Merge branch 'current' into new

This commit is contained in:
Franck Nijhof 2023-06-08 16:48:58 +02:00
commit 9ad986a3d0
No known key found for this signature in database
GPG Key ID: D62583BA8AB11CA3
234 changed files with 6130 additions and 1883 deletions

View File

@ -27,6 +27,7 @@ source/_integrations/airtouch4.markdown @LonePurpleWolf
source/_integrations/airvisual.markdown @bachya source/_integrations/airvisual.markdown @bachya
source/_integrations/airvisual_pro.markdown @bachya source/_integrations/airvisual_pro.markdown @bachya
source/_integrations/airzone.markdown @Noltari source/_integrations/airzone.markdown @Noltari
source/_integrations/airzone_cloud.markdown @Noltari
source/_integrations/aladdin_connect.markdown @mkmer source/_integrations/aladdin_connect.markdown @mkmer
source/_integrations/alarm_control_panel.markdown @home-assistant/core source/_integrations/alarm_control_panel.markdown @home-assistant/core
source/_integrations/alert.markdown @home-assistant/core @frenck source/_integrations/alert.markdown @home-assistant/core @frenck
@ -40,7 +41,7 @@ source/_integrations/amp_motorization.markdown @starkillerOG
source/_integrations/analytics.markdown @home-assistant/core @ludeeus source/_integrations/analytics.markdown @home-assistant/core @ludeeus
source/_integrations/android_ip_webcam.markdown @engrbm87 source/_integrations/android_ip_webcam.markdown @engrbm87
source/_integrations/androidtv.markdown @JeffLIrion @ollo69 source/_integrations/androidtv.markdown @JeffLIrion @ollo69
source/_integrations/androidtv_remote.markdown @tronikos source/_integrations/androidtv_remote.markdown @tronikos @Drafteed
source/_integrations/anova.markdown @Lash-L source/_integrations/anova.markdown @Lash-L
source/_integrations/anthemav.markdown @hyralex source/_integrations/anthemav.markdown @hyralex
source/_integrations/anwb_energie.markdown @klaasnicolaas source/_integrations/anwb_energie.markdown @klaasnicolaas
@ -121,6 +122,7 @@ source/_integrations/cloudflare.markdown @ludeeus @ctalkington
source/_integrations/coinbase.markdown @tombrien source/_integrations/coinbase.markdown @tombrien
source/_integrations/color_extractor.markdown @GenericStudent source/_integrations/color_extractor.markdown @GenericStudent
source/_integrations/comfoconnect.markdown @michaelarnauts source/_integrations/comfoconnect.markdown @michaelarnauts
source/_integrations/command_line.markdown @gjohansson-ST
source/_integrations/compensation.markdown @Petro31 source/_integrations/compensation.markdown @Petro31
source/_integrations/config.markdown @home-assistant/core source/_integrations/config.markdown @home-assistant/core
source/_integrations/configurator.markdown @home-assistant/core source/_integrations/configurator.markdown @home-assistant/core
@ -135,6 +137,8 @@ source/_integrations/crownstone.markdown @Crownstone @RicArch97
source/_integrations/cups.markdown @fabaff source/_integrations/cups.markdown @fabaff
source/_integrations/dacia.markdown @epenet source/_integrations/dacia.markdown @epenet
source/_integrations/daikin.markdown @fredrike source/_integrations/daikin.markdown @fredrike
source/_integrations/date.markdown @home-assistant/core
source/_integrations/datetime.markdown @home-assistant/core
source/_integrations/debugpy.markdown @frenck source/_integrations/debugpy.markdown @frenck
source/_integrations/deconz.markdown @Kane610 source/_integrations/deconz.markdown @Kane610
source/_integrations/default_config.markdown @home-assistant/core source/_integrations/default_config.markdown @home-assistant/core
@ -176,6 +180,7 @@ source/_integrations/ecowitt.markdown @pvizeli
source/_integrations/efergy.markdown @tkdrob source/_integrations/efergy.markdown @tkdrob
source/_integrations/egardia.markdown @jeroenterheerdt source/_integrations/egardia.markdown @jeroenterheerdt
source/_integrations/eight_sleep.markdown @mezz64 @raman325 source/_integrations/eight_sleep.markdown @mezz64 @raman325
source/_integrations/electrasmart.markdown @jafar-atili
source/_integrations/elgato.markdown @frenck source/_integrations/elgato.markdown @frenck
source/_integrations/elkm1.markdown @gwww @bdraco source/_integrations/elkm1.markdown @gwww @bdraco
source/_integrations/elmax.markdown @albertogeniola source/_integrations/elmax.markdown @albertogeniola
@ -259,6 +264,7 @@ source/_integrations/google.markdown @allenporter
source/_integrations/google_assistant.markdown @home-assistant/cloud source/_integrations/google_assistant.markdown @home-assistant/cloud
source/_integrations/google_assistant_sdk.markdown @tronikos source/_integrations/google_assistant_sdk.markdown @tronikos
source/_integrations/google_cloud.markdown @lufton source/_integrations/google_cloud.markdown @lufton
source/_integrations/google_generative_ai_conversation.markdown @tronikos
source/_integrations/google_mail.markdown @tkdrob source/_integrations/google_mail.markdown @tkdrob
source/_integrations/google_sheets.markdown @tkdrob source/_integrations/google_sheets.markdown @tkdrob
source/_integrations/google_travel_time.markdown @eifinger source/_integrations/google_travel_time.markdown @eifinger
@ -305,7 +311,7 @@ source/_integrations/humidifier.markdown @home-assistant/core @Shulyaka
source/_integrations/hunterdouglas_powerview.markdown @bdraco @kingy444 @trullock source/_integrations/hunterdouglas_powerview.markdown @bdraco @kingy444 @trullock
source/_integrations/hurrican_shutters_wholesale.markdown @starkillerOG source/_integrations/hurrican_shutters_wholesale.markdown @starkillerOG
source/_integrations/hvv_departures.markdown @vigonotion source/_integrations/hvv_departures.markdown @vigonotion
source/_integrations/hydrawise.markdown @ptcryan source/_integrations/hydrawise.markdown @dknowles2 @ptcryan
source/_integrations/hyperion.markdown @dermotduffy source/_integrations/hyperion.markdown @dermotduffy
source/_integrations/ialarm.markdown @RyuzakiKK source/_integrations/ialarm.markdown @RyuzakiKK
source/_integrations/iammeter.markdown @lewei50 source/_integrations/iammeter.markdown @lewei50
@ -345,6 +351,7 @@ source/_integrations/jellyfin.markdown @j-stienstra @ctalkington
source/_integrations/jewish_calendar.markdown @tsvi source/_integrations/jewish_calendar.markdown @tsvi
source/_integrations/juicenet.markdown @jesserockz source/_integrations/juicenet.markdown @jesserockz
source/_integrations/justnimbus.markdown @kvanzuijlen source/_integrations/justnimbus.markdown @kvanzuijlen
source/_integrations/jvc_projector.markdown @SteveEasley
source/_integrations/kaiterra.markdown @Michsior14 source/_integrations/kaiterra.markdown @Michsior14
source/_integrations/kaleidescape.markdown @SteveEasley source/_integrations/kaleidescape.markdown @SteveEasley
source/_integrations/keba.markdown @dannerph source/_integrations/keba.markdown @dannerph
@ -364,6 +371,7 @@ source/_integrations/kulersky.markdown @emlove
source/_integrations/lacrosse_view.markdown @IceBotYT source/_integrations/lacrosse_view.markdown @IceBotYT
source/_integrations/lametric.markdown @robbiet480 @frenck @bachya source/_integrations/lametric.markdown @robbiet480 @frenck @bachya
source/_integrations/landisgyr_heat_meter.markdown @vpathuis source/_integrations/landisgyr_heat_meter.markdown @vpathuis
source/_integrations/lastfm.markdown @joostlek
source/_integrations/launch_library.markdown @ludeeus @DurgNomis-drol source/_integrations/launch_library.markdown @ludeeus @DurgNomis-drol
source/_integrations/laundrify.markdown @xLarry source/_integrations/laundrify.markdown @xLarry
source/_integrations/lcn.markdown @alengwenus source/_integrations/lcn.markdown @alengwenus
@ -447,6 +455,7 @@ source/_integrations/nest.markdown @allenporter
source/_integrations/netatmo.markdown @cgtobi source/_integrations/netatmo.markdown @cgtobi
source/_integrations/netdata.markdown @fabaff source/_integrations/netdata.markdown @fabaff
source/_integrations/netgear.markdown @hacf-fr @Quentame @starkillerOG source/_integrations/netgear.markdown @hacf-fr @Quentame @starkillerOG
source/_integrations/netgear_lte.markdown @tkdrob
source/_integrations/network.markdown @home-assistant/core source/_integrations/network.markdown @home-assistant/core
source/_integrations/nexia.markdown @bdraco source/_integrations/nexia.markdown @bdraco
source/_integrations/nexity.markdown @imicknl @vlebourl @tetienne @nyroDev source/_integrations/nexity.markdown @imicknl @vlebourl @tetienne @nyroDev
@ -491,6 +500,7 @@ source/_integrations/openerz.markdown @misialq
source/_integrations/openexchangerates.markdown @MartinHjelmare source/_integrations/openexchangerates.markdown @MartinHjelmare
source/_integrations/opengarage.markdown @danielhiversen source/_integrations/opengarage.markdown @danielhiversen
source/_integrations/openhome.markdown @bazwilliams source/_integrations/openhome.markdown @bazwilliams
source/_integrations/opensky.markdown @joostlek
source/_integrations/opentherm_gw.markdown @mvn23 source/_integrations/opentherm_gw.markdown @mvn23
source/_integrations/openuv.markdown @bachya source/_integrations/openuv.markdown @bachya
source/_integrations/openweathermap.markdown @fabaff @freekode @nzapponi source/_integrations/openweathermap.markdown @fabaff @freekode @nzapponi
@ -510,6 +520,7 @@ source/_integrations/philips_js.markdown @elupus
source/_integrations/pi_hole.markdown @johnluetke @shenxn source/_integrations/pi_hole.markdown @johnluetke @shenxn
source/_integrations/picnic.markdown @corneyl source/_integrations/picnic.markdown @corneyl
source/_integrations/pilight.markdown @trekky12 source/_integrations/pilight.markdown @trekky12
source/_integrations/piper.markdown @balloob @synesthesiam
source/_integrations/plaato.markdown @JohNan source/_integrations/plaato.markdown @JohNan
source/_integrations/plex.markdown @jjlawren source/_integrations/plex.markdown @jjlawren
source/_integrations/plugwise.markdown @CoMPaTech @bouwew @frenck source/_integrations/plugwise.markdown @CoMPaTech @bouwew @frenck
@ -542,7 +553,7 @@ source/_integrations/qwikswitch.markdown @kellerza
source/_integrations/rachio.markdown @bdraco source/_integrations/rachio.markdown @bdraco
source/_integrations/radarr.markdown @tkdrob source/_integrations/radarr.markdown @tkdrob
source/_integrations/radio_browser.markdown @frenck source/_integrations/radio_browser.markdown @frenck
source/_integrations/radiotherm.markdown @bdraco @vinnyfuria source/_integrations/radiotherm.markdown @vinnyfuria
source/_integrations/rainbird.markdown @konikvranik @allenporter source/_integrations/rainbird.markdown @konikvranik @allenporter
source/_integrations/raincloud.markdown @vanstinator source/_integrations/raincloud.markdown @vanstinator
source/_integrations/rainforest_eagle.markdown @gtdiehl @jcalbert @hastarin source/_integrations/rainforest_eagle.markdown @gtdiehl @jcalbert @hastarin
@ -566,7 +577,7 @@ source/_integrations/rfxtrx.markdown @danielhiversen @elupus @RobBie1221
source/_integrations/rhasspy.markdown @balloob @synesthesiam source/_integrations/rhasspy.markdown @balloob @synesthesiam
source/_integrations/ridwell.markdown @bachya source/_integrations/ridwell.markdown @bachya
source/_integrations/risco.markdown @OnFreund source/_integrations/risco.markdown @OnFreund
source/_integrations/rituals_perfume_genie.markdown @milanmeu source/_integrations/rituals_perfume_genie.markdown @milanmeu @frenck
source/_integrations/rmvtransport.markdown @cgtobi source/_integrations/rmvtransport.markdown @cgtobi
source/_integrations/roborock.markdown @humbertogontijo @Lash-L source/_integrations/roborock.markdown @humbertogontijo @Lash-L
source/_integrations/roku.markdown @ctalkington source/_integrations/roku.markdown @ctalkington
@ -622,7 +633,7 @@ source/_integrations/siren.markdown @home-assistant/core @raman325
source/_integrations/sisyphus.markdown @jkeljo source/_integrations/sisyphus.markdown @jkeljo
source/_integrations/sky_hub.markdown @rogerselwyn source/_integrations/sky_hub.markdown @rogerselwyn
source/_integrations/skybell.markdown @tkdrob source/_integrations/skybell.markdown @tkdrob
source/_integrations/slack.markdown @bachya @tkdrob source/_integrations/slack.markdown @tkdrob
source/_integrations/sleepiq.markdown @mfugate1 @kbickar source/_integrations/sleepiq.markdown @mfugate1 @kbickar
source/_integrations/slide.markdown @ualex73 source/_integrations/slide.markdown @ualex73
source/_integrations/slimproto.markdown @marcelveldt source/_integrations/slimproto.markdown @marcelveldt
@ -654,7 +665,7 @@ source/_integrations/speedtestdotnet.markdown @rohankapoorcom @engrbm87
source/_integrations/spider.markdown @peternijssen source/_integrations/spider.markdown @peternijssen
source/_integrations/splunk.markdown @Bre77 source/_integrations/splunk.markdown @Bre77
source/_integrations/spotify.markdown @frenck source/_integrations/spotify.markdown @frenck
source/_integrations/sql.markdown @dgomes @gjohansson-ST @dougiteixeira source/_integrations/sql.markdown @gjohansson-ST @dougiteixeira
source/_integrations/squeezebox.markdown @rajlaud source/_integrations/squeezebox.markdown @rajlaud
source/_integrations/srp_energy.markdown @briglx source/_integrations/srp_energy.markdown @briglx
source/_integrations/starline.markdown @anonym-tsk source/_integrations/starline.markdown @anonym-tsk
@ -678,7 +689,7 @@ source/_integrations/switch.markdown @home-assistant/core
source/_integrations/switch_as_x.markdown @home-assistant/core source/_integrations/switch_as_x.markdown @home-assistant/core
source/_integrations/switchbee.markdown @jafar-atili source/_integrations/switchbee.markdown @jafar-atili
source/_integrations/switchbot.markdown @bdraco @danielhiversen @RenierM26 @murtas @Eloston @dsypniewski source/_integrations/switchbot.markdown @bdraco @danielhiversen @RenierM26 @murtas @Eloston @dsypniewski
source/_integrations/switcher_kis.markdown @tomerfi @thecode source/_integrations/switcher_kis.markdown @thecode
source/_integrations/switchmate.markdown @danielhiversen @qiz-li source/_integrations/switchmate.markdown @danielhiversen @qiz-li
source/_integrations/symfonisk.markdown @cgtobi @jjlawren source/_integrations/symfonisk.markdown @cgtobi @jjlawren
source/_integrations/syncthing.markdown @zhulik source/_integrations/syncthing.markdown @zhulik
@ -686,7 +697,7 @@ source/_integrations/syncthru.markdown @nielstron
source/_integrations/synology_dsm.markdown @hacf-fr @Quentame @mib1185 source/_integrations/synology_dsm.markdown @hacf-fr @Quentame @mib1185
source/_integrations/synology_srm.markdown @aerialls source/_integrations/synology_srm.markdown @aerialls
source/_integrations/system_bridge.markdown @timmo001 source/_integrations/system_bridge.markdown @timmo001
source/_integrations/tado.markdown @michaelarnauts source/_integrations/tado.markdown @michaelarnauts @chiefdragon
source/_integrations/tag.markdown @balloob @dmulcahey source/_integrations/tag.markdown @balloob @dmulcahey
source/_integrations/tailscale.markdown @frenck source/_integrations/tailscale.markdown @frenck
source/_integrations/tankerkoenig.markdown @guillempages @mib1185 source/_integrations/tankerkoenig.markdown @guillempages @mib1185
@ -706,6 +717,7 @@ source/_integrations/thread.markdown @home-assistant/core
source/_integrations/tibber.markdown @danielhiversen source/_integrations/tibber.markdown @danielhiversen
source/_integrations/tile.markdown @bachya source/_integrations/tile.markdown @bachya
source/_integrations/tilt_ble.markdown @apt-itude source/_integrations/tilt_ble.markdown @apt-itude
source/_integrations/time.markdown @home-assistant/core
source/_integrations/time_date.markdown @fabaff source/_integrations/time_date.markdown @fabaff
source/_integrations/tmb.markdown @alemuro source/_integrations/tmb.markdown @alemuro
source/_integrations/todoist.markdown @boralyl source/_integrations/todoist.markdown @boralyl
@ -772,6 +784,7 @@ source/_integrations/webostv.markdown @thecode
source/_integrations/websocket_api.markdown @home-assistant/core source/_integrations/websocket_api.markdown @home-assistant/core
source/_integrations/wemo.markdown @esev source/_integrations/wemo.markdown @esev
source/_integrations/whirlpool.markdown @abmantis @mkmer source/_integrations/whirlpool.markdown @abmantis @mkmer
source/_integrations/whisper.markdown @balloob @synesthesiam
source/_integrations/whois.markdown @frenck source/_integrations/whois.markdown @frenck
source/_integrations/wiffi.markdown @mampfes source/_integrations/wiffi.markdown @mampfes
source/_integrations/wilight.markdown @leofig-rj source/_integrations/wilight.markdown @leofig-rj
@ -790,6 +803,7 @@ source/_integrations/xiaomi_ble.markdown @Jc2k @Ernst79
source/_integrations/xiaomi_miio.markdown @rytilahti @syssi @starkillerOG source/_integrations/xiaomi_miio.markdown @rytilahti @syssi @starkillerOG
source/_integrations/xiaomi_tv.markdown @simse source/_integrations/xiaomi_tv.markdown @simse
source/_integrations/xmpp.markdown @fabaff @flowolf source/_integrations/xmpp.markdown @fabaff @flowolf
source/_integrations/yale_home.markdown @bdraco
source/_integrations/yale_smart_alarm.markdown @gjohansson-ST source/_integrations/yale_smart_alarm.markdown @gjohansson-ST
source/_integrations/yalexs_ble.markdown @bdraco source/_integrations/yalexs_ble.markdown @bdraco
source/_integrations/yamaha_musiccast.markdown @vigonotion @micha91 source/_integrations/yamaha_musiccast.markdown @vigonotion @micha91
@ -799,6 +813,7 @@ source/_integrations/yeelightsunflower.markdown @lindsaymarkward
source/_integrations/yi.markdown @bachya source/_integrations/yi.markdown @bachya
source/_integrations/yolink.markdown @matrixd2 source/_integrations/yolink.markdown @matrixd2
source/_integrations/youless.markdown @gjong source/_integrations/youless.markdown @gjong
source/_integrations/youtube.markdown @joostlek
source/_integrations/zamg.markdown @killer0071234 source/_integrations/zamg.markdown @killer0071234
source/_integrations/zengge.markdown @emontnemery source/_integrations/zengge.markdown @emontnemery
source/_integrations/zeroconf.markdown @bdraco source/_integrations/zeroconf.markdown @bdraco

View File

@ -17,7 +17,7 @@ group :jekyll_plugins do
end end
gem 'sinatra', '3.0.6' gem 'sinatra', '3.0.6'
gem 'nokogiri', '1.14.3' gem 'nokogiri', '1.15.2'
# Windows and JRuby does not include zoneinfo files, so bundle the tzinfo-data gem # Windows and JRuby does not include zoneinfo files, so bundle the tzinfo-data gem
# and associated library # and associated library

View File

@ -121,9 +121,9 @@ social:
# Home Assistant release details # Home Assistant release details
current_major_version: 2023 current_major_version: 2023
current_minor_version: 5 current_minor_version: 6
current_patch_version: 2 current_patch_version: 0
date_released: 2023-05-05 date_released: 2023-06-07
# Either # or the anchor link to latest release notes in the blog post. # Either # or the anchor link to latest release notes in the blog post.
# Must be prefixed with a # and have double quotes around it. # Must be prefixed with a # and have double quotes around it.

1141
package-lock.json generated

File diff suppressed because it is too large Load Diff

View File

@ -12,10 +12,10 @@
"tailwindcss": "^3.2.7", "tailwindcss": "^3.2.7",
"remark-cli": "^11.0.0", "remark-cli": "^11.0.0",
"remark-frontmatter": "^4.0.1", "remark-frontmatter": "^4.0.1",
"remark-lint": "^9.1.1", "remark-lint": "^9.1.2",
"remark-lint-fenced-code-flag": "^3.1.1", "remark-lint-fenced-code-flag": "^3.1.2",
"remark-lint-no-shell-dollars": "^3.1.1", "remark-lint-no-shell-dollars": "^3.1.2",
"remark-stringify": "^10.0.2", "remark-stringify": "^10.0.3",
"textlint": "^13.3.2", "textlint": "^13.3.2",
"textlint-filter-rule-comments": "^1.2.2", "textlint-filter-rule-comments": "^1.2.2",
"textlint-rule-common-misspellings": "^1.0.1", "textlint-rule-common-misspellings": "^1.0.1",

View File

@ -65,7 +65,7 @@ entity:
type: string type: string
type: type:
required: false required: false
description: "Sets a custom card type: `custom:my-custom-card`" description: "Sets a custom card type: `custom:my-custom-card`. It also can be used to force entities with a default special row format to render as a simple state. You can do this by setting the type: `simple-entity`. This can be used, for example, to replace a helper with an editable control with a read-only value."
type: string type: string
name: name:
required: false required: false

View File

@ -57,6 +57,19 @@
aliases: aliases:
- automations - automations
- term: Backup
definition: >-
Home Assistant has built-in functionality to create files containing a copy of
your configuration. This can be used to restore your Home Assistant as well
as migrate to a new system. The backup feature is available on some installation
types.
link: /integrations/backup/
excerpt: >-
Home Assistant has built-in functionality to create files containing a copy of
your configurations. This is available on certain installation types.
aliases:
- backups
- term: Binary sensor - term: Binary sensor
definition: >- definition: >-
A binary sensor returns information about things that only have two states - A binary sensor returns information about things that only have two states -
@ -284,6 +297,20 @@
other integrations. other integrations.
link: /docs/configuration/platform_options/ link: /docs/configuration/platform_options/
- term: Reload
definition: >-
Applies the changes made to the Home Assistant configuration files. Changes
are normally automatically updated. However, changes made outside of the front
end will not be reflected in Home Assistant and require a reload.
To perform a manual reload, go to **Settings** > **System** >
**Restart Home Assistant** (top right) > **Quick reload**. More granular
reload options are available in *YAML configuration reloading* section
in **Developer tools** > **YAML**.
excerpt: >
Applies the changes made to Home Assistant configuration files. Changes are normally
automatically updated. However, changes made outside of the front
end will not be reflected in Home Assistant and require a reload.
- term: Scene - term: Scene
definition: >- definition: >-
Scenes capture the states you want certain entities to be. For example, Scenes capture the states you want certain entities to be. For example,
@ -380,7 +407,7 @@
- term: TTS - term: TTS
definition: >- definition: >-
TTS (text to speech) allows Home Assistant to talk to you. TTS (text-to-speech) allows Home Assistant to talk to you.
link: /integrations/tts/ link: /integrations/tts/
- term: Variables - term: Variables

View File

@ -10,7 +10,7 @@ The automation's `mode` configuration option controls what happens when the auto
Mode | Description Mode | Description
-|- -|-
`single` | (Default) Do not start a new run. Issue a warning. `single` | (Default) Do not start a new run. Issue a warning.
`restart` | Start a new run after first stopping previous run. `restart` | Start a new run after first stopping the previous run. The automation only restarts if the conditions are met.
`queued` | Start a new run after all previous runs complete. Runs are guaranteed to execute in the order they were queued. Note that subsequent queued automations will only join the queue if any conditions it may have are met at the time it is triggered. `queued` | Start a new run after all previous runs complete. Runs are guaranteed to execute in the order they were queued. Note that subsequent queued automations will only join the queue if any conditions it may have are met at the time it is triggered.
`parallel` | Start a new, independent run in parallel with previous runs. `parallel` | Start a new, independent run in parallel with previous runs.

View File

@ -13,11 +13,11 @@ Quick links:
Automations based on a blueprint only need to be configured to be used. What needs to be configured differs on each blueprint. Automations based on a blueprint only need to be configured to be used. What needs to be configured differs on each blueprint.
To create your first automation based on a blueprint, go to **{% my config %}** -> **Automations & Scenes** -> **{% my blueprints %}**. Find the blueprint that you want to use and click on "Create Automation". To create your first automation based on a blueprint, go to **{% my blueprints title="Settings > Automations & Scenes > Blueprints" %}**. Find the blueprint that you want to use and select **Create Automation**.
This will open the automation editor with the blueprint selected. Give it a name and configure the blueprint and click on the blue button "Save Automation" in the bottom right. This will open the automation editor with the blueprint selected. Give it a name and configure the blueprint and click on the blue button "Save Automation" in the bottom right.
Done! If you want to revisit the configuration values, you can find it by going to **{% my config %}** and then **{% my automations %}**. Done! If you want to revisit the configuration values, you can find it by going to **Settings** and then **{% my blueprints %}**.
## Importing blueprints ## Importing blueprints

View File

@ -561,8 +561,7 @@ include_entities:
filter: filter:
description: > description: >
When filter options are provided, the entities are limited by entities When filter options are provided, the entities are limited by entities
that at least match the given conditions. Can be either a object or a list of object. that at least match the given conditions. Can be either an object or a list of objects.
Can be either a object or a list of object.
type: list type: list
required: false required: false
keys: keys:

View File

@ -3,17 +3,17 @@ title: "Configuration.yaml"
description: "Configuring Home Assistant via text files." description: "Configuring Home Assistant via text files."
--- ---
While you can configure most of Home Assistant directly from the user interface under {% my config %}, some parts need you to edit `configuration.yaml`. This file contains integrations to be loaded along with their configurations. Throughout the documentation you will find snippets that you can add to your configuration file to enable specific functionality. While you can configure most of Home Assistant directly from the user interface under {% my config %}, some parts need you to edit `configuration.yaml`. This file contains {% term integrations %} to be loaded along with their configurations. Throughout the documentation you will find snippets that you can add to your configuration file to enable specific functionality.
If you run into trouble while configuring Home Assistant, refer to the [configuration troubleshooting page](/docs/configuration/troubleshooting/) and the [`configuration.yaml` examples](/examples/#example-configurationyaml). If you run into trouble while configuring Home Assistant, refer to the [configuration troubleshooting page](/docs/configuration/troubleshooting/) and the [`configuration.yaml` examples](/examples/#example-configurationyaml).
## Editing `configuration.yaml` ## Editing `configuration.yaml`
The easiest option to edit `configuration.yaml` is to use the {% my supervisor_addon title="Studio Code Server add-on" addon="a0d7b954_vscode" %}. This add-on runs VS Code, which offers live syntax checking and auto-fill of various Home Assistant entities (if unavailable on your system, use {% my supervisor_addon title="File Editor add-on" addon="core_configurator" %} instead). The easiest option to edit `configuration.yaml` is to use the {% my supervisor_addon title="Studio Code Server add-on" addon="a0d7b954_vscode" %}. This add-on runs VS Code, which offers live syntax checking and auto-fill of various Home Assistant entities. See [here](/common-tasks/supervised/#installing-and-using-the-visual-studio-code-vsc-add-on) for details. If unavailable on your system, use {% my supervisor_addon title="File Editor add-on" addon="core_configurator" %} instead. Again, details can be found [here](/common-tasks/supervised/#installing-and-using-the-file-editor-add-on).
If you prefer to use a file editor on your computer, use the {% my supervisor_addon title="Samba add-on" addon="core_samba" %} to access the files as a network share. If you prefer to use a file editor on your computer, use the {% my supervisor_addon title="Samba add-on" addon="core_samba" %} to access the files as a network share. More details can be found [here](/common-tasks/supervised/#installing-and-using-the-samba-add-on).
The path to your configuration directory can be found in the Home Assistant frontend by going to {% my system_health title="Settings > System > Repairs > System information from the top right menu" %} The path to your configuration directory can be found in the Home Assistant {% term frontend %} by going to {% my system_health title="Settings > System > Repairs > System information from the top right menu" %}
![Show system menu option](/images/screenshots/System_information_menu.png) ![Show system menu option](/images/screenshots/System_information_menu.png)
@ -28,7 +28,7 @@ _If you use Home Assistant Core, you can find `configuration.yaml` in the config
## Reloading changes ## Reloading changes
Most integrations in Home Assistant that do not interact with devices or services can reload changes made to their configuration in `configuration.yaml`. To do this, go to {% my server_controls title="Developer Tools > YAML" %} and scroll down to the YAML configuration reloading section (alternatively, hit "c" anywhere in the UI and search for it). Most integrations in Home Assistant that do not interact with {% term devices %} or {% term services %} can reload changes made to their configuration in `configuration.yaml`. To do this, go to {% my server_controls title="Developer Tools > YAML" %} and scroll down to the YAML configuration reloading section (alternatively, hit "c" anywhere in the UI and search for it).
If you can't see your integration listed there, you will need to restart Home Assistant for changes to take effect. If you can't see your integration listed there, you will need to restart Home Assistant for changes to take effect.
@ -40,6 +40,6 @@ If you can't see your integration listed there, you will need to restart Home As
## Migrating to a new system ## Migrating to a new system
The preferred way of migrating to a new system is by {% my supervisor_backups title="making a backup" %}. Once you have created the backup on the old system, you can download it to the system that is running the Home Assistant frontend. When setting up the new system, you may use the backup. Alternatively, you can upload it to your new system using the *Upload backup* menu option of the *Backups* menu. Then, a restore of the uploaded backup on the new system concludes the migration. The preferred way of migrating to a new system is by {% my supervisor_backups title="making a backup" %}. Once you have created the backup on the old system, you can download it to the system that is running the Home Assistant frontend. When setting up the new system, you may use the backup. Alternatively, you can upload it to your new system using the _Upload backup_ menu option of the _Backups_ menu. Then, a restore of the uploaded backup on the new system concludes the migration.
If you run the container or core installation methods, you will need to manually make a backup of your configuration folder. Be aware that some of the files you need start with `.`, which is hidden by default from both `ls` (in SSH), in Windows Explorer, and macOS Finder. You'll need to ensure that you're viewing all files before you copy them. If you run the container or core installation methods, you will need to manually make a backup of your configuration folder. Be aware that some of the files you need start with `.`, which is hidden by default from both `ls` (in SSH), in Windows Explorer, and macOS Finder. You'll need to ensure that you're viewing all files before you copy them.

View File

@ -20,6 +20,9 @@ Users should upgrade the firmware on all 700 series controllers to version 7.17.
</div> </div>
- 800 series controllers
- Zooz 800 Series Z-Wave Long Range S2 Stick (ZST39 LR)
- 700 series controllers - 700 series controllers
- Aeotec Z-Stick 7 USB stick (ZWA010) (the EU version is not recommended due to RF performance issues) - Aeotec Z-Stick 7 USB stick (ZWA010) (the EU version is not recommended due to RF performance issues)
- Silicon Labs UZB-7 USB Stick (Silabs SLUSB7000A / SLUSB001A) - Silicon Labs UZB-7 USB Stick (Silabs SLUSB7000A / SLUSB001A)
@ -39,7 +42,7 @@ Users should upgrade the firmware on all 700 series controllers to version 7.17.
- Z-Wave.Me RaZberry 7 Pro (ZMEERAZBERRY7_PRO or ZMEURAZBERRY7_PRO, 700 series) - Z-Wave.Me RaZberry 7 Pro (ZMEERAZBERRY7_PRO or ZMEURAZBERRY7_PRO, 700 series)
- Z-Wave.Me Razberry 2 (500 series) - Z-Wave.Me Razberry 2 (500 series)
If you are just starting out, we recommend that you purchase a 700 series controller or a Raspberry Pi module. If you are just starting out, we recommend that you purchase a 700 series controller or a Raspberry Pi module. The 700 series controllers are the more recent version (when compared to the 500 series). The 700 series controllers support SmartStart, which allows you to add a device by scanning a QR code.
<div class='note'> <div class='note'>
If you're using Home Assistant OS, Supervised, or Container, it's recommended to use a USB stick, not a module. Passing a module through Docker is more complicated than passing a USB stick through. If you're using Home Assistant OS, Supervised, or Container, it's recommended to use a USB stick, not a module. Passing a module through Docker is more complicated than passing a USB stick through.

View File

@ -0,0 +1,6 @@
---
title: "Configuration.yaml by dannytsang"
description: ""
ha_category: Example configuration.yaml
ha_external_link: https://github.com/dannytsang/homeassistant-config
---

View File

@ -39,22 +39,6 @@
</li> </li>
</ul> </ul>
</li> </li>
<li>
<b>{% active_link /docs/assist/ Assist %}</b>
<ul>
<li>{% active_link /docs/assist/android/ Assist for Android %}</li>
<li>{% active_link /docs/assist/apple/ Assist for Apple devices %}</li>
<li>{% active_link /docs/assist/builtin_sentences/ Built-in sentences %}</li>
<li>{% active_link /docs/assist/custom_sentences/ Custom sentences %}</li>
<li>{% active_link /docs/assist/voice_remote_expose_devices/ Exposing devices to your voice assistant %}</li>
<li>{% active_link /docs/assist/voice_remote_local_assistant/ Configuring a local assistant %}</li>
<li>{% active_link /docs/assist/troubleshooting/ Troubleshooting Assist %}</li>
<li>{% active_link /docs/assist/voice_remote_local_assistant/ Configuring a local assistant %}</li>
<li>{% active_link /projects/worlds-most-private-voice-assistant/ Tutorial: World's most private voice assistant %}</li>
<li>{% active_link /projects/thirteen-usd-voice-remote/ Tutorial: $13 voice remote %}
</li>
</ul>
</li>
<li> <li>
<b>{% active_link /docs/energy/ Home Energy Management %}</b> <b>{% active_link /docs/energy/ Home Energy Management %}</b>
<ul> <ul>

View File

@ -0,0 +1,32 @@
<section class="aside-module grid__item one-whole lap-one-half">
{% assign elements = site.dashboards | sort_natural: 'title' %}
<div class="section">
<h1 class="title delta">Devices</h1>
<ul class="divided sidebar-menu">
<li>{% active_link /voice_control/android/ Assist for Android %}</li>
<li>{% active_link /voice_control/apple/ Assist for Apple %}</li>
</ul>
</div>
<div class="section">
<h1 class="title delta">Voice assistants</h1>
<ul class="divided sidebar-menu">
<li>{% active_link /voice_control/using_voice_assistants_overview/ Voice assistants: Overview %}</li>
<li>{% active_link /voice_control/voice_remote_local_assistant/ Configuring a local assistant %}</li>
<li>{% active_link /voice_control/voice_remote_expose_devices/ Exposing devices to voice assistant %}</li>
<li>{% active_link /voice_control/builtin_sentences/ Built-in sentences %}</li>
<li>{% active_link /voice_control/custom_sentences/ Custom sentences %}</li>
<li>{% active_link /voice_control/using_tts_in_automation/ Using Piper TTS in automations %}</li>
<li>{% active_link /voice_control/troubleshooting/ Troubleshooting Assist %}</li>
</ul>
</div>
<div class="section">
<h1 class="title delta">Projects</h1>
<ul class="divided sidebar-menu">
<li>{% active_link /voice_control/worlds-most-private-voice-assistant/ Tutorial: World's most private voice assistant %}</li>
<li>{% active_link /voice_control/thirteen-usd-voice-remote/ Tutorial: $13 voice remote %}</li>
</ul>
</div>
</section>

View File

@ -28,9 +28,9 @@ The data disk feature can be used on an existing installation without losing dat
1. Connect the data disk to your system. 1. Connect the data disk to your system.
2. Go to **{% my storage title="Settings > System > Storage" %}** in the UI. 2. Go to **{% my storage title="Settings > System > Storage" %}** in the UI.
3. Press the three dots on the top right and choose "Move datadisk" 3. Select the "Move data disk" button.
4. Select the data disk from the list of available devices. 4. Select the data disk from the list of available devices.
5. Press "Move". 5. Select **Move**.
![Screenshot of the "Move datadisk" feature](/images/screenshots/move-datadisk.png) ![Screenshot of the "Move datadisk" feature](/images/screenshots/move-datadisk.png)

View File

@ -0,0 +1,81 @@
## Network storage
You can configure both Network File Share (NFS) and Windows samba (CIFS) targets to be used within Home Assistant and add-ons.
To list all your currently connected network storages, go to **{% my storage title="Settings > System > Storage" %}** in the UI.
{% if page.installation == "os" %}
<div class='note'>
You need to update to Home Assistant Operating System 10.2 before you can use this feature.
</div>
{% endif %}
<p class='img'>
<picture>
<source srcset="/images/screenshots/network-storage/list_dark.png" media="(prefers-color-scheme: dark)">
<img src="/images/screenshots/network-storage/list_light.png">
</picture>
Screenshot of the list of network shares inside the storage panel.
</p>
### Add a new network storage
1. Go to **{% my storage title="Settings > System > Storage" %}** in the UI.
1. Select **Add network storage**.
1. Fill out all the information for your network storage.
1. Select **Connect**.
<p class='img'>
<picture>
<source srcset="/images/screenshots/network-storage/connect_dark.png" media="(prefers-color-scheme: dark)">
<img src="/images/screenshots/network-storage/connect_light.png">
</picture>
Screenshot of connecting a new network storage.
</p>
#### Network storage configuration
{% configuration_basic "hassio.network_share" %}
Name:
description: This is the name that will be used for the mounted directory on your system.
Usage:
description: Here, you select how the target should be used.
Server:
description: The IP/hostname of the server running NFS/CIFS.
Protocol:
description: The service the server is using for the network storage.
"[NFS]<sup>1</sup> Remote share path":
description: The path used to connect to the remote storage server.
"[CIFS]<sup>2</sup> Username":
description: The username to use when connecting to the storage server.
"[CIFS]<sup>2</sup> Password":
description: The password to use when connecting to the storage server.
"[CIFS]<sup>2</sup> Share":
description: The share to connect to on the storage server.
{% endconfiguration_basic %}
<sup>1</sup> _Options prefixed with `[NFS]` is only available for NFS targets._<br>
<sup>2</sup> _Options prefixed with `[CIFS]` is only available for CIFS targets._
### Change default backup location
By default, the first network storage of type **Backup** that you add will be set as your default backup target.
If you want to change the default backup target, you can do the following:
1. Go to **{% my backup title="Settings > System > Backups" %}** in the UI.
1. Select the menu in the top right of the screen and select the **Change default backup location** option.
1. In the dialog, there is a single option to set the default backup target.
1. Choose the one you want from the list.
1. Select **Save**.
This list will contain all the network storage targets you have added of usage type **Backup**. It also contains another option to set it back to use `/backup` again.
<p class='img'>
<picture>
<source srcset="/images/screenshots/network-storage/change_backup_dark.png" media="(prefers-color-scheme: dark)">
<img src="/images/screenshots/network-storage/change_backup_light.png">
</picture>
Screenshot of changing the default backup target.
</p>

View File

@ -33,7 +33,7 @@ sudo apt-get upgrade -y
Install the dependencies: Install the dependencies:
```bash ```bash
sudo apt-get install -y python3 python3-dev python3-venv python3-pip bluez libffi-dev libssl-dev libjpeg-dev zlib1g-dev autoconf build-essential libopenjp2-7 libtiff5 libturbojpeg0-dev tzdata sudo apt-get install -y python3 python3-dev python3-venv python3-pip bluez libffi-dev libssl-dev libjpeg-dev zlib1g-dev autoconf build-essential libopenjp2-7 libtiff5 libturbojpeg0-dev tzdata ffmpeg liblapack3 liblapack-dev libatlas-base-dev
``` ```
The above-listed dependencies might differ or missing, depending on your system or personal use of Home Assistant. The above-listed dependencies might differ or missing, depending on your system or personal use of Home Assistant.

View File

@ -12,15 +12,34 @@ Follow this guide if you want to get started with Home Assistant easily or if yo
We will need a few things to get started with installing Home Assistant. The links below lead to Ameridroid. If youre not in the US, you should be able to find these items in web stores in your country. We will need a few things to get started with installing Home Assistant. The links below lead to Ameridroid. If youre not in the US, you should be able to find these items in web stores in your country.
To get started we suggest the ODROID N2+, it's the most powerful ODROID. It's fast and with built-in eMMC one of the best boards to run Home Assistant. It's also the board that powers our [Home Assistant Blue](/blue/). To get started, we suggest the ODROID N2+, the board that powers our [Home Assistant Blue](/blue/), or the ODROID M1.
- [ODROID N2+](https://ameridroid.com/products/odroid-n2-plus?ref=eeb6nfw07e) If unavailable, we also recommend the [ODROID C4](https://ameridroid.com/products/odroid-c4?ref=eeb6nfw07e).
- [Power Supply](https://ameridroid.com/products/12v-2a-power-supply-plug?ref=eeb6nfw07e)
- [CR2032 Coin Cell](https://ameridroid.com/products/rtc-bios-battery?ref=eeb6nfw07e)
- [eMMC Module](https://ameridroid.com/products/emmc-module-n2-linux-red-dot?ref=eeb6nfw07e)
- [Case](https://ameridroid.com/products/odroid-n2-case?ref=eeb6nfw07e)
If unavailable, we also recommend the [ODROID C4](https://ameridroid.com/products/odroid-c4?ref=eeb6nfw07e) or [ODROID M1](https://ameridroid.com/products/odroid-M1?ref=eeb6nfw07e).
Home Assistant bundles (US market):
The bundles come with Home Assistant pre-installed.
* [ODROID N2+: 2 GB RAM / 16 GB eMMC](https://ameridroid.com/products/odroid-n2-home-assistant-blue-bundle-limited-edition?variant=44748729286935?ref=eeb6nfw07e)
* [ODROID N2+: 4 GB RAM / 64 GB eMMC](https://ameridroid.com/products/odroid-n2-home-assistant-blue-bundle-limited-edition?variant=44748729221399?ref=eeb6nfw07e)
* ODROID M1: 4 GB RAM / 256 GB NVMe / [16 GB &micro;SD](https://ameridroid.com/products/odroid-n2-home-assistant-blue-bundle-limited-edition?variant=44929573028119?ref=eeb6nfw07e) or [16 GB eMMC](https://ameridroid.com/products/odroid-n2-home-assistant-blue-bundle-limited-edition?variant=44994940567831?ref=eeb6nfw07e)
* ODROID M1: 8 GB RAM / 256 GB NVMe / [16 GB &micro;SD](https://ameridroid.com/products/odroid-n2-home-assistant-blue-bundle-limited-edition?variant=44929573093655?ref=eeb6nfw07e) or [16 GB eMMC](https://ameridroid.com/products/odroid-n2-home-assistant-blue-bundle-limited-edition?variant=44994940633367?ref=eeb6nfw07e)
* [ODROID M1: 8 GB RAM / 1 TB NVMe / 64 GB eMMC ](https://ameridroid.com/products/odroid-n2-home-assistant-blue-bundle-limited-edition?variant=44994940698903?ref=eeb6nfw07e)
* ODROID XU4: 2 GB RAM / [32 GB &micro;SD](https://ameridroid.com/products/odroid-n2-home-assistant-blue-bundle-limited-edition?variant=44748729352471?ref=eeb6nfw07e) or [16 GB eMMC](https://ameridroid.com/products/odroid-n2-home-assistant-blue-bundle-limited-edition?variant=44748782305559?ref=eeb6nfw07e)
Variants without pre-installed Home Assistant:
* ODROID N2+, [2 GB RAM](https://ameridroid.com/products/odroid-n2-plus?variant=40371828719650?ref=eeb6nfw07e) or [4 GB RAM](https://ameridroid.com/products/odroid-n2-plus?variant=40371828752418?ref=eeb6nfw07e)
* [ODROID C4](https://ameridroid.com/products/odroid-c4?ref=eeb6nfw07e)
* [ODROID M1](https://ameridroid.com/products/odroid-M1?ref=eeb6nfw07e)
* [Power Supply](https://ameridroid.com/products/12v-2a-power-supply-plug?ref=eeb6nfw07e)
* [CR2032 Coin Cell](https://ameridroid.com/products/rtc-bios-battery?ref=eeb6nfw07e)
* [eMMC Module](https://ameridroid.com/products/emmc-module-n2-linux-red-dot?ref=eeb6nfw07e)
* [Case](https://ameridroid.com/products/odroid-n2-case?ref=eeb6nfw07e)
* These are affiliated links. We get commissions for purchases made through links in this post.*
{% endif %} {% endif %}
@ -146,6 +165,7 @@ _Select and copy the URL or use the "copy" button that appear when you hover it.
![Screenshot of the Etcher software showing the Flash button highlighted.](/images/installation/etcher5.png) ![Screenshot of the Etcher software showing the Flash button highlighted.](/images/installation/etcher5.png)
1. When Balena Etcher has finished writing the image, you will see a confirmation. 1. When Balena Etcher has finished writing the image, you will see a confirmation.
![Screenshot of the Etcher software showing that the installation has completed.](/images/installation/etcher6.png) ![Screenshot of the Etcher software showing that the installation has completed.](/images/installation/etcher6.png)
* If you are having issues with Balena Etcher, try version [1.10](https://github.com/balena-io/etcher/releases/tag/v1.10.4).
### Start up your {{site.installation.types[page.installation_type].board}} ### Start up your {{site.installation.types[page.installation_type].board}}
@ -215,6 +235,11 @@ After downloading, decompress the image. If the image comes in a ZIP file, for e
Follow this guide if you already are running a supported virtual machine hypervisor. If you are not familiar with virtual machines, we recommend installing Home Assistant OS directly on a [Home Assistant Yellow](/installation/yellow), a [Raspberry Pi](/installation/raspberrypi), or an [ODROID](/installation/odroid). Follow this guide if you already are running a supported virtual machine hypervisor. If you are not familiar with virtual machines, we recommend installing Home Assistant OS directly on a [Home Assistant Yellow](/installation/yellow), a [Raspberry Pi](/installation/raspberrypi), or an [ODROID](/installation/odroid).
{% if page.installation_type == 'macos' %}
- If VirtualBox is not supported on your Mac, and you have experience using virtual machines, you can try running the Home Assistant Operating system on [UTM](https://mac.getutm.app/).
{% endif %}
### Create the virtual machine ### Create the virtual machine
Load the appliance image into your virtual machine hypervisor. (Note: You are free to assign as much resources as you wish to the VM, please assign enough based on your add-on needs). Load the appliance image into your virtual machine hypervisor. (Note: You are free to assign as much resources as you wish to the VM, please assign enough based on your add-on needs).

View File

@ -23,17 +23,16 @@ manually:
{% endif %} {% endif %}
- Browse to your Home Assistant instance. - Browse to your Home Assistant instance.
- In the sidebar, select **{% my config icon %}**. - Go to **{% my integrations title="Settings > Devices & Services" %}**.
- From the configuration menu, select **{% my integrations %}**.
{% if page.ha_integration_type == 'helper' %} {% if page.ha_integration_type == 'helper' %}
- In top of the screen, select the tab: **{% my helpers %}**. - At the top of the screen, select the tab: **{% my helpers %}**.
- In the bottom right, select the - In the bottom right corner, select the
**{% my config_flow_start icon domain=domain title="Create helper" %}** button. **{% my config_flow_start icon domain=domain title="Create helper" %}** button.
{% else %} {% else %}
- In the bottom right, select the - In the bottom right corner, select the
**{% my config_flow_start icon domain=domain %}** button. **{% my config_flow_start icon domain=domain %}** button.
{% endif %} {% endif %}
- From the list, search and select **{{ name }}**. - From the list, select **{{ name }}**.
- Follow the instructions on screen to complete the setup. - Follow the instructions on screen to complete the setup.
{% enddetails %} {% enddetails %}

View File

@ -5,7 +5,6 @@
Options for {{ name }} can be set via the user interface, by taking the following steps: Options for {{ name }} can be set via the user interface, by taking the following steps:
- Browse to your Home Assistant instance. - Browse to your Home Assistant instance.
- In the sidebar click on _**{% my config icon %}**_. - Go to **{% my integrations title="Settings > Devices & Services" %}**.
- From the configuration menu select: _**{% my integrations %}**_.
- If multiple instances of {{ name }} are configured, choose the instance you want to configure. - If multiple instances of {{ name }} are configured, choose the instance you want to configure.
- Click on _**"Options"**_. - Select the cogwheel, then select **Configure**.

View File

@ -41,6 +41,9 @@
<li> <li>
<a href="/dashboards/">Dashboards</a> <a href="/dashboards/">Dashboards</a>
</li> </li>
<li>
<a href="/voice_control/">Voice control</a>
</li>
</ul> </ul>
</li> </li>
<li><a href="/integrations/">Integrations</a></li> <li><a href="/integrations/">Integrations</a></li>

View File

@ -19,6 +19,8 @@
{% include asides/docs_navigation.html %} {% include asides/docs_navigation.html %}
{% elsif root == 'faq' %} {% elsif root == 'faq' %}
{% include asides/faq_navigation.html %} {% include asides/faq_navigation.html %}
{% elsif root == 'voice_control' %}
{% include asides/voice_navigation.html %}
{% elsif root == 'hassio' or root == 'addons' %} {% elsif root == 'hassio' or root == 'addons' %}
{% include asides/hassio_navigation.html %} {% include asides/hassio_navigation.html %}
{% elsif root == 'cloud' %} {% elsif root == 'cloud' %}

View File

@ -0,0 +1,64 @@
---
title: Airzone Cloud
description: Instructions on how to integrate Airzone Cloud within Home Assistant.
ha_release: 2023.6
ha_category:
- Sensor
ha_iot_class: Cloud Polling
ha_config_flow: true
ha_domain: airzone_cloud
ha_platforms:
- diagnostics
- sensor
ha_codeowners:
- '@Noltari'
ha_integration_type: integration
---
This integration interacts with the Cloud API of [Airzone devices](https://www.airzone.es/en/).
There are two main types of Airzone devices:
- [Aidoo](https://www.airzonecontrol.com/aa/en/control-solutions/aidoo/wi-fi/) / [Aidoo Pro](https://www.airzonecontrol.com/aa/en/control-solutions/aidoo/pro/)
- [Easyzone (US)](https://www.airzonecontrol.com/aa/en/control-solutions/easyzone/) / [Flexa (EU)](https://www.airzonecontrol.com/ib/es/soluciones-de-control/flexa/)
## Aidoo / Aidoo Pro
These devices are Wi-Fi controllers that are normally connected to a single air conditioner split system.
## Easyzone (US) / Flexa (EU)
These devices are connected to ducted air conditioners, motorized grilles, and individual thermostats for every room (zone). Therefore, with a single ducted air conditioning system, the user can turn on and off the air conditioner and set different desired temperatures in each room.
A typical Airzone HVAC system consists of a parent device (called *master zone* in Airzone terminology) and child devices (called *slave zones* in Airzone terminology). The [HVAC mode](https://www.home-assistant.io/integrations/climate/#service-climateset_hvac_mode) can only be changed on the parent device. On child devices, you can only enable or disable the HVAC and adjust the desired temperature for that specific device.
Note that multiple HVAC systems can be connected to the same Airzone web server. In this case, there will be one *parent zone* per HVAC system and there may also be *child zones* for each HVAC system.
{% include integrations/config_flow.md %}
{% configuration_basic %}
Username:
description: "Cloud API username"
Password:
description: "Cloud API password"
{% endconfiguration_basic %}
## Sensors
For each Airzone Aidoo (HVAC Wi-Fi controller), the following *sensors* are created:
| Condition | Description |
| :------------------ | :------------------------------------------------- |
| temperature | Measures the temperature from the HVAC thermostat. |
For each Airzone zone (thermostat), the following *sensors* are created:
| Condition | Description |
| :------------------ | :-------------------------------------------------- |
| humidity | Measures the relative humidity in the current zone. |
| temperature | Measures the temperature in the current zone. |
For each Airzone WebServer (HVAC Wi-Fi controller), the following *sensors* are created:
| Condition | Description |
| :------------------ | :------------------------------------------------- |
| rssi | Wi-Fi RSSI. |

View File

@ -195,6 +195,8 @@ The names must exactly match the scene names (minus underscores - Amazon discard
In the new Alexa Skills Kit, you can also create synonyms for slot type values, which can be used in place of the base value in utterances. Synonyms will be replaced with their associated slot value in the intent request sent to the Alexa API endpoint, but only if there are not multiple synonym matches. Otherwise, the value of the synonym that was spoken will be used. In the new Alexa Skills Kit, you can also create synonyms for slot type values, which can be used in place of the base value in utterances. Synonyms will be replaced with their associated slot value in the intent request sent to the Alexa API endpoint, but only if there are not multiple synonym matches. Otherwise, the value of the synonym that was spoken will be used.
If you want to use the `Optional ID` field next to or instead of the Synonym value, you can simply append "_Id" at the end of the template variable e.g. `Scene_Id`.
<p class='img'> <p class='img'>
<img src='/images/integrations/alexa/scene_slot_synonyms.png' /> <img src='/images/integrations/alexa/scene_slot_synonyms.png' />
Custom slot values with synonyms. Custom slot values with synonyms.
@ -217,6 +219,8 @@ intent_script:
service: scene.turn_on service: scene.turn_on
target: target:
entity_id: scene.{{ Scene | replace(" ", "_") }} entity_id: scene.{{ Scene | replace(" ", "_") }}
data:
id: {{ Scene_Id }}
speech: speech:
type: plain type: plain
text: OK text: OK

View File

@ -16,7 +16,7 @@ Polly is a paid service via Amazon Web Services. There is a [free tier](https:/
## Setup ## Setup
For more information, please read the [AWS General Reference regarding Security Credentials](https://docs.aws.amazon.com/general/latest/gr/aws-security-credentials.html) to get the needed details. Also, check the [boto3 Documentation](https://boto3.readthedocs.io/en/latest/guide/configuration.html#shared-credentials-file) about the profiles and the [AWS Regions and Endpoints Reference](https://docs.aws.amazon.com/general/latest/gr/rande.html#pol_region) for available regions. For more information, please read the [AWS General Reference regarding Security Credentials](https://docs.aws.amazon.com/IAM/latest/UserGuide/security-creds.html) to get the needed details. Also, check the [boto3 Documentation](https://boto3.amazonaws.com/v1/documentation/api/latest/guide/credentials.html#shared-credentials-file) about the profiles and the [AWS Regions and Endpoints Reference](https://docs.aws.amazon.com/general/latest/gr/rande.html#regional-endpoints) for available regions.
Available voices are listed in the [Amazon Documentation](https://docs.aws.amazon.com/polly/latest/dg/voicelist.html). Available voices are listed in the [Amazon Documentation](https://docs.aws.amazon.com/polly/latest/dg/voicelist.html).

View File

@ -1,23 +1,26 @@
--- ---
title: Android TV Remote title: Android TV Remote
description: Instructions on how to integrate Android TV remotes into Home Assistant. description: Instructions on how to integrate Android TV Remote into Home Assistant.
ha_category: ha_category:
- Media Player
- Remote - Remote
ha_release: 2023.5 ha_release: 2023.5
ha_iot_class: Local Push ha_iot_class: Local Push
ha_config_flow: true ha_config_flow: true
ha_codeowners: ha_codeowners:
- '@tronikos' - '@tronikos'
- '@Drafteed'
ha_quality_scale: platinum ha_quality_scale: platinum
ha_domain: androidtv_remote ha_domain: androidtv_remote
ha_zeroconf: true ha_zeroconf: true
ha_platforms: ha_platforms:
- diagnostics - diagnostics
- media_player
- remote - remote
ha_integration_type: device ha_integration_type: device
--- ---
The Android TV Remote integration allows you to control an Android TV device by sending [commands](https://github.com/tronikos/androidtvremote2/blob/main/TvKeys.txt) and launching apps. For this to work, the Android TV device needs to have [Android TV Remote Service](https://play.google.com/store/apps/details?id=com.google.android.tv.remote.service) which is pre-installed on most devices. The Android TV Remote integration allows you to control an Android TV and launching apps. For this to work the Android TV device needs to have [Android TV Remote Service](https://play.google.com/store/apps/details?id=com.google.android.tv.remote.service) which is pre-installed on most devices.
For a quick introduction on how to get started with Android TV Remote, check out this video: For a quick introduction on how to get started with Android TV Remote, check out this video:
@ -25,21 +28,17 @@ For a quick introduction on how to get started with Android TV Remote, check out
{% include integrations/config_flow.md %} {% include integrations/config_flow.md %}
## Entity ## Media player
This integration adds a `remote` entity which turns on/off the Android TV device. This integration adds a `media_player` with basic playback and volume controls. The media player provides volume information and display name of current active app on the Android TV. Due to API limitations, the integration will not display the playback status. It is recommended to use this integration together with [Google Cast integration](https://www.home-assistant.io/integrations/cast/). Two media players can be combined into one using the [Universal Media Player](https://www.home-assistant.io/integrations/universal/) integration.
The entity has the `current_activity` attribute that shows the current foreground app on the Android TV.
## Services Using the `media_player.play_media` service, you can launch applications via `Deep Links` and switch channels.
You can use the `remote.turn_off`, `remote.turn_on`, `remote.toggle`, and `remote.send_command` services from the [remote](/integrations/remote/) platform. ### Launching apps
For a list of the most common commands that you can send to the Android TV via `remote.send_command`, see: [TvKeys](https://github.com/tronikos/androidtvremote2/blob/main/TvKeys.txt). You can pass any URL to the device to open it in the built-in browser. Using `Deep Links` you can launch some applications.
For a full list, see [here](https://github.com/tronikos/androidtvremote2/blob/main/src/androidtvremote2/remotemessage.proto#L90).
If `activity` is specified in `remote.turn_on`, it will open the specified URL in the associated app. Examples of some `Deep Links` for popular applications:
Examples of URLs to pass as activity for some popular apps:
| App | URL | | App | URL |
| --- | --- | | --- | --- |
@ -48,6 +47,132 @@ Examples of URLs to pass as activity for some popular apps:
| Prime Video | https://app.primevideo.com | Prime Video | https://app.primevideo.com
| Disney+ | https://www.disneyplus.com | Disney+ | https://www.disneyplus.com
Examples:
```yaml
# Launch the Netflix app
service: media_player.play_media
data:
media_content_type: url
media_content_id: https://www.netflix.com/title
target:
entity_id: media_player.living_room_tv
```
```yaml
# Open a specific YouTube video:
service: media_player.play_media
data:
media_content_type: url
media_content_id: https://www.youtube.com/watch?v=dQw4w9WgXcQ
target:
entity_id: media_player.living_room_tv
```
### Switch channels
You can pass the channel number to switch the channel. The channel number must be an integer.
Example:
```yaml
# Change channel to number 15:
service: media_player.play_media
data:
media_content_type: channel
media_content_id: 15
target:
entity_id: media_player.living_room_tv
```
## Remote
The remote allows you to send key commands to your Android TV device with the `remote.send_command` service.
The entity has the `current_activity` attribute that shows the current foreground app on the Android TV.
{% details "List of the most common commands" %}
Navigation:
- DPAD_UP
- DPAD_DOWN
- DPAD_LEFT
- DPAD_RIGHT
- DPAD_CENTER
- BUTTON_A
- BUTTON_B
- BUTTON_X
- BUTTON_Y
Volume Control:
- VOLUME_DOWN
- VOLUME_UP
- VOLUME_MUTE
- MUTE
Media Control:
- MEDIA_PLAY_PAUSE
- MEDIA_PLAY
- MEDIA_PAUSE
- MEDIA_NEXT
- MEDIA_PREVIOUS
- MEDIA_STOP
- MEDIA_RECORD
- MEDIA_REWIND
- MEDIA_FAST_FORWARD
TV Control:
- 0
- 1
- 2
- 3
- 4
- 5
- 6
- 7
- 8
- 9
- DEL
- ENTER
- CHANNEL_UP
- CHANNEL_DOWN
- F1
- F2
- F3
- F4
- F5
- F6
- F7
- F8
- F9
- F10
- F11
- F12
- TV
- PROG_RED
- PROG_GREEN
- PROG_YELLOW
- PROG_BLUE
Other:
- BUTTON_MODE
- EXPLORER
- MENU
- INFO
- GUIDE
- TV_TELETEXT
- CAPTIONS
- DVR
- MEDIA_AUDIO_TRACK
- SETTINGS
- SEARCH
- ASSIST
{% enddetails %}
For a full list see [here](https://github.com/tronikos/androidtvremote2/blob/main/src/androidtvremote2/remotemessage.proto#L90).
If `activity` is specified in `remote.turn_on` it will open the specified URL in the associated app. See [Launching apps section](#launching-apps).
Examples of service calls: Examples of service calls:
```yaml ```yaml
@ -87,7 +212,7 @@ target:
entity_id: remote.living_room_tv entity_id: remote.living_room_tv
``` ```
## Dashboard example ### Dashboard example
You have to manually create buttons in Lovelace to send commands to the Android TV device or launch apps on it. You have to manually create buttons in Lovelace to send commands to the Android TV device or launch apps on it.

View File

@ -15,7 +15,7 @@ ha_platforms:
- select - select
--- ---
The Assist pipeline integration provides the foundation for the [Assist](/docs/assist/) voice assistant in Home Assistant. The Assist pipeline integration provides the foundation for the [Assist](/voice_control/) voice assistant in Home Assistant.
For most users, there is no need to install this integration manually. The Assist pipeline integration is part of the default configuration and is set up automatically if needed by other integrations. For most users, there is no need to install this integration manually. The Assist pipeline integration is part of the default configuration and is set up automatically if needed by other integrations.
If you are not using the default integration, you need to add the following to your `configuration.yaml` file: If you are not using the default integration, you need to add the following to your `configuration.yaml` file:
@ -25,4 +25,4 @@ If you are not using the default integration, you need to add the following to y
assist_pipeline: assist_pipeline:
``` ```
For more information, refer to the procedure on [configuring a pipeline](/docs/assist/voice_remote_local_assistant/). For more information, refer to the procedure on [configuring a pipeline](/voice_control/voice_remote_local_assistant/).

View File

@ -15,7 +15,7 @@ The `aws` integration provides a single place to interact with [Amazon Web Servi
## Setup ## Setup
You have to have an AWS account to use Amazon Web Services, create one [here](https://aws.amazon.com/free/) with a 12 months free tier benefit. Please note, even in the first 12-months, you may still be billed if you use more resources than offered in the free tier. We advise you to monitor your costs in the [AWS Billing Console](https://console.aws.amazon.com/billing/) closely. You can read the [Control your AWS costs](https://aws.amazon.com/getting-started/tutorials/control-your-costs-free-tier-budgets/) guide for more information. You have to have an AWS account to use Amazon Web Services, create one [here](https://aws.amazon.com/free/) with a 12 months free tier benefit. Please note, even in the first 12-months, you may still be billed if you use more resources than offered in the free tier. We advise you to monitor your costs in the [AWS Billing Console](https://console.aws.amazon.com/billing/) closely. You can read the [Control your AWS costs](https://aws.amazon.com/getting-started/hands-on/control-your-costs-free-tier-budgets/) guide for more information.
The `lambda`, `sns`, `sqs`, and `events` services, used in the `aws` component, all provide an **Always Free** tier for all users even after the 12-month period. The general usage in Home Automation will most likely not reach the free tier limit. Please read [Lambda Pricing](https://aws.amazon.com/lambda/pricing/), [SNS Pricing](https://aws.amazon.com/sns/pricing/), [SQS Pricing](https://aws.amazon.com/sqs/pricing/), and [EventBridge Pricing](https://aws.amazon.com/eventbridge/pricing/) for more details. The `lambda`, `sns`, `sqs`, and `events` services, used in the `aws` component, all provide an **Always Free** tier for all users even after the 12-month period. The general usage in Home Automation will most likely not reach the free tier limit. Please read [Lambda Pricing](https://aws.amazon.com/lambda/pricing/), [SNS Pricing](https://aws.amazon.com/sns/pricing/), [SQS Pricing](https://aws.amazon.com/sqs/pricing/), and [EventBridge Pricing](https://aws.amazon.com/eventbridge/pricing/) for more details.
@ -104,7 +104,7 @@ context:
## Lambda Notify Usage ## Lambda Notify Usage
AWS Lambda is a notification platform and thus can be controlled by calling the `notify` service [as described here](/integrations/notify/). It will invoke a Lambda for all targets given in the notification payload. A target can be formatted as a function name, an entire ARN ([Amazon Resource Name](https://docs.aws.amazon.com/general/latest/gr/aws-arns-and-namespaces.html)) or a partial ARN. For more information, please see the [botocore documentation](https://botocore.amazonaws.com/v1/documentation/api/latest/reference/services/lambda.html#Lambda.Client.invoke). AWS Lambda is a notification platform and thus can be controlled by calling the `notify` service [as described here](/integrations/notify/). It will invoke a Lambda for all targets given in the notification payload. A target can be formatted as a function name, an entire ARN ([Amazon Resource Name](https://docs.aws.amazon.com/IAM/latest/UserGuide/reference-arns.html)) or a partial ARN. For more information, please see the [botocore documentation](https://botocore.amazonaws.com/v1/documentation/api/latest/reference/services/lambda/client/invoke.html).
The Lambda event payload will contain everything passed in the service call payload. Here is an example payload that would be sent to Lambda: The Lambda event payload will contain everything passed in the service call payload. Here is an example payload that would be sent to Lambda:
@ -132,7 +132,7 @@ The context will look like this:
## SNS Notify Usage ## SNS Notify Usage
AWS SNS is a notification platform and thus can be controlled by calling the `notify` service [as described here](/integrations/notify/). It will publish a message to all targets given in the notification payload. A target must be a SNS topic or endpoint ARN ([Amazon Resource Name](https://docs.aws.amazon.com/general/latest/gr/aws-arns-and-namespaces.html)). For more information, please see the [botocore documentation](https://botocore.amazonaws.com/v1/documentation/api/latest/reference/services/sns.html#SNS.Client.publish). AWS SNS is a notification platform and thus can be controlled by calling the `notify` service [as described here](/integrations/notify/). It will publish a message to all targets given in the notification payload. A target must be a SNS topic or endpoint ARN ([Amazon Resource Name](https://docs.aws.amazon.com/IAM/latest/UserGuide/reference-arns.html)). For more information, please see the [botocore documentation](https://botocore.amazonaws.com/v1/documentation/api/latest/reference/services/sns/client/publish.html).
If one exists, the SNS Subject will be set to the title. All attributes from the payload, except the message, will be sent as stringified message attributes. If one exists, the SNS Subject will be set to the title. All attributes from the payload, except the message, will be sent as stringified message attributes.
@ -158,7 +158,7 @@ If you do not download them, you will lose them and will have to recreate a new
## SQS Notify Usage ## SQS Notify Usage
AWS SQS is a notification platform and thus can be controlled by calling the `notify` service [as described here](/integrations/notify/). It will publish a message to the queue for all targets given in the notification payload. A target must be a SQS topic URL. For more information, please see the [SQS documentation](https://docs.aws.amazon.com/AWSSimpleQueueService/latest/SQSDeveloperGuide/ImportantIdentifiers.html) and [bototcore documentation](https://botocore.amazonaws.com/v1/documentation/api/latest/reference/services/sqs.html#SQS.Client.send_message) AWS SQS is a notification platform and thus can be controlled by calling the `notify` service [as described here](/integrations/notify/). It will publish a message to the queue for all targets given in the notification payload. A target must be a SQS topic URL. For more information, please see the [SQS documentation](https://docs.aws.amazon.com/AWSSimpleQueueService/latest/SQSDeveloperGuide/sqs-queue-message-identifiers.html) and [botocore documentation](https://botocore.amazonaws.com/v1/documentation/api/latest/reference/services/sqs/client/send_message.html)
The SQS event payload will contain everything passed in the service call payload. SQS payloads will be published as stringified JSON. All attributes from the payload, except message, will also be sent as stringified message attributes. Here is an example message that would be published to the SQS queue: The SQS event payload will contain everything passed in the service call payload. SQS payloads will be published as stringified JSON. All attributes from the payload, except message, will also be sent as stringified message attributes. Here is an example message that would be published to the SQS queue:
@ -174,7 +174,7 @@ The SQS event payload will contain everything passed in the service call payload
``` ```
## EventBridge Notify Usage ## EventBridge Notify Usage
AWS EventBridge is a notification platform and thus can be controlled by calling the `notify` service [as described here](/integrations/notify/). It will publish a message to the event bus for all targets given in the notification payload. A target must be a name of an event bus accessible by the given credentials. A target is not required, and the default event bus will be used if none are specified. For more information, please see the [EventBridge documentation](https://docs.aws.amazon.com/eventbridge/latest/userguide/eb-event-bus.html) and [bototcore documentation](https://boto3.amazonaws.com/v1/documentation/api/latest/reference/services/events.html#EventBridge.Client.put_events) AWS EventBridge is a notification platform and thus can be controlled by calling the `notify` service [as described here](/integrations/notify/). It will publish a message to the event bus for all targets given in the notification payload. A target must be a name of an event bus accessible by the given credentials. A target is not required, and the default event bus will be used if none are specified. For more information, please see the [EventBridge documentation](https://docs.aws.amazon.com/eventbridge/latest/userguide/eb-event-bus.html) and [botocore documentation](https://boto3.amazonaws.com/v1/documentation/api/latest/reference/services/events/client/put_events.html)
There are two options for generating the event detail based on the service call payload. If the `detail` attribute is specified, then its value will be serialized as a JSON object and used for the event detail. If the attribute is not specified, then the value of the `message` attribute is serialized as a simple JSON object with a single key named `message` and the value of the message supplied to the service call. There are two options for generating the event detail based on the service call payload. If the `detail` attribute is specified, then its value will be serialized as a JSON object and used for the event detail. If the attribute is not specified, then the value of the `message` attribute is serialized as a simple JSON object with a single key named `message` and the value of the message supplied to the service call.

View File

@ -12,15 +12,15 @@ ha_config_flow: true
ha_integration_type: integration ha_integration_type: integration
--- ---
The `Azure Event Hub` integration allows you to hook into the Home Assistant event bus and send events to [Azure Event Hub](https://azure.microsoft.com/en-us/services/event-hubs/) or to an [Azure IoT Hub](https://docs.microsoft.com/en-us/azure/iot-hub/iot-hub-devguide-messages-read-builtin). The `Azure Event Hub` integration allows you to hook into the Home Assistant event bus and send events to [Azure Event Hub](https://azure.microsoft.com/products/event-hubs/) or to an [Azure IoT Hub](https://learn.microsoft.com/azure/iot-hub/iot-hub-devguide-messages-read-builtin).
## First time setup ## First time setup
This assumes you already have an Azure account. Otherwise create a Free account [here](https://azure.microsoft.com/en-us/free/). This assumes you already have an Azure account. Otherwise create a Free account [here](https://azure.microsoft.com/free/).
You need to create an Event Hub namespace and an Event Hub in that namespace, you can follow [this guide](https://docs.microsoft.com/en-us/azure/event-hubs/event-hubs-create). Alternatively you can directly deploy an ARM template with the namespace and the Event Hub [from here](https://github.com/Azure/azure-quickstart-templates/tree/master/quickstarts/microsoft.eventhub/event-hubs-create-event-hub-and-consumer-group). You need to create an Event Hub namespace and an Event Hub in that namespace, you can follow [this guide](https://learn.microsoft.com/azure/event-hubs/event-hubs-create). Alternatively you can directly deploy an ARM template with the namespace and the Event Hub [from here](https://github.com/Azure/azure-quickstart-templates/tree/master/quickstarts/microsoft.eventhub/event-hubs-create-event-hub-and-consumer-group).
You must then create a Shared Access Policy for the Event Hub with 'Send' claims or use the RootManageAccessKey from your namespace (this key has additional claims, including managing the event hub and listening, which are not needed for this purpose), for more details on the security of Event Hubs [go here](https://docs.microsoft.com/en-us/azure/event-hubs/event-hubs-authentication-and-security-model-overview). You must then create a Shared Access Policy for the Event Hub with 'Send' claims or use the RootManageAccessKey from your namespace (this key has additional claims, including managing the event hub and listening, which are not needed for this purpose), for more details on the security of Event Hubs [go here](https://learn.microsoft.com/azure/event-hubs/authenticate-shared-access-signature).
Once you have the name of your namespace, instance, Shared Access Policy and the key for that policy, you can setup the integration itself. Once you have the name of your namespace, instance, Shared Access Policy and the key for that policy, you can setup the integration itself.
@ -94,10 +94,10 @@ filter:
## Using the data in Azure ## Using the data in Azure
There are a number of ways to stream the data that comes into the Event Hub into storages in Azure, the easiest way is to use the built-in Capture function and this allows you to capture the data in Azure Blob Storage or Azure Data Lake store, [details here](https://docs.microsoft.com/en-us/azure/event-hubs/event-hubs-capture-overview). There are a number of ways to stream the data that comes into the Event Hub into storages in Azure, the easiest way is to use the built-in Capture function and this allows you to capture the data in Azure Blob Storage or Azure Data Lake store, [details here](https://learn.microsoft.com/azure/event-hubs/event-hubs-capture-overview).
Other storages in Azure (and outside) are possible with an [Azure Stream Analytics job](https://docs.microsoft.com/en-us/azure/stream-analytics/stream-analytics-define-inputs#stream-data-from-event-hubs), for instance for [Cosmos DB](https://docs.microsoft.com/en-us/azure/stream-analytics/stream-analytics-documentdb-output), [Azure SQL DB](https://docs.microsoft.com/en-us/azure/stream-analytics/stream-analytics-sql-output-perf), [Azure Table Storage](https://docs.microsoft.com/en-us/azure/stream-analytics/stream-analytics-define-outputs#table-storage), custom writing to [Azure Blob Storage](https://docs.microsoft.com/en-us/azure/stream-analytics/stream-analytics-custom-path-patterns-blob-storage-output) and [Topic and Queues](https://docs.microsoft.com/en-us/azure/stream-analytics/stream-analytics-quick-create-portal#configure-job-output). Other storages in Azure (and outside) are possible with an [Azure Stream Analytics job](https://learn.microsoft.com/azure/stream-analytics/stream-analytics-define-inputs#stream-data-from-event-hubs), for instance for [Cosmos DB](https://learn.microsoft.com/azure/stream-analytics/stream-analytics-documentdb-output), [Azure SQL DB](https://learn.microsoft.com/azure/stream-analytics/stream-analytics-sql-output-perf), [Azure Table Storage](https://learn.microsoft.com/azure/stream-analytics/stream-analytics-define-outputs), custom writing to [Azure Blob Storage](https://learn.microsoft.com/azure/stream-analytics/stream-analytics-custom-path-patterns-blob-storage-output) and [Topic and Queues](https://learn.microsoft.com/azure/stream-analytics/stream-analytics-quick-create-portal#configure-job-output).
On the analytical side, Event Hub can be directly fed into [Azure Databricks Spark](https://docs.microsoft.com/en-us/azure/azure-databricks/databricks-stream-from-eventhubs?toc=https%3A%2F%2Fdocs.microsoft.com%2Fen-us%2Fazure%2Fevent-hubs%2FTOC.json&bc=https%3A%2F%2Fdocs.microsoft.com%2Fen-us%2Fazure%2Fbread%2Ftoc.json), [Azure Time Series Insights](https://docs.microsoft.com/en-us/azure/time-series-insights/time-series-insights-how-to-add-an-event-source-eventhub) and [Microsoft Power BI](https://docs.microsoft.com/en-us/azure/event-hubs/event-hubs-tutorial-visualize-anomalies). On the analytical side, Event Hub can be directly fed into [Azure Databricks Spark](https://learn.microsoft.com/azure/databricks/structured-streaming/streaming-event-hubs), [Azure Time Series Insights](https://learn.microsoft.com/azure/time-series-insights/how-to-ingest-data-event-hub) and [Microsoft Power BI](https://learn.microsoft.com/azure/stream-analytics/stream-analytics-real-time-fraud-detection).
The final way to use the data in Azure is to connect an Azure Function to the Event Hub using the [Event Hub trigger binding](https://docs.microsoft.com/en-us/azure/azure-functions/functions-bindings-event-hubs). The final way to use the data in Azure is to connect an Azure Function to the Event Hub using the [Event Hub trigger binding](https://learn.microsoft.com/azure/azure-functions/functions-bindings-event-hubs).

View File

@ -13,15 +13,15 @@ ha_platforms:
ha_integration_type: integration ha_integration_type: integration
--- ---
The `Azure Service Bus` integration allows you to send messages to [Azure Service Bus](https://azure.microsoft.com/en-us/services/service-bus/) from within Home Assistant. The `Azure Service Bus` integration allows you to send messages to [Azure Service Bus](https://azure.microsoft.com/products/service-bus/) from within Home Assistant.
## First-time setup ## First-time setup
This assumes you already have an Azure account. Otherwise, create a free account [here](https://azure.microsoft.com/en-us/free/). This assumes you already have an Azure account. Otherwise, create a free account [here](https://azure.microsoft.com/free/).
You need to create a Service Bus namespace; you can follow [this guide](https://docs.microsoft.com/en-us/azure/service-bus-messaging/service-bus-create-namespace-portal). You need to create a Service Bus namespace; you can follow [this guide](https://learn.microsoft.com/azure/service-bus-messaging/service-bus-quickstart-portal#create-a-namespace-in-the-azure-portal).
You must then create a Shared Access Policy for the Service Bus with `Send` claims or use the RootManageAccessKey from your namespace (this key has additional claims, including managing the event hub and listening, which are not needed for this purpose), for more details on the security of Service Bus [go here](https://docs.microsoft.com/en-us/azure/service-bus-messaging/service-bus-authentication-and-authorization#shared-access-signature). Alternatively you can create a dedicated key for only one queue or topic, to restrict access to only that queue or topic. You must then create a Shared Access Policy for the Service Bus with `Send` claims or use the RootManageAccessKey from your namespace (this key has additional claims, including managing the event hub and listening, which are not needed for this purpose), for more details on the security of Service Bus [go here](https://learn.microsoft.com/azure/service-bus-messaging/service-bus-authentication-and-authorization#shared-access-signature). Alternatively you can create a dedicated key for only one queue or topic, to restrict access to only that queue or topic.
Once you have the connection string with `Send` policy, you can set up the integration itself. Once you have the connection string with `Send` policy, you can set up the integration itself.

View File

@ -69,7 +69,7 @@ availability_topic:
required: false required: false
type: string type: string
device: device:
description: "Information about the device this binary sensor is a part of to tie it into the [device registry](https://developers.home-assistant.io/docs/en/device_registry_index.html). Only works through [MQTT discovery](/integrations/mqtt/#mqtt-discovery) and when [`unique_id`](#unique_id) is set. At least one of identifiers or connections must be present to identify the device." description: "Information about the device this binary sensor is a part of to tie it into the [device registry](https://developers.home-assistant.io/docs/device_registry_index/). Only works through [MQTT discovery](/integrations/mqtt/#mqtt-discovery) and when [`unique_id`](#unique_id) is set. At least one of identifiers or connections must be present to identify the device."
required: false required: false
type: map type: map
keys: keys:
@ -129,7 +129,7 @@ encoding:
type: string type: string
default: "utf-8" default: "utf-8"
entity_category: entity_category:
description: The [category](https://developers.home-assistant.io/docs/core/entity#generic-properties) of the entity. description: The [category](https://developers.home-assistant.io/docs/core/entity/#generic-properties) of the entity.
required: false required: false
type: string type: string
default: None default: None

View File

@ -11,6 +11,7 @@ ha_category:
- Presence Detection - Presence Detection
- Select - Select
- Sensor - Sensor
- Switch
ha_release: 0.64 ha_release: 0.64
ha_iot_class: Cloud Polling ha_iot_class: Cloud Polling
ha_config_flow: true ha_config_flow: true
@ -28,6 +29,7 @@ ha_platforms:
- number - number
- select - select
- sensor - sensor
- switch
ha_integration_type: integration ha_integration_type: integration
--- ---
@ -50,6 +52,7 @@ This integration provides the following platforms:
- [Notifications](/integrations/bmw_connected_drive/#notifications): Send Points of Interest (POI) to your car. - [Notifications](/integrations/bmw_connected_drive/#notifications): Send Points of Interest (POI) to your car.
- [Buttons](/integrations/bmw_connected_drive/#buttons): Turn on air condition, sound the horn, flash the lights, update the vehicle location and update the state. - [Buttons](/integrations/bmw_connected_drive/#buttons): Turn on air condition, sound the horn, flash the lights, update the vehicle location and update the state.
- [Selects](/integrations/bmw_connected_drive/#selects): Display and control charging related settings for (PH)EVs. - [Selects](/integrations/bmw_connected_drive/#selects): Display and control charging related settings for (PH)EVs.
- [Switches](/integrations/bmw_connected_drive/#switches): Display and toggle settings on your car.
- [Numbers](/integrations/bmw_connected_drive/#numbers): Display and control numeric charging related settings for (PH)EVs. - [Numbers](/integrations/bmw_connected_drive/#numbers): Display and control numeric charging related settings for (PH)EVs.
## Configuration ## Configuration
@ -107,10 +110,6 @@ The air conditioning of the vehicle can be activated with the `button.<your_vehi
What exactly is started here depends on the type of vehicle. It might range from just ventilation over auxiliary heating to real air conditioning. If your vehicle is equipped with auxiliary heating, only trigger this service if the vehicle is parked in a location where it is safe to use it (e.g., not in an underground parking or closed garage). What exactly is started here depends on the type of vehicle. It might range from just ventilation over auxiliary heating to real air conditioning. If your vehicle is equipped with auxiliary heating, only trigger this service if the vehicle is parked in a location where it is safe to use it (e.g., not in an underground parking or closed garage).
Some newer cars also support stopping an active air conditioning with the `button.<your_vehicle>_deactivate_air_conditioning` button.
This will only work if you have the option to stop the AC in the *MyBMW* app. If your car doesn't support this service, nothing will happen.
### Sound the horn ### Sound the horn
The `button.<your_vehicle>_sound_horn` button sounds the horn of the vehicle. This option is not available in some countries (among which the UK). Use this feature responsibly, as it might annoy your neighbors. The `button.<your_vehicle>_sound_horn` button sounds the horn of the vehicle. This option is not available in some countries (among which the UK). Use this feature responsibly, as it might annoy your neighbors.
@ -149,6 +148,15 @@ Using these selects will impact the state of your vehicle. Use them with care!
- **Charging Mode**: Vehicle can be set to `IMMEDIATE_CHARGING` (charge as soon as plugged in) or `DELAYED_CHARGING` (charge only if within charging window). It can be used to start/stop charging if the charging window is set accordingly. - **Charging Mode**: Vehicle can be set to `IMMEDIATE_CHARGING` (charge as soon as plugged in) or `DELAYED_CHARGING` (charge only if within charging window). It can be used to start/stop charging if the charging window is set accordingly.
- **AC Charging Limit**: The maximum current a vehicle will charge with. Not available on all EVs. - **AC Charging Limit**: The maximum current a vehicle will charge with. Not available on all EVs.
## Switches
If supported by your vehicle, you can display and toggle remote services with start/stop functionality.
Using these selects will impact the state of your vehicle, use them with care!
- **Climate**: Toggle vehicle climatization. It is not possible to force it to heating/cooling, the vehicle will decide on its own. If turned on, it will run for 30 minutes (as if toggled via the MyBMW app).
- **Charging**: Toggle vehicle charging if plugged in. Only available on some electric vehicles.
## Numbers ## Numbers
If you have a (PH)EV, you can control the charging process through Home Assistant. The number entities are created automatically depending on your vehicle's capabilities and can be changed from the UI or using the `number.set_value` service. For more information, please see the [number documentation](/integrations/number/). If you have a (PH)EV, you can control the charging process through Home Assistant. The number entities are created automatically depending on your vehicle's capabilities and can be changed from the UI or using the `number.set_value` service. For more information, please see the [number documentation](/integrations/number/).

View File

@ -629,25 +629,36 @@ Learning RF Frequency, press and hold the button to learn...
Press and hold a button on the remote. Press and hold a button on the remote.
You will know it succeeded when you see the following text:
```txt ```txt
Found RF Frequency - 1 of 2! Found RF Frequency - 1 of 2!
You can now let go of the button You can now let go of the button
Press enter to continue... Press enter to continue...
``` ```
Press enter. If the attempt fails, you will see the error:
```txt ```txt
To complete learning, single press the button you want to learn RF Frequency not found
``` ```
If a failure occurs, you may need to simply keep pressing the button during the `Learning RF Frequency` step, as some remotes appear to not continuously transmit when buttons are held.
Short press the button and you get the code: After a success, do one of the following two options:
```txt 1. To learn a single button press RF code, press enter and follow the prompt:
Found RF Frequency - 2 of 2! ```txt
b2002c0111211011211121112111212110112122101121112111202210211121112110221011211121112121102210112121111021112221101121211100017b10211111211121102111212210112121111121102111212210211121102210211111211121102122102111112121101121112122101121211000017c10211111211022102111212210112121111022102112202210211121102210221011211022102122102210112121101122102122101121211100017b10211111211121102210212210112122101121102210212210221021112110221011211121112121102210112121111121102122101121221000017b1121101121112111211121211110212210112111211121211121102210211121101121112111212111211011222110112111212111112121100005dc000000000000000000000000 To complete learning, single press the button you want to learn
Base64: b'sgAsAREhEBEhESERIREhIRARISIQESERIREgIhAhESERIRAiEBEhESERISEQIhARISERECERIiEQESEhEQABexAhEREhESEQIREhIhARISERESEQIREhIhAhESEQIhAhEREhESEQISIQIRERISEQESERISIQESEhEAABfBAhEREhECIQIREhIhARISERECIQIRIgIhAhESEQIhAiEBEhECIQISIQIhARISEQESIQISIQESEhEQABexAhEREhESEQIhAhIhARISIQESEQIhAhIhAiECERIRAiEBEhESERISEQIhARISERESEQISIQESEiEAABexEhEBEhESERIREhIREQISIQESERIREhIREhECIQIREhEBEhESERISERIRARIiEQESERISERESEhEAAF3AAAAAAAAAAAAAAAAA==' ```
``` Short press the button and you get the code:
```txt
Found RF Frequency - 2 of 2!
b2002c0111211011211121112111212110112122101121112111202210211121112110221011211121112121102210112121111021112221101121211100017b10211111211121102111212210112121111121102111212210211121102210211111211121102122102111112121101121112122101121211000017c10211111211022102111212210112121111022102112202210211121102210221011211022102122102210112121101122102122101121211100017b10211111211121102210212210112122101121102210212210221021112110221011211121112121102210112121111121102122101121221000017b1121101121112111211121211110212210112111211121211121102210211121101121112111212111211011222110112111212111112121100005dc000000000000000000000000
Base64: b'sgAsAREhEBEhESERIREhIRARISIQESERIREgIhAhESERIRAiEBEhESERISEQIhARISERECERIiEQESEhEQABexAhEREhESEQIREhIhARISERESEQIREhIhAhESEQIhAhEREhESEQISIQIRERISEQESERISIQESEhEAABfBAhEREhECIQIREhIhARISERECIQIRIgIhAhESEQIhAiEBEhECIQISIQIhARISEQESIQISIQESEhEQABexAhEREhESEQIhAhIhARISIQESEQIhAhIhAiECERIRAiEBEhESERISEQIhARISERESEQISIQESEiEAABexEhEBEhESERIREhIREQISIQESERIREhIREhECIQIREhEBEhESERISERIRARIiEQESERISERESEhEAAF3AAAAAAAAAAAAAAAAA=='
```
2. To learn a button hold RF code, hold the button you wish to learn for 1-2 seconds then immediately press enter.
* You will see the same prompts for a short press as shown above. You should see it return a different base64 code.
* Test the base64 code to ensure it performs the button 'hold' command as expected, rather than the button 'press' command.
* This might take some trial and error to get the hold timing right before hitting enter to scan for the code.
### Conversion of codes from other projects ### Conversion of codes from other projects

View File

@ -88,7 +88,7 @@ current_humidity_template:
required: false required: false
type: template type: template
current_humidity_topic: current_humidity_topic:
description: The MQTT topic on which to listen for the current humidity. A `"None"` value received will reset the current temperature. Empty values (`'''`) will be ignored. description: The MQTT topic on which to listen for the current humidity. A `"None"` value received will reset the current humidity. Empty values (`'''`) will be ignored.
required: false required: false
type: string type: string
current_temperature_template: current_temperature_template:
@ -96,7 +96,7 @@ current_temperature_template:
required: false required: false
type: template type: template
current_temperature_topic: current_temperature_topic:
description: The MQTT topic on which to listen for the current temperature. A `"None"` value received will reset the current humidity. Empty values (`'''`) will be ignored. description: The MQTT topic on which to listen for the current temperature. A `"None"` value received will reset the current temperature. Empty values (`'''`) will be ignored.
required: false required: false
type: string type: string
device: device:

View File

@ -23,7 +23,7 @@ The setup requires an API Token created with `Zone:Zone:Read` and `Zone:DNS:Edit
An easy way to create this is to start with the "Edit zone DNS" template then add `Zone:Zone:Read` to the permissions. An easy way to create this is to start with the "Edit zone DNS" template then add `Zone:Zone:Read` to the permissions.
[Cloudflare API Tokens Guide](https://developers.cloudflare.com/api/tokens/create) [Cloudflare API Tokens Guide](https://developers.cloudflare.com/fundamentals/api/get-started/create-token/)
{% include integrations/config_flow.md %} {% include integrations/config_flow.md %}

View File

@ -17,21 +17,12 @@ ha_platforms:
- sensor - sensor
- switch - switch
ha_integration_type: integration ha_integration_type: integration
ha_codeowners:
- '@gjohansson-ST'
--- ---
The `command_line` offers functionality that issues specific commands to get data or to control a device. The `command_line` offers functionality that issues specific commands to get data or to control a device.
## Binary sensor
To use your Command binary sensor in your installation, add the following to your `configuration.yaml` file:
```yaml
# Example configuration.yaml entry
binary_sensor:
- platform: command_line
command: "cat /proc/sys/net/ipv4/ip_forward"
```
<div class='note'> <div class='note'>
It's highly recommended to enclose the command in single quotes `'` as it ensures all characters can be used in the command and reduces the risk of unintentional escaping. To include a single quote in a command enclosed in single quotes, double it: `''`. It's highly recommended to enclose the command in single quotes `'` as it ensures all characters can be used in the command and reduces the risk of unintentional escaping. To include a single quote in a command enclosed in single quotes, double it: `''`.
@ -39,76 +30,61 @@ It's highly recommended to enclose the command in single quotes `'` as it ensure
</div> </div>
{% configuration %} {% configuration %}
command: command_line:
description: The action to take to get the value. description: The platforms to use for you command_line integration.
required: true
type: string
command_timeout:
description: Defines number of seconds for command timeout.
required: false
type: integer
default: 15
device_class:
description: Sets the [class of the device](/integrations/binary_sensor/), changing the device state and icon that is displayed on the frontend.
required: false
type: string
name:
description: Let you overwrite the name of the device.
required: false
type: string
default: "*name* from the device"
payload_on:
description: The payload that represents enabled state.
required: false
type: string
default: 'ON'
unique_id:
description: An ID that uniquely identifies this binary sensor. Set this to a unique value to allow customization through the UI.
required: false
type: string
payload_off:
description: The payload that represents disabled state.
required: false
type: string
default: 'OFF'
scan_interval:
description: Defines number of seconds for polling interval.
required: false
type: integer
default: 60
value_template:
description: Defines a [template](/docs/configuration/templating/#processing-incoming-data) to extract a value from the payload.
required: false
type: string
{% endconfiguration %}
## Cover
A `command_line`cover platform that issues specific commands when it is moved up, down and stopped. It allows anyone to integrate any type of cover into Home Assistant that can be controlled from the command line.
To enable a command line cover in your installation, add the following to your `configuration.yaml` file:
```yaml
# Example configuration.yaml entry
cover:
- platform: command_line
covers:
garage_door:
command_open: move_command up garage
command_close: move_command down garage
command_stop: move_command stop garage
```
{% configuration %}
covers:
description: The array that contains all command line covers.
required: true required: true
type: list type: list
keys: keys:
identifier: binary_sensor:
description: Name of the command line cover as slug. Multiple entries are possible. description: Binary sensor platform.
required: true required: false
type: list type: map
keys:
command:
description: The action to take to get the value.
required: true
type: string
command_timeout:
description: Defines number of seconds for command timeout.
required: false
type: integer
default: 15
device_class:
description: Sets the [class of the device](/integrations/binary_sensor/), changing the device state and icon that is displayed on the frontend.
required: false
type: string
name:
description: Let you overwrite the name of the device.
required: false
type: string
default: "*name* from the device"
payload_on:
description: The payload that represents enabled state.
required: false
type: string
default: 'ON'
unique_id:
description: An ID that uniquely identifies this binary sensor. Set this to a unique value to allow customization through the UI.
required: false
type: string
payload_off:
description: The payload that represents disabled state.
required: false
type: string
default: 'OFF'
value_template:
description: Defines a [template](/docs/configuration/templating/#processing-incoming-data) to extract a value from the payload.
required: false
type: string
scan_interval:
description: Define time in seconds between each update.
required: false
type: integer
default: 60
cover:
description: Cover platform.
required: false
type: map
keys: keys:
command_close: command_close:
description: The action to close the cover. description: The action to close the cover.
@ -134,15 +110,10 @@ covers:
required: false required: false
type: integer type: integer
default: 15 default: 15
friendly_name: name:
description: The name used to display the cover in the frontend. description: The name used to display the cover in the frontend.
required: false required: true
type: string type: string
scan_interval:
description: Defines number of seconds for polling interval.
required: false
type: integer
default: 60
unique_id: unique_id:
description: An ID that uniquely identifies this cover. Set this to a unique value to allow customization through the UI. description: An ID that uniquely identifies this cover. Set this to a unique value to allow customization through the UI.
required: false required: false
@ -150,119 +121,73 @@ covers:
value_template: value_template:
description: if specified, `command_state` will ignore the result code of the command but the template evaluating will indicate the position of the cover. For example, if your `command_state` returns a string "open", using `value_template` as in the example configuration above will allow you to translate that into the valid state `100`. description: if specified, `command_state` will ignore the result code of the command but the template evaluating will indicate the position of the cover. For example, if your `command_state` returns a string "open", using `value_template` as in the example configuration above will allow you to translate that into the valid state `100`.
required: false required: false
default: "'{% raw %}{{ value }}{% endraw%}'"
type: template type: template
{% endconfiguration %} scan_interval:
description: Define time in seconds between each update.
## Notify required: false
type: integer
The `command_line` platform allows you to use external tools for notifications from Home Assistant. The message will be passed in as STDIN. default: 15
notify:
To enable those notifications in your installation, add the following to your `configuration.yaml` file: description: Notify platform.
required: false
```yaml type: map
# Example configuration.yaml entry keys:
notify: name:
- name: NOTIFIER_NAME description: Setting the optional parameter `name` allows multiple notifiers to be created. The notifier will bind to the service `notify.NOTIFIER_NAME`.
platform: command_line required: false
command: "espeak -vmb/mb-us1" default: notify
``` type: string
command:
{% configuration %} description: The action to take.
name: required: true
description: Setting the optional parameter `name` allows multiple notifiers to be created. The notifier will bind to the service `notify.NOTIFIER_NAME`. type: string
required: false command_timeout:
default: notify description: Defines number of seconds for command timeout.
type: string required: false
command: type: integer
description: The action to take. default: 15
required: true sensor:
type: string description: Sensor platform.
command_timeout: required: false
description: Defines number of seconds for command timeout. type: map
required: false keys:
type: integer command:
default: 15 description: The action to take to get the value.
{% endconfiguration %} required: true
type: string
To use notifications, please see the [getting started with automation page](/getting-started/automation/). command_timeout:
description: Defines number of seconds for command timeout
## Sensor required: false
type: integer
To enable it, add the following lines to your `configuration.yaml`: default: 15
json_attributes:
```yaml description: Defines a list of keys to extract values from a JSON dictionary result and then set as sensor attributes.
# Example configuration.yaml entry required: false
sensor: type: [string, list]
- platform: command_line name:
command: SENSOR_COMMAND description: Name of the command sensor.
``` required: false
type: string
{% configuration %} unique_id:
command: description: An ID that uniquely identifies this sensor. Set this to a unique value to allow customization through the UI.
description: The action to take to get the value. required: false
required: true type: string
type: string unit_of_measurement:
command_timeout: description: Defines the unit of measurement of the sensor, if any.
description: Defines number of seconds for command timeout required: false
required: false type: string
type: integer value_template:
default: 15 description: "Defines a [template](/docs/configuration/templating/#processing-incoming-data) to extract a value from the payload."
json_attributes: required: false
description: Defines a list of keys to extract values from a JSON dictionary result and then set as sensor attributes. type: string
required: false scan_interval:
type: [string, list] description: Define time in seconds between each update.
name: required: false
description: Name of the command sensor. type: integer
required: false default: 60
type: string switch:
unique_id: description: Switch platform.
description: An ID that uniquely identifies this sensor. Set this to a unique value to allow customization through the UI. required: false
required: false
type: string
scan_interval:
description: Defines number of seconds for polling interval.
required: false
type: integer
default: 60
unit_of_measurement:
description: Defines the unit of measurement of the sensor, if any.
required: false
type: string
value_template:
description: "Defines a [template](/docs/configuration/templating/#processing-incoming-data) to extract a value from the payload."
required: false
type: string
{% endconfiguration %}
## Switch
The `command_line` switch platform issues specific commands when it is turned on
and off. This might very well become our most powerful platform as it allows
anyone to integrate any type of switch into Home Assistant that can be
controlled from the command line, including calling other scripts!
To enable it, add the following lines to your `configuration.yaml`:
```yaml
# Example configuration.yaml entry
switch:
- platform: command_line
switches:
kitchen_light:
command_on: switch_command on kitchen
command_off: switch_command off kitchen
```
{% configuration %}
switches:
description: The array that contains all command switches.
required: true
type: map
keys:
identifier:
description: Name of the command switch as slug. Multiple entries are possible.
required: true
type: map type: map
keys: keys:
command_on: command_on:
@ -282,19 +207,14 @@ switches:
required: false required: false
type: integer type: integer
default: 15 default: 15
friendly_name: name:
description: The name used to display the switch in the frontend. description: The name used to display the switch in the frontend.
required: false required: true
type: string type: string
icon_template: icon:
description: Defines a template for the icon of the entity. description: Defines a template for the icon of the entity.
required: false required: false
type: template type: template
scan_interval:
description: Defines number of seconds for polling interval.
required: false
type: integer
default: 60
unique_id: unique_id:
description: An ID that uniquely identifies this switch. Set this to a unique value to allow customization through the UI. description: An ID that uniquely identifies this switch. Set this to a unique value to allow customization through the UI.
required: false required: false
@ -303,21 +223,103 @@ switches:
description: "If specified, `command_state` will ignore the result code of the command but the template evaluating to `true` will indicate the switch is on." description: "If specified, `command_state` will ignore the result code of the command but the template evaluating to `true` will indicate the switch is on."
required: false required: false
type: string type: string
scan_interval:
description: Define time in seconds between each update.
required: false
type: integer
default: 30
{% endconfiguration %} {% endconfiguration %}
A note on `friendly_name`: ## Binary sensor
When set, the `friendly_name` had been previously used for API calls and backend To use your Command binary sensor in your installation, add the following to your `configuration.yaml` file:
configuration instead of the `object_id` ("identifier"), but
[this behavior is changing](https://github.com/home-assistant/home-assistant/pull/4343) {% raw %}
to make the `friendly_name` for display purposes only. This allows users to set ```yaml
an `identifier` that emphasizes uniqueness and predictability for API and configuration # Example configuration.yaml entry
purposes but have a prettier `friendly_name` still show up in the UI. As an command_line:
additional benefit, if a user wanted to change the `friendly_name` / display - binary_sensor:
name (e.g., from "Kitchen Lightswitch" to "Kitchen Switch" or command: "cat /proc/sys/net/ipv4/ip_forward"
"Living Room Light", or remove the `friendly_name` altogether), they could ```
do so without needing to change existing automations or API calls. {% endraw%}
See aREST device below for an example.
## Cover
A `command_line`cover platform that issues specific commands when it is moved up, down and stopped. It allows anyone to integrate any type of cover into Home Assistant that can be controlled from the command line.
To enable a command line cover in your installation, add the following to your `configuration.yaml` file:
{% raw %}
```yaml
# Example configuration.yaml entry
command_line:
- cover:
command_open: move_command up garage
command_close: move_command down garage
command_stop: move_command stop garage
name: Garage
```
{% endraw%}
## Notify
The `command_line` platform allows you to use external tools for notifications from Home Assistant. The message will be passed in as STDIN.
To enable those notifications in your installation, add the following to your `configuration.yaml` file:
{% raw %}
```yaml
# Example configuration.yaml entry
command_line:
- notify:
command: "espeak -vmb/mb-us1"
```
{% endraw%}
To use notifications, please see the [getting started with automation page](/getting-started/automation/).
## Sensor
To enable it, add the following lines to your `configuration.yaml`:
{% raw %}
```yaml
# Example configuration.yaml entry
command_line:
- sensor:
command: SENSOR_COMMAND
```
{% endraw%}
## Switch
The `command_line` switch platform issues specific commands when it is turned on
and off. This might very well become our most powerful platform as it allows
anyone to integrate any type of switch into Home Assistant that can be
controlled from the command line, including calling other scripts!
To enable it, add the following lines to your `configuration.yaml`:
{% raw %}
```yaml
# Example configuration.yaml entry
command_line:
- switch:
name: Kitchen Light
command_on: switch_command on kitchen
command_off: switch_command off kitchen
```
{% endraw%}
<div class='note'>
A note on `name` for `cover` and `switch`:
The use of `friendly_name` and `object_id` has been deprecated and the slugified `name` will also be used as identifier.
Use `unique_id` to enable changing the name from the UI if required to use `name` as identifier object as required.
</div>
## Execution ## Execution
@ -339,42 +341,48 @@ In this section you find some real-life examples of how to use the command_line
Check the state of an [SickRage](https://github.com/sickragetv/sickrage) instance. Check the state of an [SickRage](https://github.com/sickragetv/sickrage) instance.
{% raw %}
```yaml ```yaml
# Example configuration.yaml entry # Example configuration.yaml entry
binary_sensor: command_line:
- platform: command_line - binary_sensor:
command: 'netstat -na | find "33322" | find /c "LISTENING" > nul && (echo "Running") || (echo "Not running")' command: 'netstat -na | find "33322" | find /c "LISTENING" > nul && (echo "Running") || (echo "Not running")'
name: "sickragerunning" name: "sickragerunning"
device_class: moving device_class: moving
payload_on: "Running" payload_on: "Running"
payload_off: "Not running" payload_off: "Not running"
``` ```
{% endraw%}
### Check RasPlex ### Check RasPlex
Check if [RasPlex](https://github.com/RasPlex/RasPlex) is `online`. Check if [RasPlex](https://github.com/RasPlex/RasPlex) is `online`.
{% raw %}
```yaml ```yaml
binary_sensor: command_line:
- platform: command_line - binary_sensor:
command: 'ping -c 1 rasplex.local | grep "1 received" | wc -l' command: 'ping -c 1 rasplex.local | grep "1 received" | wc -l'
name: "is_rasplex_online" name: "is_rasplex_online"
device_class: connectivity device_class: connectivity
payload_on: 1 payload_on: 1
payload_off: 0 payload_off: 0
``` ```
{% endraw%}
An alternative solution could look like this: An alternative solution could look like this:
{% raw %}
```yaml ```yaml
binary_sensor: command_line:
- platform: command_line - binary_sensor:
name: Printer name: Printer
command: 'ping -W 1 -c 1 192.168.1.10 > /dev/null 2>&1 && echo success || echo fail' command: 'ping -W 1 -c 1 192.168.1.10 > /dev/null 2>&1 && echo success || echo fail'
device_class: connectivity device_class: connectivity
payload_on: "success" payload_on: "success"
payload_off: "fail" payload_off: "fail"
``` ```
{% endraw%}
Consider to use the [ping sensor](/integrations/ping#binary-sensor) as an alternative to the samples above. Consider to use the [ping sensor](/integrations/ping#binary-sensor) as an alternative to the samples above.
@ -382,6 +390,7 @@ Consider to use the [ping sensor](/integrations/ping#binary-sensor) as an altern
The services running is listed in `/etc/systemd/system` and can be checked with the `systemctl` command: The services running is listed in `/etc/systemd/system` and can be checked with the `systemctl` command:
{% raw %}
```bash ```bash
$ systemctl is-active home-assistant@rock64.service $ systemctl is-active home-assistant@rock64.service
active active
@ -389,38 +398,40 @@ $ sudo service home-assistant@rock64.service stop
$ systemctl is-active home-assistant@rock64.service $ systemctl is-active home-assistant@rock64.service
inactive inactive
``` ```
{% endraw%}
A binary command line sensor can check this: A binary command line sensor can check this:
{% raw %}
```yaml ```yaml
binary_sensor: command_line:
- platform: command_line - binary_sensor:
command: '/bin/systemctl is-active home-assistant@rock64.service' command: '/bin/systemctl is-active home-assistant@rock64.service'
payload_on: "active" payload_on: "active"
payload_off: "inactive" payload_off: "inactive"
``` ```
{% endraw%}
## Example cover platform ## Example cover platform
{% raw %} {% raw %}
```yaml ```yaml
# Example configuration.yaml entry # Example configuration.yaml entry
cover: command_line:
- platform: command_line - cover:
covers: name: Garage door
garage_door: command_open: move_command up garage
command_open: move_command up garage command_close: move_command down garage
command_close: move_command down garage command_stop: move_command stop garage
command_stop: move_command stop garage command_state: state_command garage
command_state: state_command garage value_template: >
value_template: > {% if value == 'open' %}
{% if value == 'open' %} 100
100 {% elif value == 'closed' %}
{% elif value == 'closed' %} 0
0 {% endif %}
{% endif %}
``` ```
{% endraw%}
## Examples sensor platform ## Examples sensor platform
@ -431,34 +442,35 @@ In this section you find some real-life examples of how to use this sensor.
Thanks to the [`proc`](https://en.wikipedia.org/wiki/Procfs) file system, various details about a system can be retrieved. Here the CPU temperature is of interest. Add something similar to your `configuration.yaml` file: Thanks to the [`proc`](https://en.wikipedia.org/wiki/Procfs) file system, various details about a system can be retrieved. Here the CPU temperature is of interest. Add something similar to your `configuration.yaml` file:
{% raw %} {% raw %}
```yaml ```yaml
# Example configuration.yaml entry # Example configuration.yaml entry
sensor: command_line:
- platform: command_line - sensor:
name: CPU Temperature name: CPU Temperature
command: "cat /sys/class/thermal/thermal_zone0/temp" command: "cat /sys/class/thermal/thermal_zone0/temp"
# If errors occur, make sure configuration file is encoded as UTF-8 # If errors occur, make sure configuration file is encoded as UTF-8
unit_of_measurement: "°C" unit_of_measurement: "°C"
value_template: "{{ value | multiply(0.001) | round(1) }}" value_template: "{{ value | multiply(0.001) | round(1) }}"
``` ```
{% endraw%}
{% endraw %}
### Monitoring failed login attempts on Home Assistant ### Monitoring failed login attempts on Home Assistant
If you'd like to know how many failed login attempts are made to Home Assistant, add the following to your `configuration.yaml` file: If you'd like to know how many failed login attempts are made to Home Assistant, add the following to your `configuration.yaml` file:
{% raw %}
```yaml ```yaml
# Example configuration.yaml entry # Example configuration.yaml entry
sensor: command_line:
- platform: command_line - sensor:
name: badlogin name: Badlogin
command: "grep -c 'Login attempt' /home/hass/.homeassistant/home-assistant.log" command: "grep -c 'Login attempt' /home/hass/.homeassistant/home-assistant.log"
``` ```
{% endraw%}
Make sure to configure the [Logger integration](/integrations/logger) to monitor the [HTTP integration](/integrations/http/) at least the `warning` level. Make sure to configure the [Logger integration](/integrations/logger) to monitor the [HTTP integration](/integrations/http/) at least the `warning` level.
{% raw %}
```yaml ```yaml
# Example working logger settings that works # Example working logger settings that works
logger: logger:
@ -466,28 +478,33 @@ logger:
logs: logs:
homeassistant.components.http: warning homeassistant.components.http: warning
``` ```
{% endraw%}
### Details about the upstream Home Assistant release ### Details about the upstream Home Assistant release
You can see directly in the frontend (**Developer tools** -> **About**) what release of Home Assistant you are running. The Home Assistant releases are available on the [Python Package Index](https://pypi.python.org/pypi). This makes it possible to get the current release. You can see directly in the frontend (**Developer tools** -> **About**) what release of Home Assistant you are running. The Home Assistant releases are available on the [Python Package Index](https://pypi.python.org/pypi). This makes it possible to get the current release.
{% raw %}
```yaml ```yaml
sensor: command_line:
- platform: command_line - sensor:
command: python3 -c "import requests; print(requests.get('https://pypi.python.org/pypi/homeassistant/json').json()['info']['version'])" command: python3 -c "import requests; print(requests.get('https://pypi.python.org/pypi/homeassistant/json').json()['info']['version'])"
name: HA release name: HA release
``` ```
{% endraw%}
### Read value out of a remote text file ### Read value out of a remote text file
If you own devices which are storing values in text files which are accessible over HTTP then you can use the same approach as shown in the previous section. Instead of looking at the JSON response we directly grab the sensor's value. If you own devices which are storing values in text files which are accessible over HTTP then you can use the same approach as shown in the previous section. Instead of looking at the JSON response we directly grab the sensor's value.
{% raw %}
```yaml ```yaml
sensor: command_line:
- platform: command_line - sensor:
command: python3 -c "import requests; print(requests.get('http://remote-host/sensor_data.txt').text)" command: python3 -c "import requests; print(requests.get('http://remote-host/sensor_data.txt').text)"
name: File value name: File value
``` ```
{% endraw%}
### Use an external script ### Use an external script
@ -495,12 +512,15 @@ The example is doing the same as the [aREST sensor](/integrations/arest#sensor)
The one-line script to retrieve a value is shown below. Of course it would be possible to use this directly in the `configuration.yaml` file but need extra care about the quotation marks. The one-line script to retrieve a value is shown below. Of course it would be possible to use this directly in the `configuration.yaml` file but need extra care about the quotation marks.
{% raw %}
```bash ```bash
python3 -c "import requests; print(requests.get('http://10.0.0.48/analog/2').json()['return_value'])" python3 -c "import requests; print(requests.get('http://10.0.0.48/analog/2').json()['return_value'])"
``` ```
{% endraw%}
The script (saved as `arest-value.py`) that is used looks like the example below. The script (saved as `arest-value.py`) that is used looks like the example below.
{% raw %}
```python ```python
#!/usr/bin/python3 #!/usr/bin/python3
from requests import get from requests import get
@ -508,53 +528,52 @@ from requests import get
response = get("http://10.0.0.48/analog/2") response = get("http://10.0.0.48/analog/2")
print(response.json()["return_value"]) print(response.json()["return_value"])
``` ```
{% endraw%}
To use the script you need to add something like the following to your `configuration.yaml` file. To use the script you need to add something like the following to your `configuration.yaml` file.
{% raw %}
```yaml ```yaml
# Example configuration.yaml entry # Example configuration.yaml entry
sensor: command_line:
- platform: command_line - sensor:
name: Brightness name: Brightness
command: "python3 /path/to/script/arest-value.py" command: "python3 /path/to/script/arest-value.py"
``` ```
{% endraw%}
### Usage of templating in `command:` ### Usage of templating in `command:`
[Templates](/docs/configuration/templating/) are supported in the `command` configuration variable. This could be used if you want to include the state of a specific sensor as an argument to your external script. [Templates](/docs/configuration/templating/) are supported in the `command` configuration variable. This could be used if you want to include the state of a specific sensor as an argument to your external script.
{% raw %} {% raw %}
```yaml ```yaml
# Example configuration.yaml entry # Example configuration.yaml entry
sensor: command_line:
- platform: command_line - sensor:
name: wind direction name: Wind direction
command: "sh /home/pi/.homeassistant/scripts/wind_direction.sh {{ states('sensor.wind_direction') }}" command: "sh /home/pi/.homeassistant/scripts/wind_direction.sh {{ states('sensor.wind_direction') }}"
unit_of_measurement: "Direction" unit_of_measurement: "Direction"
``` ```
{% endraw%}
{% endraw %}
### Usage of JSON attributes in command output ### Usage of JSON attributes in command output
The example shows how you can retrieve multiple values with one sensor (where the additional values are attributes) by using `value_json` and `json_attributes`. The example shows how you can retrieve multiple values with one sensor (where the additional values are attributes) by using `value_json` and `json_attributes`.
{% raw %} {% raw %}
```yaml ```yaml
# Example configuration.yaml entry # Example configuration.yaml entry
sensor: command_line:
- platform: command_line - sensor:
name: JSON time name: JSON time
json_attributes: json_attributes:
- date - date
- milliseconds_since_epoch - milliseconds_since_epoch
command: "python3 /home/pi/.homeassistant/scripts/datetime.py" command: "python3 /home/pi/.homeassistant/scripts/datetime.py"
value_template: "{{ value_json.time }}" value_template: "{{ value_json.time }}"
``` ```
{% endraw%}
{% endraw %}
## Example switch platform ## Example switch platform
@ -563,28 +582,23 @@ sensor:
This example demonstrates how to use template to change the icon as its state changes. This icon is referencing its own state. This example demonstrates how to use template to change the icon as its state changes. This icon is referencing its own state.
{% raw %} {% raw %}
```yaml ```yaml
switch: command_line:
- platform: command_line - switch:
switches: name: Driveway outside sensor
command_on: >
driveway_sensor_motion: curl -X PUT -d '{"on":true}' "http://ip_address/api/sensors/27/config/"
friendly_name: Driveway outside sensor command_off: >
command_on: > curl -X PUT -d '{"on":false}' "http://ip_address/api/sensors/27/config/"
curl -X PUT -d '{"on":true}' "http://ip_address/api/sensors/27/config/" command_state: curl http://ip_address/api/sensors/27/
command_off: > value_template: >
curl -X PUT -d '{"on":false}' "http://ip_address/api/sensors/27/config/" {{value_json.config.on}}
command_state: curl http://ip_address/api/sensors/27/ icon_template: >
value_template: > {% if value_json.config.on == true %} mdi:toggle-switch
{{value_json.config.on}} {% else %} mdi:toggle-switch-off
icon_template: > {% endif %}
{% if value_json.config.on == true %} mdi:toggle-switch
{% else %} mdi:toggle-switch-off
{% endif %}
``` ```
{% endraw%}
{% endraw %}
### aREST device ### aREST device
@ -594,21 +608,17 @@ The command line tool [`curl`](https://curl.haxx.se/) is used to toggle a pin
which is controllable through REST. which is controllable through REST.
{% raw %} {% raw %}
```yaml ```yaml
# Example configuration.yaml entry # Example configuration.yaml entry
switch: command_line:
- platform: command_line - switch:
switches: command_on: "/usr/bin/curl -X GET http://192.168.1.10/digital/4/1"
arest_pin_four: command_off: "/usr/bin/curl -X GET http://192.168.1.10/digital/4/0"
command_on: "/usr/bin/curl -X GET http://192.168.1.10/digital/4/1" command_state: "/usr/bin/curl -X GET http://192.168.1.10/digital/4"
command_off: "/usr/bin/curl -X GET http://192.168.1.10/digital/4/0" value_template: '{{ value == "1" }}'
command_state: "/usr/bin/curl -X GET http://192.168.1.10/digital/4" name: Kitchen Lightswitch
value_template: '{{ value == "1" }}'
friendly_name: Kitchen Lightswitch
``` ```
{% endraw%}
{% endraw %}
Given this example, in the UI one would see the `friendly_name` of Given this example, in the UI one would see the `friendly_name` of
"Kitchen Light". However, the `identifier` is `arest_pin_four`, making the "Kitchen Light". However, the `identifier` is `arest_pin_four`, making the
@ -623,29 +633,31 @@ This switch will shutdown your system that is hosting Home Assistant.
This switch will shutdown your host immediately, there will be no confirmation. This switch will shutdown your host immediately, there will be no confirmation.
</div> </div>
{% raw %}
```yaml ```yaml
# Example configuration.yaml entry # Example configuration.yaml entry
switch: command_line:
- platform: command_line - switch:
switches: name: Home Assistant System Shutdown
home_assistant_system_shutdown: command_off: "/usr/sbin/poweroff"
command_off: "/usr/sbin/poweroff"
``` ```
{% endraw%}
### Control your VLC player ### Control your VLC player
This switch will control a local VLC media player This switch will control a local VLC media player
([Source](https://community.home-assistant.io/t/vlc-player/106)). ([Source](https://community.home-assistant.io/t/vlc-player/106)).
{% raw %}
```yaml ```yaml
# Example configuration.yaml entry # Example configuration.yaml entry
switch: command_line:
- platform: command_line - switch:
switches: name: VLC
vlc: command_on: "cvlc 1.mp3 vlc://quit &"
command_on: "cvlc 1.mp3 vlc://quit &" command_off: "pkill vlc"
command_off: "pkill vlc"
``` ```
{% endraw%}
### Control Foscam Motion Sensor ### Control Foscam Motion Sensor
@ -655,20 +667,17 @@ This switch supports statecmd,
which checks the current state of motion detection. which checks the current state of motion detection.
{% raw %} {% raw %}
```yaml ```yaml
# Example configuration.yaml entry # Example configuration.yaml entry
switch: command_line:
- platform: command_line - switch:
switches: name: Foscam Motion
foscam_motion: command_on: 'curl -k "https://ipaddress:443/cgi-bin/CGIProxy.fcgi?cmd=setMotionDetectConfig&isEnable=1&usr=admin&pwd=password"'
command_on: 'curl -k "https://ipaddress:443/cgi-bin/CGIProxy.fcgi?cmd=setMotionDetectConfig&isEnable=1&usr=admin&pwd=password"' command_off: 'curl -k "https://ipaddress:443/cgi-bin/CGIProxy.fcgi?cmd=setMotionDetectConfig&isEnable=0&usr=admin&pwd=password"'
command_off: 'curl -k "https://ipaddress:443/cgi-bin/CGIProxy.fcgi?cmd=setMotionDetectConfig&isEnable=0&usr=admin&pwd=password"' command_state: 'curl -k --silent "https://ipaddress:443/cgi-bin/CGIProxy.fcgi?cmd=getMotionDetectConfig&usr=admin&pwd=password" | grep -oP "(?<=isEnable>).*?(?=</isEnable>)"'
command_state: 'curl -k --silent "https://ipaddress:443/cgi-bin/CGIProxy.fcgi?cmd=getMotionDetectConfig&usr=admin&pwd=password" | grep -oP "(?<=isEnable>).*?(?=</isEnable>)"' value_template: '{{ value == "1" }}'
value_template: '{{ value == "1" }}'
``` ```
{% endraw%}
{% endraw %}
- Replace admin and password with an "Admin" privileged Foscam user - Replace admin and password with an "Admin" privileged Foscam user
- Replace ipaddress with the local IP address of your Foscam - Replace ipaddress with the local IP address of your Foscam

View File

@ -12,7 +12,7 @@ ha_domain: counter
ha_integration_type: helper ha_integration_type: helper
--- ---
The `counter` integration allows one to count occurrences fired by automations. The Counter integration allows one to count occurrences fired by automations.
## Configuration ## Configuration
@ -85,7 +85,7 @@ If `restore` is set to `true`, the `initial` value will only be used when no pre
## Services ## Services
Available services: `increment`, `decrement`, `reset` and `configure`. Available services: `increment`, `decrement`, `reset`, and `set_value`.
### Service `counter.increment` ### Service `counter.increment`
@ -111,18 +111,14 @@ With this service the counter is reset to its initial value.
| ---------------------- | -------- | ----------- | | ---------------------- | -------- | ----------- |
| `entity_id` | no | Name of the entity to take action, e.g., `counter.my_custom_counter`. | | `entity_id` | no | Name of the entity to take action, e.g., `counter.my_custom_counter`. |
### Service `counter.configure` ### Service `counter.set_value`
With this service the properties of the counter can be changed while running. This service allows setting the counter to a specific value.
| Service data attribute | Optional | Description | | Service data attribute | Optional | Description |
| ---------------------- | -------- | ----------- | | ---------------------- | -------- | ----------- |
| `entity_id` | no | Name of the entity to take action, e.g., `counter.my_custom_counter`. | | `entity_id` | no | Name of the entity to take action, e.g., `counter.my_custom_counter`. |
| `minimum` | yes | Set new value for minimum. None disables minimum. | | `value` | yes | Set the counter to the given value. |
| `maximum` | yes | Set new value for maximum. None disables maximum. |
| `step` | yes | Set new value for step. |
| `initial` | yes | Set new value for initial. |
| `value` | yes | Set the counters state to the given value. |
### Use the service ### Use the service

View File

@ -0,0 +1,31 @@
---
title: Date
description: Instructions on how to set up date entities within Home Assistant.
ha_category:
- Date
ha_release: '2023.6'
ha_domain: date
ha_quality_scale: internal
ha_codeowners:
- '@home-assistant/core'
ha_integration_type: entity
---
The Date integration is built for the controlling and monitoring of dates on devices.
Date entities cannot be created manually, but can be provided by other integrations. If you are looking for a way to create a similar entity, please take a look at the [Date/Time helper](/integrations/input_datetime).
## Services
### date services
Available services: `date.set_value`
### Service `date.set_value`
Set a new value for the date entity.
| Service data attribute | Optional | Description |
| ---------------------- | -------- | ----------- |
| `entity_id` | no | String or list of strings that point at `entity_id`'s of dates to control.
| `date` | no | New date value to set.

View File

@ -0,0 +1,31 @@
---
title: Date/Time
description: Instructions on how to set up date/time entities within Home Assistant.
ha_category:
- Date/Time
ha_release: "2023.6"
ha_domain: datetime
ha_quality_scale: internal
ha_codeowners:
- "@home-assistant/core"
ha_integration_type: entity
---
The Date/Time integration is built for the controlling and monitoring of timestamps on devices.
Date/Time entities cannot be implemented manually, but can be provided by other integrations. If you are looking for a way to create a Date/Time entity, please take a look at the [Date/Time helper](/integrations/input_datetime).
## Services
### datetime services
Available services: `datetime.set_value`
### Service `datetime.set_value`
Set a new value for the datetime entity.
| Service data attribute | Optional | Description |
| ---------------------- | -------- | ------------------------------------------------------------------------------------------------------------ |
| `entity_id` | no | String or list of strings that point at `entity_id`'s of datetimes to control. |
| `datetime` | no | New datetime value to set. If timezone is not included, the Home Assistant instance's timezone will be used. |

View File

@ -40,6 +40,8 @@ ha_platforms:
- vacuum - vacuum
- water_heater - water_heater
- weather - weather
- date
- time
ha_integration_type: integration ha_integration_type: integration
--- ---

View File

@ -118,7 +118,14 @@ json_attributes_template:
required: false required: false
type: template type: template
json_attributes_topic: json_attributes_topic:
description: The MQTT topic subscribed to receive a JSON dictionary payload and then set as device_tracker attributes. Usage example can be found in [MQTT sensor](/integrations/sensor.mqtt/#json-attributes-topic-configuration) documentation. description: "The MQTT topic subscribed to receive a JSON dictionary message containing device tracker attributes.
This topic can be used to set the location of the device tracker under the following conditions:
* If the attributes in the JSON message include `longitude`, `latitude`, and `gps_accuracy` (optional).\n
* If the device tracker is within a configured [zone](/integrations/zone/).\n
If these conditions are met, it is not required to configure `state_topic`.\n\n
Be aware that any location message received at `state_topic` overrides the location received via `json_attributes_topic` until a message configured with `payload_reset` is received at `state_topic`. For a more generic usage example of the `json_attributes_topic`, refer to the [MQTT sensor](/integrations/sensor.mqtt/#json-attributes-topic-configuration) documentation."
required: false required: false
type: string type: string
name: name:
@ -164,8 +171,8 @@ source_type:
required: false required: false
type: string type: string
state_topic: state_topic:
description: The MQTT topic subscribed to receive device tracker state changes. description: The MQTT topic subscribed to receive device tracker state changes. The states defined in `state_topic` override the location states defined by the `json_attributes_topic`. This state override is turned inactive if the `state_topic` receives a message containing `payload_reset`. The `state_topic` can only be omitted if `json_attributes_topic` is used.
required: true required: false
type: string type: string
unique_id: unique_id:
description: "An ID that uniquely identifies this device_tracker. If two device_trackers have the same unique ID, Home Assistant will raise an exception." description: "An ID that uniquely identifies this device_tracker. If two device_trackers have the same unique ID, Home Assistant will raise an exception."
@ -208,6 +215,8 @@ If the device supports GPS coordinates then they can be sent to Home Assistant b
- Attributes topic: `a4567d663eaf/attributes` - Attributes topic: `a4567d663eaf/attributes`
- Example attributes payload: - Example attributes payload:
Example message to be received at topic `a4567d663eaf/attributes`:
```json ```json
{ {
"latitude": 32.87336, "latitude": 32.87336,
@ -219,9 +228,15 @@ If the device supports GPS coordinates then they can be sent to Home Assistant b
To create the device_tracker with GPS coordinates support: To create the device_tracker with GPS coordinates support:
```bash ```bash
mosquitto_pub -h 127.0.0.1 -t homeassistant/device_tracker/a4567d663eaf/config -m '{"state_topic": "a4567d663eaf/state", "name": "My Tracker", "payload_home": "home", "payload_not_home": "not_home", "json_attributes_topic": "a4567d663eaf/attributes"}' mosquitto_pub -h 127.0.0.1 -t homeassistant/device_tracker/a4567d663eaf/config -m '{"json_attributes_topic": "a4567d663eaf/attributes", "name": "My Tracker"}'
``` ```
<div class='note info'>
Using `state_topic` is optional when using `json_attributes_topic` to determine the state of the device tracker.
</div>
To set the state of the device tracker to specific coordinates: To set the state of the device tracker to specific coordinates:
```bash ```bash

View File

@ -16,11 +16,12 @@ ha_codeowners:
- '@2Fake' - '@2Fake'
- '@Shutgun' - '@Shutgun'
ha_domain: devolo_home_control ha_domain: devolo_home_control
ha_quality_scale: silver ha_quality_scale: gold
ha_platforms: ha_platforms:
- binary_sensor - binary_sensor
- climate - climate
- cover - cover
- diagnostics
- light - light
- sensor - sensor
- siren - siren

View File

@ -3,6 +3,7 @@ title: devolo Home Network
description: Instructions on how to integrate devolo Home Network devices with Home Assistant. description: Instructions on how to integrate devolo Home Network devices with Home Assistant.
ha_category: ha_category:
- Binary Sensor - Binary Sensor
- Button
- Presence Detection - Presence Detection
- Sensor - Sensor
- Switch - Switch
@ -16,6 +17,7 @@ ha_domain: devolo_home_network
ha_quality_scale: platinum ha_quality_scale: platinum
ha_platforms: ha_platforms:
- binary_sensor - binary_sensor
- button
- device_tracker - device_tracker
- diagnostics - diagnostics
- sensor - sensor
@ -38,6 +40,13 @@ Currently the following device types within Home Assistant are supported.
* Updates every 5 minutes * Updates every 5 minutes
* Is disabled by default because it typically rarely changes * Is disabled by default because it typically rarely changes
### Buttons
* Identify a PLC device by making its LED blink for 2 minutes
* Start pairing on a PLC device
* Restart the device
* Start WPS
### Presence Detection ### Presence Detection
* Detect presence of devices connected to the main or the guest wifi * Detect presence of devices connected to the main or the guest wifi

View File

@ -10,11 +10,11 @@ ha_domain: dialogflow
ha_integration_type: integration ha_integration_type: integration
--- ---
The `dialogflow` integration is designed to be used with the [webhook](https://dialogflow.com/docs/fulfillment#webhook) integration of [Dialogflow](https://dialogflow.com/). When a conversation ends with a user, Dialogflow sends an action and parameters to the webhook. The `dialogflow` integration is designed to be used with the [webhook](https://cloud.google.com/dialogflow/es/docs/fulfillment-webhook) integration of [Dialogflow](https://cloud.google.com/dialogflow/docs/). When a conversation ends with a user, Dialogflow sends an action and parameters to the webhook.
To be able to receive messages from Dialogflow, your Home Assistant instance needs to be accessible from the web and you need to have the external URL [configured](/docs/configuration/basic). Dialogflow will return fallback answers if your server does not answer or takes too long (more than 5 seconds). To be able to receive messages from Dialogflow, your Home Assistant instance needs to be accessible from the web and you need to have the external URL [configured](/docs/configuration/basic). Dialogflow will return fallback answers if your server does not answer or takes too long (more than 5 seconds).
Dialogflow could be [integrated](https://dialogflow.com/docs/integrations/) with many popular messaging, virtual assistant and IoT platforms. Dialogflow could be [integrated](https://cloud.google.com/dialogflow/es/docs/integrations) with many popular messaging, virtual assistant and IoT platforms.
Using Dialogflow will be easy to create conversations like: Using Dialogflow will be easy to create conversations like:
@ -38,7 +38,7 @@ To get the webhook URL, go to the integrations page in the configuration screen
- [Login](https://console.dialogflow.com/) with your Google account. - [Login](https://console.dialogflow.com/) with your Google account.
- Click on "Create Agent". - Click on "Create Agent".
- Select name, language (if you are planning to use Google Actions check their [supported languages](https://support.google.com/assistant/answer/7108196?hl=en)) and time zone. - Select name, language (if you are planning to use Google Actions check their [supported languages](https://support.google.com/assistant/answer/7108196)) and time zone.
- Click "Save". - Click "Save".
- Now go to "Fulfillment" (in the left menu). - Now go to "Fulfillment" (in the left menu).
- Enable Webhook and set your Dialogflow webhook URL as the endpoint, e.g., `https://myhome.duckdns.org/api/webhook/800b4cb4d27d078a8871656a90854a292651b20635685f8ea23ddb7a09e8b417` - Enable Webhook and set your Dialogflow webhook URL as the endpoint, e.g., `https://myhome.duckdns.org/api/webhook/800b4cb4d27d078a8871656a90854a292651b20635685f8ea23ddb7a09e8b417`
@ -66,7 +66,7 @@ When activated, the [`alexa` integration](/integrations/alexa/) will have Home A
## Examples ## Examples
Download [this zip](https://github.com/home-assistant/home-assistant.io/blob/next/source/assets/HomeAssistant_APIAI.zip) and load it in your Dialogflow agent (**Settings** -> **Export and Import**) for examples intents to use with this configuration: Download [this zip](https://github.com/home-assistant/home-assistant.io/blob/current/source/assets/HomeAssistant_APIAI.zip) and load it in your Dialogflow agent (**Settings** -> **Export and Import**) for examples intents to use with this configuration:
{% raw %} {% raw %}

View File

@ -1,8 +1,9 @@
--- ---
title: Deutscher Wetterdienst (DWD) Weather Warnings title: Deutscher Wetterdienst (DWD) Weather Warnings
description: Instructions on how to integrate Deutsche Wetter Dienst weather warnings into Home Assistant. description: Instructions on how to integrate Deutscher Wetterdienst weather warnings into Home Assistant.
ha_category: ha_category:
- Weather - Weather
ha_config_flow: true
ha_release: 0.51 ha_release: 0.51
ha_iot_class: Cloud Polling ha_iot_class: Cloud Polling
ha_domain: dwd_weather_warnings ha_domain: dwd_weather_warnings
@ -16,71 +17,33 @@ ha_platforms:
ha_integration_type: integration ha_integration_type: integration
--- ---
The `dwd_weather_warnings` sensor platform uses the [Deutsche Wetter Dienst (DWD)](https://www.dwd.de) as a source for current and advance warnings. The Deutscher Wetterdienst Weather Warnings integration uses the [Deutscher Wetterdienst (DWD)](https://www.dwd.de) as a source for current and advance weather warnings. The configured sensor checks for data every 15 minutes.
- A name is optional but if multiple regions are used a name will be required. {% include integrations/config_flow.md %}
- The sensor checks for new data every 15 minutes.
## Configuration {% configuration_basic %}
Warncell ID or name:
To add the DWD WarnApp sensor to your installation, add the following to your `configuration.yaml` file: description: Identifier of the region. It can be a warncell ID (integer) or a warncell name. It is heavily advised to use warncell ID because a warncell name is sometimes not unique. A list of valid warncell IDs and names can be found [here](https://www.dwd.de/DE/leistungen/opendata/help/warnungen/cap_warncellids_csv.html). Some of the warncells are outdated but still listed. If the setup fails, search the list for a similar sounding warncell. If the warncell name is not unique, `" (not unique used ID)!"` will be added to the reported `region_name`. Setting this field is required.
{% endconfiguration_basic %}
```yaml
# Example configuration.yaml entry
sensor:
- platform: dwd_weather_warnings
region_name: Hansestadt Hamburg
```
<div class="note">
- The `region_name` can either be a so called `warncell id` (integer) or a `warncell name` (string). It is heavily advised to use `warncell id` because `warncell name` is not unique in some cases.
A list of valid warncell ids and names can be found at [here](https://www.dwd.de/DE/leistungen/opendata/help/warnungen/cap_warncellids_csv.html).
- Some of the warncells are outdated but still listed. If setup fails search the list for a similar sounding warncell.
- If you selected a `warncell name` and the name is not unique `" (not unique used ID)!"` will be added to the reported `region_name`.
</div>
{% configuration %}
region_name:
required: true
description: The region name = warncell name (string) or region id = warncell id (integer) taken from DWD homepage.
type: [string, integer]
name:
required: false
description: The name you would like to give to the warnapp sensor.
type: string
default: DWD-Weather-Warnings
monitored_conditions:
description: List of warnings you want to be informed about.
required: false
default: all
type: list
keys:
current_warning_level:
description: The current warning level.
advance_warning_level:
description: The expected warning level.
{% endconfiguration %}
### Attributes ### Attributes
| Attribute | Description | | Attribute | Description |
| ------------ | -------------------------------------- | | ------------ | -------------------------------------- |
| `last_update` | *(time)* Time and date (UTC) of last update from DWD. | | `last_update` | *(time)* Time and date (UTC) of last update from DWD. |
| `region_name` | *(str)* Requested region name. This should be the same as the region name in the configuration if a name was given. | | `region_name` | *(str)* Requested region name. This should be the same as the region name in the configuration, if a name was given. |
| `region_id` | *(int)* Region ID assigned by DWD. This should be the same as the region id in the configuration if an id was given. | | `region_id` | *(int)* Region ID assigned by DWD. This should be the same as the region id in the configuration, if an id was given. |
| `warning_count` | *(int)* Number of issued warnings. There can be more than one warning issued at once. | | `warning_count` | *(int)* Number of issued warnings. There can be more than one warning issued at once. |
| `warning_<x>` | *(list)* The warning as a whole object containing the following attributes as nested attributes. | | `warning_<x>` | *(list)* The warning as a whole object containing the following attributes as nested attributes. |
| `warning_<x>_level` | *(int)* Issued warning level (0 - 4).<br/>0: Keine Warnungen <br/>1: Wetterwarnungen <br/>2: Warnungen vor markantem Wetter<br/>3: Unwetterwarnungen<br/>4: Warnungen vor extremem Unwetter | | `warning_<x>_level` | *(int)* Issued warning level (0 - 4).<br/>0: Keine Warnungen <br/>1: Wetterwarnungen <br/>2: Warnungen vor markantem Wetter<br/>3: Unwetterwarnungen<br/>4: Warnungen vor extremem Unwetter |
| `warning_<x>_type` | *(int)* Issued warning type. <br/>More information can be found [here](https://www.dwd.de/DE/leistungen/opendata/help/warnungen/warning_codes_pdf.pdf?__blob=publicationFile&v=5). | | `warning_<x>_type` | *(int)* Issued warning type. More information can be found [here](https://www.dwd.de/DE/leistungen/opendata/help/warnungen/warning_codes_pdf.pdf?__blob=publicationFile&v=5). |
| `warning_<x>_name` | *(str)* Warning name correlates with the warning type and represents it as a short string. | | `warning_<x>_name` | *(str)* Warning name correlated with the warning type and represented as a short string. |
| `warning_<x>_headline` | *(str)* Official headline of the weather warning. | | `warning_<x>_headline` | *(str)* Official headline of the weather warning. |
| `warning_<x>_start` | *(time)* Starting time and date (UTC) of the issued warning. | | `warning_<x>_start` | *(time)* Starting time and date (UTC) of the issued warning. |
| `warning_<x>_end` | *(time)* Ending time and date (UTC) of the issued warning. | | `warning_<x>_end` | *(time)* Ending time and date (UTC) of the issued warning. |
| `warning_<x>_description` | *(str)* Details for the issued warning. | | `warning_<x>_description` | *(str)* Details for the issued warning. |
| `warning_<x>_instruction` | *(str)* The DWD sometimes provides helpful information about precautions to take for the issued warning. | | `warning_<x>_instruction` | *(str)* The DWD sometimes provides helpful information about precautions to take for the issued warning. |
| `warning_<x>_parameters` | *(list)* A list of additional warning parameters. <br/>More information can be found [here](https://www.dwd.de/DE/leistungen/opendata/help/warnungen/warning_codes_pdf.pdf?__blob=publicationFile&v=5). | | `warning_<x>_parameters` | *(list)* A list of additional warning parameters. More information can be found [here](https://www.dwd.de/DE/leistungen/opendata/help/warnungen/warning_codes_pdf.pdf?__blob=publicationFile&v=5). |
| `warning_<x>_color` | *(str)* The DWD color of the warning encoded as `#rrggbb`. | | `warning_<x>_color` | *(str)* The DWD color of the warning encoded as `#rrggbb`. |
<div class="note"> <div class="note">

View File

@ -83,7 +83,7 @@ api_key:
<img src='/images/screenshots/ecobee-thermostat-card.png' /> <img src='/images/screenshots/ecobee-thermostat-card.png' />
</p> </p>
You must [restart Home Assistant](/docs/configuration/#reloading-changes) for the changes to take effect. After restarting, navigate to the **Settings** -> **Devices & Services** menu, hit **Configure** next to the discovered `ecobee` entry, and continue to authorize the app according to the above **Automatic Configuration**, starting at step 2. You must [restart Home Assistant](/docs/configuration/#reloading-changes) for the changes to take effect. After restarting, go to {% my integrations title="**Settings** > **Devices & Services**" %} and select the cogwheel. Then, select **Configure** and continue to authorize the app according to the above **Automatic Configuration**, starting at step 2.
## Notifications ## Notifications

View File

@ -0,0 +1,23 @@
---
title: Electra Smart
description: Instructions for how to integrate Electra Air Conditioners within Home Assistant.
ha_category:
- Climate
ha_release: 2023.6
ha_iot_class: Cloud Polling
ha_config_flow: true
ha_codeowners:
- '@jafar-atili'
ha_domain: electrasmart
ha_platforms:
- climate
ha_integration_type: integration
---
[Electra Air](https://www.electra-air.co.il), is a company which manufactures and sells Air Conditioners.
To set up this integration, you must have access to the phone number used to register in the Electra Smart mobile app.
Air Conditioners configured in your Electra Smart mobile app will be discovered by Home Assistant after the Electra Smart integration is configured.
{% include integrations/config_flow.md %}

View File

@ -4,6 +4,7 @@ description: Integrate EZVIZ camera within Home Assistant.
ha_release: 0.107 ha_release: 0.107
ha_category: ha_category:
- Camera - Camera
- Update
ha_iot_class: Cloud Polling ha_iot_class: Cloud Polling
ha_domain: ezviz ha_domain: ezviz
ha_codeowners: ha_codeowners:
@ -13,8 +14,10 @@ ha_config_flow: true
ha_platforms: ha_platforms:
- binary_sensor - binary_sensor
- camera - camera
- number
- sensor - sensor
- switch - switch
- update
ha_integration_type: integration ha_integration_type: integration
--- ---
@ -91,6 +94,14 @@ To enable/disable motion detection, use the Home Assistant built in services.
| -----------------------| ----------- | | -----------------------| ----------- |
| `entity_id` | String or list of strings that point at `entity_id`s of cameras. Use `entity_id: all` to target all. | | `entity_id` | String or list of strings that point at `entity_id`s of cameras. Use `entity_id: all` to target all. |
### OTA update
Trigger device OTA firmware update process for latest stable version.
### Motion Detection Sensitivity
The motion detection sensitivity can be adjusted using the "Detection sensitivity" Number entity. It's important to note that this entity fetches information from the device and will not update if your battery-powered camera is in sleep mode, as this measure is implemented to preserve battery life and prevent excessive drainage.
## Troubleshooting ## Troubleshooting
- `authentication failed`: The authentication requires an EZVIZ account with two-step verification disabled. Google, Facebook, TikTok, or other Oauth-based accounts will not work. - `authentication failed`: The authentication requires an EZVIZ account with two-step verification disabled. Google, Facebook, TikTok, or other Oauth-based accounts will not work.

View File

@ -15,7 +15,7 @@ ha_integration_type: integration
--- ---
The FAA Delays integration collects and displays information about delays at US Airports based on the The FAA Delays integration collects and displays information about delays at US Airports based on the
[FAA's National Airspace System Status](https://www.fly.faa.gov/ois/). [FAA's National Airspace System Status](https://nasstatus.faa.gov/).
Data measured includes: Data measured includes:

View File

@ -25,7 +25,7 @@ notify:
{% configuration %} {% configuration %}
page_access_token: page_access_token:
description: "Access token for your Facebook page. Checkout [Facebook Messenger Platform](https://developers.facebook.com/docs/messenger-platform/guides/setup) for more information." description: "Access token for your Facebook page. Checkout [Facebook Messenger Platform](https://developers.facebook.com/docs/messenger-platform/webhooks) for more information."
required: true required: true
type: string type: string
name: name:
@ -37,8 +37,8 @@ name:
### Usage ### Usage
With Facebook notify service, you can send your notifications to your Facebook messenger with help of your Facebook page. You have to create a [Facebook Page and App](https://developers.facebook.com/docs/messenger-platform/guides/quick-start) for this service. You can control it by calling the notify service [as described here](/integrations/notify/). It will send a message on messenger to user specified by **target** on behalf of your page. See the [quick start](https://developers.facebook.com/docs/messenger-platform/guides/quick-start) guide for more information. With Facebook notify service, you can send your notifications to your Facebook messenger with help of your Facebook page. You have to create a [Facebook Page and App](https://developers.facebook.com/docs/messenger-platform/getting-started/quick-start) for this service. You can control it by calling the notify service [as described here](/integrations/notify/). It will send a message on messenger to user specified by **target** on behalf of your page. See the [quick start](https://developers.facebook.com/docs/messenger-platform/getting-started/quick-start) guide for more information.
The phone number used in **target** should be registered with Facebook messenger. Phone number of the recipient should be in +1(212)555-2368 format. If your app is not approved by Facebook then the recipient should by either admin, developer or tester for your Facebook app. [More information](https://developers.facebook.com/docs/messenger-platform/send-api-reference#phone_number) about the phone number. The phone number used in **target** should be registered with Facebook messenger. Phone number of the recipient should be in +1(212)555-2368 format. If your app is not approved by Facebook then the recipient should by either admin, developer or tester for your Facebook app. [More information](https://developers.facebook.com/docs/messenger-platform/reference/send-api#phone_number) about the phone number.
```yaml ```yaml
# Example automation notification entry # Example automation notification entry
@ -102,7 +102,7 @@ if (preg_match('/get my id/', strtolower($message))) {
``` ```
### Rich messages ### Rich messages
You could also send rich messing (cards, buttons, images, videos, etc). [Info](https://developers.facebook.com/docs/messenger-platform/send-api-reference) to which types of messages and how to build them. You could also send rich messing (cards, buttons, images, videos, etc). [Info](https://developers.facebook.com/docs/messenger-platform/reference/send-api) to which types of messages and how to build them.
```yaml ```yaml
# Example script with a notification entry with a rich message # Example script with a notification entry with a rich message

View File

@ -89,6 +89,11 @@ automation:
Any field under the `<entry>` tag in the feed can be used for example `trigger.event.data.content` will get the body of the feed entry. Any field under the `<entry>` tag in the feed can be used for example `trigger.event.data.content` will get the body of the feed entry.
### Video Tutorial
This video tutorial explains how to set up the feedreader and show the latest news feed item on your dashboard in Home Assistant.
<lite-youtube videoid="Va4JOKbesi0" videotitle="How to view RSS feeds on your Dashboard in Home Assistant" posterquality="maxresdefault"></lite-youtube>
For more advanced use cases, a custom integration registering to the `feedreader` event type could be used instead: For more advanced use cases, a custom integration registering to the `feedreader` event type could be used instead:
```python ```python

View File

@ -104,7 +104,7 @@ The following attributes are available:
With Automation you can configure one or more of the following useful actions: With Automation you can configure one or more of the following useful actions:
1. Sound an alarm and/or switch on lights when an emergency incident is received. 1. Sound an alarm and/or switch on lights when an emergency incident is received.
1. Use text to speech to play incident details via a media player while getting dressed. 1. Use text-to-speech to play incident details via a media player while getting dressed.
1. Respond with a response acknowledgment using a door-sensor when leaving the house or by pressing a button to let your teammates know you are underway. 1. Respond with a response acknowledgment using a door-sensor when leaving the house or by pressing a button to let your teammates know you are underway.
1. Cast a FireServiceRota dashboard to a Chromecast device. (this requires a Nabu Casa subscription) 1. Cast a FireServiceRota dashboard to a Chromecast device. (this requires a Nabu Casa subscription)

View File

@ -100,8 +100,8 @@ If you like the Forecast.Solar service, or are interested in more frequent data
updates (based on a higher data resolution), you could [sign up for one updates (based on a higher data resolution), you could [sign up for one
of their plans](https://doc.forecast.solar/doku.php?id=account_models#compare_plans). of their plans](https://doc.forecast.solar/doku.php?id=account_models#compare_plans).
To enable the use of the API key with this integration, go to {% my integrations %}, To enable the use of the API key with this integration, go to {% my integrations %}.
click "Configure" on the Forecast.Solar integration instance and enter the On the Forecast.Solar integration, select the cogwheel, then select **Configure**. Enter the
API key for your account. API key for your account.
## Tweaking the estimations ## Tweaking the estimations
@ -131,8 +131,8 @@ a more realistic forecast graph.
To adjust the configuration settings for your Forecast.Solar integration To adjust the configuration settings for your Forecast.Solar integration
instance: instance:
- Browse to your Home Assistant instance. 1. Browse to your Home Assistant instance.
- In the sidebar click on _**{% my config icon %}**_. 1. Go to **{% my integrations title="Settings > Devices & Services" %}**.
- From the configuration menu select: _**{% my integrations %}**_. 1. If multiple instances of {{ name }} are configured, choose the instance you want to configure.
- If multiple instances of {{ name }} are configured, choose the instance you want to configure. 1. Select the cogwheel.
- Click on _**"Configure"**_. 1. Select **Configure**.

View File

@ -76,10 +76,10 @@ If no password is given, it will be auto-generated.
| `password` | yes | New password for the guest wifi | | `password` | yes | New password for the guest wifi |
| `length` | yes | Length of the auto-generated password. (_default 12_) | | `length` | yes | Length of the auto-generated password. (_default 12_) |
## Integration Options ## Integration options
It is possible to change some behaviors through the integration options. It is possible to change some behaviors through the integration options.
These can be changed at **AVM FRITZ!Box Tools** -> **Configure** on the Integrations page. To change the settings, go to {% my integrations title="**Settings** > **Devices & Services**" %}. On the **AVM FRITZ!Box Tools** integration, select the cogwheel. Then select **Configure**.
- **Consider home**: Number of seconds that must elapse before considering a disconnected device "not at home". - **Consider home**: Number of seconds that must elapse before considering a disconnected device "not at home".
- **Enable old discovery method**: Needed on some scenarios like no mesh support (fw <= 6.x), mixed brands network devices or LAN switches. - **Enable old discovery method**: Needed on some scenarios like no mesh support (fw <= 6.x), mixed brands network devices or LAN switches.

View File

@ -5,74 +5,31 @@ ha_category:
- Geolocation - Geolocation
ha_iot_class: Cloud Polling ha_iot_class: Cloud Polling
ha_release: 0.79 ha_release: 0.79
ha_domain: geo_json_events ha_config_flow: true
ha_codeowners: ha_codeowners:
- '@exxamalte' - '@exxamalte'
ha_domain: geo_json_events
ha_platforms: ha_platforms:
- geo_location - geo_location
ha_integration_type: service ha_integration_type: service
--- ---
The `geo_json_events` platform lets you integrate GeoJSON feeds. It retrieves events from a feed and shows information of those events filtered by distance to Home Assistant's location. The GeoJSON integration lets you ingest events from GeoJSON feeds. It retrieves events from a feed and shows information of those events filtered by distance to Home Assistant's location.
All entries in the GeoJSON feed must define a `geometry` which typically is a point or polygon with geo coordinates. In addition, this platform will look for a `title` key in the entry's `properties` and use that as the entity's name. All entries in the GeoJSON feed must define a `geometry` which typically is a point or polygon with geo coordinates. In addition, this platform will look for a `title` key in the entry's `properties` and use that as the entity's name.
Entities are generated, updated and removed automatically with each update from the GeoJSON feed. Each entity defines latitude and longitude and will be shown on the map automatically. The distance in kilometers is available as the state of each entity. Entities are generated, updated and removed automatically with each update from the GeoJSON feed. Each entity defines latitude and longitude and will be shown on the map automatically. The distance in kilometers is available as the state of each entity.
The data is updated every 5 minutes. The data is updated every 5 minutes.
## Configuration {% include integrations/config_flow.md %}
To integrate a GeoJSON feed, add the following lines to your `configuration.yaml`. This is an example configuration showing [earthquake data provided by the U.S. Geological Survey](https://earthquake.usgs.gov/earthquakes/feed/v1.0/geojson.php).
```yaml
# Example configuration.yaml entry
geo_location:
- platform: geo_json_events
url: https://earthquake.usgs.gov/earthquakes/feed/v1.0/summary/all_day.geojson
```
{% configuration %}
url:
description: Full URL of the GeoJSON feed.
required: true
type: string
radius:
description: The distance in kilometers around the Home Assistant's coordinates in which events are considered.
required: false
type: float
default: 20.0
latitude:
description: Latitude of the coordinates around which events are considered.
required: false
type: string
default: Latitude defined in your `configuration.yaml`
longitude:
description: Longitude of the coordinates around which events are considered.
required: false
type: string
default: Longitude defined in your `configuration.yaml`
{% endconfiguration %}
## State Attributes ## State Attributes
The following state attributes are available for each entity in addition to the standard ones: The following state attributes are available for each entity in addition to the standard ones:
| Attribute | Description | | Attribute | Description |
|-------------|-------------| |-------------|-------------------------------------------------------------------------------------|
| latitude | Latitude of the event. | | latitude | Latitude of the event. |
| longitude | Longitude of the event. | | longitude | Longitude of the event. |
| source | `geo_json_events` to be used in conjunction with `geo_location` automation trigger. | | source | `geo_json_events` to be used in conjunction with `geo_location` automation trigger. |
| external_id | The external ID used in the feed to identify the event in the feed. | | external_id | The external ID used in the feed to identify the event in the feed. |
## Advanced Configuration Example
When integrating several GeoJSON feeds, it may be useful to distinguish the entities of different feeds. The easiest way to do that is by defining an [`entity_namespace`](/docs/configuration/platform_options/#entity-namespace/) for each platform which will prefix each entity ID with the defined value.
```yaml
# Example configuration.yaml entry
geo_location:
- platform: geo_json_events
url: https://earthquake.usgs.gov/earthquakes/feed/v1.0/summary/all_day.geojson
radius: 250
entity_namespace: "usgs_earthquakes"
```

View File

@ -1,52 +0,0 @@
---
title: Goalfeed
description: Instructions on how to setup Goalfeed events within Home Assistant.
logo: goalfeed.png
ha_category:
- Other
ha_iot_class: Cloud Push
ha_release: 0.63
ha_domain: goalfeed
ha_integration_type: integration
---
The `goalfeed` integration lets you use your Goalfeed account to trigger events in Home Assistant whenever a NHL or MLB team scores.
To use this component, enter your email address and password from your goalfeed.ca account in your `configuration.yaml` file:
```yaml
# Example configuration.yaml entry
goalfeed:
username: YOUR_E_MAIL_ADDRESS
password: YOUR_PASSWORD
```
{% configuration %}
username:
required: true
description: The email address on your goalfeed.ca account.
type: string
password:
required: true
description: The password on your goalfeed.ca account.
type: string
{% endconfiguration %}
Now you can use the goal event type in your automations:
```yaml
- alias: 'Jets Goal'
trigger:
platform: event
event_type: goal
event_data:
team_name: "Winnipeg Jets"
```
Goal events have the following event data:
- **team**: Three letter code representing the team. This is unique within the leagues, but not unique across the leagues (i.e., 'WPG' or 'TOR').
- **team_name**: The team that scored (i.e., 'Winnipeg Jets' or 'Toronto Blue Jays').
- **team_hash**: A unique hash for the team (you can find these values on https://goalfeed.ca/get-teams).
- **league_id**: A unique number for the league.
- **league_name**: A the short name of the league (i.e., 'NHL' or 'MLB').

View File

@ -269,22 +269,25 @@ entity_config:
Currently, the following domains are available to be used with Google Assistant, listed with their default types: Currently, the following domains are available to be used with Google Assistant, listed with their default types:
- alarm_control_panel (arm/disarm) - alarm_control_panel (arm/disarm)
- button (scene)
- camera (streaming, requires compatible camera) - camera (streaming, requires compatible camera)
- group (on/off) - climate (temperature setting, hvac_mode)
- input_boolean (on/off) - cover (on/off/set position)
- input_select (option/setting/mode/value)
- scene (on)
- script (on)
- switch (on/off)
- fan (on/off/speed percentage/preset mode) - fan (on/off/speed percentage/preset mode)
- group (on/off)
- humidifier (humidity setting/on/off/mode)
- input_boolean (on/off)
- input_button
- input_select (option/setting/mode/value)
- light (on/off/brightness/rgb color/color temp) - light (on/off/brightness/rgb color/color temp)
- lock - lock
- cover (on/off/set position)
- media_player (on/off/set volume (via set volume)/source (via set input source)/control playback) - media_player (on/off/set volume (via set volume)/source (via set input source)/control playback)
- climate (temperature setting, hvac_mode) - scene (on)
- vacuum (dock/start/stop/pause) - script (on)
- select
- sensor (temperature setting for temperature sensors and humidity setting for humidity sensors) - sensor (temperature setting for temperature sensors and humidity setting for humidity sensors)
- humidifier (humidity setting/on/off/mode) - switch (on/off)
- vacuum (dock/start/stop/pause)
<div class='note'> <div class='note'>

View File

@ -98,15 +98,19 @@ If commands don't work try removing superfluous words such as "the". E.g. "play
If broadcasting doesn't work, make sure: the speakers aren't in do not disturb mode, the Home Assistant server is in the same network as the speakers, and IPv6 is disabled in the router. If broadcasting doesn't work, make sure: the speakers aren't in do not disturb mode, the Home Assistant server is in the same network as the speakers, and IPv6 is disabled in the router.
The easiest way to check if the integration is working is to check [My Google Activity](https://myactivity.google.com/myactivity) for the issued commands and their responses.
## Limitations/known issues ## Limitations/known issues
Multiple Google accounts are not supported. - Multiple Google accounts are not supported.
- Personal results are not supported yet since that requires creating an OAuth client ID of the Desktop app.
Limitations of the underlying library are listed [here](https://github.com/tronikos/gassist_text#limitationsknown-issues) (media playback, routines, and personal results are not working). - If you see the issued commands in [My Google Activity](https://myactivity.google.com/myactivity), the integration is working fine. If the commands don't have the expected outcome, don't open an issue in the Home Assistant Core project or the [underlying library](https://github.com/tronikos/gassist_text). You should instead report the issue directly to Google [here](https://github.com/googlesamples/assistant-sdk-python/issues). Examples of known Google Assistant API issues:
- Media playback commands (other than play news, play podcast, play white noise, or play rain sounds) don't work.
- Routines don't work.
## Configuration ## Configuration
On the configure page, you can set the language code of the interactions with Google Assistant. If not configured, the integration picks one based on Home Assistant's configured language and country. Supported languages are listed [here](https://developers.google.com/assistant/sdk/reference/rpc/languages) On the configure page, you can set the language code of the interactions with Google Assistant. If not configured, the integration picks one based on Home Assistant's configured language and country. Supported languages are listed [here](https://developers.google.com/assistant/sdk/reference/rpc/languages).
## Services ## Services

View File

@ -30,8 +30,8 @@ tts:
API key obtaining process described in corresponding documentation: API key obtaining process described in corresponding documentation:
* [Text-to-Speech](https://cloud.google.com/text-to-speech/docs/quickstart-protocol) * [Text-to-speech](https://cloud.google.com/text-to-speech/docs/quickstart-protocol)
* [Speech-to-Text](https://cloud.google.com/speech-to-text/docs/quickstart-protocol) * [Speech-to-text](https://cloud.google.com/speech-to-text/docs/quickstart-protocol)
* [Geocoding](https://developers.google.com/maps/documentation/geocoding/start) * [Geocoding](https://developers.google.com/maps/documentation/geocoding/start)
Basic instruction for all APIs: Basic instruction for all APIs:
@ -42,36 +42,36 @@ Basic instruction for all APIs:
4. [Make sure that billing is enabled for your Google Cloud Platform project](https://cloud.google.com/billing/docs/how-to/modify-project). 4. [Make sure that billing is enabled for your Google Cloud Platform project](https://cloud.google.com/billing/docs/how-to/modify-project).
5. Enable needed Cloud API visiting one of the links below or [APIs library](https://console.cloud.google.com/apis/library), selecting your `Project` from the dropdown list and clicking the `Continue` button: 5. Enable needed Cloud API visiting one of the links below or [APIs library](https://console.cloud.google.com/apis/library), selecting your `Project` from the dropdown list and clicking the `Continue` button:
* [Text-to-Speech](https://console.cloud.google.com/flows/enableapi?apiid=texttospeech.googleapis.com)
* [Speech-to-Text](https://console.cloud.google.com/flows/enableapi?apiid=speech.googleapis.com)
* [Geocoding](https://console.cloud.google.com/flows/enableapi?apiid=geocoding-backend.googleapis.com)
* [Text-to-speech](https://console.cloud.google.com/flows/enableapi?apiid=texttospeech.googleapis.com)
* [Speech-to-text](https://console.cloud.google.com/flows/enableapi?apiid=speech.googleapis.com)
* [Geocoding](https://console.cloud.google.com/flows/enableapi?apiid=geocoding-backend.googleapis.com)
6. Set up authentication: 6. Set up authentication:
1. Visit [this link](https://console.cloud.google.com/apis/credentials/serviceaccountkey) 1. Visit [this link](https://console.cloud.google.com/apis/credentials/serviceaccountkey)
2. From the `Service account` list, select `New service account`. 2. From the `Service account` list, select `New service account`.
3. In the `Service account name` field, enter any name. 3. In the `Service account name` field, enter any name.
If you are requesting Text-to-Speech API key: If you are requesting a text-to-speech API key:
4. Don't select a value from the Role list. **No role is required to access this service**. 4. Don't select a value from the Role list. **No role is required to access this service**.
5. Click `Create`. A note appears, warning that this service account has no role. 5. Click `Create`. A note appears, warning that this service account has no role.
6. Click `Create without role`. A JSON file that contains your `API key` downloads to your computer. 6. Click `Create without role`. A JSON file that contains your `API key` downloads to your computer.
## Google Cloud Text-to-Speech ## Google Cloud text-to-speech
[Google Cloud Text-to-Speech](https://cloud.google.com/text-to-speech/) converts text into human-like speech in more than 100 voices across 20+ languages and variants. It applies groundbreaking research in speech synthesis (WaveNet) and Google's powerful neural networks to deliver high-fidelity audio. With this easy-to-use API, you can create lifelike interactions with your users that transform customer service, device interaction, and other applications. [Google Cloud text-to-speech](https://cloud.google.com/text-to-speech/) converts text into human-like speech in more than 100 voices across 20+ languages and variants. It applies groundbreaking research in speech synthesis (WaveNet) and Google's powerful neural networks to deliver high-fidelity audio. With this easy-to-use API, you can create lifelike interactions with your users that transform customer service, device interaction, and other applications.
### Pricing ### Pricing
The Cloud Text-to-Speech API is priced monthly based on the amount of characters to synthesize into audio sent to the service. The Cloud text-to-speech API is priced monthly based on the amount of characters to synthesize into audio sent to the service.
| Feature | Monthly free tier | Paid usage | | Feature | Monthly free tier | Paid usage |
|-------------------------------|---------------------------|-----------------------------------| |-------------------------------|---------------------------|-----------------------------------|
| Standard (non-WaveNet) voices | 0 to 4 million characters | $4.00 USD / 1 million characters | | Standard (non-WaveNet) voices | 0 to 4 million characters | $4.00 USD / 1 million characters |
| WaveNet voices | 0 to 1 million characters | $16.00 USD / 1 million characters | | WaveNet voices | 0 to 1 million characters | $16.00 USD / 1 million characters |
### Text-to-Speech configuration ### Text-to-speech configuration
{% configuration %} {% configuration %}
key_file: key_file:
@ -113,7 +113,7 @@ gain:
type: float type: float
default: 0.0 default: 0.0
profiles: profiles:
description: "An identifier which selects 'audio effects' profiles that are applied on (post synthesized) text to speech. Effects are applied on top of each other in the order they are given. Supported profile ids listed [here](https://cloud.google.com/text-to-speech/docs/audio-profiles)." description: "An identifier which selects 'audio effects' profiles that are applied on (post synthesized) text-to-speech. Effects are applied on top of each other in the order they are given. Supported profile ids listed [here](https://cloud.google.com/text-to-speech/docs/audio-profiles)."
required: false required: false
type: list type: list
default: "[]" default: "[]"
@ -126,7 +126,7 @@ text_type:
### Full configuration example ### Full configuration example
The Google Cloud Text-to-Speech configuration can look like: The Google Cloud text-to-speech configuration can look like:
```yaml ```yaml
# Example configuration.yaml entry # Example configuration.yaml entry

View File

@ -0,0 +1,48 @@
---
title: Google Generative AI Conversation
description: Instructions on how to integrate Google Generative AI as a conversation agent
ha_category:
- Voice
ha_release: 2023.6
ha_iot_class: Cloud Polling
ha_config_flow: true
ha_codeowners:
- '@tronikos'
ha_domain: google_generative_ai_conversation
ha_integration_type: service
---
The Google Generative AI integration adds a conversation agent powered by [Google Generative AI](https://developers.generativeai.google/) in Home Assistant.
This conversation agent is unable to control your house. It can only query information that has been provided by Home Assistant. To be able to answer questions about your house, Home Assistant will need to provide Google Generative AI with the details of your house, which include areas, devices and their states.
This integration requires an API key to use, [which you can generate here](https://makersuite.google.com/app/apikey).
{% include integrations/config_flow.md %}
### Generate an API Key
The Google Generative AI API key is used to authenticate requests to the Google Generative AI API. To generate an API key take the following steps:
- Join the PaLM API and MakerSuite [waitlist](https://makersuite.google.com/waitlist).
- Wait several days for an email with subject "Its your turn to use the PaLM API and MakerSuite".
- Visit the [API Keys page](https://makersuite.google.com/app/apikey) to retrieve the API key you'll use to configure the integration.
{% include integrations/option_flow.md %}
{% configuration_basic %}
Prompt Template:
description: The starting text for the AI language model to generate new text from. This text can include information about your Home Assistant instance, devices, and areas and is written using [Home Assistant Templating](/docs/configuration/templating/).
Model:
description: Model used to generate response.
Temperature:
description: Creativity allowed in the responses. Higher values produce a more random and varied response. A temperature of zero will be deterministic.
Top P:
description: Probability threshold for top-p sampling.
Top K:
description: Number of top-scored tokens to consider during generation.
{% endconfiguration_basic %}

View File

@ -27,26 +27,26 @@ If you have already set up the correct credentials, you can do step 1 and then s
{% details "Generate Client ID and Client Secret" %} {% details "Generate Client ID and Client Secret" %}
This section explains how to generate a Client ID and Client Secret on This section explains how to generate a Client ID and Client Secret on
[Google Developers Console](https://console.cloud.google.com/apis/library/gmail.googleapis.com?project=home-assistant-17698). [Google Developers Console](https://console.cloud.google.com/apis/library/gmail.googleapis.com).
1. First, go to the Google Developers Console to enable [Gmail API](https://console.cloud.google.com/apis/library/gmail.googleapis.com?project=home-assistant-17698) 1. First, go to the Google Developers Console to enable [Gmail API](https://console.cloud.google.com/apis/library/gmail.googleapis.com)
2. The wizard will ask you to choose a project to manage your application. Select a project and click continue. 2. The wizard will ask you to choose a project to manage your application. Select a project and select **Continue**.
3. Verify that your Gmail API was enabled and click 'Go to credentials' 3. Verify that your Gmail API was enabled and select **Go to credentials**.
4. Navigate to APIs & Services (left sidebar) > [Credentials](https://console.cloud.google.com/apis/credentials) 4. Navigate to **APIs & Services** (left sidebar) > [Credentials](https://console.cloud.google.com/apis/credentials)
5. Click on the field on the left of the screen, **OAuth Consent Screen**. 5. Click on the field on the left of the screen, **OAuth Consent Screen**.
6. Select **External** and **Create**. 6. Select **External** and **Create**.
7. Set the *App Name* (the name of the application asking for consent) to anything you want, e.g., *Home Assistant*. 7. Set the **App Name** (the name of the application asking for consent) to anything you want, e.g., *Home Assistant*.
8. You then need to select a *Support email*. To do this, click the drop-down box and select your email address. 8. You then need to select a **Support email**. To do this, from the dropdown menu, select your email address.
9. You finally need to complete the section: *Developer contact information*. To do this, enter your email address (the same as above is fine). 9. You finally need to complete the section: **Developer contact information**. To do this, enter your email address (the same as above is fine).
10. Scroll to the bottom and click **Save and Continue**. You don't have to fill out anything else, or it may enable additional review. 10. Scroll to the bottom and select **Save and Continue**. You don't have to fill out anything else, or it may enable additional review.
11. You will then be automatically taken to the Scopes page. You do not need to add any scopes here, so click Save and Continue to move to the Optional info page. You do not need to add anything to the Optional info page, so click Save and Continue, which will take you to the Summary page. Click Back to Dashboard. 11. You will then be automatically taken to the **Scopes** page. You do not need to add any scopes here, so select **Save and Continue** to move to the **Optional info** page. You do not need to add anything to the **Optional info** page, so select **Save and Continue**, which will take you to the **Summary** page. Select **Back to Dashboard**.
12. Click **OAuth consent screen** again and set *Publish Status* to **Production** otherwise your credentials will expire every 7 days. 12. Select **OAuth consent screen** again and set *Publish Status* to **Production**. Otherwise your credentials will expire every 7 days.
13. Make sure **Publishing status** is set to production. 13. Make sure **Publishing status** is set to production.
14. Click **Credentials** in the menu on the left-hand side of the screen, then click **Create credentials** (at the top of the screen), then select *OAuth client ID*. 14. Select **Credentials** in the menu on the left-hand side of the screen, then select **Create credentials** (at the top of the screen), then select **OAuth client ID**.
15. Set the Application type to *Web application* and give this credential set a name (like "Home Assistant Credentials"). 15. Set the Application type to *Web application* and give this credential set a name (like "Home Assistant Credentials").
16. Add https://my.home-assistant.io/redirect/oauth to *Authorized redirect URIs* then click **Create**. This is not a placeholder and is the URI that must be used. 16. Add `https://my.home-assistant.io/redirect/oauth` to **Authorized redirect URIs** then select **Create**. This is not a placeholder and is the URI that must be used.
17. You will then be presented with a pop-up saying *OAuth client created* showing *Your Client ID* and *Your Client Secret*. Make a note of these (for example, copy and paste them into a text editor), as you will need these shortly. Once you have noted these strings, click **OK**. If you need to find these credentials again at any point, then navigate to *APIs & Services > Credentials*, and you will see *Home Assistant Credentials* (or whatever you named them in the previous step) under *OAuth 2.0 Client IDs*. To view both the *Client ID* and *Client secret*, click on the pencil icon; this will take you to the settings page for these credentials, and the information will be on the right-hand side of the page. 17. You will then be presented with a pop-up saying **OAuth client created** showing **Your Client ID** and **Your Client Secret**. Make a note of these (for example, copy and paste them into a text editor), as you will need them shortly. Once you have noted these strings, select **OK**. If you need to find these credentials again at any point, then navigate to **APIs & Services** > **Credentials**, and you will see **Home Assistant Credentials** (or whatever you named them in the previous step) under **OAuth 2.0 Client IDs**. To view both the **Client ID** and **Client secret**, select the pencil icon. This will take you to the settings page for these credentials, and the information will be on the right-hand side of the page.
18. Double-check that the *Gmail API* has been automatically enabled. To do this, select **Library** from the menu, then search for *Gmail API*. If it is enabled you will see *API Enabled* with a green tick next to it. If it is not enabled, then enable it. 18. Double-check that the **Gmail API** has been automatically enabled. To do this, select **Library** from the menu, then search for **Gmail API**. If it is enabled, you will see **API Enabled** with a green tick next to it. If it is not enabled, then enable it.
{% enddetails %} {% enddetails %}
@ -60,11 +60,11 @@ The integration setup will next give you instructions to enter the [Application
2. **NOTE**: You may get a message telling you that the app has not been verified and you will need to acknowledge that in order to proceed. 2. **NOTE**: You may get a message telling you that the app has not been verified and you will need to acknowledge that in order to proceed.
3. You can now see the details of what you are authorizing Home Assistant to access with two options at the bottom. Click **Continue**. 3. You can now see the details of what you are authorizing Home Assistant to access with two options at the bottom. Select **Continue**.
4. The page will now display *Link account to Home Assistant?*, note *Your instance URL*. If this is not correct, please refer to [My Home Assistant](/integrations/my). If everything looks good, click **Link Account**. 4. The page will now display **Link account to Home Assistant?**, note **Your instance URL**. If this is not correct, refer to [My Home Assistant](/integrations/my). If everything looks good, select **Link Account**.
5. You may close the window, and return back to Home Assistant where you should see a *Success!* message from Home Assistant. 5. You may close the window, and return back to Home Assistant where you should see a **Success!** message from Home Assistant.
{% enddetails %} {% enddetails %}

View File

@ -20,7 +20,7 @@ The integration currently only has access to that one document that is created d
## Prerequisites ## Prerequisites
You need to configure developer credentials to allow Home Assistant to access your Google Account. You need to configure developer credentials to allow Home Assistant to access your Google Account.
These credentials are the same as the ones for [Nest](/integrations/nest) and [Google Mail](/integrations/google_mail). These credentials are the same as the ones for [Nest](/integrations/nest), [YouTube](/integrations/youtube) and [Google Mail](/integrations/google_mail).
These are not the same as the one for [Google Calendar](/integrations/google). These are not the same as the one for [Google Calendar](/integrations/google).
If you have already set up the correct credentials, you can do step 1 and then skip to step 13 on the below instructions. If you have already set up the correct credentials, you can do step 1 and then skip to step 13 on the below instructions.
@ -72,6 +72,11 @@ The integration setup will next give you instructions to enter the [Application
If you have an error with your credentials you can delete them in the [Application Credentials](/integrations/application_credentials/) user interface. If you have an error with your credentials you can delete them in the [Application Credentials](/integrations/application_credentials/) user interface.
### Video Tutorial
This video tutorial explains how to set up the Google Sheets integration and how you can add data from Home Assistant to a Google Sheet.
<lite-youtube videoid="hgGMgoxLYwo" videotitle="How to use Google Sheets in Home Assistant - TUTORIAL" posterquality="maxresdefault"></lite-youtube>
### Service `google_sheets.append_sheet` ### Service `google_sheets.append_sheet`
You can use the service `google_sheets.append_sheet` to add a row of data to the Sheets document created at setup. You can use the service `google_sheets.append_sheet` to add a row of data to the Sheets document created at setup.

View File

@ -1,6 +1,6 @@
--- ---
title: Google Translate Text-to-Speech title: Google Translate text-to-speech
description: Instructions on how to setup Google Translate Text-to-Speech with Home Assistant. description: Instructions on how to setup Google Translate text-to-speech with Home Assistant.
ha_category: ha_category:
- Text-to-speech - Text-to-speech
ha_release: 0.35 ha_release: 0.35
@ -11,7 +11,7 @@ ha_platforms:
ha_integration_type: integration ha_integration_type: integration
--- ---
The `google_translate` text-to-speech platform uses the unofficial [Google Translate Text-to-Speech engine](https://translate.google.com/) to read a text with natural sounding voices. Contrary to what the name suggests, the integration only does text-to-speech and does not translate messages sent to it. The `google_translate` text-to-speech platform uses the unofficial [Google Translate text-to-speech engine](https://translate.google.com/) to read a text with natural sounding voices. Contrary to what the name suggests, the integration only does text-to-speech and does not translate messages sent to it.
## Configuration ## Configuration

View File

@ -48,6 +48,10 @@ entity_id:
description: The entity you want to track. description: The entity you want to track.
required: true required: true
type: string type: string
unique_id:
description: An ID that uniquely identifies this entity. Set this to a unique value to allow customization through the UI.
required: false
type: string
state: state:
description: The states you want to track. description: The states you want to track.
required: true required: true

View File

@ -318,7 +318,7 @@ The HomeKit Accessory Protocol Specification only allows a maximum of 150 unique
### Multiple HomeKit instances ### Multiple HomeKit instances
If you create a HomeKit integration via the UI (i.e., **Settings** -> **Devices & Services**), it must be configured via the UI **only**. While the UI only offers limited configuration options at the moment, any attempt to configure a HomeKit instance created in the UI via the `configuration.yaml` file will result in another instance of HomeKit running on a different port. If you create a HomeKit integration via the UI (i.e., **Settings** > **Devices & Services**), it must be configured via the UI **only**. While the UI only offers limited configuration options at the moment, any attempt to configure a HomeKit instance created in the UI via the `configuration.yaml` file will result in another instance of HomeKit running on a different port.
It is recommended to only edit a HomeKit instance in the UI that was created in the UI, and likewise, only edit a HomeKit instance in YAML that was created in YAML. It is recommended to only edit a HomeKit instance in the UI that was created in the UI, and likewise, only edit a HomeKit instance in YAML that was created in YAML.
@ -328,7 +328,7 @@ When exposing a Camera, Activity based remote (a `remote` that supports activiti
To quickly add all accessory mode entities in the UI: To quickly add all accessory mode entities in the UI:
1. Create a new bridge via the UI (i.e., **{% my config_flow_start title="Settings >> Devices & Services" domain=page.ha_domain %}**). 1. Create a new bridge via the UI (i.e., **{% my config_flow_start title="Settings > Devices & Services" domain=page.ha_domain %}**).
2. Select `media_player`, `remote`, `lock`, and `camera` domains. 2. Select `media_player`, `remote`, `lock`, and `camera` domains.
3. Complete the flow as normal. 3. Complete the flow as normal.
4. Additional HomeKit entries for each entity that must operate in accessory mode will be created for each entity that does not already have one. 4. Additional HomeKit entries for each entity that must operate in accessory mode will be created for each entity that does not already have one.
@ -337,7 +337,7 @@ To quickly add all accessory mode entities in the UI:
To add a single entity in accessory mode: To add a single entity in accessory mode:
1. Create a new bridge via the UI (i.e., **{% my config_flow_start title="Settings >> Devices & Services" domain=page.ha_domain %}**) 1. Create a new bridge via the UI (i.e., **{% my config_flow_start title="Settings > Devices & Services" domain=page.ha_domain %}**)
2. Before pairing the bridge, access the options for the bridge. 2. Before pairing the bridge, access the options for the bridge.
3. Change the mode to `accessory` 3. Change the mode to `accessory`
4. Select the entity. 4. Select the entity.
@ -424,7 +424,7 @@ The following integrations are currently supported:
# Device Triggers # Device Triggers
Devices that support triggers can be added to the bridge by accessing options for the bridge in **{% my integrations title="Settings >> Devices & Services" %}**. To use this feature, Advanced Mode must be enabled in your user profile. Devices that support triggers can be added to the bridge by accessing options for the bridge in **{% my integrations title="Settings > Devices & Services" %}**. To use this feature, Advanced Mode must be enabled in your user profile.
Bridged device triggers are represented as a single press button on stateless programmable switches. This allows a HomeKit automation to run when a device trigger fires. Because the Apple Home app currently only shows the number of the button and not the name, users may find it easier to identify the name of the button in the `Eve for HomeKit` app. Bridged device triggers are represented as a single press button on stateless programmable switches. This allows a HomeKit automation to run when a device trigger fires. Because the Apple Home app currently only shows the number of the button and not the name, users may find it easier to identify the name of the button in the `Eve for HomeKit` app.
@ -549,7 +549,7 @@ Remember that the iOS device needs to be in the same local network as the Home A
#### `Home Assistant Bridge` doesn't appear in the Home App (for pairing) - Docker #### `Home Assistant Bridge` doesn't appear in the Home App (for pairing) - Docker
Set `network_mode: host` in your `docker-compose.yaml`. If you have further problems this [issue](https://github.com/home-assistant/home-assistant/issues/15692) might help. Set `network_mode: host` in your `docker-compose.yaml`. If you have further problems this [issue](https://github.com/home-assistant/core/issues/15692) might help.
You can also try to use `avahi-daemon` in reflector mode together with the option `advertise_ip`, see above. You can also try to use `avahi-daemon` in reflector mode together with the option `advertise_ip`, see above.
@ -592,7 +592,7 @@ To use the HomeKit integration with multiple different Home Assistant instances
#### Specific entity doesn't work #### Specific entity doesn't work
Although we try our best, some entities don't work with the HomeKit integration yet. The result will be that either pairing fails completely or all Home Assistant accessories will stop working. Use the filter to identify which entity is causing the issue. It's best to try pairing and step by step including more entities. If it works, unpair and repeat until you find the one that is causing the issues. To help others and the developers, please open a new issue here: [home-assistant/issues/new](https://github.com/home-assistant/home-assistant/issues/new?labels=component:%20homekit) Although we try our best, some entities don't work with the HomeKit integration yet. The result will be that either pairing fails completely or all Home Assistant accessories will stop working. Use the filter to identify which entity is causing the issue. It's best to try pairing and step by step including more entities. If it works, unpair and repeat until you find the one that is causing the issues. To help others and the developers, please open a new issue here: [core/issues/new](https://github.com/home-assistant/core/issues/new)
If you have any iOS 12.x devices signed into your iCloud account, media player entities with `device_class: tv` may trigger this condition. Filtering the entity or signing the iOS 12.x device out of iCloud should resolve the issue after restarting other devices. If you have any iOS 12.x devices signed into your iCloud account, media player entities with `device_class: tv` may trigger this condition. Filtering the entity or signing the iOS 12.x device out of iCloud should resolve the issue after restarting other devices.
@ -626,7 +626,7 @@ Ensure that the [`ffmpeg`](/integrations/ffmpeg) integration is configured corre
#### Camera streaming is unstable or slow #### Camera streaming is unstable or slow
If your camera supports native H.264 streams, Home Assistant can avoid converting the video stream, which is an expensive operation. To enable native H.264 streaming when configured via YAML, change the `video_codec` to `copy`. To allow native H.264 streaming when setting up HomeKit via the UI, go to **Settings** -> **Devices & Services** in the UI, click **Options** for your HomeKit Bridge, and check the box for your camera on the `Cameras that support native H.264 streams` screen. If your camera supports native H.264 streams, Home Assistant can avoid converting the video stream, which is an expensive operation. To enable native H.264 streaming when configured via YAML, change the `video_codec` to `copy`. To allow native H.264 streaming when setting up HomeKit via the UI, go to **Settings** > **Devices & Services** in the UI, click **Options** for your HomeKit Bridge, and check the box for your camera on the `Cameras that support native H.264 streams` screen.
#### Multiple camera streams #### Multiple camera streams

View File

@ -43,7 +43,7 @@ ha_platforms:
ha_integration_type: integration ha_integration_type: integration
--- ---
The [HomeKit](https://developer.apple.com/homekit/) controller integration allows you to connect accessories with the "Works with HomeKit" logo to Home Assistant. This integration should not be confused with the [HomeKit](/integrations/homekit/) integration, which allows you to control Home Assistant devices via HomeKit. The [HomeKit](https://developer.apple.com/apple-home/) controller integration allows you to connect accessories with the "Works with HomeKit" logo to Home Assistant. This integration should not be confused with the [HomeKit](/integrations/homekit/) integration, which allows you to control Home Assistant devices via HomeKit.
The integration will automatically detect HomeKit compatible devices that are ready to pair if the [`zeroconf`](/integrations/zeroconf/) integration is enabled. This is enabled by default on new installations via the [`default_config`](/integrations/default_config/) component. The integration will automatically detect HomeKit compatible devices that are ready to pair if the [`zeroconf`](/integrations/zeroconf/) integration is enabled. This is enabled by default on new installations via the [`default_config`](/integrations/default_config/) component.
@ -165,7 +165,7 @@ homekit:
`netdisco` is not used by Home Assistant to discover HomeKit devices, so if it can't see your device the problem is more likely to be environmental than with Home Assistant itself. `netdisco` is not used by Home Assistant to discover HomeKit devices, so if it can't see your device the problem is more likely to be environmental than with Home Assistant itself.
Alternatively if you are less comfortable with the command line you could use Discovery for [Mac](https://apps.apple.com/us/app/discovery-dns-sd-browser/id1381004916?mt=12) or [iOS](https://apps.apple.com/us/app/discovery-dns-sd-browser/id305441017), Android [Service Browser](https://play.google.com/store/apps/details?id=com.druk.servicebrowser) or [All My Lan](https://www.microsoft.com/en-us/p/all-my-lan/9wzdncrdn19v). These are a less useful diagnostic as they aren't running from the same point on your network as Home Assistant. Even if it is visible in this tool it might still be a networking issue. They can give sometimes give clues. Alternatively if you are less comfortable with the command line you could use Discovery for [Mac](https://apps.apple.com/app/discovery-dns-sd-browser/id1381004916) or [iOS](https://apps.apple.com/app/discovery-dns-sd-browser/id305441017), Android [Service Browser](https://play.google.com/store/apps/details?id=com.druk.servicebrowser) or [All My Lan](https://apps.microsoft.com/store/detail/all-my-lan/9WZDNCRDN19V). These are a less useful diagnostic as they aren't running from the same point on your network as Home Assistant. Even if it is visible in this tool it might still be a networking issue. They can give sometimes give clues.
Where a discovery tool does give an IP, check it is what you expect (compare to DHCP leases in your router for example). Can you ping it? If not, you have a network problem. Where a discovery tool does give an IP, check it is what you expect (compare to DHCP leases in your router for example). Can you ping it? If not, you have a network problem.

View File

@ -175,7 +175,7 @@ target:
#### Overrides #### Overrides
You can pass any of the parameters listed [here](https://developer.mozilla.org/en-US/docs/Web/API/ServiceWorkerRegistration/showNotification#Parameters) in the `data` dictionary. Please note, Chrome specifies that the maximum size for an icon is 320px by 320px, the maximum `badge` size is 96px by 96px and the maximum icon size for an action button is 128px by 128px. You can pass any of the parameters listed [here](https://developer.mozilla.org/docs/Web/API/ServiceWorkerRegistration/showNotification#Parameters) in the `data` dictionary. Please note, Chrome specifies that the maximum size for an icon is 320px by 320px, the maximum `badge` size is 96px by 96px and the maximum icon size for an action button is 128px by 128px.
#### URL #### URL

View File

@ -10,6 +10,7 @@ ha_release: 0.71
ha_iot_class: Cloud Polling ha_iot_class: Cloud Polling
ha_domain: hydrawise ha_domain: hydrawise
ha_codeowners: ha_codeowners:
- '@dknowles2'
- '@ptcryan' - '@ptcryan'
ha_platforms: ha_platforms:
- binary_sensor - binary_sensor

View File

@ -74,10 +74,12 @@ To get the Estimated distance sensor to work, in most cases, it has to be calibr
- [Feasycom FSC-BP103B](https://www.feasycom.com/bluetooth-ibeacon-da14531) - [Feasycom FSC-BP103B](https://www.feasycom.com/bluetooth-ibeacon-da14531)
- [Feasycom FSC-BP104D](https://www.feasycom.com/dialog-da14531-bluetooth-low-energy-beacon) - [Feasycom FSC-BP104D](https://www.feasycom.com/dialog-da14531-bluetooth-low-energy-beacon)
- [Feasycom FSC-BP108](https://www.feasycom.com/bluetooth-5-1-waterproof-bluetooth-beacon) - [Feasycom FSC-BP108](https://www.feasycom.com/bluetooth-5-1-waterproof-bluetooth-beacon)
- [MikroTik TG-BT5-IN](https://mikrotik.com/product/tg_bt5_in) (Additional sensors such as angle or impact are not compatible)
- [NRF51822 iBeacon](https://www.aliexpress.com/item/32826502025.html) - [NRF51822 iBeacon](https://www.aliexpress.com/item/32826502025.html)
- [NRF52810 iBeacon](https://www.aliexpress.com/item/1005003211033416.html) - [NRF52810 iBeacon](https://www.aliexpress.com/item/1005003211033416.html)
- [Pawscout Tag](https://pawscout.com/shop/pawscout-tag/) - [Pawscout Tag](https://pawscout.com/shop/pawscout-tag/)
- [SwiftFinder](https://www.amazon.com/dp/B089MD5NP7) (Requires being paired to a phone first before it starts transmitting once a minute, otherwise it stays asleep) - [SwiftFinder](https://www.amazon.com/dp/B089MD5NP7) (Requires being paired to a phone first before it starts transmitting once a minute, otherwise it stays asleep)
- [Teltonika EYE Teltonika EYE Sensor](https://teltonika-gps.com/products/accessories/sensors-beacons/eye) (Additional sensors such as accelerometer, temperature, and humidity are not compatible)
## Example automation ## Example automation

View File

@ -50,14 +50,16 @@ Below is an example for setting up the integration to connect to your Microsoft
- Password: Your password - Password: Your password
- Charset: `US-ASCII` - Charset: `US-ASCII`
### Selecting an alternate SSL cipher list (advanced mode) ### Selecting an alternate SSL cipher list or disable SSL verification (advanced mode)
If the default IMAP server settings do not work, you might try to set an alternate SLL cipher list. If the default IMAP server settings do not work, you might try to set an alternate SLL cipher list.
The SSL cipher list option allows to select the list of SSL ciphers to be accepted from this endpoint. `default` (_system default_), `modern` or `intermediate` (_inspired by [Mozilla Security/Server Side TLS](https://wiki.mozilla.org/Security/Server_Side_TLS)_) The SSL cipher list option allows to select the list of SSL ciphers to be accepted from this endpoint. `default` (_system default_), `modern` or `intermediate` (_inspired by [Mozilla Security/Server Side TLS](https://wiki.mozilla.org/Security/Server_Side_TLS)_)
If you are using self signed certificates can can turn of SSL verification.
<div class='note info'> <div class='note info'>
The SSL cipher list is an advanced setting. The option is available only when advanced mode is enabled (see user settings). The SSL cipher list and verify SSL are advanced settings. The options are available only when advanced mode is enabled (see user settings).
</div> </div>
@ -70,7 +72,15 @@ Email providers may limit the number of reported emails. The number may be less
When a new message arrives that meets the search criteria the `imap` integration will send a custom [event](/docs/automation/trigger/#event-trigger) that can be used to trigger an automation. When a new message arrives that meets the search criteria the `imap` integration will send a custom [event](/docs/automation/trigger/#event-trigger) that can be used to trigger an automation.
It is also possible to use to create a template [`binary_sensor` or `sensor`](/integrations/template/#trigger-based-template-binary-sensors-buttons-numbers-selects-and-sensors) based the [event data](/docs/automation/templating/#event). It is also possible to use to create a template [`binary_sensor` or `sensor`](/integrations/template/#trigger-based-template-binary-sensors-buttons-numbers-selects-and-sensors) based the [event data](/docs/automation/templating/#event).
The table below shows what attributes come with `trigger.event.data`. The data is a dictionary that has the keys that are shown below: The table below shows what attributes come with `trigger.event.data`. The data is a dictionary that has the keys that are shown below.
The attributes shown in the table are also available as variables for the custom event data template. The [example](/integrations/imap/#example---custom-event-data-template) shows how to use this as an event filter.
<div class='note info'>
The custom event data template is an advanced feature. The option is available only when advanced mode is enabled (see user settings). The `text` attribute is not size limited when used as a variable in the template.
</div>
{% configuration_basic %} {% configuration_basic %}
server: server:
@ -82,7 +92,7 @@ search:
folder: folder:
description: The IMAP folder configuration description: The IMAP folder configuration
text: text:
description: The email body `text` of the the message (only the first 2048 bytes will be available) description: The email body `text` of the message (by default, only the first 2048 bytes will be available.)
sender: sender:
description: The `sender` of the message description: The `sender` of the message
subject: subject:
@ -91,11 +101,21 @@ date:
description: A `datetime` object of the `date` sent description: A `datetime` object of the `date` sent
headers: headers:
description: The `headers` of the message in the for of a dictionary. The values are iterable as headers can occur more than once. description: The `headers` of the message in the for of a dictionary. The values are iterable as headers can occur more than once.
custom:
description: Holds the result of the custom event data [template](/docs/configuration/templating). All attributes are available as a variable in the template.
{% endconfiguration_basic %} {% endconfiguration_basic %}
The `event_type` for the custom event should be set to `imap_content`. The configuration below shows how you can use the event data in a template `sensor`. The `event_type` for the custom event should be set to `imap_content`. The configuration below shows how you can use the event data in a template `sensor`.
If the default maximum message size (2048 bytes) to be used in events is too small for your needs, then this maximum size setting can be increased. You need to have your profile set to _advanced_ mode to do this.
<div class='note warning'>
Increasing the default maximum message size (2048 bytes) could have a negative impact on performance as event data is also logged by the `recorder`. If the total event data size exceeds the maximum event size (32168 bytes), the event will be skipped.
</div>
{% raw %} {% raw %}
```yaml ```yaml
@ -116,11 +136,10 @@ template:
Sender: "{{ trigger.event.data['sender'] }}" Sender: "{{ trigger.event.data['sender'] }}"
Date: "{{ trigger.event.data['date'] }}" Date: "{{ trigger.event.data['date'] }}"
Subject: "{{ trigger.event.data['subject'] }}" Subject: "{{ trigger.event.data['subject'] }}"
To: "{{ trigger.event.data['headers']['Delivered-To'][0] }}" To: "{{ trigger.event.data['headers'].get('Delivered-To', ['n/a'])[0] }}"
Subject: "{{ trigger.event.data['headers']['Subject'][0] }}" Return-Path: "{{ trigger.event.data['headers'].get('Return-Path',['n/a'])[0] }}"
Return_Path: "{{ trigger.event.data['headers']['Return-Path'][0] }}" Received-first: "{{ trigger.event.data['headers'].get('Received',['n/a'])[0] }}"
Received-first: "{{ trigger.event.data['headers']['Received'][0] }}" Received-last: "{{ trigger.event.data['headers'].get('Received',['n/a'])[-1] }}"
Received-last: "{{ trigger.event.data['headers']['Received'][-1] }}"
``` ```
{% endraw %} {% endraw %}
@ -197,3 +216,37 @@ template:
{% endraw %} {% endraw %}
By making small changes to the regular expressions defined above, a similar structure can parse other types of data out of the body text of other emails. By making small changes to the regular expressions defined above, a similar structure can parse other types of data out of the body text of other emails.
## Example - custom event data template
We can define a custom event data template to help filter events. This can be handy if, for example, we have multiple senders we want to allow.
We define the following template to return true if part of the `sender` is `@example.com`:
{% raw %}
```jinja2
{{ "@example.com" in sender }}
```
{% endraw %}
This will render to `True` if the sender is allowed. The result is added to the event data as `trigger.event.data["custom"]`.
The example below will only set the state to the subject of the email of template sensor, but only if the sender address matches.
{% raw %}
```yaml
template:
- trigger:
- platform: event
event_type: "imap_content"
id: "custom_event"
event_data:
custom: True
sensor:
- name: event filtered by template
state: '{{ trigger.event.data["subject"] }}'
```
{% endraw %}

View File

@ -0,0 +1,46 @@
---
title: JVC Projector
description: Instructions on how to integrate JVC Projector into Home Assistant.
ha_category:
- Remote
ha_release: '2023.6'
ha_iot_class: Local Polling
ha_config_flow: true
ha_codeowners:
- '@SteveEasley'
ha_domain: jvc_projector
ha_platforms:
- remote
ha_integration_type: device
---
The JVC Projector integration allows for the automation of [JVC Projectors](https://www.jvc.com/usa/projectors/).
## Supported Models
This integration is intended for the automation of any modern JVC Projector with a LAN network port.
{% include integrations/config_flow.md %}
## Remote
The JVC Projector remote platform will create a [Remote](/integrations/remote/) entity for the device. This entity allows you to send the following commands via the [remote.send_command](/integrations/remote/) service.
- `menu`
- `up`
- `down`
- `left`
- `right`
- `ok`
- `back`
- `info`
- `input`
- `hide`
- `mpc`
- `cmd`
- `advanced_menu`
- `picture_mode`
- `color_profile`
- `lens_control`
- `setting_memory`
- `gamma_settings`

View File

@ -390,3 +390,74 @@ data:
{% endconfiguration %} {% endconfiguration %}
To use notifications, please see the [getting started with automation page](/getting-started/automation/). To use notifications, please see the [getting started with automation page](/getting-started/automation/).
## Keypress events
key presses of keyboards/remotes can be overwritten in Kodi and configured to send an event to Home Assistant, which can then be used in automations to, for instance, turn up/down the volume of a TV/receiver.
A keypress can be overwritten in Kodi by using the [Kodi keymap XML](https://kodi.wiki/view/Keymap) or from within the Kodi GUI using the [Keymap Editor add-on](https://kodi.wiki/view/Add-on:Keymap_Editor).
An example of the Kodi keymap configuration using XML, which will overwrite the volume_up/volume_down buttons and instead send an event to HomeAssistant:
```xml
<keymap>
<global>
<keyboard>
<volume_up>NotifyAll("KodiLivingroom", "OnKeyPress", {"key":"volume_up"})</volume_up>
<volume_down>NotifyAll("KodiLivingroom", "OnKeyPress", {"key":"volume_down"})</volume_down>
</keyboard>
</global>
</keymap>
```
The `"KodiLivingroom"` can be set to any value and will be present in the event data as the `"sender"`
The `"OnKeyPress"` is needed to identify the event in Home Assistant, do not change this.
The `{"key":"volume_up"}` can contain any JSON which will be present in the event data under the `"data"` key, normally this is used to identify which key was pressed.
For possible keyboard key names, see: https://kodi.wiki/view/List_of_keynames
For other actions, see: https://kodi.wiki/view/Keymap#Keynames
For the example above, when the volume up key is pressed, an event in Home Assistant will be fired that looks like this:
```yaml
event_type: kodi_keypress
data:
type: keypress
device_id: 72e5g0ay5621f5d719qd8cydj943421a
entity_id: media_player.kodi_livingroom
sender: KodiLivingroom
data:
key: volume_up
```
A example of a automation to turn up/down the volume of a receiver using the event:
{% raw %}
```yaml
alias: Kodi keypress
mode: parallel
max: 10
trigger:
- platform: event
event_type: kodi_keypress
event_data:
entity_id: media_player.kodi_livingroom
action:
- choose:
- conditions:
- condition: template
value_template: "{{trigger.event.data.data.key=='volume_up'}}"
sequence:
- service: media_player.volume_up
target:
entity_id: media_player.receiver
- conditions:
- condition: template
value_template: "{{trigger.event.data.data.key=='volume_down'}}"
sequence:
- service: media_player.volume_down
target:
entity_id: media_player.receiver
```
{% endraw %}

View File

@ -10,6 +10,7 @@ ha_codeowners:
- '@IceBotYT' - '@IceBotYT'
ha_domain: lacrosse_view ha_domain: lacrosse_view
ha_platforms: ha_platforms:
- diagnostics
- sensor - sensor
ha_integration_type: integration ha_integration_type: integration
--- ---

View File

@ -9,38 +9,15 @@ ha_domain: lastfm
ha_platforms: ha_platforms:
- sensor - sensor
ha_integration_type: integration ha_integration_type: integration
ha_config_flow: true
ha_codeowners:
- '@joostlek'
--- ---
The `lastfm` sensor platform will allow you to see whenever a user starts scrobbling, their play count, last song played, and top song played on [Last.fm](https://www.last.fm/). The `lastfm` sensor platform will allow you to see whenever a user starts scrobbling, their play count, last song played, and top song played on [Last.fm](https://www.last.fm/).
## Setup ## Prerequisites
To get an API key you need to create an [API account](https://www.last.fm/api/account/create). To get an API key you need to create an [API account](https://www.last.fm/api/account/create).
## Configuration {% include integrations/config_flow.md %}
To use Last.fm sensor with your installation, add the following to your `configuration.yaml` file:
```yaml
# Example configuration.yaml entry
sensor:
- platform: lastfm
api_key: YOUR_API_KEY
users:
- user1
- user2
```
{% configuration %}
api_key:
description: Your Last.fm API key.
required: true
type: string
users:
description: List of users.
required: true
type: list
keys:
username:
description: Username of the user to monitor.
{% endconfiguration %}

View File

@ -43,7 +43,7 @@ Most lights do not support all attributes. You can check the integration documen
| ---------------------- | -------- | ----------- | | ---------------------- | -------- | ----------- |
| `entity_id` | no | String or list of strings that point at `entity_id`s of lights. To target all lights, set `entity_id` to `all`. | `entity_id` | no | String or list of strings that point at `entity_id`s of lights. To target all lights, set `entity_id` to `all`.
| `transition` | yes | Number that represents the time (in seconds) the light should take to transition to the new state. | `transition` | yes | Number that represents the time (in seconds) the light should take to transition to the new state.
| `profile` | yes | String with the name of one of the [built-in profiles](https://github.com/home-assistant/home-assistant/blob/master/homeassistant/components/light/light_profiles.csv) (relax, energize, concentrate, reading) or one of the custom profiles defined in `light_profiles.csv` in the current working directory. Light profiles define an xy color, brightness and a transition value (if no transition is desired, set to 0 or leave out the column entirely). If a profile is given, and a brightness is set, then the profile brightness will be overwritten. | `profile` | yes | String with the name of one of the [built-in profiles](https://github.com/home-assistant/core/blob/master/homeassistant/components/light/light_profiles.csv) (relax, energize, concentrate, reading) or one of the custom profiles defined in `light_profiles.csv` in the current working directory. Light profiles define an xy color, brightness and a transition value (if no transition is desired, set to 0 or leave out the column entirely). If a profile is given, and a brightness is set, then the profile brightness will be overwritten.
| `hs_color` | yes | A list containing two floats representing the hue and saturation of the color you want the light to be. Hue is scaled 0-360, and saturation is scaled 0-100. | `hs_color` | yes | A list containing two floats representing the hue and saturation of the color you want the light to be. Hue is scaled 0-360, and saturation is scaled 0-100.
| `xy_color` | yes | A list containing two floats representing the xy color you want the light to be. Two comma-separated floats that represent the color in XY. | `xy_color` | yes | A list containing two floats representing the xy color you want the light to be. Two comma-separated floats that represent the color in XY.
| `rgb_color` | yes | A list containing three integers between 0 and 255 representing the RGB color you want the light to be. Three comma-separated integers that represent the color in RGB, within square brackets. | `rgb_color` | yes | A list containing three integers between 0 and 255 representing the RGB color you want the light to be. Three comma-separated integers that represent the color in RGB, within square brackets.
@ -116,6 +116,6 @@ Turns one or multiple lights off.
### Service `light.toggle` ### Service `light.toggle`
Toggles the state of one or multiple lights. Takes the same arguments as [`turn_on`](#service-lightturn_on) service. Toggles the state of one or multiple lights. Takes the same arguments as the [`light.turn_on`](#service-lightturn_on) service.
*Note*: If `light.toggle` is used for a group of lights, it will toggle the individual state of each light. If you want the lights to be treated as a single light, use [Light Groups](/integrations/light.group/) instead. *Note*: If `light.toggle` is used for a group of lights, it will toggle the individual state of each light. If you want the lights to be treated as a single light, use [Light Groups](/integrations/group#binary-sensor-light-and-switch-groups) instead.

View File

@ -106,8 +106,10 @@ The first use of a light or switch will try to register with your Lightwave WiFi
# TRVs # TRVs
Lightwave Thermostatic Radiator Values (TRV) are supported but require an additional proxy to capture the current TRV temperature. Lightwave Thermostatic Radiator Values (TRV) are supported.
See [LWProxy](https://github.com/ColinRobbins/Homeassistant-Lightwave-TRV)
Earlier integrations required a proxy - See [LWProxy](https://github.com/ColinRobbins/Homeassistant-Lightwave-TRV).
This capabilty is still supported, but no longer required.
```yaml ```yaml
# Example TRV configuration.yaml for TRVs # Example TRV configuration.yaml for TRVs
@ -117,8 +119,6 @@ lightwave:
R99D1: R99D1:
name: Bedroom Light name: Bedroom Light
trv: trv:
proxy_ip: 127.0.0.1 # Proxy address, do not change unless running on a different server
proxy_port: 7878 # Do not change, unless a port clash
trvs: trvs:
R1Dh: # The ID of the TRV. R1Dh: # The ID of the TRV.
name: Bedroom TRV name: Bedroom TRV

View File

@ -60,6 +60,12 @@ It is recommended to assign a static IP address to your main repeater. This ensu
</div> </div>
<div class='note'>
If you are using RadioRA2 software version 12 or later, the default `lutron` user with password `integration` is not configured by default. To configure a new telnet user, go to **Settings** > **Integration** in your project and add a new telnet login. Once configured, use the transfer tab to push your changes to the RadioRA2 main repeater(s).
</div>
## Keypad buttons ## Keypad buttons
Individual buttons on keypads are not represented as entities. Instead, they fire events called `lutron_event` whose payloads include `id` and `action` attributes. Individual buttons on keypads are not represented as entities. Instead, they fire events called `lutron_event` whose payloads include `id` and `action` attributes.

View File

@ -11,7 +11,7 @@ ha_platforms:
ha_integration_type: integration ha_integration_type: integration
--- ---
The `marytts` text-to-speech platform uses [MaryTTS](http://mary.dfki.de/) Text-to-Speech engine to read a text with natural sounding voices. The `marytts` text-to-speech platform uses [MaryTTS](http://mary.dfki.de/) text-to-speech engine to read a text with natural sounding voices.
## Configuration ## Configuration

View File

@ -223,6 +223,10 @@ The Philips Hue V2 bridge supports Matter since a recent update (the beta progra
- Device events for example for dimmer remotes are not supported. - Device events for example for dimmer remotes are not supported.
- Only basic control of lights is supported, no scenes, events, effects etc. - Only basic control of lights is supported, no scenes, events, effects etc.
### Tasmota
Tasmota supports Matter over IP on all ESP32 based devices (in experimental phase). Follow the [instructions](https://tasmota.github.io/docs/Matter/).
### TP-Link Tapo P125M (power plug) ### TP-Link Tapo P125M (power plug)
- Look for the M addition in the model name, a device without the M (regular P125) is not Matter compliant. - Look for the M addition in the model name, a device without the M (regular P125) is not Matter compliant.

View File

@ -62,9 +62,13 @@ homeassistant:
recording: /mnt/recordings recording: /mnt/recordings
``` ```
Please note, that the folder must be accessible locally. Home Assistant <div class='note'>
cannot connect to external or remote network shares using this configuration
option. If you want to use media from a network storage, the network storage must first be connected first. Refer to [these instructions on how to connect network storage](/common-tasks/os/#network-storage).
The media from the network storage is then automatically added to the local media browser.
</div>
## Playing media from a Media Source ## Playing media from a Media Source

View File

@ -15,6 +15,7 @@ ha_platforms:
- number - number
- sensor - sensor
- switch - switch
- time
ha_integration_type: integration ha_integration_type: integration
--- ---

View File

@ -16,7 +16,11 @@ ha_platforms:
ha_integration_type: integration ha_integration_type: integration
--- ---
The `metoffice` weather platform uses the Met Office's [DataPoint API](https://www.metoffice.gov.uk/datapoint) for weather data. You can get an API key by registering for a Met Office [account](https://register.metoffice.gov.uk/WaveRegistrationClient/public/register.do?service=datapoint). As their website is not as straightforward, after registration and verifying your account you can login [here](https://register.metoffice.gov.uk/MyAccountClient/account/view) to retrieve your API key. The `metoffice` weather platform uses the Met Office's [DataPoint API](https://www.metoffice.gov.uk/datapoint) for weather data.
## Getting started
Their website is not as straightforward so check the [getting started](https://www.metoffice.gov.uk/services/data/datapoint/getting-started).
1. Register for a [Met Office account](https://register.metoffice.gov.uk/WaveRegistrationClient/public/register.do?service=datapoint).
2. After registration and verification of your account, [login](https://register.metoffice.gov.uk/MyAccountClient/account/view) to retrieve your API key.
{% include integrations/config_flow.md %} {% include integrations/config_flow.md %}

View File

@ -1,6 +1,6 @@
--- ---
title: Microsoft Text-to-Speech (TTS) title: Microsoft text-to-speech (TTS)
description: Instructions on how to set up Microsoft Text-to-Speech with Home Assistant. description: Instructions on how to set up Microsoft text-to-speech with Home Assistant.
ha_category: ha_category:
- Text-to-speech - Text-to-speech
ha_iot_class: Cloud Push ha_iot_class: Cloud Push
@ -11,7 +11,7 @@ ha_platforms:
ha_integration_type: integration ha_integration_type: integration
--- ---
The `microsoft` text-to-speech platform uses the [TTS engine of the Microsoft Speech Service](https://docs.microsoft.com/en-us/azure/cognitive-services/speech-service/text-to-speech) to read a text with natural sounding voices. This integration uses an API that is part of the Cognitive Services offering and is known as the Microsoft Speech API. For this integration to work, you need a free API key. You can use your [Azure subscription](https://azure.microsoft.com) to create an [Azure Speech resource](https://portal.azure.com/#create/Microsoft.CognitiveServicesSpeechServices). The `microsoft` text-to-speech platform uses the [TTS engine of the Microsoft Speech Service](https://learn.microsoft.com/azure/cognitive-services/speech-service/text-to-speech) to read a text with natural sounding voices. This integration uses an API that is part of the Cognitive Services offering and is known as the Microsoft Speech API. For this integration to work, you need a free API key. You can use your [Azure subscription](https://azure.microsoft.com) to create an [Azure Speech resource](https://portal.azure.com/#create/Microsoft.CognitiveServicesSpeechServices).
## Configuration ## Configuration
@ -30,7 +30,7 @@ api_key:
required: true required: true
type: string type: string
language: language:
description: The language to use. Note that if you set the language to anything other than the default, you will need to specify a matching voice type as well. For the supported languages check the list of [available languages](https://github.com/home-assistant/home-assistant/blob/dev/homeassistant/components/microsoft/tts.py#L20). description: The language to use. Note that if you set the language to anything other than the default, you will need to specify a matching voice type as well. For the supported languages check the list of [available languages](https://github.com/home-assistant/core/blob/dev/homeassistant/generated/microsoft_tts.py).
required: false required: false
type: string type: string
default: "`en-us`" default: "`en-us`"
@ -40,7 +40,7 @@ gender:
type: string type: string
default: "`Female`" default: "`Female`"
type: type:
description: "The voice type you want to use. Accepted values are listed as the service name mapping [in the documentation](https://docs.microsoft.com/en-us/azure/cognitive-services/speech-service/language-support#text-to-speech)." description: "The voice type you want to use. Accepted values are listed as the service name mapping [in the documentation](https://learn.microsoft.com/azure/cognitive-services/speech-service/language-support?tabs=tts)."
required: false required: false
type: string type: string
default: "`JennyNeural`" default: "`JennyNeural`"
@ -64,7 +64,7 @@ contour:
required: false required: false
type: string type: string
region: region:
description: "The region of your API endpoint. See [documentation](https://docs.microsoft.com/en-us/azure/cognitive-services/speech-service/regions)." description: "The region of your API endpoint. See [documentation](https://learn.microsoft.com/azure/cognitive-services/speech-service/regions)."
required: false required: false
type: string type: string
default: "`eastus`" default: "`eastus`"
@ -72,9 +72,9 @@ region:
<div class='note'> <div class='note'>
Not all Azure regions support high-quality neural voices. Use [this overview](https://docs.microsoft.com/en-us/azure/cognitive-services/speech-service/regions#neural-and-standard-voices) to determine the availability of standard and neural voices by region/endpoint. Not all Azure regions support high-quality neural voices. Use [this overview](https://learn.microsoft.com/azure/cognitive-services/speech-service/regions) to determine the availability of standard and neural voices by region/endpoint.
New users ([any newly created Azure Speech resource after August 31st, 2021](https://docs.microsoft.com/en-us/azure/cognitive-services/speech-service/text-to-speech#migrate-to-neural-voice)) can only use neural voices. Existing resources can continue using standard voices through August 31st, 2024. New users ([any newly created Azure Speech resource after August 31st, 2021](https://learn.microsoft.com/azure/cognitive-services/speech-service/text-to-speech#more-about-neural-text-to-speech-features)) can only use neural voices. Existing resources can continue using standard voices through August 31st, 2024.
</div> </div>

View File

@ -11,17 +11,17 @@ ha_integration_type: integration
The `microsoft_face` integration platform is the main integration for Microsoft The `microsoft_face` integration platform is the main integration for Microsoft
Azure Cognitive service Azure Cognitive service
[Face](https://azure.microsoft.com/en-us/services/cognitive-services/face/). [Face](https://azure.microsoft.com/products/cognitive-services/vision-services).
All data are stored in your own private instance in the Azure cloud. All data are stored in your own private instance in the Azure cloud.
## Setup ## Setup
You need an API key, which is free, but requires an You need an API key, which is free, but requires an
[Azure registration](https://azure.microsoft.com/en-us/free/) using your [Azure registration](https://azure.microsoft.com/free/) using your
Microsoft ID. The free resource (*F0*) is limited to 20 requests per minute and Microsoft ID. The free resource (*F0*) is limited to 20 requests per minute and
30k requests in a month. If you don't want to use the Azure cloud, you can also 30k requests in a month. If you don't want to use the Azure cloud, you can also
get an API key by registering with get an API key by registering with
[cognitive-services](https://azure.microsoft.com/en-us/try/cognitive-services/). [cognitive-services](https://azure.microsoft.com/try/cognitive-services/).
Please note that all keys on cognitive services must be recreated every 90 days. Please note that all keys on cognitive services must be recreated every 90 days.
## Configuration ## Configuration

View File

@ -10,19 +10,19 @@ ha_integration_type: integration
--- ---
The `microsoft_face_detect` image processing platform allows you to use the The `microsoft_face_detect` image processing platform allows you to use the
[Microsoft Face Identify](https://www.microsoft.com/cognitive-services/en-us/) [Microsoft Face Identify](https://azure.microsoft.com/products/cognitive-services/)
API through Home Assistant. This platform enables you to detect face on camera API through Home Assistant. This platform enables you to detect face on camera
and fire an event with attributes. and fire an event with attributes.
Please refer to the [Microsoft Face component](/integrations/microsoft_face/) configuration on Please refer to the [Microsoft Face integration](/integrations/microsoft_face/) configuration on
how to setup the API key. how to setup the API key.
For using the result inside an automation rule, For using the result inside an automation rule,
take a look at the [Image Processing component](/integrations/image_processing/) page. take a look at the [Image Processing integration](/integrations/image_processing/) page.
<div class='note'> <div class='note'>
The free version of the Microsoft Face identify API limits the number of requests possible per month. Therefore, it is strongly recommended that you limit the `scan_interval` when setting up an instance of this entity as detailed on the main [Image Processing component](/integrations/image_processing/) page. The free version of the Microsoft Face identify API limits the number of requests possible per month. Therefore, it is strongly recommended that you limit the `scan_interval` when setting up an instance of this entity as detailed on the main [Image Processing integration](/integrations/image_processing/) page.
</div> </div>

View File

@ -10,19 +10,19 @@ ha_integration_type: integration
--- ---
The `microsoft_face_identify` image processing platform lets you use The `microsoft_face_identify` image processing platform lets you use
[Microsoft Face identify](https://www.microsoft.com/cognitive-services/en-us/) [Microsoft Face identify](https://azure.microsoft.com/products/cognitive-services/)
API through Home Assistant. This platform allow you do identify persons on API through Home Assistant. This platform allow you do identify persons on
camera and fire an event with attributes. camera and fire an event with attributes.
Please refer to the [Microsoft Face component](/integrations/microsoft_face/) configuration on Please refer to the [Microsoft Face integration](/integrations/microsoft_face/) configuration on
how to setup the API key. how to setup the API key.
For using the result inside an automation rule, For using the result inside an automation rule,
take a look at the [Image Processing component](/integrations/image_processing/) page. take a look at the [Image Processing integration](/integrations/image_processing/) page.
<div class='note'> <div class='note'>
The free version of the Microsoft Face identify API limits the number of requests possible per month. Therefore, it is strongly recommended that you limit the `scan_interval` when setting up an instance of this entity as detailed on the main [Image Processing component](/integrations/image_processing/) page. The free version of the Microsoft Face identify API limits the number of requests possible per month. Therefore, it is strongly recommended that you limit the `scan_interval` when setting up an instance of this entity as detailed on the main [Image Processing integration](/integrations/image_processing/) page.
</div> </div>

View File

@ -169,7 +169,7 @@ For Ubiquiti routers/access points the "Enable multicast enhancement (IGMPv3)" s
### Bypassing UDP multicast ### Bypassing UDP multicast
If UDP Multicast does not work in your setup (due to network limitations), this integration can be used in local polling mode. If UDP Multicast does not work in your setup (due to network limitations), this integration can be used in local polling mode.
Go to Settings -> Integrations -> on the already set up Motion Blinds integration click "configure" --> disable the "Wait for push" option (disabled by default). Go to Settings -> Integrations -> on the already set up Motion Blinds integration click "configure" --> disable the "Wait for multicast push on update" option (disabled by default).
The default update interval of the Motion Blinds integration is every 10 minutes. When UDP multicast pushes do not work, this polling interval can be a bit high. The default update interval of the Motion Blinds integration is every 10 minutes. When UDP multicast pushes do not work, this polling interval can be a bit high.
To increase the polling interval: To increase the polling interval:

View File

@ -68,7 +68,11 @@ The Mosquitto project runs a [public broker](https://test.mosquitto.org). This i
MQTT broker settings are configured when the MQTT integration is first set up and can be changed later if needed. MQTT broker settings are configured when the MQTT integration is first set up and can be changed later if needed.
Add the MQTT integration, then provide your broker's hostname (or IP address) and port and (if required) the username and password that Home Assistant should use. To change the settings later, click on "Configure" on the integration page in the UI, then "Re-configure MQTT". Add the MQTT integration, then provide your broker's hostname (or IP address) and port and (if required) the username and password that Home Assistant should use. To change the settings later, follow these steps:
1. Go to **{% my integrations title="Settings > Devices & Services" %}**.
1. On the MQTT integration, select the cogwheel.
1. Select **Configure**, then **Re-configure MQTT**.
<div class='note'> <div class='note'>
@ -109,7 +113,7 @@ With a secure broker connection it is possible to use a client certificate for a
#### Using WebSockets as transport #### Using WebSockets as transport
You can select `websockets` as transport method if your MQTT broker supports it. When you select `websockets` and click `NEXT` you will be able to add a WebSockets path (default = `/` and WebSockets headers (optional). The target WebSockets URI: `ws://{broker}:{port}{WebSockets path}` is built with `broker`, `port` and `ws_path` (WebSocket path) settings. You can select `websockets` as transport method if your MQTT broker supports it. When you select `websockets` and click `NEXT`, you will be able to add a WebSockets path (default = `/`) and WebSockets headers (optional). The target WebSockets URI: `ws://{broker}:{port}{WebSockets path}` is built with `broker`, `port` and `ws_path` (WebSocket path) settings.
To configure the WebSocketS headers supply a valid JSON dictionary string. E.g. `{ "Authorization": "token" , "x-header": "some header"}`. The default transport method is `tcp`. The WebSockets transport can be secured using TLS and optionally using user credentials or a client certificate. To configure the WebSocketS headers supply a valid JSON dictionary string. E.g. `{ "Authorization": "token" , "x-header": "some header"}`. The default transport method is `tcp`. The WebSockets transport can be secured using TLS and optionally using user credentials or a client certificate.
<div class='note'> <div class='note'>
@ -120,7 +124,12 @@ A configured client certificate will only be active if broker certificate valida
## Configure MQTT options ## Configure MQTT options
To change the settings, click on "Configure" in the integration page in the UI, then "Re-configure MQTT". Click `NEXT` to open the MQTT options page. To change the settings, follow these steps:
1. Go to **{% my integrations title="Settings > Devices & Services" %}**.
1. On the MQTT integration, select the cogwheel.
1. Select **Configure**, then **Re-configure MQTT**.
1. To open the MQTT options page, select **Next**.
### Discovery options ### Discovery options
@ -140,13 +149,17 @@ MQTT Birth and Last Will messages can be customized or disabled from the UI. To
The `mosquitto` broker package ships commandline tools (often as `*-clients` package) to send and receive MQTT messages. For sending test messages to a broker running on `localhost` check the example below: The `mosquitto` broker package ships commandline tools (often as `*-clients` package) to send and receive MQTT messages. For sending test messages to a broker running on `localhost` check the example below:
```bash ```bash
mosquitto_pub -h 127.0.0.1 -t home-assistant/switch/1/on -m "Switch is ON" mosquitto_pub -h 127.0.0.1 -t homeassistant/switch/1/on -m "Switch is ON"
``` ```
Another way to send MQTT messages manually is to use the "MQTT" integration in the frontend. Choose "Settings" on the left menu, click "Devices & Services", and choose "Configure" in the "Mosquitto broker" tile. Enter something similar to the example below into the "topic" field under "Publish a packet" and press "PUBLISH" . Another way to send MQTT messages manually is to use the **MQTT** integration in the frontend. Choose "Settings" on the left menu, click "Devices & Services", and choose "Configure" in the "Mosquitto broker" tile. Enter something similar to the example below into the "topic" field under "Publish a packet" and press "PUBLISH" .
1. Go to **{% my integrations title="Settings > Devices & Services" %}**.
1. On the Mosquitto broker integration, select the cogwheel, then select **Configure**.
1. Enter something similar to the example below into the **topic** field under **Publish a packet**. Select **Publish**.
```bash ```bash
home-assistant/switch/1/power homeassistant/switch/1/power
``` ```
and in the Payload field and in the Payload field
@ -155,23 +168,23 @@ and in the Payload field
ON ON
``` ```
In the "Listen to a topic" field, type `#` to see everything, or "home-assistant/switch/#" to just follow a published topic, then press "START LISTENING". The messages should appear similar to the text below: In the "Listen to a topic" field, type `#` to see everything, or "homeassistant/switch/#" to just follow a published topic, then press "START LISTENING". The messages should appear similar to the text below:
```bash ```bash
Message 23 received on home-assistant/switch/1/power/stat/POWER at 12:16 PM: Message 23 received on homeassistant/switch/1/power/stat/POWER at 12:16 PM:
ON ON
QoS: 0 - Retain: false QoS: 0 - Retain: false
Message 22 received on home-assistant/switch/1/power/stat/RESULT at 12:16 PM: Message 22 received on homeassistant/switch/1/power/stat/RESULT at 12:16 PM:
{ {
"POWER": "ON" "POWER": "ON"
} }
QoS: 0 - Retain: false QoS: 0 - Retain: false
``` ```
For reading all messages sent on the topic `home-assistant` to a broker running on localhost: For reading all messages sent on the topic `homeassistant` to a broker running on localhost:
```bash ```bash
mosquitto_sub -h 127.0.0.1 -v -t "home-assistant/#" mosquitto_sub -h 127.0.0.1 -v -t "homeassistant/#"
``` ```
## MQTT Discovery ## MQTT Discovery
@ -218,6 +231,7 @@ The discovery topic needs to follow a specific format:
<discovery_prefix>/<component>/[<node_id>/]<object_id>/config <discovery_prefix>/<component>/[<node_id>/]<object_id>/config
``` ```
- `<discovery_prefix>`: The Discovery Prefix defaults to `homeassistant`. This prefix can be [changed](#discovery-options).
- `<component>`: One of the supported MQTT components, eg. `binary_sensor`. - `<component>`: One of the supported MQTT components, eg. `binary_sensor`.
- `<node_id>` (*Optional*): ID of the node providing the topic, this is not used by Home Assistant but may be used to structure the MQTT topic. The ID of the node must only consist of characters from the character class `[a-zA-Z0-9_-]` (alphanumerics, underscore and hyphen). - `<node_id>` (*Optional*): ID of the node providing the topic, this is not used by Home Assistant but may be used to structure the MQTT topic. The ID of the node must only consist of characters from the character class `[a-zA-Z0-9_-]` (alphanumerics, underscore and hyphen).
- `<object_id>`: The ID of the device. This is only to allow for separate topics for each device and is not used for the `entity_id`. The ID of the device must only consist of characters from the character class `[a-zA-Z0-9_-]` (alphanumerics, underscore and hyphen). - `<object_id>`: The ID of the device. This is only to allow for separate topics for each device and is not used for the `entity_id`. The ID of the device must only consist of characters from the character class `[a-zA-Z0-9_-]` (alphanumerics, underscore and hyphen).
@ -316,10 +330,6 @@ Configuration variable names in the discovery payload may be abbreviated to cons
'fan_mode_stat_t': 'fan_mode_state_topic', 'fan_mode_stat_t': 'fan_mode_state_topic',
'frc_upd': 'force_update', 'frc_upd': 'force_update',
'g_tpl': 'green_template', 'g_tpl': 'green_template',
'hold_cmd_tpl': 'hold_command_template',
'hold_cmd_t': 'hold_command_topic',
'hold_stat_tpl': 'hold_state_template',
'hold_stat_t': 'hold_state_topic',
'hs_cmd_t': 'hs_command_topic', 'hs_cmd_t': 'hs_command_topic',
'hs_cmd_tpl': 'hs_command_template', 'hs_cmd_tpl': 'hs_command_template',
'hs_stat_t': 'hs_state_topic', 'hs_stat_t': 'hs_state_topic',
@ -481,7 +491,6 @@ Configuration variable names in the discovery payload may be abbreviated to cons
'tilt_clsd_val': 'tilt_closed_value', 'tilt_clsd_val': 'tilt_closed_value',
'tilt_cmd_t': 'tilt_command_topic', 'tilt_cmd_t': 'tilt_command_topic',
'tilt_cmd_tpl': 'tilt_command_template', 'tilt_cmd_tpl': 'tilt_command_template',
'tilt_inv_stat': 'tilt_invert_state',
'tilt_max': 'tilt_max', 'tilt_max': 'tilt_max',
'tilt_min': 'tilt_min', 'tilt_min': 'tilt_min',
'tilt_opnd_val': 'tilt_opened_value', 'tilt_opnd_val': 'tilt_opened_value',
@ -495,10 +504,6 @@ Configuration variable names in the discovery payload may be abbreviated to cons
'val_tpl': 'value_template', 'val_tpl': 'value_template',
'whit_cmd_t': 'white_command_topic', 'whit_cmd_t': 'white_command_topic',
'whit_scl': 'white_scale', 'whit_scl': 'white_scale',
'whit_val_cmd_t': 'white_value_command_topic',
'whit_val_scl': 'white_value_scale',
'whit_val_stat_t': 'white_value_state_topic',
'whit_val_tpl': 'white_value_template',
'xy_cmd_t': 'xy_command_topic', 'xy_cmd_t': 'xy_command_topic',
'xy_cmd_tpl': 'xy_command_template', 'xy_cmd_tpl': 'xy_command_template',
'xy_stat_t': 'xy_state_topic', 'xy_stat_t': 'xy_state_topic',
@ -536,7 +541,7 @@ The following software has built-in support for MQTT discovery:
- [IOTLink](https://iotlink.gitlab.io) (starting with 2.0.0) - [IOTLink](https://iotlink.gitlab.io) (starting with 2.0.0)
- [MiFlora MQTT Daemon](https://github.com/ThomDietrich/miflora-mqtt-daemon) - [MiFlora MQTT Daemon](https://github.com/ThomDietrich/miflora-mqtt-daemon)
- [Nuki Hub](https://github.com/technyon/nuki_hub) - [Nuki Hub](https://github.com/technyon/nuki_hub)
- [Nuki Smart Lock 3.0 Pro](https://support.nuki.io/hc/en-us/articles/12947926779409-MQTT-support) - [Nuki Smart Lock 3.0 Pro](https://support.nuki.io/hc/articles/12947926779409-MQTT-support), [more info](https://developer.nuki.io/t/mqtt-api-specification-v1-3/17626)
- [OpenMQTTGateway](https://github.com/1technophile/OpenMQTTGateway) - [OpenMQTTGateway](https://github.com/1technophile/OpenMQTTGateway)
- [room-assistant](https://github.com/mKeRix/room-assistant) (starting with 1.1.0) - [room-assistant](https://github.com/mKeRix/room-assistant) (starting with 1.1.0)
- [SmartHome](https://github.com/roncoa/SmartHome) - [SmartHome](https://github.com/roncoa/SmartHome)
@ -784,14 +789,14 @@ You must include either `topic` or `topic_template`, but not both. If providing
</p> </p>
```yaml ```yaml
topic: home-assistant/light/1/command topic: homeassistant/light/1/command
payload: on payload: on
``` ```
{% raw %} {% raw %}
```yaml ```yaml
topic: home-assistant/light/1/state topic: homeassistant/light/1/state
payload_template: "{{ states('device_tracker.paulus') }}" payload_template: "{{ states('device_tracker.paulus') }}"
``` ```
@ -800,7 +805,7 @@ payload_template: "{{ states('device_tracker.paulus') }}"
{% raw %} {% raw %}
```yaml ```yaml
topic_template: "home-assistant/light/{{ states('sensor.light_active') }}/state" topic_template: "homeassistant/light/{{ states('sensor.light_active') }}/state"
payload_template: "{{ states('device_tracker.paulus') }}" payload_template: "{{ states('device_tracker.paulus') }}"
``` ```
@ -811,7 +816,7 @@ If you want to send JSON using the YAML editor then you need to format/escape
it properly. Like: it properly. Like:
```yaml ```yaml
topic: home-assistant/light/1/state topic: homeassistant/light/1/state
payload: "{\"Status\":\"off\", \"Data\":\"something\"}"` payload: "{\"Status\":\"off\", \"Data\":\"something\"}"`
``` ```
@ -844,7 +849,7 @@ data:
Example of how to use `qos` and `retain`: Example of how to use `qos` and `retain`:
```yaml ```yaml
topic: home-assistant/light/1/command topic: homeassistant/light/1/command
payload: on payload: on
qos: 2 qos: 2
retain: true retain: true

View File

@ -13,7 +13,7 @@ ha_platforms:
ha_integration_type: integration ha_integration_type: integration
--- ---
The `Microsoft Teams` platform allows you to send notifications from Home Assistant to a team channel in [Microsoft Teams](https://products.office.com/en-us/microsoft-teams/group-chat-software). The `Microsoft Teams` platform allows you to send notifications from Home Assistant to a team channel in [Microsoft Teams](https://www.microsoft.com/microsoft-teams/group-chat-software).
## Setup ## Setup

View File

@ -2,7 +2,6 @@
title: Google Nest title: Google Nest
description: Instructions on how to integrate Nest into Home Assistant. description: Instructions on how to integrate Nest into Home Assistant.
ha_category: ha_category:
- Binary Sensor
- Camera - Camera
- Climate - Climate
- Doorbell - Doorbell
@ -56,11 +55,10 @@ Adding Nest to your Home Assistant instance can be done via the user interface,
{% details "Manual configuration steps" %} {% details "Manual configuration steps" %}
1. Browse to your Home Assistant instance. 1. Browse to your Home Assistant instance.
1. In the sidebar click on _**{% my config icon %}**_. 1. Go to **{% my integrations title="Settings > Devices & Services" %}**.
1. From the configuration menu select: _**{% my integrations %}**_. 1. In the bottom right corner, select the
1. In the bottom right, click on the **{% my config_flow_start icon domain=page.ha_domain %}** button.
_**{% my config_flow_start icon domain=page.ha_domain %}**_ button. 1. From the list, select **Nest** and follow the instructions on screen.
1. From the list, search and select _**"Nest"**_ and follow the instructions.
{% enddetails %} {% enddetails %}
@ -424,9 +422,8 @@ To improve security and reduce phishing risk Google has [deprecated](https://dev
{% details "Reconfigure the integration %} {% details "Reconfigure the integration %}
1. Make sure to upgrade to the latest version of Home Assistant. 1. Make sure to upgrade to the latest version of Home Assistant.
1. In the sidebar click on _**{% my config icon %}**_. 1. Go to **{% my integrations title="Settings > Devices & Services" %}**.
1. From the configuration menu select: _**{% my integrations %}**_. 1. The **Nest** integration should appear with alert.
1. The *Nest* integration should appear with alert.
![Screenshot of success](/images/integrations/nest/attention.png) ![Screenshot of success](/images/integrations/nest/attention.png)
@ -581,342 +578,6 @@ logger:
<div class='note warning'> <div class='note warning'>
The Legacy [Works with Nest](https://developers.nest.com/) API is not accepting new signups. The Legacy [Works with Nest](https://developers.nest.com/) API is deprecated, and will be shut down by Google in September 2023.
</div> </div>
{% details "Legacy Works with Nest Configuration Steps" %}
The Nest integration is the main integration to integrate all [Nest](https://nest.com/) related platforms. To connect Nest, you will have to [sign up for a developer account](https://developers.nest.com/products) and get a `client_id` and `client_secret`.
There is currently support for the following device types within Home Assistant:
- [Binary Sensor](#binary-sensor)
- [Camera](#camera)
- [Climate](#climate)
- [Sensor](#sensor)
**Setting up developer account**
1. Visit [Nest Developers](https://developers.nest.com/), and sign in. Create an account if you don't have one already.
2. Fill in account details:
* The "Company Information" can be anything. We recommend using your name.
3. Submit changes
4. Click "[Products](https://developers.nest.com/products)" at top of page.
5. Click "[Create New Product](https://developers.nest.com/products/new)"
6. Fill in details:
* Product name must be unique. We recommend [email] - Home Assistant.
* The description, users, URLs can all be anything you want.
* Leave the "Redirect URI" Field blank
7. For permissions check every box and if it's an option select the read/write option. Note: there are important permissions under the "Other Permissions" category. If you are only adding a thermostat, do not just select the permissions under "Thermostat". You still need to check the boxes under "Other Permissions" in order to give you access to features like away mode, ETA, structure read/write, and postal code.
* The description requires a specific format to be accepted.
* Use "[Home Assistant] [Edit] [For Home Automation]" as the description as it is not super important.
8. Click "Create Product"
9. Once the new product page opens the "Product ID" and "Product Secret" are located on the right side. These will be used as `client_id` and `client_secret` below.
10. Add the Nest integration to your `configuration.yaml` and restart Home Assistant. Then, go to `Settings > Devices & Services` and select `CONFIGURE` next to `Nest`. Click the link in the configurator pop up to log into your Nest account and complete the OAuth. Copy the resulting PIN code into the pop up.
Connecting to the Nest Developer API requires outbound port 9553 on your firewall. The configuration will fail if this is not accessible.
**Configuration**
```yaml
# Example configuration.yaml entry
nest:
client_id: CLIENT_ID
client_secret: CLIENT_SECRET
```
```yaml
# Example configuration.yaml entry to show only devices at your vacation and primary homes
nest:
client_id: CLIENT_ID
client_secret: CLIENT_SECRET
structure:
- Vacation
- Primary
```
{% configuration %}
client_id:
description: Your Nest developer client ID.
required: true
type: string
client_secret:
description: Your Nest developer client secret.
required: true
type: string
structure:
description: The structure or structures you would like to include devices from. If not specified, this will include all structures in your Nest account.
required: false
type: list
{% endconfiguration %}
**Service `set_away_mode`**
You can use the service `nest/set_away_mode` to set the structure(s) to "Home" or "Away".
| Service data attribute | Optional | Description |
| ---------------------- | -------- | ----------- |
| `away_mode` | no | String, must be `away` or `home`.
| `structure` | yes | String, will default to all configured Nest structures if not specified.
Examples:
```yaml
# Example script to set away, no structure specified so will execute for all
script:
nest_set_away:
sequence:
- service: nest.set_away_mode
data:
away_mode: away
```
```yaml
# Example script to set home, structure specified
script:
nest_set_home:
sequence:
- service: nest.set_away_mode
data:
away_mode: home
structure:
- Apartment
```
**Service `set_eta`**
You can use the service `nest/set_eta` to set or update the estimated time of arrival window. Calling this service will automatically set the structure(s) to "Away". Structures must have an associated Nest thermostat in order to use ETA function.
| Service data attribute | Optional | Description |
| ---------------------- | -------- | ----------- |
| `eta` | no | Time period, estimated time of arrival from now.
| `eta_window` | yes | Time period, estimated time of arrival window. Default is 1 minute.
| `trip_id` | yes | String, unique ID for the trip. Default is auto-generated using a timestamp. Using an existing `trip_id` will update that trip's ETA.
| `structure` | yes | String, will default to all configured Nest structures if not specified.
Examples:
```yaml
# Example script to set ETA, no structure specified so will execute for all
script:
nest_set_eta:
sequence:
- service: nest.set_eta
data:
eta: 00:10:30
trip_id: Leave Work
```
```yaml
# Example script to update ETA and specify window, structure specified
script:
nest_update_eta:
sequence:
- service: nest.set_eta
data:
eta: 00:11:00
eta_window: 00:05
trip_id: Leave Work
structure:
- Apartment
```
**Service `cancel_eta`**
You can use the service `nest/cancel_eta` to cancel an existing estimated time of arrival window. Structures must have an associated Nest thermostat in order to use ETA function.
| Service data attribute | Optional | Description |
| ---------------------- | -------- | ----------- |
| `trip_id` | no | String, unique ID for the trip. Using an existing `trip_id` will update that trip's ETA.
| `structure` | yes | String, will default to all configured Nest structures if not specified.
Examples:
```yaml
# Example script to cancel ETA, no structure specified so will execute for all
script:
nest_cancel_eta:
sequence:
- service: nest.cancel_eta
data:
trip_id: Leave Work
```
```yaml
# Example script to cancel ETA, structure specified
script:
nest_cancel_eta:
sequence:
- service: nest.cancel_eta
data:
trip_id: Leave Work
structure:
- Apartment
```
**Troubleshooting**
- If you're getting [rickrolled](https://www.youtube.com/watch?v=dQw4w9WgXcQ) by the Legacy API instead of being able to see your Nest cameras, you may not have set up your developer account's permissions correctly. Go back through and make sure you've selected read/write under every category that it's an option.
**Platforms**
<div class='note'>
You must have the [Nest component](/integrations/nest/) configured to use the platforms below.
</div>
**Binary Sensor**
The `nest` binary sensor platform lets you monitor various states of your [Nest](https://nest.com) devices.
<div class='note'>
You must have the [Nest component](/integrations/nest/) configured to use these sensors. The binary sensors will be setup if the `nest` integration is configured and the required configuration for the `nest binary sensor` is set.
</div>
**Configuration**
To enable binary sensors and customize which sensors are setup, you can extend the [Nest component](/integrations/nest/) configuration in your `configuration.yaml` file with the following settings:
```yaml
# Example configuration.yaml entry
nest:
binary_sensors:
monitored_conditions:
- 'fan'
- 'target'
```
By default all binary sensors for your available Nest devices will be monitored. Leave `monitored_conditions` blank to disable all binary sensors for the [Nest component](/integrations/nest/).
{% configuration %}
monitored_conditions:
description: States to monitor.
required: false
type: list
{% endconfiguration %}
The following conditions are available by device:
- Nest Home:
- away
- Nest Thermostat:
- online
- fan
- is\_using\_emergency\_heat
- is\_locked
- has\_leaf
- Nest Protect:
- online
- Nest Camera:
- online
- motion\_detected
- person\_detected
- sound\_detected
**Camera**
The `nest` platform allows you to watch still frames from a video stream (not live stream) of your [Nest](https://nest.com/camera/meet-nest-cam/) camera in Home Assistant.
<div class='note'>
The Legacy API integration allows you to watch still frames from a video stream (not live stream). The Legacy API also supports the `camera.turn_on` and `camera.turn_off` services.
</div>
Nest Camera supports the `camera.turn_on` and `camera.turn_off` services since the 0.75 release.
**Climate**
The `nest` climate platform lets you control a thermostat from [Nest](https://nest.com).
<div class='note'>
Please note due to limitations with the European Nest Thermostat E, integration with Home Assistant for that thermostat is not possible.
</div>
<p class='img'>
<img src='/images/screenshots/nest-thermostat-card.png' />
</p>
**Sensor**
The `nest` sensor platform lets you monitor sensors connected to your [Nest](https://nest.com) devices.
<div class='note'>
The sensors will be setup if the `nest` integration is configured and the required configuration for the `nest sensor` is set.
</div>
**Configuration**
To enable sensors and customize which sensors are setup, you can extend the [Nest component](/integrations/nest/) configuration in your `configuration.yaml` file with the following settings:
```yaml
# Example configuration.yaml entry
nest:
sensors:
monitored_conditions:
- 'temperature'
- 'target'
```
By default all sensors for your available Nest devices will be monitored. Leave `monitored_conditions` blank to disable all sensors for the [Nest component](/integrations/nest/).
{% configuration %}
monitored_conditions:
description: States to monitor.
required: false
type: list
{% endconfiguration %}
The following conditions are available by device:
- Nest Home:
- `eta`: Estimated time of arrival.
- `security_state`: `ok` or `deter`. [Security State](#security-state). Only available when Nest Camera exists.
- Nest Thermostat:
- `humidity`
- `preset_mode`
- `temperature`
- `target`
- `hvac_state`: The currently active state of the HVAC system, `heat`, `cool` or `off` (previously `heating`, `cooling` or `off`).
- Nest Protect:
- `co_status`: `Ok`, `Warning` or `Emergency`
- `smoke_status`: `Ok`, `Warning` or `Emergency`
- `battery_health`: `Ok` or `Replace`
- `color_status`: `gray`, `green`, `yellow` or `red`. Indicates device status by color in the Nest app UI. It is an aggregate condition for battery+smoke+CO states, and reflects the actual color indicators displayed in the Nest app.
- Nest Camera: none
**Security State**
<div class='note warning'>
This feature is not designed to transform your Home Assistant into a security system, neither Home Assistant nor Nest are liable for damages,
or consequential damages of any character arising as a result of use this feature.
This feature does not depend on the [Nest Secure alarm system](https://nest.com/alarm-system/overview/) and is not a reflection of the status of that system,
nor does it react to state changes in that system.
</div>
<div class='note'>
This feature uses a new [Nest Security API](https://developers.nest.com/documentation/cloud/security-guide).
You may need to change your ["Product"](https://developers.nest.com/products) permission setting to include `Security State Read`.
After this permission change, you may need to re-authorize your client.
</div>
If a Nest Cam detects the presence of a person (see `person_detected` in [binary_sensor.nest](#binary-sensor) while the structure is in `away` mode (see `away` in [binary_sensor.nest](#binary-sensor), the structure enters `deter` mode.
A `deter` state is re-evaluated after several minutes and relaxed to `ok` if no further `person_detected` events have occurred.
The `security_state` automatically switches to `ok` when the structure state is `home`.
{% enddetails %}

View File

@ -109,3 +109,7 @@ Displays the current link rate of the device indicating the maximum possible dat
### Link type ### Link type
Displays the current link type: wired, 2.4GHz or 5GHz. Displays the current link type: wired, 2.4GHz or 5GHz.
## Troubleshooting
- If you get a "Connection or login error" when trying to setup the NETGEAR integration, please try using the IP address of the router (often "192.168.1.1") as host instead of the default "routerlogin.net".

View File

@ -14,6 +14,8 @@ ha_platforms:
- notify - notify
- sensor - sensor
ha_integration_type: integration ha_integration_type: integration
ha_codeowners:
- '@tkdrob'
--- ---
The NETGEAR LTE integration for Home Assistant allows you to observe and control [NETGEAR LTE modems](https://www.netgear.com/home/mobile-wifi/lte-modems/). The NETGEAR LTE integration for Home Assistant allows you to observe and control [NETGEAR LTE modems](https://www.netgear.com/home/mobile-wifi/lte-modems/).

View File

@ -12,7 +12,7 @@ ha_codeowners:
ha_integration_type: system ha_integration_type: system
--- ---
This integration provides network configuration for integrations such as [Zeroconf](/integrations/zeroconf/). It is managed by going to **{% my network title="Settings >> System >> Network" %}** and is only available to users that have "Advanced Mode" enabled on their {% my profile title="user profile" %}. This integration provides network configuration for integrations such as [Zeroconf](/integrations/zeroconf/). It is managed by going to **{% my network title="Settings > System > Network" %}** and is only available to users that have "Advanced Mode" enabled on their {% my profile title="user profile" %}.
**{% my general badge %}** **{% my general badge %}**

View File

@ -4,7 +4,7 @@ description: Instructions on how to use public transit data from Nextbus in Home
ha_category: ha_category:
- Sensor - Sensor
- Transport - Transport
ha_iot_class: Local Polling ha_iot_class: Cloud Polling
ha_release: 0.93 ha_release: 0.93
ha_codeowners: ha_codeowners:
- '@vividboarder' - '@vividboarder'

View File

@ -17,6 +17,7 @@ ha_platforms:
- select - select
- sensor - sensor
- switch - switch
- water_heater
ha_integration_type: integration ha_integration_type: integration
--- ---
@ -28,9 +29,12 @@ Supported devices:
- S1145/S1155 - S1145/S1155
- F1245/F1255 - F1245/F1255
- F1355/F1355 - F1355/F1355
- S2125
- S320/S325
- F370 - F370
- F470 - F470
- F730 - F730
- S735
- F750 - F750
- SMO40 - SMO40
- SMOS40 - SMOS40

View File

@ -68,3 +68,8 @@ Events generated by Nuki are sent as events of type `nuki_event` with the follow
| -------------------- | ------------------------------------------ | | -------------------- | ------------------------------------------ |
| `type` | The type of the event. Values: `ring` | `type` | The type of the event. Values: `ring`
| `entity_id` | The ID of the entity generating the event. | `entity_id` | The ID of the entity generating the event.
## MQTT support
The Nuki Smart Lock 3.0 Pro also [supports MQTT](https://support.nuki.io/hc/en-us/articles/12947926779409-MQTT-support) and can directly integrate with Home Assistant through [MQTT discovery](/integrations/mqtt/#mqtt-discovery).
Specific information can be found [here](https://developer.nuki.io/t/mqtt-api-specification-v1-3/17626).

View File

@ -36,15 +36,23 @@ If running Home assistant Core in a venv, ensure that libxml2 and libxslt python
Most of the ONVIF devices support more than one audio/video profile. Each profile provides different image quality, or in the case of an NVR, separate connected cameras. This integration will add entities for all compatible profiles with the video encoding set to H.264. Usually, the first profile has the highest quality and it is the profile used by default. However, you may want to use a lower quality image. You may disable unwanted entities through the Home Assistant UI. Most of the ONVIF devices support more than one audio/video profile. Each profile provides different image quality, or in the case of an NVR, separate connected cameras. This integration will add entities for all compatible profiles with the video encoding set to H.264. Usually, the first profile has the highest quality and it is the profile used by default. However, you may want to use a lower quality image. You may disable unwanted entities through the Home Assistant UI.
### Extra configuration of the integration {% include integrations/option_flow.md %}
You can configure specific FFmpeg options through the integration options flow by clicking the gear icon on the top right of the integration details page.
| Option | Description | | Option | Description |
| -------| ----------- | | -------| ----------- |
| RTSP transport mechanism | RTSP transport protocols. The possible options are: `tcp`, `udp`, `udp_multicast`, `http`. | | RTSP transport mechanism | RTSP transport protocols. The possible options are: `tcp`, `udp`, `udp_multicast`, `http`. |
| Extra FFmpeg arguments | Extra options to pass to `ffmpeg`, e.g., image quality or video filter options. More details in [`ffmpeg` integration](/integrations/ffmpeg). | | Extra FFmpeg arguments | Extra options to pass to `ffmpeg`, e.g., image quality or video filter options. More details in [`ffmpeg` integration](/integrations/ffmpeg). |
| Use wallclock as timestamps | ([Advanced Mode](/blog/2019/07/17/release-96/#advanced-mode) only) Rewrite the camera timestamps. This may help with playback or crashing issues from Wi-Fi cameras or cameras of certain brands (e.g., EZVIZ). | | Use wallclock as timestamps | ([Advanced Mode](/blog/2019/07/17/release-96/#advanced-mode) only) Rewrite the camera timestamps. This may help with playback or crashing issues from Wi-Fi cameras or cameras of certain brands (e.g., EZVIZ). |
| Enable Webhooks | If the device supports notifications via a Webhook, the integration will attempt to set up a Webhook. Disable this option to force falling back to trying PullPoint if the device supports it. |
#### Snapshots
Some cameras will not produce usable snapshots with larger stream sizes.
By default, the integration will only enable the camera entity for the first H264 profile. If you are unable to get a working snapshot:
- If additional camera entities are available for other profiles, try enabling those entities.
- Set the `Extra FFmpeg arguments` to `-pred 1 -ss 00:00:05 -frames:v 1` to cause the snapshot to be taken 5 seconds into the stream.
### Supported Sensors ### Supported Sensors

View File

@ -50,4 +50,4 @@ Top P:
### Talking to Super Mario over the phone ### Talking to Super Mario over the phone
You can use an OpenAI Conversation integration to [talk to Super Mario over a classic landline phone](/projects/worlds-most-private-voice-assistant/). You can use an OpenAI Conversation integration to [talk to Super Mario over a classic landline phone](/voice_control/worlds-most-private-voice-assistant/).

View File

@ -9,6 +9,8 @@ ha_domain: opensky
ha_platforms: ha_platforms:
- sensor - sensor
ha_integration_type: integration ha_integration_type: integration
ha_codeowners:
- '@joostlek'
--- ---
The `opensky` sensor allows one to track overhead flights in a given region. It uses crowd-sourced data from the [OpenSky Network](https://opensky-network.org/) public API. It will also fire Home Assistant events when flights enter and exit the defined region. The `opensky` sensor allows one to track overhead flights in a given region. It uses crowd-sourced data from the [OpenSky Network](https://opensky-network.org/) public API. It will also fire Home Assistant events when flights enter and exit the defined region.

Some files were not shown because too many files have changed in this diff Show More