Assist: Cleanup topics (#29626)

* Assist: Cleanup topics

- Apply sentence style capitalization
- Add related topics

* Add link to voice assistant page

* Move dashboard procedure to Devices section

* Add more related links

* Add link
This commit is contained in:
c0ffeeca7 2023-10-31 17:20:48 +01:00 committed by GitHub
parent 305443962a
commit 49d2478349
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
10 changed files with 61 additions and 17 deletions

View File

@ -6,11 +6,12 @@
<ul class="divided sidebar-menu">
<li>{% active_link /voice_control/android/ Assist for Android %}</li>
<li>{% active_link /voice_control/apple/ Assist for Apple %}</li>
<li>{% active_link /voice_control/start_assist_from_dashboard/ Starting Assist from your dashboard %}</li>
</ul>
</div>
<div class="section">
<h1 class="title delta">Voice assistants</h1>
<h1 class="title delta">{% active_link /voice_control/ Voice assistants %}</h1>
<ul class="divided sidebar-menu">
<li>{% active_link /voice_control/voice_remote_local_assistant/ Configuring a local assistant %}</li>
<li>{% active_link /voice_control/voice_remote_cloud_assistant/ Configuring a cloud assistant %}</li>
@ -21,7 +22,6 @@
<li>{% active_link /voice_control/custom_sentences/ Custom sentences %}</li>
<li>{% active_link /voice_control/using_tts_in_automation/ Using Piper TTS in automations %}</li>
<li>{% active_link /voice_control/assist_create_open_ai_personality/ Creating a personality with OpenAI %}</li>
<li>{% active_link /voice_control/start_assist_from_dashboard/ Starting Assist from your dashboard %}</li>
<li>{% active_link /voice_control/troubleshooting/ Troubleshooting Assist %}</li>
</ul>
</div>

View File

@ -1,5 +1,5 @@
---
title: "Assist - Entity Aliases"
title: "Assist - entity aliases"
---
Assist will use the names of your entities, as well as any aliases you've configured.

View File

@ -133,4 +133,10 @@ Depending on your watch, you can assign Assist to a button so that you can start
- Select **Always**.
![List of assistants](/images/assist/android_watch_7.png)
3. Now, use your key and speak a command.
3. Now, use your key and speak a command.
## Related topics
- [Home Assistant Companion App](https://companion.home-assistant.io/docs/getting_started/)
- [Exposing devices to Assist](/voice_control/voice_remote_expose_devices/)
- [Starting Assist from your dashboard](/voice_control/start_assist_from_dashboard/)

View File

@ -36,3 +36,9 @@ Using OpenAI requires an OpenAI account. For this tutorial, the free trial optio
- Leave the other settings unchanged and select **Create**.
4. You can repeat this with other OpenAI personalities. You can add as many OpenAI Conversation integrations as you would like.
- To add a new personality, you need to create a new API key. Then, add a new OpenAI Conversation integration with that API key.
## Related topics
- [Home Assistant Cloud](https://www.nabucasa.com)
- [Cloud assistant pipeline](/voice_control/voice_remote_cloud_assistant/)
- [Local assistant pipeline](/voice_control/voice_remote_local_assistant/)

View File

@ -1,5 +1,5 @@
---
title: "Assist - Default Sentences"
title: "Assist - default sentences"
---
Home Assistant comes with built-in sentences [contributed by the community](https://github.com/home-assistant/intents/) for [dozens of languages](https://developers.home-assistant.io/docs/voice/intent-recognition/supported-languages).
@ -41,8 +41,8 @@ To get an idea of the specific sentences that are supported for your language, y
![Example of a folder of assistant sentence test files](/images/assist/intents-test-files.png)
- The screenshot below shows sentences used to test the command to turn on the lights. Note that *Living room* here is just a place holder.
It could be any area that you have in your home.
- The screenshot below shows sentences used to test the command to turn on the lights. Note that *Living room* here is just a place holder.
It could be any area that you have in your home.
![Example of a set of test sentences](/images/assist/assist-test-file-light-turn-on.png)
@ -50,11 +50,16 @@ To get an idea of the specific sentences that are supported for your language, y
- On GitHub, in the [tests](https://github.com/home-assistant/intents/tree/main/tests) folder, open the subfolder for your language.
- Open the file of interest.
![Sentences definition for turning on the light](/images/assist/assist-sentence-definition-01.png)
![Sentences definition for turning on the light](/images/assist/assist-sentence-definition-01.png)
- () mean alternative elements.
- [] mean optional elements.
- <> mean an expansion rule. The view these rules, search for `expansion_rules` in the [_common.yaml](https://github.com/home-assistant/intents/blob/main/sentences/en/_common.yaml) file.
- The syntax is explained in detail in the [template sentence syntax documentation](https://developers.home-assistant.io/docs/voice/intent-recognition/template-sentence-syntax/).
## Related topics
- [Create aliases](/voice_control/aliases/)
- [Create your own sentences](/voice_control/custom_sentences/)
- [Template sentence syntax documentation](https://developers.home-assistant.io/docs/voice/intent-recognition/template-sentence-syntax/)
- [Sentence test cases](https://github.com/home-assistant/intents/tree/main/sentences)

View File

@ -97,6 +97,7 @@ Things you can try if the execution is very slow:
## Related topics
- [$13 voice assistant for Home Assistant](/voice_control/thirteen-usd-voice-remote/)
- [wake word training environment](https://colab.research.google.com/drive/1q1oe2zOyZp7UsB3jJiQ1IFn8z5YfjwEb?usp=sharing#scrollTo=1cbqBebHXjFD)
- [Samba add-on installed](/common-tasks/os/#configuring-access-to-files)
- [openWakeWord](https://github.com/dscripka/openWakeWord)
- [Wake word training environment](https://colab.research.google.com/drive/1q1oe2zOyZp7UsB3jJiQ1IFn8z5YfjwEb?usp=sharing#scrollTo=1cbqBebHXjFD)
- [Installing the Samba add-on](/common-tasks/os/#configuring-access-to-files)
- [openWakeWord add-on](https://github.com/dscripka/openWakeWord)
- [About wake words](/voice_control/about_wake_word/)

View File

@ -1,5 +1,5 @@
---
title: "Assist - Custom Sentences"
title: "Assist - custom sentences"
---
You may add your own sentences to the intent recognizer by either extending an [existing intent](https://developers.home-assistant.io/docs/intent_builtin/) or creating a new one. You may also [customize responses](#customizing-responses) for existing intents.
@ -122,7 +122,7 @@ intent_script:
{% endraw %}
## Customizing Responses
## Customizing responses
Responses for existing intents can be customized as well in `config/custom_sentences/<language>`:
@ -138,3 +138,12 @@ responses:
```
{% endraw %}
## Related topics
- [View existing intents](https://developers.home-assistant.io/docs/intent_builtin/)
- [Create aliases](/voice_control/aliases/)
- [$13 voice assistant for Home Assistant](/voice_control/thirteen-usd-voice-remote/)
- [Assist for Apple](/voice_control/apple/)
- [Assist for Android](/voice_control/android/)

View File

@ -14,5 +14,9 @@ If you are using Home Assistant in kiosk mode, for example if you have a tablet
- You can use any assistant you have previously set up.
- If you have assistants in different languages, you can add a button for each of these languages.
6. If you are using Assist with your voice, enable **Start listening**.
- If you don't want to use voice but just want to type, you do not need to enable listening.
- If you don't want to use voice but just want to type, you do not need to enable listening.
7. **Save** your new button card.
## Related topics
- [Assist for Android](/voice_control/android/)

View File

@ -9,7 +9,13 @@ This is to avoid that sensitive devices, such as locks and garage doors, can ina
1. Go to **Settings** > **Voice assistants**.
2. Open the **Expose** tab.
![Expose entities tab](/images/assist/assistant-expose-01.png)
![Expose entities tab](/images/assist/assistant-expose-01.png)
3. Select **Expose entities**.
1. Select all entities you want to be able to control by voice.
![Expose entities tab](/images/assist/assistant-expose-02.png)
![Expose entities tab](/images/assist/assistant-expose-02.png)
## Related topics
- [Local assistant pipeline](/voice_control/voice_remote_local_assistant/)
- [Cloud assistant pipeline](/voice_control/voice_remote_cloud_assistant/)

View File

@ -16,7 +16,7 @@ your smart home and issue commands and get responses.
[Grandstream HT801](https://amzn.to/40k7mRa)
- includes a 5&nbsp;V power adapter and an Ethernet cable
- RJ11 phone cable to connect the phone to the Grandstream
- [Home Assistant Cloud](https://www.nabucasa.com) or a manually configured [Assist Pipeline](/integrations/assist_pipeline)
- [Cloud assistant pipeline](/voice_control/voice_remote_cloud_assistant/) or a manually configured [local assistant pipeline](/voice_control/voice_remote_local_assistant/)
## Setting up Grandstream
@ -125,3 +125,10 @@ The phone shown in the video by TheFes is a *Heemaf type 1955*, which was used b
The phone used during creation of this tutorial is a 1953 [*Tischstation Mod.29 HF-TR* by Autophon AG](https://www.radiomuseum.org/r/autophon_tischstation_mod29_hf_tr.html).
![Analog phone Tischstation Mod.29 by Autophon AG](/images/assist/autophon-mod-29.jpg)
## Related topics
- [Grandstream HT801](https://amzn.to/40k7mRa)
- [Home Assistant Cloud](https://www.nabucasa.com)
- [Cloud assistant pipeline](/voice_control/voice_remote_cloud_assistant/)
- [Local assistant pipeline](/voice_control/voice_remote_local_assistant/)