diff --git a/source/_includes/asides/voice_navigation.html b/source/_includes/asides/voice_navigation.html
index 83e10316064..94db543939d 100644
--- a/source/_includes/asides/voice_navigation.html
+++ b/source/_includes/asides/voice_navigation.html
@@ -6,11 +6,12 @@
-
Voice assistants
+ {% active_link /voice_control/ Voice assistants %}
diff --git a/source/voice_control/aliases.markdown b/source/voice_control/aliases.markdown
index 0f71a23da63..bae43a5f4c0 100644
--- a/source/voice_control/aliases.markdown
+++ b/source/voice_control/aliases.markdown
@@ -1,5 +1,5 @@
---
-title: "Assist - Entity Aliases"
+title: "Assist - entity aliases"
---
Assist will use the names of your entities, as well as any aliases you've configured.
diff --git a/source/voice_control/android.markdown b/source/voice_control/android.markdown
index f7fbe58bfda..5bf8e675397 100644
--- a/source/voice_control/android.markdown
+++ b/source/voice_control/android.markdown
@@ -133,4 +133,10 @@ Depending on your watch, you can assign Assist to a button so that you can start
- Select **Always**.

-3. Now, use your key and speak a command.
\ No newline at end of file
+3. Now, use your key and speak a command.
+
+## Related topics
+
+- [Home Assistant Companion App](https://companion.home-assistant.io/docs/getting_started/)
+- [Exposing devices to Assist](/voice_control/voice_remote_expose_devices/)
+- [Starting Assist from your dashboard](/voice_control/start_assist_from_dashboard/)
diff --git a/source/voice_control/assist_create_open_ai_personality.markdown b/source/voice_control/assist_create_open_ai_personality.markdown
index 3489a04097b..685e06f084b 100644
--- a/source/voice_control/assist_create_open_ai_personality.markdown
+++ b/source/voice_control/assist_create_open_ai_personality.markdown
@@ -36,3 +36,9 @@ Using OpenAI requires an OpenAI account. For this tutorial, the free trial optio
- Leave the other settings unchanged and select **Create**.
4. You can repeat this with other OpenAI personalities. You can add as many OpenAI Conversation integrations as you would like.
- To add a new personality, you need to create a new API key. Then, add a new OpenAI Conversation integration with that API key.
+
+## Related topics
+
+- [Home Assistant Cloud](https://www.nabucasa.com)
+- [Cloud assistant pipeline](/voice_control/voice_remote_cloud_assistant/)
+- [Local assistant pipeline](/voice_control/voice_remote_local_assistant/)
diff --git a/source/voice_control/builtin_sentences.markdown b/source/voice_control/builtin_sentences.markdown
index 2f6e2dcf6e7..abc9be20fb8 100644
--- a/source/voice_control/builtin_sentences.markdown
+++ b/source/voice_control/builtin_sentences.markdown
@@ -1,5 +1,5 @@
---
-title: "Assist - Default Sentences"
+title: "Assist - default sentences"
---
Home Assistant comes with built-in sentences [contributed by the community](https://github.com/home-assistant/intents/) for [dozens of languages](https://developers.home-assistant.io/docs/voice/intent-recognition/supported-languages).
@@ -41,8 +41,8 @@ To get an idea of the specific sentences that are supported for your language, y

- - The screenshot below shows sentences used to test the command to turn on the lights. Note that *Living room* here is just a place holder.
- It could be any area that you have in your home.
+ - The screenshot below shows sentences used to test the command to turn on the lights. Note that *Living room* here is just a place holder.
+ It could be any area that you have in your home.

@@ -50,11 +50,16 @@ To get an idea of the specific sentences that are supported for your language, y
- On GitHub, in the [tests](https://github.com/home-assistant/intents/tree/main/tests) folder, open the subfolder for your language.
- Open the file of interest.
- 
+ 
- () mean alternative elements.
- [] mean optional elements.
- <> mean an expansion rule. The view these rules, search for `expansion_rules` in the [_common.yaml](https://github.com/home-assistant/intents/blob/main/sentences/en/_common.yaml) file.
- The syntax is explained in detail in the [template sentence syntax documentation](https://developers.home-assistant.io/docs/voice/intent-recognition/template-sentence-syntax/).
+## Related topics
+- [Create aliases](/voice_control/aliases/)
+- [Create your own sentences](/voice_control/custom_sentences/)
+- [Template sentence syntax documentation](https://developers.home-assistant.io/docs/voice/intent-recognition/template-sentence-syntax/)
+- [Sentence test cases](https://github.com/home-assistant/intents/tree/main/sentences)
diff --git a/source/voice_control/create_wake_word.markdown b/source/voice_control/create_wake_word.markdown
index 32a94014657..4e4dce7e507 100644
--- a/source/voice_control/create_wake_word.markdown
+++ b/source/voice_control/create_wake_word.markdown
@@ -97,6 +97,7 @@ Things you can try if the execution is very slow:
## Related topics
- [$13 voice assistant for Home Assistant](/voice_control/thirteen-usd-voice-remote/)
-- [wake word training environment](https://colab.research.google.com/drive/1q1oe2zOyZp7UsB3jJiQ1IFn8z5YfjwEb?usp=sharing#scrollTo=1cbqBebHXjFD)
-- [Samba add-on installed](/common-tasks/os/#configuring-access-to-files)
-- [openWakeWord](https://github.com/dscripka/openWakeWord)
+- [Wake word training environment](https://colab.research.google.com/drive/1q1oe2zOyZp7UsB3jJiQ1IFn8z5YfjwEb?usp=sharing#scrollTo=1cbqBebHXjFD)
+- [Installing the Samba add-on](/common-tasks/os/#configuring-access-to-files)
+- [openWakeWord add-on](https://github.com/dscripka/openWakeWord)
+- [About wake words](/voice_control/about_wake_word/)
diff --git a/source/voice_control/custom_sentences.markdown b/source/voice_control/custom_sentences.markdown
index 15c25d0b904..967f0709cce 100644
--- a/source/voice_control/custom_sentences.markdown
+++ b/source/voice_control/custom_sentences.markdown
@@ -1,5 +1,5 @@
---
-title: "Assist - Custom Sentences"
+title: "Assist - custom sentences"
---
You may add your own sentences to the intent recognizer by either extending an [existing intent](https://developers.home-assistant.io/docs/intent_builtin/) or creating a new one. You may also [customize responses](#customizing-responses) for existing intents.
@@ -122,7 +122,7 @@ intent_script:
{% endraw %}
-## Customizing Responses
+## Customizing responses
Responses for existing intents can be customized as well in `config/custom_sentences/`:
@@ -138,3 +138,12 @@ responses:
```
{% endraw %}
+
+
+## Related topics
+
+- [View existing intents](https://developers.home-assistant.io/docs/intent_builtin/)
+- [Create aliases](/voice_control/aliases/)
+- [$13 voice assistant for Home Assistant](/voice_control/thirteen-usd-voice-remote/)
+- [Assist for Apple](/voice_control/apple/)
+- [Assist for Android](/voice_control/android/)
\ No newline at end of file
diff --git a/source/voice_control/start_assist_from_dashboard.markdown b/source/voice_control/start_assist_from_dashboard.markdown
index b7cb836d765..f31b3b6f3c5 100644
--- a/source/voice_control/start_assist_from_dashboard.markdown
+++ b/source/voice_control/start_assist_from_dashboard.markdown
@@ -14,5 +14,9 @@ If you are using Home Assistant in kiosk mode, for example if you have a tablet
- You can use any assistant you have previously set up.
- If you have assistants in different languages, you can add a button for each of these languages.
6. If you are using Assist with your voice, enable **Start listening**.
- - If you don't want to use voice but just want to type, you do not need to enable listening.
+ - If you don't want to use voice but just want to type, you do not need to enable listening.
7. **Save** your new button card.
+
+## Related topics
+
+- [Assist for Android](/voice_control/android/)
diff --git a/source/voice_control/voice_remote_expose_devices.markdown b/source/voice_control/voice_remote_expose_devices.markdown
index d16a5bd7e17..41c7db07351 100644
--- a/source/voice_control/voice_remote_expose_devices.markdown
+++ b/source/voice_control/voice_remote_expose_devices.markdown
@@ -9,7 +9,13 @@ This is to avoid that sensitive devices, such as locks and garage doors, can ina
1. Go to **Settings** > **Voice assistants**.
2. Open the **Expose** tab.
- 
+ 
3. Select **Expose entities**.
1. Select all entities you want to be able to control by voice.
- 
\ No newline at end of file
+ 
+
+
+## Related topics
+
+- [Local assistant pipeline](/voice_control/voice_remote_local_assistant/)
+- [Cloud assistant pipeline](/voice_control/voice_remote_cloud_assistant/)
\ No newline at end of file
diff --git a/source/voice_control/worlds-most-private-voice-assistant.markdown b/source/voice_control/worlds-most-private-voice-assistant.markdown
index 748a1181f0d..351fae4be23 100644
--- a/source/voice_control/worlds-most-private-voice-assistant.markdown
+++ b/source/voice_control/worlds-most-private-voice-assistant.markdown
@@ -16,7 +16,7 @@ your smart home and issue commands and get responses.
[Grandstream HT801](https://amzn.to/40k7mRa)
- includes a 5 V power adapter and an Ethernet cable
- RJ11 phone cable to connect the phone to the Grandstream
-- [Home Assistant Cloud](https://www.nabucasa.com) or a manually configured [Assist Pipeline](/integrations/assist_pipeline)
+- [Cloud assistant pipeline](/voice_control/voice_remote_cloud_assistant/) or a manually configured [local assistant pipeline](/voice_control/voice_remote_local_assistant/)
## Setting up Grandstream
@@ -125,3 +125,10 @@ The phone shown in the video by TheFes is a *Heemaf type 1955*, which was used b
The phone used during creation of this tutorial is a 1953 [*Tischstation Mod.29 HF-TR* by Autophon AG](https://www.radiomuseum.org/r/autophon_tischstation_mod29_hf_tr.html).

+
+## Related topics
+
+- [Grandstream HT801](https://amzn.to/40k7mRa)
+- [Home Assistant Cloud](https://www.nabucasa.com)
+- [Cloud assistant pipeline](/voice_control/voice_remote_cloud_assistant/)
+- [Local assistant pipeline](/voice_control/voice_remote_local_assistant/)
\ No newline at end of file