diff --git a/source/voice_control/assist_create_open_ai_personality.markdown b/source/voice_control/assist_create_open_ai_personality.markdown index 2eb5314415e..63c983b9d1f 100644 --- a/source/voice_control/assist_create_open_ai_personality.markdown +++ b/source/voice_control/assist_create_open_ai_personality.markdown @@ -44,14 +44,12 @@ There are cloud agents provided by [Open AI](/integrations/openai_conversation/) ### Creating a voice assistant personality with an LLM-based conversation agent 1. Go to {% my integrations title="**Settings** > **Devices & Services**" %} **Add Integration**, find your LLM provider and set it up with your API key. - - In case of a provider of local agents like Ollama, you need to configure the local URL where the agent is installed. Follow the specific [integration recommendations](/integrations/ollama) in this case. - + - In case of a provider of local agents like Ollama, you need to configure the local URL where the agent is installed. Follow the specific [integration recommendations](/integrations/ollama) in this case. 2. Go to **Settings > Voice Assistants > Add Assistant**. Give it a name and pick a conversation agent from your AI's option. In this example we are using Antropic and the agent picked is Claude. ![Add Claude agent to Assist](/images/assist/add-claude-to-assist.png) -3. Be mindful of your Text-to-speech and Speech-to-text configurations. These are not handled by the IA and should stay as you want them configured for Assist. - +3. Be mindful of your Text-to-speech and Speech-to-text configurations. These are not handled by the AI and should stay as you want them configured for Assist. 4. Configure the agent (gear icon next to the agent's name). - In the **Prompt template** field, enter a text that will prompt the AI to become the character. For example:: @@ -65,14 +63,12 @@ There are cloud agents provided by [Open AI](/integrations/openai_conversation/) - Once your Assist agent has been created, you can go to **Voice assistants** and the three dots menu of your personality, and define if you want Home Assistant's model to be the priority response, and therefore Assist would prefer to handling commands locally . ![Fallback toggle](/images/assist/fallback-assist-toggle.png) - - If you keep this option selected, if the intent can be answered by Home Assistant it will. It will not have the personality, but the response will be fast and efficient (since it doesn't require to go through the LLM). This is recommended in cases where you can accept not having the IA character reply sometimes and would rather your lights are turned on faster. + - If you keep this option selected, if the intent can be answered by Home Assistant it will. It will not have the personality, but the response will be fast and efficient (since it doesn't require to go through the LLM). This is recommended in cases where you can accept not having the AI character reply sometimes and would rather your lights are turned on faster. - If you deselect the option, all the intents will go through the agent. This is recommended when efficiency is not an issue and you need the agent never to break character (for example if your Assist personality is Santa Claus). - - 5. You can uncheck Recommended model settings, hit Submit and it will unblock extra customization. In the specific example of OpenAI, [here](/integrations/openai_conversation/#model) a brief summary of the other settings. 6. You can test the agent directly from the Voice assistants panel, selecting Start a conversation from the agent's menu. It will control your Home Assistant and reply exactly as it will do with any voice hardware. -7. In case you need troubleshooting with your LLM provider, check any specifics from your IA in our [integrations documentation](/integrations) +7. In case you need troubleshooting with your LLM provider, check any specifics from your AI in our [integrations documentation](/integrations) ## Tutorial: Setting up Assist with OpenAI