mirror of
https://github.com/home-assistant/home-assistant.io.git
synced 2025-07-12 20:06:52 +00:00
Add instructions for using ollama experimental home control (#34465)
* Add instructions for using ollama experimental home control * Update source/_integrations/ollama.markdown Co-authored-by: coderabbitai[bot] <136622811+coderabbitai[bot]@users.noreply.github.com> * Update source/_integrations/ollama.markdown --------- Co-authored-by: coderabbitai[bot] <136622811+coderabbitai[bot]@users.noreply.github.com> Co-authored-by: c0ffeeca7 <38767475+c0ffeeca7@users.noreply.github.com>
This commit is contained in:
parent
27a7f531e7
commit
e0989d65e6
@ -13,6 +13,8 @@ ha_integration_type: service
|
||||
related:
|
||||
- docs: /docs/configuration/templating/
|
||||
title: Home Assistant Templating
|
||||
- docs: /voice_control/voice_remote_expose_devices/
|
||||
title: Exposing entities to Assist
|
||||
- docs: /docs/automation/trigger/#sentence-trigger
|
||||
title: Sentence trigger
|
||||
ha_platforms:
|
||||
@ -21,7 +23,9 @@ ha_platforms:
|
||||
|
||||
The **Ollama** {% term integration %} adds a conversation agent in Home Assistant powered by a local [Ollama](https://ollama.com/) server.
|
||||
|
||||
This conversation agent is unable to control your house. The Ollama conversation agent can be used in automations, but not as a [sentence trigger](/docs/automation/trigger/#sentence-trigger). It can only query information that has been provided by Home Assistant. To be able to answer questions about your house, Home Assistant will need to provide Ollama with the details of your house, which include areas, devices, and their states.
|
||||
Controlling Home Assistant is an experimental feature that provides the AI access to the Assist API of Home Assistant. You can control what devices and entities it can access from the {% my voice_assistants title="exposed entities page" %}. The AI is able to provide you information about your devices and control them.
|
||||
|
||||
This integration does not integrate with [sentence triggers](/docs/automation/trigger/#sentence-trigger).
|
||||
|
||||
This integration requires an external Ollama server, which is available for macOS, Linux, and Windows. Follow the [download instructions](https://ollama.com/download) to install the server. Once installed, configure Ollama to be [accessible over the network](https://github.com/ollama/ollama/blob/main/docs/faq.md#how-can-i-expose-ollama-on-my-network).
|
||||
|
||||
@ -34,9 +38,12 @@ URL:
|
||||
|
||||
Model:
|
||||
description: Name of the [Ollama model](https://ollama.com/library) to use, such as `mistral` or `llama2:13b`. Models will be automatically downloaded during setup.
|
||||
|
||||
Prompt template:
|
||||
description: The starting text for the AI language model to generate new text from. This text can include information about your Home Assistant instance, devices, and areas and is written using [Home Assistant Templating](/docs/configuration/templating/).
|
||||
|
||||
Instructions:
|
||||
description: Instructions for the AI on how it should respond to your requests. It is written using [Home Assistant Templating](/docs/configuration/templating/).
|
||||
|
||||
Control Home Assistant:
|
||||
description: If the model is allowed to interact with Home Assistant. It can only control or provide information about entities that are [exposed](/voice_control/voice_remote_expose_devices/) to it. This feature is considered experimental and see [Controlling Home Assistant](#controlling-home-assistant) below for details on model limitations.
|
||||
|
||||
Max history messages:
|
||||
description: Maximum number of messages to keep for each conversation (0 = no limit). Limiting this value will cause older messages in a conversation to be dropped.
|
||||
@ -45,3 +52,18 @@ Keep alive:
|
||||
description: Duration in seconds for the Ollama host to keep the model in memory after receiving a message (-1 = no limit, 0 = no retention). Default value is -1.
|
||||
|
||||
{% endconfiguration_basic %}
|
||||
|
||||
## Controlling Home Assistant
|
||||
|
||||
If you want to experiment with local LLMs using Home Assistant, we currently recommend using the `llama3.1:8b` model and exposing fewer than 25 entities. Note that smaller models are more likely to make mistakes than larger models.
|
||||
|
||||
Only models that support [Tools](https://ollama.com/search?c=tools) may control Home Assistant.
|
||||
|
||||
Smaller models may not [reliably maintain a conversation](https://llama.meta.com/docs/model-cards-and-prompt-formats/llama3_1/#llama-3.1-instruct) when controlling
|
||||
Home Assistant is enabled. However, you may use multiple Ollama configurations that
|
||||
share the same model, but use different prompts:
|
||||
|
||||
- Add the Ollama integration without enabling control of Home Assistant. You can use
|
||||
this conversation agent to have a conversation.
|
||||
- Add an additional Ollama integration, using the same model, enabling control of Home Assistant.
|
||||
You can use this conversation agent to control Home Assistant.
|
||||
|
Loading…
x
Reference in New Issue
Block a user