mirror of
https://github.com/home-assistant/developers.home-assistant.git
synced 2025-04-19 10:57:14 +00:00
Document the LLM helper (#2178)
* Document the LLM helper * Incorporate other PR * Apply suggestions from code review Co-authored-by: Michael Hansen <hansen.mike@gmail.com> --------- Co-authored-by: Michael Hansen <hansen.mike@gmail.com>
This commit is contained in:
parent
340ddb4c22
commit
1f235a8d53
17
blog/2024-05-20-llm-api.md
Normal file
17
blog/2024-05-20-llm-api.md
Normal file
@ -0,0 +1,17 @@
|
||||
---
|
||||
author: Paulus Schoutsen
|
||||
authorURL: https://github.com/balloob
|
||||
authorImageURL: /img/profile/paulus.jpg
|
||||
authorTwitter: balloob
|
||||
title: Exposing Home Assistant API to LLMs
|
||||
---
|
||||
|
||||
Since we introduced LLMs in Home Assistant as part of Year of the Voice, we have received the request to allow enabling LLMs to interact with Home Assistant. This is now possible by exposing a Home Assistant API to LLMs.
|
||||
|
||||
Home Assistant will come with a built-in Assist API, which follows the capabilities and exposed entities that are also accessible to the built-in conversation agent.
|
||||
|
||||
Integrations that interact with LLMs should update their integration to support LLM APIs.
|
||||
|
||||
Custom integration authors can create their own LLM APIs to offer LLMs more advanced access to Home Assistant.
|
||||
|
||||
See the [LLM API documentation](/docs/core/llm) for more information and [this example pull request](https://github.com/home-assistant/core/pull/117644) on how to integrate the LLM API in your integration.
|
264
docs/core/llm/index.md
Normal file
264
docs/core/llm/index.md
Normal file
@ -0,0 +1,264 @@
|
||||
---
|
||||
title: "Home Assistant API for Large Language Models"
|
||||
sidebar_label: "LLM API"
|
||||
---
|
||||
|
||||
Home Assistant can interact with large language models (LLMs). By exposing a Home Assistant API to an LLM, the LLM can fetch data or control Home Assistant to better assist the user. Home Assistant comes with a built-in LLM API but custom integrations can register their own to provide more advanced functionality.
|
||||
|
||||
## Built-in Assist API
|
||||
|
||||
Home Assistant has a built-in API which exposes the Assist API to LLMs. This API allows LLMs to interact with Home Assistant via [intents](../../intent_builtin), and can be extended by registering intents.
|
||||
|
||||
The Assist API is equivalent to the capabilities and exposed entities that are also accessible to the built-in conversation agent. No administrative tasks can be performed.
|
||||
|
||||
## Supporting LLM APIs
|
||||
|
||||
The LLM API needs to be integrated in two places in your integration. Users need to be able to configure which API should be used, and the tools offered by the API should be passed to the LLM when interacting with it.
|
||||
|
||||
### Options Flow
|
||||
|
||||
The chosen API should be stored in the config entry options. It should hold a string reference to the API id. If no API is selected, the key must be omitted.
|
||||
|
||||
In your options flow, you should offer a selector to the user to pick which API should be used.
|
||||
|
||||
```python
|
||||
from types import MappingProxyType
|
||||
|
||||
from homeassistant.const import CONF_LLM_HASS_API
|
||||
from homeassistant.core import HomeAssistant, callback
|
||||
from homeassistant.helpers import llm
|
||||
from homeassistant.helpers.selector import (
|
||||
SelectOptionDict,
|
||||
SelectSelector,
|
||||
SelectSelectorConfig,
|
||||
)
|
||||
|
||||
|
||||
@callback
|
||||
def async_get_options_schema(
|
||||
hass: HomeAssistant,
|
||||
options: MappingProxyType[str, Any],
|
||||
) -> vol.Schema:
|
||||
"""Return the options schema."""
|
||||
apis: list[SelectOptionDict] = [
|
||||
SelectOptionDict(
|
||||
label="No control",
|
||||
value="none",
|
||||
)
|
||||
]
|
||||
apis.extend(
|
||||
SelectOptionDict(
|
||||
label=api.name,
|
||||
value=api.id,
|
||||
)
|
||||
for api in llm.async_get_apis(hass)
|
||||
)
|
||||
|
||||
return vol.Schema(
|
||||
{
|
||||
vol.Optional(
|
||||
CONF_LLM_HASS_API,
|
||||
description={"suggested_value": options.get(CONF_LLM_HASS_API)},
|
||||
default="none",
|
||||
): SelectSelector(SelectSelectorConfig(options=apis)),
|
||||
}
|
||||
)
|
||||
```
|
||||
|
||||
When processing the options, make sure to remove the key if the user selected `none` before storing the options.
|
||||
|
||||
```python
|
||||
if user_input[CONF_LLM_HASS_API] == "none":
|
||||
user_input.pop(CONF_LLM_HASS_API)
|
||||
return self.async_create_entry(title="", data=user_input)
|
||||
```
|
||||
|
||||
### Fetching tools
|
||||
|
||||
When interacting with the LLM, you should fetch the tools from the selected API and pass them to the LLM together with the extra prompt provided by the API.
|
||||
|
||||
```python
|
||||
from homeassistant.const import CONF_LLM_HASS_API
|
||||
from homeassistant.core import HomeAssistant, callback
|
||||
from homeassistant.components import conversation
|
||||
from homeassistant.helpers import intent, llm
|
||||
|
||||
|
||||
class MyConversationEntity(conversation.ConversationEntity):
|
||||
|
||||
def __init__(self, entry: ConfigEntry) -> None:
|
||||
"""Initialize the agent."""
|
||||
self.entry = entry
|
||||
|
||||
...
|
||||
|
||||
async def async_process(
|
||||
self, user_input: conversation.ConversationInput
|
||||
) -> conversation.ConversationResult:
|
||||
"""Process the user input."""
|
||||
prompt: str = ...
|
||||
|
||||
intent_response = intent.IntentResponse(language=user_input.language)
|
||||
llm_api: llm.API | None = None
|
||||
tools: list[dict[str, Any]] | None = None
|
||||
|
||||
if self.entry.options.get(CONF_LLM_HASS_API):
|
||||
try:
|
||||
llm_api = llm.async_get_api(
|
||||
self.hass, self.entry.options[CONF_LLM_HASS_API]
|
||||
)
|
||||
except HomeAssistantError as err:
|
||||
LOGGER.error("Error getting LLM API: %s", err)
|
||||
intent_response.async_set_error(
|
||||
intent.IntentResponseErrorCode.UNKNOWN,
|
||||
f"Error preparing LLM API: {err}",
|
||||
)
|
||||
return conversation.ConversationResult(
|
||||
response=intent_response, conversation_id=user_input.conversation_id
|
||||
)
|
||||
|
||||
prompt += "\n" + llm_api.prompt_template
|
||||
tools = [
|
||||
_format_tool(tool) # TODO format the tools as your LLM expects
|
||||
for tool in llm_api.async_get_tools()
|
||||
]
|
||||
|
||||
# Interact with LLM and pass tools
|
||||
for _iteration in range(10):
|
||||
response = ... # Get response from your LLM. Include available tools and the results of previous tool calls
|
||||
|
||||
if not response.tool_call:
|
||||
break
|
||||
|
||||
LOGGER.debug(
|
||||
"Tool call: %s(%s)",
|
||||
response.tool_call.function.name,
|
||||
response.tool_call.function.arguments,
|
||||
)
|
||||
tool_input = llm.ToolInput(
|
||||
tool_name=response.tool_call.function.name,
|
||||
tool_args=json.loads(response.tool_call.function.arguments),
|
||||
platform=DOMAIN,
|
||||
context=user_input.context,
|
||||
user_prompt=user_input.text,
|
||||
language=user_input.language,
|
||||
assistant=conversation.DOMAIN,
|
||||
)
|
||||
try:
|
||||
tool_response = await llm_api.async_call_tool(
|
||||
self.hass, tool_input
|
||||
)
|
||||
except (HomeAssistantError, vol.Invalid) as e:
|
||||
tool_response = {"error": type(e).__name__}
|
||||
if str(e):
|
||||
tool_response["error_text"] = str(e)
|
||||
|
||||
LOGGER.debug("Tool response: %s", tool_response)
|
||||
```
|
||||
|
||||
## Creating your own API
|
||||
|
||||
To create your own API, you need to create a class that inherits from `API` and implement the `async_get_tools` method. The `async_get_tools` method should return a list of `Tool` objects that represent the functionality that you want to expose to the LLM.
|
||||
|
||||
### Tools
|
||||
|
||||
The `llm.Tool` class represents a tool that can be called by the LLM.
|
||||
|
||||
```python
|
||||
from homeassistant.core import HomeAssistant
|
||||
from homeassistant.helper import llm
|
||||
from homeassistant.util import dt as dt_util
|
||||
from homeassistant.util.json import JsonObjectType
|
||||
|
||||
|
||||
class TimeTool(llm.Tool):
|
||||
"""Tool to get the current time."""
|
||||
|
||||
name = "GetTime"
|
||||
description: "Returns the current time."
|
||||
|
||||
# Optional. A voluptuous schema of the input parameters.
|
||||
parameters = vol.Schema({
|
||||
vol.Optional('timezone'): str,
|
||||
})
|
||||
|
||||
async def async_call(
|
||||
self, hass: HomeAssistant, tool_input: llm.ToolInput
|
||||
) -> JsonObjectType:
|
||||
"""Call the tool."""
|
||||
if "timezone" in tool_input.tool_args:
|
||||
tzinfo = dt_util.get_time_zone(tool_input.tool_args["timezone"])
|
||||
else:
|
||||
tzinfo = dt_util.DEFAULT_TIME_ZONE
|
||||
|
||||
return dt_util.now(tzinfo).isoformat()
|
||||
```
|
||||
|
||||
The `llm.Tool` class has the following attributes:
|
||||
|
||||
| Name | Type | Description |
|
||||
|---------------------|------------|----------------------------------------------------------------------------------------------------------------|
|
||||
| `name` | string | The name of the tool. Required. |
|
||||
| `description` | string | Description of the tool to help the LLM understand when and how it should be called. Optional but recommended. |
|
||||
| `parameters` | vol.Schema | The voluptuous schema of the parameters. Defaults to vol.Schema({}) |
|
||||
|
||||
The `llm.Tool` class has the following methods:
|
||||
|
||||
#### `async_call`
|
||||
Perform the actual operation of the tool when called by the LLM. This must be an async method. Its arguments are `hass` and an instance of `llm.ToolInput`.
|
||||
|
||||
Response data must be a dict and serializable in JSON [`homeassistant.util.json.JsonObjectType`](https://github.com/home-assistant/home-assistant/blob/master/homeassistant/util/json.py).
|
||||
|
||||
Errors must be raised as `HomeAssistantError` exceptions (or its subclasses). The response data should not contain error codes used for error handling.
|
||||
|
||||
The `ToolInput` has following attributes:
|
||||
|
||||
| Name | Type | Description |
|
||||
|-------------------|---------|---------------------------------------------------------------------------------------------------------|
|
||||
| `tool_name` | string | The name of the tool being called |
|
||||
| `tool_args` | dict | The arguments provided by the LLM. The arguments are converted and validated using `parameters` schema. |
|
||||
| `platform` | string | The DOMAIN of the conversation agent using the tool |
|
||||
| `context` | Context | The `homeassistant.core.Context` of the conversation |
|
||||
| `user_prompt` | string | The raw text input that initiated the tool call |
|
||||
| `language` | string | The language of the conversation agent, or "*" for any language |
|
||||
| `assistant` | string | The assistant name used to control exposed entities. Currently only `conversation` is supported. |
|
||||
|
||||
### API
|
||||
|
||||
The API object represents a collection of tools that will be made available to the LLM.
|
||||
|
||||
```python
|
||||
from homeassistant.core import HomeAssistant
|
||||
from homeassistant.helper import llm
|
||||
from homeassistant.util import dt as dt_util
|
||||
from homeassistant.util.json import JsonObjectType
|
||||
|
||||
|
||||
class MyAPI(API):
|
||||
"""My own API for LLMs."""
|
||||
|
||||
def __init__(self, hass: HomeAssistant) -> None:
|
||||
"""Init the class."""
|
||||
super().__init__(
|
||||
hass=hass,
|
||||
id="my_unique_key",
|
||||
name="My own API",
|
||||
prompt_template="Call the tools to fetch data from the system.",
|
||||
)
|
||||
|
||||
@callback
|
||||
def async_get_tools(self) -> list[Tool]:
|
||||
"""Return a list of LLM tools."""
|
||||
return [
|
||||
TimeTool(),
|
||||
]
|
||||
```
|
||||
|
||||
The `llm.API` class has the following attributes:
|
||||
|
||||
| Name | Type | Description |
|
||||
|-------------------|---------|---------------------------------------------------------------------------------------------------------|
|
||||
| `id` | string | A unique identifier for the API. Required. |
|
||||
| `name` | string | The name of the API. Required. |
|
||||
| `prompt_template` | string | A template to render the prompt to describe the tools to the LLM. |
|
||||
|
@ -205,6 +205,7 @@ module.exports = {
|
||||
label: "Conversation",
|
||||
items: ["intent_conversation_api"],
|
||||
},
|
||||
"core/llm/index",
|
||||
{
|
||||
type: "category",
|
||||
label: "Native App Integration",
|
||||
|
Loading…
x
Reference in New Issue
Block a user