Compare commits

..

65 Commits

Author SHA1 Message Date
Paul Bottein
2725b95a10 Update frontend to 20251001.3 2025-10-11 14:32:34 +02:00
tronikos
73383e6c26 Add reconfigure flow in Google Assistant SDK (#153802) 2025-10-10 16:24:46 +02:00
Matthias Alphart
217894ee8b Update knx-frontend to 2025.10.9.185845 (#154103) 2025-10-10 16:24:14 +02:00
Thomas D
c7321a337e Add device_tracker platform to Volvo integration (#153437) 2025-10-10 16:23:07 +02:00
Denis Shulyaka
517124dfbe Anthropic web search support (#153753) 2025-10-10 16:21:21 +02:00
hanwg
f49299b009 Add edit message media feature for Telegram bot (#151034) 2025-10-10 15:50:54 +02:00
Shay Levy
1001da08f6 Fix Shelly RPC cover update when the device is not initialized (#154159) 2025-10-10 16:50:45 +03:00
Lars
0da019404c Remove deprecated extra attributes from fritzbox climate (#154152) 2025-10-10 15:48:22 +02:00
Jan Čermák
9a4280d0de Add attachments support to OpenRouter AI task (#154161) 2025-10-10 15:44:33 +02:00
Lukas
c28e105df5 Pooldose update api (#153497)
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
Co-authored-by: Joostlek <joostlek@outlook.com>
2025-10-10 15:43:47 +02:00
Matthias Alphart
68787248f6 Update xknx to 3.9.1 (#154146) 2025-10-10 15:42:38 +02:00
Tom Matheussen
36be6b6187 Add configured number to Satel Integra subentry titles (#154155) 2025-10-10 15:27:23 +02:00
Jordan Harvey
42dea92c51 Add time platform to nintendo_parental integration (#153866)
Co-authored-by: Erik Montnemery <erik@montnemery.com>
2025-10-10 15:20:59 +02:00
Justus
4b828d4753 IOmeter bump version v0.2.0 (#154150) 2025-10-10 15:15:58 +02:00
Robert Resch
8e79c38f34 Bump deebot-client to 15.1.0 (#154154) 2025-10-10 16:07:45 +03:00
jvmahon
c92107b8d4 Inherit MatterEntityDescription in Matter entities (#154083) 2025-10-10 15:04:01 +02:00
epenet
b25622f40e Use SI constants in CO unit converter (#153187) 2025-10-10 14:59:20 +02:00
Petro31
e887d5e6ad Fix delay_on and auto_off with multiple triggers (#153839) 2025-10-10 14:21:11 +02:00
TheJulianJES
1f19e40cfe Adjust OTBR config entry name for ZBT-2 (#153940) 2025-10-10 14:19:08 +02:00
Bram Kragten
3d2d2271d3 Update frontend to 20251001.2 (#154143) 2025-10-10 14:08:17 +02:00
Jack Thomasson
d1dd5eecd6 use a consistent python version for uv (#154022) 2025-10-10 13:59:45 +02:00
Jan Bouwhuis
cdec29ffb7 Add MQTT select subentry support (#153637) 2025-10-10 13:46:21 +02:00
peteS-UK
07f3e00f18 Fix for multiple Lyrion Music Server on a single Home Assistant server for Squeezebox (#154081) 2025-10-10 13:36:46 +02:00
starkillerOG
084d029168 Add Reolink survaillance rule switch entities (#154132)
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
2025-10-10 13:35:07 +02:00
tronikos
17e997ee18 Add module-level statistics to SolarEdge (#152581) 2025-10-10 13:07:39 +02:00
J. Diego Rodríguez Royo
16d4c6c95a Add Spotless series features to Home Connect integration (#153016) 2025-10-10 13:00:17 +02:00
epenet
0205a636ef Filter out invalid Renault vehicles (#154070)
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
2025-10-10 12:52:21 +02:00
hanwg
4707fd2f94 Update quality scale for Telegram bot (#154122) 2025-10-10 12:47:55 +02:00
J. Nick Koston
ad3cadab83 Bump propcache to 0.4.1 (#154033) 2025-10-10 11:45:58 +01:00
Jordan Harvey
3fce815415 Add reauthentication to Nintendo Switch Parental controls integration (#154077) 2025-10-10 12:31:46 +02:00
Erik Montnemery
ee67619cb1 Add mg/m³ as a valid UOM for sensor/number Carbon Monoxide device class (#154074)
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
Co-authored-by: Abílio Costa <abmantis@users.noreply.github.com>
2025-10-10 11:31:31 +01:00
TheJulianJES
1a744a2c91 Fix HA hardware configuration message for Thread without HAOS (#153933) 2025-10-10 11:37:43 +02:00
Erik Montnemery
951978e483 Include unit class in units_changed statistics issue (#154069) 2025-10-10 10:50:40 +03:00
Marcus Gustavsson
54d30377d3 Add ConfigFlow to Prowl integration (#133771)
Co-authored-by: Norbert Rittel <norbert@rittel.de>
2025-10-10 09:41:45 +02:00
Shay Levy
eb04dda197 Use Entity Description in Shelly BLU TRV button (#154118) 2025-10-10 10:24:08 +03:00
Fabien Kleinbourg
1e192aadfa sharkiq dependency bump to 1.4.2 (#153931) 2025-10-10 08:40:20 +02:00
Erwin Douna
6f680f3d03 Portainer fix offline endpoint (#154101) 2025-10-10 08:14:49 +02:00
starkillerOG
f0663dc275 Bump reolink-aio to 0.16.2 (#154117) 2025-10-10 02:20:11 +03:00
Paulus Schoutsen
96bb67bef9 Z-Wave: ESPHome discovery to update all options (#154113) 2025-10-09 17:14:53 -04:00
G Johansson
929d76e236 Add validation for ObjectSelector (#153081) 2025-10-09 21:46:03 +02:00
Artur Pragacz
fe1ff083de Improve comments in the core config (#154096) 2025-10-09 21:08:16 +02:00
puddly
90c68f8ad0 Prevent reloading the ZHA integration while adapter firmware is being updated (#152626) 2025-10-09 21:00:02 +02:00
Shay Levy
6b79aa7738 Use Entity Description in Shelly cover platform (#154085) 2025-10-09 21:04:06 +03:00
Joost Lekkerkerker
f6fb4c8d5a Add unique id to nederlandse spoorwegen (#154013) 2025-10-09 19:00:47 +02:00
hanwg
a6e575ecfa Add diagnostics for Telegram bot (#154016) 2025-10-09 18:20:00 +02:00
Thomas D
85392ae167 Bump dependency for Volvo integration (#154084) 2025-10-09 18:15:22 +02:00
G Johansson
9d124be491 Remove deprecated set state directly in alarmcontrolpanel (#154038) 2025-10-09 18:06:13 +02:00
G Johansson
8bca3931ab Remove deprecated cover state constants (#154037) 2025-10-09 18:05:49 +02:00
Kevin McCormack
0367a01287 Enable strict typing for GitHub integration (#154048) 2025-10-09 17:50:24 +02:00
Manu
86e2c2f361 Add jet lag prevention event support to Sleep as Android integration (#154075) 2025-10-09 17:48:44 +02:00
Daniel De Sousa
335c8e50a2 Add switchbot_cloud climate TURN_OFF, TURN_ON support. (#154017) 2025-10-09 17:47:26 +02:00
eskerda
8152a9e5da Update Citybikes component with third-party library and fields (#151009) 2025-10-09 17:27:31 +02:00
Felipe Santos
250e562caf Fix devcontainer mistakenly using Python 3.14 (#154046) 2025-10-09 17:25:59 +02:00
Maciej Bieniek
a3b641e53d Bump brother to version 5.1.1 (#154080) 2025-10-09 17:00:41 +02:00
Shay Levy
135ea4c02e Fix Shelly orphaned entity removal logic (#154031) 2025-10-09 16:21:58 +03:00
Simone Chemelli
bc980c1212 Bump aioamazondevices to 6.4.0 (#154071) 2025-10-09 15:20:25 +02:00
Shay Levy
59ca88a7e8 Update Shelly block valve platform to use entity description (#154068) 2025-10-09 12:05:19 +03:00
Erik Montnemery
d45114cd11 Improve unit handling in recorder (#153941) 2025-10-09 10:29:42 +02:00
David Rapan
2eba650064 Mark Shelly docs-troubleshooting as done (#154066) 2025-10-09 11:22:32 +03:00
Christopher Fenner
de4adb8855 Make sensor names translatable in OpenWeatherMap integration (#153872) 2025-10-09 09:47:25 +02:00
Joost Lekkerkerker
1d86c03b02 Migrate Nederlandse Spoorwegen sensor to timestamp (#154011) 2025-10-09 09:25:11 +02:00
Klaas Schoute
77fb1036cc Bump autarco to v3.2.0 (#154039) 2025-10-09 01:13:57 +03:00
Shay Levy
b15b4e4888 Fix Shelly virtual components roles migration (#153987) 2025-10-09 00:06:32 +03:00
puddly
dddf6d5f1a Add new ZBT-2 VID:PID pair for discovery (#154036) 2025-10-08 15:59:49 -05:00
Abílio Costa
66fb5f4d95 Simplify firing of trigger actions (#152772)
Co-authored-by: Erik Montnemery <erik@montnemery.com>
2025-10-08 21:40:20 +01:00
258 changed files with 9861 additions and 1929 deletions

View File

@@ -221,6 +221,7 @@ homeassistant.components.generic_thermostat.*
homeassistant.components.geo_location.*
homeassistant.components.geocaching.*
homeassistant.components.gios.*
homeassistant.components.github.*
homeassistant.components.glances.*
homeassistant.components.go2rtc.*
homeassistant.components.goalzero.*

8
CODEOWNERS generated
View File

@@ -762,8 +762,8 @@ build.json @home-assistant/supervisor
/homeassistant/components/intent/ @home-assistant/core @synesthesiam @arturpragacz
/tests/components/intent/ @home-assistant/core @synesthesiam @arturpragacz
/homeassistant/components/intesishome/ @jnimmo
/homeassistant/components/iometer/ @MaestroOnICe
/tests/components/iometer/ @MaestroOnICe
/homeassistant/components/iometer/ @jukrebs
/tests/components/iometer/ @jukrebs
/homeassistant/components/ios/ @robbiet480
/tests/components/ios/ @robbiet480
/homeassistant/components/iotawatt/ @gtdiehl @jyavenard
@@ -1479,8 +1479,8 @@ build.json @home-assistant/supervisor
/tests/components/snoo/ @Lash-L
/homeassistant/components/snooz/ @AustinBrunkhorst
/tests/components/snooz/ @AustinBrunkhorst
/homeassistant/components/solaredge/ @frenck @bdraco
/tests/components/solaredge/ @frenck @bdraco
/homeassistant/components/solaredge/ @frenck @bdraco @tronikos
/tests/components/solaredge/ @frenck @bdraco @tronikos
/homeassistant/components/solaredge_local/ @drobtravels @scheric
/homeassistant/components/solarlog/ @Ernst79 @dontinelli
/tests/components/solarlog/ @Ernst79 @dontinelli

View File

@@ -34,9 +34,11 @@ WORKDIR /usr/src
COPY --from=ghcr.io/astral-sh/uv:latest /uv /usr/local/bin/uv
RUN uv python install 3.13.2
USER vscode
ENV UV_PYTHON=3.13.2
RUN uv python install
ENV VIRTUAL_ENV="/home/vscode/.local/ha-venv"
RUN uv venv $VIRTUAL_ENV
ENV PATH="$VIRTUAL_ENV/bin:$PATH"

View File

@@ -2,10 +2,9 @@
from __future__ import annotations
import asyncio
from datetime import timedelta
import logging
from typing import TYPE_CHECKING, Any, Final, final
from typing import Any, Final, final
from propcache.api import cached_property
import voluptuous as vol
@@ -28,8 +27,6 @@ from homeassistant.helpers import config_validation as cv
from homeassistant.helpers.config_validation import make_entity_service_schema
from homeassistant.helpers.entity import Entity, EntityDescription
from homeassistant.helpers.entity_component import EntityComponent
from homeassistant.helpers.entity_platform import EntityPlatform
from homeassistant.helpers.frame import ReportBehavior, report_usage
from homeassistant.helpers.typing import ConfigType
from homeassistant.util.hass_dict import HassKey
@@ -149,68 +146,11 @@ class AlarmControlPanelEntity(Entity, cached_properties=CACHED_PROPERTIES_WITH_A
)
_alarm_control_panel_option_default_code: str | None = None
__alarm_legacy_state: bool = False
def __init_subclass__(cls, **kwargs: Any) -> None:
"""Post initialisation processing."""
super().__init_subclass__(**kwargs)
if any(method in cls.__dict__ for method in ("_attr_state", "state")):
# Integrations should use the 'alarm_state' property instead of
# setting the state directly.
cls.__alarm_legacy_state = True
def __setattr__(self, name: str, value: Any, /) -> None:
"""Set attribute.
Deprecation warning if setting '_attr_state' directly
unless already reported.
"""
if name == "_attr_state":
self._report_deprecated_alarm_state_handling()
return super().__setattr__(name, value)
@callback
def add_to_platform_start(
self,
hass: HomeAssistant,
platform: EntityPlatform,
parallel_updates: asyncio.Semaphore | None,
) -> None:
"""Start adding an entity to a platform."""
super().add_to_platform_start(hass, platform, parallel_updates)
if self.__alarm_legacy_state:
self._report_deprecated_alarm_state_handling()
@callback
def _report_deprecated_alarm_state_handling(self) -> None:
"""Report on deprecated handling of alarm state.
Integrations should implement alarm_state instead of using state directly.
"""
report_usage(
"is setting state directly."
f" Entity {self.entity_id} ({type(self)}) should implement the 'alarm_state'"
" property and return its state using the AlarmControlPanelState enum",
core_integration_behavior=ReportBehavior.ERROR,
custom_integration_behavior=ReportBehavior.LOG,
breaks_in_ha_version="2025.11",
integration_domain=self.platform.platform_name if self.platform else None,
exclude_integrations={DOMAIN},
)
@final
@property
def state(self) -> str | None:
"""Return the current state."""
if (alarm_state := self.alarm_state) is not None:
return alarm_state
if self._attr_state is not None:
# Backwards compatibility for integrations that set state directly
# Should be removed in 2025.11
if TYPE_CHECKING:
assert isinstance(self._attr_state, str)
return self._attr_state
return None
return self.alarm_state
@cached_property
def alarm_state(self) -> AlarmControlPanelState | None:

View File

@@ -1472,10 +1472,10 @@ class AlexaModeController(AlexaCapability):
# Return state instead of position when using ModeController.
mode = self.entity.state
if mode in (
cover.STATE_OPEN,
cover.STATE_OPENING,
cover.STATE_CLOSED,
cover.STATE_CLOSING,
cover.CoverState.OPEN,
cover.CoverState.OPENING,
cover.CoverState.CLOSED,
cover.CoverState.CLOSING,
STATE_UNKNOWN,
):
return f"{cover.ATTR_POSITION}.{mode}"
@@ -1594,11 +1594,11 @@ class AlexaModeController(AlexaCapability):
["Position", AlexaGlobalCatalog.SETTING_OPENING], False
)
self._resource.add_mode(
f"{cover.ATTR_POSITION}.{cover.STATE_OPEN}",
f"{cover.ATTR_POSITION}.{cover.CoverState.OPEN}",
[AlexaGlobalCatalog.VALUE_OPEN],
)
self._resource.add_mode(
f"{cover.ATTR_POSITION}.{cover.STATE_CLOSED}",
f"{cover.ATTR_POSITION}.{cover.CoverState.CLOSED}",
[AlexaGlobalCatalog.VALUE_CLOSE],
)
self._resource.add_mode(
@@ -1651,22 +1651,22 @@ class AlexaModeController(AlexaCapability):
raise_labels.append(AlexaSemantics.ACTION_OPEN)
self._semantics.add_states_to_value(
[AlexaSemantics.STATES_CLOSED],
f"{cover.ATTR_POSITION}.{cover.STATE_CLOSED}",
f"{cover.ATTR_POSITION}.{cover.CoverState.CLOSED}",
)
self._semantics.add_states_to_value(
[AlexaSemantics.STATES_OPEN],
f"{cover.ATTR_POSITION}.{cover.STATE_OPEN}",
f"{cover.ATTR_POSITION}.{cover.CoverState.OPEN}",
)
self._semantics.add_action_to_directive(
lower_labels,
"SetMode",
{"mode": f"{cover.ATTR_POSITION}.{cover.STATE_CLOSED}"},
{"mode": f"{cover.ATTR_POSITION}.{cover.CoverState.CLOSED}"},
)
self._semantics.add_action_to_directive(
raise_labels,
"SetMode",
{"mode": f"{cover.ATTR_POSITION}.{cover.STATE_OPEN}"},
{"mode": f"{cover.ATTR_POSITION}.{cover.CoverState.OPEN}"},
)
return self._semantics.serialize_semantics()

View File

@@ -1261,9 +1261,9 @@ async def async_api_set_mode(
elif instance == f"{cover.DOMAIN}.{cover.ATTR_POSITION}":
position = mode.split(".")[1]
if position == cover.STATE_CLOSED:
if position == cover.CoverState.CLOSED:
service = cover.SERVICE_CLOSE_COVER
elif position == cover.STATE_OPEN:
elif position == cover.CoverState.OPEN:
service = cover.SERVICE_OPEN_COVER
elif position == "custom":
service = cover.SERVICE_STOP_COVER

View File

@@ -8,5 +8,5 @@
"iot_class": "cloud_polling",
"loggers": ["aioamazondevices"],
"quality_scale": "platinum",
"requirements": ["aioamazondevices==6.2.9"]
"requirements": ["aioamazondevices==6.4.0"]
}

View File

@@ -4,12 +4,15 @@ from __future__ import annotations
from collections.abc import Mapping
from functools import partial
import json
import logging
from typing import Any, cast
import anthropic
import voluptuous as vol
from voluptuous_openapi import convert
from homeassistant.components.zone import ENTITY_ID_HOME
from homeassistant.config_entries import (
ConfigEntry,
ConfigEntryState,
@@ -18,7 +21,13 @@ from homeassistant.config_entries import (
ConfigSubentryFlow,
SubentryFlowResult,
)
from homeassistant.const import CONF_API_KEY, CONF_LLM_HASS_API, CONF_NAME
from homeassistant.const import (
ATTR_LATITUDE,
ATTR_LONGITUDE,
CONF_API_KEY,
CONF_LLM_HASS_API,
CONF_NAME,
)
from homeassistant.core import HomeAssistant, callback
from homeassistant.helpers import llm
from homeassistant.helpers.selector import (
@@ -37,12 +46,23 @@ from .const import (
CONF_RECOMMENDED,
CONF_TEMPERATURE,
CONF_THINKING_BUDGET,
CONF_WEB_SEARCH,
CONF_WEB_SEARCH_CITY,
CONF_WEB_SEARCH_COUNTRY,
CONF_WEB_SEARCH_MAX_USES,
CONF_WEB_SEARCH_REGION,
CONF_WEB_SEARCH_TIMEZONE,
CONF_WEB_SEARCH_USER_LOCATION,
DEFAULT_CONVERSATION_NAME,
DOMAIN,
RECOMMENDED_CHAT_MODEL,
RECOMMENDED_MAX_TOKENS,
RECOMMENDED_TEMPERATURE,
RECOMMENDED_THINKING_BUDGET,
RECOMMENDED_WEB_SEARCH,
RECOMMENDED_WEB_SEARCH_MAX_USES,
RECOMMENDED_WEB_SEARCH_USER_LOCATION,
WEB_SEARCH_UNSUPPORTED_MODELS,
)
_LOGGER = logging.getLogger(__name__)
@@ -168,6 +188,14 @@ class ConversationSubentryFlowHandler(ConfigSubentryFlow):
CONF_THINKING_BUDGET, RECOMMENDED_THINKING_BUDGET
) >= user_input.get(CONF_MAX_TOKENS, RECOMMENDED_MAX_TOKENS):
errors[CONF_THINKING_BUDGET] = "thinking_budget_too_large"
if user_input.get(CONF_WEB_SEARCH, RECOMMENDED_WEB_SEARCH):
model = user_input.get(CONF_CHAT_MODEL, RECOMMENDED_CHAT_MODEL)
if model.startswith(tuple(WEB_SEARCH_UNSUPPORTED_MODELS)):
errors[CONF_WEB_SEARCH] = "web_search_unsupported_model"
elif user_input.get(
CONF_WEB_SEARCH_USER_LOCATION, RECOMMENDED_WEB_SEARCH_USER_LOCATION
):
user_input.update(await self._get_location_data())
if not errors:
if self._is_new:
@@ -215,6 +243,68 @@ class ConversationSubentryFlowHandler(ConfigSubentryFlow):
errors=errors or None,
)
async def _get_location_data(self) -> dict[str, str]:
"""Get approximate location data of the user."""
location_data: dict[str, str] = {}
zone_home = self.hass.states.get(ENTITY_ID_HOME)
if zone_home is not None:
client = await self.hass.async_add_executor_job(
partial(
anthropic.AsyncAnthropic,
api_key=self._get_entry().data[CONF_API_KEY],
)
)
location_schema = vol.Schema(
{
vol.Optional(
CONF_WEB_SEARCH_CITY,
description="Free text input for the city, e.g. `San Francisco`",
): str,
vol.Optional(
CONF_WEB_SEARCH_REGION,
description="Free text input for the region, e.g. `California`",
): str,
}
)
response = await client.messages.create(
model=RECOMMENDED_CHAT_MODEL,
messages=[
{
"role": "user",
"content": "Where are the following coordinates located: "
f"({zone_home.attributes[ATTR_LATITUDE]},"
f" {zone_home.attributes[ATTR_LONGITUDE]})? Please respond "
"only with a JSON object using the following schema:\n"
f"{convert(location_schema)}",
},
{
"role": "assistant",
"content": "{", # hints the model to skip any preamble
},
],
max_tokens=RECOMMENDED_MAX_TOKENS,
)
_LOGGER.debug("Model response: %s", response.content)
location_data = location_schema(
json.loads(
"{"
+ "".join(
block.text
for block in response.content
if isinstance(block, anthropic.types.TextBlock)
)
)
or {}
)
if self.hass.config.country:
location_data[CONF_WEB_SEARCH_COUNTRY] = self.hass.config.country
location_data[CONF_WEB_SEARCH_TIMEZONE] = self.hass.config.time_zone
_LOGGER.debug("Location data: %s", location_data)
return location_data
async_step_user = async_step_set_options
async_step_reconfigure = async_step_set_options
@@ -273,6 +363,18 @@ def anthropic_config_option_schema(
CONF_THINKING_BUDGET,
default=RECOMMENDED_THINKING_BUDGET,
): int,
vol.Optional(
CONF_WEB_SEARCH,
default=RECOMMENDED_WEB_SEARCH,
): bool,
vol.Optional(
CONF_WEB_SEARCH_MAX_USES,
default=RECOMMENDED_WEB_SEARCH_MAX_USES,
): int,
vol.Optional(
CONF_WEB_SEARCH_USER_LOCATION,
default=RECOMMENDED_WEB_SEARCH_USER_LOCATION,
): bool,
}
)
return schema

View File

@@ -18,9 +18,26 @@ RECOMMENDED_TEMPERATURE = 1.0
CONF_THINKING_BUDGET = "thinking_budget"
RECOMMENDED_THINKING_BUDGET = 0
MIN_THINKING_BUDGET = 1024
CONF_WEB_SEARCH = "web_search"
RECOMMENDED_WEB_SEARCH = False
CONF_WEB_SEARCH_USER_LOCATION = "user_location"
RECOMMENDED_WEB_SEARCH_USER_LOCATION = False
CONF_WEB_SEARCH_MAX_USES = "web_search_max_uses"
RECOMMENDED_WEB_SEARCH_MAX_USES = 5
CONF_WEB_SEARCH_CITY = "city"
CONF_WEB_SEARCH_REGION = "region"
CONF_WEB_SEARCH_COUNTRY = "country"
CONF_WEB_SEARCH_TIMEZONE = "timezone"
NON_THINKING_MODELS = [
"claude-3-5", # Both sonnet and haiku
"claude-3-opus",
"claude-3-haiku",
]
WEB_SEARCH_UNSUPPORTED_MODELS = [
"claude-3-haiku",
"claude-3-opus",
"claude-3-5-sonnet-20240620",
"claude-3-5-sonnet-20241022",
]

View File

@@ -1,12 +1,17 @@
"""Base entity for Anthropic."""
from collections.abc import AsyncGenerator, Callable, Iterable
from dataclasses import dataclass, field
import json
from typing import Any
import anthropic
from anthropic import AsyncStream
from anthropic.types import (
CitationsDelta,
CitationsWebSearchResultLocation,
CitationWebSearchResultLocationParam,
ContentBlockParam,
InputJSONDelta,
MessageDeltaUsage,
MessageParam,
@@ -16,11 +21,16 @@ from anthropic.types import (
RawContentBlockStopEvent,
RawMessageDeltaEvent,
RawMessageStartEvent,
RawMessageStopEvent,
RedactedThinkingBlock,
RedactedThinkingBlockParam,
ServerToolUseBlock,
ServerToolUseBlockParam,
SignatureDelta,
TextBlock,
TextBlockParam,
TextCitation,
TextCitationParam,
TextDelta,
ThinkingBlock,
ThinkingBlockParam,
@@ -29,9 +39,15 @@ from anthropic.types import (
ThinkingDelta,
ToolParam,
ToolResultBlockParam,
ToolUnionParam,
ToolUseBlock,
ToolUseBlockParam,
Usage,
WebSearchTool20250305Param,
WebSearchToolRequestErrorParam,
WebSearchToolResultBlock,
WebSearchToolResultBlockParam,
WebSearchToolResultError,
)
from anthropic.types.message_create_params import MessageCreateParamsStreaming
from voluptuous_openapi import convert
@@ -48,6 +64,13 @@ from .const import (
CONF_MAX_TOKENS,
CONF_TEMPERATURE,
CONF_THINKING_BUDGET,
CONF_WEB_SEARCH,
CONF_WEB_SEARCH_CITY,
CONF_WEB_SEARCH_COUNTRY,
CONF_WEB_SEARCH_MAX_USES,
CONF_WEB_SEARCH_REGION,
CONF_WEB_SEARCH_TIMEZONE,
CONF_WEB_SEARCH_USER_LOCATION,
DOMAIN,
LOGGER,
MIN_THINKING_BUDGET,
@@ -73,6 +96,69 @@ def _format_tool(
)
@dataclass(slots=True)
class CitationDetails:
"""Citation details for a content part."""
index: int = 0
"""Start position of the text."""
length: int = 0
"""Length of the relevant data."""
citations: list[TextCitationParam] = field(default_factory=list)
"""Citations for the content part."""
@dataclass(slots=True)
class ContentDetails:
"""Native data for AssistantContent."""
citation_details: list[CitationDetails] = field(default_factory=list)
def has_content(self) -> bool:
"""Check if there is any content."""
return any(detail.length > 0 for detail in self.citation_details)
def has_citations(self) -> bool:
"""Check if there are any citations."""
return any(detail.citations for detail in self.citation_details)
def add_citation_detail(self) -> None:
"""Add a new citation detail."""
if not self.citation_details or self.citation_details[-1].length > 0:
self.citation_details.append(
CitationDetails(
index=self.citation_details[-1].index
+ self.citation_details[-1].length
if self.citation_details
else 0
)
)
def add_citation(self, citation: TextCitation) -> None:
"""Add a citation to the current detail."""
if not self.citation_details:
self.citation_details.append(CitationDetails())
citation_param: TextCitationParam | None = None
if isinstance(citation, CitationsWebSearchResultLocation):
citation_param = CitationWebSearchResultLocationParam(
type="web_search_result_location",
title=citation.title,
url=citation.url,
cited_text=citation.cited_text,
encrypted_index=citation.encrypted_index,
)
if citation_param:
self.citation_details[-1].citations.append(citation_param)
def delete_empty(self) -> None:
"""Delete empty citation details."""
self.citation_details = [
detail for detail in self.citation_details if detail.citations
]
def _convert_content(
chat_content: Iterable[conversation.Content],
) -> list[MessageParam]:
@@ -81,15 +167,31 @@ def _convert_content(
for content in chat_content:
if isinstance(content, conversation.ToolResultContent):
tool_result_block = ToolResultBlockParam(
type="tool_result",
tool_use_id=content.tool_call_id,
content=json.dumps(content.tool_result),
)
if not messages or messages[-1]["role"] != "user":
if content.tool_name == "web_search":
tool_result_block: ContentBlockParam = WebSearchToolResultBlockParam(
type="web_search_tool_result",
tool_use_id=content.tool_call_id,
content=content.tool_result["content"]
if "content" in content.tool_result
else WebSearchToolRequestErrorParam(
type="web_search_tool_result_error",
error_code=content.tool_result.get("error_code", "unavailable"), # type: ignore[typeddict-item]
),
)
external_tool = True
else:
tool_result_block = ToolResultBlockParam(
type="tool_result",
tool_use_id=content.tool_call_id,
content=json.dumps(content.tool_result),
)
external_tool = False
if not messages or messages[-1]["role"] != (
"assistant" if external_tool else "user"
):
messages.append(
MessageParam(
role="user",
role="assistant" if external_tool else "user",
content=[tool_result_block],
)
)
@@ -151,13 +253,56 @@ def _convert_content(
redacted_thinking_block
)
if content.content:
messages[-1]["content"].append( # type: ignore[union-attr]
TextBlockParam(type="text", text=content.content)
)
current_index = 0
for detail in (
content.native.citation_details
if isinstance(content.native, ContentDetails)
else [CitationDetails(length=len(content.content))]
):
if detail.index > current_index:
# Add text block for any text without citations
messages[-1]["content"].append( # type: ignore[union-attr]
TextBlockParam(
type="text",
text=content.content[current_index : detail.index],
)
)
messages[-1]["content"].append( # type: ignore[union-attr]
TextBlockParam(
type="text",
text=content.content[
detail.index : detail.index + detail.length
],
citations=detail.citations,
)
if detail.citations
else TextBlockParam(
type="text",
text=content.content[
detail.index : detail.index + detail.length
],
)
)
current_index = detail.index + detail.length
if current_index < len(content.content):
# Add text block for any remaining text without citations
messages[-1]["content"].append( # type: ignore[union-attr]
TextBlockParam(
type="text",
text=content.content[current_index:],
)
)
if content.tool_calls:
messages[-1]["content"].extend( # type: ignore[union-attr]
[
ToolUseBlockParam(
ServerToolUseBlockParam(
type="server_tool_use",
id=tool_call.id,
name="web_search",
input=tool_call.tool_args,
)
if tool_call.external and tool_call.tool_name == "web_search"
else ToolUseBlockParam(
type="tool_use",
id=tool_call.id,
name=tool_call.tool_name,
@@ -173,10 +318,12 @@ def _convert_content(
return messages
async def _transform_stream(
async def _transform_stream( # noqa: C901 - This is complex, but better to have it in one place
chat_log: conversation.ChatLog,
stream: AsyncStream[MessageStreamEvent],
) -> AsyncGenerator[conversation.AssistantContentDeltaDict]:
) -> AsyncGenerator[
conversation.AssistantContentDeltaDict | conversation.ToolResultContentDeltaDict
]:
"""Transform the response stream into HA format.
A typical stream of responses might look something like the following:
@@ -209,11 +356,13 @@ async def _transform_stream(
if stream is None:
raise TypeError("Expected a stream of messages")
current_tool_block: ToolUseBlockParam | None = None
current_tool_block: ToolUseBlockParam | ServerToolUseBlockParam | None = None
current_tool_args: str
content_details = ContentDetails()
content_details.add_citation_detail()
input_usage: Usage | None = None
has_content = False
has_native = False
first_block: bool
async for response in stream:
LOGGER.debug("Received response: %s", response)
@@ -222,6 +371,7 @@ async def _transform_stream(
if response.message.role != "assistant":
raise ValueError("Unexpected message role")
input_usage = response.message.usage
first_block = True
elif isinstance(response, RawContentBlockStartEvent):
if isinstance(response.content_block, ToolUseBlock):
current_tool_block = ToolUseBlockParam(
@@ -232,17 +382,37 @@ async def _transform_stream(
)
current_tool_args = ""
elif isinstance(response.content_block, TextBlock):
if has_content:
if ( # Do not start a new assistant content just for citations, concatenate consecutive blocks with citations instead.
first_block
or (
not content_details.has_citations()
and response.content_block.citations is None
and content_details.has_content()
)
):
if content_details.has_citations():
content_details.delete_empty()
yield {"native": content_details}
content_details = ContentDetails()
yield {"role": "assistant"}
has_native = False
has_content = True
first_block = False
content_details.add_citation_detail()
if response.content_block.text:
content_details.citation_details[-1].length += len(
response.content_block.text
)
yield {"content": response.content_block.text}
elif isinstance(response.content_block, ThinkingBlock):
if has_native:
if first_block or has_native:
if content_details.has_citations():
content_details.delete_empty()
yield {"native": content_details}
content_details = ContentDetails()
content_details.add_citation_detail()
yield {"role": "assistant"}
has_native = False
has_content = False
first_block = False
elif isinstance(response.content_block, RedactedThinkingBlock):
LOGGER.debug(
"Some of Claudes internal reasoning has been automatically "
@@ -250,15 +420,60 @@ async def _transform_stream(
"responses"
)
if has_native:
if content_details.has_citations():
content_details.delete_empty()
yield {"native": content_details}
content_details = ContentDetails()
content_details.add_citation_detail()
yield {"role": "assistant"}
has_native = False
has_content = False
first_block = False
yield {"native": response.content_block}
has_native = True
elif isinstance(response.content_block, ServerToolUseBlock):
current_tool_block = ServerToolUseBlockParam(
type="server_tool_use",
id=response.content_block.id,
name=response.content_block.name,
input="",
)
current_tool_args = ""
elif isinstance(response.content_block, WebSearchToolResultBlock):
if content_details.has_citations():
content_details.delete_empty()
yield {"native": content_details}
content_details = ContentDetails()
content_details.add_citation_detail()
yield {
"role": "tool_result",
"tool_call_id": response.content_block.tool_use_id,
"tool_name": "web_search",
"tool_result": {
"type": "web_search_tool_result_error",
"error_code": response.content_block.content.error_code,
}
if isinstance(
response.content_block.content, WebSearchToolResultError
)
else {
"content": [
{
"type": "web_search_result",
"encrypted_content": block.encrypted_content,
"page_age": block.page_age,
"title": block.title,
"url": block.url,
}
for block in response.content_block.content
]
},
}
first_block = True
elif isinstance(response, RawContentBlockDeltaEvent):
if isinstance(response.delta, InputJSONDelta):
current_tool_args += response.delta.partial_json
elif isinstance(response.delta, TextDelta):
content_details.citation_details[-1].length += len(response.delta.text)
yield {"content": response.delta.text}
elif isinstance(response.delta, ThinkingDelta):
yield {"thinking_content": response.delta.thinking}
@@ -271,6 +486,8 @@ async def _transform_stream(
)
}
has_native = True
elif isinstance(response.delta, CitationsDelta):
content_details.add_citation(response.delta.citation)
elif isinstance(response, RawContentBlockStopEvent):
if current_tool_block is not None:
tool_args = json.loads(current_tool_args) if current_tool_args else {}
@@ -281,6 +498,7 @@ async def _transform_stream(
id=current_tool_block["id"],
tool_name=current_tool_block["name"],
tool_args=tool_args,
external=current_tool_block["type"] == "server_tool_use",
)
]
}
@@ -290,6 +508,12 @@ async def _transform_stream(
chat_log.async_trace(_create_token_stats(input_usage, usage))
if response.delta.stop_reason == "refusal":
raise HomeAssistantError("Potential policy violation detected")
elif isinstance(response, RawMessageStopEvent):
if content_details.has_citations():
content_details.delete_empty()
yield {"native": content_details}
content_details = ContentDetails()
content_details.add_citation_detail()
def _create_token_stats(
@@ -337,21 +561,11 @@ class AnthropicBaseLLMEntity(Entity):
"""Generate an answer for the chat log."""
options = self.subentry.data
tools: list[ToolParam] | None = None
if chat_log.llm_api:
tools = [
_format_tool(tool, chat_log.llm_api.custom_serializer)
for tool in chat_log.llm_api.tools
]
system = chat_log.content[0]
if not isinstance(system, conversation.SystemContent):
raise TypeError("First message must be a system message")
messages = _convert_content(chat_log.content[1:])
client = self.entry.runtime_data
thinking_budget = options.get(CONF_THINKING_BUDGET, RECOMMENDED_THINKING_BUDGET)
model = options.get(CONF_CHAT_MODEL, RECOMMENDED_CHAT_MODEL)
model_args = MessageCreateParamsStreaming(
@@ -361,8 +575,8 @@ class AnthropicBaseLLMEntity(Entity):
system=system.content,
stream=True,
)
if tools:
model_args["tools"] = tools
thinking_budget = options.get(CONF_THINKING_BUDGET, RECOMMENDED_THINKING_BUDGET)
if (
not model.startswith(tuple(NON_THINKING_MODELS))
and thinking_budget >= MIN_THINKING_BUDGET
@@ -376,6 +590,34 @@ class AnthropicBaseLLMEntity(Entity):
CONF_TEMPERATURE, RECOMMENDED_TEMPERATURE
)
tools: list[ToolUnionParam] = []
if chat_log.llm_api:
tools = [
_format_tool(tool, chat_log.llm_api.custom_serializer)
for tool in chat_log.llm_api.tools
]
if options.get(CONF_WEB_SEARCH):
web_search = WebSearchTool20250305Param(
name="web_search",
type="web_search_20250305",
max_uses=options.get(CONF_WEB_SEARCH_MAX_USES),
)
if options.get(CONF_WEB_SEARCH_USER_LOCATION):
web_search["user_location"] = {
"type": "approximate",
"city": options.get(CONF_WEB_SEARCH_CITY, ""),
"region": options.get(CONF_WEB_SEARCH_REGION, ""),
"country": options.get(CONF_WEB_SEARCH_COUNTRY, ""),
"timezone": options.get(CONF_WEB_SEARCH_TIMEZONE, ""),
}
tools.append(web_search)
if tools:
model_args["tools"] = tools
client = self.entry.runtime_data
# To prevent infinite loops, we limit the number of iterations
for _iteration in range(MAX_TOOL_ITERATIONS):
try:

View File

@@ -35,11 +35,17 @@
"temperature": "Temperature",
"llm_hass_api": "[%key:common::config_flow::data::llm_hass_api%]",
"recommended": "Recommended model settings",
"thinking_budget_tokens": "Thinking budget"
"thinking_budget": "Thinking budget",
"web_search": "Enable web search",
"web_search_max_uses": "Maximum web searches",
"user_location": "Include home location"
},
"data_description": {
"prompt": "Instruct how the LLM should respond. This can be a template.",
"thinking_budget_tokens": "The number of tokens the model can use to think about the response out of the total maximum number of tokens. Set to 1024 or greater to enable extended thinking."
"thinking_budget": "The number of tokens the model can use to think about the response out of the total maximum number of tokens. Set to 1024 or greater to enable extended thinking.",
"web_search": "The web search tool gives Claude direct access to real-time web content, allowing it to answer questions with up-to-date information beyond its knowledge cutoff",
"web_search_max_uses": "Limit the number of searches performed per response",
"user_location": "Localize search results based on home location"
}
}
},
@@ -48,7 +54,8 @@
"entry_not_loaded": "Cannot add things while the configuration is disabled."
},
"error": {
"thinking_budget_too_large": "Maximum tokens must be greater than the thinking budget."
"thinking_budget_too_large": "Maximum tokens must be greater than the thinking budget.",
"web_search_unsupported_model": "Web search is not supported by the selected model. Please choose a compatible model or disable web search."
}
}
}

View File

@@ -5,5 +5,5 @@
"config_flow": true,
"documentation": "https://www.home-assistant.io/integrations/autarco",
"iot_class": "cloud_polling",
"requirements": ["autarco==3.1.0"]
"requirements": ["autarco==3.2.0"]
}

View File

@@ -8,7 +8,7 @@
"integration_type": "device",
"iot_class": "local_polling",
"loggers": ["brother", "pyasn1", "pysmi", "pysnmp"],
"requirements": ["brother==5.1.0"],
"requirements": ["brother==5.1.1"],
"zeroconf": [
{
"type": "_printer._tcp.local.",

View File

@@ -4,5 +4,6 @@
"codeowners": [],
"documentation": "https://www.home-assistant.io/integrations/citybikes",
"iot_class": "cloud_polling",
"quality_scale": "legacy"
"quality_scale": "legacy",
"requirements": ["python-citybikes==0.3.3"]
}

View File

@@ -5,8 +5,11 @@ from __future__ import annotations
import asyncio
from datetime import timedelta
import logging
import sys
import aiohttp
from citybikes import __version__ as CITYBIKES_CLIENT_VERSION
from citybikes.asyncio import Client as CitybikesClient
import voluptuous as vol
from homeassistant.components.sensor import (
@@ -15,21 +18,18 @@ from homeassistant.components.sensor import (
SensorEntity,
)
from homeassistant.const import (
ATTR_ID,
ATTR_LATITUDE,
ATTR_LOCATION,
ATTR_LONGITUDE,
ATTR_NAME,
APPLICATION_NAME,
CONF_LATITUDE,
CONF_LONGITUDE,
CONF_NAME,
CONF_RADIUS,
EVENT_HOMEASSISTANT_CLOSE,
UnitOfLength,
__version__,
)
from homeassistant.core import HomeAssistant
from homeassistant.exceptions import PlatformNotReady
from homeassistant.helpers import config_validation as cv
from homeassistant.helpers.aiohttp_client import async_get_clientsession
from homeassistant.helpers.entity import async_generate_entity_id
from homeassistant.helpers.entity_platform import AddEntitiesCallback
from homeassistant.helpers.event import async_track_time_interval
@@ -40,31 +40,33 @@ from homeassistant.util.unit_system import US_CUSTOMARY_SYSTEM
_LOGGER = logging.getLogger(__name__)
ATTR_EMPTY_SLOTS = "empty_slots"
ATTR_EXTRA = "extra"
ATTR_FREE_BIKES = "free_bikes"
ATTR_NETWORK = "network"
ATTR_NETWORKS_LIST = "networks"
ATTR_STATIONS_LIST = "stations"
ATTR_TIMESTAMP = "timestamp"
HA_USER_AGENT = (
f"{APPLICATION_NAME}/{__version__} "
f"python-citybikes/{CITYBIKES_CLIENT_VERSION} "
f"Python/{sys.version_info[0]}.{sys.version_info[1]}"
)
ATTR_UID = "uid"
ATTR_LATITUDE = "latitude"
ATTR_LONGITUDE = "longitude"
ATTR_EMPTY_SLOTS = "empty_slots"
ATTR_TIMESTAMP = "timestamp"
CONF_NETWORK = "network"
CONF_STATIONS_LIST = "stations"
DEFAULT_ENDPOINT = "https://api.citybik.es/{uri}"
PLATFORM = "citybikes"
MONITORED_NETWORKS = "monitored-networks"
DATA_CLIENT = "client"
NETWORKS_URI = "v2/networks"
REQUEST_TIMEOUT = 5 # In seconds; argument to asyncio.timeout
REQUEST_TIMEOUT = aiohttp.ClientTimeout(total=5)
SCAN_INTERVAL = timedelta(minutes=5) # Timely, and doesn't suffocate the API
STATIONS_URI = "v2/networks/{uid}?fields=network.stations"
CITYBIKES_ATTRIBUTION = (
"Information provided by the CityBikes Project (https://citybik.es/#about)"
)
@@ -87,72 +89,6 @@ PLATFORM_SCHEMA = vol.All(
),
)
NETWORK_SCHEMA = vol.Schema(
{
vol.Required(ATTR_ID): cv.string,
vol.Required(ATTR_NAME): cv.string,
vol.Required(ATTR_LOCATION): vol.Schema(
{
vol.Required(ATTR_LATITUDE): cv.latitude,
vol.Required(ATTR_LONGITUDE): cv.longitude,
},
extra=vol.REMOVE_EXTRA,
),
},
extra=vol.REMOVE_EXTRA,
)
NETWORKS_RESPONSE_SCHEMA = vol.Schema(
{vol.Required(ATTR_NETWORKS_LIST): [NETWORK_SCHEMA]}
)
STATION_SCHEMA = vol.Schema(
{
vol.Required(ATTR_FREE_BIKES): cv.positive_int,
vol.Required(ATTR_EMPTY_SLOTS): vol.Any(cv.positive_int, None),
vol.Required(ATTR_LATITUDE): cv.latitude,
vol.Required(ATTR_LONGITUDE): cv.longitude,
vol.Required(ATTR_ID): cv.string,
vol.Required(ATTR_NAME): cv.string,
vol.Required(ATTR_TIMESTAMP): cv.string,
vol.Optional(ATTR_EXTRA): vol.Schema(
{vol.Optional(ATTR_UID): cv.string}, extra=vol.REMOVE_EXTRA
),
},
extra=vol.REMOVE_EXTRA,
)
STATIONS_RESPONSE_SCHEMA = vol.Schema(
{
vol.Required(ATTR_NETWORK): vol.Schema(
{vol.Required(ATTR_STATIONS_LIST): [STATION_SCHEMA]}, extra=vol.REMOVE_EXTRA
)
}
)
class CityBikesRequestError(Exception):
"""Error to indicate a CityBikes API request has failed."""
async def async_citybikes_request(hass, uri, schema):
"""Perform a request to CityBikes API endpoint, and parse the response."""
try:
session = async_get_clientsession(hass)
async with asyncio.timeout(REQUEST_TIMEOUT):
req = await session.get(DEFAULT_ENDPOINT.format(uri=uri))
json_response = await req.json()
return schema(json_response)
except (TimeoutError, aiohttp.ClientError):
_LOGGER.error("Could not connect to CityBikes API endpoint")
except ValueError:
_LOGGER.error("Received non-JSON data from CityBikes API endpoint")
except vol.Invalid as err:
_LOGGER.error("Received unexpected JSON from CityBikes API endpoint: %s", err)
raise CityBikesRequestError
async def async_setup_platform(
hass: HomeAssistant,
@@ -175,6 +111,14 @@ async def async_setup_platform(
radius, UnitOfLength.FEET, UnitOfLength.METERS
)
client = CitybikesClient(user_agent=HA_USER_AGENT, timeout=REQUEST_TIMEOUT)
hass.data[PLATFORM][DATA_CLIENT] = client
async def _async_close_client(event):
await client.close()
hass.bus.async_listen_once(EVENT_HOMEASSISTANT_CLOSE, _async_close_client)
# Create a single instance of CityBikesNetworks.
networks = hass.data.setdefault(CITYBIKES_NETWORKS, CityBikesNetworks(hass))
@@ -194,10 +138,10 @@ async def async_setup_platform(
devices = []
for station in network.stations:
dist = location_util.distance(
latitude, longitude, station[ATTR_LATITUDE], station[ATTR_LONGITUDE]
latitude, longitude, station.latitude, station.longitude
)
station_id = station[ATTR_ID]
station_uid = str(station.get(ATTR_EXTRA, {}).get(ATTR_UID, ""))
station_id = station.id
station_uid = str(station.extra.get(ATTR_UID, ""))
if radius > dist or stations_list.intersection((station_id, station_uid)):
if name:
@@ -216,6 +160,7 @@ class CityBikesNetworks:
def __init__(self, hass):
"""Initialize the networks instance."""
self.hass = hass
self.client = hass.data[PLATFORM][DATA_CLIENT]
self.networks = None
self.networks_loading = asyncio.Condition()
@@ -224,24 +169,21 @@ class CityBikesNetworks:
try:
await self.networks_loading.acquire()
if self.networks is None:
networks = await async_citybikes_request(
self.hass, NETWORKS_URI, NETWORKS_RESPONSE_SCHEMA
)
self.networks = networks[ATTR_NETWORKS_LIST]
except CityBikesRequestError as err:
self.networks = await self.client.networks.fetch()
except aiohttp.ClientError as err:
raise PlatformNotReady from err
else:
result = None
minimum_dist = None
for network in self.networks:
network_latitude = network[ATTR_LOCATION][ATTR_LATITUDE]
network_longitude = network[ATTR_LOCATION][ATTR_LONGITUDE]
network_latitude = network.location.latitude
network_longitude = network.location.longitude
dist = location_util.distance(
latitude, longitude, network_latitude, network_longitude
)
if minimum_dist is None or dist < minimum_dist:
minimum_dist = dist
result = network[ATTR_ID]
result = network.id
return result
finally:
@@ -257,22 +199,20 @@ class CityBikesNetwork:
self.network_id = network_id
self.stations = []
self.ready = asyncio.Event()
self.client = hass.data[PLATFORM][DATA_CLIENT]
async def async_refresh(self, now=None):
"""Refresh the state of the network."""
try:
network = await async_citybikes_request(
self.hass,
STATIONS_URI.format(uid=self.network_id),
STATIONS_RESPONSE_SCHEMA,
)
self.stations = network[ATTR_NETWORK][ATTR_STATIONS_LIST]
self.ready.set()
except CityBikesRequestError as err:
if now is not None:
self.ready.clear()
else:
network = await self.client.network(uid=self.network_id).fetch()
except aiohttp.ClientError as err:
if now is None:
raise PlatformNotReady from err
self.ready.clear()
return
self.stations = network.stations
self.ready.set()
class CityBikesStation(SensorEntity):
@@ -290,16 +230,13 @@ class CityBikesStation(SensorEntity):
async def async_update(self) -> None:
"""Update station state."""
for station in self._network.stations:
if station[ATTR_ID] == self._station_id:
station_data = station
break
self._attr_name = station_data.get(ATTR_NAME)
self._attr_native_value = station_data.get(ATTR_FREE_BIKES)
station = next(s for s in self._network.stations if s.id == self._station_id)
self._attr_name = station.name
self._attr_native_value = station.free_bikes
self._attr_extra_state_attributes = {
ATTR_UID: station_data.get(ATTR_EXTRA, {}).get(ATTR_UID),
ATTR_LATITUDE: station_data.get(ATTR_LATITUDE),
ATTR_LONGITUDE: station_data.get(ATTR_LONGITUDE),
ATTR_EMPTY_SLOTS: station_data.get(ATTR_EMPTY_SLOTS),
ATTR_TIMESTAMP: station_data.get(ATTR_TIMESTAMP),
ATTR_UID: station.extra.get(ATTR_UID),
ATTR_LATITUDE: station.latitude,
ATTR_LONGITUDE: station.longitude,
ATTR_EMPTY_SLOTS: station.empty_slots,
ATTR_TIMESTAMP: station.timestamp,
}

View File

@@ -7,14 +7,7 @@ from typing import Any, cast
from aiocomelit import ComelitSerialBridgeObject
from aiocomelit.const import COVER, STATE_COVER, STATE_OFF, STATE_ON
from homeassistant.components.cover import (
STATE_CLOSED,
STATE_CLOSING,
STATE_OPEN,
STATE_OPENING,
CoverDeviceClass,
CoverEntity,
)
from homeassistant.components.cover import CoverDeviceClass, CoverEntity, CoverState
from homeassistant.core import HomeAssistant
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from homeassistant.helpers.restore_state import RestoreEntity
@@ -128,9 +121,9 @@ class ComelitCoverEntity(ComelitBridgeBaseEntity, RestoreEntity, CoverEntity):
await super().async_added_to_hass()
if (state := await self.async_get_last_state()) is not None:
if state.state == STATE_CLOSED:
self._last_action = STATE_COVER.index(STATE_CLOSING)
if state.state == STATE_OPEN:
self._last_action = STATE_COVER.index(STATE_OPENING)
if state.state == CoverState.CLOSED:
self._last_action = STATE_COVER.index(CoverState.CLOSING)
if state.state == CoverState.OPEN:
self._last_action = STATE_COVER.index(CoverState.OPENING)
self._attr_is_closed = state.state == STATE_CLOSED
self._attr_is_closed = state.state == CoverState.CLOSED

View File

@@ -13,7 +13,7 @@ from propcache.api import cached_property
import voluptuous as vol
from homeassistant.config_entries import ConfigEntry
from homeassistant.const import ( # noqa: F401
from homeassistant.const import (
SERVICE_CLOSE_COVER,
SERVICE_CLOSE_COVER_TILT,
SERVICE_OPEN_COVER,
@@ -24,19 +24,9 @@ from homeassistant.const import ( # noqa: F401
SERVICE_STOP_COVER_TILT,
SERVICE_TOGGLE,
SERVICE_TOGGLE_COVER_TILT,
STATE_CLOSED,
STATE_CLOSING,
STATE_OPEN,
STATE_OPENING,
)
from homeassistant.core import HomeAssistant
from homeassistant.helpers import config_validation as cv
from homeassistant.helpers.deprecation import (
DeprecatedConstantEnum,
all_with_deprecated_constants,
check_if_deprecated_constant,
dir_with_deprecated_constants,
)
from homeassistant.helpers.entity import Entity, EntityDescription
from homeassistant.helpers.entity_component import EntityComponent
from homeassistant.helpers.typing import ConfigType
@@ -63,15 +53,6 @@ class CoverState(StrEnum):
OPENING = "opening"
# STATE_* below are deprecated as of 2024.11
# when imported from homeassistant.components.cover
# use the CoverState enum instead.
_DEPRECATED_STATE_CLOSED = DeprecatedConstantEnum(CoverState.CLOSED, "2025.11")
_DEPRECATED_STATE_CLOSING = DeprecatedConstantEnum(CoverState.CLOSING, "2025.11")
_DEPRECATED_STATE_OPEN = DeprecatedConstantEnum(CoverState.OPEN, "2025.11")
_DEPRECATED_STATE_OPENING = DeprecatedConstantEnum(CoverState.OPENING, "2025.11")
class CoverDeviceClass(StrEnum):
"""Device class for cover."""
@@ -463,11 +444,3 @@ class CoverEntity(Entity, cached_properties=CACHED_PROPERTIES_WITH_ATTR_):
return (
fns["close"] if self._cover_is_last_toggle_direction_open else fns["open"]
)
# These can be removed if no deprecated constant are in this module anymore
__getattr__ = ft.partial(check_if_deprecated_constant, module_globals=globals())
__dir__ = ft.partial(
dir_with_deprecated_constants, module_globals_keys=[*globals().keys()]
)
__all__ = all_with_deprecated_constants(globals())

View File

@@ -24,6 +24,7 @@ from homeassistant.core import HomeAssistant, callback
from homeassistant.helpers.aiohttp_client import async_get_clientsession
from homeassistant.helpers.update_coordinator import DataUpdateCoordinator
from homeassistant.util import dt as dt_util
from homeassistant.util.unit_conversion import EnergyConverter
from .const import DOMAIN
@@ -146,6 +147,7 @@ class DukeEnergyCoordinator(DataUpdateCoordinator[None]):
name=f"{name_prefix} Consumption",
source=DOMAIN,
statistic_id=consumption_statistic_id,
unit_class=EnergyConverter.UNIT_CLASS,
unit_of_measurement=UnitOfEnergy.KILO_WATT_HOUR
if meter["serviceType"] == "ELECTRIC"
else UnitOfVolume.CENTUM_CUBIC_FEET,

View File

@@ -6,5 +6,5 @@
"documentation": "https://www.home-assistant.io/integrations/ecovacs",
"iot_class": "cloud_push",
"loggers": ["sleekxmppfs", "sucks", "deebot_client"],
"requirements": ["py-sucks==0.9.11", "deebot-client==15.0.0"]
"requirements": ["py-sucks==0.9.11", "deebot-client==15.1.0"]
}

View File

@@ -20,6 +20,7 @@ from homeassistant.components.recorder.statistics import (
from homeassistant.components.recorder.util import get_instance
from homeassistant.const import UnitOfEnergy
from homeassistant.util import dt as dt_util
from homeassistant.util.unit_conversion import EnergyConverter
from .const import DOMAIN, LOGGER
@@ -153,6 +154,7 @@ class ElviaImporter:
name=f"{self.metering_point_id} Consumption",
source=DOMAIN,
statistic_id=statistic_id,
unit_class=EnergyConverter.UNIT_CLASS,
unit_of_measurement=UnitOfEnergy.KILO_WATT_HOUR,
),
statistics=statistics,

View File

@@ -13,27 +13,14 @@ from homeassistant.components.climate import (
ClimateEntityFeature,
HVACMode,
)
from homeassistant.const import (
ATTR_BATTERY_LEVEL,
ATTR_TEMPERATURE,
PRECISION_HALVES,
UnitOfTemperature,
)
from homeassistant.const import ATTR_TEMPERATURE, PRECISION_HALVES, UnitOfTemperature
from homeassistant.core import HomeAssistant, callback
from homeassistant.exceptions import HomeAssistantError
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from .const import (
ATTR_STATE_BATTERY_LOW,
ATTR_STATE_HOLIDAY_MODE,
ATTR_STATE_SUMMER_MODE,
ATTR_STATE_WINDOW_OPEN,
DOMAIN,
LOGGER,
)
from .const import DOMAIN, LOGGER
from .coordinator import FritzboxConfigEntry, FritzboxDataUpdateCoordinator
from .entity import FritzBoxDeviceEntity
from .model import ClimateExtraAttributes
from .sensor import value_scheduled_preset
HVAC_MODES = [HVACMode.HEAT, HVACMode.OFF]
@@ -202,26 +189,6 @@ class FritzboxThermostat(FritzBoxDeviceEntity, ClimateEntity):
self.check_active_or_lock_mode()
await self.async_set_hkr_state(PRESET_API_HKR_STATE_MAPPING[preset_mode])
@property
def extra_state_attributes(self) -> ClimateExtraAttributes:
"""Return the device specific state attributes."""
# deprecated with #143394, can be removed in 2025.11
attrs: ClimateExtraAttributes = {
ATTR_STATE_BATTERY_LOW: self.data.battery_low,
}
# the following attributes are available since fritzos 7
if self.data.battery_level is not None:
attrs[ATTR_BATTERY_LEVEL] = self.data.battery_level
if self.data.holiday_active is not None:
attrs[ATTR_STATE_HOLIDAY_MODE] = self.data.holiday_active
if self.data.summer_active is not None:
attrs[ATTR_STATE_SUMMER_MODE] = self.data.summer_active
if self.data.window_open is not None:
attrs[ATTR_STATE_WINDOW_OPEN] = self.data.window_open
return attrs
def check_active_or_lock_mode(self) -> None:
"""Check if in summer/vacation mode or lock enabled."""
if self.data.holiday_active or self.data.summer_active:

View File

@@ -20,5 +20,5 @@
"documentation": "https://www.home-assistant.io/integrations/frontend",
"integration_type": "system",
"quality_scale": "internal",
"requirements": ["home-assistant-frontend==20251001.0"]
"requirements": ["home-assistant-frontend==20251001.3"]
}

View File

@@ -167,6 +167,6 @@ class GitHubDataUpdateCoordinator(DataUpdateCoordinator[dict[str, Any]]):
)
self.hass.bus.async_listen_once(EVENT_HOMEASSISTANT_STOP, self.unsubscribe)
def unsubscribe(self, *args) -> None:
def unsubscribe(self, *args: Any) -> None:
"""Unsubscribe to repository events."""
self._client.repos.events.unsubscribe(subscription_id=self._subscription_id)

View File

@@ -182,10 +182,10 @@ FAN_SPEED_MAX_SPEED_COUNT = 5
COVER_VALVE_STATES = {
cover.DOMAIN: {
"closed": cover.STATE_CLOSED,
"closing": cover.STATE_CLOSING,
"open": cover.STATE_OPEN,
"opening": cover.STATE_OPENING,
"closed": cover.CoverState.CLOSED.value,
"closing": cover.CoverState.CLOSING.value,
"open": cover.CoverState.OPEN.value,
"opening": cover.CoverState.OPENING.value,
},
valve.DOMAIN: {
"closed": valve.STATE_CLOSED,

View File

@@ -8,7 +8,12 @@ from typing import Any
import voluptuous as vol
from homeassistant.config_entries import SOURCE_REAUTH, ConfigFlowResult, OptionsFlow
from homeassistant.config_entries import (
SOURCE_REAUTH,
SOURCE_RECONFIGURE,
ConfigFlowResult,
OptionsFlow,
)
from homeassistant.core import callback
from homeassistant.helpers import config_entry_oauth2_flow
@@ -40,6 +45,12 @@ class OAuth2FlowHandler(
"prompt": "consent",
}
async def async_step_reconfigure(
self, user_input: dict[str, Any] | None = None
) -> ConfigFlowResult:
"""Handle a reconfiguration flow."""
return await self.async_step_user(user_input)
async def async_step_reauth(
self, entry_data: Mapping[str, Any]
) -> ConfigFlowResult:
@@ -60,6 +71,10 @@ class OAuth2FlowHandler(
return self.async_update_reload_and_abort(
self._get_reauth_entry(), data=data
)
if self.source == SOURCE_RECONFIGURE:
return self.async_update_reload_and_abort(
self._get_reconfigure_entry(), data=data
)
return self.async_create_entry(
title=DEFAULT_NAME,

View File

@@ -30,7 +30,8 @@
"oauth_failed": "[%key:common::config_flow::abort::oauth2_failed%]",
"already_configured": "[%key:common::config_flow::abort::already_configured_account%]",
"already_in_progress": "[%key:common::config_flow::abort::already_in_progress%]",
"reauth_successful": "[%key:common::config_flow::abort::reauth_successful%]"
"reauth_successful": "[%key:common::config_flow::abort::reauth_successful%]",
"reconfigure_successful": "[%key:common::config_flow::abort::reconfigure_successful%]"
},
"create_entry": {
"default": "[%key:common::config_flow::create_entry::authenticated%]"

View File

@@ -99,6 +99,20 @@ CLEANING_MODE_OPTIONS = {
"ConsumerProducts.CleaningRobot.EnumType.CleaningModes.Silent",
"ConsumerProducts.CleaningRobot.EnumType.CleaningModes.Standard",
"ConsumerProducts.CleaningRobot.EnumType.CleaningModes.Power",
"ConsumerProducts.CleaningRobot.EnumType.CleaningMode.IntelligentMode",
"ConsumerProducts.CleaningRobot.EnumType.CleaningMode.VacuumOnly",
"ConsumerProducts.CleaningRobot.EnumType.CleaningMode.MopOnly",
"ConsumerProducts.CleaningRobot.EnumType.CleaningMode.VacuumAndMop",
"ConsumerProducts.CleaningRobot.EnumType.CleaningMode.MopAfterVacuum",
)
}
SUCTION_POWER_OPTIONS = {
bsh_key_to_translation_key(option): option
for option in (
"ConsumerProducts.CleaningRobot.EnumType.SuctionPower.Silent",
"ConsumerProducts.CleaningRobot.EnumType.SuctionPower.Standard",
"ConsumerProducts.CleaningRobot.EnumType.SuctionPower.Max",
)
}
@@ -309,6 +323,10 @@ PROGRAM_ENUM_OPTIONS = {
OptionKey.CONSUMER_PRODUCTS_CLEANING_ROBOT_CLEANING_MODE,
CLEANING_MODE_OPTIONS,
),
(
OptionKey.CONSUMER_PRODUCTS_CLEANING_ROBOT_SUCTION_POWER,
SUCTION_POWER_OPTIONS,
),
(OptionKey.CONSUMER_PRODUCTS_COFFEE_MAKER_BEAN_AMOUNT, BEAN_AMOUNT_OPTIONS),
(
OptionKey.CONSUMER_PRODUCTS_COFFEE_MAKER_COFFEE_TEMPERATURE,

View File

@@ -30,6 +30,7 @@ from .const import (
INTENSIVE_LEVEL_OPTIONS,
PROGRAMS_TRANSLATION_KEYS_MAP,
SPIN_SPEED_OPTIONS,
SUCTION_POWER_OPTIONS,
TEMPERATURE_OPTIONS,
TRANSLATION_KEYS_PROGRAMS_MAP,
VARIO_PERFECT_OPTIONS,
@@ -168,6 +169,16 @@ PROGRAM_SELECT_OPTION_ENTITY_DESCRIPTIONS = (
for translation_key, value in CLEANING_MODE_OPTIONS.items()
},
),
HomeConnectSelectEntityDescription(
key=OptionKey.CONSUMER_PRODUCTS_CLEANING_ROBOT_SUCTION_POWER,
translation_key="suction_power",
options=list(SUCTION_POWER_OPTIONS),
translation_key_values=SUCTION_POWER_OPTIONS,
values_translation_key={
value: translation_key
for translation_key, value in SUCTION_POWER_OPTIONS.items()
},
),
HomeConnectSelectEntityDescription(
key=OptionKey.CONSUMER_PRODUCTS_COFFEE_MAKER_BEAN_AMOUNT,
translation_key="bean_amount",

View File

@@ -202,6 +202,22 @@ set_program_and_options:
- consumer_products_cleaning_robot_enum_type_cleaning_modes_silent
- consumer_products_cleaning_robot_enum_type_cleaning_modes_standard
- consumer_products_cleaning_robot_enum_type_cleaning_modes_power
- consumer_products_cleaning_robot_enum_type_cleaning_mode_intelligent_mode
- consumer_products_cleaning_robot_enum_type_cleaning_mode_vacuum_only
- consumer_products_cleaning_robot_enum_type_cleaning_mode_mop_only
- consumer_products_cleaning_robot_enum_type_cleaning_mode_vacuum_and_mop
- consumer_products_cleaning_robot_enum_type_cleaning_mode_mop_after_vacuum
consumer_products_cleaning_robot_option_suction_power:
example: consumer_products_cleaning_robot_enum_type_suction_power_standard
required: false
selector:
select:
mode: dropdown
translation_key: suction_power
options:
- consumer_products_cleaning_robot_enum_type_suction_power_silent
- consumer_products_cleaning_robot_enum_type_suction_power_standard
- consumer_products_cleaning_robot_enum_type_suction_power_max
coffee_maker_options:
collapsed: true
fields:

View File

@@ -324,7 +324,19 @@
"options": {
"consumer_products_cleaning_robot_enum_type_cleaning_modes_silent": "Silent",
"consumer_products_cleaning_robot_enum_type_cleaning_modes_standard": "Standard",
"consumer_products_cleaning_robot_enum_type_cleaning_modes_power": "Power"
"consumer_products_cleaning_robot_enum_type_cleaning_modes_power": "Power",
"consumer_products_cleaning_robot_enum_type_cleaning_mode_intelligent_mode": "Intelligent mode",
"consumer_products_cleaning_robot_enum_type_cleaning_mode_vacuum_only": "Vacuum only",
"consumer_products_cleaning_robot_enum_type_cleaning_mode_mop_only": "Mop only",
"consumer_products_cleaning_robot_enum_type_cleaning_mode_vacuum_and_mop": "Vacuum and mop",
"consumer_products_cleaning_robot_enum_type_cleaning_mode_mop_after_vacuum": "Mop after vacuum"
}
},
"suction_power": {
"options": {
"consumer_products_cleaning_robot_enum_type_suction_power_silent": "Silent",
"consumer_products_cleaning_robot_enum_type_suction_power_standard": "Standard",
"consumer_products_cleaning_robot_enum_type_suction_power_max": "Max"
}
},
"bean_amount": {
@@ -519,6 +531,10 @@
"name": "Cleaning mode",
"description": "Defines the favored cleaning mode."
},
"consumer_products_cleaning_robot_option_suction_power": {
"name": "Suction power",
"description": "Defines the suction power."
},
"consumer_products_coffee_maker_option_bean_amount": {
"name": "Bean amount",
"description": "Describes the amount of coffee beans used in a coffee machine program."
@@ -1196,7 +1212,20 @@
"state": {
"consumer_products_cleaning_robot_enum_type_cleaning_modes_silent": "[%key:component::home_connect::selector::cleaning_mode::options::consumer_products_cleaning_robot_enum_type_cleaning_modes_silent%]",
"consumer_products_cleaning_robot_enum_type_cleaning_modes_standard": "[%key:component::home_connect::selector::cleaning_mode::options::consumer_products_cleaning_robot_enum_type_cleaning_modes_standard%]",
"consumer_products_cleaning_robot_enum_type_cleaning_modes_power": "[%key:component::home_connect::selector::cleaning_mode::options::consumer_products_cleaning_robot_enum_type_cleaning_modes_power%]"
"consumer_products_cleaning_robot_enum_type_cleaning_modes_power": "[%key:component::home_connect::selector::cleaning_mode::options::consumer_products_cleaning_robot_enum_type_cleaning_modes_power%]",
"consumer_products_cleaning_robot_enum_type_cleaning_mode_intelligent_mode": "[%key:component::home_connect::selector::cleaning_mode::options::consumer_products_cleaning_robot_enum_type_cleaning_mode_intelligent_mode%]",
"consumer_products_cleaning_robot_enum_type_cleaning_mode_vacuum_only": "[%key:component::home_connect::selector::cleaning_mode::options::consumer_products_cleaning_robot_enum_type_cleaning_mode_vacuum_only%]",
"consumer_products_cleaning_robot_enum_type_cleaning_mode_mop_only": "[%key:component::home_connect::selector::cleaning_mode::options::consumer_products_cleaning_robot_enum_type_cleaning_mode_mop_only%]",
"consumer_products_cleaning_robot_enum_type_cleaning_mode_vacuum_and_mop": "[%key:component::home_connect::selector::cleaning_mode::options::consumer_products_cleaning_robot_enum_type_cleaning_mode_vacuum_and_mop%]",
"consumer_products_cleaning_robot_enum_type_cleaning_mode_mop_after_vacuum": "[%key:component::home_connect::selector::cleaning_mode::options::consumer_products_cleaning_robot_enum_type_cleaning_mode_mop_after_vacuum%]"
}
},
"suction_power": {
"name": "[%key:component::home_connect::services::set_program_and_options::fields::consumer_products_cleaning_robot_option_suction_power::name%]",
"state": {
"consumer_products_cleaning_robot_enum_type_suction_power_silent": "[%key:component::home_connect::selector::suction_power::options::consumer_products_cleaning_robot_enum_type_suction_power_silent%]",
"consumer_products_cleaning_robot_enum_type_suction_power_standard": "[%key:component::home_connect::selector::suction_power::options::consumer_products_cleaning_robot_enum_type_suction_power_standard%]",
"consumer_products_cleaning_robot_enum_type_suction_power_max": "[%key:component::home_connect::selector::suction_power::options::consumer_products_cleaning_robot_enum_type_suction_power_max%]"
}
},
"bean_amount": {

View File

@@ -13,6 +13,12 @@
"pid": "4001",
"description": "*zbt-2*",
"known_devices": ["ZBT-2"]
},
{
"vid": "303A",
"pid": "831A",
"description": "*zbt-2*",
"known_devices": ["ZBT-2"]
}
]
}

View File

@@ -1,15 +1,20 @@
"""Home Assistant Hardware integration helpers."""
from __future__ import annotations
from collections import defaultdict
from collections.abc import AsyncIterator, Awaitable, Callable
from contextlib import asynccontextmanager
import logging
from typing import Protocol
from typing import TYPE_CHECKING, Protocol
from homeassistant.config_entries import ConfigEntry
from homeassistant.core import CALLBACK_TYPE, HomeAssistant, callback as hass_callback
from . import DATA_COMPONENT
from .util import FirmwareInfo
if TYPE_CHECKING:
from .util import FirmwareInfo
_LOGGER = logging.getLogger(__name__)
@@ -51,6 +56,7 @@ class HardwareInfoDispatcher:
self._notification_callbacks: defaultdict[
str, set[Callable[[FirmwareInfo], None]]
] = defaultdict(set)
self._active_firmware_updates: dict[str, str] = {}
def register_firmware_info_provider(
self, domain: str, platform: HardwareFirmwareInfoModule
@@ -118,6 +124,36 @@ class HardwareInfoDispatcher:
if fw_info is not None:
yield fw_info
def register_firmware_update_in_progress(
self, device: str, source_domain: str
) -> None:
"""Register that a firmware update is in progress for a device."""
if device in self._active_firmware_updates:
current_domain = self._active_firmware_updates[device]
raise ValueError(
f"Firmware update already in progress for {device} by {current_domain}"
)
self._active_firmware_updates[device] = source_domain
def unregister_firmware_update_in_progress(
self, device: str, source_domain: str
) -> None:
"""Unregister a firmware update for a device."""
if device not in self._active_firmware_updates:
raise ValueError(f"No firmware update in progress for {device}")
if self._active_firmware_updates[device] != source_domain:
current_domain = self._active_firmware_updates[device]
raise ValueError(
f"Firmware update for {device} is owned by {current_domain}, not {source_domain}"
)
del self._active_firmware_updates[device]
def is_firmware_update_in_progress(self, device: str) -> bool:
"""Check if a firmware update is in progress for a device."""
return device in self._active_firmware_updates
@hass_callback
def async_register_firmware_info_provider(
@@ -141,3 +177,42 @@ def async_notify_firmware_info(
) -> Awaitable[None]:
"""Notify the dispatcher of new firmware information."""
return hass.data[DATA_COMPONENT].notify_firmware_info(domain, firmware_info)
@hass_callback
def async_register_firmware_update_in_progress(
hass: HomeAssistant, device: str, source_domain: str
) -> None:
"""Register that a firmware update is in progress for a device."""
return hass.data[DATA_COMPONENT].register_firmware_update_in_progress(
device, source_domain
)
@hass_callback
def async_unregister_firmware_update_in_progress(
hass: HomeAssistant, device: str, source_domain: str
) -> None:
"""Unregister a firmware update for a device."""
return hass.data[DATA_COMPONENT].unregister_firmware_update_in_progress(
device, source_domain
)
@hass_callback
def async_is_firmware_update_in_progress(hass: HomeAssistant, device: str) -> bool:
"""Check if a firmware update is in progress for a device."""
return hass.data[DATA_COMPONENT].is_firmware_update_in_progress(device)
@asynccontextmanager
async def async_firmware_update_context(
hass: HomeAssistant, device: str, source_domain: str
) -> AsyncIterator[None]:
"""Register a device as having its firmware being actively updated."""
async_register_firmware_update_in_progress(hass, device, source_domain)
try:
yield
finally:
async_unregister_firmware_update_in_progress(hass, device, source_domain)

View File

@@ -67,7 +67,7 @@
}
},
"abort": {
"not_hassio_thread": "The OpenThread Border Router add-on can only be installed with Home Assistant OS. If you would like to use the {model} as a Thread border router, please flash the firmware manually using the [web flasher]({docs_web_flasher_url}) and set up OpenThread Border Router to communicate with it.",
"not_hassio_thread": "The OpenThread Border Router add-on can only be installed with Home Assistant OS. If you would like to use the {model} as a Thread border router, please manually set up OpenThread Border Router to communicate with it.",
"otbr_addon_already_running": "The OpenThread Border Router add-on is already running, it cannot be installed again.",
"zha_still_using_stick": "This {model} is in use by the Zigbee Home Automation integration. Please migrate your Zigbee network to another adapter or delete the integration and try again.",
"otbr_still_using_stick": "This {model} is in use by the OpenThread Border Router add-on. If you use the Thread network, make sure you have alternative border routers. Uninstall the add-on and try again.",

View File

@@ -275,6 +275,7 @@ class BaseFirmwareUpdateEntity(
expected_installed_firmware_type=self.entity_description.expected_firmware_type,
bootloader_reset_methods=self.bootloader_reset_methods,
progress_callback=self._update_progress,
domain=self._config_entry.domain,
)
finally:
self._attr_in_progress = False

View File

@@ -26,6 +26,7 @@ from homeassistant.helpers.singleton import singleton
from . import DATA_COMPONENT
from .const import (
DOMAIN,
OTBR_ADDON_MANAGER_DATA,
OTBR_ADDON_NAME,
OTBR_ADDON_SLUG,
@@ -33,6 +34,7 @@ from .const import (
ZIGBEE_FLASHER_ADDON_NAME,
ZIGBEE_FLASHER_ADDON_SLUG,
)
from .helpers import async_firmware_update_context
from .silabs_multiprotocol_addon import (
WaitingAddonManager,
get_multiprotocol_addon_manager,
@@ -359,45 +361,50 @@ async def async_flash_silabs_firmware(
expected_installed_firmware_type: ApplicationType,
bootloader_reset_methods: Sequence[ResetTarget] = (),
progress_callback: Callable[[int, int], None] | None = None,
*,
domain: str = DOMAIN,
) -> FirmwareInfo:
"""Flash firmware to the SiLabs device."""
firmware_info = await guess_firmware_info(hass, device)
_LOGGER.debug("Identified firmware info: %s", firmware_info)
async with async_firmware_update_context(hass, device, domain):
firmware_info = await guess_firmware_info(hass, device)
_LOGGER.debug("Identified firmware info: %s", firmware_info)
fw_image = await hass.async_add_executor_job(parse_firmware_image, fw_data)
fw_image = await hass.async_add_executor_job(parse_firmware_image, fw_data)
flasher = Flasher(
device=device,
probe_methods=(
ApplicationType.GECKO_BOOTLOADER.as_flasher_application_type(),
ApplicationType.EZSP.as_flasher_application_type(),
ApplicationType.SPINEL.as_flasher_application_type(),
ApplicationType.CPC.as_flasher_application_type(),
),
bootloader_reset=tuple(
m.as_flasher_reset_target() for m in bootloader_reset_methods
),
)
async with AsyncExitStack() as stack:
for owner in firmware_info.owners:
await stack.enter_async_context(owner.temporarily_stop(hass))
try:
# Enter the bootloader with indeterminate progress
await flasher.enter_bootloader()
# Flash the firmware, with progress
await flasher.flash_firmware(fw_image, progress_callback=progress_callback)
except Exception as err:
raise HomeAssistantError("Failed to flash firmware") from err
probed_firmware_info = await probe_silabs_firmware_info(
device,
probe_methods=(expected_installed_firmware_type,),
flasher = Flasher(
device=device,
probe_methods=(
ApplicationType.GECKO_BOOTLOADER.as_flasher_application_type(),
ApplicationType.EZSP.as_flasher_application_type(),
ApplicationType.SPINEL.as_flasher_application_type(),
ApplicationType.CPC.as_flasher_application_type(),
),
bootloader_reset=tuple(
m.as_flasher_reset_target() for m in bootloader_reset_methods
),
)
if probed_firmware_info is None:
raise HomeAssistantError("Failed to probe the firmware after flashing")
async with AsyncExitStack() as stack:
for owner in firmware_info.owners:
await stack.enter_async_context(owner.temporarily_stop(hass))
return probed_firmware_info
try:
# Enter the bootloader with indeterminate progress
await flasher.enter_bootloader()
# Flash the firmware, with progress
await flasher.flash_firmware(
fw_image, progress_callback=progress_callback
)
except Exception as err:
raise HomeAssistantError("Failed to flash firmware") from err
probed_firmware_info = await probe_silabs_firmware_info(
device,
probe_methods=(expected_installed_firmware_type,),
)
if probed_firmware_info is None:
raise HomeAssistantError("Failed to probe the firmware after flashing")
return probed_firmware_info

View File

@@ -1,12 +1,12 @@
{
"domain": "iometer",
"name": "IOmeter",
"codeowners": ["@MaestroOnICe"],
"codeowners": ["@jukrebs"],
"config_flow": true,
"documentation": "https://www.home-assistant.io/integrations/iometer",
"integration_type": "device",
"iot_class": "local_polling",
"quality_scale": "bronze",
"requirements": ["iometer==0.1.0"],
"requirements": ["iometer==0.2.0"],
"zeroconf": ["_iometer._tcp.local."]
}

View File

@@ -34,6 +34,7 @@ from homeassistant.helpers.device_registry import (
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from homeassistant.helpers.typing import StateType
from homeassistant.helpers.update_coordinator import CoordinatorEntity
from homeassistant.util.unit_conversion import EnergyConverter, VolumeConverter
from .const import DOMAIN
from .coordinator import IstaConfigEntry, IstaCoordinator
@@ -49,6 +50,7 @@ class IstaSensorEntityDescription(SensorEntityDescription):
"""Ista EcoTrend Sensor Description."""
consumption_type: IstaConsumptionType
unit_class: str | None = None
value_type: IstaValueType | None = None
@@ -84,6 +86,7 @@ SENSOR_DESCRIPTIONS: tuple[IstaSensorEntityDescription, ...] = (
suggested_display_precision=1,
consumption_type=IstaConsumptionType.HEATING,
value_type=IstaValueType.ENERGY,
unit_class=EnergyConverter.UNIT_CLASS,
),
IstaSensorEntityDescription(
key=IstaSensorEntity.HEATING_COST,
@@ -104,6 +107,7 @@ SENSOR_DESCRIPTIONS: tuple[IstaSensorEntityDescription, ...] = (
state_class=SensorStateClass.TOTAL,
suggested_display_precision=1,
consumption_type=IstaConsumptionType.HOT_WATER,
unit_class=VolumeConverter.UNIT_CLASS,
),
IstaSensorEntityDescription(
key=IstaSensorEntity.HOT_WATER_ENERGY,
@@ -114,6 +118,7 @@ SENSOR_DESCRIPTIONS: tuple[IstaSensorEntityDescription, ...] = (
suggested_display_precision=1,
consumption_type=IstaConsumptionType.HOT_WATER,
value_type=IstaValueType.ENERGY,
unit_class=EnergyConverter.UNIT_CLASS,
),
IstaSensorEntityDescription(
key=IstaSensorEntity.HOT_WATER_COST,
@@ -135,6 +140,7 @@ SENSOR_DESCRIPTIONS: tuple[IstaSensorEntityDescription, ...] = (
suggested_display_precision=1,
entity_registry_enabled_default=False,
consumption_type=IstaConsumptionType.WATER,
unit_class=VolumeConverter.UNIT_CLASS,
),
IstaSensorEntityDescription(
key=IstaSensorEntity.WATER_COST,
@@ -276,6 +282,7 @@ class IstaSensor(CoordinatorEntity[IstaCoordinator], SensorEntity):
"name": f"{self.device_entry.name} {self.name}",
"source": DOMAIN,
"statistic_id": statistic_id,
"unit_class": self.entity_description.unit_class,
"unit_of_measurement": self.entity_description.native_unit_of_measurement,
}
if statistics:

View File

@@ -36,6 +36,11 @@ from homeassistant.helpers.device_registry import DeviceEntry
from homeassistant.helpers.issue_registry import IssueSeverity, async_create_issue
from homeassistant.helpers.typing import ConfigType
from homeassistant.util import dt as dt_util
from homeassistant.util.unit_conversion import (
EnergyConverter,
TemperatureConverter,
VolumeConverter,
)
from .const import DATA_BACKUP_AGENT_LISTENERS, DOMAIN
@@ -254,6 +259,7 @@ async def _insert_statistics(hass: HomeAssistant) -> None:
"source": DOMAIN,
"name": "Outdoor temperature",
"statistic_id": f"{DOMAIN}:temperature_outdoor",
"unit_class": TemperatureConverter.UNIT_CLASS,
"unit_of_measurement": UnitOfTemperature.CELSIUS,
"mean_type": StatisticMeanType.ARITHMETIC,
"has_sum": False,
@@ -267,6 +273,7 @@ async def _insert_statistics(hass: HomeAssistant) -> None:
"source": DOMAIN,
"name": "Energy consumption 1",
"statistic_id": f"{DOMAIN}:energy_consumption_kwh",
"unit_class": EnergyConverter.UNIT_CLASS,
"unit_of_measurement": UnitOfEnergy.KILO_WATT_HOUR,
"mean_type": StatisticMeanType.NONE,
"has_sum": True,
@@ -279,6 +286,7 @@ async def _insert_statistics(hass: HomeAssistant) -> None:
"source": DOMAIN,
"name": "Energy consumption 2",
"statistic_id": f"{DOMAIN}:energy_consumption_mwh",
"unit_class": EnergyConverter.UNIT_CLASS,
"unit_of_measurement": UnitOfEnergy.MEGA_WATT_HOUR,
"mean_type": StatisticMeanType.NONE,
"has_sum": True,
@@ -293,6 +301,7 @@ async def _insert_statistics(hass: HomeAssistant) -> None:
"source": DOMAIN,
"name": "Gas consumption 1",
"statistic_id": f"{DOMAIN}:gas_consumption_m3",
"unit_class": VolumeConverter.UNIT_CLASS,
"unit_of_measurement": UnitOfVolume.CUBIC_METERS,
"mean_type": StatisticMeanType.NONE,
"has_sum": True,
@@ -307,6 +316,7 @@ async def _insert_statistics(hass: HomeAssistant) -> None:
"source": DOMAIN,
"name": "Gas consumption 2",
"statistic_id": f"{DOMAIN}:gas_consumption_ft3",
"unit_class": VolumeConverter.UNIT_CLASS,
"unit_of_measurement": UnitOfVolume.CUBIC_FEET,
"mean_type": StatisticMeanType.NONE,
"has_sum": True,
@@ -319,6 +329,7 @@ async def _insert_statistics(hass: HomeAssistant) -> None:
"source": RECORDER_DOMAIN,
"name": None,
"statistic_id": "sensor.statistics_issues_issue_1",
"unit_class": VolumeConverter.UNIT_CLASS,
"unit_of_measurement": UnitOfVolume.CUBIC_METERS,
"mean_type": StatisticMeanType.ARITHMETIC,
"has_sum": False,
@@ -331,6 +342,7 @@ async def _insert_statistics(hass: HomeAssistant) -> None:
"source": RECORDER_DOMAIN,
"name": None,
"statistic_id": "sensor.statistics_issues_issue_2",
"unit_class": None,
"unit_of_measurement": "cats",
"mean_type": StatisticMeanType.ARITHMETIC,
"has_sum": False,
@@ -343,6 +355,7 @@ async def _insert_statistics(hass: HomeAssistant) -> None:
"source": RECORDER_DOMAIN,
"name": None,
"statistic_id": "sensor.statistics_issues_issue_3",
"unit_class": VolumeConverter.UNIT_CLASS,
"unit_of_measurement": UnitOfVolume.CUBIC_METERS,
"mean_type": StatisticMeanType.ARITHMETIC,
"has_sum": False,
@@ -355,6 +368,7 @@ async def _insert_statistics(hass: HomeAssistant) -> None:
"source": RECORDER_DOMAIN,
"name": None,
"statistic_id": "sensor.statistics_issues_issue_4",
"unit_class": VolumeConverter.UNIT_CLASS,
"unit_of_measurement": UnitOfVolume.CUBIC_METERS,
"mean_type": StatisticMeanType.ARITHMETIC,
"has_sum": False,
@@ -375,6 +389,7 @@ async def _insert_wrong_wind_direction_statistics(hass: HomeAssistant) -> None:
"source": RECORDER_DOMAIN,
"name": None,
"statistic_id": "sensor.statistics_issues_issue_5",
"unit_class": None,
"unit_of_measurement": DEGREE,
"mean_type": StatisticMeanType.ARITHMETIC,
"has_sum": False,

View File

@@ -11,9 +11,9 @@
"loggers": ["xknx", "xknxproject"],
"quality_scale": "silver",
"requirements": [
"xknx==3.9.0",
"xknx==3.9.1",
"xknxproject==3.8.2",
"knx-frontend==2025.8.24.205840"
"knx-frontend==2025.10.9.185845"
],
"single_config_entry": true
}

View File

@@ -2,6 +2,7 @@
from __future__ import annotations
from dataclasses import dataclass
from enum import IntEnum
from typing import Any
@@ -26,7 +27,7 @@ from homeassistant.const import ATTR_TEMPERATURE, Platform, UnitOfTemperature
from homeassistant.core import HomeAssistant, callback
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from .entity import MatterEntity
from .entity import MatterEntity, MatterEntityDescription
from .helpers import get_matter
from .models import MatterDiscoverySchema
@@ -182,6 +183,11 @@ async def async_setup_entry(
matter.register_platform_handler(Platform.CLIMATE, async_add_entities)
@dataclass(frozen=True)
class MatterClimateEntityDescription(ClimateEntityDescription, MatterEntityDescription):
"""Describe Matter Climate entities."""
class MatterClimate(MatterEntity, ClimateEntity):
"""Representation of a Matter climate entity."""
@@ -423,7 +429,7 @@ class MatterClimate(MatterEntity, ClimateEntity):
DISCOVERY_SCHEMAS = [
MatterDiscoverySchema(
platform=Platform.CLIMATE,
entity_description=ClimateEntityDescription(
entity_description=MatterClimateEntityDescription(
key="MatterThermostat",
name=None,
),

View File

@@ -2,6 +2,7 @@
from __future__ import annotations
from dataclasses import dataclass
from enum import IntEnum
from math import floor
from typing import Any
@@ -22,7 +23,7 @@ from homeassistant.core import HomeAssistant, callback
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from .const import LOGGER
from .entity import MatterEntity
from .entity import MatterEntity, MatterEntityDescription
from .helpers import get_matter
from .models import MatterDiscoverySchema
@@ -61,10 +62,15 @@ async def async_setup_entry(
matter.register_platform_handler(Platform.COVER, async_add_entities)
@dataclass(frozen=True)
class MatterCoverEntityDescription(CoverEntityDescription, MatterEntityDescription):
"""Describe Matter Cover entities."""
class MatterCover(MatterEntity, CoverEntity):
"""Representation of a Matter Cover."""
entity_description: CoverEntityDescription
entity_description: MatterCoverEntityDescription
@property
def is_closed(self) -> bool | None:
@@ -198,7 +204,7 @@ class MatterCover(MatterEntity, CoverEntity):
DISCOVERY_SCHEMAS = [
MatterDiscoverySchema(
platform=Platform.COVER,
entity_description=CoverEntityDescription(
entity_description=MatterCoverEntityDescription(
key="MatterCover",
name=None,
),
@@ -214,7 +220,7 @@ DISCOVERY_SCHEMAS = [
),
MatterDiscoverySchema(
platform=Platform.COVER,
entity_description=CoverEntityDescription(
entity_description=MatterCoverEntityDescription(
key="MatterCoverPositionAwareLift", name=None
),
entity_class=MatterCover,
@@ -229,7 +235,7 @@ DISCOVERY_SCHEMAS = [
),
MatterDiscoverySchema(
platform=Platform.COVER,
entity_description=CoverEntityDescription(
entity_description=MatterCoverEntityDescription(
key="MatterCoverPositionAwareTilt", name=None
),
entity_class=MatterCover,
@@ -244,7 +250,7 @@ DISCOVERY_SCHEMAS = [
),
MatterDiscoverySchema(
platform=Platform.COVER,
entity_description=CoverEntityDescription(
entity_description=MatterCoverEntityDescription(
key="MatterCoverPositionAwareLiftAndTilt", name=None
),
entity_class=MatterCover,

View File

@@ -2,6 +2,7 @@
from __future__ import annotations
from dataclasses import dataclass
from typing import Any
from chip.clusters import Objects as clusters
@@ -18,7 +19,7 @@ from homeassistant.const import Platform
from homeassistant.core import HomeAssistant, callback
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from .entity import MatterEntity
from .entity import MatterEntity, MatterEntityDescription
from .helpers import get_matter
from .models import MatterDiscoverySchema
@@ -46,6 +47,11 @@ async def async_setup_entry(
matter.register_platform_handler(Platform.EVENT, async_add_entities)
@dataclass(frozen=True)
class MatterEventEntityDescription(EventEntityDescription, MatterEntityDescription):
"""Describe Matter Event entities."""
class MatterEventEntity(MatterEntity, EventEntity):
"""Representation of a Matter Event entity."""
@@ -132,7 +138,7 @@ class MatterEventEntity(MatterEntity, EventEntity):
DISCOVERY_SCHEMAS = [
MatterDiscoverySchema(
platform=Platform.EVENT,
entity_description=EventEntityDescription(
entity_description=MatterEventEntityDescription(
key="GenericSwitch",
device_class=EventDeviceClass.BUTTON,
translation_key="button",

View File

@@ -2,6 +2,7 @@
from __future__ import annotations
from dataclasses import dataclass
from typing import TYPE_CHECKING, Any
from chip.clusters import Objects as clusters
@@ -18,7 +19,7 @@ from homeassistant.const import Platform
from homeassistant.core import HomeAssistant, callback
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from .entity import MatterEntity
from .entity import MatterEntity, MatterEntityDescription
from .helpers import get_matter
from .models import MatterDiscoverySchema
@@ -52,6 +53,11 @@ async def async_setup_entry(
matter.register_platform_handler(Platform.FAN, async_add_entities)
@dataclass(frozen=True)
class MatterFanEntityDescription(FanEntityDescription, MatterEntityDescription):
"""Describe Matter Fan entities."""
class MatterFan(MatterEntity, FanEntity):
"""Representation of a Matter fan."""
@@ -308,7 +314,7 @@ class MatterFan(MatterEntity, FanEntity):
DISCOVERY_SCHEMAS = [
MatterDiscoverySchema(
platform=Platform.FAN,
entity_description=FanEntityDescription(
entity_description=MatterFanEntityDescription(
key="MatterFan",
name=None,
),

View File

@@ -2,6 +2,7 @@
from __future__ import annotations
from dataclasses import dataclass
from typing import Any
from chip.clusters import Objects as clusters
@@ -29,7 +30,7 @@ from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from homeassistant.util import color as color_util
from .const import LOGGER
from .entity import MatterEntity
from .entity import MatterEntity, MatterEntityDescription
from .helpers import get_matter
from .models import MatterDiscoverySchema
from .util import (
@@ -85,10 +86,15 @@ async def async_setup_entry(
matter.register_platform_handler(Platform.LIGHT, async_add_entities)
@dataclass(frozen=True)
class MatterLightEntityDescription(LightEntityDescription, MatterEntityDescription):
"""Describe Matter Light entities."""
class MatterLight(MatterEntity, LightEntity):
"""Representation of a Matter light."""
entity_description: LightEntityDescription
entity_description: MatterLightEntityDescription
_supports_brightness = False
_supports_color = False
_supports_color_temperature = False
@@ -458,7 +464,7 @@ class MatterLight(MatterEntity, LightEntity):
DISCOVERY_SCHEMAS = [
MatterDiscoverySchema(
platform=Platform.LIGHT,
entity_description=LightEntityDescription(
entity_description=MatterLightEntityDescription(
key="MatterLight",
name=None,
),
@@ -487,7 +493,7 @@ DISCOVERY_SCHEMAS = [
# Additional schema to match (HS Color) lights with incorrect/missing device type
MatterDiscoverySchema(
platform=Platform.LIGHT,
entity_description=LightEntityDescription(
entity_description=MatterLightEntityDescription(
key="MatterHSColorLightFallback",
name=None,
),
@@ -508,7 +514,7 @@ DISCOVERY_SCHEMAS = [
# Additional schema to match (XY Color) lights with incorrect/missing device type
MatterDiscoverySchema(
platform=Platform.LIGHT,
entity_description=LightEntityDescription(
entity_description=MatterLightEntityDescription(
key="MatterXYColorLightFallback",
name=None,
),
@@ -529,7 +535,7 @@ DISCOVERY_SCHEMAS = [
# Additional schema to match (color temperature) lights with incorrect/missing device type
MatterDiscoverySchema(
platform=Platform.LIGHT,
entity_description=LightEntityDescription(
entity_description=MatterLightEntityDescription(
key="MatterColorTemperatureLightFallback",
name=None,
),

View File

@@ -3,6 +3,7 @@
from __future__ import annotations
import asyncio
from dataclasses import dataclass
from typing import Any
from chip.clusters import Objects as clusters
@@ -19,7 +20,7 @@ from homeassistant.core import HomeAssistant, callback
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from .const import LOGGER
from .entity import MatterEntity
from .entity import MatterEntity, MatterEntityDescription
from .helpers import get_matter
from .models import MatterDiscoverySchema
@@ -52,6 +53,11 @@ async def async_setup_entry(
matter.register_platform_handler(Platform.LOCK, async_add_entities)
@dataclass(frozen=True)
class MatterLockEntityDescription(LockEntityDescription, MatterEntityDescription):
"""Describe Matter Lock entities."""
class MatterLock(MatterEntity, LockEntity):
"""Representation of a Matter lock."""
@@ -254,7 +260,7 @@ class MatterLock(MatterEntity, LockEntity):
DISCOVERY_SCHEMAS = [
MatterDiscoverySchema(
platform=Platform.LOCK,
entity_description=LockEntityDescription(
entity_description=MatterLockEntityDescription(
key="MatterLock",
name=None,
),

View File

@@ -11,7 +11,8 @@ from matter_server.client.models.device_types import DeviceType
from matter_server.client.models.node import MatterEndpoint
from homeassistant.const import Platform
from homeassistant.helpers.entity import EntityDescription
from .entity import MatterEntityDescription
type SensorValueTypes = type[
clusters.uint | int | clusters.Nullable | clusters.float32 | float
@@ -54,7 +55,7 @@ class MatterEntityInfo:
attributes_to_watch: list[type[ClusterAttributeDescriptor]]
# the entity description to use
entity_description: EntityDescription
entity_description: MatterEntityDescription
# entity class to use to instantiate the entity
entity_class: type
@@ -80,7 +81,7 @@ class MatterDiscoverySchema:
platform: Platform
# platform-specific entity description
entity_description: EntityDescription
entity_description: MatterEntityDescription
# entity class to use to instantiate the entity
entity_class: type

View File

@@ -42,6 +42,11 @@ async def async_setup_entry(
matter.register_platform_handler(Platform.SWITCH, async_add_entities)
@dataclass(frozen=True)
class MatterSwitchEntityDescription(SwitchEntityDescription, MatterEntityDescription):
"""Describe Matter Switch entities."""
class MatterSwitch(MatterEntity, SwitchEntity):
"""Representation of a Matter switch."""
@@ -168,7 +173,7 @@ class MatterNumericSwitch(MatterSwitch):
DISCOVERY_SCHEMAS = [
MatterDiscoverySchema(
platform=Platform.SWITCH,
entity_description=SwitchEntityDescription(
entity_description=MatterSwitchEntityDescription(
key="MatterPlug",
device_class=SwitchDeviceClass.OUTLET,
name=None,
@@ -179,7 +184,7 @@ DISCOVERY_SCHEMAS = [
),
MatterDiscoverySchema(
platform=Platform.SWITCH,
entity_description=SwitchEntityDescription(
entity_description=MatterSwitchEntityDescription(
key="MatterPowerToggle",
device_class=SwitchDeviceClass.SWITCH,
translation_key="power",
@@ -207,7 +212,7 @@ DISCOVERY_SCHEMAS = [
),
MatterDiscoverySchema(
platform=Platform.SWITCH,
entity_description=SwitchEntityDescription(
entity_description=MatterSwitchEntityDescription(
key="MatterSwitch",
device_class=SwitchDeviceClass.OUTLET,
name=None,

View File

@@ -25,7 +25,7 @@ from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from homeassistant.helpers.event import async_call_later
from homeassistant.helpers.restore_state import ExtraStoredData
from .entity import MatterEntity
from .entity import MatterEntity, MatterEntityDescription
from .helpers import get_matter
from .models import MatterDiscoverySchema
@@ -67,6 +67,11 @@ async def async_setup_entry(
matter.register_platform_handler(Platform.UPDATE, async_add_entities)
@dataclass(frozen=True)
class MatterUpdateEntityDescription(UpdateEntityDescription, MatterEntityDescription):
"""Describe Matter Update entities."""
class MatterUpdate(MatterEntity, UpdateEntity):
"""Representation of a Matter node capable of updating."""
@@ -250,7 +255,7 @@ class MatterUpdate(MatterEntity, UpdateEntity):
DISCOVERY_SCHEMAS = [
MatterDiscoverySchema(
platform=Platform.UPDATE,
entity_description=UpdateEntityDescription(
entity_description=MatterUpdateEntityDescription(
key="MatterUpdate", device_class=UpdateDeviceClass.FIRMWARE
),
entity_class=MatterUpdate,

View File

@@ -2,6 +2,7 @@
from __future__ import annotations
from dataclasses import dataclass
from enum import IntEnum
from typing import TYPE_CHECKING, Any
@@ -20,7 +21,7 @@ from homeassistant.core import HomeAssistant, callback
from homeassistant.exceptions import HomeAssistantError
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from .entity import MatterEntity
from .entity import MatterEntity, MatterEntityDescription
from .helpers import get_matter
from .models import MatterDiscoverySchema
@@ -58,6 +59,13 @@ async def async_setup_entry(
matter.register_platform_handler(Platform.VACUUM, async_add_entities)
@dataclass(frozen=True)
class MatterStateVacuumEntityDescription(
StateVacuumEntityDescription, MatterEntityDescription
):
"""Describe Matter Vacuum entities."""
class MatterVacuum(MatterEntity, StateVacuumEntity):
"""Representation of a Matter Vacuum cleaner entity."""
@@ -65,7 +73,7 @@ class MatterVacuum(MatterEntity, StateVacuumEntity):
_supported_run_modes: (
dict[int, clusters.RvcRunMode.Structs.ModeOptionStruct] | None
) = None
entity_description: StateVacuumEntityDescription
entity_description: MatterStateVacuumEntityDescription
_platform_translation_key = "vacuum"
def _get_run_mode_by_tag(
@@ -212,7 +220,7 @@ class MatterVacuum(MatterEntity, StateVacuumEntity):
DISCOVERY_SCHEMAS = [
MatterDiscoverySchema(
platform=Platform.VACUUM,
entity_description=StateVacuumEntityDescription(
entity_description=MatterStateVacuumEntityDescription(
key="MatterVacuumCleaner", name=None
),
entity_class=MatterVacuum,

View File

@@ -2,6 +2,8 @@
from __future__ import annotations
from dataclasses import dataclass
from chip.clusters import Objects as clusters
from matter_server.client.models import device_types
@@ -16,7 +18,7 @@ from homeassistant.const import Platform
from homeassistant.core import HomeAssistant, callback
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from .entity import MatterEntity
from .entity import MatterEntity, MatterEntityDescription
from .helpers import get_matter
from .models import MatterDiscoverySchema
@@ -34,11 +36,16 @@ async def async_setup_entry(
matter.register_platform_handler(Platform.VALVE, async_add_entities)
@dataclass(frozen=True)
class MatterValveEntityDescription(ValveEntityDescription, MatterEntityDescription):
"""Describe Matter Valve entities."""
class MatterValve(MatterEntity, ValveEntity):
"""Representation of a Matter Valve."""
_feature_map: int | None = None
entity_description: ValveEntityDescription
entity_description: MatterValveEntityDescription
_platform_translation_key = "valve"
async def async_open_valve(self) -> None:
@@ -128,7 +135,7 @@ class MatterValve(MatterEntity, ValveEntity):
DISCOVERY_SCHEMAS = [
MatterDiscoverySchema(
platform=Platform.VALVE,
entity_description=ValveEntityDescription(
entity_description=MatterValveEntityDescription(
key="MatterValve",
device_class=ValveDeviceClass.WATER,
name=None,

View File

@@ -2,6 +2,7 @@
from __future__ import annotations
from dataclasses import dataclass
from typing import Any, cast
from chip.clusters import Objects as clusters
@@ -26,7 +27,7 @@ from homeassistant.const import (
from homeassistant.core import HomeAssistant, callback
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from .entity import MatterEntity
from .entity import MatterEntity, MatterEntityDescription
from .helpers import get_matter
from .models import MatterDiscoverySchema
@@ -50,6 +51,13 @@ async def async_setup_entry(
matter.register_platform_handler(Platform.WATER_HEATER, async_add_entities)
@dataclass(frozen=True)
class MatterWaterHeaterEntityDescription(
WaterHeaterEntityDescription, MatterEntityDescription
):
"""Describe Matter Water Heater entities."""
class MatterWaterHeater(MatterEntity, WaterHeaterEntity):
"""Representation of a Matter WaterHeater entity."""
@@ -171,7 +179,7 @@ class MatterWaterHeater(MatterEntity, WaterHeaterEntity):
DISCOVERY_SCHEMAS = [
MatterDiscoverySchema(
platform=Platform.WATER_HEATER,
entity_description=WaterHeaterEntityDescription(
entity_description=MatterWaterHeaterEntityDescription(
key="MatterWaterHeater",
name=None,
),

View File

@@ -25,6 +25,7 @@ from homeassistant.const import UnitOfEnergy
from homeassistant.core import HomeAssistant
from homeassistant.helpers.update_coordinator import DataUpdateCoordinator
from homeassistant.util import dt as dt_util, slugify
from homeassistant.util.unit_conversion import EnergyConverter
from .const import DOMAIN
@@ -156,6 +157,7 @@ class MillHistoricDataUpdateCoordinator(DataUpdateCoordinator):
name=f"{heater.name}",
source=DOMAIN,
statistic_id=statistic_id,
unit_class=EnergyConverter.UNIT_CLASS,
unit_of_measurement=UnitOfEnergy.KILO_WATT_HOUR,
)
async_add_external_statistics(self.hass, metadata, statistics)

View File

@@ -458,6 +458,7 @@ SUBENTRY_PLATFORMS = [
Platform.LOCK,
Platform.NOTIFY,
Platform.NUMBER,
Platform.SELECT,
Platform.SENSOR,
Platform.SWITCH,
]
@@ -1141,6 +1142,7 @@ ENTITY_CONFIG_VALIDATOR: dict[
Platform.LOCK.value: None,
Platform.NOTIFY.value: None,
Platform.NUMBER.value: validate_number_platform_config,
Platform.SELECT: None,
Platform.SENSOR.value: validate_sensor_platform_config,
Platform.SWITCH.value: None,
}
@@ -1367,6 +1369,7 @@ PLATFORM_ENTITY_FIELDS: dict[str, dict[str, PlatformField]] = {
custom_filtering=True,
),
},
Platform.SELECT.value: {},
Platform.SENSOR.value: {
CONF_DEVICE_CLASS: PlatformField(
selector=SENSOR_DEVICE_CLASS_SELECTOR, required=False
@@ -3103,6 +3106,34 @@ PLATFORM_MQTT_FIELDS: dict[str, dict[str, PlatformField]] = {
),
CONF_RETAIN: PlatformField(selector=BOOLEAN_SELECTOR, required=False),
},
Platform.SELECT.value: {
CONF_COMMAND_TOPIC: PlatformField(
selector=TEXT_SELECTOR,
required=True,
validator=valid_publish_topic,
error="invalid_publish_topic",
),
CONF_COMMAND_TEMPLATE: PlatformField(
selector=TEMPLATE_SELECTOR,
required=False,
validator=validate(cv.template),
error="invalid_template",
),
CONF_STATE_TOPIC: PlatformField(
selector=TEXT_SELECTOR,
required=False,
validator=valid_subscribe_topic,
error="invalid_subscribe_topic",
),
CONF_VALUE_TEMPLATE: PlatformField(
selector=TEMPLATE_SELECTOR,
required=False,
validator=validate(cv.template),
error="invalid_template",
),
CONF_OPTIONS: PlatformField(selector=OPTIONS_SELECTOR, required=True),
CONF_RETAIN: PlatformField(selector=BOOLEAN_SELECTOR, required=False),
},
Platform.SENSOR.value: {
CONF_STATE_TOPIC: PlatformField(
selector=TEXT_SELECTOR,

View File

@@ -346,6 +346,7 @@
"mode_state_template": "Operation mode value template",
"on_command_type": "ON command type",
"optimistic": "Optimistic",
"options": "Set options",
"payload_off": "Payload \"off\"",
"payload_on": "Payload \"on\"",
"payload_press": "Payload \"press\"",
@@ -393,6 +394,7 @@
"mode_state_template": "Defines a [template](https://www.home-assistant.io/docs/configuration/templating/#using-value-templates-with-mqtt) to extract the operation mode state. [Learn more.]({url}#mode_state_template)",
"on_command_type": "Defines when the payload \"on\" is sent. Using \"Last\" (the default) will send any style (brightness, color, etc) topics first and then a payload \"on\" to the command topic. Using \"First\" will send the payload \"on\" and then any style topics. Using \"Brightness\" will only send brightness commands instead of the payload \"on\" to turn the light on.",
"optimistic": "Flag that defines if the {platform} entity works in optimistic mode. [Learn more.]({url}#optimistic)",
"options": "List of options that can be selected.",
"payload_off": "The payload that represents the \"off\" state.",
"payload_on": "The payload that represents the \"on\" state.",
"payload_press": "The payload to send when the button is triggered.",
@@ -1334,6 +1336,7 @@
"lock": "[%key:component::lock::title%]",
"notify": "[%key:component::notify::title%]",
"number": "[%key:component::number::title%]",
"select": "[%key:component::select::title%]",
"sensor": "[%key:component::sensor::title%]",
"switch": "[%key:component::switch::title%]"
}

View File

@@ -13,6 +13,7 @@ import voluptuous as vol
from homeassistant.components.sensor import (
PLATFORM_SCHEMA as SENSOR_PLATFORM_SCHEMA,
SensorDeviceClass,
SensorEntity,
)
from homeassistant.config_entries import SOURCE_IMPORT
@@ -128,10 +129,13 @@ async def async_setup_entry(
subentry.data[CONF_NAME],
subentry.data[CONF_FROM],
subentry.data[CONF_TO],
subentry.subentry_id,
subentry.data.get(CONF_VIA),
parse_time(subentry.data[CONF_TIME])
if CONF_TIME in subentry.data
else None,
(
parse_time(subentry.data[CONF_TIME])
if CONF_TIME in subentry.data
else None
),
)
],
config_subentry_id=subentry.subentry_id,
@@ -142,6 +146,7 @@ async def async_setup_entry(
class NSDepartureSensor(SensorEntity):
"""Implementation of a NS Departure Sensor."""
_attr_device_class = SensorDeviceClass.TIMESTAMP
_attr_attribution = "Data provided by NS"
_attr_icon = "mdi:train"
@@ -151,6 +156,7 @@ class NSDepartureSensor(SensorEntity):
name: str,
departure: str,
heading: str,
subentry_id: str,
via: str | None,
time: dt.time | None,
) -> None:
@@ -161,21 +167,16 @@ class NSDepartureSensor(SensorEntity):
self._via = via
self._heading = heading
self._time = time
self._state: str | None = None
self._trips: list[Trip] | None = None
self._first_trip: Trip | None = None
self._next_trip: Trip | None = None
self._attr_unique_id = f"{subentry_id}-actual_departure"
@property
def name(self) -> str:
"""Return the name of the sensor."""
return self._name
@property
def native_value(self) -> str | None:
"""Return the next departure time."""
return self._state
@property
def extra_state_attributes(self) -> dict[str, Any] | None:
"""Return the state attributes."""
@@ -269,7 +270,7 @@ class NSDepartureSensor(SensorEntity):
(datetime.now() + timedelta(minutes=30)).time() < self._time
or (datetime.now() - timedelta(minutes=30)).time() > self._time
):
self._state = None
self._attr_native_value = None
self._trips = None
self._first_trip = None
return
@@ -309,7 +310,7 @@ class NSDepartureSensor(SensorEntity):
if len(filtered_times) > 0:
sorted_times = sorted(filtered_times, key=lambda x: x[1])
self._first_trip = self._trips[sorted_times[0][0]]
self._state = sorted_times[0][1].strftime("%H:%M")
self._attr_native_value = sorted_times[0][1]
# Filter again to remove trains that leave at the exact same time.
filtered_times = [
@@ -326,7 +327,7 @@ class NSDepartureSensor(SensorEntity):
else:
self._first_trip = None
self._state = None
self._attr_native_value = None
except (
requests.exceptions.ConnectionError,

View File

@@ -10,13 +10,13 @@ from pynintendoparental.exceptions import (
from homeassistant.const import Platform
from homeassistant.core import HomeAssistant
from homeassistant.exceptions import ConfigEntryError
from homeassistant.exceptions import ConfigEntryAuthFailed
from homeassistant.helpers.aiohttp_client import async_get_clientsession
from .const import CONF_SESSION_TOKEN, DOMAIN
from .coordinator import NintendoParentalConfigEntry, NintendoUpdateCoordinator
_PLATFORMS: list[Platform] = [Platform.SENSOR]
_PLATFORMS: list[Platform] = [Platform.SENSOR, Platform.TIME]
async def async_setup_entry(
@@ -31,7 +31,7 @@ async def async_setup_entry(
client_session=async_get_clientsession(hass),
)
except (InvalidSessionTokenException, InvalidOAuthConfigurationException) as err:
raise ConfigEntryError(
raise ConfigEntryAuthFailed(
translation_domain=DOMAIN,
translation_key="auth_expired",
) from err

View File

@@ -2,6 +2,7 @@
from __future__ import annotations
from collections.abc import Mapping
import logging
from typing import TYPE_CHECKING, Any
@@ -59,3 +60,41 @@ class NintendoConfigFlow(ConfigFlow, domain=DOMAIN):
data_schema=vol.Schema({vol.Required(CONF_API_TOKEN): str}),
errors=errors,
)
async def async_step_reauth(
self, user_input: Mapping[str, Any]
) -> ConfigFlowResult:
"""Perform reauthentication on an API error."""
return await self.async_step_reauth_confirm()
async def async_step_reauth_confirm(
self, user_input: dict[str, Any] | None = None
) -> ConfigFlowResult:
"""Confirm reauth dialog."""
errors: dict[str, str] = {}
reauth_entry = self._get_reauth_entry()
if self.auth is None:
self.auth = Authenticator.generate_login(
client_session=async_get_clientsession(self.hass)
)
if user_input is not None:
try:
await self.auth.complete_login(
self.auth, user_input[CONF_API_TOKEN], False
)
except (ValueError, InvalidSessionTokenException, HttpException):
errors["base"] = "invalid_auth"
else:
return self.async_update_reload_and_abort(
reauth_entry,
data={
**reauth_entry.data,
CONF_SESSION_TOKEN: self.auth.get_session_token,
},
)
return self.async_show_form(
step_id="reauth_confirm",
description_placeholders={"link": self.auth.login_url},
data_schema=vol.Schema({vol.Required(CONF_API_TOKEN): str}),
errors=errors,
)

View File

@@ -3,3 +3,7 @@
DOMAIN = "nintendo_parental"
CONF_UPDATE_INTERVAL = "update_interval"
CONF_SESSION_TOKEN = "session_token"
BEDTIME_ALARM_MIN = "16:00"
BEDTIME_ALARM_MAX = "23:00"
BEDTIME_ALARM_DISABLE = "00:00"

View File

@@ -9,6 +9,15 @@
"data_description": {
"api_token": "The link copied from the Nintendo website"
}
},
"reauth_confirm": {
"description": "To obtain your access token, click [Nintendo Login]({link}) to sign in to your Nintendo account. Then, for the account you want to link, right-click on the red **Select this person** button and choose **Copy Link Address**.",
"data": {
"api_token": "Access token"
},
"data_description": {
"api_token": "The link copied from the Nintendo website"
}
}
},
"error": {
@@ -17,7 +26,8 @@
"unknown": "[%key:common::config_flow::error::unknown%]"
},
"abort": {
"already_configured": "[%key:common::config_flow::abort::already_configured_account%]"
"already_configured": "[%key:common::config_flow::abort::already_configured_account%]",
"reauth_successful": "[%key:common::config_flow::abort::reauth_successful%]"
}
},
"entity": {
@@ -28,11 +38,19 @@
"time_remaining": {
"name": "Screen time remaining"
}
},
"time": {
"bedtime_alarm": {
"name": "Bedtime alarm"
}
}
},
"exceptions": {
"auth_expired": {
"message": "Authentication expired. Please remove and re-add the integration to reconnect."
"message": "Authentication token expired."
},
"bedtime_alarm_out_of_range": {
"message": "{value} not accepted. Bedtime Alarm must be between {bedtime_alarm_min} and {bedtime_alarm_max}. To disable, set to {bedtime_alarm_disable}."
}
}
}

View File

@@ -0,0 +1,100 @@
"""Time platform for Nintendo Parental."""
from __future__ import annotations
from collections.abc import Callable, Coroutine
from dataclasses import dataclass
from datetime import time
from enum import StrEnum
import logging
from typing import Any
from pynintendoparental.exceptions import BedtimeOutOfRangeError
from homeassistant.components.time import TimeEntity, TimeEntityDescription
from homeassistant.core import HomeAssistant
from homeassistant.exceptions import ServiceValidationError
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from .const import BEDTIME_ALARM_DISABLE, BEDTIME_ALARM_MAX, BEDTIME_ALARM_MIN, DOMAIN
from .coordinator import NintendoParentalConfigEntry, NintendoUpdateCoordinator
from .entity import Device, NintendoDevice
_LOGGER = logging.getLogger(__name__)
# Coordinator is used to centralize the data updates
PARALLEL_UPDATES = 0
class NintendoParentalTime(StrEnum):
"""Store keys for Nintendo Parental time."""
BEDTIME_ALARM = "bedtime_alarm"
@dataclass(kw_only=True, frozen=True)
class NintendoParentalTimeEntityDescription(TimeEntityDescription):
"""Description for Nintendo Parental time entities."""
value_fn: Callable[[Device], time | None]
set_value_fn: Callable[[Device, time], Coroutine[Any, Any, None]]
TIME_DESCRIPTIONS: tuple[NintendoParentalTimeEntityDescription, ...] = (
NintendoParentalTimeEntityDescription(
key=NintendoParentalTime.BEDTIME_ALARM,
translation_key=NintendoParentalTime.BEDTIME_ALARM,
value_fn=lambda device: device.bedtime_alarm,
set_value_fn=lambda device, value: device.set_bedtime_alarm(value=value),
),
)
async def async_setup_entry(
hass: HomeAssistant,
entry: NintendoParentalConfigEntry,
async_add_devices: AddConfigEntryEntitiesCallback,
) -> None:
"""Set up the time platform."""
async_add_devices(
NintendoParentalTimeEntity(entry.runtime_data, device, entity)
for device in entry.runtime_data.api.devices.values()
for entity in TIME_DESCRIPTIONS
)
class NintendoParentalTimeEntity(NintendoDevice, TimeEntity):
"""Represent a single time entity."""
entity_description: NintendoParentalTimeEntityDescription
def __init__(
self,
coordinator: NintendoUpdateCoordinator,
device: Device,
description: NintendoParentalTimeEntityDescription,
) -> None:
"""Initialize the time entity."""
super().__init__(coordinator=coordinator, device=device, key=description.key)
self.entity_description = description
@property
def native_value(self) -> time | None:
"""Return the time."""
return self.entity_description.value_fn(self._device)
async def async_set_value(self, value: time) -> None:
"""Update the value."""
try:
await self.entity_description.set_value_fn(self._device, value)
except BedtimeOutOfRangeError as exc:
raise ServiceValidationError(
translation_domain=DOMAIN,
translation_key="bedtime_alarm_out_of_range",
translation_placeholders={
"value": value.strftime("%H:%M"),
"bedtime_alarm_max": BEDTIME_ALARM_MAX,
"bedtime_alarm_min": BEDTIME_ALARM_MIN,
"bedtime_alarm_disable": BEDTIME_ALARM_DISABLE,
},
) from exc

View File

@@ -124,7 +124,7 @@ class NumberDeviceClass(StrEnum):
CO = "carbon_monoxide"
"""Carbon Monoxide gas concentration.
Unit of measurement: `ppm` (parts per million)
Unit of measurement: `ppm` (parts per million), `mg/m³`
"""
CO2 = "carbon_dioxide"
@@ -475,7 +475,10 @@ DEVICE_CLASS_UNITS: dict[NumberDeviceClass, set[type[StrEnum] | str | None]] = {
NumberDeviceClass.ATMOSPHERIC_PRESSURE: set(UnitOfPressure),
NumberDeviceClass.BATTERY: {PERCENTAGE},
NumberDeviceClass.BLOOD_GLUCOSE_CONCENTRATION: set(UnitOfBloodGlucoseConcentration),
NumberDeviceClass.CO: {CONCENTRATION_PARTS_PER_MILLION},
NumberDeviceClass.CO: {
CONCENTRATION_PARTS_PER_MILLION,
CONCENTRATION_MILLIGRAMS_PER_CUBIC_METER,
},
NumberDeviceClass.CO2: {CONCENTRATION_PARTS_PER_MILLION},
NumberDeviceClass.CONDUCTIVITY: set(UnitOfConductivity),
NumberDeviceClass.CURRENT: set(UnitOfElectricCurrent),

View File

@@ -40,7 +40,10 @@ class OpenRouterAITaskEntity(
"""OpenRouter AI Task entity."""
_attr_name = None
_attr_supported_features = ai_task.AITaskEntityFeature.GENERATE_DATA
_attr_supported_features = (
ai_task.AITaskEntityFeature.GENERATE_DATA
| ai_task.AITaskEntityFeature.SUPPORT_ATTACHMENTS
)
async def _async_generate_data(
self,

View File

@@ -2,13 +2,17 @@
from __future__ import annotations
import base64
from collections.abc import AsyncGenerator, Callable
import json
from mimetypes import guess_file_type
from pathlib import Path
from typing import TYPE_CHECKING, Any, Literal
import openai
from openai.types.chat import (
ChatCompletionAssistantMessageParam,
ChatCompletionContentPartImageParam,
ChatCompletionFunctionToolParam,
ChatCompletionMessage,
ChatCompletionMessageFunctionToolCallParam,
@@ -26,6 +30,7 @@ from voluptuous_openapi import convert
from homeassistant.components import conversation
from homeassistant.config_entries import ConfigSubentry
from homeassistant.const import CONF_MODEL
from homeassistant.core import HomeAssistant
from homeassistant.exceptions import HomeAssistantError
from homeassistant.helpers import device_registry as dr, llm
from homeassistant.helpers.entity import Entity
@@ -165,6 +170,43 @@ async def _transform_response(
yield data
async def async_prepare_files_for_prompt(
hass: HomeAssistant, files: list[tuple[Path, str | None]]
) -> list[ChatCompletionContentPartImageParam]:
"""Append files to a prompt.
Caller needs to ensure that the files are allowed.
"""
def append_files_to_content() -> list[ChatCompletionContentPartImageParam]:
content: list[ChatCompletionContentPartImageParam] = []
for file_path, mime_type in files:
if not file_path.exists():
raise HomeAssistantError(f"`{file_path}` does not exist")
if mime_type is None:
mime_type = guess_file_type(file_path)[0]
if not mime_type or not mime_type.startswith(("image/", "application/pdf")):
raise HomeAssistantError(
"Only images and PDF are supported by the OpenRouter API, "
f"`{file_path}` is not an image file or PDF"
)
base64_file = base64.b64encode(file_path.read_bytes()).decode("utf-8")
content.append(
{
"type": "image_url",
"image_url": {"url": f"data:{mime_type};base64,{base64_file}"},
}
)
return content
return await hass.async_add_executor_job(append_files_to_content)
class OpenRouterEntity(Entity):
"""Base entity for Open Router."""
@@ -216,6 +258,24 @@ class OpenRouterEntity(Entity):
if (m := _convert_content_to_chat_message(content))
]
last_content = chat_log.content[-1]
# Handle attachments by adding them to the last user message
if last_content.role == "user" and last_content.attachments:
last_message: ChatCompletionMessageParam = model_args["messages"][-1]
assert last_message["role"] == "user" and isinstance(
last_message["content"], str
)
# Encode files with base64 and append them to the text prompt
files = await async_prepare_files_for_prompt(
self.hass,
[(a.path, a.mime_type) for a in last_content.attachments],
)
last_message["content"] = [
{"type": "text", "text": last_message["content"]},
*files,
]
if structure:
if TYPE_CHECKING:
assert structure_name is not None

View File

@@ -1,4 +1,29 @@
{
"entity": {
"sensor": {
"clouds": {
"default": "mdi:weather-cloudy"
},
"precipitation_kind": {
"default": "mdi:weather-snowy-rainy"
},
"rain": {
"default": "mdi:weather-rainy"
},
"snow": {
"default": "mdi:weather-snowy"
},
"uv_index": {
"default": "mdi:weather-sunny"
},
"visibility_distance": {
"default": "mdi:eye"
},
"weather_code": {
"default": "mdi:barcode"
}
}
},
"services": {
"get_minute_forecast": {
"service": "mdi:weather-snowy-rainy"

View File

@@ -64,60 +64,55 @@ from .coordinator import OWMUpdateCoordinator
WEATHER_SENSOR_TYPES: tuple[SensorEntityDescription, ...] = (
SensorEntityDescription(
key=ATTR_API_WEATHER,
name="Weather",
translation_key=ATTR_API_WEATHER,
),
SensorEntityDescription(
key=ATTR_API_DEW_POINT,
name="Dew Point",
translation_key=ATTR_API_DEW_POINT,
native_unit_of_measurement=UnitOfTemperature.CELSIUS,
device_class=SensorDeviceClass.TEMPERATURE,
state_class=SensorStateClass.MEASUREMENT,
),
SensorEntityDescription(
key=ATTR_API_TEMPERATURE,
name="Temperature",
native_unit_of_measurement=UnitOfTemperature.CELSIUS,
device_class=SensorDeviceClass.TEMPERATURE,
state_class=SensorStateClass.MEASUREMENT,
),
SensorEntityDescription(
key=ATTR_API_FEELS_LIKE_TEMPERATURE,
name="Feels like temperature",
translation_key=ATTR_API_FEELS_LIKE_TEMPERATURE,
native_unit_of_measurement=UnitOfTemperature.CELSIUS,
device_class=SensorDeviceClass.TEMPERATURE,
state_class=SensorStateClass.MEASUREMENT,
),
SensorEntityDescription(
key=ATTR_API_WIND_SPEED,
name="Wind speed",
native_unit_of_measurement=UnitOfSpeed.METERS_PER_SECOND,
device_class=SensorDeviceClass.WIND_SPEED,
state_class=SensorStateClass.MEASUREMENT,
),
SensorEntityDescription(
key=ATTR_API_WIND_GUST,
name="Wind gust",
translation_key=ATTR_API_WIND_GUST,
native_unit_of_measurement=UnitOfSpeed.METERS_PER_SECOND,
device_class=SensorDeviceClass.WIND_SPEED,
state_class=SensorStateClass.MEASUREMENT,
),
SensorEntityDescription(
key=ATTR_API_WIND_BEARING,
name="Wind bearing",
native_unit_of_measurement=DEGREE,
state_class=SensorStateClass.MEASUREMENT_ANGLE,
device_class=SensorDeviceClass.WIND_DIRECTION,
),
SensorEntityDescription(
key=ATTR_API_HUMIDITY,
name="Humidity",
native_unit_of_measurement=PERCENTAGE,
device_class=SensorDeviceClass.HUMIDITY,
state_class=SensorStateClass.MEASUREMENT,
),
SensorEntityDescription(
key=ATTR_API_PRESSURE,
name="Pressure",
native_unit_of_measurement=UnitOfPressure.HPA,
device_class=SensorDeviceClass.PRESSURE,
state_class=SensorStateClass.MEASUREMENT,
@@ -125,37 +120,37 @@ WEATHER_SENSOR_TYPES: tuple[SensorEntityDescription, ...] = (
),
SensorEntityDescription(
key=ATTR_API_CLOUDS,
name="Cloud coverage",
translation_key=ATTR_API_CLOUDS,
native_unit_of_measurement=PERCENTAGE,
state_class=SensorStateClass.MEASUREMENT,
),
SensorEntityDescription(
key=ATTR_API_RAIN,
name="Rain",
translation_key=ATTR_API_RAIN,
native_unit_of_measurement=UnitOfVolumetricFlux.MILLIMETERS_PER_HOUR,
device_class=SensorDeviceClass.PRECIPITATION_INTENSITY,
state_class=SensorStateClass.MEASUREMENT,
),
SensorEntityDescription(
key=ATTR_API_SNOW,
name="Snow",
translation_key=ATTR_API_SNOW,
native_unit_of_measurement=UnitOfVolumetricFlux.MILLIMETERS_PER_HOUR,
device_class=SensorDeviceClass.PRECIPITATION_INTENSITY,
state_class=SensorStateClass.MEASUREMENT,
),
SensorEntityDescription(
key=ATTR_API_PRECIPITATION_KIND,
name="Precipitation kind",
translation_key=ATTR_API_PRECIPITATION_KIND,
),
SensorEntityDescription(
key=ATTR_API_UV_INDEX,
name="UV Index",
translation_key=ATTR_API_UV_INDEX,
native_unit_of_measurement=UV_INDEX,
state_class=SensorStateClass.MEASUREMENT,
),
SensorEntityDescription(
key=ATTR_API_VISIBILITY_DISTANCE,
name="Visibility",
translation_key=ATTR_API_VISIBILITY_DISTANCE,
native_unit_of_measurement=UnitOfLength.METERS,
device_class=SensorDeviceClass.DISTANCE,
state_class=SensorStateClass.MEASUREMENT,
@@ -163,11 +158,11 @@ WEATHER_SENSOR_TYPES: tuple[SensorEntityDescription, ...] = (
),
SensorEntityDescription(
key=ATTR_API_CONDITION,
name="Condition",
translation_key=ATTR_API_CONDITION,
),
SensorEntityDescription(
key=ATTR_API_WEATHER_CODE,
name="Weather Code",
translation_key=ATTR_API_WEATHER_CODE,
),
)

View File

@@ -41,6 +41,46 @@
}
}
},
"entity": {
"sensor": {
"dew_point": {
"name": "[%key:component::weather::entity_component::_::state_attributes::dew_point::name%]"
},
"feels_like_temperature": {
"name": "[%key:component::weather::entity_component::_::state_attributes::apparent_temperature::name%]"
},
"wind_gust": {
"name": "[%key:component::weather::entity_component::_::state_attributes::wind_gust_speed::name%]"
},
"clouds": {
"name": "[%key:component::weather::entity_component::_::state_attributes::cloud_coverage::name%]"
},
"rain": {
"name": "Rain intensity"
},
"snow": {
"name": "Snow intensity"
},
"precipitation_kind": {
"name": "Precipitation kind"
},
"uv_index": {
"name": "[%key:component::weather::entity_component::_::state_attributes::uv_index::name%]"
},
"visibility_distance": {
"name": "[%key:component::weather::entity_component::_::state_attributes::visibility::name%]"
},
"condition": {
"name": "Condition"
},
"weather": {
"name": "[%key:component::weather::title%]"
},
"weather_code": {
"name": "Weather code"
}
}
},
"issues": {
"deprecated_v25": {
"title": "OpenWeatherMap API V2.5 deprecated",

View File

@@ -35,6 +35,7 @@ from homeassistant.helpers import issue_registry as ir
from homeassistant.helpers.aiohttp_client import async_create_clientsession
from homeassistant.helpers.update_coordinator import DataUpdateCoordinator, UpdateFailed
from homeassistant.util import dt as dt_util
from homeassistant.util.unit_conversion import EnergyConverter, VolumeConverter
from .const import CONF_LOGIN_DATA, CONF_TOTP_SECRET, CONF_UTILITY, DOMAIN
@@ -149,6 +150,7 @@ class OpowerCoordinator(DataUpdateCoordinator[dict[str, Forecast]]):
name=f"{name_prefix} cost",
source=DOMAIN,
statistic_id=cost_statistic_id,
unit_class=None,
unit_of_measurement=None,
)
compensation_metadata = StatisticMetaData(
@@ -157,8 +159,14 @@ class OpowerCoordinator(DataUpdateCoordinator[dict[str, Forecast]]):
name=f"{name_prefix} compensation",
source=DOMAIN,
statistic_id=compensation_statistic_id,
unit_class=None,
unit_of_measurement=None,
)
consumption_unit_class = (
EnergyConverter.UNIT_CLASS
if account.meter_type == MeterType.ELEC
else VolumeConverter.UNIT_CLASS
)
consumption_unit = (
UnitOfEnergy.KILO_WATT_HOUR
if account.meter_type == MeterType.ELEC
@@ -170,6 +178,7 @@ class OpowerCoordinator(DataUpdateCoordinator[dict[str, Forecast]]):
name=f"{name_prefix} consumption",
source=DOMAIN,
statistic_id=consumption_statistic_id,
unit_class=consumption_unit_class,
unit_of_measurement=consumption_unit,
)
return_metadata = StatisticMetaData(
@@ -178,6 +187,7 @@ class OpowerCoordinator(DataUpdateCoordinator[dict[str, Forecast]]):
name=f"{name_prefix} return",
source=DOMAIN,
statistic_id=return_statistic_id,
unit_class=consumption_unit_class,
unit_of_measurement=consumption_unit,
)

View File

@@ -75,6 +75,9 @@ async def _title(hass: HomeAssistant, discovery_info: HassioServiceInfo) -> str:
if device and ("Connect_ZBT-1" in device or "SkyConnect" in device):
return f"Home Assistant Connect ZBT-1 ({discovery_info.name})"
if device and "Nabu_Casa_ZBT-2" in device:
return f"Home Assistant Connect ZBT-2 ({discovery_info.name})"
return discovery_info.name

View File

@@ -7,11 +7,13 @@ from typing import Any
from pooldose.client import PooldoseClient
from pooldose.request_status import RequestStatus
from pooldose.type_definitions import APIVersionResponse
import voluptuous as vol
from homeassistant.config_entries import ConfigFlow, ConfigFlowResult
from homeassistant.const import CONF_HOST, CONF_MAC
from homeassistant.helpers import config_validation as cv
from homeassistant.helpers.aiohttp_client import async_get_clientsession
from homeassistant.helpers.service_info.dhcp import DhcpServiceInfo
from .const import DOMAIN
@@ -38,9 +40,9 @@ class PooldoseConfigFlow(ConfigFlow, domain=DOMAIN):
async def _validate_host(
self, host: str
) -> tuple[str | None, dict[str, str] | None, dict[str, str] | None]:
) -> tuple[str | None, APIVersionResponse | None, dict[str, str] | None]:
"""Validate the host and return (serial_number, api_versions, errors)."""
client = PooldoseClient(host)
client = PooldoseClient(host, websession=async_get_clientsession(self.hass))
client_status = await client.connect()
if client_status == RequestStatus.HOST_UNREACHABLE:
return None, None, {"base": "cannot_connect"}
@@ -124,7 +126,14 @@ class PooldoseConfigFlow(ConfigFlow, domain=DOMAIN):
step_id="user",
data_schema=SCHEMA_DEVICE,
errors=errors,
description_placeholders=api_versions,
# Handle API version info for error display; pass version info when available
# or None when api_versions is None to avoid displaying version details
description_placeholders={
"api_version_is": api_versions.get("api_version_is") or "",
"api_version_should": api_versions.get("api_version_should") or "",
}
if api_versions
else None,
)
await self.async_set_unique_id(serial_number, raise_on_progress=False)

View File

@@ -4,10 +4,10 @@ from __future__ import annotations
from datetime import timedelta
import logging
from typing import Any
from pooldose.client import PooldoseClient
from pooldose.request_status import RequestStatus
from pooldose.type_definitions import DeviceInfoDict, StructuredValuesDict
from homeassistant.config_entries import ConfigEntry
from homeassistant.core import HomeAssistant
@@ -18,10 +18,10 @@ _LOGGER = logging.getLogger(__name__)
type PooldoseConfigEntry = ConfigEntry[PooldoseCoordinator]
class PooldoseCoordinator(DataUpdateCoordinator[dict[str, Any]]):
class PooldoseCoordinator(DataUpdateCoordinator[StructuredValuesDict]):
"""Coordinator for PoolDose integration."""
device_info: dict[str, Any]
device_info: DeviceInfoDict
config_entry: PooldoseConfigEntry
def __init__(
@@ -46,7 +46,7 @@ class PooldoseCoordinator(DataUpdateCoordinator[dict[str, Any]]):
self.device_info = self.client.device_info
_LOGGER.debug("Device info: %s", self.device_info)
async def _async_update_data(self) -> dict[str, Any]:
async def _async_update_data(self) -> StructuredValuesDict:
"""Fetch data from the PoolDose API."""
try:
status, instant_values = await self.client.instant_values_structured()
@@ -62,7 +62,7 @@ class PooldoseCoordinator(DataUpdateCoordinator[dict[str, Any]]):
if status != RequestStatus.SUCCESS:
raise UpdateFailed(f"API returned status: {status}")
if instant_values is None:
if not instant_values:
raise UpdateFailed("No data received from API")
_LOGGER.debug("Instant values structured: %s", instant_values)

View File

@@ -2,7 +2,9 @@
from __future__ import annotations
from typing import Any
from typing import Literal
from pooldose.type_definitions import DeviceInfoDict, ValueDict
from homeassistant.const import CONF_MAC
from homeassistant.helpers.device_registry import CONNECTION_NETWORK_MAC, DeviceInfo
@@ -14,13 +16,13 @@ from .coordinator import PooldoseCoordinator
def device_info(
info: dict | None, unique_id: str, mac: str | None = None
info: DeviceInfoDict | None, unique_id: str, mac: str | None = None
) -> DeviceInfo:
"""Create device info for PoolDose devices."""
if info is None:
info = {}
api_version = info.get("API_VERSION", "").removesuffix("/")
api_version = (info.get("API_VERSION") or "").removesuffix("/")
return DeviceInfo(
identifiers={(DOMAIN, unique_id)},
@@ -51,9 +53,9 @@ class PooldoseEntity(CoordinatorEntity[PooldoseCoordinator]):
self,
coordinator: PooldoseCoordinator,
serial_number: str,
device_properties: dict[str, Any],
device_properties: DeviceInfoDict,
entity_description: EntityDescription,
platform_name: str,
platform_name: Literal["sensor", "switch", "number", "binary_sensor", "select"],
) -> None:
"""Initialize PoolDose entity."""
super().__init__(coordinator)
@@ -66,18 +68,7 @@ class PooldoseEntity(CoordinatorEntity[PooldoseCoordinator]):
coordinator.config_entry.data.get(CONF_MAC),
)
@property
def available(self) -> bool:
"""Return True if the entity is available."""
if not super().available or self.coordinator.data is None:
return False
# Check if the entity type exists in coordinator data
platform_data = self.coordinator.data.get(self.platform_name, {})
return self.entity_description.key in platform_data
def get_data(self) -> dict | None:
def get_data(self) -> ValueDict | None:
"""Get data for this entity, only if available."""
if not self.available:
return None
platform_data = self.coordinator.data.get(self.platform_name, {})
platform_data = self.coordinator.data[self.platform_name]
return platform_data.get(self.entity_description.key)

View File

@@ -11,5 +11,5 @@
"documentation": "https://www.home-assistant.io/integrations/pooldose",
"iot_class": "local_polling",
"quality_scale": "bronze",
"requirements": ["python-pooldose==0.5.0"]
"requirements": ["python-pooldose==0.7.0"]
}

View File

@@ -48,7 +48,7 @@ rules:
discovery: done
docs-data-update: done
docs-examples: todo
docs-known-limitations: todo
docs-known-limitations: done
docs-supported-devices: done
docs-supported-functions: done
docs-troubleshooting: done
@@ -72,5 +72,5 @@ rules:
# Platinum
async-dependency: done
inject-websession: todo
inject-websession: done
strict-typing: todo

View File

@@ -19,8 +19,6 @@ from .entity import PooldoseEntity
_LOGGER = logging.getLogger(__name__)
PLATFORM_NAME = "sensor"
SENSOR_DESCRIPTIONS: tuple[SensorEntityDescription, ...] = (
SensorEntityDescription(
key="temperature",
@@ -146,18 +144,16 @@ async def async_setup_entry(
assert config_entry.unique_id is not None
coordinator = config_entry.runtime_data
data = coordinator.data
sensor_data = coordinator.data["sensor"]
serial_number = config_entry.unique_id
sensor_data = data.get(PLATFORM_NAME, {}) if data else {}
async_add_entities(
PooldoseSensor(
coordinator,
serial_number,
coordinator.device_info,
description,
PLATFORM_NAME,
"sensor",
)
for description in SENSOR_DESCRIPTIONS
if description.key in sensor_data
@@ -171,16 +167,17 @@ class PooldoseSensor(PooldoseEntity, SensorEntity):
def native_value(self) -> float | int | str | None:
"""Return the current value of the sensor."""
data = self.get_data()
if isinstance(data, dict) and "value" in data:
if data is not None:
return data["value"]
return None
@property
def native_unit_of_measurement(self) -> str | None:
"""Return the unit of measurement."""
if self.entity_description.key == "temperature":
data = self.get_data()
if isinstance(data, dict) and "unit" in data and data["unit"] is not None:
return data["unit"] # °C or °F
if (
self.entity_description.key == "temperature"
and (data := self.get_data()) is not None
):
return data["unit"] # °C or °F
return super().native_unit_of_measurement

View File

@@ -2,3 +2,5 @@
DOMAIN = "portainer"
DEFAULT_NAME = "Portainer"
ENDPOINT_STATUS_DOWN = 2

View File

@@ -21,7 +21,7 @@ from homeassistant.core import HomeAssistant
from homeassistant.exceptions import ConfigEntryAuthFailed, ConfigEntryNotReady
from homeassistant.helpers.update_coordinator import DataUpdateCoordinator, UpdateFailed
from .const import DOMAIN
from .const import DOMAIN, ENDPOINT_STATUS_DOWN
type PortainerConfigEntry = ConfigEntry[PortainerCoordinator]
@@ -110,6 +110,14 @@ class PortainerCoordinator(DataUpdateCoordinator[dict[int, PortainerCoordinatorD
mapped_endpoints: dict[int, PortainerCoordinatorData] = {}
for endpoint in endpoints:
if endpoint.status == ENDPOINT_STATUS_DOWN:
_LOGGER.debug(
"Skipping offline endpoint: %s (ID: %d)",
endpoint.name,
endpoint.id,
)
continue
try:
containers = await self.portainer.get_containers(endpoint.id)
except PortainerConnectionError as err:

View File

@@ -1 +1,41 @@
"""The prowl component."""
import logging
import prowlpy
from homeassistant.config_entries import ConfigEntry
from homeassistant.const import CONF_API_KEY
from homeassistant.core import HomeAssistant
from homeassistant.exceptions import ConfigEntryError, ConfigEntryNotReady
from homeassistant.helpers import config_validation as cv
from .const import DOMAIN, PLATFORMS
from .helpers import async_verify_key
_LOGGER = logging.getLogger(__name__)
CONFIG_SCHEMA = cv.platform_only_config_schema(DOMAIN)
async def async_setup_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool:
"""Set up a Prowl service."""
try:
if not await async_verify_key(hass, entry.data[CONF_API_KEY]):
raise ConfigEntryError(
"Unable to validate Prowl API key (Key invalid or expired)"
)
except TimeoutError as ex:
raise ConfigEntryNotReady("API call to Prowl failed") from ex
except prowlpy.APIError as ex:
if str(ex).startswith("Not accepted: exceeded rate limit"):
raise ConfigEntryNotReady("Prowl API rate limit exceeded") from ex
raise ConfigEntryError("Failed to validate Prowl API key ({ex})") from ex
await hass.config_entries.async_forward_entry_setups(entry, PLATFORMS)
return True
async def async_unload_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool:
"""Unload a config entry."""
return await hass.config_entries.async_unload_platforms(entry, PLATFORMS)

View File

@@ -0,0 +1,68 @@
"""The config flow for the Prowl component."""
from __future__ import annotations
import logging
from typing import Any
import prowlpy
import voluptuous as vol
from homeassistant.config_entries import ConfigFlow, ConfigFlowResult
from homeassistant.const import CONF_API_KEY, CONF_NAME
from .const import DOMAIN
from .helpers import async_verify_key
_LOGGER = logging.getLogger(__name__)
class ProwlConfigFlow(ConfigFlow, domain=DOMAIN):
"""Handle a config flow for the Prowl component."""
VERSION = 1
async def async_step_user(
self, user_input: dict[str, Any] | None = None
) -> ConfigFlowResult:
"""Handle user configuration."""
errors = {}
if user_input:
api_key = user_input[CONF_API_KEY]
self._async_abort_entries_match({CONF_API_KEY: api_key})
errors = await self._validate_api_key(api_key)
if not errors:
return self.async_create_entry(
title=user_input[CONF_NAME],
data={
CONF_API_KEY: api_key,
},
)
return self.async_show_form(
step_id="user",
data_schema=self.add_suggested_values_to_schema(
vol.Schema(
{
vol.Required(CONF_API_KEY): str,
vol.Required(CONF_NAME): str,
},
),
user_input or {CONF_NAME: "Prowl"},
),
errors=errors,
)
async def _validate_api_key(self, api_key: str) -> dict[str, str]:
"""Validate the provided API key."""
ret = {}
try:
if not await async_verify_key(self.hass, api_key):
ret = {"base": "invalid_api_key"}
except TimeoutError:
ret = {"base": "api_timeout"}
except prowlpy.APIError:
ret = {"base": "bad_api_response"}
return ret

View File

@@ -1,3 +1,6 @@
"""Constants for the Prowl Notification service."""
from homeassistant.const import Platform
DOMAIN = "prowl"
PLATFORMS = [Platform.NOTIFY]

View File

@@ -0,0 +1,21 @@
"""Helper functions for Prowl."""
import asyncio
from functools import partial
import prowlpy
from homeassistant.core import HomeAssistant
async def async_verify_key(hass: HomeAssistant, api_key: str) -> bool:
"""Validate API key."""
prowl = await hass.async_add_executor_job(partial(prowlpy.Prowl, api_key))
try:
async with asyncio.timeout(10):
await hass.async_add_executor_job(prowl.verify_key)
return True
except prowlpy.APIError as ex:
if str(ex).startswith("Invalid API key"):
return False
raise

View File

@@ -2,6 +2,7 @@
"domain": "prowl",
"name": "Prowl",
"codeowners": [],
"config_flow": true,
"documentation": "https://www.home-assistant.io/integrations/prowl",
"integration_type": "service",
"iot_class": "cloud_push",

View File

@@ -16,11 +16,14 @@ from homeassistant.components.notify import (
ATTR_TITLE_DEFAULT,
PLATFORM_SCHEMA as NOTIFY_PLATFORM_SCHEMA,
BaseNotificationService,
NotifyEntity,
)
from homeassistant.config_entries import ConfigEntry
from homeassistant.const import CONF_API_KEY
from homeassistant.core import HomeAssistant
from homeassistant.exceptions import HomeAssistantError
from homeassistant.helpers import config_validation as cv
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from homeassistant.helpers.typing import ConfigType, DiscoveryInfoType
_LOGGER = logging.getLogger(__name__)
@@ -34,19 +37,31 @@ async def async_get_service(
discovery_info: DiscoveryInfoType | None = None,
) -> ProwlNotificationService:
"""Get the Prowl notification service."""
prowl = await hass.async_add_executor_job(
partial(prowlpy.Prowl, apikey=config[CONF_API_KEY])
return await hass.async_add_executor_job(
partial(ProwlNotificationService, hass, config[CONF_API_KEY])
)
return ProwlNotificationService(hass, prowl)
async def async_setup_entry(
hass: HomeAssistant,
entry: ConfigEntry,
async_add_entities: AddConfigEntryEntitiesCallback,
) -> None:
"""Set up the notify entities."""
prowl = ProwlNotificationEntity(hass, entry.title, entry.data[CONF_API_KEY])
async_add_entities([prowl])
class ProwlNotificationService(BaseNotificationService):
"""Implement the notification service for Prowl."""
"""Implement the notification service for Prowl.
def __init__(self, hass: HomeAssistant, prowl: prowlpy.Prowl) -> None:
This class is used for legacy configuration via configuration.yaml
"""
def __init__(self, hass: HomeAssistant, api_key: str) -> None:
"""Initialize the service."""
self._hass = hass
self._prowl = prowl
self._prowl = prowlpy.Prowl(api_key)
async def async_send_message(self, message: str, **kwargs: Any) -> None:
"""Send the message to the user."""
@@ -80,3 +95,47 @@ class ProwlNotificationService(BaseNotificationService):
) from ex
_LOGGER.error("Unexpected error when calling Prowl API: %s", str(ex))
raise HomeAssistantError("Unexpected error when calling Prowl API") from ex
class ProwlNotificationEntity(NotifyEntity):
"""Implement the notification service for Prowl.
This class is used for Prowl config entries.
"""
def __init__(self, hass: HomeAssistant, name: str, api_key: str) -> None:
"""Initialize the service."""
self._hass = hass
self._prowl = prowlpy.Prowl(api_key)
self._attr_name = name
self._attr_unique_id = name
async def async_send_message(self, message: str, title: str | None = None) -> None:
"""Send the message."""
_LOGGER.debug("Sending Prowl notification from entity %s", self.name)
try:
async with asyncio.timeout(10):
await self._hass.async_add_executor_job(
partial(
self._prowl.send,
application="Home-Assistant",
event=title or ATTR_TITLE_DEFAULT,
description=message,
priority=0,
url=None,
)
)
except TimeoutError as ex:
_LOGGER.error("Timeout accessing Prowl API")
raise HomeAssistantError("Timeout accessing Prowl API") from ex
except prowlpy.APIError as ex:
if str(ex).startswith("Invalid API key"):
_LOGGER.error("Invalid API key for Prowl service")
raise HomeAssistantError("Invalid API key for Prowl service") from ex
if str(ex).startswith("Not accepted"):
_LOGGER.error("Prowl returned: exceeded rate limit")
raise HomeAssistantError(
"Prowl service reported: exceeded rate limit"
) from ex
_LOGGER.error("Unexpected error when calling Prowl API: %s", str(ex))
raise HomeAssistantError("Unexpected error when calling Prowl API") from ex

View File

@@ -0,0 +1,21 @@
{
"config": {
"step": {
"user": {
"description": "Enter the Prowl API key and its name.",
"data": {
"api_key": "[%key:common::config_flow::data::api_key%]",
"name": "[%key:common::config_flow::data::name%]"
}
}
},
"error": {
"invalid_api_key": "[%key:common::config_flow::error::invalid_api_key%]",
"api_timeout": "[%key:common::config_flow::error::timeout_connect%]",
"bad_api_response": "[%key:common::config_flow::error::unknown%]"
},
"abort": {
"already_configured": "API key is already configured"
}
}
}

View File

@@ -54,6 +54,7 @@ CONTEXT_ID_AS_BINARY_SCHEMA_VERSION = 36
EVENT_TYPE_IDS_SCHEMA_VERSION = 37
STATES_META_SCHEMA_VERSION = 38
CIRCULAR_MEAN_SCHEMA_VERSION = 49
UNIT_CLASS_SCHEMA_VERSION = 51
LEGACY_STATES_EVENT_ID_INDEX_SCHEMA_VERSION = 28
LEGACY_STATES_EVENT_FOREIGN_KEYS_FIXED_SCHEMA_VERSION = 43

View File

@@ -574,13 +574,18 @@ class Recorder(threading.Thread):
statistic_id: str,
*,
new_statistic_id: str | UndefinedType = UNDEFINED,
new_unit_class: str | None | UndefinedType = UNDEFINED,
new_unit_of_measurement: str | None | UndefinedType = UNDEFINED,
on_done: Callable[[], None] | None = None,
) -> None:
"""Update statistics metadata for a statistic_id."""
self.queue_task(
UpdateStatisticsMetadataTask(
on_done, statistic_id, new_statistic_id, new_unit_of_measurement
on_done,
statistic_id,
new_statistic_id,
new_unit_class,
new_unit_of_measurement,
)
)

View File

@@ -71,7 +71,7 @@ class LegacyBase(DeclarativeBase):
"""Base class for tables, used for schema migration."""
SCHEMA_VERSION = 50
SCHEMA_VERSION = 51
_LOGGER = logging.getLogger(__name__)
@@ -756,6 +756,7 @@ class _StatisticsMeta:
)
source: Mapped[str | None] = mapped_column(String(32))
unit_of_measurement: Mapped[str | None] = mapped_column(String(255))
unit_class: Mapped[str | None] = mapped_column(String(255))
has_mean: Mapped[bool | None] = mapped_column(Boolean)
has_sum: Mapped[bool | None] = mapped_column(Boolean)
name: Mapped[str | None] = mapped_column(String(255))

View File

@@ -9,6 +9,7 @@ from homeassistant.helpers import entity_registry as er
from homeassistant.helpers.event import async_has_entity_registry_updated_listeners
from .core import Recorder
from .statistics import async_update_statistics_metadata
from .util import filter_unique_constraint_integrity_error, get_instance, session_scope
_LOGGER = logging.getLogger(__name__)
@@ -27,8 +28,8 @@ def async_setup(hass: HomeAssistant) -> None:
assert event.data["action"] == "update" and "old_entity_id" in event.data
old_entity_id = event.data["old_entity_id"]
new_entity_id = event.data["entity_id"]
instance.async_update_statistics_metadata(
old_entity_id, new_statistic_id=new_entity_id
async_update_statistics_metadata(
hass, old_entity_id, new_statistic_id=new_entity_id
)
instance.async_update_states_metadata(
old_entity_id, new_entity_id=new_entity_id

View File

@@ -103,7 +103,11 @@ from .queries import (
migrate_single_short_term_statistics_row_to_timestamp,
migrate_single_statistics_row_to_timestamp,
)
from .statistics import cleanup_statistics_timestamp_migration, get_start_time
from .statistics import (
_PRIMARY_UNIT_CONVERTERS,
cleanup_statistics_timestamp_migration,
get_start_time,
)
from .tasks import RecorderTask
from .util import (
database_job_retry_wrapper,
@@ -2037,6 +2041,21 @@ class _SchemaVersion50Migrator(_SchemaVersionMigrator, target_version=50):
connection.execute(text("UPDATE statistics_meta SET has_mean=NULL"))
class _SchemaVersion51Migrator(_SchemaVersionMigrator, target_version=51):
def _apply_update(self) -> None:
"""Version specific update method."""
# Add unit class column to StatisticsMeta
_add_columns(self.session_maker, "statistics_meta", ["unit_class VARCHAR(255)"])
with session_scope(session=self.session_maker()) as session:
connection = session.connection()
for conv in _PRIMARY_UNIT_CONVERTERS:
connection.execute(
update(StatisticsMeta)
.where(StatisticsMeta.unit_of_measurement.in_(conv.VALID_UNITS))
.values(unit_class=conv.UNIT_CLASS)
)
def _migrate_statistics_columns_to_timestamp_removing_duplicates(
hass: HomeAssistant,
instance: Recorder,

View File

@@ -70,6 +70,8 @@ class StatisticMetaData(TypedDict):
name: str | None
source: str
statistic_id: str
unit_class: str | None
"""Specifies the unit conversion class to use, if applicable."""
unit_of_measurement: str | None

View File

@@ -35,6 +35,7 @@ import voluptuous as vol
from homeassistant.const import ATTR_UNIT_OF_MEASUREMENT
from homeassistant.core import HomeAssistant, callback, valid_entity_id
from homeassistant.exceptions import HomeAssistantError
from homeassistant.helpers.frame import report_usage
from homeassistant.helpers.recorder import DATA_RECORDER
from homeassistant.helpers.singleton import singleton
from homeassistant.helpers.typing import UNDEFINED, UndefinedType
@@ -46,6 +47,7 @@ from homeassistant.util.unit_conversion import (
AreaConverter,
BaseUnitConverter,
BloodGlucoseConcentrationConverter,
CarbonMonoxideConcentrationConverter,
ConductivityConverter,
DataRateConverter,
DistanceConverter,
@@ -193,43 +195,50 @@ QUERY_STATISTICS_SUMMARY_SUM = (
.label("rownum"),
)
_PRIMARY_UNIT_CONVERTERS: list[type[BaseUnitConverter]] = [
ApparentPowerConverter,
AreaConverter,
BloodGlucoseConcentrationConverter,
ConductivityConverter,
DataRateConverter,
DistanceConverter,
DurationConverter,
ElectricCurrentConverter,
ElectricPotentialConverter,
EnergyConverter,
EnergyDistanceConverter,
InformationConverter,
MassConverter,
MassVolumeConcentrationConverter,
PowerConverter,
PressureConverter,
ReactiveEnergyConverter,
ReactivePowerConverter,
SpeedConverter,
TemperatureConverter,
UnitlessRatioConverter,
VolumeConverter,
VolumeFlowRateConverter,
]
_SECONDARY_UNIT_CONVERTERS: list[type[BaseUnitConverter]] = [
CarbonMonoxideConcentrationConverter,
]
STATISTIC_UNIT_TO_UNIT_CONVERTER: dict[str | None, type[BaseUnitConverter]] = {
**dict.fromkeys(ApparentPowerConverter.VALID_UNITS, ApparentPowerConverter),
**dict.fromkeys(AreaConverter.VALID_UNITS, AreaConverter),
**dict.fromkeys(
BloodGlucoseConcentrationConverter.VALID_UNITS,
BloodGlucoseConcentrationConverter,
),
**dict.fromkeys(
MassVolumeConcentrationConverter.VALID_UNITS, MassVolumeConcentrationConverter
),
**dict.fromkeys(ConductivityConverter.VALID_UNITS, ConductivityConverter),
**dict.fromkeys(DataRateConverter.VALID_UNITS, DataRateConverter),
**dict.fromkeys(DistanceConverter.VALID_UNITS, DistanceConverter),
**dict.fromkeys(DurationConverter.VALID_UNITS, DurationConverter),
**dict.fromkeys(ElectricCurrentConverter.VALID_UNITS, ElectricCurrentConverter),
**dict.fromkeys(ElectricPotentialConverter.VALID_UNITS, ElectricPotentialConverter),
**dict.fromkeys(EnergyConverter.VALID_UNITS, EnergyConverter),
**dict.fromkeys(EnergyDistanceConverter.VALID_UNITS, EnergyDistanceConverter),
**dict.fromkeys(InformationConverter.VALID_UNITS, InformationConverter),
**dict.fromkeys(MassConverter.VALID_UNITS, MassConverter),
**dict.fromkeys(PowerConverter.VALID_UNITS, PowerConverter),
**dict.fromkeys(PressureConverter.VALID_UNITS, PressureConverter),
**dict.fromkeys(ReactiveEnergyConverter.VALID_UNITS, ReactiveEnergyConverter),
**dict.fromkeys(ReactivePowerConverter.VALID_UNITS, ReactivePowerConverter),
**dict.fromkeys(SpeedConverter.VALID_UNITS, SpeedConverter),
**dict.fromkeys(TemperatureConverter.VALID_UNITS, TemperatureConverter),
**dict.fromkeys(UnitlessRatioConverter.VALID_UNITS, UnitlessRatioConverter),
**dict.fromkeys(VolumeConverter.VALID_UNITS, VolumeConverter),
**dict.fromkeys(VolumeFlowRateConverter.VALID_UNITS, VolumeFlowRateConverter),
unit: conv for conv in _PRIMARY_UNIT_CONVERTERS for unit in conv.VALID_UNITS
}
"""Map of units to unit converter.
This map includes units which can be converted without knowing the unit class.
"""
UNIT_CLASSES = {
unit: converter.UNIT_CLASS
for unit, converter in STATISTIC_UNIT_TO_UNIT_CONVERTER.items()
UNIT_CLASS_TO_UNIT_CONVERTER: dict[str | None, type[BaseUnitConverter]] = {
conv.UNIT_CLASS: conv
for conv in chain(_PRIMARY_UNIT_CONVERTERS, _SECONDARY_UNIT_CONVERTERS)
}
"""Map of unit class to converter."""
DATA_SHORT_TERM_STATISTICS_RUN_CACHE = "recorder_short_term_statistics_run_cache"
@@ -315,14 +324,32 @@ class StatisticsRow(BaseStatisticsRow, total=False):
change: float | None
def _get_unit_converter(
unit_class: str | None, from_unit: str | None
) -> type[BaseUnitConverter] | None:
"""Return the unit converter for the given unit class and unit.
The unit converter is determined from the unit class and unit if the unit class
and unit match, otherwise from the unit.
"""
if (
conv := UNIT_CLASS_TO_UNIT_CONVERTER.get(unit_class)
) is not None and from_unit in conv.VALID_UNITS:
return conv
if (conv := STATISTIC_UNIT_TO_UNIT_CONVERTER.get(from_unit)) is not None:
return conv
return None
def get_display_unit(
hass: HomeAssistant,
statistic_id: str,
unit_class: str | None,
statistic_unit: str | None,
) -> str | None:
"""Return the unit which the statistic will be displayed in."""
if (converter := STATISTIC_UNIT_TO_UNIT_CONVERTER.get(statistic_unit)) is None:
if (converter := _get_unit_converter(unit_class, statistic_unit)) is None:
return statistic_unit
state_unit: str | None = statistic_unit
@@ -337,13 +364,14 @@ def get_display_unit(
def _get_statistic_to_display_unit_converter(
unit_class: str | None,
statistic_unit: str | None,
state_unit: str | None,
requested_units: dict[str, str] | None,
allow_none: bool = True,
) -> Callable[[float | None], float | None] | Callable[[float], float] | None:
"""Prepare a converter from the statistics unit to display unit."""
if (converter := STATISTIC_UNIT_TO_UNIT_CONVERTER.get(statistic_unit)) is None:
if (converter := _get_unit_converter(unit_class, statistic_unit)) is None:
return None
display_unit: str | None
@@ -367,24 +395,25 @@ def _get_statistic_to_display_unit_converter(
return converter.converter_factory(from_unit=statistic_unit, to_unit=display_unit)
def _get_display_to_statistic_unit_converter(
def _get_display_to_statistic_unit_converter_func(
unit_class: str | None,
display_unit: str | None,
statistic_unit: str | None,
) -> Callable[[float], float] | None:
"""Prepare a converter from the display unit to the statistics unit."""
if (
display_unit == statistic_unit
or (converter := STATISTIC_UNIT_TO_UNIT_CONVERTER.get(statistic_unit)) is None
or (converter := _get_unit_converter(unit_class, statistic_unit)) is None
):
return None
return converter.converter_factory(from_unit=display_unit, to_unit=statistic_unit)
def _get_unit_converter(
from_unit: str, to_unit: str
def _get_unit_converter_func(
unit_class: str | None, from_unit: str, to_unit: str
) -> Callable[[float | None], float | None] | None:
"""Prepare a converter from a unit to another unit."""
for conv in STATISTIC_UNIT_TO_UNIT_CONVERTER.values():
if (conv := _get_unit_converter(unit_class, from_unit)) is not None:
if from_unit in conv.VALID_UNITS and to_unit in conv.VALID_UNITS:
if from_unit == to_unit:
return None
@@ -394,9 +423,11 @@ def _get_unit_converter(
raise HomeAssistantError
def can_convert_units(from_unit: str | None, to_unit: str | None) -> bool:
def can_convert_units(
unit_class: str | None, from_unit: str | None, to_unit: str | None
) -> bool:
"""Return True if it's possible to convert from from_unit to to_unit."""
for converter in STATISTIC_UNIT_TO_UNIT_CONVERTER.values():
if (converter := _get_unit_converter(unit_class, from_unit)) is not None:
if from_unit in converter.VALID_UNITS and to_unit in converter.VALID_UNITS:
return True
return False
@@ -863,18 +894,71 @@ def clear_statistics(instance: Recorder, statistic_ids: list[str]) -> None:
instance.statistics_meta_manager.delete(session, statistic_ids)
@callback
def async_update_statistics_metadata(
hass: HomeAssistant,
statistic_id: str,
*,
new_statistic_id: str | UndefinedType = UNDEFINED,
new_unit_class: str | None | UndefinedType = UNDEFINED,
new_unit_of_measurement: str | None | UndefinedType = UNDEFINED,
on_done: Callable[[], None] | None = None,
_called_from_ws_api: bool = False,
) -> None:
"""Update statistics metadata for a statistic_id."""
if new_unit_of_measurement is not UNDEFINED and new_unit_class is UNDEFINED:
if not _called_from_ws_api:
report_usage(
(
"doesn't specify unit_class when calling "
"async_update_statistics_metadata"
),
breaks_in_ha_version="2026.11",
exclude_integrations={DOMAIN},
)
unit = new_unit_of_measurement
if unit in STATISTIC_UNIT_TO_UNIT_CONVERTER:
new_unit_class = STATISTIC_UNIT_TO_UNIT_CONVERTER[unit].UNIT_CLASS
else:
new_unit_class = None
if TYPE_CHECKING:
# After the above check, new_unit_class is guaranteed to not be UNDEFINED
assert new_unit_class is not UNDEFINED
if new_unit_of_measurement is not UNDEFINED and new_unit_class is not None:
if (converter := UNIT_CLASS_TO_UNIT_CONVERTER.get(new_unit_class)) is None:
raise HomeAssistantError(f"Unsupported unit_class: '{new_unit_class}'")
if new_unit_of_measurement not in converter.VALID_UNITS:
raise HomeAssistantError(
f"Unsupported unit_of_measurement '{new_unit_of_measurement}' "
f"for unit_class '{new_unit_class}'"
)
get_instance(hass).async_update_statistics_metadata(
statistic_id,
new_statistic_id=new_statistic_id,
new_unit_class=new_unit_class,
new_unit_of_measurement=new_unit_of_measurement,
on_done=on_done,
)
def update_statistics_metadata(
instance: Recorder,
statistic_id: str,
new_statistic_id: str | None | UndefinedType,
new_unit_class: str | None | UndefinedType,
new_unit_of_measurement: str | None | UndefinedType,
) -> None:
"""Update statistics metadata for a statistic_id."""
statistics_meta_manager = instance.statistics_meta_manager
if new_unit_of_measurement is not UNDEFINED:
if new_unit_class is not UNDEFINED and new_unit_of_measurement is not UNDEFINED:
with session_scope(session=instance.get_session()) as session:
statistics_meta_manager.update_unit_of_measurement(
session, statistic_id, new_unit_of_measurement
session, statistic_id, new_unit_class, new_unit_of_measurement
)
if new_statistic_id is not UNDEFINED and new_statistic_id is not None:
with session_scope(
@@ -926,13 +1010,16 @@ def _statistic_by_id_from_metadata(
return {
meta["statistic_id"]: {
"display_unit_of_measurement": get_display_unit(
hass, meta["statistic_id"], meta["unit_of_measurement"]
hass,
meta["statistic_id"],
meta["unit_class"],
meta["unit_of_measurement"],
),
"mean_type": meta["mean_type"],
"has_sum": meta["has_sum"],
"name": meta["name"],
"source": meta["source"],
"unit_class": UNIT_CLASSES.get(meta["unit_of_measurement"]),
"unit_class": meta["unit_class"],
"unit_of_measurement": meta["unit_of_measurement"],
}
for _, meta in metadata.values()
@@ -1008,7 +1095,7 @@ def list_statistic_ids(
"has_sum": meta["has_sum"],
"name": meta["name"],
"source": meta["source"],
"unit_class": UNIT_CLASSES.get(meta["unit_of_measurement"]),
"unit_class": meta["unit_class"],
"unit_of_measurement": meta["unit_of_measurement"],
}
@@ -1744,10 +1831,13 @@ def statistic_during_period(
else:
result["change"] = None
unit_class = metadata[1]["unit_class"]
state_unit = unit = metadata[1]["unit_of_measurement"]
if state := hass.states.get(statistic_id):
state_unit = state.attributes.get(ATTR_UNIT_OF_MEASUREMENT)
convert = _get_statistic_to_display_unit_converter(unit, state_unit, units)
convert = _get_statistic_to_display_unit_converter(
unit_class, unit, state_unit, units
)
if not convert:
return result
@@ -1830,10 +1920,13 @@ def _augment_result_with_change(
metadata_by_id = _metadata[row.metadata_id]
statistic_id = metadata_by_id["statistic_id"]
unit_class = metadata_by_id["unit_class"]
state_unit = unit = metadata_by_id["unit_of_measurement"]
if state := hass.states.get(statistic_id):
state_unit = state.attributes.get(ATTR_UNIT_OF_MEASUREMENT)
convert = _get_statistic_to_display_unit_converter(unit, state_unit, units)
convert = _get_statistic_to_display_unit_converter(
unit_class, unit, state_unit, units
)
if convert is not None:
prev_sums[statistic_id] = convert(row.sum)
@@ -2426,11 +2519,12 @@ def _sorted_statistics_to_dict(
metadata_by_id = metadata[meta_id]
statistic_id = metadata_by_id["statistic_id"]
if convert_units:
unit_class = metadata_by_id["unit_class"]
state_unit = unit = metadata_by_id["unit_of_measurement"]
if state := hass.states.get(statistic_id):
state_unit = state.attributes.get(ATTR_UNIT_OF_MEASUREMENT)
convert = _get_statistic_to_display_unit_converter(
unit, state_unit, units, allow_none=False
unit_class, unit, state_unit, units, allow_none=False
)
else:
convert = None
@@ -2501,6 +2595,27 @@ def _async_import_statistics(
statistics: Iterable[StatisticData],
) -> None:
"""Validate timestamps and insert an import_statistics job in the queue."""
# If unit class is not set, we try to set it based on the unit of measurement
# Note: This can't happen from the type checker's perspective, but we need
# to guard against custom integrations that have not been updated to set
# the unit_class.
if "unit_class" not in metadata:
unit = metadata["unit_of_measurement"] # type: ignore[unreachable]
if unit in STATISTIC_UNIT_TO_UNIT_CONVERTER:
metadata["unit_class"] = STATISTIC_UNIT_TO_UNIT_CONVERTER[unit].UNIT_CLASS
else:
metadata["unit_class"] = None
if (unit_class := metadata["unit_class"]) is not None:
if (converter := UNIT_CLASS_TO_UNIT_CONVERTER.get(unit_class)) is None:
raise HomeAssistantError(f"Unsupported unit_class: '{unit_class}'")
if metadata["unit_of_measurement"] not in converter.VALID_UNITS:
raise HomeAssistantError(
f"Unsupported unit_of_measurement '{metadata['unit_of_measurement']}' "
f"for unit_class '{unit_class}'"
)
for statistic in statistics:
start = statistic["start"]
if start.tzinfo is None or start.tzinfo.utcoffset(start) is None:
@@ -2532,6 +2647,8 @@ def async_import_statistics(
hass: HomeAssistant,
metadata: StatisticMetaData,
statistics: Iterable[StatisticData],
*,
_called_from_ws_api: bool = False,
) -> None:
"""Import hourly statistics from an internal source.
@@ -2544,6 +2661,13 @@ def async_import_statistics(
if not metadata["source"] or metadata["source"] != DOMAIN:
raise HomeAssistantError("Invalid source")
if "unit_class" not in metadata and not _called_from_ws_api: # type: ignore[unreachable]
report_usage( # type: ignore[unreachable]
"doesn't specify unit_class when calling async_import_statistics",
breaks_in_ha_version="2026.11",
exclude_integrations={DOMAIN},
)
_async_import_statistics(hass, metadata, statistics)
@@ -2552,6 +2676,8 @@ def async_add_external_statistics(
hass: HomeAssistant,
metadata: StatisticMetaData,
statistics: Iterable[StatisticData],
*,
_called_from_ws_api: bool = False,
) -> None:
"""Add hourly statistics from an external source.
@@ -2566,6 +2692,13 @@ def async_add_external_statistics(
if not metadata["source"] or metadata["source"] != domain:
raise HomeAssistantError("Invalid source")
if "unit_class" not in metadata and not _called_from_ws_api: # type: ignore[unreachable]
report_usage( # type: ignore[unreachable]
"doesn't specify unit_class when calling async_add_external_statistics",
breaks_in_ha_version="2026.11",
exclude_integrations={DOMAIN},
)
_async_import_statistics(hass, metadata, statistics)
@@ -2699,9 +2832,10 @@ def adjust_statistics(
if statistic_id not in metadata:
return True
unit_class = metadata[statistic_id][1]["unit_class"]
statistic_unit = metadata[statistic_id][1]["unit_of_measurement"]
if convert := _get_display_to_statistic_unit_converter(
adjustment_unit, statistic_unit
if convert := _get_display_to_statistic_unit_converter_func(
unit_class, adjustment_unit, statistic_unit
):
sum_adjustment = convert(sum_adjustment)
@@ -2769,8 +2903,9 @@ def change_statistics_unit(
return
metadata_id = metadata[0]
unit_class = metadata[1]["unit_class"]
if not (convert := _get_unit_converter(old_unit, new_unit)):
if not (convert := _get_unit_converter_func(unit_class, old_unit, new_unit)):
_LOGGER.warning(
"Statistics unit of measurement for %s is already %s",
statistic_id,
@@ -2786,12 +2921,14 @@ def change_statistics_unit(
_change_statistics_unit_for_table(session, table, metadata_id, convert)
statistics_meta_manager.update_unit_of_measurement(
session, statistic_id, new_unit
session,
statistic_id,
unit_class,
new_unit,
)
@callback
def async_change_statistics_unit(
async def async_change_statistics_unit(
hass: HomeAssistant,
statistic_id: str,
*,
@@ -2799,7 +2936,17 @@ def async_change_statistics_unit(
old_unit_of_measurement: str,
) -> None:
"""Change statistics unit for a statistic_id."""
if not can_convert_units(old_unit_of_measurement, new_unit_of_measurement):
metadatas = await get_instance(hass).async_add_executor_job(
partial(get_metadata, hass, statistic_ids={statistic_id})
)
if statistic_id not in metadatas:
raise HomeAssistantError(f"No metadata found for {statistic_id}")
metadata = metadatas[statistic_id][1]
if not can_convert_units(
metadata["unit_class"], old_unit_of_measurement, new_unit_of_measurement
):
raise HomeAssistantError(
f"Can't convert {old_unit_of_measurement} to {new_unit_of_measurement}"
)

View File

@@ -13,9 +13,10 @@ from sqlalchemy.orm.session import Session
from sqlalchemy.sql.expression import true
from sqlalchemy.sql.lambdas import StatementLambdaElement
from ..const import CIRCULAR_MEAN_SCHEMA_VERSION
from ..const import CIRCULAR_MEAN_SCHEMA_VERSION, UNIT_CLASS_SCHEMA_VERSION
from ..db_schema import StatisticsMeta
from ..models import StatisticMeanType, StatisticMetaData
from ..statistics import STATISTIC_UNIT_TO_UNIT_CONVERTER
from ..util import execute_stmt_lambda_element
if TYPE_CHECKING:
@@ -41,6 +42,7 @@ INDEX_UNIT_OF_MEASUREMENT: Final = 3
INDEX_HAS_SUM: Final = 4
INDEX_NAME: Final = 5
INDEX_MEAN_TYPE: Final = 6
INDEX_UNIT_CLASS: Final = 7
def _generate_get_metadata_stmt(
@@ -58,6 +60,8 @@ def _generate_get_metadata_stmt(
columns.append(StatisticsMeta.mean_type)
else:
columns.append(StatisticsMeta.has_mean)
if schema_version >= UNIT_CLASS_SCHEMA_VERSION:
columns.append(StatisticsMeta.unit_class)
stmt = lambda_stmt(lambda: select(*columns))
if statistic_ids:
stmt += lambda q: q.where(StatisticsMeta.statistic_id.in_(statistic_ids))
@@ -140,6 +144,13 @@ class StatisticsMetaManager:
if row[INDEX_MEAN_TYPE]
else StatisticMeanType.NONE
)
if self.recorder.schema_version >= UNIT_CLASS_SCHEMA_VERSION:
unit_class = row[INDEX_UNIT_CLASS]
else:
conv = STATISTIC_UNIT_TO_UNIT_CONVERTER.get(
row[INDEX_UNIT_OF_MEASUREMENT]
)
unit_class = conv.UNIT_CLASS if conv else None
meta = {
"has_mean": mean_type is StatisticMeanType.ARITHMETIC,
"mean_type": mean_type,
@@ -148,6 +159,7 @@ class StatisticsMetaManager:
"source": row[INDEX_SOURCE],
"statistic_id": statistic_id,
"unit_of_measurement": row[INDEX_UNIT_OF_MEASUREMENT],
"unit_class": unit_class,
}
id_meta = (row_id, meta)
results[statistic_id] = id_meta
@@ -206,6 +218,7 @@ class StatisticsMetaManager:
old_metadata["mean_type"] != new_metadata["mean_type"]
or old_metadata["has_sum"] != new_metadata["has_sum"]
or old_metadata["name"] != new_metadata["name"]
or old_metadata["unit_class"] != new_metadata["unit_class"]
or old_metadata["unit_of_measurement"]
!= new_metadata["unit_of_measurement"]
):
@@ -217,6 +230,7 @@ class StatisticsMetaManager:
StatisticsMeta.mean_type: new_metadata["mean_type"],
StatisticsMeta.has_sum: new_metadata["has_sum"],
StatisticsMeta.name: new_metadata["name"],
StatisticsMeta.unit_class: new_metadata["unit_class"],
StatisticsMeta.unit_of_measurement: new_metadata["unit_of_measurement"],
},
synchronize_session=False,
@@ -328,7 +342,11 @@ class StatisticsMetaManager:
)
def update_unit_of_measurement(
self, session: Session, statistic_id: str, new_unit: str | None
self,
session: Session,
statistic_id: str,
new_unit_class: str | None,
new_unit: str | None,
) -> None:
"""Update the unit of measurement for a statistic_id.
@@ -338,7 +356,12 @@ class StatisticsMetaManager:
self._assert_in_recorder_thread()
session.query(StatisticsMeta).filter(
StatisticsMeta.statistic_id == statistic_id
).update({StatisticsMeta.unit_of_measurement: new_unit})
).update(
{
StatisticsMeta.unit_of_measurement: new_unit,
StatisticsMeta.unit_class: new_unit_class,
}
)
self._clear_cache([statistic_id])
def update_statistic_id(

View File

@@ -77,6 +77,7 @@ class UpdateStatisticsMetadataTask(RecorderTask):
on_done: Callable[[], None] | None
statistic_id: str
new_statistic_id: str | None | UndefinedType
new_unit_class: str | None | UndefinedType
new_unit_of_measurement: str | None | UndefinedType
def run(self, instance: Recorder) -> None:
@@ -85,6 +86,7 @@ class UpdateStatisticsMetadataTask(RecorderTask):
instance,
self.statistic_id,
self.new_statistic_id,
self.new_unit_class,
self.new_unit_of_measurement,
)
if self.on_done:

View File

@@ -4,6 +4,7 @@ from __future__ import annotations
import asyncio
from datetime import datetime as dt
import logging
from typing import Any, Literal, cast
import voluptuous as vol
@@ -14,11 +15,13 @@ from homeassistant.core import HomeAssistant, callback, valid_entity_id
from homeassistant.exceptions import HomeAssistantError
from homeassistant.helpers import config_validation as cv
from homeassistant.helpers.json import json_bytes
from homeassistant.helpers.typing import UNDEFINED
from homeassistant.util import dt as dt_util
from homeassistant.util.unit_conversion import (
ApparentPowerConverter,
AreaConverter,
BloodGlucoseConcentrationConverter,
CarbonMonoxideConcentrationConverter,
ConductivityConverter,
DataRateConverter,
DistanceConverter,
@@ -43,11 +46,12 @@ from homeassistant.util.unit_conversion import (
from .models import StatisticMeanType, StatisticPeriod
from .statistics import (
STATISTIC_UNIT_TO_UNIT_CONVERTER,
UNIT_CLASS_TO_UNIT_CONVERTER,
async_add_external_statistics,
async_change_statistics_unit,
async_import_statistics,
async_list_statistic_ids,
async_update_statistics_metadata,
list_statistic_ids,
statistic_during_period,
statistics_during_period,
@@ -56,6 +60,8 @@ from .statistics import (
)
from .util import PERIOD_SCHEMA, get_instance, resolve_period
_LOGGER = logging.getLogger(__name__)
CLEAR_STATISTICS_TIME_OUT = 10
UPDATE_STATISTICS_METADATA_TIME_OUT = 10
@@ -66,6 +72,9 @@ UNIT_SCHEMA = vol.Schema(
vol.Optional("blood_glucose_concentration"): vol.In(
BloodGlucoseConcentrationConverter.VALID_UNITS
),
vol.Optional("carbon_monoxide"): vol.In(
CarbonMonoxideConcentrationConverter.VALID_UNITS
),
vol.Optional("concentration"): vol.In(
MassVolumeConcentrationConverter.VALID_UNITS
),
@@ -392,6 +401,7 @@ async def ws_get_statistics_metadata(
{
vol.Required("type"): "recorder/update_statistics_metadata",
vol.Required("statistic_id"): str,
vol.Optional("unit_class"): vol.Any(str, None),
vol.Required("unit_of_measurement"): vol.Any(str, None),
}
)
@@ -401,6 +411,8 @@ async def ws_update_statistics_metadata(
) -> None:
"""Update statistics metadata for a statistic_id.
The unit_class specifies which unit conversion class to use, if applicable.
Only the normalized unit of measurement can be updated.
"""
done_event = asyncio.Event()
@@ -408,10 +420,20 @@ async def ws_update_statistics_metadata(
def update_statistics_metadata_done() -> None:
hass.loop.call_soon_threadsafe(done_event.set)
get_instance(hass).async_update_statistics_metadata(
if "unit_class" not in msg:
_LOGGER.warning(
"WS command recorder/update_statistics_metadata called without "
"specifying unit_class in metadata, this is deprecated and will "
"stop working in HA Core 2026.11"
)
async_update_statistics_metadata(
hass,
msg["statistic_id"],
new_unit_class=msg.get("unit_class", UNDEFINED),
new_unit_of_measurement=msg["unit_of_measurement"],
on_done=update_statistics_metadata_done,
_called_from_ws_api=True,
)
try:
async with asyncio.timeout(UPDATE_STATISTICS_METADATA_TIME_OUT):
@@ -434,15 +456,15 @@ async def ws_update_statistics_metadata(
vol.Required("old_unit_of_measurement"): vol.Any(str, None),
}
)
@callback
def ws_change_statistics_unit(
@websocket_api.async_response
async def ws_change_statistics_unit(
hass: HomeAssistant, connection: websocket_api.ActiveConnection, msg: dict[str, Any]
) -> None:
"""Change the unit_of_measurement for a statistic_id.
All existing statistics will be converted to the new unit.
"""
async_change_statistics_unit(
await async_change_statistics_unit(
hass,
msg["statistic_id"],
new_unit_of_measurement=msg["new_unit_of_measurement"],
@@ -487,17 +509,23 @@ async def ws_adjust_sum_statistics(
return
metadata = metadatas[0]
def valid_units(statistics_unit: str | None, adjustment_unit: str | None) -> bool:
def valid_units(
unit_class: str | None, statistics_unit: str | None, adjustment_unit: str | None
) -> bool:
if statistics_unit == adjustment_unit:
return True
converter = STATISTIC_UNIT_TO_UNIT_CONVERTER.get(statistics_unit)
if converter is not None and adjustment_unit in converter.VALID_UNITS:
if (
(converter := UNIT_CLASS_TO_UNIT_CONVERTER.get(unit_class)) is not None
and statistics_unit in converter.VALID_UNITS
and adjustment_unit in converter.VALID_UNITS
):
return True
return False
unit_class = metadata["unit_class"]
stat_unit = metadata["statistics_unit_of_measurement"]
adjustment_unit = msg["adjustment_unit_of_measurement"]
if not valid_units(stat_unit, adjustment_unit):
if not valid_units(unit_class, stat_unit, adjustment_unit):
connection.send_error(
msg["id"],
"invalid_units",
@@ -521,6 +549,7 @@ async def ws_adjust_sum_statistics(
vol.Required("name"): vol.Any(str, None),
vol.Required("source"): str,
vol.Required("statistic_id"): str,
vol.Optional("unit_class"): vol.Any(str, None),
vol.Required("unit_of_measurement"): vol.Any(str, None),
},
vol.Required("stats"): [
@@ -540,16 +569,25 @@ async def ws_adjust_sum_statistics(
def ws_import_statistics(
hass: HomeAssistant, connection: websocket_api.ActiveConnection, msg: dict[str, Any]
) -> None:
"""Import statistics."""
"""Import statistics.
The unit_class specifies which unit conversion class to use, if applicable.
"""
metadata = msg["metadata"]
# The WS command will be changed in a follow up PR
metadata["mean_type"] = (
StatisticMeanType.ARITHMETIC if metadata["has_mean"] else StatisticMeanType.NONE
)
if "unit_class" not in metadata:
_LOGGER.warning(
"WS command recorder/import_statistics called without specifying "
"unit_class in metadata, this is deprecated and will stop working "
"in HA Core 2026.11"
)
stats = msg["stats"]
if valid_entity_id(metadata["statistic_id"]):
async_import_statistics(hass, metadata, stats)
async_import_statistics(hass, metadata, stats, _called_from_ws_api=True)
else:
async_add_external_statistics(hass, metadata, stats)
async_add_external_statistics(hass, metadata, stats, _called_from_ws_api=True)
connection.send_result(msg["id"])

View File

@@ -39,6 +39,23 @@ from .renault_vehicle import COORDINATORS, RenaultVehicleProxy
LOGGER = logging.getLogger(__name__)
async def _get_filtered_vehicles(account: RenaultAccount) -> list[KamereonVehiclesLink]:
"""Filter out vehicles with missing details.
May be due to new purchases, or issue with the Renault servers.
"""
vehicles = await account.get_vehicles()
if not vehicles.vehicleLinks:
return []
result: list[KamereonVehiclesLink] = []
for link in vehicles.vehicleLinks:
if link.vehicleDetails is None:
LOGGER.warning("Ignoring vehicle with missing details: %s", link.vin)
continue
result.append(link)
return result
class RenaultHub:
"""Handle account communication with Renault servers."""
@@ -84,49 +101,48 @@ class RenaultHub:
account_id: str = config_entry.data[CONF_KAMEREON_ACCOUNT_ID]
self._account = await self._client.get_api_account(account_id)
vehicles = await self._account.get_vehicles()
if vehicles.vehicleLinks:
if any(
vehicle_link.vehicleDetails is None
for vehicle_link in vehicles.vehicleLinks
):
raise ConfigEntryNotReady(
"Failed to retrieve vehicle details from Renault servers"
)
num_call_per_scan = len(COORDINATORS) * len(vehicles.vehicleLinks)
scan_interval = timedelta(
seconds=(3600 * num_call_per_scan) / MAX_CALLS_PER_HOURS
vehicle_links = await _get_filtered_vehicles(self._account)
if not vehicle_links:
LOGGER.debug(
"No valid vehicle details found for account_id: %s", account_id
)
raise ConfigEntryNotReady(
"Failed to retrieve vehicle details from Renault servers"
)
device_registry = dr.async_get(self._hass)
await asyncio.gather(
*(
self.async_initialise_vehicle(
vehicle_link,
self._account,
scan_interval,
config_entry,
device_registry,
)
for vehicle_link in vehicles.vehicleLinks
)
)
num_call_per_scan = len(COORDINATORS) * len(vehicle_links)
scan_interval = timedelta(
seconds=(3600 * num_call_per_scan) / MAX_CALLS_PER_HOURS
)
# all vehicles have been initiated with the right number of active coordinators
num_call_per_scan = 0
for vehicle_link in vehicles.vehicleLinks:
device_registry = dr.async_get(self._hass)
await asyncio.gather(
*(
self.async_initialise_vehicle(
vehicle_link,
self._account,
scan_interval,
config_entry,
device_registry,
)
for vehicle_link in vehicle_links
)
)
# all vehicles have been initiated with the right number of active coordinators
num_call_per_scan = 0
for vehicle_link in vehicle_links:
vehicle = self._vehicles[str(vehicle_link.vin)]
num_call_per_scan += len(vehicle.coordinators)
new_scan_interval = timedelta(
seconds=(3600 * num_call_per_scan) / MAX_CALLS_PER_HOURS
)
if new_scan_interval != scan_interval:
# we need to change the vehicles with the right scan interval
for vehicle_link in vehicle_links:
vehicle = self._vehicles[str(vehicle_link.vin)]
num_call_per_scan += len(vehicle.coordinators)
new_scan_interval = timedelta(
seconds=(3600 * num_call_per_scan) / MAX_CALLS_PER_HOURS
)
if new_scan_interval != scan_interval:
# we need to change the vehicles with the right scan interval
for vehicle_link in vehicles.vehicleLinks:
vehicle = self._vehicles[str(vehicle_link.vin)]
vehicle.update_scan_interval(new_scan_interval)
vehicle.update_scan_interval(new_scan_interval)
async def async_initialise_vehicle(
self,
@@ -164,10 +180,10 @@ class RenaultHub:
"""Get Kamereon account ids."""
accounts = []
for account in await self._client.get_api_accounts():
vehicles = await account.get_vehicles()
vehicle_links = await _get_filtered_vehicles(account)
# Only add the account if it has linked vehicles.
if vehicles.vehicleLinks:
if vehicle_links:
accounts.append(account.account_id)
return accounts

View File

@@ -533,6 +533,12 @@
"manual_record": {
"default": "mdi:record-rec"
},
"rule": {
"default": "mdi:cctv",
"state": {
"off": "mdi:cctv-off"
}
},
"pre_record": {
"default": "mdi:history"
},

View File

@@ -19,5 +19,5 @@
"iot_class": "local_push",
"loggers": ["reolink_aio"],
"quality_scale": "platinum",
"requirements": ["reolink-aio==0.16.1"]
"requirements": ["reolink-aio==0.16.2"]
}

View File

@@ -1019,6 +1019,9 @@
"manual_record": {
"name": "Manual record"
},
"rule": {
"name": "Surveillance rule {name}"
},
"pre_record": {
"name": "Pre-recording"
},

View File

@@ -60,6 +60,18 @@ class ReolinkChimeSwitchEntityDescription(
value: Callable[[Chime], bool | None]
@dataclass(frozen=True, kw_only=True)
class ReolinkSwitchIndexEntityDescription(
SwitchEntityDescription,
ReolinkChannelEntityDescription,
):
"""A class that describes switch entities with an extra index."""
method: Callable[[Host, int, int, bool], Any]
value: Callable[[Host, int, int], bool | None]
placeholder: Callable[[Host, int, int], str]
SWITCH_ENTITIES = (
ReolinkSwitchEntityDescription(
key="ir_lights",
@@ -304,6 +316,15 @@ CHIME_SWITCH_ENTITIES = (
),
)
RULE_SWITCH_ENTITY = ReolinkSwitchIndexEntityDescription(
key="rule",
cmd_key="rules",
translation_key="rule",
placeholder=lambda api, ch, idx: api.baichuan.rule_name(ch, idx),
value=lambda api, ch, idx: api.baichuan.rule_enabled(ch, idx),
method=lambda api, ch, idx, value: (api.baichuan.set_rule_enabled(ch, idx, value)),
)
async def async_setup_entry(
hass: HomeAssistant,
@@ -336,6 +357,11 @@ async def async_setup_entry(
for chime in reolink_data.host.api.chime_list
if chime.channel is None
)
entities.extend(
ReolinkIndexSwitchEntity(reolink_data, channel, rule_id, RULE_SWITCH_ENTITY)
for channel in reolink_data.host.api.channels
for rule_id in reolink_data.host.api.baichuan.rule_ids(channel)
)
async_add_entities(entities)
@@ -469,3 +495,46 @@ class ReolinkHostChimeSwitchEntity(ReolinkHostChimeCoordinatorEntity, SwitchEnti
"""Turn the entity off."""
await self.entity_description.method(self._chime, False)
self.async_write_ha_state()
class ReolinkIndexSwitchEntity(ReolinkChannelCoordinatorEntity, SwitchEntity):
"""Base switch entity class for Reolink IP camera with an extra index."""
entity_description: ReolinkSwitchIndexEntityDescription
def __init__(
self,
reolink_data: ReolinkData,
channel: int,
index: int,
entity_description: ReolinkSwitchIndexEntityDescription,
) -> None:
"""Initialize Reolink switch entity."""
self.entity_description = entity_description
super().__init__(reolink_data, channel)
self._index = index
self._attr_translation_placeholders = {
"name": entity_description.placeholder(self._host.api, self._channel, index)
}
self._attr_unique_id = f"{self._attr_unique_id}_{index}"
@property
def is_on(self) -> bool | None:
"""Return true if switch is on."""
return self.entity_description.value(self._host.api, self._channel, self._index)
@raise_translated_error
async def async_turn_on(self, **kwargs: Any) -> None:
"""Turn the entity on."""
await self.entity_description.method(
self._host.api, self._channel, self._index, True
)
self.async_write_ha_state()
@raise_translated_error
async def async_turn_off(self, **kwargs: Any) -> None:
"""Turn the entity off."""
await self.entity_description.method(
self._host.api, self._channel, self._index, False
)
self.async_write_ha_state()

View File

@@ -245,3 +245,43 @@ async def async_unload_entry(hass: HomeAssistant, entry: SatelConfigEntry) -> bo
async def update_listener(hass: HomeAssistant, entry: SatelConfigEntry) -> None:
"""Handle options update."""
hass.config_entries.async_schedule_reload(entry.entry_id)
async def async_migrate_entry(
hass: HomeAssistant, config_entry: SatelConfigEntry
) -> bool:
"""Migrate old entry."""
_LOGGER.debug(
"Migrating configuration from version %s.%s",
config_entry.version,
config_entry.minor_version,
)
if config_entry.version > 1:
# This means the user has downgraded from a future version
return False
if config_entry.version == 1 and config_entry.minor_version == 1:
for subentry in config_entry.subentries.values():
property_map = {
SUBENTRY_TYPE_PARTITION: CONF_PARTITION_NUMBER,
SUBENTRY_TYPE_ZONE: CONF_ZONE_NUMBER,
SUBENTRY_TYPE_OUTPUT: CONF_OUTPUT_NUMBER,
SUBENTRY_TYPE_SWITCHABLE_OUTPUT: CONF_SWITCHABLE_OUTPUT_NUMBER,
}
new_title = f"{subentry.title} ({subentry.data[property_map[subentry.subentry_type]]})"
hass.config_entries.async_update_subentry(
config_entry, subentry, title=new_title
)
hass.config_entries.async_update_entry(config_entry, minor_version=2)
_LOGGER.debug(
"Migration to configuration version %s.%s successful",
config_entry.version,
config_entry.minor_version,
)
return True

View File

@@ -91,6 +91,7 @@ class SatelConfigFlow(ConfigFlow, domain=DOMAIN):
"""Handle a Satel Integra config flow."""
VERSION = 1
MINOR_VERSION = 2
@staticmethod
@callback
@@ -158,7 +159,7 @@ class SatelConfigFlow(ConfigFlow, domain=DOMAIN):
subentries.append(
{
"subentry_type": SUBENTRY_TYPE_PARTITION,
"title": partition_data[CONF_NAME],
"title": f"{partition_data[CONF_NAME]} ({partition_number})",
"unique_id": f"{SUBENTRY_TYPE_PARTITION}_{partition_number}",
"data": {
CONF_NAME: partition_data[CONF_NAME],
@@ -174,7 +175,7 @@ class SatelConfigFlow(ConfigFlow, domain=DOMAIN):
subentries.append(
{
"subentry_type": SUBENTRY_TYPE_ZONE,
"title": zone_data[CONF_NAME],
"title": f"{zone_data[CONF_NAME]} ({zone_number})",
"unique_id": f"{SUBENTRY_TYPE_ZONE}_{zone_number}",
"data": {
CONF_NAME: zone_data[CONF_NAME],
@@ -192,7 +193,7 @@ class SatelConfigFlow(ConfigFlow, domain=DOMAIN):
subentries.append(
{
"subentry_type": SUBENTRY_TYPE_OUTPUT,
"title": output_data[CONF_NAME],
"title": f"{output_data[CONF_NAME]} ({output_number})",
"unique_id": f"{SUBENTRY_TYPE_OUTPUT}_{output_number}",
"data": {
CONF_NAME: output_data[CONF_NAME],
@@ -210,7 +211,7 @@ class SatelConfigFlow(ConfigFlow, domain=DOMAIN):
subentries.append(
{
"subentry_type": SUBENTRY_TYPE_SWITCHABLE_OUTPUT,
"title": switchable_output_data[CONF_NAME],
"title": f"{switchable_output_data[CONF_NAME]} ({switchable_output_number})",
"unique_id": f"{SUBENTRY_TYPE_SWITCHABLE_OUTPUT}_{switchable_output_number}",
"data": {
CONF_NAME: switchable_output_data[CONF_NAME],
@@ -279,7 +280,9 @@ class PartitionSubentryFlowHandler(ConfigSubentryFlow):
if not errors:
return self.async_create_entry(
title=user_input[CONF_NAME], data=user_input, unique_id=unique_id
title=f"{user_input[CONF_NAME]} ({user_input[CONF_PARTITION_NUMBER]})",
data=user_input,
unique_id=unique_id,
)
return self.async_show_form(
@@ -304,7 +307,7 @@ class PartitionSubentryFlowHandler(ConfigSubentryFlow):
return self.async_update_and_abort(
self._get_entry(),
subconfig_entry,
title=user_input[CONF_NAME],
title=f"{user_input[CONF_NAME]} ({subconfig_entry.data[CONF_PARTITION_NUMBER]})",
data_updates=user_input,
)
@@ -338,7 +341,9 @@ class ZoneSubentryFlowHandler(ConfigSubentryFlow):
if not errors:
return self.async_create_entry(
title=user_input[CONF_NAME], data=user_input, unique_id=unique_id
title=f"{user_input[CONF_NAME]} ({user_input[CONF_ZONE_NUMBER]})",
data=user_input,
unique_id=unique_id,
)
return self.async_show_form(
@@ -363,7 +368,7 @@ class ZoneSubentryFlowHandler(ConfigSubentryFlow):
return self.async_update_and_abort(
self._get_entry(),
subconfig_entry,
title=user_input[CONF_NAME],
title=f"{user_input[CONF_NAME]} ({subconfig_entry.data[CONF_ZONE_NUMBER]})",
data_updates=user_input,
)
@@ -396,7 +401,9 @@ class OutputSubentryFlowHandler(ConfigSubentryFlow):
if not errors:
return self.async_create_entry(
title=user_input[CONF_NAME], data=user_input, unique_id=unique_id
title=f"{user_input[CONF_NAME]} ({user_input[CONF_OUTPUT_NUMBER]})",
data=user_input,
unique_id=unique_id,
)
return self.async_show_form(
@@ -421,7 +428,7 @@ class OutputSubentryFlowHandler(ConfigSubentryFlow):
return self.async_update_and_abort(
self._get_entry(),
subconfig_entry,
title=user_input[CONF_NAME],
title=f"{user_input[CONF_NAME]} ({subconfig_entry.data[CONF_OUTPUT_NUMBER]})",
data_updates=user_input,
)
@@ -454,7 +461,9 @@ class SwitchableOutputSubentryFlowHandler(ConfigSubentryFlow):
if not errors:
return self.async_create_entry(
title=user_input[CONF_NAME], data=user_input, unique_id=unique_id
title=f"{user_input[CONF_NAME]} ({user_input[CONF_SWITCHABLE_OUTPUT_NUMBER]})",
data=user_input,
unique_id=unique_id,
)
return self.async_show_form(
@@ -479,7 +488,7 @@ class SwitchableOutputSubentryFlowHandler(ConfigSubentryFlow):
return self.async_update_and_abort(
self._get_entry(),
subconfig_entry,
title=user_input[CONF_NAME],
title=f"{user_input[CONF_NAME]} ({subconfig_entry.data[CONF_SWITCHABLE_OUTPUT_NUMBER]})",
data_updates=user_input,
)

Some files were not shown because too many files have changed in this diff Show More