Compare commits

..

73 Commits

Author SHA1 Message Date
abmantis
1bf6771a54 Merge branch 'dev' of github.com:home-assistant/core into dev_target_triggers_conditions 2025-11-06 19:57:40 +00:00
Charlie Rusbridger
eb9849c411 Fix wrong BrowseError module in Kode (#155971) 2025-11-06 19:18:07 +00:00
Joshua Peisach (ItzSwirlz)
93d48fae9d noaa_tides: define constants (#155949) 2025-11-06 19:13:37 +00:00
Matthias Alphart
d90a7b2345 Fix KNX Climate humidity DPT (#155942) 2025-11-06 19:09:51 +00:00
G Johansson
c2f6a364b8 Remove deprecated constant for volt ampere reactive (#155955) 2025-11-06 18:59:11 +00:00
G Johansson
bbadd92ffb Remove deprecated square meters constant (#155954) 2025-11-06 18:58:46 +00:00
Tom Monck JR
6a7de24a04 Fix args passed to check_config script (#155885) 2025-11-06 19:27:53 +02:00
G Johansson
67ccdd36fb Allow template in query in sql (#150287) 2025-11-06 17:11:46 +01:00
Andrea Turri
2ddf55a60d Miele time sensors 3/3 - Add absolute time sensors (#146055)
Co-authored-by: Erik Montnemery <erik@montnemery.com>
2025-11-06 17:09:19 +01:00
TheJulianJES
57e7bc81d4 Bump ZHA to 0.0.78 (#155937) 2025-11-06 16:30:24 +01:00
Erik Montnemery
777f09598f Fix waze_travel_time tests opening sockets (#155902) 2025-11-06 16:09:29 +01:00
G Johansson
81a9ef1df0 Fix spelling in smhi strings (#155951) 2025-11-06 15:54:30 +01:00
Erik Montnemery
d063bc87a1 Fix nam tests opening sockets (#155898) 2025-11-06 15:02:03 +01:00
epenet
5fce08de65 Remove getattr in Tuya find_dpcode function (#155941) 2025-11-06 14:51:35 +01:00
epenet
c0db966afd Move find_dpcode function out of Tuya entity (#155934) 2025-11-06 13:43:07 +01:00
G Johansson
9288995cad Add fans and battery sensor to systemmonitor (#151066) 2025-11-06 13:08:44 +01:00
Erik Montnemery
4d2abb4f65 Fix ezviz tests opening sockets (#155896) 2025-11-06 12:10:33 +01:00
Artur Pragacz
60014b6530 Rename misspelled service python files (#155909) 2025-11-06 09:59:45 +01:00
Erik Montnemery
3b57cab6b4 Revert "Allow opening sockets in logbook tests" (#155899) 2025-11-06 09:20:28 +01:00
Erik Montnemery
967467664b Disable automatic start of HTTP server in tests (#155857) 2025-11-06 08:37:04 +01:00
alexqzd
b87b5cffd8 SmartThings: Expose the entity to control the AC unit beep (#151546) 2025-11-06 07:55:51 +01:00
Artur Pragacz
bb44987af1 Clear dynamic encryption key in ESPHome on remove (#155858) 2025-11-06 02:11:32 +01:00
Christopher Fenner
8d3ef2b224 Add icons for presets in ViCare ventilation entity (#155845) 2025-11-05 20:57:02 +01:00
wollew
5e409295f9 velux: add one more missing data_description (#155854) 2025-11-05 20:56:19 +01:00
J. Nick Koston
530c189f9c Add Bluetooth WiFi provisioning for Shelly (#155822) 2025-11-05 13:20:24 -06:00
giuseppeg88
f05fef9588 Add bad code attempt event to manual alarm control panel (#146315)
Co-authored-by: Erik Montnemery <erik@montnemery.com>
2025-11-05 18:15:27 +00:00
Paulus Schoutsen
a257b5c54c Rename DALI Center to Sunricher DALI (#155865)
Co-authored-by: Franck Nijhof <git@frenck.dev>
2025-11-05 19:15:07 +01:00
puddly
5b9f7372fc Allow hardware integrations to specify TX power for ZHA (#155855)
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
Co-authored-by: Franck Nijhof <git@frenck.dev>
2025-11-05 19:13:54 +01:00
puddly
a4c0a9b3a5 Revert "Fix progress step recursion (#153906)" (#155866) 2025-11-05 18:46:39 +01:00
Bram Kragten
7d65b4c941 Update frontend to 20251105.0 (#155853) 2025-11-05 16:32:06 +01:00
Martin Hjelmare
abd0ee7bce Fix progress step recursion (#153906) 2025-11-05 15:48:35 +01:00
Will Moss
9e3eb20a04 Fix account link no internet on startup (#154579)
Co-authored-by: Martin Hjelmare <marhje52@gmail.com>
2025-11-05 15:23:20 +01:00
Erik Montnemery
6dc655c3b4 Allow opening sockets in logbook tests (#155840) 2025-11-05 14:58:21 +01:00
Maciej Bieniek
9f595a94fb Check if the Brother printer serial number matches (#155842) 2025-11-05 14:15:46 +01:00
Lukas
5dc215a143 Bump python-pooldose to 0.7.8 (#155307)
Co-authored-by: Joost Lekkerkerker <joostlek@outlook.com>
Co-authored-by: Abílio Costa <abmantis@users.noreply.github.com>
2025-11-05 13:04:49 +00:00
starkillerOG
306b78ba5f Bring Reolink test coverage back to 100% (#155839) 2025-11-05 12:22:44 +01:00
Erik Montnemery
bccb646a07 Create issue to warn against using http.server_host in supervised installs (#155837) 2025-11-05 12:13:56 +01:00
Christopher Fenner
4a5dc8cdd6 Add labels to selector in AndroidTV config flow (#155660) 2025-11-05 12:05:58 +01:00
Erik Montnemery
52a751507a Revert "Deprecate http.server_host option and raise issue if used" (#155834) 2025-11-05 11:26:14 +01:00
wollew
533b9f969d velux: add missing data_descriptions in config flow (#155832) 2025-11-05 11:25:07 +01:00
G Johansson
5de7928bc0 Fix sentence casing in smhi (#155831) 2025-11-05 11:24:52 +01:00
epenet
aad9b07f86 Simplify tuya sensor code (#155835) 2025-11-05 11:24:06 +01:00
Tom Matheussen
3e2c401253 Allow multiple config entries for Satel Integra (#155833) 2025-11-05 11:21:56 +01:00
Bouwe Westerdijk
762e63d042 Bugfix: implement RestoreState and bump backend for Plugwise climate (#155126) 2025-11-05 11:18:15 +01:00
puddly
ec6d40a51c Add progress to ZHA migration steps (#155764)
Co-authored-by: TheJulianJES <TheJulianJES@users.noreply.github.com>
2025-11-05 11:10:10 +01:00
Erik Montnemery
47c2c61626 Deprecate http.server_host option and raise issue if used (#155828) 2025-11-05 11:08:49 +01:00
Erik Montnemery
73c941f6c5 Fix ESPHome config entry unload (#155830) 2025-11-05 10:32:29 +01:00
epenet
685edb5f76 Add Tuya test fixtures for cz category (#155827) 2025-11-05 09:54:27 +01:00
G Johansson
5987b6dcb9 Improve code formatting in System monitor (#155800) 2025-11-04 22:09:04 -08:00
Oliver Gründel
cb029e0bb0 Remove state class for rolling window in ecowitt (#155812) 2025-11-04 22:06:15 -08:00
steinmn
553ec35947 Set LG Thinq energy sensor state_class as total_increasing (#155816) 2025-11-04 22:01:38 -08:00
G Johansson
f93940bfa9 Revert "Make influxdb batch settings configurable" (#155808) 2025-11-04 22:00:02 -08:00
Foscam-wangzhengyu
486f93eb28 Bump libpyfoscamcgi to 0.0.9 (#155824) 2025-11-04 21:58:24 -08:00
cdnninja
462db36fef add update platform to vesync (#154915) 2025-11-04 21:40:35 -08:00
Nathan Spencer
485f7f45e8 Bump pylitterbot to 2025.0.0 (#155821) 2025-11-04 18:03:24 -08:00
G Johansson
a446d8a98c Add fire sensors to smhi (#153224) 2025-11-04 17:37:32 -08:00
J. Nick Koston
b4a31fc578 Bump aioshelly to 13.16.0 (#155813) 2025-11-04 22:20:00 +01:00
G Johansson
22321c22cc Bump holidays to 0.84 (#155802) 2025-11-04 22:18:02 +01:00
TheJulianJES
4419c236e2 Add ZHA migration retry steps for unplugged adapters (#155537) 2025-11-04 20:34:51 +01:00
Maciej Bieniek
1731a2534c Implement base entity class for Brother integration (#155714)
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
2025-11-04 20:28:52 +01:00
Bram Kragten
ec0edf47b1 Update frontend to 20251104.0 (#155799) 2025-11-04 14:08:34 -05:00
Tom Matheussen
57c69738e3 Migrate Satel Integra entities unique_id to use config flow entry_id (#154187) 2025-11-04 20:03:08 +01:00
Robert Resch
fb1f258b2b Readd deprecated archs to build wheels (#155792) 2025-11-04 19:30:19 +01:00
puddly
d419dd0c05 Fix non-unique ZHA serial port paths and migrate USB integration to always list unique paths (#155019) 2025-11-04 11:42:56 -05:00
abmantis
e7a7cb829e Merge branch 'dev' of github.com:home-assistant/core into dev_target_triggers_conditions 2025-11-04 12:28:39 +00:00
abmantis
6f6b2f1ad3 Merge branch 'dev_target_triggers_conditions' of github.com:home-assistant/core into dev_target_triggers_conditions 2025-10-15 17:03:28 +01:00
abmantis
1cc4890f75 Merge branch 'dev' of github.com:home-assistant/core into dev_target_triggers_conditions 2025-10-15 17:03:18 +01:00
Bram Kragten
d3dd9b26c9 Fixes for triggers.yaml descriptions (#153841)
Co-authored-by: Artur Pragacz <49985303+arturpragacz@users.noreply.github.com>
2025-10-09 18:00:56 +01:00
Abílio Costa
a64d61df05 Fix light trigger with new Trigger class changes (#154087) 2025-10-09 18:14:55 +02:00
abmantis
e7c6c5311d Merge branch 'dev' of github.com:home-assistant/core into dev_target_triggers_conditions 2025-10-09 15:55:39 +01:00
abmantis
72a524c868 Merge branch 'dev' of github.com:home-assistant/core into dev_target_triggers_conditions 2025-09-29 16:56:23 +01:00
abmantis
b437113f31 Merge branch 'dev' of github.com:home-assistant/core into dev_target_triggers_conditions 2025-09-29 11:18:39 +01:00
Abílio Costa
e0e263d3b5 Add state trigger to light component (#148416)
Co-authored-by: Artur Pragacz <49985303+arturpragacz@users.noreply.github.com>
2025-09-18 19:53:26 +01:00
216 changed files with 12021 additions and 1626 deletions

View File

@@ -88,6 +88,10 @@ jobs:
fail-fast: false
matrix:
arch: ${{ fromJson(needs.init.outputs.architectures) }}
exclude:
- arch: armv7
- arch: armhf
- arch: i386
steps:
- name: Checkout the repository
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0

4
CODEOWNERS generated
View File

@@ -1539,8 +1539,8 @@ build.json @home-assistant/supervisor
/tests/components/suez_water/ @ooii @jb101010-2
/homeassistant/components/sun/ @home-assistant/core
/tests/components/sun/ @home-assistant/core
/homeassistant/components/sunricher_dali_center/ @niracler
/tests/components/sunricher_dali_center/ @niracler
/homeassistant/components/sunricher_dali/ @niracler
/tests/components/sunricher_dali/ @niracler
/homeassistant/components/supla/ @mwegrzynek
/homeassistant/components/surepetcare/ @benleb @danielhiversen
/tests/components/surepetcare/ @benleb @danielhiversen

View File

@@ -1,7 +1,10 @@
image: ghcr.io/home-assistant/{arch}-homeassistant
build_from:
aarch64: ghcr.io/home-assistant/aarch64-homeassistant-base:2025.10.1
armhf: ghcr.io/home-assistant/armhf-homeassistant-base:2025.10.1
armv7: ghcr.io/home-assistant/armv7-homeassistant-base:2025.10.1
amd64: ghcr.io/home-assistant/amd64-homeassistant-base:2025.10.1
i386: ghcr.io/home-assistant/i386-homeassistant-base:2025.10.1
cosign:
base_identity: https://github.com/home-assistant/docker/.*
identity: https://github.com/home-assistant/core/.*

View File

@@ -39,11 +39,11 @@ from .const import (
CONF_TURN_OFF_COMMAND,
CONF_TURN_ON_COMMAND,
DEFAULT_ADB_SERVER_PORT,
DEFAULT_DEVICE_CLASS,
DEFAULT_EXCLUDE_UNNAMED_APPS,
DEFAULT_GET_SOURCES,
DEFAULT_PORT,
DEFAULT_SCREENCAP_INTERVAL,
DEVICE_AUTO,
DEVICE_CLASSES,
DOMAIN,
PROP_ETHMAC,
@@ -89,8 +89,14 @@ class AndroidTVFlowHandler(ConfigFlow, domain=DOMAIN):
data_schema = vol.Schema(
{
vol.Required(CONF_HOST, default=host): str,
vol.Required(CONF_DEVICE_CLASS, default=DEFAULT_DEVICE_CLASS): vol.In(
DEVICE_CLASSES
vol.Required(CONF_DEVICE_CLASS, default=DEVICE_AUTO): SelectSelector(
SelectSelectorConfig(
options=[
SelectOptionDict(value=k, label=v)
for k, v in DEVICE_CLASSES.items()
],
translation_key="device_class",
)
),
vol.Required(CONF_PORT, default=DEFAULT_PORT): cv.port,
},

View File

@@ -15,15 +15,19 @@ CONF_TURN_OFF_COMMAND = "turn_off_command"
CONF_TURN_ON_COMMAND = "turn_on_command"
DEFAULT_ADB_SERVER_PORT = 5037
DEFAULT_DEVICE_CLASS = "auto"
DEFAULT_EXCLUDE_UNNAMED_APPS = False
DEFAULT_GET_SOURCES = True
DEFAULT_PORT = 5555
DEFAULT_SCREENCAP_INTERVAL = 5
DEVICE_AUTO = "auto"
DEVICE_ANDROIDTV = "androidtv"
DEVICE_FIRETV = "firetv"
DEVICE_CLASSES = [DEFAULT_DEVICE_CLASS, DEVICE_ANDROIDTV, DEVICE_FIRETV]
DEVICE_CLASSES = {
DEVICE_AUTO: "auto",
DEVICE_ANDROIDTV: "Android TV",
DEVICE_FIRETV: "Fire TV",
}
PROP_ETHMAC = "ethmac"
PROP_SERIALNO = "serialno"

View File

@@ -65,6 +65,13 @@
}
}
},
"selector": {
"device_class": {
"options": {
"auto": "Auto-detect device type"
}
}
},
"services": {
"adb_command": {
"description": "Sends an ADB command to an Android / Fire TV device.",

View File

@@ -9,7 +9,7 @@ from brother import Brother, SnmpError
from homeassistant.components.snmp import async_get_snmp_engine
from homeassistant.const import CONF_HOST, CONF_PORT, CONF_TYPE, Platform
from homeassistant.core import HomeAssistant
from homeassistant.exceptions import ConfigEntryNotReady
from homeassistant.exceptions import ConfigEntryError, ConfigEntryNotReady
from .const import (
CONF_COMMUNITY,
@@ -50,6 +50,15 @@ async def async_setup_entry(hass: HomeAssistant, entry: BrotherConfigEntry) -> b
coordinator = BrotherDataUpdateCoordinator(hass, entry, brother)
await coordinator.async_config_entry_first_refresh()
if brother.serial.lower() != entry.unique_id:
raise ConfigEntryError(
translation_domain=DOMAIN,
translation_key="serial_mismatch",
translation_placeholders={
"device": entry.title,
},
)
entry.runtime_data = coordinator
await hass.config_entries.async_forward_entry_setups(entry, PLATFORMS)

View File

@@ -0,0 +1,30 @@
"""Define the Brother entity."""
from homeassistant.helpers.device_registry import CONNECTION_NETWORK_MAC, DeviceInfo
from homeassistant.helpers.update_coordinator import CoordinatorEntity
from .const import DOMAIN
from .coordinator import BrotherDataUpdateCoordinator
class BrotherPrinterEntity(CoordinatorEntity[BrotherDataUpdateCoordinator]):
"""Define a Brother Printer entity."""
_attr_has_entity_name = True
def __init__(
self,
coordinator: BrotherDataUpdateCoordinator,
) -> None:
"""Initialize."""
super().__init__(coordinator)
self._attr_device_info = DeviceInfo(
configuration_url=f"http://{coordinator.brother.host}/",
identifiers={(DOMAIN, coordinator.brother.serial)},
connections={(CONNECTION_NETWORK_MAC, coordinator.brother.mac)},
serial_number=coordinator.brother.serial,
manufacturer="Brother",
model=coordinator.brother.model,
name=coordinator.brother.model,
sw_version=coordinator.brother.firmware,
)

View File

@@ -19,13 +19,12 @@ from homeassistant.components.sensor import (
from homeassistant.const import PERCENTAGE, EntityCategory
from homeassistant.core import HomeAssistant, callback
from homeassistant.helpers import entity_registry as er
from homeassistant.helpers.device_registry import CONNECTION_NETWORK_MAC, DeviceInfo
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from homeassistant.helpers.typing import StateType
from homeassistant.helpers.update_coordinator import CoordinatorEntity
from .const import DOMAIN
from .coordinator import BrotherConfigEntry, BrotherDataUpdateCoordinator
from .entity import BrotherPrinterEntity
# Coordinator is used to centralize the data updates
PARALLEL_UPDATES = 0
@@ -333,12 +332,9 @@ async def async_setup_entry(
)
class BrotherPrinterSensor(
CoordinatorEntity[BrotherDataUpdateCoordinator], SensorEntity
):
"""Define an Brother Printer sensor."""
class BrotherPrinterSensor(BrotherPrinterEntity, SensorEntity):
"""Define a Brother Printer sensor."""
_attr_has_entity_name = True
entity_description: BrotherSensorEntityDescription
def __init__(
@@ -348,16 +344,7 @@ class BrotherPrinterSensor(
) -> None:
"""Initialize."""
super().__init__(coordinator)
self._attr_device_info = DeviceInfo(
configuration_url=f"http://{coordinator.brother.host}/",
identifiers={(DOMAIN, coordinator.brother.serial)},
connections={(CONNECTION_NETWORK_MAC, coordinator.brother.mac)},
serial_number=coordinator.brother.serial,
manufacturer="Brother",
model=coordinator.brother.model,
name=coordinator.brother.model,
sw_version=coordinator.brother.firmware,
)
self._attr_native_value = description.value(coordinator.data)
self._attr_unique_id = f"{coordinator.brother.serial.lower()}_{description.key}"
self.entity_description = description

View File

@@ -207,6 +207,9 @@
"cannot_connect": {
"message": "An error occurred while connecting to the {device} printer: {error}"
},
"serial_mismatch": {
"message": "The serial number for {device} doesn't match the one in the configuration. It's possible that the two Brother printers have swapped IP addresses. Restore the previous IP address configuration or reconfigure the devices with Home Assistant."
},
"update_error": {
"message": "An error occurred while retrieving data from the {device} printer: {error}"
}

View File

@@ -71,8 +71,11 @@ async def _get_services(hass: HomeAssistant) -> list[dict[str, Any]]:
services = await account_link.async_fetch_available_services(
hass.data[DATA_CLOUD]
)
except (aiohttp.ClientError, TimeoutError):
return []
except (aiohttp.ClientError, TimeoutError) as err:
raise config_entry_oauth2_flow.ImplementationUnavailableError(
"Cannot provide OAuth2 implementation for cloud services. "
"Failed to fetch from account link server."
) from err
hass.data[DATA_SERVICES] = services

View File

@@ -151,14 +151,12 @@ ECOWITT_SENSORS_MAPPING: Final = {
key="RAIN_COUNT_MM",
native_unit_of_measurement=UnitOfPrecipitationDepth.MILLIMETERS,
device_class=SensorDeviceClass.PRECIPITATION,
state_class=SensorStateClass.TOTAL,
suggested_display_precision=1,
),
EcoWittSensorTypes.RAIN_COUNT_INCHES: SensorEntityDescription(
key="RAIN_COUNT_INCHES",
native_unit_of_measurement=UnitOfPrecipitationDepth.INCHES,
device_class=SensorDeviceClass.PRECIPITATION,
state_class=SensorStateClass.TOTAL,
suggested_display_precision=2,
),
EcoWittSensorTypes.RAIN_RATE_MM: SensorEntityDescription(

View File

@@ -2,7 +2,9 @@
from __future__ import annotations
from aioesphomeapi import APIClient
import logging
from aioesphomeapi import APIClient, APIConnectionError
from homeassistant.components import zeroconf
from homeassistant.components.bluetooth import async_remove_scanner
@@ -20,9 +22,12 @@ from homeassistant.helpers.typing import ConfigType
from . import assist_satellite, dashboard, ffmpeg_proxy
from .const import CONF_BLUETOOTH_MAC_ADDRESS, CONF_NOISE_PSK, DOMAIN
from .domain_data import DomainData
from .encryption_key_storage import async_get_encryption_key_storage
from .entry_data import ESPHomeConfigEntry, RuntimeEntryData
from .manager import DEVICE_CONFLICT_ISSUE_FORMAT, ESPHomeManager, cleanup_instance
_LOGGER = logging.getLogger(__name__)
CONFIG_SCHEMA = cv.config_entry_only_config_schema(DOMAIN)
CLIENT_INFO = f"Home Assistant {ha_version}"
@@ -75,10 +80,12 @@ async def async_setup_entry(hass: HomeAssistant, entry: ESPHomeConfigEntry) -> b
async def async_unload_entry(hass: HomeAssistant, entry: ESPHomeConfigEntry) -> bool:
"""Unload an esphome config entry."""
entry_data = await cleanup_instance(entry)
return await hass.config_entries.async_unload_platforms(
entry, entry_data.loaded_platforms
unload_ok = await hass.config_entries.async_unload_platforms(
entry, entry.runtime_data.loaded_platforms
)
if unload_ok:
await cleanup_instance(entry)
return unload_ok
async def async_remove_entry(hass: HomeAssistant, entry: ESPHomeConfigEntry) -> None:
@@ -89,3 +96,57 @@ async def async_remove_entry(hass: HomeAssistant, entry: ESPHomeConfigEntry) ->
hass, DOMAIN, DEVICE_CONFLICT_ISSUE_FORMAT.format(entry.entry_id)
)
await DomainData.get(hass).get_or_create_store(hass, entry).async_remove()
await _async_clear_dynamic_encryption_key(hass, entry)
async def _async_clear_dynamic_encryption_key(
hass: HomeAssistant, entry: ESPHomeConfigEntry
) -> None:
"""Clear the dynamic encryption key on the device and from storage."""
if entry.unique_id is None or entry.data.get(CONF_NOISE_PSK) is None:
return
# Only clear the key if it's stored in our storage, meaning it was
# dynamically generated by us and not user-provided
storage = await async_get_encryption_key_storage(hass)
if await storage.async_get_key(entry.unique_id) is None:
return
host: str = entry.data[CONF_HOST]
port: int = entry.data[CONF_PORT]
password: str | None = entry.data[CONF_PASSWORD]
noise_psk: str | None = entry.data.get(CONF_NOISE_PSK)
zeroconf_instance = await zeroconf.async_get_instance(hass)
cli = APIClient(
host,
port,
password,
client_info=CLIENT_INFO,
zeroconf_instance=zeroconf_instance,
noise_psk=noise_psk,
timezone=hass.config.time_zone,
)
try:
await cli.connect()
# Clear the encryption key on the device by passing an empty key
if not await cli.noise_encryption_set_key(b""):
_LOGGER.debug(
"Could not clear dynamic encryption key for ESPHome device %s: Device rejected key removal",
entry.unique_id,
)
return
except APIConnectionError as exc:
_LOGGER.debug(
"Could not connect to ESPHome device %s to clear dynamic encryption key: %s",
entry.unique_id,
exc,
)
return
finally:
await cli.disconnect()
await storage.async_remove_key(entry.unique_id)

View File

@@ -6,5 +6,5 @@
"documentation": "https://www.home-assistant.io/integrations/foscam",
"iot_class": "local_polling",
"loggers": ["libpyfoscamcgi"],
"requirements": ["libpyfoscamcgi==0.0.8"]
"requirements": ["libpyfoscamcgi==0.0.9"]
}

View File

@@ -744,7 +744,9 @@ class ManifestJSONView(HomeAssistantView):
@websocket_api.websocket_command(
{
"type": "frontend/get_icons",
vol.Required("category"): vol.In({"entity", "entity_component", "services"}),
vol.Required("category"): vol.In(
{"entity", "entity_component", "services", "triggers"}
),
vol.Optional("integration"): vol.All(cv.ensure_list, [str]),
}
)

View File

@@ -20,5 +20,5 @@
"documentation": "https://www.home-assistant.io/integrations/frontend",
"integration_type": "system",
"quality_scale": "internal",
"requirements": ["home-assistant-frontend==20251103.0"]
"requirements": ["home-assistant-frontend==20251105.0"]
}

View File

@@ -15,9 +15,7 @@ from homeassistant.helpers.storage import Store
from homeassistant.util.hass_dict import HassKey
DATA_STORAGE: HassKey[dict[str, UserStore]] = HassKey("frontend_storage")
DATA_SYSTEM_STORAGE: HassKey[SystemStore] = HassKey("frontend_system_storage")
STORAGE_VERSION_USER_DATA = 1
STORAGE_VERSION_SYSTEM_DATA = 1
async def async_setup_frontend_storage(hass: HomeAssistant) -> None:
@@ -25,9 +23,6 @@ async def async_setup_frontend_storage(hass: HomeAssistant) -> None:
websocket_api.async_register_command(hass, websocket_set_user_data)
websocket_api.async_register_command(hass, websocket_get_user_data)
websocket_api.async_register_command(hass, websocket_subscribe_user_data)
websocket_api.async_register_command(hass, websocket_set_system_data)
websocket_api.async_register_command(hass, websocket_get_system_data)
websocket_api.async_register_command(hass, websocket_subscribe_system_data)
async def async_user_store(hass: HomeAssistant, user_id: str) -> UserStore:
@@ -88,62 +83,6 @@ class _UserStore(Store[dict[str, Any]]):
)
async def async_system_store(hass: HomeAssistant) -> SystemStore:
"""Access the system store."""
if DATA_SYSTEM_STORAGE not in hass.data:
store = hass.data[DATA_SYSTEM_STORAGE] = SystemStore(hass)
await store.async_load()
return hass.data[DATA_SYSTEM_STORAGE]
class SystemStore:
"""System store for frontend data."""
def __init__(self, hass: HomeAssistant) -> None:
"""Initialize the system store."""
self._store = _SystemStore(hass)
self.data: dict[str, Any] = {}
self.subscriptions: dict[str | None, list[Callable[[], None]]] = {}
async def async_load(self) -> None:
"""Load the data from the store."""
self.data = await self._store.async_load() or {}
async def async_set_item(self, key: str, value: Any) -> None:
"""Set an item and save the store."""
self.data[key] = value
await self._store.async_save(self.data)
for cb in self.subscriptions.get(None, []):
cb()
for cb in self.subscriptions.get(key, []):
cb()
@callback
def async_subscribe(
self, key: str | None, on_update_callback: Callable[[], None]
) -> Callable[[], None]:
"""Subscribe to store updates."""
self.subscriptions.setdefault(key, []).append(on_update_callback)
def unsubscribe() -> None:
"""Unsubscribe from the store."""
self.subscriptions[key].remove(on_update_callback)
return unsubscribe
class _SystemStore(Store[dict[str, Any]]):
"""System store for frontend data."""
def __init__(self, hass: HomeAssistant) -> None:
"""Initialize the system store."""
super().__init__(
hass,
STORAGE_VERSION_SYSTEM_DATA,
"frontend.system_data",
)
def with_user_store(
orig_func: Callable[
[HomeAssistant, ActiveConnection, dict[str, Any], UserStore],
@@ -168,28 +107,6 @@ def with_user_store(
return with_user_store_func
def with_system_store(
orig_func: Callable[
[HomeAssistant, ActiveConnection, dict[str, Any], SystemStore],
Coroutine[Any, Any, None],
],
) -> Callable[
[HomeAssistant, ActiveConnection, dict[str, Any]], Coroutine[Any, Any, None]
]:
"""Decorate function to provide system store."""
@wraps(orig_func)
async def with_system_store_func(
hass: HomeAssistant, connection: ActiveConnection, msg: dict[str, Any]
) -> None:
"""Provide system store to function."""
store = await async_system_store(hass)
await orig_func(hass, connection, msg, store)
return with_system_store_func
@websocket_api.websocket_command(
{
vol.Required("type"): "frontend/set_user_data",
@@ -252,74 +169,3 @@ async def websocket_subscribe_user_data(
connection.subscriptions[msg["id"]] = store.async_subscribe(key, on_data_update)
on_data_update()
connection.send_result(msg["id"])
@websocket_api.websocket_command(
{
vol.Required("type"): "frontend/set_system_data",
vol.Required("key"): str,
vol.Required("value"): vol.Any(bool, str, int, float, dict, list, None),
}
)
@websocket_api.async_response
@with_system_store
async def websocket_set_system_data(
hass: HomeAssistant,
connection: ActiveConnection,
msg: dict[str, Any],
store: SystemStore,
) -> None:
"""Handle set system data command."""
if not connection.user.is_admin:
connection.send_error(msg["id"], "unauthorized", "Admin access required")
return
await store.async_set_item(msg["key"], msg["value"])
connection.send_result(msg["id"])
@websocket_api.websocket_command(
{vol.Required("type"): "frontend/get_system_data", vol.Optional("key"): str}
)
@websocket_api.async_response
@with_system_store
async def websocket_get_system_data(
hass: HomeAssistant,
connection: ActiveConnection,
msg: dict[str, Any],
store: SystemStore,
) -> None:
"""Handle get system data command."""
data = store.data
connection.send_result(
msg["id"], {"value": data.get(msg["key"]) if "key" in msg else data}
)
@websocket_api.websocket_command(
{
vol.Required("type"): "frontend/subscribe_system_data",
vol.Optional("key"): str,
}
)
@websocket_api.async_response
@with_system_store
async def websocket_subscribe_system_data(
hass: HomeAssistant,
connection: ActiveConnection,
msg: dict[str, Any],
store: SystemStore,
) -> None:
"""Handle subscribe to system data command."""
key: str | None = msg.get("key")
def on_data_update() -> None:
"""Handle system data update."""
data = store.data
connection.send_event(
msg["id"], {"value": data.get(key) if key is not None else data}
)
connection.subscriptions[msg["id"]] = store.async_subscribe(key, on_data_update)
on_data_update()
connection.send_result(msg["id"])

View File

@@ -5,5 +5,5 @@
"config_flow": true,
"documentation": "https://www.home-assistant.io/integrations/holiday",
"iot_class": "local_polling",
"requirements": ["holidays==0.83", "babel==2.15.0"]
"requirements": ["holidays==0.84", "babel==2.15.0"]
}

View File

@@ -39,6 +39,8 @@ from .const import (
NABU_CASA_FIRMWARE_RELEASES_URL,
PID,
PRODUCT,
RADIO_TX_POWER_DBM_BY_COUNTRY,
RADIO_TX_POWER_DBM_DEFAULT,
SERIAL_NUMBER,
VID,
)
@@ -103,6 +105,21 @@ class ZBT2FirmwareMixin(ConfigEntryBaseFlow, FirmwareInstallFlowProtocol):
next_step_id="finish_thread_installation",
)
def _extra_zha_hardware_options(self) -> dict[str, Any]:
"""Return extra ZHA hardware options."""
country = self.hass.config.country
if country is None:
tx_power = RADIO_TX_POWER_DBM_DEFAULT
else:
tx_power = RADIO_TX_POWER_DBM_BY_COUNTRY.get(
country, RADIO_TX_POWER_DBM_DEFAULT
)
return {
"tx_power": tx_power,
}
class HomeAssistantConnectZBT2ConfigFlow(
ZBT2FirmwareMixin,

View File

@@ -1,5 +1,7 @@
"""Constants for the Home Assistant Connect ZBT-2 integration."""
from homeassistant.generated.countries import COUNTRIES
DOMAIN = "homeassistant_connect_zbt2"
NABU_CASA_FIRMWARE_RELEASES_URL = (
@@ -17,3 +19,59 @@ VID = "vid"
DEVICE = "device"
HARDWARE_NAME = "Home Assistant Connect ZBT-2"
RADIO_TX_POWER_DBM_DEFAULT = 8
RADIO_TX_POWER_DBM_BY_COUNTRY = {
# EU Member States
"AT": 10,
"BE": 10,
"BG": 10,
"HR": 10,
"CY": 10,
"CZ": 10,
"DK": 10,
"EE": 10,
"FI": 10,
"FR": 10,
"DE": 10,
"GR": 10,
"HU": 10,
"IE": 10,
"IT": 10,
"LV": 10,
"LT": 10,
"LU": 10,
"MT": 10,
"NL": 10,
"PL": 10,
"PT": 10,
"RO": 10,
"SK": 10,
"SI": 10,
"ES": 10,
"SE": 10,
# EEA Members
"IS": 10,
"LI": 10,
"NO": 10,
# Standards harmonized with RED or ETSI
"CH": 10,
"GB": 10,
"TR": 10,
"AL": 10,
"BA": 10,
"GE": 10,
"MD": 10,
"ME": 10,
"MK": 10,
"RS": 10,
"UA": 10,
# Other CEPT nations
"AD": 10,
"AZ": 10,
"MC": 10,
"SM": 10,
"VA": 10,
}
assert set(RADIO_TX_POWER_DBM_BY_COUNTRY) <= COUNTRIES

View File

@@ -456,6 +456,10 @@ class BaseFirmwareInstallFlow(ConfigEntryBaseFlow, ABC):
# This step is necessary to prevent `user_input` from being passed through
return await self.async_step_continue_zigbee()
def _extra_zha_hardware_options(self) -> dict[str, Any]:
"""Return extra ZHA hardware options."""
return {}
async def async_step_continue_zigbee(
self, user_input: dict[str, Any] | None = None
) -> ConfigFlowResult:
@@ -478,6 +482,7 @@ class BaseFirmwareInstallFlow(ConfigEntryBaseFlow, ABC):
},
"radio_type": "ezsp",
"flow_strategy": self._zigbee_flow_strategy,
**self._extra_zha_hardware_options(),
},
)
return self._continue_zha_flow(result)

View File

@@ -38,6 +38,7 @@ from homeassistant.const import (
from homeassistant.core import Event, HomeAssistant, callback
from homeassistant.exceptions import HomeAssistantError
from homeassistant.helpers import config_validation as cv, issue_registry as ir, storage
from homeassistant.helpers.hassio import is_hassio
from homeassistant.helpers.http import (
KEY_ALLOW_CONFIGURED_CORS,
KEY_AUTHENTICATED, # noqa: F401
@@ -109,7 +110,7 @@ HTTP_SCHEMA: Final = vol.All(
cv.deprecated(CONF_BASE_URL),
vol.Schema(
{
vol.Optional(CONF_SERVER_HOST, default=_DEFAULT_BIND): vol.All(
vol.Optional(CONF_SERVER_HOST): vol.All(
cv.ensure_list, vol.Length(min=1), [cv.string]
),
vol.Optional(CONF_SERVER_PORT, default=SERVER_PORT): cv.port,
@@ -207,7 +208,17 @@ async def async_setup(hass: HomeAssistant, config: ConfigType) -> bool:
if conf is None:
conf = cast(ConfData, HTTP_SCHEMA({}))
server_host = conf[CONF_SERVER_HOST]
if CONF_SERVER_HOST in conf and is_hassio(hass):
ir.async_create_issue(
hass,
DOMAIN,
"server_host_may_break_hassio",
is_fixable=False,
severity=ir.IssueSeverity.ERROR,
translation_key="server_host_may_break_hassio",
)
server_host = conf.get(CONF_SERVER_HOST, _DEFAULT_BIND)
server_port = conf[CONF_SERVER_PORT]
ssl_certificate = conf.get(CONF_SSL_CERTIFICATE)
ssl_peer_certificate = conf.get(CONF_SSL_PEER_CERTIFICATE)

View File

@@ -1,5 +1,9 @@
{
"issues": {
"server_host_may_break_hassio": {
"description": "The `server_host` configuration option in the HTTP integration is prone to break the communication between Home Assistant Core and Supervisor, and will be removed in a future release.\n\nIf you are using this option to bind Home Assistant to specific network interfaces, please remove it from your configuration. Home Assistant will automatically bind to all available interfaces by default.\n\nIf you have specific networking requirements, consider using firewall rules or other network configuration to control access to Home Assistant.",
"title": "The `server_host` HTTP configuration may break Home Assistant Core - Supervisor communication"
},
"ssl_configured_without_configured_urls": {
"description": "Home Assistant detected that SSL has been set up on your instance, however, no custom external internet URL has been set.\n\nThis may result in unexpected behavior. Text-to-speech may fail, and integrations may not be able to connect back to your instance correctly.\n\nTo address this issue, go to Settings > System > Network; under the \"Home Assistant URL\" section, configure your new \"Internet\" and \"Local network\" addresses that match your new SSL configuration.",
"title": "SSL is configured without an external URL or internal URL"

View File

@@ -54,15 +54,14 @@ from homeassistant.helpers.typing import ConfigType
from .const import (
API_VERSION_2,
BATCH_BUFFER_SIZE,
BATCH_TIMEOUT,
CATCHING_UP_MESSAGE,
CLIENT_ERROR_V1,
CLIENT_ERROR_V2,
CODE_INVALID_INPUTS,
COMPONENT_CONFIG_SCHEMA_BATCH,
COMPONENT_CONFIG_SCHEMA_CONNECTION,
CONF_API_VERSION,
CONF_BATCH_BUFFER_SIZE,
CONF_BATCH_TIMEOUT,
CONF_BUCKET,
CONF_COMPONENT_CONFIG,
CONF_COMPONENT_CONFIG_DOMAIN,
@@ -194,12 +193,7 @@ _INFLUX_BASE_SCHEMA = INCLUDE_EXCLUDE_BASE_FILTER_SCHEMA.extend(
)
INFLUX_SCHEMA = vol.All(
_INFLUX_BASE_SCHEMA.extend(
{
**COMPONENT_CONFIG_SCHEMA_CONNECTION,
**COMPONENT_CONFIG_SCHEMA_BATCH,
}
),
_INFLUX_BASE_SCHEMA.extend(COMPONENT_CONFIG_SCHEMA_CONNECTION),
validate_version_specific_config,
create_influx_url,
)
@@ -502,9 +496,7 @@ def setup(hass: HomeAssistant, config: ConfigType) -> bool:
event_to_json = _generate_event_to_json(conf)
max_tries = conf.get(CONF_RETRY_COUNT)
instance = hass.data[DOMAIN] = InfluxThread(
hass, influx, event_to_json, max_tries, conf
)
instance = hass.data[DOMAIN] = InfluxThread(hass, influx, event_to_json, max_tries)
instance.start()
def shutdown(event):
@@ -521,7 +513,7 @@ def setup(hass: HomeAssistant, config: ConfigType) -> bool:
class InfluxThread(threading.Thread):
"""A threaded event handler class."""
def __init__(self, hass, influx, event_to_json, max_tries, config):
def __init__(self, hass, influx, event_to_json, max_tries):
"""Initialize the listener."""
threading.Thread.__init__(self, name=DOMAIN)
self.queue: queue.SimpleQueue[threading.Event | tuple[float, Event] | None] = (
@@ -532,8 +524,6 @@ class InfluxThread(threading.Thread):
self.max_tries = max_tries
self.write_errors = 0
self.shutdown = False
self._batch_timeout = config[CONF_BATCH_TIMEOUT]
self.batch_buffer_size = config[CONF_BATCH_BUFFER_SIZE]
hass.bus.listen(EVENT_STATE_CHANGED, self._event_listener)
@callback
@@ -542,31 +532,23 @@ class InfluxThread(threading.Thread):
item = (time.monotonic(), event)
self.queue.put(item)
@property
def batch_timeout(self):
@staticmethod
def batch_timeout():
"""Return number of seconds to wait for more events."""
return self._batch_timeout
return BATCH_TIMEOUT
def get_events_json(self):
"""Return a batch of events formatted for writing."""
queue_seconds = QUEUE_BACKLOG_SECONDS + self.max_tries * RETRY_DELAY
start_time = time.monotonic()
batch_timeout = self.batch_timeout()
count = 0
json = []
dropped = 0
with suppress(queue.Empty):
while len(json) < self.batch_buffer_size and not self.shutdown:
if count > 0 and time.monotonic() - start_time >= batch_timeout:
break
timeout = (
None
if count == 0
else batch_timeout - (time.monotonic() - start_time)
)
while len(json) < BATCH_BUFFER_SIZE and not self.shutdown:
timeout = None if count == 0 else self.batch_timeout()
item = self.queue.get(timeout=timeout)
count += 1

View File

@@ -47,9 +47,6 @@ CONF_FUNCTION = "function"
CONF_QUERY = "query"
CONF_IMPORTS = "imports"
CONF_BATCH_BUFFER_SIZE = "batch_buffer_size"
CONF_BATCH_TIMEOUT = "batch_timeout"
DEFAULT_DATABASE = "home_assistant"
DEFAULT_HOST_V2 = "us-west-2-1.aws.cloud2.influxdata.com"
DEFAULT_SSL_V2 = True
@@ -63,9 +60,6 @@ DEFAULT_RANGE_STOP = "now()"
DEFAULT_FUNCTION_FLUX = "|> limit(n: 1)"
DEFAULT_MEASUREMENT_ATTR = "unit_of_measurement"
DEFAULT_BATCH_BUFFER_SIZE = 100
DEFAULT_BATCH_TIMEOUT = 1
INFLUX_CONF_MEASUREMENT = "measurement"
INFLUX_CONF_TAGS = "tags"
INFLUX_CONF_TIME = "time"
@@ -82,6 +76,8 @@ TIMEOUT = 10 # seconds
RETRY_DELAY = 20
QUEUE_BACKLOG_SECONDS = 30
RETRY_INTERVAL = 60 # seconds
BATCH_TIMEOUT = 1
BATCH_BUFFER_SIZE = 100
LANGUAGE_INFLUXQL = "influxQL"
LANGUAGE_FLUX = "flux"
TEST_QUERY_V1 = "SHOW DATABASES;"
@@ -156,10 +152,3 @@ COMPONENT_CONFIG_SCHEMA_CONNECTION = {
vol.Inclusive(CONF_ORG, "v2_authentication"): cv.string,
vol.Optional(CONF_BUCKET, default=DEFAULT_BUCKET): cv.string,
}
COMPONENT_CONFIG_SCHEMA_BATCH = {
vol.Optional(
CONF_BATCH_BUFFER_SIZE, default=DEFAULT_BATCH_BUFFER_SIZE
): cv.positive_int,
vol.Optional(CONF_BATCH_TIMEOUT, default=DEFAULT_BATCH_TIMEOUT): cv.positive_float,
}

View File

@@ -359,7 +359,7 @@ CLIMATE_KNX_SCHEMA = vol.Schema(
write=False, state_required=True, valid_dpt="9.001"
),
vol.Optional(CONF_GA_HUMIDITY_CURRENT): GASelector(
write=False, valid_dpt="9.002"
write=False, valid_dpt="9.007"
),
vol.Required(CONF_TARGET_TEMPERATURE): GroupSelect(
GroupSelectOption(

View File

@@ -221,7 +221,7 @@ async def library_payload(hass):
for child in library_info.children:
child.thumbnail = "https://brands.home-assistant.io/_/kodi/logo.png"
with contextlib.suppress(media_source.BrowseError):
with contextlib.suppress(BrowseError):
item = await media_source.async_browse_media(
hass, None, content_filter=media_source_content_filter
)

View File

@@ -622,6 +622,7 @@ ENERGY_USAGE_SENSORS: tuple[ThinQEnergySensorEntityDescription, ...] = (
usage_period=USAGE_MONTHLY,
start_date_fn=lambda today: today,
end_date_fn=lambda today: today,
state_class=SensorStateClass.TOTAL_INCREASING,
),
ThinQEnergySensorEntityDescription(
key="last_month",

View File

@@ -25,5 +25,10 @@
"turn_on": {
"service": "mdi:lightbulb-on"
}
},
"triggers": {
"state": {
"trigger": "mdi:state-machine"
}
}
}

View File

@@ -132,6 +132,13 @@
}
},
"selector": {
"behavior": {
"options": {
"any": "Any",
"first": "First",
"last": "Last"
}
},
"color_name": {
"options": {
"aliceblue": "Alice blue",
@@ -289,6 +296,12 @@
"long": "Long",
"short": "Short"
}
},
"state": {
"options": {
"off": "[%key:common::state::off%]",
"on": "[%key:common::state::on%]"
}
}
},
"services": {
@@ -462,5 +475,22 @@
}
}
},
"title": "Light"
"title": "Light",
"triggers": {
"state": {
"description": "When the state of a light changes, such as turning on or off.",
"description_configured": "When the state of a light changes",
"fields": {
"behavior": {
"description": "The behavior of the targeted entities to trigger on.",
"name": "Behavior"
},
"state": {
"description": "The state to trigger on.",
"name": "State"
}
},
"name": "State"
}
}
}

View File

@@ -0,0 +1,152 @@
"""Provides triggers for lights."""
from typing import TYPE_CHECKING, Final, cast, override
import voluptuous as vol
from homeassistant.const import (
ATTR_ENTITY_ID,
CONF_STATE,
CONF_TARGET,
STATE_OFF,
STATE_ON,
)
from homeassistant.core import CALLBACK_TYPE, HomeAssistant, callback, split_entity_id
from homeassistant.helpers import config_validation as cv
from homeassistant.helpers.event import process_state_match
from homeassistant.helpers.target import (
TargetStateChangedData,
async_track_target_selector_state_change_event,
)
from homeassistant.helpers.trigger import Trigger, TriggerActionRunner, TriggerConfig
from homeassistant.helpers.typing import ConfigType
from .const import DOMAIN
# remove when #151314 is merged
CONF_OPTIONS: Final = "options"
ATTR_BEHAVIOR: Final = "behavior"
BEHAVIOR_FIRST: Final = "first"
BEHAVIOR_LAST: Final = "last"
BEHAVIOR_ANY: Final = "any"
STATE_PLATFORM_TYPE: Final = "state"
STATE_TRIGGER_SCHEMA = vol.Schema(
{
vol.Required(CONF_OPTIONS): {
vol.Required(CONF_STATE): vol.In([STATE_ON, STATE_OFF]),
vol.Required(ATTR_BEHAVIOR, default=BEHAVIOR_ANY): vol.In(
[BEHAVIOR_FIRST, BEHAVIOR_LAST, BEHAVIOR_ANY]
),
},
vol.Required(CONF_TARGET): cv.TARGET_FIELDS,
}
)
class StateTrigger(Trigger):
"""Trigger for state changes."""
@override
@classmethod
async def async_validate_config(
cls, hass: HomeAssistant, config: ConfigType
) -> ConfigType:
"""Validate config."""
return cast(ConfigType, STATE_TRIGGER_SCHEMA(config))
def __init__(self, hass: HomeAssistant, config: TriggerConfig) -> None:
"""Initialize the state trigger."""
super().__init__(hass, config)
if TYPE_CHECKING:
assert config.options is not None
assert config.target is not None
self._options = config.options
self._target = config.target
@override
async def async_attach_runner(
self, run_action: TriggerActionRunner
) -> CALLBACK_TYPE:
"""Attach the trigger to an action runner."""
match_config_state = process_state_match(self._options.get(CONF_STATE))
def check_all_match(entity_ids: set[str]) -> bool:
"""Check if all entity states match."""
return all(
match_config_state(state.state)
for entity_id in entity_ids
if (state := self._hass.states.get(entity_id)) is not None
)
def check_one_match(entity_ids: set[str]) -> bool:
"""Check that only one entity state matches."""
return (
sum(
match_config_state(state.state)
for entity_id in entity_ids
if (state := self._hass.states.get(entity_id)) is not None
)
== 1
)
behavior = self._options.get(ATTR_BEHAVIOR)
@callback
def state_change_listener(
target_state_change_data: TargetStateChangedData,
) -> None:
"""Listen for state changes and call action."""
event = target_state_change_data.state_change_event
entity_id = event.data["entity_id"]
from_state = event.data["old_state"]
to_state = event.data["new_state"]
if to_state is None:
return
# This check is required for "first" behavior, to check that it went from zero
# entities matching the state to one. Otherwise, if previously there were two
# entities on CONF_STATE and one changed, this would trigger.
# For "last" behavior it is not required, but serves as a quicker fail check.
if not match_config_state(to_state.state):
return
if behavior == BEHAVIOR_LAST:
if not check_all_match(target_state_change_data.targeted_entity_ids):
return
elif behavior == BEHAVIOR_FIRST:
if not check_one_match(target_state_change_data.targeted_entity_ids):
return
run_action(
{
ATTR_ENTITY_ID: entity_id,
"from_state": from_state,
"to_state": to_state,
},
f"state of {entity_id}",
event.context,
)
def entity_filter(entities: set[str]) -> set[str]:
"""Filter entities of this domain."""
return {
entity_id
for entity_id in entities
if split_entity_id(entity_id)[0] == DOMAIN
}
return async_track_target_selector_state_change_event(
self._hass, self._target, state_change_listener, entity_filter
)
TRIGGERS: dict[str, type[Trigger]] = {
STATE_PLATFORM_TYPE: StateTrigger,
}
async def async_get_triggers(hass: HomeAssistant) -> dict[str, type[Trigger]]:
"""Return the triggers for lights."""
return TRIGGERS

View File

@@ -0,0 +1,24 @@
state:
target:
entity:
domain: light
fields:
state:
required: true
default: "on"
selector:
select:
options:
- "off"
- "on"
translation_key: state
behavior:
required: true
default: any
selector:
select:
options:
- first
- last
- any
translation_key: behavior

View File

@@ -13,5 +13,5 @@
"iot_class": "cloud_push",
"loggers": ["pylitterbot"],
"quality_scale": "bronze",
"requirements": ["pylitterbot==2024.2.7"]
"requirements": ["pylitterbot==2025.0.0"]
}

View File

@@ -408,6 +408,20 @@ class ManualAlarm(AlarmControlPanelEntity, RestoreEntity):
if not alarm_code or code == alarm_code:
return
current_context = (
self._context if hasattr(self, "_context") and self._context else None
)
user_id_from_context = current_context.user_id if current_context else None
self.hass.bus.async_fire(
"manual_alarm_bad_code_attempt",
{
"entity_id": self.entity_id,
"user_id": user_id_from_context,
"target_state": state,
},
)
raise ServiceValidationError(
"Invalid alarm code provided",
translation_domain=DOMAIN,

View File

@@ -41,6 +41,9 @@
"energy_forecast": {
"default": "mdi:lightning-bolt-outline"
},
"finish": {
"default": "mdi:clock-end"
},
"plate": {
"default": "mdi:circle-outline",
"state": {
@@ -83,6 +86,9 @@
"spin_speed": {
"default": "mdi:sync"
},
"start": {
"default": "mdi:clock-start"
},
"start_time": {
"default": "mdi:clock-start"
},

View File

@@ -4,6 +4,7 @@ from __future__ import annotations
from collections.abc import Callable, Mapping
from dataclasses import dataclass
from datetime import datetime, timedelta
import logging
from typing import Any, Final, cast
@@ -29,6 +30,7 @@ from homeassistant.core import HomeAssistant, callback
from homeassistant.helpers import entity_registry as er
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from homeassistant.helpers.typing import StateType
from homeassistant.util import dt as dt_util
from .const import (
COFFEE_SYSTEM_PROFILE,
@@ -102,12 +104,47 @@ def _get_coffee_profile(value: MieleDevice) -> str | None:
return None
def _convert_start_timestamp(
elapsed_time_list: list[int], start_time_list: list[int]
) -> datetime | None:
"""Convert raw values representing time into start timestamp."""
now = dt_util.utcnow()
elapsed_duration = _convert_duration(elapsed_time_list)
delayed_start_duration = _convert_duration(start_time_list)
if (elapsed_duration is None or elapsed_duration == 0) and (
delayed_start_duration is None or delayed_start_duration == 0
):
return None
if elapsed_duration is not None and elapsed_duration > 0:
duration = -elapsed_duration
elif delayed_start_duration is not None and delayed_start_duration > 0:
duration = delayed_start_duration
delta = timedelta(minutes=duration)
return (now + delta).replace(second=0, microsecond=0)
def _convert_finish_timestamp(
remaining_time_list: list[int], start_time_list: list[int]
) -> datetime | None:
"""Convert raw values representing time into finish timestamp."""
now = dt_util.utcnow()
program_duration = _convert_duration(remaining_time_list)
delayed_start_duration = _convert_duration(start_time_list)
if program_duration is None or program_duration == 0:
return None
duration = program_duration + (
delayed_start_duration if delayed_start_duration is not None else 0
)
delta = timedelta(minutes=duration)
return (now + delta).replace(second=0, microsecond=0)
@dataclass(frozen=True, kw_only=True)
class MieleSensorDescription(SensorEntityDescription):
"""Class describing Miele sensor entities."""
value_fn: Callable[[MieleDevice], StateType]
end_value_fn: Callable[[StateType], StateType] | None = None
value_fn: Callable[[MieleDevice], StateType | datetime]
end_value_fn: Callable[[StateType | datetime], StateType | datetime] | None = None
extra_attributes: dict[str, Callable[[MieleDevice], StateType]] | None = None
zone: int | None = None
unique_id_fn: Callable[[str, MieleSensorDescription], str] | None = None
@@ -428,6 +465,60 @@ SENSOR_TYPES: Final[tuple[MieleSensorDefinition, ...]] = (
suggested_unit_of_measurement=UnitOfTime.HOURS,
),
),
MieleSensorDefinition(
types=(
MieleAppliance.WASHING_MACHINE,
MieleAppliance.WASHING_MACHINE_SEMI_PROFESSIONAL,
MieleAppliance.TUMBLE_DRYER,
MieleAppliance.TUMBLE_DRYER_SEMI_PROFESSIONAL,
MieleAppliance.DISHWASHER,
MieleAppliance.OVEN,
MieleAppliance.OVEN_MICROWAVE,
MieleAppliance.STEAM_OVEN,
MieleAppliance.MICROWAVE,
MieleAppliance.ROBOT_VACUUM_CLEANER,
MieleAppliance.WASHER_DRYER,
MieleAppliance.STEAM_OVEN_COMBI,
MieleAppliance.STEAM_OVEN_MICRO,
MieleAppliance.DIALOG_OVEN,
MieleAppliance.STEAM_OVEN_MK2,
),
description=MieleSensorDescription(
key="state_finish_timestamp",
translation_key="finish",
value_fn=lambda value: _convert_finish_timestamp(
value.state_remaining_time, value.state_start_time
),
device_class=SensorDeviceClass.TIMESTAMP,
entity_category=EntityCategory.DIAGNOSTIC,
),
),
MieleSensorDefinition(
types=(
MieleAppliance.WASHING_MACHINE,
MieleAppliance.TUMBLE_DRYER,
MieleAppliance.DISHWASHER,
MieleAppliance.OVEN,
MieleAppliance.OVEN_MICROWAVE,
MieleAppliance.STEAM_OVEN,
MieleAppliance.MICROWAVE,
MieleAppliance.WASHER_DRYER,
MieleAppliance.STEAM_OVEN_COMBI,
MieleAppliance.STEAM_OVEN_MICRO,
MieleAppliance.DIALOG_OVEN,
MieleAppliance.ROBOT_VACUUM_CLEANER,
MieleAppliance.STEAM_OVEN_MK2,
),
description=MieleSensorDescription(
key="state_start_timestamp",
translation_key="start",
value_fn=lambda value: _convert_start_timestamp(
value.state_elapsed_time, value.state_start_time
),
device_class=SensorDeviceClass.TIMESTAMP,
entity_category=EntityCategory.DIAGNOSTIC,
),
),
MieleSensorDefinition(
types=(
MieleAppliance.TUMBLE_DRYER_SEMI_PROFESSIONAL,
@@ -620,6 +711,8 @@ async def async_setup_entry(
"state_elapsed_time": MieleTimeSensor,
"state_remaining_time": MieleTimeSensor,
"state_start_time": MieleTimeSensor,
"state_start_timestamp": MieleAbsoluteTimeSensor,
"state_finish_timestamp": MieleAbsoluteTimeSensor,
"current_energy_consumption": MieleConsumptionSensor,
"current_water_consumption": MieleConsumptionSensor,
}.get(definition.description.key, MieleSensor)
@@ -743,7 +836,7 @@ class MieleSensor(MieleEntity, SensorEntity):
self._attr_unique_id = description.unique_id_fn(device_id, description)
@property
def native_value(self) -> StateType:
def native_value(self) -> StateType | datetime:
"""Return the state of the sensor."""
return self.entity_description.value_fn(self.device)
@@ -761,7 +854,7 @@ class MieleSensor(MieleEntity, SensorEntity):
class MieleRestorableSensor(MieleSensor, RestoreSensor):
"""Representation of a Sensor whose internal state can be restored."""
_attr_native_value: StateType
_attr_native_value: StateType | datetime
async def async_added_to_hass(self) -> None:
"""When entity is added to hass."""
@@ -773,7 +866,7 @@ class MieleRestorableSensor(MieleSensor, RestoreSensor):
self._attr_native_value = last_data.native_value # type: ignore[assignment]
@property
def native_value(self) -> StateType:
def native_value(self) -> StateType | datetime:
"""Return the state of the sensor.
It is necessary to override `native_value` to fall back to the default
@@ -934,6 +1027,40 @@ class MieleTimeSensor(MieleRestorableSensor):
self._attr_native_value = current_value
class MieleAbsoluteTimeSensor(MieleRestorableSensor):
"""Representation of absolute time sensors handling precision correctness."""
_previous_value: StateType | datetime = None
def _update_native_value(self) -> None:
"""Update the last value of the sensor."""
current_value = self.entity_description.value_fn(self.device)
current_status = StateStatus(self.device.state_status)
# The API reports with minute precision, to avoid changing
# the value too often, we keep the cached value if it differs
# less than 90s from the new value
if (
isinstance(self._previous_value, datetime)
and isinstance(current_value, datetime)
and (
self._previous_value - timedelta(seconds=90)
< current_value
< self._previous_value + timedelta(seconds=90)
)
) or current_status == StateStatus.PROGRAM_ENDED:
return
# force unknown when appliance is not working (some devices are keeping last value until a new cycle starts)
if current_status in (StateStatus.OFF, StateStatus.ON, StateStatus.IDLE):
self._attr_native_value = None
# otherwise, cache value and return it
else:
self._attr_native_value = current_value
self._previous_value = current_value
class MieleConsumptionSensor(MieleRestorableSensor):
"""Representation of consumption sensors keeping state from cache."""

View File

@@ -216,6 +216,9 @@
"energy_forecast": {
"name": "Energy forecast"
},
"finish": {
"name": "Finish"
},
"plate": {
"name": "Plate {plate_no}",
"state": {
@@ -1015,6 +1018,9 @@
"spin_speed": {
"name": "Spin speed"
},
"start": {
"name": "Start"
},
"start_time": {
"name": "Start in"
},

View File

@@ -26,8 +26,8 @@ from homeassistant.helpers.issue_registry import (
async_delete_issue,
)
from .actions import get_music_assistant_client, register_actions
from .const import ATTR_CONF_EXPOSE_PLAYER_TO_HA, DOMAIN, LOGGER
from .services import get_music_assistant_client, register_actions
if TYPE_CHECKING:
from music_assistant_models.event import MassEvent

View File

@@ -0,0 +1,11 @@
"""Constants for the NOAA Tides integration."""
from datetime import timedelta
CONF_STATION_ID = "station_id"
DEFAULT_NAME = "NOAA Tides"
DEFAULT_PREDICTION_LENGTH = timedelta(days=2)
DEFAULT_TIMEZONE = "lst_ldt"
ATTRIBUTION = "Data provided by NOAA"

View File

@@ -2,7 +2,7 @@
from __future__ import annotations
from datetime import datetime, timedelta
from datetime import datetime
import logging
from typing import TYPE_CHECKING, Any, Literal, TypedDict
@@ -22,6 +22,13 @@ from homeassistant.helpers.entity_platform import AddEntitiesCallback
from homeassistant.helpers.typing import ConfigType, DiscoveryInfoType
from homeassistant.util.unit_system import METRIC_SYSTEM
from .const import (
ATTRIBUTION,
CONF_STATION_ID,
DEFAULT_NAME,
DEFAULT_PREDICTION_LENGTH,
DEFAULT_TIMEZONE,
)
from .helpers import get_station_unique_id
if TYPE_CHECKING:
@@ -29,13 +36,6 @@ if TYPE_CHECKING:
_LOGGER = logging.getLogger(__name__)
CONF_STATION_ID = "station_id"
DEFAULT_NAME = "NOAA Tides"
DEFAULT_TIMEZONE = "lst_ldt"
SCAN_INTERVAL = timedelta(minutes=60)
TIMEZONES = ["gmt", "lst", "lst_ldt"]
UNIT_SYSTEMS = ["english", "metric"]
@@ -63,9 +63,9 @@ def setup_platform(
if CONF_UNIT_SYSTEM in config:
unit_system = config[CONF_UNIT_SYSTEM]
elif hass.config.units is METRIC_SYSTEM:
unit_system = UNIT_SYSTEMS[1]
unit_system = "metric"
else:
unit_system = UNIT_SYSTEMS[0]
unit_system = "english"
try:
station = coops.Station(station_id, unit_system)
@@ -97,7 +97,7 @@ class NOAATidesData(TypedDict):
class NOAATidesAndCurrentsSensor(SensorEntity):
"""Representation of a NOAA Tides and Currents sensor."""
_attr_attribution = "Data provided by NOAA"
_attr_attribution = ATTRIBUTION
def __init__(self, name, station_id, timezone, unit_system, station) -> None:
"""Initialize the sensor."""
@@ -141,8 +141,8 @@ class NOAATidesAndCurrentsSensor(SensorEntity):
return attr
@property
def native_value(self):
"""Return the state of the device."""
def native_value(self) -> str | None:
"""Return the state."""
if self.data is None:
return None
api_time = self.data["time_stamp"][0]
@@ -157,8 +157,7 @@ class NOAATidesAndCurrentsSensor(SensorEntity):
def update(self) -> None:
"""Get the latest data from NOAA Tides and Currents API."""
begin = datetime.now()
delta = timedelta(days=2)
end = begin + delta
end = begin + DEFAULT_PREDICTION_LENGTH
try:
df_predictions = self._station.get_data(
begin_date=begin.strftime("%Y%m%d %H:%M"),

View File

@@ -2,6 +2,7 @@
from __future__ import annotations
from dataclasses import dataclass
from typing import Any
from homeassistant.components.climate import (
@@ -13,18 +14,44 @@ from homeassistant.components.climate import (
HVACAction,
HVACMode,
)
from homeassistant.const import ATTR_TEMPERATURE, UnitOfTemperature
from homeassistant.const import ATTR_TEMPERATURE, STATE_OFF, STATE_ON, UnitOfTemperature
from homeassistant.core import HomeAssistant, callback
from homeassistant.exceptions import HomeAssistantError
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from homeassistant.helpers.restore_state import ExtraStoredData, RestoreEntity
from .const import DOMAIN, MASTER_THERMOSTATS
from .coordinator import PlugwiseConfigEntry, PlugwiseDataUpdateCoordinator
from .entity import PlugwiseEntity
from .util import plugwise_command
ERROR_NO_SCHEDULE = "set_schedule_first"
PARALLEL_UPDATES = 0
@dataclass
class PlugwiseClimateExtraStoredData(ExtraStoredData):
"""Object to hold extra stored data."""
last_active_schedule: str | None
previous_action_mode: str | None
def as_dict(self) -> dict[str, Any]:
"""Return a dict representation of the text data."""
return {
"last_active_schedule": self.last_active_schedule,
"previous_action_mode": self.previous_action_mode,
}
@classmethod
def from_dict(cls, restored: dict[str, Any]) -> PlugwiseClimateExtraStoredData:
"""Initialize a stored data object from a dict."""
return cls(
last_active_schedule=restored.get("last_active_schedule"),
previous_action_mode=restored.get("previous_action_mode"),
)
async def async_setup_entry(
hass: HomeAssistant,
entry: PlugwiseConfigEntry,
@@ -56,14 +83,26 @@ async def async_setup_entry(
entry.async_on_unload(coordinator.async_add_listener(_add_entities))
class PlugwiseClimateEntity(PlugwiseEntity, ClimateEntity):
class PlugwiseClimateEntity(PlugwiseEntity, ClimateEntity, RestoreEntity):
"""Representation of a Plugwise thermostat."""
_attr_name = None
_attr_temperature_unit = UnitOfTemperature.CELSIUS
_attr_translation_key = DOMAIN
_previous_mode: str = "heating"
_last_active_schedule: str | None = None
_previous_action_mode: str | None = HVACAction.HEATING.value
async def async_added_to_hass(self) -> None:
"""Run when entity about to be added."""
await super().async_added_to_hass()
if extra_data := await self.async_get_last_extra_data():
plugwise_extra_data = PlugwiseClimateExtraStoredData.from_dict(
extra_data.as_dict()
)
self._last_active_schedule = plugwise_extra_data.last_active_schedule
self._previous_action_mode = plugwise_extra_data.previous_action_mode
def __init__(
self,
@@ -76,7 +115,6 @@ class PlugwiseClimateEntity(PlugwiseEntity, ClimateEntity):
gateway_id: str = coordinator.api.gateway_id
self._gateway_data = coordinator.data[gateway_id]
self._location = device_id
if (location := self.device.get("location")) is not None:
self._location = location
@@ -105,25 +143,19 @@ class PlugwiseClimateEntity(PlugwiseEntity, ClimateEntity):
self.device["thermostat"]["resolution"], 0.1
)
def _previous_action_mode(self, coordinator: PlugwiseDataUpdateCoordinator) -> None:
"""Return the previous action-mode when the regulation-mode is not heating or cooling.
Helper for set_hvac_mode().
"""
# When no cooling available, _previous_mode is always heating
if (
"regulation_modes" in self._gateway_data
and "cooling" in self._gateway_data["regulation_modes"]
):
mode = self._gateway_data["select_regulation_mode"]
if mode in ("cooling", "heating"):
self._previous_mode = mode
@property
def current_temperature(self) -> float:
"""Return the current temperature."""
return self.device["sensors"]["temperature"]
@property
def extra_restore_state_data(self) -> PlugwiseClimateExtraStoredData:
"""Return text specific state data to be restored."""
return PlugwiseClimateExtraStoredData(
last_active_schedule=self._last_active_schedule,
previous_action_mode=self._previous_action_mode,
)
@property
def target_temperature(self) -> float:
"""Return the temperature we try to reach.
@@ -170,9 +202,10 @@ class PlugwiseClimateEntity(PlugwiseEntity, ClimateEntity):
if self.coordinator.api.cooling_present:
if "regulation_modes" in self._gateway_data:
if self._gateway_data["select_regulation_mode"] == "cooling":
selected = self._gateway_data.get("select_regulation_mode")
if selected == HVACAction.COOLING.value:
hvac_modes.append(HVACMode.COOL)
if self._gateway_data["select_regulation_mode"] == "heating":
if selected == HVACAction.HEATING.value:
hvac_modes.append(HVACMode.HEAT)
else:
hvac_modes.append(HVACMode.HEAT_COOL)
@@ -184,8 +217,16 @@ class PlugwiseClimateEntity(PlugwiseEntity, ClimateEntity):
@property
def hvac_action(self) -> HVACAction:
"""Return the current running hvac operation if supported."""
# Keep track of the previous action-mode
self._previous_action_mode(self.coordinator)
# Keep track of the previous hvac_action mode.
# When no cooling available, _previous_action_mode is always heating
if (
"regulation_modes" in self._gateway_data
and HVACAction.COOLING.value in self._gateway_data["regulation_modes"]
):
mode = self._gateway_data["select_regulation_mode"]
if mode in (HVACAction.COOLING.value, HVACAction.HEATING.value):
self._previous_action_mode = mode
if (action := self.device.get("control_state")) is not None:
return HVACAction(action)
@@ -219,14 +260,33 @@ class PlugwiseClimateEntity(PlugwiseEntity, ClimateEntity):
return
if hvac_mode == HVACMode.OFF:
await self.coordinator.api.set_regulation_mode(hvac_mode)
await self.coordinator.api.set_regulation_mode(hvac_mode.value)
else:
current = self.device.get("select_schedule")
desired = current
# Capture the last valid schedule
if desired and desired != "off":
self._last_active_schedule = desired
elif desired == "off":
desired = self._last_active_schedule
# Enabling HVACMode.AUTO requires a previously set schedule for saving and restoring
if hvac_mode == HVACMode.AUTO and not desired:
raise HomeAssistantError(
translation_domain=DOMAIN,
translation_key=ERROR_NO_SCHEDULE,
)
await self.coordinator.api.set_schedule_state(
self._location,
"on" if hvac_mode == HVACMode.AUTO else "off",
STATE_ON if hvac_mode == HVACMode.AUTO else STATE_OFF,
desired,
)
if self.hvac_mode == HVACMode.OFF:
await self.coordinator.api.set_regulation_mode(self._previous_mode)
if self.hvac_mode == HVACMode.OFF and self._previous_action_mode:
await self.coordinator.api.set_regulation_mode(
self._previous_action_mode
)
@plugwise_command
async def async_set_preset_mode(self, preset_mode: str) -> None:

View File

@@ -8,6 +8,6 @@
"iot_class": "local_polling",
"loggers": ["plugwise"],
"quality_scale": "platinum",
"requirements": ["plugwise==1.8.2"],
"requirements": ["plugwise==1.8.3"],
"zeroconf": ["_plugwise._tcp.local."]
}

View File

@@ -314,6 +314,9 @@
"invalid_xml_data": {
"message": "[%key:component::plugwise::config::error::response_error%]"
},
"set_schedule_first": {
"message": "Failed setting HVACMode, set a schedule first."
},
"unsupported_firmware": {
"message": "[%key:component::plugwise::config::error::unsupported%]"
}

View File

@@ -3,13 +3,15 @@
from __future__ import annotations
import logging
from typing import Any
from pooldose.client import PooldoseClient
from pooldose.request_status import RequestStatus
from homeassistant.const import CONF_HOST, Platform
from homeassistant.core import HomeAssistant
from homeassistant.core import HomeAssistant, callback
from homeassistant.exceptions import ConfigEntryNotReady
from homeassistant.helpers import entity_registry as er
from .coordinator import PooldoseConfigEntry, PooldoseCoordinator
@@ -18,6 +20,36 @@ _LOGGER = logging.getLogger(__name__)
PLATFORMS: list[Platform] = [Platform.SENSOR]
async def async_migrate_entry(hass: HomeAssistant, entry: PooldoseConfigEntry) -> bool:
"""Migrate old entry."""
# Version 1.1 -> 1.2: Migrate entity unique IDs
# - ofa_orp_value -> ofa_orp_time
# - ofa_ph_value -> ofa_ph_time
if entry.version == 1 and entry.minor_version < 2:
@callback
def migrate_unique_id(entity_entry: er.RegistryEntry) -> dict[str, Any] | None:
"""Migrate entity unique IDs for pooldose sensors."""
new_unique_id = entity_entry.unique_id
# Check if this entry needs migration
if "_ofa_orp_value" in new_unique_id:
new_unique_id = new_unique_id.replace("_ofa_orp_value", "_ofa_orp_time")
elif "_ofa_ph_value" in new_unique_id:
new_unique_id = new_unique_id.replace("_ofa_ph_value", "_ofa_ph_time")
else:
# No migration needed
return None
return {"new_unique_id": new_unique_id}
await er.async_migrate_entries(hass, entry.entry_id, migrate_unique_id)
hass.config_entries.async_update_entry(entry, version=1, minor_version=2)
return True
async def async_setup_entry(hass: HomeAssistant, entry: PooldoseConfigEntry) -> bool:
"""Set up Seko PoolDose from a config entry."""
# Get host from config entry data (connection-critical configuration)

View File

@@ -31,6 +31,7 @@ class PooldoseConfigFlow(ConfigFlow, domain=DOMAIN):
"""Config flow for the Pooldose integration including DHCP discovery."""
VERSION = 1
MINOR_VERSION = 2
def __init__(self) -> None:
"""Initialize the config flow and store the discovered IP address and MAC."""

View File

@@ -1,10 +1,10 @@
{
"entity": {
"sensor": {
"ofa_orp_value": {
"ofa_orp_time": {
"default": "mdi:clock"
},
"ofa_ph_value": {
"ofa_ph_time": {
"default": "mdi:clock"
},
"orp": {

View File

@@ -11,5 +11,5 @@
"documentation": "https://www.home-assistant.io/integrations/pooldose",
"iot_class": "local_polling",
"quality_scale": "bronze",
"requirements": ["python-pooldose==0.7.0"]
"requirements": ["python-pooldose==0.7.8"]
}

View File

@@ -48,8 +48,8 @@ SENSOR_DESCRIPTIONS: tuple[SensorEntityDescription, ...] = (
options=["proportional", "on_off", "timed"],
),
SensorEntityDescription(
key="ofa_ph_value",
translation_key="ofa_ph_value",
key="ofa_ph_time",
translation_key="ofa_ph_time",
entity_category=EntityCategory.DIAGNOSTIC,
device_class=SensorDeviceClass.DURATION,
entity_registry_enabled_default=False,
@@ -72,8 +72,8 @@ SENSOR_DESCRIPTIONS: tuple[SensorEntityDescription, ...] = (
options=["off", "proportional", "on_off", "timed"],
),
SensorEntityDescription(
key="ofa_orp_value",
translation_key="ofa_orp_value",
key="ofa_orp_time",
translation_key="ofa_orp_time",
device_class=SensorDeviceClass.DURATION,
entity_category=EntityCategory.DIAGNOSTIC,
entity_registry_enabled_default=False,

View File

@@ -34,10 +34,10 @@
},
"entity": {
"sensor": {
"ofa_orp_value": {
"ofa_orp_time": {
"name": "ORP overfeed alert time"
},
"ofa_ph_value": {
"ofa_ph_time": {
"name": "pH overfeed alert time"
},
"orp": {

View File

@@ -19,6 +19,7 @@ from homeassistant.data_entry_flow import FlowResultType
from homeassistant.exceptions import ConfigEntryNotReady
from homeassistant.helpers import config_validation as cv, issue_registry as ir
from homeassistant.helpers.dispatcher import async_dispatcher_send
from homeassistant.helpers.entity_registry import RegistryEntry, async_migrate_entries
from homeassistant.helpers.typing import ConfigType
from .const import (
@@ -257,10 +258,11 @@ async def async_migrate_entry(
config_entry.minor_version,
)
if config_entry.version > 1:
if config_entry.version > 2:
# This means the user has downgraded from a future version
return False
# 1.2 Migrate subentries to include configured numbers to title
if config_entry.version == 1 and config_entry.minor_version == 1:
for subentry in config_entry.subentries.values():
property_map = {
@@ -278,6 +280,21 @@ async def async_migrate_entry(
hass.config_entries.async_update_entry(config_entry, minor_version=2)
# 2.1 Migrate all entity unique IDs to replace "satel" prefix with config entry ID, allows multiple entries to be configured
if config_entry.version == 1:
@callback
def migrate_unique_id(entity_entry: RegistryEntry) -> dict[str, str]:
"""Migrate the unique ID to a new format."""
return {
"new_unique_id": entity_entry.unique_id.replace(
"satel", config_entry.entry_id
)
}
await async_migrate_entries(hass, config_entry.entry_id, migrate_unique_id)
hass.config_entries.async_update_entry(config_entry, version=2, minor_version=1)
_LOGGER.debug(
"Migration to configuration version %s.%s successful",
config_entry.version,

View File

@@ -52,7 +52,11 @@ async def async_setup_entry(
async_add_entities(
[
SatelIntegraAlarmPanel(
controller, zone_name, arm_home_mode, partition_num
controller,
zone_name,
arm_home_mode,
partition_num,
config_entry.entry_id,
)
],
config_subentry_id=subentry.subentry_id,
@@ -69,10 +73,12 @@ class SatelIntegraAlarmPanel(AlarmControlPanelEntity):
| AlarmControlPanelEntityFeature.ARM_AWAY
)
def __init__(self, controller, name, arm_home_mode, partition_id) -> None:
def __init__(
self, controller, name, arm_home_mode, partition_id, config_entry_id
) -> None:
"""Initialize the alarm panel."""
self._attr_name = name
self._attr_unique_id = f"satel_alarm_panel_{partition_id}"
self._attr_unique_id = f"{config_entry_id}_alarm_panel_{partition_id}"
self._arm_home_mode = arm_home_mode
self._partition_id = partition_id
self._satel = controller

View File

@@ -53,6 +53,7 @@ async def async_setup_entry(
zone_type,
CONF_ZONES,
SIGNAL_ZONES_UPDATED,
config_entry.entry_id,
)
],
config_subentry_id=subentry.subentry_id,
@@ -77,6 +78,7 @@ async def async_setup_entry(
ouput_type,
CONF_OUTPUTS,
SIGNAL_OUTPUTS_UPDATED,
config_entry.entry_id,
)
],
config_subentry_id=subentry.subentry_id,
@@ -96,10 +98,11 @@ class SatelIntegraBinarySensor(BinarySensorEntity):
zone_type,
sensor_type,
react_to_signal,
config_entry_id,
):
"""Initialize the binary_sensor."""
self._device_number = device_number
self._attr_unique_id = f"satel_{sensor_type}_{device_number}"
self._attr_unique_id = f"{config_entry_id}_{sensor_type}_{device_number}"
self._name = device_name
self._zone_type = zone_type
self._state = 0

View File

@@ -90,8 +90,8 @@ SWITCHABLE_OUTPUT_SCHEMA = vol.Schema({vol.Required(CONF_NAME): cv.string})
class SatelConfigFlow(ConfigFlow, domain=DOMAIN):
"""Handle a Satel Integra config flow."""
VERSION = 1
MINOR_VERSION = 2
VERSION = 2
MINOR_VERSION = 1
@staticmethod
@callback
@@ -121,6 +121,8 @@ class SatelConfigFlow(ConfigFlow, domain=DOMAIN):
errors: dict[str, str] = {}
if user_input is not None:
self._async_abort_entries_match({CONF_HOST: user_input[CONF_HOST]})
valid = await self.test_connection(
user_input[CONF_HOST], user_input[CONF_PORT]
)

View File

@@ -7,6 +7,5 @@
"integration_type": "device",
"iot_class": "local_push",
"loggers": ["satel_integra"],
"requirements": ["satel-integra==0.3.7"],
"single_config_entry": true
"requirements": ["satel-integra==0.3.7"]
}

View File

@@ -4,6 +4,9 @@
"code_input_description": "Code to toggle switchable outputs"
},
"config": {
"abort": {
"already_configured": "[%key:common::config_flow::abort::already_configured_device%]"
},
"error": {
"cannot_connect": "[%key:common::config_flow::error::cannot_connect%]"
},

View File

@@ -46,6 +46,7 @@ async def async_setup_entry(
switchable_output_num,
switchable_output_name,
config_entry.options.get(CONF_CODE),
config_entry.entry_id,
),
],
config_subentry_id=subentry.subentry_id,
@@ -57,10 +58,10 @@ class SatelIntegraSwitch(SwitchEntity):
_attr_should_poll = False
def __init__(self, controller, device_number, device_name, code):
def __init__(self, controller, device_number, device_name, code, config_entry_id):
"""Initialize the binary_sensor."""
self._device_number = device_number
self._attr_unique_id = f"satel_switch_{device_number}"
self._attr_unique_id = f"{config_entry_id}_switch_{device_number}"
self._name = device_name
self._state = False
self._code = code

View File

@@ -0,0 +1,85 @@
"""BLE provisioning helpers for Shelly integration."""
from __future__ import annotations
import asyncio
from dataclasses import dataclass, field
import logging
from homeassistant.core import HomeAssistant, callback
from homeassistant.helpers.device_registry import format_mac
from homeassistant.util.hass_dict import HassKey
_LOGGER = logging.getLogger(__name__)
@dataclass
class ProvisioningState:
"""State for tracking zeroconf discovery during BLE provisioning."""
event: asyncio.Event = field(default_factory=asyncio.Event)
host: str | None = None
port: int | None = None
PROVISIONING_FUTURES: HassKey[dict[str, ProvisioningState]] = HassKey(
"shelly_provisioning_futures"
)
@callback
def async_get_provisioning_registry(
hass: HomeAssistant,
) -> dict[str, ProvisioningState]:
"""Get the provisioning registry, creating it if needed.
This is a helper function for internal use.
It ensures the registry exists without requiring async_setup to run first.
"""
return hass.data.setdefault(PROVISIONING_FUTURES, {})
@callback
def async_register_zeroconf_discovery(
hass: HomeAssistant, mac: str, host: str, port: int
) -> None:
"""Register a zeroconf discovery for a device that was provisioned via BLE.
Called by zeroconf discovery when it finds a device that may have been
provisioned via BLE. If BLE provisioning is waiting for this device,
the host and port will be stored (replacing any previous values).
Multiple zeroconf discoveries can happen (Shelly service, HTTP service, etc.)
and the last one wins.
Args:
hass: Home Assistant instance
mac: Device MAC address (will be normalized)
host: Device IP address/hostname from zeroconf
port: Device port from zeroconf
"""
registry = async_get_provisioning_registry(hass)
normalized_mac = format_mac(mac)
state = registry.get(normalized_mac)
if not state:
_LOGGER.debug(
"No BLE provisioning state found for %s (host %s, port %s)",
normalized_mac,
host,
port,
)
return
_LOGGER.debug(
"Registering zeroconf discovery for %s at %s:%s (replacing previous)",
normalized_mac,
host,
port,
)
# Store host and port (replacing any previous values) and signal the event
state.host = host
state.port = port
state.event.set()

View File

@@ -2,9 +2,13 @@
from __future__ import annotations
from collections.abc import Mapping
from typing import Any, Final
import asyncio
from collections.abc import AsyncIterator, Mapping
from contextlib import asynccontextmanager
from typing import TYPE_CHECKING, Any, Final
from aioshelly.ble.manufacturer_data import has_rpc_over_ble
from aioshelly.ble.provisioning import async_provision_wifi, async_scan_wifi_networks
from aioshelly.block_device import BlockDevice
from aioshelly.common import ConnectionOptions, get_info
from aioshelly.const import BLOCK_GENERATIONS, DEFAULT_HTTP_PORT, RPC_GENERATIONS
@@ -14,10 +18,18 @@ from aioshelly.exceptions import (
InvalidAuthError,
InvalidHostError,
MacAddressMismatchError,
RpcCallError,
)
from aioshelly.rpc_device import RpcDevice
from aioshelly.zeroconf import async_lookup_device_by_name
from bleak.backends.device import BLEDevice
import voluptuous as vol
from homeassistant.components import zeroconf
from homeassistant.components.bluetooth import (
BluetoothServiceInfoBleak,
async_ble_device_from_address,
)
from homeassistant.config_entries import ConfigFlow, ConfigFlowResult, OptionsFlow
from homeassistant.const import (
CONF_HOST,
@@ -29,15 +41,27 @@ from homeassistant.const import (
)
from homeassistant.core import HomeAssistant, callback
from homeassistant.helpers.aiohttp_client import async_get_clientsession
from homeassistant.helpers.selector import SelectSelector, SelectSelectorConfig
from homeassistant.helpers.device_registry import format_mac
from homeassistant.helpers.selector import (
SelectSelector,
SelectSelectorConfig,
SelectSelectorMode,
)
from homeassistant.helpers.service_info.zeroconf import ZeroconfServiceInfo
from .ble_provisioning import (
ProvisioningState,
async_get_provisioning_registry,
async_register_zeroconf_discovery,
)
from .const import (
CONF_BLE_SCANNER_MODE,
CONF_GEN,
CONF_SLEEP_PERIOD,
CONF_SSID,
DOMAIN,
LOGGER,
PROVISIONING_TIMEOUT,
BLEScannerMode,
)
from .coordinator import ShellyConfigEntry, async_reconnect_soon
@@ -70,6 +94,10 @@ BLE_SCANNER_OPTIONS = [
INTERNAL_WIFI_AP_IP = "192.168.33.1"
# BLE provisioning flow steps that are in the finishing state
# Used to determine if a BLE flow should be aborted when zeroconf discovers the device
BLUETOOTH_FINISHING_STEPS = {"do_provision", "provision_done"}
async def validate_input(
hass: HomeAssistant,
@@ -145,6 +173,12 @@ class ShellyConfigFlow(ConfigFlow, domain=DOMAIN):
port: int = DEFAULT_HTTP_PORT
info: dict[str, Any] = {}
device_info: dict[str, Any] = {}
ble_device: BLEDevice | None = None
device_name: str = ""
wifi_networks: list[dict[str, Any]] = []
selected_ssid: str = ""
_provision_task: asyncio.Task | None = None
_provision_result: ConfigFlowResult | None = None
async def async_step_user(
self, user_input: dict[str, Any] | None = None
@@ -262,6 +296,45 @@ class ShellyConfigFlow(ConfigFlow, domain=DOMAIN):
step_id="credentials", data_schema=vol.Schema(schema), errors=errors
)
def _abort_idle_ble_flows(self, mac: str) -> None:
"""Abort idle BLE provisioning flows for this device.
When zeroconf discovers a device, it means the device is already on WiFi.
If there's an idle BLE flow (user hasn't started provisioning yet), abort it.
Active provisioning flows (do_provision/provision_done) should not be aborted
as they're waiting for zeroconf handoff.
"""
for flow in self._async_in_progress(include_uninitialized=True):
if (
flow["flow_id"] != self.flow_id
and flow["context"].get("unique_id") == mac
and flow["context"].get("source") == "bluetooth"
and flow.get("step_id") not in BLUETOOTH_FINISHING_STEPS
):
LOGGER.debug(
"Aborting idle BLE flow %s for %s (device discovered via zeroconf)",
flow["flow_id"],
mac,
)
self.hass.config_entries.flow.async_abort(flow["flow_id"])
async def _async_handle_zeroconf_mac_discovery(
self, mac: str, host: str, port: int
) -> None:
"""Handle MAC address discovery from zeroconf.
Registers discovery info for BLE handoff and aborts idle BLE flows.
"""
# Register this zeroconf discovery with BLE provisioning in case
# this device was just provisioned via BLE
async_register_zeroconf_discovery(self.hass, mac, host, port)
# Check for idle BLE provisioning flows and abort them since
# device is already on WiFi (discovered via zeroconf)
self._abort_idle_ble_flows(mac)
await self._async_discovered_mac(mac, host)
async def _async_discovered_mac(self, mac: str, host: str) -> None:
"""Abort and reconnect soon if the device with the mac address is already configured."""
if (
@@ -281,6 +354,313 @@ class ShellyConfigFlow(ConfigFlow, domain=DOMAIN):
else:
self._abort_if_unique_id_configured({CONF_HOST: host})
async def async_step_bluetooth(
self, discovery_info: BluetoothServiceInfoBleak
) -> ConfigFlowResult:
"""Handle bluetooth discovery."""
# Parse MAC address from the Bluetooth device name
if not (mac := mac_address_from_name(discovery_info.name)):
return self.async_abort(reason="invalid_discovery_info")
# Check if RPC-over-BLE is enabled - required for WiFi provisioning
if not has_rpc_over_ble(discovery_info.manufacturer_data):
LOGGER.debug(
"Device %s does not have RPC-over-BLE enabled, skipping provisioning",
discovery_info.name,
)
return self.async_abort(reason="invalid_discovery_info")
# Check if already configured - abort if device is already set up
await self.async_set_unique_id(mac)
self._abort_if_unique_id_configured()
# Store BLE device and name for WiFi provisioning
self.ble_device = async_ble_device_from_address(
self.hass, discovery_info.address, connectable=True
)
if not self.ble_device:
return self.async_abort(reason="cannot_connect")
self.device_name = discovery_info.name
self.context.update(
{
"title_placeholders": {"name": discovery_info.name},
}
)
return await self.async_step_bluetooth_confirm()
async def async_step_bluetooth_confirm(
self, user_input: dict[str, Any] | None = None
) -> ConfigFlowResult:
"""Confirm bluetooth provisioning."""
if user_input is not None:
return await self.async_step_wifi_scan()
return self.async_show_form(
step_id="bluetooth_confirm",
description_placeholders={
"name": self.context["title_placeholders"]["name"]
},
)
async def async_step_wifi_scan(
self, user_input: dict[str, Any] | None = None
) -> ConfigFlowResult:
"""Scan for WiFi networks via BLE."""
if user_input is not None:
self.selected_ssid = user_input[CONF_SSID]
return await self.async_step_wifi_credentials()
# Scan for WiFi networks via BLE
if TYPE_CHECKING:
assert self.ble_device is not None
try:
self.wifi_networks = await async_scan_wifi_networks(self.ble_device)
except (DeviceConnectionError, RpcCallError) as err:
LOGGER.debug("Failed to scan WiFi networks via BLE: %s", err)
# "Writing is not permitted" error means device rejects BLE writes
# and BLE provisioning is disabled - user must use Shelly app
if "not permitted" in str(err):
return self.async_abort(reason="ble_not_permitted")
return await self.async_step_wifi_scan_failed()
except Exception: # noqa: BLE001
LOGGER.exception("Unexpected exception during WiFi scan")
return self.async_abort(reason="unknown")
# Create list of SSIDs for selection
# If no networks found, still allow custom SSID entry
ssid_options = [network["ssid"] for network in self.wifi_networks]
return self.async_show_form(
step_id="wifi_scan",
data_schema=vol.Schema(
{
vol.Required(CONF_SSID): SelectSelector(
SelectSelectorConfig(
options=ssid_options,
mode=SelectSelectorMode.DROPDOWN,
custom_value=True,
)
),
}
),
)
async def async_step_wifi_scan_failed(
self, user_input: dict[str, Any] | None = None
) -> ConfigFlowResult:
"""Handle failed WiFi scan - allow retry."""
if user_input is not None:
# User wants to retry - go back to wifi_scan
return await self.async_step_wifi_scan()
return self.async_show_form(step_id="wifi_scan_failed")
@asynccontextmanager
async def _async_provision_context(
self, mac: str
) -> AsyncIterator[ProvisioningState]:
"""Context manager to register and cleanup provisioning state."""
state = ProvisioningState()
provisioning_registry = async_get_provisioning_registry(self.hass)
normalized_mac = format_mac(mac)
provisioning_registry[normalized_mac] = state
try:
yield state
finally:
provisioning_registry.pop(normalized_mac, None)
async def async_step_wifi_credentials(
self, user_input: dict[str, Any] | None = None
) -> ConfigFlowResult:
"""Get WiFi credentials and provision device."""
if user_input is not None:
self.selected_ssid = user_input.get(CONF_SSID, self.selected_ssid)
password = user_input[CONF_PASSWORD]
return await self.async_step_do_provision({"password": password})
return self.async_show_form(
step_id="wifi_credentials",
data_schema=vol.Schema(
{
vol.Required(CONF_PASSWORD): str,
}
),
description_placeholders={"ssid": self.selected_ssid},
)
async def _async_provision_wifi_and_wait_for_zeroconf(
self, mac: str, password: str, state: ProvisioningState
) -> ConfigFlowResult | None:
"""Provision WiFi credentials via BLE and wait for zeroconf discovery.
Returns the flow result to be stored in self._provision_result, or None if failed.
"""
# Provision WiFi via BLE
if TYPE_CHECKING:
assert self.ble_device is not None
try:
await async_provision_wifi(self.ble_device, self.selected_ssid, password)
except (DeviceConnectionError, RpcCallError) as err:
LOGGER.debug("Failed to provision WiFi via BLE: %s", err)
# BLE connection/communication failed - allow retry from network selection
return None
except Exception: # noqa: BLE001
LOGGER.exception("Unexpected exception during WiFi provisioning")
return self.async_abort(reason="unknown")
LOGGER.debug(
"WiFi provisioning successful for %s, waiting for zeroconf discovery",
mac,
)
# Two-phase device discovery after WiFi provisioning:
#
# Phase 1: Wait for zeroconf discovery callback (via event)
# - Callback only fires on NEW zeroconf advertisements
# - If device appears on network, we get notified immediately
# - This is the fast path for successful provisioning
#
# Phase 2: Active lookup on timeout (poll)
# - Handles case where device was factory reset and has stale zeroconf data
# - Factory reset devices don't send zeroconf goodbye, leaving stale records
# - The timeout ensures device has enough time to connect to WiFi
# - Active poll forces fresh lookup, ignoring stale cached data
#
# Why not just poll? If we polled immediately, we'd get stale data and
# try to connect right away, causing false failures before device is ready.
try:
await asyncio.wait_for(state.event.wait(), timeout=PROVISIONING_TIMEOUT)
except TimeoutError:
LOGGER.debug("Timeout waiting for zeroconf discovery, trying active lookup")
# No new discovery received - device may have stale zeroconf data
# Do active lookup to force fresh resolution
aiozc = await zeroconf.async_get_async_instance(self.hass)
result = await async_lookup_device_by_name(aiozc, self.device_name)
# If we still don't have a host, provisioning failed
if not result:
LOGGER.debug("Active lookup failed - provisioning unsuccessful")
# Store failure info and return None - provision_done will handle redirect
return None
state.host, state.port = result
else:
LOGGER.debug(
"Zeroconf discovery received for device after WiFi provisioning at %s",
state.host,
)
# Device discovered via zeroconf - get device info and set up directly
if TYPE_CHECKING:
assert state.host is not None
assert state.port is not None
self.host = state.host
self.port = state.port
try:
self.info = await self._async_get_info(self.host, self.port)
except DeviceConnectionError as err:
LOGGER.debug("Failed to connect to device after WiFi provisioning: %s", err)
# Device appeared on network but can't connect - allow retry
return None
if get_info_auth(self.info):
# Device requires authentication - show credentials step
return await self.async_step_credentials()
try:
device_info = await validate_input(
self.hass, self.host, self.port, self.info, {}
)
except DeviceConnectionError as err:
LOGGER.debug("Failed to validate device after WiFi provisioning: %s", err)
# Device info validation failed - allow retry
return None
if not device_info[CONF_MODEL]:
return self.async_abort(reason="firmware_not_fully_provisioned")
# User just provisioned this device - create entry directly without confirmation
return self.async_create_entry(
title=device_info["title"],
data={
CONF_HOST: self.host,
CONF_PORT: self.port,
CONF_SLEEP_PERIOD: device_info[CONF_SLEEP_PERIOD],
CONF_MODEL: device_info[CONF_MODEL],
CONF_GEN: device_info[CONF_GEN],
},
)
async def _do_provision(self, password: str) -> None:
"""Provision WiFi credentials to device via BLE."""
if TYPE_CHECKING:
assert self.ble_device is not None
mac = self.unique_id
if TYPE_CHECKING:
assert mac is not None
async with self._async_provision_context(mac) as state:
self._provision_result = (
await self._async_provision_wifi_and_wait_for_zeroconf(
mac, password, state
)
)
async def async_step_do_provision(
self, user_input: dict[str, Any] | None = None
) -> ConfigFlowResult:
"""Execute WiFi provisioning via BLE."""
if not self._provision_task:
if TYPE_CHECKING:
assert user_input is not None
password = user_input["password"]
self._provision_task = self.hass.async_create_task(
self._do_provision(password), eager_start=False
)
if not self._provision_task.done():
return self.async_show_progress(
step_id="do_provision",
progress_action="provisioning",
progress_task=self._provision_task,
)
self._provision_task = None
return self.async_show_progress_done(next_step_id="provision_done")
async def async_step_provision_failed(
self, user_input: dict[str, Any] | None = None
) -> ConfigFlowResult:
"""Handle failed provisioning - allow retry."""
if user_input is not None:
# User wants to retry - clear state and go back to wifi_scan
self.selected_ssid = ""
self.wifi_networks = []
return await self.async_step_wifi_scan()
return self.async_show_form(
step_id="provision_failed",
description_placeholders={"ssid": self.selected_ssid},
)
async def async_step_provision_done(
self, user_input: dict[str, Any] | None = None
) -> ConfigFlowResult:
"""Show the result of the provision step."""
result = self._provision_result
self._provision_result = None
# If provisioning failed, redirect to provision_failed step
if result is None:
return await self.async_step_provision_failed()
return result
async def async_step_zeroconf(
self, discovery_info: ZeroconfServiceInfo
) -> ConfigFlowResult:
@@ -288,23 +668,25 @@ class ShellyConfigFlow(ConfigFlow, domain=DOMAIN):
if discovery_info.ip_address.version == 6:
return self.async_abort(reason="ipv6_not_supported")
host = discovery_info.host
port = discovery_info.port or DEFAULT_HTTP_PORT
# First try to get the mac address from the name
# so we can avoid making another connection to the
# device if we already have it configured
if mac := mac_address_from_name(discovery_info.name):
await self._async_discovered_mac(mac, host)
await self._async_handle_zeroconf_mac_discovery(mac, host, port)
try:
# Devices behind range extender doesn't generate zeroconf packets
# so port is always the default one
self.info = await self._async_get_info(host, DEFAULT_HTTP_PORT)
self.info = await self._async_get_info(host, port)
except DeviceConnectionError:
return self.async_abort(reason="cannot_connect")
if not mac:
# We could not get the mac address from the name
# so need to check here since we just got the info
await self._async_discovered_mac(self.info[CONF_MAC], host)
mac = self.info[CONF_MAC]
await self._async_handle_zeroconf_mac_discovery(mac, host, port)
self.host = host
self.context.update(

View File

@@ -36,6 +36,10 @@ DOMAIN: Final = "shelly"
LOGGER: Logger = getLogger(__package__)
# BLE provisioning
PROVISIONING_TIMEOUT: Final = 35 # 35 seconds to wait for device to connect to WiFi
CONF_SSID: Final = "ssid"
CONF_COAP_PORT: Final = "coap_port"
FIRMWARE_PATTERN: Final = re.compile(r"^(\d{8})")

View File

@@ -1,6 +1,11 @@
{
"domain": "shelly",
"name": "Shelly",
"bluetooth": [
{
"local_name": "Shelly*"
}
],
"codeowners": ["@bieniu", "@thecode", "@chemelli74", "@bdraco"],
"config_flow": true,
"dependencies": ["bluetooth", "http", "network"],
@@ -9,7 +14,7 @@
"iot_class": "local_push",
"loggers": ["aioshelly"],
"quality_scale": "silver",
"requirements": ["aioshelly==13.15.0"],
"requirements": ["aioshelly==13.16.0"],
"zeroconf": [
{
"name": "shelly*",

View File

@@ -2,13 +2,20 @@
"config": {
"abort": {
"already_configured": "[%key:common::config_flow::abort::already_configured_device%]",
"already_on_wifi": "Device is already connected to WiFi and was discovered via the network.",
"another_device": "Re-configuration was unsuccessful, the IP address/hostname of another Shelly device was used.",
"ble_not_permitted": "Device is bound to a Shelly cloud account and cannot be provisioned via Bluetooth. Please use the Shelly app to provision WiFi credentials, then add the device when it appears on your network.",
"cannot_connect": "Failed to connect to the device. Ensure the device is powered on and within range.",
"firmware_not_fully_provisioned": "Device not fully provisioned. Please contact Shelly support",
"invalid_discovery_info": "Invalid Bluetooth discovery information.",
"ipv6_not_supported": "IPv6 is not supported.",
"mac_address_mismatch": "[%key:component::shelly::config::error::mac_address_mismatch%]",
"no_wifi_networks": "No WiFi networks found during scan.",
"reauth_successful": "[%key:common::config_flow::abort::reauth_successful%]",
"reauth_unsuccessful": "Re-authentication was unsuccessful, please remove the integration and set it up again.",
"reconfigure_successful": "[%key:common::config_flow::abort::reconfigure_successful%]"
"reconfigure_successful": "[%key:common::config_flow::abort::reconfigure_successful%]",
"unknown": "[%key:common::config_flow::error::unknown%]",
"wifi_provisioned": "WiFi credentials for {ssid} have been provisioned to {name}. The device is connecting to WiFi and will complete setup automatically."
},
"error": {
"cannot_connect": "[%key:common::config_flow::error::cannot_connect%]",
@@ -19,7 +26,13 @@
"unknown": "[%key:common::config_flow::error::unknown%]"
},
"flow_title": "{name}",
"progress": {
"provisioning": "Provisioning WiFi credentials and waiting for device to connect"
},
"step": {
"bluetooth_confirm": {
"description": "The Shelly device {name} has been discovered via Bluetooth but is not connected to WiFi.\n\nDo you want to provision WiFi credentials to this device?"
},
"confirm_discovery": {
"description": "Do you want to set up the {model} at {host}?\n\nBattery-powered devices that are password-protected must be woken up before continuing with setting up.\nBattery-powered devices that are not password-protected will be added when the device wakes up, you can now manually wake the device up using a button on it or wait for the next data update from the device."
},
@@ -33,6 +46,9 @@
"username": "Username for the device's web panel."
}
},
"provision_failed": {
"description": "The device did not connect to {ssid}. This may be due to an incorrect password or the network being out of range. Would you like to try again?"
},
"reauth_confirm": {
"data": {
"password": "[%key:common::config_flow::data::password%]",
@@ -64,6 +80,27 @@
"port": "The TCP port of the Shelly device to connect to (Gen2+)."
},
"description": "Before setup, battery-powered devices must be woken up, you can now wake the device up using a button on it."
},
"wifi_credentials": {
"data": {
"password": "[%key:common::config_flow::data::password%]"
},
"data_description": {
"password": "Password for the WiFi network."
},
"description": "Enter the password for {ssid}."
},
"wifi_scan": {
"data": {
"ssid": "WiFi network"
},
"data_description": {
"ssid": "Select a WiFi network from the list or enter a custom SSID for hidden networks."
},
"description": "Select a WiFi network from the list or enter a custom SSID for hidden networks."
},
"wifi_scan_failed": {
"description": "Failed to scan for WiFi networks via Bluetooth. The device may be out of range or Bluetooth connection failed. Would you like to try again?"
}
}
},

View File

@@ -505,6 +505,9 @@ KEEP_CAPABILITY_QUIRK: dict[
Capability.SAMSUNG_CE_AIR_CONDITIONER_LIGHTING: (
lambda status: status[Attribute.LIGHTING].value is not None
),
Capability.SAMSUNG_CE_AIR_CONDITIONER_BEEP: (
lambda status: status[Attribute.BEEP].value is not None
),
}

View File

@@ -156,6 +156,13 @@
"sanitize": {
"default": "mdi:lotion"
},
"sound_effect": {
"default": "mdi:volume-high",
"state": {
"off": "mdi:volume-off",
"on": "mdi:volume-high"
}
},
"wrinkle_prevent": {
"default": "mdi:tumble-dryer",
"state": {

View File

@@ -653,6 +653,9 @@
"sanitize": {
"name": "Sanitize"
},
"sound_effect": {
"name": "Sound effect"
},
"wrinkle_prevent": {
"name": "Wrinkle prevent"
}

View File

@@ -91,6 +91,15 @@ CAPABILITY_TO_COMMAND_SWITCHES: dict[
),
}
CAPABILITY_TO_SWITCHES: dict[Capability | str, SmartThingsSwitchEntityDescription] = {
Capability.SAMSUNG_CE_AIR_CONDITIONER_BEEP: SmartThingsSwitchEntityDescription(
key=Capability.SAMSUNG_CE_AIR_CONDITIONER_BEEP,
translation_key="sound_effect",
status_attribute=Attribute.BEEP,
on_key="on",
on_command=Command.ON,
off_command=Command.OFF,
entity_category=EntityCategory.CONFIG,
),
Capability.SAMSUNG_CE_WASHER_BUBBLE_SOAK: SmartThingsSwitchEntityDescription(
key=Capability.SAMSUNG_CE_WASHER_BUBBLE_SOAK,
translation_key="bubble_soak",

View File

@@ -9,7 +9,11 @@ from homeassistant.const import (
)
from homeassistant.core import HomeAssistant
from .coordinator import SMHIConfigEntry, SMHIDataUpdateCoordinator
from .coordinator import (
SMHIConfigEntry,
SMHIDataUpdateCoordinator,
SMHIFireDataUpdateCoordinator,
)
PLATFORMS = [Platform.SENSOR, Platform.WEATHER]
@@ -24,7 +28,9 @@ async def async_setup_entry(hass: HomeAssistant, entry: SMHIConfigEntry) -> bool
coordinator = SMHIDataUpdateCoordinator(hass, entry)
await coordinator.async_config_entry_first_refresh()
entry.runtime_data = coordinator
fire_coordinator = SMHIFireDataUpdateCoordinator(hass, entry)
await fire_coordinator.async_config_entry_first_refresh()
entry.runtime_data = (coordinator, fire_coordinator)
await hass.config_entries.async_forward_entry_setups(entry, PLATFORMS)
return True

View File

@@ -5,7 +5,14 @@ from __future__ import annotations
import asyncio
from dataclasses import dataclass
from pysmhi import SMHIForecast, SmhiForecastException, SMHIPointForecast
from pysmhi import (
SMHIFireForecast,
SmhiFireForecastException,
SMHIFirePointForecast,
SMHIForecast,
SmhiForecastException,
SMHIPointForecast,
)
from homeassistant.config_entries import ConfigEntry
from homeassistant.const import CONF_LATITUDE, CONF_LOCATION, CONF_LONGITUDE
@@ -15,7 +22,9 @@ from homeassistant.helpers.update_coordinator import DataUpdateCoordinator, Upda
from .const import DEFAULT_SCAN_INTERVAL, DOMAIN, LOGGER, TIMEOUT
type SMHIConfigEntry = ConfigEntry[SMHIDataUpdateCoordinator]
type SMHIConfigEntry = ConfigEntry[
tuple[SMHIDataUpdateCoordinator, SMHIFireDataUpdateCoordinator]
]
@dataclass
@@ -27,6 +36,14 @@ class SMHIForecastData:
twice_daily: list[SMHIForecast]
@dataclass
class SMHIFireForecastData:
"""Dataclass for SMHI fire data."""
fire_daily: list[SMHIFireForecast]
fire_hourly: list[SMHIFireForecast]
class SMHIDataUpdateCoordinator(DataUpdateCoordinator[SMHIForecastData]):
"""A SMHI Data Update Coordinator."""
@@ -71,3 +88,49 @@ class SMHIDataUpdateCoordinator(DataUpdateCoordinator[SMHIForecastData]):
def current(self) -> SMHIForecast:
"""Return the current metrics."""
return self.data.daily[0]
class SMHIFireDataUpdateCoordinator(DataUpdateCoordinator[SMHIFireForecastData]):
"""A SMHI Fire Data Update Coordinator."""
config_entry: SMHIConfigEntry
def __init__(self, hass: HomeAssistant, config_entry: SMHIConfigEntry) -> None:
"""Initialize the SMHI coordinator."""
super().__init__(
hass,
LOGGER,
config_entry=config_entry,
name=DOMAIN,
update_interval=DEFAULT_SCAN_INTERVAL,
)
self._smhi_fire_api = SMHIFirePointForecast(
config_entry.data[CONF_LOCATION][CONF_LONGITUDE],
config_entry.data[CONF_LOCATION][CONF_LATITUDE],
session=aiohttp_client.async_get_clientsession(hass),
)
async def _async_update_data(self) -> SMHIFireForecastData:
"""Fetch data from SMHI."""
try:
async with asyncio.timeout(TIMEOUT):
_forecast_fire_daily = (
await self._smhi_fire_api.async_get_daily_forecast()
)
_forecast_fire_hourly = (
await self._smhi_fire_api.async_get_hourly_forecast()
)
except SmhiFireForecastException as ex:
raise UpdateFailed(
"Failed to retrieve the forecast from the SMHI API"
) from ex
return SMHIFireForecastData(
fire_daily=_forecast_fire_daily,
fire_hourly=_forecast_fire_hourly,
)
@property
def fire_current(self) -> SMHIFireForecast:
"""Return the current fire metrics."""
return self.data.fire_daily[0]

View File

@@ -6,13 +6,14 @@ from abc import abstractmethod
from homeassistant.core import callback
from homeassistant.helpers.device_registry import DeviceEntryType, DeviceInfo
from homeassistant.helpers.entity import Entity
from homeassistant.helpers.update_coordinator import CoordinatorEntity
from .const import DOMAIN
from .coordinator import SMHIDataUpdateCoordinator
from .coordinator import SMHIDataUpdateCoordinator, SMHIFireDataUpdateCoordinator
class SmhiWeatherBaseEntity(CoordinatorEntity[SMHIDataUpdateCoordinator]):
class SmhiWeatherBaseEntity(Entity):
"""Representation of a base weather entity."""
_attr_attribution = "Swedish weather institute (SMHI)"
@@ -22,10 +23,8 @@ class SmhiWeatherBaseEntity(CoordinatorEntity[SMHIDataUpdateCoordinator]):
self,
latitude: str,
longitude: str,
coordinator: SMHIDataUpdateCoordinator,
) -> None:
"""Initialize the SMHI base weather entity."""
super().__init__(coordinator)
self._attr_unique_id = f"{latitude}, {longitude}"
self._attr_device_info = DeviceInfo(
entry_type=DeviceEntryType.SERVICE,
@@ -36,12 +35,50 @@ class SmhiWeatherBaseEntity(CoordinatorEntity[SMHIDataUpdateCoordinator]):
)
self.update_entity_data()
@abstractmethod
def update_entity_data(self) -> None:
"""Refresh the entity data."""
class SmhiWeatherEntity(
CoordinatorEntity[SMHIDataUpdateCoordinator], SmhiWeatherBaseEntity
):
"""Representation of a weather entity."""
def __init__(
self,
latitude: str,
longitude: str,
coordinator: SMHIDataUpdateCoordinator,
) -> None:
"""Initialize the SMHI base weather entity."""
super().__init__(coordinator)
SmhiWeatherBaseEntity.__init__(self, latitude, longitude)
@callback
def _handle_coordinator_update(self) -> None:
"""Handle updated data from the coordinator."""
self.update_entity_data()
super()._handle_coordinator_update()
@abstractmethod
def update_entity_data(self) -> None:
"""Refresh the entity data."""
class SmhiFireEntity(
CoordinatorEntity[SMHIFireDataUpdateCoordinator], SmhiWeatherBaseEntity
):
"""Representation of a weather entity."""
def __init__(
self,
latitude: str,
longitude: str,
coordinator: SMHIFireDataUpdateCoordinator,
) -> None:
"""Initialize the SMHI base weather entity."""
super().__init__(coordinator)
SmhiWeatherBaseEntity.__init__(self, latitude, longitude)
@callback
def _handle_coordinator_update(self) -> None:
"""Handle updated data from the coordinator."""
self.update_entity_data()
super()._handle_coordinator_update()

View File

@@ -1,12 +1,42 @@
{
"entity": {
"sensor": {
"build_up_index": {
"default": "mdi:grass"
},
"drought_code": {
"default": "mdi:grass"
},
"duff_moisture_code": {
"default": "mdi:grass"
},
"fine_fuel_moisture_code": {
"default": "mdi:grass"
},
"fire_weather_index": {
"default": "mdi:pine-tree-fire"
},
"forestdry": {
"default": "mdi:forest"
},
"frozen_precipitation": {
"default": "mdi:weather-snowy-rainy"
},
"fwi": {
"default": "mdi:pine-tree-fire"
},
"fwiindex": {
"default": "mdi:pine-tree-fire"
},
"grassfire": {
"default": "mdi:fire-circle"
},
"high_cloud": {
"default": "mdi:cloud-arrow-up"
},
"initial_spread_index": {
"default": "mdi:grass"
},
"low_cloud": {
"default": "mdi:cloud-arrow-down"
},
@@ -16,6 +46,9 @@
"precipitation_category": {
"default": "mdi:weather-pouring"
},
"rate_of_spread": {
"default": "mdi:grass"
},
"thunder": {
"default": "mdi:weather-lightning"
},

View File

@@ -10,19 +10,55 @@ from homeassistant.components.sensor import (
SensorDeviceClass,
SensorEntity,
SensorEntityDescription,
SensorStateClass,
)
from homeassistant.const import (
CONF_LATITUDE,
CONF_LOCATION,
CONF_LONGITUDE,
PERCENTAGE,
UnitOfSpeed,
)
from homeassistant.const import CONF_LATITUDE, CONF_LOCATION, CONF_LONGITUDE, PERCENTAGE
from homeassistant.core import HomeAssistant
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from homeassistant.helpers.typing import StateType
from .coordinator import SMHIConfigEntry, SMHIDataUpdateCoordinator
from .entity import SmhiWeatherBaseEntity
from .coordinator import (
SMHIConfigEntry,
SMHIDataUpdateCoordinator,
SMHIFireDataUpdateCoordinator,
)
from .entity import SmhiFireEntity, SmhiWeatherEntity
PARALLEL_UPDATES = 0
FWI_INDEX_MAP = {
"1": "very_low",
"2": "low",
"3": "moderate",
"4": "high",
"5": "very_high",
"6": "extreme",
}
GRASSFIRE_MAP = {
"1": "snow_cover",
"2": "season_over",
"3": "low",
"4": "moderate",
"5": "high",
"6": "very_high",
}
FORESTDRY_MAP = {
"1": "very_wet",
"2": "wet",
"3": "moderate_wet",
"4": "dry",
"5": "very_dry",
"6": "extremely_dry",
}
def get_percentage_values(entity: SMHISensor, key: str) -> int | None:
def get_percentage_values(entity: SMHIWeatherSensor, key: str) -> int | None:
"""Return percentage values in correct range."""
value: int | None = entity.coordinator.current.get(key) # type: ignore[assignment]
if value is not None and 0 <= value <= 100:
@@ -32,49 +68,64 @@ def get_percentage_values(entity: SMHISensor, key: str) -> int | None:
return None
def get_fire_index_value(entity: SMHIFireSensor, key: str) -> str:
"""Return index value as string."""
value: int | None = entity.coordinator.fire_current.get(key) # type: ignore[assignment]
if value is not None and value > 0:
return str(int(value))
return "0"
@dataclass(frozen=True, kw_only=True)
class SMHISensorEntityDescription(SensorEntityDescription):
"""Describes SMHI sensor entity."""
class SMHIWeatherEntityDescription(SensorEntityDescription):
"""Describes SMHI weather entity."""
value_fn: Callable[[SMHISensor], StateType | datetime]
value_fn: Callable[[SMHIWeatherSensor], StateType | datetime]
SENSOR_DESCRIPTIONS: tuple[SMHISensorEntityDescription, ...] = (
SMHISensorEntityDescription(
@dataclass(frozen=True, kw_only=True)
class SMHIFireEntityDescription(SensorEntityDescription):
"""Describes SMHI fire entity."""
value_fn: Callable[[SMHIFireSensor], StateType | datetime]
WEATHER_SENSOR_DESCRIPTIONS: tuple[SMHIWeatherEntityDescription, ...] = (
SMHIWeatherEntityDescription(
key="thunder",
translation_key="thunder",
value_fn=lambda entity: get_percentage_values(entity, "thunder"),
native_unit_of_measurement=PERCENTAGE,
),
SMHISensorEntityDescription(
SMHIWeatherEntityDescription(
key="total_cloud",
translation_key="total_cloud",
value_fn=lambda entity: get_percentage_values(entity, "total_cloud"),
native_unit_of_measurement=PERCENTAGE,
entity_registry_enabled_default=False,
),
SMHISensorEntityDescription(
SMHIWeatherEntityDescription(
key="low_cloud",
translation_key="low_cloud",
value_fn=lambda entity: get_percentage_values(entity, "low_cloud"),
native_unit_of_measurement=PERCENTAGE,
entity_registry_enabled_default=False,
),
SMHISensorEntityDescription(
SMHIWeatherEntityDescription(
key="medium_cloud",
translation_key="medium_cloud",
value_fn=lambda entity: get_percentage_values(entity, "medium_cloud"),
native_unit_of_measurement=PERCENTAGE,
entity_registry_enabled_default=False,
),
SMHISensorEntityDescription(
SMHIWeatherEntityDescription(
key="high_cloud",
translation_key="high_cloud",
value_fn=lambda entity: get_percentage_values(entity, "high_cloud"),
native_unit_of_measurement=PERCENTAGE,
entity_registry_enabled_default=False,
),
SMHISensorEntityDescription(
SMHIWeatherEntityDescription(
key="precipitation_category",
translation_key="precipitation_category",
value_fn=lambda entity: str(
@@ -83,13 +134,100 @@ SENSOR_DESCRIPTIONS: tuple[SMHISensorEntityDescription, ...] = (
device_class=SensorDeviceClass.ENUM,
options=["0", "1", "2", "3", "4", "5", "6"],
),
SMHISensorEntityDescription(
SMHIWeatherEntityDescription(
key="frozen_precipitation",
translation_key="frozen_precipitation",
value_fn=lambda entity: get_percentage_values(entity, "frozen_precipitation"),
native_unit_of_measurement=PERCENTAGE,
),
)
FIRE_SENSOR_DESCRIPTIONS: tuple[SMHIFireEntityDescription, ...] = (
SMHIFireEntityDescription(
key="fwiindex",
translation_key="fwiindex",
value_fn=(
lambda entity: FWI_INDEX_MAP.get(get_fire_index_value(entity, "fwiindex"))
),
device_class=SensorDeviceClass.ENUM,
options=[*FWI_INDEX_MAP.values()],
entity_registry_enabled_default=False,
),
SMHIFireEntityDescription(
key="fire_weather_index",
translation_key="fire_weather_index",
value_fn=lambda entity: entity.coordinator.fire_current.get("fwi"),
state_class=SensorStateClass.MEASUREMENT,
entity_registry_enabled_default=False,
),
SMHIFireEntityDescription(
key="initial_spread_index",
translation_key="initial_spread_index",
value_fn=lambda entity: entity.coordinator.fire_current.get("isi"),
state_class=SensorStateClass.MEASUREMENT,
entity_registry_enabled_default=False,
),
SMHIFireEntityDescription(
key="build_up_index",
translation_key="build_up_index",
value_fn=(
lambda entity: entity.coordinator.fire_current.get(
"bui" # codespell:ignore bui
)
),
state_class=SensorStateClass.MEASUREMENT,
entity_registry_enabled_default=False,
),
SMHIFireEntityDescription(
key="fine_fuel_moisture_code",
translation_key="fine_fuel_moisture_code",
value_fn=lambda entity: entity.coordinator.fire_current.get("ffmc"),
state_class=SensorStateClass.MEASUREMENT,
entity_registry_enabled_default=False,
),
SMHIFireEntityDescription(
key="duff_moisture_code",
translation_key="duff_moisture_code",
value_fn=lambda entity: entity.coordinator.fire_current.get("dmc"),
state_class=SensorStateClass.MEASUREMENT,
entity_registry_enabled_default=False,
),
SMHIFireEntityDescription(
key="drought_code",
translation_key="drought_code",
value_fn=lambda entity: entity.coordinator.fire_current.get("dc"),
state_class=SensorStateClass.MEASUREMENT,
entity_registry_enabled_default=False,
),
SMHIFireEntityDescription(
key="grassfire",
translation_key="grassfire",
value_fn=(
lambda entity: GRASSFIRE_MAP.get(get_fire_index_value(entity, "grassfire"))
),
device_class=SensorDeviceClass.ENUM,
options=[*GRASSFIRE_MAP.values()],
entity_registry_enabled_default=False,
),
SMHIFireEntityDescription(
key="rate_of_spread",
translation_key="rate_of_spread",
value_fn=lambda entity: entity.coordinator.fire_current.get("rn"),
device_class=SensorDeviceClass.SPEED,
state_class=SensorStateClass.MEASUREMENT,
native_unit_of_measurement=UnitOfSpeed.METERS_PER_MINUTE,
entity_registry_enabled_default=False,
),
SMHIFireEntityDescription(
key="forestdry",
translation_key="forestdry",
value_fn=(
lambda entity: FORESTDRY_MAP.get(get_fire_index_value(entity, "forestdry"))
),
device_class=SensorDeviceClass.ENUM,
options=[*FORESTDRY_MAP.values()],
entity_registry_enabled_default=False,
),
)
async def async_setup_entry(
@@ -99,30 +237,43 @@ async def async_setup_entry(
) -> None:
"""Set up SMHI sensor platform."""
coordinator = entry.runtime_data
coordinator = entry.runtime_data[0]
fire_coordinator = entry.runtime_data[1]
location = entry.data
async_add_entities(
SMHISensor(
entities: list[SMHIWeatherSensor | SMHIFireSensor] = []
entities.extend(
SMHIWeatherSensor(
location[CONF_LOCATION][CONF_LATITUDE],
location[CONF_LOCATION][CONF_LONGITUDE],
coordinator=coordinator,
entity_description=description,
)
for description in SENSOR_DESCRIPTIONS
for description in WEATHER_SENSOR_DESCRIPTIONS
)
entities.extend(
SMHIFireSensor(
location[CONF_LOCATION][CONF_LATITUDE],
location[CONF_LOCATION][CONF_LONGITUDE],
coordinator=fire_coordinator,
entity_description=description,
)
for description in FIRE_SENSOR_DESCRIPTIONS
)
async_add_entities(entities)
class SMHISensor(SmhiWeatherBaseEntity, SensorEntity):
"""Representation of a SMHI Sensor."""
entity_description: SMHISensorEntityDescription
class SMHIWeatherSensor(SmhiWeatherEntity, SensorEntity):
"""Representation of a SMHI Weather Sensor."""
entity_description: SMHIWeatherEntityDescription
def __init__(
self,
latitude: str,
longitude: str,
coordinator: SMHIDataUpdateCoordinator,
entity_description: SMHISensorEntityDescription,
entity_description: SMHIWeatherEntityDescription,
) -> None:
"""Initiate SMHI Sensor."""
self.entity_description = entity_description
@@ -137,3 +288,30 @@ class SMHISensor(SmhiWeatherBaseEntity, SensorEntity):
"""Refresh the entity data."""
if self.coordinator.data.daily:
self._attr_native_value = self.entity_description.value_fn(self)
class SMHIFireSensor(SmhiFireEntity, SensorEntity):
"""Representation of a SMHI Weather Sensor."""
entity_description: SMHIFireEntityDescription
def __init__(
self,
latitude: str,
longitude: str,
coordinator: SMHIFireDataUpdateCoordinator,
entity_description: SMHIFireEntityDescription,
) -> None:
"""Initiate SMHI Sensor."""
self.entity_description = entity_description
super().__init__(
latitude,
longitude,
coordinator,
)
self._attr_unique_id = f"{latitude}, {longitude}-{entity_description.key}"
def update_entity_data(self) -> None:
"""Refresh the entity data."""
if self.coordinator.data.fire_daily:
self._attr_native_value = self.entity_description.value_fn(self)

View File

@@ -26,12 +26,66 @@
},
"entity": {
"sensor": {
"build_up_index": {
"name": "Build up index"
},
"drought_code": {
"name": "Drought code"
},
"duff_moisture_code": {
"name": "Duff moisture code"
},
"fine_fuel_moisture_code": {
"name": "Fine fuel moisture code"
},
"fire_weather_index": {
"name": "Fire weather index"
},
"forestdry": {
"name": "Fuel drying",
"state": {
"dry": "Dry",
"extremely_dry": "Extremely dry",
"moderate_wet": "Moderately wet",
"very_dry": "Very dry",
"very_wet": "Very wet",
"wet": "Wet"
}
},
"frozen_precipitation": {
"name": "Frozen precipitation"
},
"fwi": {
"name": "Fire weather index"
},
"fwiindex": {
"name": "FWI index",
"state": {
"extreme": "Extremely high risk",
"high": "High risk",
"low": "Low risk",
"moderate": "Moderate risk",
"very_high": "Very high risk",
"very_low": "Very low risk"
}
},
"grassfire": {
"name": "Highest grass fire risk",
"state": {
"high": "High",
"low": "Low",
"moderate": "Moderate",
"season_over": "Grass fire season over",
"snow_cover": "Snow cover",
"very_high": "Very high"
}
},
"high_cloud": {
"name": "High cloud coverage"
},
"initial_spread_index": {
"name": "Initial spread index"
},
"low_cloud": {
"name": "Low cloud coverage"
},
@@ -50,6 +104,9 @@
"6": "Freezing drizzle"
}
},
"rate_of_spread": {
"name": "Potential rate of spread"
},
"thunder": {
"name": "Thunder probability"
},

View File

@@ -55,7 +55,7 @@ from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from .const import ATTR_SMHI_THUNDER_PROBABILITY, ENTITY_ID_SENSOR_FORMAT
from .coordinator import SMHIConfigEntry
from .entity import SmhiWeatherBaseEntity
from .entity import SmhiWeatherEntity
# Used to map condition from API results
CONDITION_CLASSES: Final[dict[str, list[int]]] = {
@@ -89,7 +89,7 @@ async def async_setup_entry(
"""Add a weather entity from map location."""
location = config_entry.data
coordinator = config_entry.runtime_data
coordinator = config_entry.runtime_data[0]
entity = SmhiWeather(
location[CONF_LOCATION][CONF_LATITUDE],
@@ -101,7 +101,7 @@ async def async_setup_entry(
async_add_entities([entity])
class SmhiWeather(SmhiWeatherBaseEntity, SingleCoordinatorWeatherEntity):
class SmhiWeather(SmhiWeatherEntity, SingleCoordinatorWeatherEntity):
"""Representation of a weather entity."""
_attr_native_temperature_unit = UnitOfTemperature.CELSIUS

View File

@@ -49,7 +49,9 @@ QUERY_SCHEMA = vol.Schema(
{
vol.Required(CONF_COLUMN_NAME): cv.string,
vol.Required(CONF_NAME): cv.template,
vol.Required(CONF_QUERY): vol.All(cv.string, validate_sql_select),
vol.Required(CONF_QUERY): vol.All(
cv.template, ValueTemplate.from_template, validate_sql_select
),
vol.Optional(CONF_UNIT_OF_MEASUREMENT): cv.string,
vol.Optional(CONF_VALUE_TEMPLATE): vol.All(
cv.template, ValueTemplate.from_template

View File

@@ -9,8 +9,6 @@ import sqlalchemy
from sqlalchemy.engine import Engine, Result
from sqlalchemy.exc import MultipleResultsFound, NoSuchColumnError, SQLAlchemyError
from sqlalchemy.orm import Session, scoped_session, sessionmaker
import sqlparse
from sqlparse.exceptions import SQLParseError
import voluptuous as vol
from homeassistant.components.recorder import CONF_DB_URL, get_instance
@@ -31,21 +29,28 @@ from homeassistant.const import (
CONF_UNIT_OF_MEASUREMENT,
CONF_VALUE_TEMPLATE,
)
from homeassistant.core import callback
from homeassistant.core import async_get_hass, callback
from homeassistant.data_entry_flow import section
from homeassistant.exceptions import TemplateError
from homeassistant.helpers import selector
from .const import CONF_ADVANCED_OPTIONS, CONF_COLUMN_NAME, CONF_QUERY, DOMAIN
from .util import resolve_db_url
from .util import (
EmptyQueryError,
InvalidSqlQuery,
MultipleQueryError,
NotSelectQueryError,
UnknownQueryTypeError,
check_and_render_sql_query,
resolve_db_url,
)
_LOGGER = logging.getLogger(__name__)
OPTIONS_SCHEMA: vol.Schema = vol.Schema(
{
vol.Required(CONF_QUERY): selector.TextSelector(
selector.TextSelectorConfig(multiline=True)
),
vol.Required(CONF_QUERY): selector.TemplateSelector(),
vol.Required(CONF_COLUMN_NAME): selector.TextSelector(),
vol.Required(CONF_ADVANCED_OPTIONS): section(
vol.Schema(
@@ -89,14 +94,12 @@ CONFIG_SCHEMA: vol.Schema = vol.Schema(
def validate_sql_select(value: str) -> str:
"""Validate that value is a SQL SELECT query."""
if len(query := sqlparse.parse(value.lstrip().lstrip(";"))) > 1:
raise MultipleResultsFound
if len(query) == 0 or (query_type := query[0].get_type()) == "UNKNOWN":
raise ValueError
if query_type != "SELECT":
_LOGGER.debug("The SQL query %s is of type %s", query, query_type)
raise SQLParseError
return str(query[0])
hass = async_get_hass()
try:
return check_and_render_sql_query(hass, value)
except (TemplateError, InvalidSqlQuery) as err:
_LOGGER.debug("Invalid query '%s' results in '%s'", value, err.args[0])
raise
def validate_db_connection(db_url: str) -> bool:
@@ -138,7 +141,7 @@ def validate_query(db_url: str, query: str, column: str) -> bool:
if sess:
sess.close()
engine.dispose()
raise ValueError(error) from error
raise InvalidSqlQuery from error
for res in result.mappings():
if column not in res:
@@ -224,13 +227,13 @@ class SQLConfigFlow(ConfigFlow, domain=DOMAIN):
except NoSuchColumnError:
errors["column"] = "column_invalid"
description_placeholders = {"column": column}
except MultipleResultsFound:
except (MultipleResultsFound, MultipleQueryError):
errors["query"] = "multiple_queries"
except SQLAlchemyError:
errors["db_url"] = "db_url_invalid"
except SQLParseError:
except (NotSelectQueryError, UnknownQueryTypeError):
errors["query"] = "query_no_read_only"
except ValueError as err:
except (TemplateError, EmptyQueryError, InvalidSqlQuery) as err:
_LOGGER.debug("Invalid query: %s", err)
errors["query"] = "query_invalid"
@@ -282,13 +285,13 @@ class SQLOptionsFlowHandler(OptionsFlowWithReload):
except NoSuchColumnError:
errors["column"] = "column_invalid"
description_placeholders = {"column": column}
except MultipleResultsFound:
except (MultipleResultsFound, MultipleQueryError):
errors["query"] = "multiple_queries"
except SQLAlchemyError:
errors["db_url"] = "db_url_invalid"
except SQLParseError:
except (NotSelectQueryError, UnknownQueryTypeError):
errors["query"] = "query_no_read_only"
except ValueError as err:
except (TemplateError, EmptyQueryError, InvalidSqlQuery) as err:
_LOGGER.debug("Invalid query: %s", err)
errors["query"] = "query_invalid"
else:

View File

@@ -22,7 +22,7 @@ from homeassistant.const import (
MATCH_ALL,
)
from homeassistant.core import HomeAssistant
from homeassistant.exceptions import TemplateError
from homeassistant.exceptions import PlatformNotReady, TemplateError
from homeassistant.helpers.device_registry import DeviceEntryType, DeviceInfo
from homeassistant.helpers.entity_platform import (
AddConfigEntryEntitiesCallback,
@@ -40,7 +40,9 @@ from homeassistant.helpers.typing import ConfigType, DiscoveryInfoType
from .const import CONF_ADVANCED_OPTIONS, CONF_COLUMN_NAME, CONF_QUERY, DOMAIN
from .util import (
InvalidSqlQuery,
async_create_sessionmaker,
check_and_render_sql_query,
convert_value,
generate_lambda_stmt,
redact_credentials,
@@ -81,7 +83,7 @@ async def async_setup_platform(
return
name: Template = conf[CONF_NAME]
query_str: str = conf[CONF_QUERY]
query_template: ValueTemplate = conf[CONF_QUERY]
value_template: ValueTemplate | None = conf.get(CONF_VALUE_TEMPLATE)
column_name: str = conf[CONF_COLUMN_NAME]
unique_id: str | None = conf.get(CONF_UNIQUE_ID)
@@ -96,7 +98,7 @@ async def async_setup_platform(
await async_setup_sensor(
hass,
trigger_entity_config,
query_str,
query_template,
column_name,
value_template,
unique_id,
@@ -119,6 +121,13 @@ async def async_setup_entry(
template: str | None = entry.options[CONF_ADVANCED_OPTIONS].get(CONF_VALUE_TEMPLATE)
column_name: str = entry.options[CONF_COLUMN_NAME]
query_template: ValueTemplate | None = None
try:
query_template = ValueTemplate(query_str, hass)
query_template.ensure_valid()
except TemplateError as err:
raise PlatformNotReady("Invalid SQL query template") from err
value_template: ValueTemplate | None = None
if template is not None:
try:
@@ -137,7 +146,7 @@ async def async_setup_entry(
await async_setup_sensor(
hass,
trigger_entity_config,
query_str,
query_template,
column_name,
value_template,
entry.entry_id,
@@ -150,7 +159,7 @@ async def async_setup_entry(
async def async_setup_sensor(
hass: HomeAssistant,
trigger_entity_config: ConfigType,
query_str: str,
query_template: ValueTemplate,
column_name: str,
value_template: ValueTemplate | None,
unique_id: str | None,
@@ -166,22 +175,25 @@ async def async_setup_sensor(
) = await async_create_sessionmaker(hass, db_url)
if sessmaker is None:
return
validate_query(hass, query_str, uses_recorder_db, unique_id)
validate_query(hass, query_template, uses_recorder_db, unique_id)
query_str = check_and_render_sql_query(hass, query_template)
upper_query = query_str.upper()
# MSSQL uses TOP and not LIMIT
mod_query_template = query_template
if not ("LIMIT" in upper_query or "SELECT TOP" in upper_query):
if "mssql" in db_url:
query_str = upper_query.replace("SELECT", "SELECT TOP 1")
_query = query_template.template.replace("SELECT", "SELECT TOP 1")
else:
query_str = query_str.replace(";", "") + " LIMIT 1;"
_query = query_template.template.replace(";", "") + " LIMIT 1;"
mod_query_template = ValueTemplate(_query, hass)
async_add_entities(
[
SQLSensor(
trigger_entity_config,
sessmaker,
query_str,
mod_query_template,
column_name,
value_template,
yaml,
@@ -200,7 +212,7 @@ class SQLSensor(ManualTriggerSensorEntity):
self,
trigger_entity_config: ConfigType,
sessmaker: scoped_session,
query: str,
query: ValueTemplate,
column: str,
value_template: ValueTemplate | None,
yaml: bool,
@@ -214,7 +226,6 @@ class SQLSensor(ManualTriggerSensorEntity):
self.sessionmaker = sessmaker
self._attr_extra_state_attributes = {}
self._use_database_executor = use_database_executor
self._lambda_stmt = generate_lambda_stmt(query)
if not yaml and (unique_id := trigger_entity_config.get(CONF_UNIQUE_ID)):
self._attr_name = None
self._attr_has_entity_name = True
@@ -255,11 +266,22 @@ class SQLSensor(ManualTriggerSensorEntity):
self._attr_extra_state_attributes = {}
sess: scoped_session = self.sessionmaker()
try:
result: Result = sess.execute(self._lambda_stmt)
rendered_query = check_and_render_sql_query(self.hass, self._query)
_lambda_stmt = generate_lambda_stmt(rendered_query)
result: Result = sess.execute(_lambda_stmt)
except (TemplateError, InvalidSqlQuery) as err:
_LOGGER.error(
"Error rendering query %s: %s",
redact_credentials(self._query.template),
redact_credentials(str(err)),
)
sess.rollback()
sess.close()
return
except SQLAlchemyError as err:
_LOGGER.error(
"Error executing query %s: %s",
self._query,
rendered_query,
redact_credentials(str(err)),
)
sess.rollback()
@@ -267,7 +289,7 @@ class SQLSensor(ManualTriggerSensorEntity):
return
for res in result.mappings():
_LOGGER.debug("Query %s result in %s", self._query, res.items())
_LOGGER.debug("Query %s result in %s", rendered_query, res.items())
data = res[self._column_name]
for key, value in res.items():
self._attr_extra_state_attributes[key] = convert_value(value)
@@ -287,6 +309,6 @@ class SQLSensor(ManualTriggerSensorEntity):
self._attr_native_value = data
if data is None:
_LOGGER.warning("%s returned no results", self._query)
_LOGGER.warning("%s returned no results", rendered_query)
sess.close()

View File

@@ -19,11 +19,13 @@ from homeassistant.core import (
)
from homeassistant.exceptions import ServiceValidationError
from homeassistant.helpers import config_validation as cv
from homeassistant.helpers.trigger_template_entity import ValueTemplate
from homeassistant.util.json import JsonValueType
from .const import CONF_QUERY, DOMAIN
from .util import (
async_create_sessionmaker,
check_and_render_sql_query,
convert_value,
generate_lambda_stmt,
redact_credentials,
@@ -37,7 +39,9 @@ _LOGGER = logging.getLogger(__name__)
SERVICE_QUERY = "query"
SERVICE_QUERY_SCHEMA = vol.Schema(
{
vol.Required(CONF_QUERY): vol.All(cv.string, validate_sql_select),
vol.Required(CONF_QUERY): vol.All(
cv.template, ValueTemplate.from_template, validate_sql_select
),
vol.Optional(CONF_DB_URL): cv.string,
}
)
@@ -72,8 +76,9 @@ async def _async_query_service(
def _execute_and_convert_query() -> list[JsonValueType]:
"""Execute the query and return the results with converted types."""
sess: Session = sessmaker()
rendered_query = check_and_render_sql_query(call.hass, query_str)
try:
result: Result = sess.execute(generate_lambda_stmt(query_str))
result: Result = sess.execute(generate_lambda_stmt(rendered_query))
except SQLAlchemyError as err:
_LOGGER.debug(
"Error executing query %s: %s",

View File

@@ -8,7 +8,7 @@
"db_url_invalid": "Database URL invalid",
"multiple_queries": "Multiple SQL queries are not supported",
"query_invalid": "SQL query invalid",
"query_no_read_only": "SQL query must be read-only"
"query_no_read_only": "SQL query is not a read-only SELECT query or it's of an unknown type"
},
"step": {
"options": {

View File

@@ -19,7 +19,9 @@ import voluptuous as vol
from homeassistant.components.recorder import SupportedDialect, get_instance
from homeassistant.const import EVENT_HOMEASSISTANT_STOP
from homeassistant.core import Event, HomeAssistant, callback
from homeassistant.exceptions import HomeAssistantError, TemplateError
from homeassistant.helpers import issue_registry as ir
from homeassistant.helpers.template import Template
from .const import DB_URL_RE, DOMAIN
from .models import SQLData
@@ -44,16 +46,14 @@ def resolve_db_url(hass: HomeAssistant, db_url: str | None) -> str:
return get_instance(hass).db_url
def validate_sql_select(value: str) -> str:
def validate_sql_select(value: Template) -> Template:
"""Validate that value is a SQL SELECT query."""
if len(query := sqlparse.parse(value.lstrip().lstrip(";"))) > 1:
raise vol.Invalid("Multiple SQL queries are not supported")
if len(query) == 0 or (query_type := query[0].get_type()) == "UNKNOWN":
raise vol.Invalid("Invalid SQL query")
if query_type != "SELECT":
_LOGGER.debug("The SQL query %s is of type %s", query, query_type)
raise vol.Invalid("Only SELECT queries allowed")
return str(query[0])
try:
assert value.hass
check_and_render_sql_query(value.hass, value)
except (TemplateError, InvalidSqlQuery) as err:
raise vol.Invalid(str(err)) from err
return value
async def async_create_sessionmaker(
@@ -113,7 +113,7 @@ async def async_create_sessionmaker(
def validate_query(
hass: HomeAssistant,
query_str: str,
query_template: str | Template,
uses_recorder_db: bool,
unique_id: str | None = None,
) -> None:
@@ -121,7 +121,7 @@ def validate_query(
Args:
hass: The Home Assistant instance.
query_str: The SQL query string to be validated.
query_template: The SQL query string to be validated.
uses_recorder_db: A boolean indicating if the query is against the recorder database.
unique_id: The unique ID of the entity, used for creating issue registry keys.
@@ -131,6 +131,10 @@ def validate_query(
"""
if not uses_recorder_db:
return
if isinstance(query_template, Template):
query_str = query_template.async_render()
else:
query_str = Template(query_template, hass).async_render()
redacted_query = redact_credentials(query_str)
issue_key = unique_id if unique_id else redacted_query
@@ -239,3 +243,49 @@ def convert_value(value: Any) -> Any:
return f"0x{value.hex()}"
case _:
return value
def check_and_render_sql_query(hass: HomeAssistant, query: Template | str) -> str:
"""Check and render SQL query."""
if isinstance(query, str):
query = query.strip()
if not query:
raise EmptyQueryError("Query cannot be empty")
query = Template(query, hass=hass)
# Raises TemplateError if template is invalid
query.ensure_valid()
rendered_query: str = query.async_render()
if len(rendered_queries := sqlparse.parse(rendered_query.lstrip().lstrip(";"))) > 1:
raise MultipleQueryError("Multiple SQL statements are not allowed")
if (
len(rendered_queries) == 0
or (query_type := rendered_queries[0].get_type()) == "UNKNOWN"
):
raise UnknownQueryTypeError("SQL query is empty or unknown type")
if query_type != "SELECT":
_LOGGER.debug("The SQL query %s is of type %s", rendered_query, query_type)
raise NotSelectQueryError("SQL query must be of type SELECT")
return str(rendered_queries[0])
class InvalidSqlQuery(HomeAssistantError):
"""SQL query is invalid error."""
class EmptyQueryError(InvalidSqlQuery):
"""SQL query is empty error."""
class MultipleQueryError(InvalidSqlQuery):
"""SQL query is multiple error."""
class UnknownQueryTypeError(InvalidSqlQuery):
"""SQL query is of unknown type error."""
class NotSelectQueryError(InvalidSqlQuery):
"""SQL query is not a SELECT statement error."""

View File

@@ -1,4 +1,4 @@
"""The DALI Center integration."""
"""The Sunricher DALI integration."""
from __future__ import annotations
@@ -28,7 +28,7 @@ _LOGGER = logging.getLogger(__name__)
async def async_setup_entry(hass: HomeAssistant, entry: DaliCenterConfigEntry) -> bool:
"""Set up DALI Center from a config entry."""
"""Set up Sunricher DALI from a config entry."""
gateway = DaliGateway(
entry.data[CONF_SERIAL_NUMBER],

View File

@@ -1,4 +1,4 @@
"""Config flow for the DALI Center integration."""
"""Config flow for the Sunricher DALI integration."""
from __future__ import annotations
@@ -30,7 +30,7 @@ _LOGGER = logging.getLogger(__name__)
class DaliCenterConfigFlow(ConfigFlow, domain=DOMAIN):
"""Handle a config flow for DALI Center."""
"""Handle a config flow for Sunricher DALI."""
VERSION = 1

View File

@@ -0,0 +1,5 @@
"""Constants for the Sunricher DALI integration."""
DOMAIN = "sunricher_dali"
MANUFACTURER = "Sunricher"
CONF_SERIAL_NUMBER = "serial_number"

View File

@@ -38,7 +38,7 @@ async def async_setup_entry(
entry: DaliCenterConfigEntry,
async_add_entities: AddConfigEntryEntitiesCallback,
) -> None:
"""Set up DALI Center light entities from config entry."""
"""Set up Sunricher DALI light entities from config entry."""
runtime_data = entry.runtime_data
gateway = runtime_data.gateway
devices = runtime_data.devices
@@ -57,7 +57,7 @@ async def async_setup_entry(
class DaliCenterLight(LightEntity):
"""Representation of a DALI Center Light."""
"""Representation of a Sunricher DALI Light."""
_attr_has_entity_name = True
_attr_name = None

View File

@@ -1,9 +1,9 @@
{
"domain": "sunricher_dali_center",
"name": "DALI Center",
"domain": "sunricher_dali",
"name": "Sunricher DALI",
"codeowners": ["@niracler"],
"config_flow": true,
"documentation": "https://www.home-assistant.io/integrations/sunricher_dali_center",
"documentation": "https://www.home-assistant.io/integrations/sunricher_dali",
"iot_class": "local_push",
"quality_scale": "bronze",
"requirements": ["PySrDaliGateway==0.13.1"]

View File

@@ -5,8 +5,8 @@
},
"error": {
"cannot_connect": "[%key:common::config_flow::error::cannot_connect%]",
"discovery_failed": "Failed to discover DALI gateways on the network",
"no_devices_found": "No DALI gateways found on the network",
"discovery_failed": "Failed to discover Sunricher DALI gateways on the network",
"no_devices_found": "No Sunricher DALI gateways found on the network",
"unknown": "[%key:common::config_flow::error::unknown%]"
},
"step": {
@@ -17,12 +17,10 @@
"data_description": {
"selected_gateway": "Each option shows the gateway name, serial number, and IP address."
},
"description": "Select the gateway to configure.",
"title": "Select DALI gateway"
"description": "Select the gateway to configure."
},
"user": {
"description": "**Three-step process:**\n\n1. Ensure the gateway is powered and on the same network.\n2. Select **Submit** to start discovery (searches for up to 3 minutes)\n3. While discovery is in progress, press the **Reset** button on your DALI gateway device **once**.\n\nThe gateway will respond immediately after the button press.",
"title": "Set up DALI Center gateway"
"description": "**Three-step process:**\n\n1. Ensure the gateway is powered and on the same network.\n2. Select **Submit** to start discovery (searches for up to 3 minutes)\n3. While discovery is in progress, press the **Reset** button on your Sunricher DALI gateway device **once**.\n\nThe gateway will respond immediately after the button press."
}
}
}

View File

@@ -1,4 +1,4 @@
"""Type definitions for the DALI Center integration."""
"""Type definitions for the Sunricher DALI integration."""
from dataclasses import dataclass
@@ -9,7 +9,7 @@ from homeassistant.config_entries import ConfigEntry
@dataclass
class DaliCenterData:
"""Runtime data for the DALI Center integration."""
"""Runtime data for the Sunricher DALI integration."""
gateway: DaliGateway
devices: list[Device]

View File

@@ -1,5 +0,0 @@
"""Constants for the DALI Center integration."""
DOMAIN = "sunricher_dali_center"
MANUFACTURER = "Sunricher"
CONF_SERIAL_NUMBER = "serial_number"

View File

@@ -66,11 +66,11 @@ def get_process(entity: SystemMonitorSensor) -> bool:
class SysMonitorBinarySensorEntityDescription(BinarySensorEntityDescription):
"""Describes System Monitor binary sensor entities."""
value_fn: Callable[[SystemMonitorSensor], bool]
value_fn: Callable[[SystemMonitorSensor], bool | None]
add_to_update: Callable[[SystemMonitorSensor], tuple[str, str]]
SENSOR_TYPES: tuple[SysMonitorBinarySensorEntityDescription, ...] = (
PROCESS_TYPES: tuple[SysMonitorBinarySensorEntityDescription, ...] = (
SysMonitorBinarySensorEntityDescription(
key="binary_process",
translation_key="process",
@@ -81,6 +81,20 @@ SENSOR_TYPES: tuple[SysMonitorBinarySensorEntityDescription, ...] = (
),
)
BINARY_SENSOR_TYPES: tuple[SysMonitorBinarySensorEntityDescription, ...] = (
SysMonitorBinarySensorEntityDescription(
key="battery_plugged",
value_fn=(
lambda entity: entity.coordinator.data.battery.power_plugged
if entity.coordinator.data.battery
else None
),
device_class=BinarySensorDeviceClass.BATTERY_CHARGING,
add_to_update=lambda entity: ("battery", ""),
entity_registry_enabled_default=False,
),
)
async def async_setup_entry(
hass: HomeAssistant,
@@ -90,18 +104,30 @@ async def async_setup_entry(
"""Set up System Monitor binary sensors based on a config entry."""
coordinator = entry.runtime_data.coordinator
async_add_entities(
entities: list[SystemMonitorSensor] = []
entities.extend(
SystemMonitorSensor(
coordinator,
sensor_description,
entry.entry_id,
argument,
)
for sensor_description in SENSOR_TYPES
for sensor_description in PROCESS_TYPES
for argument in entry.options.get(BINARY_SENSOR_DOMAIN, {}).get(
CONF_PROCESS, []
)
)
entities.extend(
SystemMonitorSensor(
coordinator,
sensor_description,
entry.entry_id,
"",
)
for sensor_description in BINARY_SENSOR_TYPES
)
async_add_entities(entities)
class SystemMonitorSensor(

View File

@@ -9,7 +9,7 @@ import os
from typing import TYPE_CHECKING, Any, NamedTuple
from psutil import Process
from psutil._common import sdiskusage, shwtemp, snetio, snicaddr, sswap
from psutil._common import sbattery, sdiskusage, shwtemp, snetio, snicaddr, sswap
import psutil_home_assistant as ha_psutil
from homeassistant.components.binary_sensor import DOMAIN as BINARY_SENSOR_DOMAIN
@@ -22,6 +22,7 @@ from .const import CONF_PROCESS, PROCESS_ERRORS
if TYPE_CHECKING:
from . import SystemMonitorConfigEntry
from .util import read_fan_speed
_LOGGER = logging.getLogger(__name__)
@@ -30,44 +31,52 @@ _LOGGER = logging.getLogger(__name__)
class SensorData:
"""Sensor data."""
disk_usage: dict[str, sdiskusage]
swap: sswap
memory: VirtualMemory
io_counters: dict[str, snetio]
addresses: dict[str, list[snicaddr]]
load: tuple[float, float, float]
cpu_percent: float | None
battery: sbattery | None
boot_time: datetime
processes: list[Process]
temperatures: dict[str, list[shwtemp]]
cpu_percent: float | None
disk_usage: dict[str, sdiskusage]
fan_speed: dict[str, int]
io_counters: dict[str, snetio]
load: tuple[float, float, float]
memory: VirtualMemory
process_fds: dict[str, int]
processes: list[Process]
swap: sswap
temperatures: dict[str, list[shwtemp]]
def as_dict(self) -> dict[str, Any]:
"""Return as dict."""
disk_usage = None
if self.disk_usage:
disk_usage = {k: str(v) for k, v in self.disk_usage.items()}
io_counters = None
if self.io_counters:
io_counters = {k: str(v) for k, v in self.io_counters.items()}
addresses = None
if self.addresses:
addresses = {k: str(v) for k, v in self.addresses.items()}
disk_usage = None
if self.disk_usage:
disk_usage = {k: str(v) for k, v in self.disk_usage.items()}
fan_speed = None
if self.fan_speed:
fan_speed = {k: str(v) for k, v in self.fan_speed.items()}
io_counters = None
if self.io_counters:
io_counters = {k: str(v) for k, v in self.io_counters.items()}
temperatures = None
if self.temperatures:
temperatures = {k: str(v) for k, v in self.temperatures.items()}
return {
"disk_usage": disk_usage,
"swap": str(self.swap),
"memory": str(self.memory),
"io_counters": io_counters,
"addresses": addresses,
"load": str(self.load),
"cpu_percent": str(self.cpu_percent),
"battery": str(self.battery),
"boot_time": str(self.boot_time),
"processes": str(self.processes),
"temperatures": temperatures,
"cpu_percent": str(self.cpu_percent),
"disk_usage": disk_usage,
"fan_speed": fan_speed,
"io_counters": io_counters,
"load": str(self.load),
"memory": str(self.memory),
"process_fds": self.process_fds,
"processes": str(self.processes),
"swap": str(self.swap),
"temperatures": temperatures,
}
@@ -124,14 +133,16 @@ class SystemMonitorCoordinator(TimestampDataUpdateCoordinator[SensorData]):
_disk_defaults[("disks", argument)] = set()
return {
**_disk_defaults,
("swap", ""): set(),
("memory", ""): set(),
("io_counters", ""): set(),
("addresses", ""): set(),
("load", ""): set(),
("cpu_percent", ""): set(),
("battery", ""): set(),
("boot", ""): set(),
("cpu_percent", ""): set(),
("fan_speed", ""): set(),
("io_counters", ""): set(),
("load", ""): set(),
("memory", ""): set(),
("processes", ""): set(),
("swap", ""): set(),
("temperatures", ""): set(),
}
@@ -153,17 +164,19 @@ class SystemMonitorCoordinator(TimestampDataUpdateCoordinator[SensorData]):
self._initial_update = False
return SensorData(
disk_usage=_data["disks"],
swap=_data["swap"],
memory=_data["memory"],
io_counters=_data["io_counters"],
addresses=_data["addresses"],
load=load,
cpu_percent=cpu_percent,
battery=_data["battery"],
boot_time=_data["boot_time"],
processes=_data["processes"],
temperatures=_data["temperatures"],
cpu_percent=cpu_percent,
disk_usage=_data["disks"],
fan_speed=_data["fan_speed"],
io_counters=_data["io_counters"],
load=load,
memory=_data["memory"],
process_fds=_data["process_fds"],
processes=_data["processes"],
swap=_data["swap"],
temperatures=_data["temperatures"],
)
def update_data(self) -> dict[str, Any]:
@@ -255,14 +268,33 @@ class SystemMonitorCoordinator(TimestampDataUpdateCoordinator[SensorData]):
except AttributeError:
_LOGGER.debug("OS does not provide temperature sensors")
fan_speed: dict[str, int] = {}
if self.update_subscribers[("fan_speed", "")] or self._initial_update:
try:
fan_sensors = self._psutil.sensors_fans()
fan_speed = read_fan_speed(fan_sensors)
_LOGGER.debug("fan_speed: %s", fan_speed)
except AttributeError:
_LOGGER.debug("OS does not provide fan sensors")
battery: sbattery | None = None
if self.update_subscribers[("battery", "")] or self._initial_update:
try:
battery = self._psutil.sensors_battery()
_LOGGER.debug("battery: %s", battery)
except AttributeError:
_LOGGER.debug("OS does not provide battery sensors")
return {
"disks": disks,
"swap": swap,
"memory": memory,
"io_counters": io_counters,
"addresses": addresses,
"battery": battery,
"boot_time": self.boot_time,
"processes": selected_processes,
"temperatures": temps,
"disks": disks,
"fan_speed": fan_speed,
"io_counters": io_counters,
"memory": memory,
"process_fds": process_fds,
"processes": selected_processes,
"swap": swap,
"temperatures": temps,
}

View File

@@ -1,6 +1,9 @@
{
"entity": {
"sensor": {
"battery_empty": {
"default": "mdi:battery-clock"
},
"disk_free": {
"default": "mdi:harddisk"
},
@@ -10,6 +13,9 @@
"disk_use_percent": {
"default": "mdi:harddisk"
},
"fan_speed": {
"default": "mdi:fan"
},
"ipv4_address": {
"default": "mdi:ip-network"
},

View File

@@ -5,7 +5,7 @@ from __future__ import annotations
from collections.abc import Callable
import contextlib
from dataclasses import dataclass
from datetime import datetime
from datetime import datetime, timedelta
from functools import lru_cache
import ipaddress
import logging
@@ -14,6 +14,8 @@ import sys
import time
from typing import Any, Literal
from psutil._common import POWER_TIME_UNKNOWN, POWER_TIME_UNLIMITED
from homeassistant.components.sensor import (
DOMAIN as SENSOR_DOMAIN,
SensorDeviceClass,
@@ -23,6 +25,7 @@ from homeassistant.components.sensor import (
)
from homeassistant.const import (
PERCENTAGE,
REVOLUTIONS_PER_MINUTE,
EntityCategory,
UnitOfDataRate,
UnitOfInformation,
@@ -34,7 +37,7 @@ from homeassistant.helpers.device_registry import DeviceEntryType, DeviceInfo
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from homeassistant.helpers.typing import StateType
from homeassistant.helpers.update_coordinator import CoordinatorEntity
from homeassistant.util import slugify
from homeassistant.util import dt as dt_util, slugify
from . import SystemMonitorConfigEntry
from .binary_sensor import BINARY_SENSOR_DOMAIN
@@ -55,12 +58,23 @@ SENSOR_TYPE_MANDATORY_ARG = 4
SIGNAL_SYSTEMMONITOR_UPDATE = "systemmonitor_update"
SENSORS_NO_ARG = ("load_", "memory_", "processor_use", "swap_", "last_boot")
BATTERY_REMAIN_UNKNOWNS = (POWER_TIME_UNKNOWN, POWER_TIME_UNLIMITED)
SENSORS_NO_ARG = (
"battery_empty",
"battery",
"last_boot",
"load_",
"memory_",
"processor_use",
"swap_",
)
SENSORS_WITH_ARG = {
"disk_": "disk_arguments",
"fan_speed": "fan_speed_arguments",
"ipv": "network_arguments",
**dict.fromkeys(NET_IO_TYPES, "network_arguments"),
"process_num_fds": "processes",
**dict.fromkeys(NET_IO_TYPES, "network_arguments"),
}
@@ -133,6 +147,17 @@ def get_process_num_fds(entity: SystemMonitorSensor) -> int | None:
return process_fds.get(entity.argument)
def battery_time_ends(entity: SystemMonitorSensor) -> datetime | None:
"""Return when battery runs out, rounded to minute."""
battery = entity.coordinator.data.battery
if not battery or battery.secsleft in BATTERY_REMAIN_UNKNOWNS:
return None
return (dt_util.utcnow() + timedelta(seconds=battery.secsleft)).replace(
second=0, microsecond=0
)
@dataclass(frozen=True, kw_only=True)
class SysMonitorSensorEntityDescription(SensorEntityDescription):
"""Describes System Monitor sensor entities."""
@@ -145,6 +170,28 @@ class SysMonitorSensorEntityDescription(SensorEntityDescription):
SENSOR_TYPES: dict[str, SysMonitorSensorEntityDescription] = {
"battery": SysMonitorSensorEntityDescription(
key="battery",
native_unit_of_measurement=PERCENTAGE,
device_class=SensorDeviceClass.BATTERY,
state_class=SensorStateClass.MEASUREMENT,
value_fn=(
lambda entity: entity.coordinator.data.battery.percent
if entity.coordinator.data.battery
else None
),
none_is_unavailable=True,
add_to_update=lambda entity: ("battery", ""),
),
"battery_empty": SysMonitorSensorEntityDescription(
key="battery_empty",
translation_key="battery_empty",
device_class=SensorDeviceClass.TIMESTAMP,
state_class=SensorStateClass.MEASUREMENT,
value_fn=battery_time_ends,
none_is_unavailable=True,
add_to_update=lambda entity: ("battery", ""),
),
"disk_free": SysMonitorSensorEntityDescription(
key="disk_free",
translation_key="disk_free",
@@ -152,11 +199,13 @@ SENSOR_TYPES: dict[str, SysMonitorSensorEntityDescription] = {
native_unit_of_measurement=UnitOfInformation.GIBIBYTES,
device_class=SensorDeviceClass.DATA_SIZE,
state_class=SensorStateClass.MEASUREMENT,
value_fn=lambda entity: round(
entity.coordinator.data.disk_usage[entity.argument].free / 1024**3, 1
)
if entity.argument in entity.coordinator.data.disk_usage
else None,
value_fn=(
lambda entity: round(
entity.coordinator.data.disk_usage[entity.argument].free / 1024**3, 1
)
if entity.argument in entity.coordinator.data.disk_usage
else None
),
none_is_unavailable=True,
add_to_update=lambda entity: ("disks", entity.argument),
),
@@ -167,11 +216,13 @@ SENSOR_TYPES: dict[str, SysMonitorSensorEntityDescription] = {
native_unit_of_measurement=UnitOfInformation.GIBIBYTES,
device_class=SensorDeviceClass.DATA_SIZE,
state_class=SensorStateClass.MEASUREMENT,
value_fn=lambda entity: round(
entity.coordinator.data.disk_usage[entity.argument].used / 1024**3, 1
)
if entity.argument in entity.coordinator.data.disk_usage
else None,
value_fn=(
lambda entity: round(
entity.coordinator.data.disk_usage[entity.argument].used / 1024**3, 1
)
if entity.argument in entity.coordinator.data.disk_usage
else None
),
none_is_unavailable=True,
add_to_update=lambda entity: ("disks", entity.argument),
),
@@ -181,14 +232,24 @@ SENSOR_TYPES: dict[str, SysMonitorSensorEntityDescription] = {
placeholder="mount_point",
native_unit_of_measurement=PERCENTAGE,
state_class=SensorStateClass.MEASUREMENT,
value_fn=lambda entity: entity.coordinator.data.disk_usage[
entity.argument
].percent
if entity.argument in entity.coordinator.data.disk_usage
else None,
value_fn=(
lambda entity: entity.coordinator.data.disk_usage[entity.argument].percent
if entity.argument in entity.coordinator.data.disk_usage
else None
),
none_is_unavailable=True,
add_to_update=lambda entity: ("disks", entity.argument),
),
"fan_speed": SysMonitorSensorEntityDescription(
key="fan_speed",
translation_key="fan_speed",
placeholder="fan_name",
native_unit_of_measurement=REVOLUTIONS_PER_MINUTE,
state_class=SensorStateClass.MEASUREMENT,
value_fn=lambda entity: entity.coordinator.data.fan_speed[entity.argument],
none_is_unavailable=True,
add_to_update=lambda entity: ("fan_speed", ""),
),
"ipv4_address": SysMonitorSensorEntityDescription(
key="ipv4_address",
translation_key="ipv4_address",
@@ -212,14 +273,6 @@ SENSOR_TYPES: dict[str, SysMonitorSensorEntityDescription] = {
value_fn=lambda entity: entity.coordinator.data.boot_time,
add_to_update=lambda entity: ("boot", ""),
),
"load_15m": SysMonitorSensorEntityDescription(
key="load_15m",
translation_key="load_15m",
icon=get_cpu_icon(),
state_class=SensorStateClass.MEASUREMENT,
value_fn=lambda entity: round(entity.coordinator.data.load[2], 2),
add_to_update=lambda entity: ("load", ""),
),
"load_1m": SysMonitorSensorEntityDescription(
key="load_1m",
translation_key="load_1m",
@@ -236,14 +289,22 @@ SENSOR_TYPES: dict[str, SysMonitorSensorEntityDescription] = {
value_fn=lambda entity: round(entity.coordinator.data.load[1], 2),
add_to_update=lambda entity: ("load", ""),
),
"load_15m": SysMonitorSensorEntityDescription(
key="load_15m",
translation_key="load_15m",
icon=get_cpu_icon(),
state_class=SensorStateClass.MEASUREMENT,
value_fn=lambda entity: round(entity.coordinator.data.load[2], 2),
add_to_update=lambda entity: ("load", ""),
),
"memory_free": SysMonitorSensorEntityDescription(
key="memory_free",
translation_key="memory_free",
native_unit_of_measurement=UnitOfInformation.MEBIBYTES,
device_class=SensorDeviceClass.DATA_SIZE,
state_class=SensorStateClass.MEASUREMENT,
value_fn=lambda entity: round(
entity.coordinator.data.memory.available / 1024**2, 1
value_fn=(
lambda entity: round(entity.coordinator.data.memory.available / 1024**2, 1)
),
add_to_update=lambda entity: ("memory", ""),
),
@@ -253,13 +314,15 @@ SENSOR_TYPES: dict[str, SysMonitorSensorEntityDescription] = {
native_unit_of_measurement=UnitOfInformation.MEBIBYTES,
device_class=SensorDeviceClass.DATA_SIZE,
state_class=SensorStateClass.MEASUREMENT,
value_fn=lambda entity: round(
(
entity.coordinator.data.memory.total
- entity.coordinator.data.memory.available
value_fn=(
lambda entity: round(
(
entity.coordinator.data.memory.total
- entity.coordinator.data.memory.available
)
/ 1024**2,
1,
)
/ 1024**2,
1,
),
add_to_update=lambda entity: ("memory", ""),
),
@@ -311,27 +374,15 @@ SENSOR_TYPES: dict[str, SysMonitorSensorEntityDescription] = {
value_fn=get_packets,
add_to_update=lambda entity: ("io_counters", ""),
),
"throughput_network_in": SysMonitorSensorEntityDescription(
key="throughput_network_in",
translation_key="throughput_network_in",
placeholder="interface",
native_unit_of_measurement=UnitOfDataRate.MEGABYTES_PER_SECOND,
device_class=SensorDeviceClass.DATA_RATE,
"process_num_fds": SysMonitorSensorEntityDescription(
key="process_num_fds",
translation_key="process_num_fds",
placeholder="process",
state_class=SensorStateClass.MEASUREMENT,
entity_registry_enabled_default=False,
mandatory_arg=True,
value_fn=get_throughput,
add_to_update=lambda entity: ("io_counters", ""),
),
"throughput_network_out": SysMonitorSensorEntityDescription(
key="throughput_network_out",
translation_key="throughput_network_out",
placeholder="interface",
native_unit_of_measurement=UnitOfDataRate.MEGABYTES_PER_SECOND,
device_class=SensorDeviceClass.DATA_RATE,
state_class=SensorStateClass.MEASUREMENT,
mandatory_arg=True,
value_fn=get_throughput,
add_to_update=lambda entity: ("io_counters", ""),
value_fn=get_process_num_fds,
add_to_update=lambda entity: ("processes", ""),
),
"processor_use": SysMonitorSensorEntityDescription(
key="processor_use",
@@ -339,10 +390,12 @@ SENSOR_TYPES: dict[str, SysMonitorSensorEntityDescription] = {
native_unit_of_measurement=PERCENTAGE,
icon=get_cpu_icon(),
state_class=SensorStateClass.MEASUREMENT,
value_fn=lambda entity: (
round(entity.coordinator.data.cpu_percent)
if entity.coordinator.data.cpu_percent
else None
value_fn=(
lambda entity: (
round(entity.coordinator.data.cpu_percent)
if entity.coordinator.data.cpu_percent
else None
)
),
add_to_update=lambda entity: ("cpu_percent", ""),
),
@@ -352,8 +405,8 @@ SENSOR_TYPES: dict[str, SysMonitorSensorEntityDescription] = {
native_unit_of_measurement=UnitOfTemperature.CELSIUS,
device_class=SensorDeviceClass.TEMPERATURE,
state_class=SensorStateClass.MEASUREMENT,
value_fn=lambda entity: read_cpu_temperature(
entity.coordinator.data.temperatures
value_fn=(
lambda entity: read_cpu_temperature(entity.coordinator.data.temperatures)
),
none_is_unavailable=True,
add_to_update=lambda entity: ("temperatures", ""),
@@ -384,15 +437,27 @@ SENSOR_TYPES: dict[str, SysMonitorSensorEntityDescription] = {
value_fn=lambda entity: entity.coordinator.data.swap.percent,
add_to_update=lambda entity: ("swap", ""),
),
"process_num_fds": SysMonitorSensorEntityDescription(
key="process_num_fds",
translation_key="process_num_fds",
placeholder="process",
"throughput_network_in": SysMonitorSensorEntityDescription(
key="throughput_network_in",
translation_key="throughput_network_in",
placeholder="interface",
native_unit_of_measurement=UnitOfDataRate.MEGABYTES_PER_SECOND,
device_class=SensorDeviceClass.DATA_RATE,
state_class=SensorStateClass.MEASUREMENT,
entity_registry_enabled_default=False,
mandatory_arg=True,
value_fn=get_process_num_fds,
add_to_update=lambda entity: ("processes", ""),
value_fn=get_throughput,
add_to_update=lambda entity: ("io_counters", ""),
),
"throughput_network_out": SysMonitorSensorEntityDescription(
key="throughput_network_out",
translation_key="throughput_network_out",
placeholder="interface",
native_unit_of_measurement=UnitOfDataRate.MEGABYTES_PER_SECOND,
device_class=SensorDeviceClass.DATA_RATE,
state_class=SensorStateClass.MEASUREMENT,
mandatory_arg=True,
value_fn=get_throughput,
add_to_update=lambda entity: ("io_counters", ""),
),
}
@@ -409,14 +474,17 @@ def check_legacy_resource(resource: str, resources: set[str]) -> bool:
IO_COUNTER = {
"network_out": 0,
"network_in": 1,
"packets_out": 2,
"network_out": 0,
"packets_in": 3,
"throughput_network_out": 0,
"packets_out": 2,
"throughput_network_in": 1,
"throughput_network_out": 0,
}
IF_ADDRS_FAMILY = {
"ipv4_address": socket.AF_INET,
"ipv6_address": socket.AF_INET6,
}
IF_ADDRS_FAMILY = {"ipv4_address": socket.AF_INET, "ipv6_address": socket.AF_INET6}
async def async_setup_entry(
@@ -437,6 +505,7 @@ async def async_setup_entry(
return {
"disk_arguments": get_all_disk_mounts(hass, psutil_wrapper),
"network_arguments": get_all_network_interfaces(hass, psutil_wrapper),
"fan_speed_arguments": list(sensor_data.fan_speed),
}
cpu_temperature: float | None = None

View File

@@ -16,6 +16,9 @@
}
},
"sensor": {
"battery_empty": {
"name": "Battery empty"
},
"disk_free": {
"name": "Disk free {mount_point}"
},
@@ -25,6 +28,9 @@
"disk_use_percent": {
"name": "Disk usage {mount_point}"
},
"fan_speed": {
"name": "{fan_name} fan speed"
},
"ipv4_address": {
"name": "IPv4 address {ip_address}"
},

View File

@@ -3,7 +3,7 @@
import logging
import os
from psutil._common import shwtemp
from psutil._common import sfan, shwtemp
import psutil_home_assistant as ha_psutil
from homeassistant.core import HomeAssistant
@@ -89,3 +89,19 @@ def read_cpu_temperature(temps: dict[str, list[shwtemp]]) -> float | None:
return round(entry.current, 1)
return None
def read_fan_speed(fans: dict[str, list[sfan]]) -> dict[str, int]:
"""Attempt to read fan speed."""
entry: sfan
_LOGGER.debug("Fan speed: %s", fans)
if not fans:
return {}
sensor_fans: dict[str, int] = {}
for name, entries in fans.items():
for entry in entries:
_label = name if not entry.label else entry.label
sensor_fans[_label] = round(entry.current, 0)
return sensor_fans

View File

@@ -21,7 +21,7 @@ from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from . import TuyaConfigEntry
from .const import TUYA_DISCOVERY_NEW, DeviceCategory, DPCode, DPType
from .entity import TuyaEntity
from .models import EnumTypeData
from .models import EnumTypeData, find_dpcode
from .util import get_dpcode
@@ -118,8 +118,8 @@ class TuyaAlarmEntity(TuyaEntity, AlarmControlPanelEntity):
self._attr_unique_id = f"{super().unique_id}{description.key}"
# Determine supported modes
if supported_modes := self.find_dpcode(
description.key, dptype=DPType.ENUM, prefer_function=True
if supported_modes := find_dpcode(
self.device, description.key, dptype=DPType.ENUM, prefer_function=True
):
if Mode.HOME in supported_modes.range:
self._attr_supported_features |= AlarmControlPanelEntityFeature.ARM_HOME
@@ -131,8 +131,11 @@ class TuyaAlarmEntity(TuyaEntity, AlarmControlPanelEntity):
self._attr_supported_features |= AlarmControlPanelEntityFeature.TRIGGER
# Determine master state
if enum_type := self.find_dpcode(
description.master_state, dptype=DPType.ENUM, prefer_function=True
if enum_type := find_dpcode(
self.device,
description.master_state,
dptype=DPType.ENUM,
prefer_function=True,
):
self._master_state = enum_type

View File

@@ -26,7 +26,7 @@ from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from . import TuyaConfigEntry
from .const import TUYA_DISCOVERY_NEW, DeviceCategory, DPCode, DPType
from .entity import TuyaEntity
from .models import IntegerTypeData
from .models import IntegerTypeData, find_dpcode
from .util import get_dpcode
TUYA_HVAC_TO_HA = {
@@ -153,11 +153,13 @@ class TuyaClimateEntity(TuyaEntity, ClimateEntity):
self._attr_temperature_unit = system_temperature_unit
# Figure out current temperature, use preferred unit or what is available
celsius_type = self.find_dpcode(
(DPCode.TEMP_CURRENT, DPCode.UPPER_TEMP), dptype=DPType.INTEGER
celsius_type = find_dpcode(
self.device, (DPCode.TEMP_CURRENT, DPCode.UPPER_TEMP), dptype=DPType.INTEGER
)
fahrenheit_type = self.find_dpcode(
(DPCode.TEMP_CURRENT_F, DPCode.UPPER_TEMP_F), dptype=DPType.INTEGER
fahrenheit_type = find_dpcode(
self.device,
(DPCode.TEMP_CURRENT_F, DPCode.UPPER_TEMP_F),
dptype=DPType.INTEGER,
)
if fahrenheit_type and (
prefered_temperature_unit == UnitOfTemperature.FAHRENHEIT
@@ -173,11 +175,11 @@ class TuyaClimateEntity(TuyaEntity, ClimateEntity):
self._current_temperature = celsius_type
# Figure out setting temperature, use preferred unit or what is available
celsius_type = self.find_dpcode(
DPCode.TEMP_SET, dptype=DPType.INTEGER, prefer_function=True
celsius_type = find_dpcode(
self.device, DPCode.TEMP_SET, dptype=DPType.INTEGER, prefer_function=True
)
fahrenheit_type = self.find_dpcode(
DPCode.TEMP_SET_F, dptype=DPType.INTEGER, prefer_function=True
fahrenheit_type = find_dpcode(
self.device, DPCode.TEMP_SET_F, dptype=DPType.INTEGER, prefer_function=True
)
if fahrenheit_type and (
prefered_temperature_unit == UnitOfTemperature.FAHRENHEIT
@@ -201,8 +203,8 @@ class TuyaClimateEntity(TuyaEntity, ClimateEntity):
# Determine HVAC modes
self._attr_hvac_modes: list[HVACMode] = []
self._hvac_to_tuya = {}
if enum_type := self.find_dpcode(
DPCode.MODE, dptype=DPType.ENUM, prefer_function=True
if enum_type := find_dpcode(
self.device, DPCode.MODE, dptype=DPType.ENUM, prefer_function=True
):
self._attr_hvac_modes = [HVACMode.OFF]
unknown_hvac_modes: list[str] = []
@@ -225,8 +227,11 @@ class TuyaClimateEntity(TuyaEntity, ClimateEntity):
]
# Determine dpcode to use for setting the humidity
if int_type := self.find_dpcode(
DPCode.HUMIDITY_SET, dptype=DPType.INTEGER, prefer_function=True
if int_type := find_dpcode(
self.device,
DPCode.HUMIDITY_SET,
dptype=DPType.INTEGER,
prefer_function=True,
):
self._attr_supported_features |= ClimateEntityFeature.TARGET_HUMIDITY
self._set_humidity = int_type
@@ -234,13 +239,14 @@ class TuyaClimateEntity(TuyaEntity, ClimateEntity):
self._attr_max_humidity = int(int_type.max_scaled)
# Determine dpcode to use for getting the current humidity
self._current_humidity = self.find_dpcode(
DPCode.HUMIDITY_CURRENT, dptype=DPType.INTEGER
self._current_humidity = find_dpcode(
self.device, DPCode.HUMIDITY_CURRENT, dptype=DPType.INTEGER
)
# Determine fan modes
self._fan_mode_dp_code: str | None = None
if enum_type := self.find_dpcode(
if enum_type := find_dpcode(
self.device,
(DPCode.FAN_SPEED_ENUM, DPCode.LEVEL, DPCode.WINDSPEED),
dptype=DPType.ENUM,
prefer_function=True,

View File

@@ -22,7 +22,7 @@ from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from . import TuyaConfigEntry
from .const import TUYA_DISCOVERY_NEW, DeviceCategory, DPCode, DPType
from .entity import TuyaEntity
from .models import EnumTypeData, IntegerTypeData
from .models import EnumTypeData, IntegerTypeData, find_dpcode
from .util import get_dpcode
@@ -204,8 +204,8 @@ class TuyaCoverEntity(TuyaEntity, CoverEntity):
self._attr_supported_features |= (
CoverEntityFeature.OPEN | CoverEntityFeature.CLOSE
)
elif enum_type := self.find_dpcode(
description.key, dptype=DPType.ENUM, prefer_function=True
elif enum_type := find_dpcode(
self.device, description.key, dptype=DPType.ENUM, prefer_function=True
):
if description.open_instruction_value in enum_type.range:
self._attr_supported_features |= CoverEntityFeature.OPEN
@@ -217,8 +217,11 @@ class TuyaCoverEntity(TuyaEntity, CoverEntity):
self._current_state = get_dpcode(self.device, description.current_state)
# Determine type to use for setting the position
if int_type := self.find_dpcode(
description.set_position, dptype=DPType.INTEGER, prefer_function=True
if int_type := find_dpcode(
self.device,
description.set_position,
dptype=DPType.INTEGER,
prefer_function=True,
):
self._attr_supported_features |= CoverEntityFeature.SET_POSITION
self._set_position = int_type
@@ -226,13 +229,17 @@ class TuyaCoverEntity(TuyaEntity, CoverEntity):
self._current_position = int_type
# Determine type for getting the position
if int_type := self.find_dpcode(
description.current_position, dptype=DPType.INTEGER, prefer_function=True
if int_type := find_dpcode(
self.device,
description.current_position,
dptype=DPType.INTEGER,
prefer_function=True,
):
self._current_position = int_type
# Determine type to use for setting the tilt
if int_type := self.find_dpcode(
if int_type := find_dpcode(
self.device,
(DPCode.ANGLE_HORIZONTAL, DPCode.ANGLE_VERTICAL),
dptype=DPType.INTEGER,
prefer_function=True,
@@ -242,7 +249,8 @@ class TuyaCoverEntity(TuyaEntity, CoverEntity):
# Determine type to use for checking motor reverse mode
if (motor_mode := description.motor_reverse_mode) and (
enum_type := self.find_dpcode(
enum_type := find_dpcode(
self.device,
motor_mode,
dptype=DPType.ENUM,
prefer_function=True,
@@ -311,8 +319,11 @@ class TuyaCoverEntity(TuyaEntity, CoverEntity):
def open_cover(self, **kwargs: Any) -> None:
"""Open the cover."""
value: bool | str = True
if self.find_dpcode(
self.entity_description.key, dptype=DPType.ENUM, prefer_function=True
if find_dpcode(
self.device,
self.entity_description.key,
dptype=DPType.ENUM,
prefer_function=True,
):
value = self.entity_description.open_instruction_value
@@ -337,8 +348,11 @@ class TuyaCoverEntity(TuyaEntity, CoverEntity):
def close_cover(self, **kwargs: Any) -> None:
"""Close cover."""
value: bool | str = False
if self.find_dpcode(
self.entity_description.key, dptype=DPType.ENUM, prefer_function=True
if find_dpcode(
self.device,
self.entity_description.key,
dptype=DPType.ENUM,
prefer_function=True,
):
value = self.entity_description.close_instruction_value

View File

@@ -2,7 +2,7 @@
from __future__ import annotations
from typing import Any, Literal, overload
from typing import Any
from tuya_sharing import CustomerDevice, Manager
@@ -10,8 +10,7 @@ from homeassistant.helpers.device_registry import DeviceInfo
from homeassistant.helpers.dispatcher import async_dispatcher_connect
from homeassistant.helpers.entity import Entity
from .const import DOMAIN, LOGGER, TUYA_HA_SIGNAL_UPDATE_ENTITY, DPCode, DPType
from .models import EnumTypeData, IntegerTypeData
from .const import DOMAIN, LOGGER, TUYA_HA_SIGNAL_UPDATE_ENTITY
class TuyaEntity(Entity):
@@ -44,77 +43,6 @@ class TuyaEntity(Entity):
"""Return if the device is available."""
return self.device.online
@overload
def find_dpcode(
self,
dpcodes: str | DPCode | tuple[DPCode, ...] | None,
*,
prefer_function: bool = False,
dptype: Literal[DPType.ENUM],
) -> EnumTypeData | None: ...
@overload
def find_dpcode(
self,
dpcodes: str | DPCode | tuple[DPCode, ...] | None,
*,
prefer_function: bool = False,
dptype: Literal[DPType.INTEGER],
) -> IntegerTypeData | None: ...
def find_dpcode(
self,
dpcodes: str | DPCode | tuple[DPCode, ...] | None,
*,
prefer_function: bool = False,
dptype: DPType,
) -> EnumTypeData | IntegerTypeData | None:
"""Find type information for a matching DP code available for this device."""
if dptype not in (DPType.ENUM, DPType.INTEGER):
raise NotImplementedError("Only ENUM and INTEGER types are supported")
if dpcodes is None:
return None
if isinstance(dpcodes, str):
dpcodes = (DPCode(dpcodes),)
elif not isinstance(dpcodes, tuple):
dpcodes = (dpcodes,)
order = ["status_range", "function"]
if prefer_function:
order = ["function", "status_range"]
for dpcode in dpcodes:
for key in order:
if dpcode not in getattr(self.device, key):
continue
if (
dptype == DPType.ENUM
and getattr(self.device, key)[dpcode].type == DPType.ENUM
):
if not (
enum_type := EnumTypeData.from_json(
dpcode, getattr(self.device, key)[dpcode].values
)
):
continue
return enum_type
if (
dptype == DPType.INTEGER
and getattr(self.device, key)[dpcode].type == DPType.INTEGER
):
if not (
integer_type := IntegerTypeData.from_json(
dpcode, getattr(self.device, key)[dpcode].values
)
):
continue
return integer_type
return None
async def async_added_to_hass(self) -> None:
"""Call when entity is added to hass."""
self.async_on_remove(

View File

@@ -16,6 +16,7 @@ from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from . import TuyaConfigEntry
from .const import TUYA_DISCOVERY_NEW, DeviceCategory, DPCode, DPType
from .entity import TuyaEntity
from .models import find_dpcode
# All descriptions can be found here. Mostly the Enum data types in the
# default status set of each category (that don't have a set instruction)
@@ -125,7 +126,7 @@ class TuyaEventEntity(TuyaEntity, EventEntity):
self.entity_description = description
self._attr_unique_id = f"{super().unique_id}{description.key}"
if dpcode := self.find_dpcode(description.key, dptype=DPType.ENUM):
if dpcode := find_dpcode(self.device, description.key, dptype=DPType.ENUM):
self._attr_event_types: list[str] = dpcode.range
async def _handle_state_update(

View File

@@ -23,7 +23,7 @@ from homeassistant.util.percentage import (
from . import TuyaConfigEntry
from .const import TUYA_DISCOVERY_NEW, DeviceCategory, DPCode, DPType
from .entity import TuyaEntity
from .models import EnumTypeData, IntegerTypeData
from .models import EnumTypeData, IntegerTypeData, find_dpcode
from .util import get_dpcode
_DIRECTION_DPCODES = (DPCode.FAN_DIRECTION,)
@@ -106,21 +106,24 @@ class TuyaFanEntity(TuyaEntity, FanEntity):
self._switch = get_dpcode(self.device, _SWITCH_DPCODES)
self._attr_preset_modes = []
if enum_type := self.find_dpcode(
(DPCode.FAN_MODE, DPCode.MODE), dptype=DPType.ENUM, prefer_function=True
if enum_type := find_dpcode(
self.device,
(DPCode.FAN_MODE, DPCode.MODE),
dptype=DPType.ENUM,
prefer_function=True,
):
self._presets = enum_type
self._attr_supported_features |= FanEntityFeature.PRESET_MODE
self._attr_preset_modes = enum_type.range
# Find speed controls, can be either percentage or a set of speeds
if int_type := self.find_dpcode(
_SPEED_DPCODES, dptype=DPType.INTEGER, prefer_function=True
if int_type := find_dpcode(
self.device, _SPEED_DPCODES, dptype=DPType.INTEGER, prefer_function=True
):
self._attr_supported_features |= FanEntityFeature.SET_SPEED
self._speed = int_type
elif enum_type := self.find_dpcode(
_SPEED_DPCODES, dptype=DPType.ENUM, prefer_function=True
elif enum_type := find_dpcode(
self.device, _SPEED_DPCODES, dptype=DPType.ENUM, prefer_function=True
):
self._attr_supported_features |= FanEntityFeature.SET_SPEED
self._speeds = enum_type
@@ -129,8 +132,8 @@ class TuyaFanEntity(TuyaEntity, FanEntity):
self._oscillate = dpcode
self._attr_supported_features |= FanEntityFeature.OSCILLATE
if enum_type := self.find_dpcode(
_DIRECTION_DPCODES, dptype=DPType.ENUM, prefer_function=True
if enum_type := find_dpcode(
self.device, _DIRECTION_DPCODES, dptype=DPType.ENUM, prefer_function=True
):
self._direction = enum_type
self._attr_supported_features |= FanEntityFeature.DIRECTION

Some files were not shown because too many files have changed in this diff Show More