Compare commits

..

87 Commits

Author SHA1 Message Date
copilot-swe-agent[bot]
3bae7760d8 Remove EVENT_LABS_UPDATED re-export from labs/const.py
Import EVENT_LABS_UPDATED directly from homeassistant.const in
labs/__init__.py and labs/websocket_api.py instead of re-exporting
from labs/const.py.

Co-authored-by: balloob <1444314+balloob@users.noreply.github.com>
2025-11-30 04:21:22 +00:00
copilot-swe-agent[bot]
512cbd17b1 Move EVENT_LABS_UPDATED to homeassistant/const.py
Move the constant to the central const.py file to avoid circular
import issues. Update labs/const.py to re-export from homeassistant.const
and update events.py to import from homeassistant.const directly.

Co-authored-by: balloob <1444314+balloob@users.noreply.github.com>
2025-11-30 04:17:22 +00:00
copilot-swe-agent[bot]
299c1a9c2d Fix import to use component root instead of submodule
Import EVENT_LABS_UPDATED from homeassistant.components.labs
instead of homeassistant.components.labs.const per
hass-component-root-import rule.

Co-authored-by: balloob <1444314+balloob@users.noreply.github.com>
2025-11-30 03:59:51 +00:00
copilot-swe-agent[bot]
c9b69ee297 Add EVENT_LABS_UPDATED to event subscription allowlist
This fixes the authorization error for non-admin users subscribing
to the labs_updated event, which was causing issues with kiosk mode
and dedicated dashboards in the beta version.

Fixes home-assistant/core#145771

Co-authored-by: balloob <1444314+balloob@users.noreply.github.com>
2025-11-29 20:51:44 +00:00
copilot-swe-agent[bot]
550ef72437 Initial plan 2025-11-29 20:45:35 +00:00
Raphael Hehl
0174bad182 Add PARALLEL_UPDATES to UniFi Protect platforms (#157504) 2025-11-29 19:48:43 +01:00
Allen Porter
d5be623684 Bump python-roborock to 3.8.4 (#157538) 2025-11-29 20:34:27 +02:00
Raphael Hehl
d006b044c8 Bump uiprotect to 7.31.0 (#157543) 2025-11-29 20:33:09 +02:00
Jan Bouwhuis
fdd9571623 Fix MQTT entity cannot be renamed (#157540) 2025-11-29 19:29:54 +01:00
Shay Levy
4f4c5152b9 Refactor Shelly setup to use async_setup_entry_block for block entities (#157517) 2025-11-29 18:08:12 +02:00
Denis Shulyaka
b031a082cd Bump anthropic to 0.75.0 (#157491) 2025-11-29 14:35:30 +01:00
Shay Levy
a1132195fd Refactor Shelly RPC event platform to use base class (#157499) 2025-11-29 13:09:32 +02:00
Jordan Harvey
708b3dc8b2 Disable cookie quotes for Anglian Water (#157518) 2025-11-29 11:52:55 +01:00
J. Nick Koston
8ae0216135 Bump ESPHome stable BLE version to 2025.11.0 (#157511) 2025-11-29 03:40:22 -06:00
David Woodhouse
1472281cd5 Clarify percentage_command_topic and percentage_state_topic for MQTT fan (#157460)
Co-authored-by: Jan Bouwhuis <jbouwh@users.noreply.github.com>
2025-11-29 02:47:34 -06:00
Allen Porter
ceaa71d198 Bump python-roborock to 3.8.3 (#157512) 2025-11-29 09:34:22 +01:00
Arie Catsman
7f0d0c555a Bump pyenphase to 2.4.2 (#157500) 2025-11-28 21:58:57 +01:00
steaura
3b94b2491a Update bootstrap.py for grammar in slow startup error log (#157458) 2025-11-28 19:34:30 +01:00
Sebastian Schneider
8c8708d5bc Support UniFi LED control for devices without RGB (#156812) 2025-11-28 17:33:15 +01:00
Maciej Bieniek
ca35102138 Remove name for Shelly gas valve (gen1) entity (#157490) 2025-11-28 15:26:17 +01:00
Maciej Bieniek
1a1b50ef1a Add missing string for Shelly away mode switch (#157488) 2025-11-28 16:07:40 +02:00
epenet
5a4d51e57a Mark config-flow-test-coverage as done in SFR Box IQS (#157485) 2025-11-28 12:46:01 +01:00
epenet
9e1bc637e2 Improve diagnostics tests in SFR Box API (#157483) 2025-11-28 11:58:33 +01:00
Joakim Plate
ab879c07ca Add logbook support for args same as params for zha (#154997) 2025-11-28 11:15:49 +01:00
Hem Bhagat
488c97531e Move translatable URLs out of strings.json for opentherm_gw integration (#157437) 2025-11-28 10:45:45 +01:00
epenet
3b52c5df79 Use snapshot_platform helper in SFR Box tests (#157481) 2025-11-28 10:44:39 +01:00
Shay Levy
7f4b56104d Update Shelly utils coverage to 100% (#157478) 2025-11-28 11:32:41 +02:00
Åke Strandberg
ab8135ba1a Add loggers to senz manifest (#157479) 2025-11-28 10:19:28 +01:00
epenet
a88599bc09 Improve tests in SFR Box (#157444)
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
2025-11-28 10:10:38 +01:00
Manuel Stahl
45034279c8 Update pystiebeleltron to 0.2.5 (#157450) 2025-11-28 09:48:51 +01:00
Artur Pragacz
9f3dae6254 Add tools in default agent also in fallback pipeline (#157441) 2025-11-28 09:47:52 +01:00
epenet
ef36d7b1e5 Fix blocking call in Tuya initialisation (#157477) 2025-11-28 09:45:28 +01:00
dependabot[bot]
e5346ba017 Bump home-assistant/builder from 2025.09.0 to 2025.11.0 (#157468)
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-11-28 09:30:37 +01:00
dependabot[bot]
68d41d2a48 Bump docker/metadata-action from 5.9.0 to 5.10.0 (#157467)
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-11-28 09:30:22 +01:00
dependabot[bot]
00a882c20a Bump actions/ai-inference from 2.0.2 to 2.0.3 (#157466)
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-11-28 09:28:59 +01:00
Jordan Harvey
44a6772947 Fix Anglian Water sensor setup (#157457) 2025-11-28 07:25:04 +01:00
cdnninja
f874ba1355 Move device_info to attribute in vesync (#157462) 2025-11-28 07:20:50 +01:00
Allen Porter
4fc125c49a Improve Nest error message wording in test before setup (#157465) 2025-11-28 07:19:54 +01:00
Artur Pragacz
8c59196e19 Provide log info for discovered flows in logger (#157454) 2025-11-28 02:13:10 +01:00
Shay Levy
326f7f0559 Add coverage to Shelly utils (#157455) 2025-11-28 00:29:47 +02:00
ElectricSteve
11afda8c22 bump: youtubeaio to 2.1.1 (#157452) 2025-11-27 22:42:57 +01:00
StaleLoafOfBread
f1ee0e4ac9 Add support for gallons per day as a unit of volume flow rate (#157394) 2025-11-27 20:42:16 +01:00
Joakim Plate
5f522e5afa Fix cancel propagation in update coordinator and config entry (#153504) 2025-11-27 19:48:45 +01:00
Thomas55555
4f6624d0aa Fix strings in Google Air Quality (#157297) 2025-11-27 19:26:33 +01:00
epenet
70990645a7 Mark config-flow as done in SFR Box IQS (#157439) 2025-11-27 19:14:13 +01:00
Andrew Jackson
2f7d74ff62 Add icons to transmission entities (#157436) 2025-11-27 18:38:32 +01:00
epenet
885667832b Add initial IQS to sfr_box (#155419) 2025-11-27 18:36:51 +01:00
Petro31
4646929987 Avoid custom template platform deprecations (#157415) 2025-11-27 18:06:29 +01:00
Petro31
010aea952c Reload templates when labs flag automation.new_triggers_conditions is set (#157368) 2025-11-27 18:05:33 +01:00
Bram Kragten
563678dc47 Update frontend to 20251127.0 (#157431) 2025-11-27 18:05:18 +01:00
epenet
a48f01f213 Raise UpdateFailed if API returns None in sfr_box (#157434) 2025-11-27 18:01:56 +01:00
Andrew Jackson
08b758b0d2 Add device info and parallel_updates to Transmission (#157423) 2025-11-27 17:37:27 +01:00
Allen Porter
4306fbea52 Fix regression in roborock image entity naming (#157432) 2025-11-27 17:36:18 +01:00
Robert Resch
6f4c479f8f Use same cosign version in build workflow (#157365) 2025-11-27 17:13:04 +01:00
Shay Levy
1d9c06264e Fix Shelly support for button5 trigger (#157422) 2025-11-27 16:38:45 +01:00
epenet
d045ecaf13 Add parallel_updates to SFR Box (#157426) 2025-11-27 16:04:25 +01:00
Markus Jacobsen
f7c41e694c Add media content id attribute to Bang & Olufsen (#156597) 2025-11-27 15:53:43 +01:00
Kamil Breguła
9ee7ed5cdb Fix MAC address mix-ups between WLED devices (#155491)
Co-authored-by: mik-laj <12058428+mik-laj@users.noreply.github.com>
Co-authored-by: Joost Lekkerkerker <joostlek@outlook.com>
2025-11-27 15:10:32 +01:00
Denis Shulyaka
83c4e2abc9 Fix Anthropic init with incorrect model (#157421) 2025-11-27 14:16:46 +01:00
Erik Montnemery
a7dbf551a3 Add climate started_cooling and started_drying triggers (#156945) 2025-11-27 12:41:08 +01:00
Petro31
0b2bb9f6bf Modernize template binary sensor (#157279) 2025-11-27 12:28:16 +01:00
tan-lawrence
0769163b67 Use "medium" instead of "med" for the medium fan mode in Coolmaster (#157253) 2025-11-27 12:27:49 +01:00
Robert Resch
2bb51e1146 Reduce Devcontainer docker layers (#157412) 2025-11-27 12:27:18 +01:00
Paulus Schoutsen
d2248d282c Default conversation agent to store tool calls in chat log (#157377) 2025-11-27 12:27:03 +01:00
Jan Čermák
8fe79a88ca Fix state classes of Ecowitt rain sensors (#157409) 2025-11-27 12:24:28 +01:00
Jaap Pieroen
7a328539b2 Bugfix: Essent remove average gas price today (#157317) 2025-11-27 12:24:07 +01:00
abelyliu
ec69efee4d Fix parsing of Tuya electricity RAW values (#157039) 2025-11-27 12:23:33 +01:00
Shay Levy
dbcde549d4 Update Shelly coordinator coverage to 100% (#157380) 2025-11-27 12:22:19 +01:00
Michael
988355e138 Add tests for the switch platform to the AdGuard Home integration (#157105) 2025-11-27 12:21:23 +01:00
victorigualada
7711eac607 Return early when setting cloud ai_task and conversation and not logged in to cloud (#157402) 2025-11-27 12:20:42 +01:00
Denis Shulyaka
32fe53cceb Add anthropic model to the device info (#157413) 2025-11-27 12:16:05 +01:00
Andrew Jackson
3a65d3c0dc Add tests to Transmission (#157355) 2025-11-27 12:15:10 +01:00
epenet
7fe26223ac Bump renault-api to 0.5.1 (#157411) 2025-11-27 12:06:57 +01:00
victorigualada
7e8496afb2 Bump hass-nabucasa from 1.6.1 to 1.6.2 (#157405) 2025-11-27 11:40:50 +01:00
Paulus Schoutsen
2ec5190243 Install requirements_test_all in dev (#157392) 2025-11-27 10:30:50 +01:00
Erik Montnemery
a706db8fdb Minor polish of cover trigger tests (#157397) 2025-11-27 09:57:03 +01:00
starkillerOG
a00923c48b Bump reolink-aio to 0.16.6 (#157399) 2025-11-27 09:53:25 +01:00
Sarah Seidman
7480d59f0f Normalize input for Droplet pairing code (#157361) 2025-11-27 08:36:30 +01:00
Erik Montnemery
4c8d9ed401 Adjust type hints in sensor group (#157373) 2025-11-27 08:34:16 +01:00
Lukas
eef10c59db Pooldose bump api 0.8.0 (new) (#157381) 2025-11-27 08:33:32 +01:00
dependabot[bot]
a1a1f8dd77 Bump docker/metadata-action from 5.5.1 to 5.9.0 (#157395) 2025-11-27 07:26:58 +01:00
dependabot[bot]
c75a5c5151 Bump docker/setup-buildx-action from 3.5.0 to 3.11.1 (#157396) 2025-11-27 07:25:16 +01:00
Allen Porter
cdaaa2bd8f Update fitbit to use new asyncio client library for device list (#157308)
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
2025-11-27 00:23:49 -05:00
Allen Porter
bd84dac8fb Update roborock test typing (#157370) 2025-11-27 00:21:48 -05:00
Allen Porter
42cbeca5b0 Remove old roborock map storage (#157379) 2025-11-27 00:21:04 -05:00
Allen Porter
ad0a498d10 Bump python-roborock to 3.8.1 (#157376) 2025-11-26 16:12:19 -08:00
Jan Bouwhuis
973405822b Move translatable URL out of strings.json for knx integration (#155244) 2025-11-26 23:09:59 +01:00
201 changed files with 5651 additions and 2393 deletions

View File

@@ -190,7 +190,8 @@ jobs:
username: ${{ github.repository_owner }}
password: ${{ secrets.GITHUB_TOKEN }}
- name: Install Cosign
- &install_cosign
name: Install Cosign
uses: sigstore/cosign-installer@faadad0cce49287aee09b3a48701e75088a2c6ad # v4.0.0
with:
cosign-release: "v2.5.3"
@@ -294,7 +295,7 @@ jobs:
# home-assistant/builder doesn't support sha pinning
- name: Build base image
uses: home-assistant/builder@2025.09.0
uses: home-assistant/builder@2025.11.0
with:
args: |
$BUILD_ARGS \
@@ -353,10 +354,7 @@ jobs:
matrix:
registry: ["ghcr.io/home-assistant", "docker.io/homeassistant"]
steps:
- name: Install Cosign
uses: sigstore/cosign-installer@faadad0cce49287aee09b3a48701e75088a2c6ad # v4.0.0
with:
cosign-release: "v2.2.3"
- *install_cosign
- name: Login to DockerHub
if: matrix.registry == 'docker.io/homeassistant'
@@ -393,7 +391,7 @@ jobs:
# 2025.12.0.dev202511250240 -> tags: 2025.12.0.dev202511250240, dev
- name: Generate Docker metadata
id: meta
uses: docker/metadata-action@8e5442c4ef9f78752691e2d8f8d19755c6f78e81 # v5.5.1
uses: docker/metadata-action@c299e40c65443455700f0fdfc63efafe5b349051 # v5.10.0
with:
images: ${{ matrix.registry }}/home-assistant
sep-tags: ","
@@ -407,7 +405,7 @@ jobs:
type=semver,pattern={{major}}.{{minor}},value=${{ needs.init.outputs.version }},enable=${{ !contains(needs.init.outputs.version, 'd') && !contains(needs.init.outputs.version, 'b') }}
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@aa33708b10e362ff993539393ff100fa93ed6a27 # v3.7.1
uses: docker/setup-buildx-action@e468171a9de216ec08956ac3ada2f0791b6bd435 # v3.7.1
- name: Copy architecture images to DockerHub
if: matrix.registry == 'docker.io/homeassistant'

View File

@@ -231,7 +231,7 @@ jobs:
- name: Detect duplicates using AI
id: ai_detection
if: steps.extract.outputs.should_continue == 'true' && steps.fetch_similar.outputs.has_similar == 'true'
uses: actions/ai-inference@5022b33bc1431add9b2831934daf8147a2ad9331 # v2.0.2
uses: actions/ai-inference@02c6cc30ae592ce65ee356387748dfc2fd5f7993 # v2.0.3
with:
model: openai/gpt-4o
system-prompt: |

View File

@@ -57,7 +57,7 @@ jobs:
- name: Detect language using AI
id: ai_language_detection
if: steps.detect_language.outputs.should_continue == 'true'
uses: actions/ai-inference@5022b33bc1431add9b2831934daf8147a2ad9331 # v2.0.2
uses: actions/ai-inference@02c6cc30ae592ce65ee356387748dfc2fd5f7993 # v2.0.3
with:
model: openai/gpt-4o-mini
system-prompt: |

View File

@@ -19,9 +19,9 @@ on:
env:
DEFAULT_PYTHON: "3.13"
# concurrency:
# group: ${{ github.workflow }}-${{ github.ref_name}}
# cancel-in-progress: true
concurrency:
group: ${{ github.workflow }}-${{ github.ref_name}}
cancel-in-progress: true
jobs:
init:
@@ -101,7 +101,7 @@ jobs:
core:
name: Build Core wheels ${{ matrix.abi }} for ${{ matrix.arch }} (musllinux_1_2)
if: false && github.repository_owner == 'home-assistant'
if: github.repository_owner == 'home-assistant'
needs: init
runs-on: ${{ matrix.os }}
strategy:
@@ -177,33 +177,8 @@ jobs:
sed -i "/uv/d" requirements.txt
sed -i "/uv/d" requirements_diff.txt
- name: Create requirements file for custom build
run: |
touch requirements_custom.txt
echo -n "cython==3.2.1" >> requirements_custom.txt
- name: Modify requirements file for custom build
id: modify-requirements
run: |
echo " # force update" >> requirements_custom.txt
echo "skip_binary=cython" >> $GITHUB_OUTPUT
- name: Build wheels (custom)
uses: cdce8p/wheels@master
with:
abi: ${{ matrix.abi }}
tag: musllinux_1_2
arch: ${{ matrix.arch }}
wheels-key: ${{ secrets.WHEELS_KEY }}
env-file: true
skip-binary: ${{ steps.modify-requirements.outputs.skip_binary }}
constraints: "homeassistant/package_constraints.txt"
requirements: "requirements_custom.txt"
verbose: true
- name: Build wheels
uses: *home-assistant-wheels
if: false
with:
abi: ${{ matrix.abi }}
tag: musllinux_1_2

View File

@@ -35,25 +35,22 @@ COPY --from=ghcr.io/astral-sh/uv:latest /uv /usr/local/bin/uv
USER vscode
COPY .python-version ./
RUN uv python install
ENV VIRTUAL_ENV="/home/vscode/.local/ha-venv"
RUN uv venv $VIRTUAL_ENV
RUN --mount=type=bind,source=.python-version,target=.python-version \
uv python install \
&& uv venv $VIRTUAL_ENV
ENV PATH="$VIRTUAL_ENV/bin:$PATH"
WORKDIR /tmp
# Setup hass-release
RUN git clone --depth 1 https://github.com/home-assistant/hass-release ~/hass-release \
&& uv pip install -e ~/hass-release/
# Install Python dependencies from requirements
COPY requirements.txt ./
COPY homeassistant/package_constraints.txt homeassistant/package_constraints.txt
RUN uv pip install -r requirements.txt
COPY requirements_test.txt requirements_test_pre_commit.txt ./
RUN uv pip install -r requirements_test.txt
RUN --mount=type=bind,source=requirements.txt,target=requirements.txt \
--mount=type=bind,source=homeassistant/package_constraints.txt,target=homeassistant/package_constraints.txt \
--mount=type=bind,source=requirements_test.txt,target=requirements_test.txt \
--mount=type=bind,source=requirements_test_pre_commit.txt,target=requirements_test_pre_commit.txt \
uv pip install -r requirements.txt -r requirements_test.txt
WORKDIR /workspaces

View File

@@ -7,6 +7,7 @@ from typing import Any, Final
from homeassistant.const import (
EVENT_COMPONENT_LOADED,
EVENT_CORE_CONFIG_UPDATE,
EVENT_LABS_UPDATED,
EVENT_LOVELACE_UPDATED,
EVENT_PANELS_UPDATED,
EVENT_RECORDER_5MIN_STATISTICS_GENERATED,
@@ -45,6 +46,7 @@ SUBSCRIBE_ALLOWLIST: Final[set[EventType[Any] | str]] = {
EVENT_STATE_CHANGED,
EVENT_THEMES_UPDATED,
EVENT_LABEL_REGISTRY_UPDATED,
EVENT_LABS_UPDATED,
EVENT_CATEGORY_REGISTRY_UPDATED,
EVENT_FLOOR_REGISTRY_UPDATED,
}

View File

@@ -1000,7 +1000,7 @@ class _WatchPendingSetups:
# We log every LOG_SLOW_STARTUP_INTERVAL until all integrations are done
# once we take over LOG_SLOW_STARTUP_INTERVAL (60s) to start up
_LOGGER.warning(
"Waiting on integrations to complete setup: %s",
"Waiting for integrations to complete setup: %s",
self._setup_started,
)

View File

@@ -2,6 +2,7 @@
from __future__ import annotations
from aiohttp import CookieJar
from pyanglianwater import AnglianWater
from pyanglianwater.auth import MSOB2CAuth
from pyanglianwater.exceptions import (
@@ -18,7 +19,7 @@ from homeassistant.const import (
)
from homeassistant.core import HomeAssistant
from homeassistant.exceptions import ConfigEntryAuthFailed, ConfigEntryError
from homeassistant.helpers.aiohttp_client import async_get_clientsession
from homeassistant.helpers.aiohttp_client import async_create_clientsession
from .const import CONF_ACCOUNT_NUMBER, DOMAIN
from .coordinator import AnglianWaterConfigEntry, AnglianWaterUpdateCoordinator
@@ -33,7 +34,10 @@ async def async_setup_entry(
auth = MSOB2CAuth(
username=entry.data[CONF_USERNAME],
password=entry.data[CONF_PASSWORD],
session=async_get_clientsession(hass),
session=async_create_clientsession(
hass,
cookie_jar=CookieJar(quote_cookie=False),
),
refresh_token=entry.data[CONF_ACCESS_TOKEN],
account_number=entry.data[CONF_ACCOUNT_NUMBER],
)

View File

@@ -18,17 +18,21 @@ _LOGGER = logging.getLogger(__name__)
class AnglianWaterEntity(CoordinatorEntity[AnglianWaterUpdateCoordinator]):
"""Defines a Anglian Water entity."""
_attr_has_entity_name = True
def __init__(
self,
coordinator: AnglianWaterUpdateCoordinator,
smart_meter: SmartMeter,
key: str,
) -> None:
"""Initialize Anglian Water entity."""
super().__init__(coordinator)
self.smart_meter = smart_meter
self._attr_unique_id = f"{smart_meter.serial_number}_{key}"
self._attr_device_info = DeviceInfo(
identifiers={(DOMAIN, smart_meter.serial_number)},
name="Smart Water Meter",
name=smart_meter.serial_number,
manufacturer="Anglian Water",
serial_number=smart_meter.serial_number,
)

View File

@@ -108,9 +108,8 @@ class AnglianWaterSensorEntity(AnglianWaterEntity, SensorEntity):
description: AnglianWaterSensorEntityDescription,
) -> None:
"""Initialize Anglian Water sensor."""
super().__init__(coordinator, smart_meter)
super().__init__(coordinator, smart_meter, description.key)
self.entity_description = description
self._attr_unique_id = f"{smart_meter.serial_number}_{description.key}"
@property
def native_value(self) -> float | None:

View File

@@ -17,7 +17,7 @@ from homeassistant.helpers import (
)
from homeassistant.helpers.typing import ConfigType
from .const import CONF_CHAT_MODEL, DEFAULT, DEFAULT_CONVERSATION_NAME, DOMAIN, LOGGER
from .const import DEFAULT_CONVERSATION_NAME, DOMAIN, LOGGER
PLATFORMS = (Platform.AI_TASK, Platform.CONVERSATION)
CONFIG_SCHEMA = cv.config_entry_only_config_schema(DOMAIN)
@@ -37,14 +37,7 @@ async def async_setup_entry(hass: HomeAssistant, entry: AnthropicConfigEntry) ->
partial(anthropic.AsyncAnthropic, api_key=entry.data[CONF_API_KEY])
)
try:
# Use model from first conversation subentry for validation
subentries = list(entry.subentries.values())
if subentries:
model_id = subentries[0].data.get(CONF_CHAT_MODEL, DEFAULT[CONF_CHAT_MODEL])
else:
model_id = DEFAULT[CONF_CHAT_MODEL]
model = await client.models.retrieve(model_id=model_id, timeout=10.0)
LOGGER.debug("Anthropic model: %s", model.display_name)
await client.models.list(timeout=10.0)
except anthropic.AuthenticationError as err:
LOGGER.error("Invalid API key: %s", err)
return False

View File

@@ -421,6 +421,8 @@ class ConversationSubentryFlowHandler(ConfigSubentryFlow):
)
if short_form.search(model_alias):
model_alias += "-0"
if model_alias.endswith(("haiku", "opus", "sonnet")):
model_alias += "-latest"
model_options.append(
SelectOptionDict(
label=model_info.display_name,

View File

@@ -583,7 +583,7 @@ class AnthropicBaseLLMEntity(Entity):
identifiers={(DOMAIN, subentry.subentry_id)},
name=subentry.title,
manufacturer="Anthropic",
model="Claude",
model=subentry.data.get(CONF_CHAT_MODEL, DEFAULT[CONF_CHAT_MODEL]),
entry_type=dr.DeviceEntryType.SERVICE,
)

View File

@@ -8,5 +8,5 @@
"documentation": "https://www.home-assistant.io/integrations/anthropic",
"integration_type": "service",
"iot_class": "cloud_polling",
"requirements": ["anthropic==0.73.0"]
"requirements": ["anthropic==0.75.0"]
}

View File

@@ -1123,63 +1123,6 @@ class PipelineRun:
)
try:
user_input = conversation.ConversationInput(
text=intent_input,
context=self.context,
conversation_id=conversation_id,
device_id=self._device_id,
satellite_id=self._satellite_id,
language=input_language,
agent_id=self.intent_agent.id,
extra_system_prompt=conversation_extra_system_prompt,
)
agent_id = self.intent_agent.id
processed_locally = agent_id == conversation.HOME_ASSISTANT_AGENT
all_targets_in_satellite_area = False
intent_response: intent.IntentResponse | None = None
if not processed_locally and not self._intent_agent_only:
# Sentence triggers override conversation agent
if (
trigger_response_text
:= await conversation.async_handle_sentence_triggers(
self.hass, user_input
)
) is not None:
# Sentence trigger matched
agent_id = "sentence_trigger"
processed_locally = True
intent_response = intent.IntentResponse(
self.pipeline.conversation_language
)
intent_response.async_set_speech(trigger_response_text)
intent_filter: Callable[[RecognizeResult], bool] | None = None
# If the LLM has API access, we filter out some sentences that are
# interfering with LLM operation.
if (
intent_agent_state := self.hass.states.get(self.intent_agent.id)
) and intent_agent_state.attributes.get(
ATTR_SUPPORTED_FEATURES, 0
) & conversation.ConversationEntityFeature.CONTROL:
intent_filter = _async_local_fallback_intent_filter
# Try local intents
if (
intent_response is None
and self.pipeline.prefer_local_intents
and (
intent_response := await conversation.async_handle_intents(
self.hass,
user_input,
intent_filter=intent_filter,
)
)
):
# Local intent matched
agent_id = conversation.HOME_ASSISTANT_AGENT
processed_locally = True
if self.tts_stream and self.tts_stream.supports_streaming_input:
tts_input_stream: asyncio.Queue[str | None] | None = asyncio.Queue()
else:
@@ -1265,6 +1208,17 @@ class PipelineRun:
assert self.tts_stream is not None
self.tts_stream.async_set_message_stream(tts_input_stream_generator())
user_input = conversation.ConversationInput(
text=intent_input,
context=self.context,
conversation_id=conversation_id,
device_id=self._device_id,
satellite_id=self._satellite_id,
language=input_language,
agent_id=self.intent_agent.id,
extra_system_prompt=conversation_extra_system_prompt,
)
with (
chat_session.async_get_chat_session(
self.hass, user_input.conversation_id
@@ -1276,6 +1230,53 @@ class PipelineRun:
chat_log_delta_listener=chat_log_delta_listener,
) as chat_log,
):
agent_id = self.intent_agent.id
processed_locally = agent_id == conversation.HOME_ASSISTANT_AGENT
all_targets_in_satellite_area = False
intent_response: intent.IntentResponse | None = None
if not processed_locally and not self._intent_agent_only:
# Sentence triggers override conversation agent
if (
trigger_response_text
:= await conversation.async_handle_sentence_triggers(
self.hass, user_input, chat_log
)
) is not None:
# Sentence trigger matched
agent_id = "sentence_trigger"
processed_locally = True
intent_response = intent.IntentResponse(
self.pipeline.conversation_language
)
intent_response.async_set_speech(trigger_response_text)
intent_filter: Callable[[RecognizeResult], bool] | None = None
# If the LLM has API access, we filter out some sentences that are
# interfering with LLM operation.
if (
intent_agent_state := self.hass.states.get(self.intent_agent.id)
) and intent_agent_state.attributes.get(
ATTR_SUPPORTED_FEATURES, 0
) & conversation.ConversationEntityFeature.CONTROL:
intent_filter = _async_local_fallback_intent_filter
# Try local intents
if (
intent_response is None
and self.pipeline.prefer_local_intents
and (
intent_response := await conversation.async_handle_intents(
self.hass,
user_input,
chat_log,
intent_filter=intent_filter,
)
)
):
# Local intent matched
agent_id = conversation.HOME_ASSISTANT_AGENT
processed_locally = True
# It was already handled, create response and add to chat history
if intent_response is not None:
speech: str = intent_response.speech.get("plain", {}).get(

View File

@@ -17,8 +17,12 @@ from homeassistant.components.media_player import (
class BangOlufsenSource:
"""Class used for associating device source ids with friendly names. May not include all sources."""
DEEZER: Final[Source] = Source(name="Deezer", id="deezer")
LINE_IN: Final[Source] = Source(name="Line-In", id="lineIn")
NET_RADIO: Final[Source] = Source(name="B&O Radio", id="netRadio")
SPDIF: Final[Source] = Source(name="Optical", id="spdif")
TIDAL: Final[Source] = Source(name="Tidal", id="tidal")
UNKNOWN: Final[Source] = Source(name="Unknown Source", id="unknown")
URI_STREAMER: Final[Source] = Source(name="Audio Streamer", id="uriStreamer")
@@ -78,6 +82,16 @@ class BangOlufsenModel(StrEnum):
BEOREMOTE_ONE = "Beoremote One"
class BangOlufsenAttribute(StrEnum):
"""Enum for extra_state_attribute keys."""
BEOLINK = "beolink"
BEOLINK_PEERS = "peers"
BEOLINK_SELF = "self"
BEOLINK_LEADER = "leader"
BEOLINK_LISTENERS = "listeners"
# Physical "buttons" on devices
class BangOlufsenButtons(StrEnum):
"""Enum for device buttons."""

View File

@@ -82,6 +82,7 @@ from .const import (
FALLBACK_SOURCES,
MANUFACTURER,
VALID_MEDIA_TYPES,
BangOlufsenAttribute,
BangOlufsenMediaType,
BangOlufsenSource,
WebsocketNotification,
@@ -224,7 +225,8 @@ class BangOlufsenMediaPlayer(BangOlufsenEntity, MediaPlayerEntity):
# Beolink compatible sources
self._beolink_sources: dict[str, bool] = {}
self._remote_leader: BeolinkLeader | None = None
# Extra state attributes for showing Beolink: peer(s), listener(s), leader and self
# Extra state attributes:
# Beolink: peer(s), listener(s), leader and self
self._beolink_attributes: dict[str, dict[str, dict[str, str]]] = {}
async def async_added_to_hass(self) -> None:
@@ -436,7 +438,10 @@ class BangOlufsenMediaPlayer(BangOlufsenEntity, MediaPlayerEntity):
await self._async_update_beolink()
async def _async_update_beolink(self) -> None:
"""Update the current Beolink leader, listeners, peers and self."""
"""Update the current Beolink leader, listeners, peers and self.
Updates Home Assistant state.
"""
self._beolink_attributes = {}
@@ -445,18 +450,24 @@ class BangOlufsenMediaPlayer(BangOlufsenEntity, MediaPlayerEntity):
# Add Beolink self
self._beolink_attributes = {
"beolink": {"self": {self.device_entry.name: self._beolink_jid}}
BangOlufsenAttribute.BEOLINK: {
BangOlufsenAttribute.BEOLINK_SELF: {
self.device_entry.name: self._beolink_jid
}
}
}
# Add Beolink peers
peers = await self._client.get_beolink_peers()
if len(peers) > 0:
self._beolink_attributes["beolink"]["peers"] = {}
self._beolink_attributes[BangOlufsenAttribute.BEOLINK][
BangOlufsenAttribute.BEOLINK_PEERS
] = {}
for peer in peers:
self._beolink_attributes["beolink"]["peers"][peer.friendly_name] = (
peer.jid
)
self._beolink_attributes[BangOlufsenAttribute.BEOLINK][
BangOlufsenAttribute.BEOLINK_PEERS
][peer.friendly_name] = peer.jid
# Add Beolink listeners / leader
self._remote_leader = self._playback_metadata.remote_leader
@@ -477,7 +488,9 @@ class BangOlufsenMediaPlayer(BangOlufsenEntity, MediaPlayerEntity):
# Add self
group_members.append(self.entity_id)
self._beolink_attributes["beolink"]["leader"] = {
self._beolink_attributes[BangOlufsenAttribute.BEOLINK][
BangOlufsenAttribute.BEOLINK_LEADER
] = {
self._remote_leader.friendly_name: self._remote_leader.jid,
}
@@ -514,9 +527,9 @@ class BangOlufsenMediaPlayer(BangOlufsenEntity, MediaPlayerEntity):
beolink_listener.jid
)
break
self._beolink_attributes["beolink"]["listeners"] = (
beolink_listeners_attribute
)
self._beolink_attributes[BangOlufsenAttribute.BEOLINK][
BangOlufsenAttribute.BEOLINK_LISTENERS
] = beolink_listeners_attribute
self._attr_group_members = group_members
@@ -615,11 +628,18 @@ class BangOlufsenMediaPlayer(BangOlufsenEntity, MediaPlayerEntity):
return None
@property
def media_content_type(self) -> str:
def media_content_type(self) -> MediaType | str | None:
"""Return the current media type."""
# Hard to determine content type
if self._source_change.id == BangOlufsenSource.URI_STREAMER.id:
return MediaType.URL
content_type = {
BangOlufsenSource.URI_STREAMER.id: MediaType.URL,
BangOlufsenSource.DEEZER.id: BangOlufsenMediaType.DEEZER,
BangOlufsenSource.TIDAL.id: BangOlufsenMediaType.TIDAL,
BangOlufsenSource.NET_RADIO.id: BangOlufsenMediaType.RADIO,
}
# Hard to determine content type.
if self._source_change.id in content_type:
return content_type[self._source_change.id]
return MediaType.MUSIC
@property
@@ -632,6 +652,11 @@ class BangOlufsenMediaPlayer(BangOlufsenEntity, MediaPlayerEntity):
"""Return the current playback progress."""
return self._playback_progress.progress
@property
def media_content_id(self) -> str | None:
"""Return internal ID of Deezer, Tidal and radio stations."""
return self._playback_metadata.source_internal_id
@property
def media_image_url(self) -> str | None:
"""Return URL of the currently playing music."""

View File

@@ -98,6 +98,12 @@
}
},
"triggers": {
"started_cooling": {
"trigger": "mdi:snowflake"
},
"started_drying": {
"trigger": "mdi:water-percent"
},
"started_heating": {
"trigger": "mdi:fire"
},

View File

@@ -298,6 +298,28 @@
},
"title": "Climate",
"triggers": {
"started_cooling": {
"description": "Triggers when a climate started cooling.",
"description_configured": "[%key:component::climate::triggers::started_cooling::description%]",
"fields": {
"behavior": {
"description": "[%key:component::climate::common::trigger_behavior_description%]",
"name": "[%key:component::climate::common::trigger_behavior_name%]"
}
},
"name": "When a climate started cooling"
},
"started_drying": {
"description": "Triggers when a climate started drying.",
"description_configured": "[%key:component::climate::triggers::started_drying::description%]",
"fields": {
"behavior": {
"description": "[%key:component::climate::common::trigger_behavior_description%]",
"name": "[%key:component::climate::common::trigger_behavior_name%]"
}
},
"name": "When a climate started drying"
},
"started_heating": {
"description": "Triggers when a climate starts to heat.",
"description_configured": "[%key:component::climate::triggers::started_heating::description%]",

View File

@@ -11,6 +11,12 @@ from homeassistant.helpers.trigger import (
from .const import ATTR_HVAC_ACTION, DOMAIN, HVACAction, HVACMode
TRIGGERS: dict[str, type[Trigger]] = {
"started_cooling": make_entity_state_attribute_trigger(
DOMAIN, ATTR_HVAC_ACTION, HVACAction.COOLING
),
"started_drying": make_entity_state_attribute_trigger(
DOMAIN, ATTR_HVAC_ACTION, HVACAction.DRYING
),
"turned_off": make_entity_state_trigger(DOMAIN, HVACMode.OFF),
"turned_on": make_conditional_entity_state_trigger(
DOMAIN,

View File

@@ -14,6 +14,8 @@
- last
- any
started_cooling: *trigger_common
started_drying: *trigger_common
started_heating: *trigger_common
turned_off: *trigger_common
turned_on: *trigger_common

View File

@@ -6,6 +6,7 @@ import io
from json import JSONDecodeError
import logging
from hass_nabucasa import NabuCasaBaseError
from hass_nabucasa.llm import (
LLMAuthenticationError,
LLMError,
@@ -93,10 +94,11 @@ async def async_setup_entry(
async_add_entities: AddConfigEntryEntitiesCallback,
) -> None:
"""Set up Home Assistant Cloud AI Task entity."""
cloud = hass.data[DATA_CLOUD]
if not (cloud := hass.data[DATA_CLOUD]).is_logged_in:
return
try:
await cloud.llm.async_ensure_token()
except LLMError:
except (LLMError, NabuCasaBaseError):
return
async_add_entities([CloudLLMTaskEntity(cloud, config_entry)])

View File

@@ -4,6 +4,7 @@ from __future__ import annotations
from typing import Literal
from hass_nabucasa import NabuCasaBaseError
from hass_nabucasa.llm import LLMError
from homeassistant.components import conversation
@@ -23,10 +24,11 @@ async def async_setup_entry(
async_add_entities: AddConfigEntryEntitiesCallback,
) -> None:
"""Set up the Home Assistant Cloud conversation entity."""
cloud = hass.data[DATA_CLOUD]
if not (cloud := hass.data[DATA_CLOUD]).is_logged_in:
return
try:
await cloud.llm.async_ensure_token()
except LLMError:
except (LLMError, NabuCasaBaseError):
return
async_add_entities([CloudConversationEntity(cloud, config_entry)])

View File

@@ -13,6 +13,6 @@
"integration_type": "system",
"iot_class": "cloud_push",
"loggers": ["acme", "hass_nabucasa", "snitun"],
"requirements": ["hass-nabucasa==1.6.1"],
"requirements": ["hass-nabucasa==1.6.2"],
"single_config_entry": true
}

View File

@@ -236,7 +236,9 @@ async def async_prepare_agent(
async def async_handle_sentence_triggers(
hass: HomeAssistant, user_input: ConversationInput
hass: HomeAssistant,
user_input: ConversationInput,
chat_log: ChatLog,
) -> str | None:
"""Try to match input against sentence triggers and return response text.
@@ -245,12 +247,13 @@ async def async_handle_sentence_triggers(
agent = get_agent_manager(hass).default_agent
assert agent is not None
return await agent.async_handle_sentence_triggers(user_input)
return await agent.async_handle_sentence_triggers(user_input, chat_log)
async def async_handle_intents(
hass: HomeAssistant,
user_input: ConversationInput,
chat_log: ChatLog,
*,
intent_filter: Callable[[RecognizeResult], bool] | None = None,
) -> intent.IntentResponse | None:
@@ -261,7 +264,9 @@ async def async_handle_intents(
agent = get_agent_manager(hass).default_agent
assert agent is not None
return await agent.async_handle_intents(user_input, intent_filter=intent_filter)
return await agent.async_handle_intents(
user_input, chat_log, intent_filter=intent_filter
)
async def async_setup(hass: HomeAssistant, config: ConfigType) -> bool:

View File

@@ -66,6 +66,7 @@ from homeassistant.helpers import (
entity_registry as er,
floor_registry as fr,
intent,
llm,
start as ha_start,
template,
translation,
@@ -76,7 +77,7 @@ from homeassistant.util import language as language_util
from homeassistant.util.json import JsonObjectType, json_loads_object
from .agent_manager import get_agent_manager
from .chat_log import AssistantContent, ChatLog
from .chat_log import AssistantContent, ChatLog, ToolResultContent
from .const import (
DOMAIN,
METADATA_CUSTOM_FILE,
@@ -435,7 +436,7 @@ class DefaultAgent(ConversationEntity):
if trigger_result := await self.async_recognize_sentence_trigger(user_input):
# Process callbacks and get response
response_text = await self._handle_trigger_result(
trigger_result, user_input
trigger_result, user_input, chat_log
)
# Convert to conversation result
@@ -447,8 +448,9 @@ class DefaultAgent(ConversationEntity):
if response is None:
# Match intents
intent_result = await self.async_recognize_intent(user_input)
response = await self._async_process_intent_result(
intent_result, user_input
intent_result, user_input, chat_log
)
speech: str = response.speech.get("plain", {}).get("speech", "")
@@ -467,6 +469,7 @@ class DefaultAgent(ConversationEntity):
self,
result: RecognizeResult | None,
user_input: ConversationInput,
chat_log: ChatLog,
) -> intent.IntentResponse:
"""Process user input with intents."""
language = user_input.language or self.hass.config.language
@@ -529,12 +532,21 @@ class DefaultAgent(ConversationEntity):
ConversationTraceEventType.TOOL_CALL,
{
"intent_name": result.intent.name,
"slots": {
entity.name: entity.value or entity.text
for entity in result.entities_list
},
"slots": {entity.name: entity.value for entity in result.entities_list},
},
)
tool_input = llm.ToolInput(
tool_name=result.intent.name,
tool_args={entity.name: entity.value for entity in result.entities_list},
external=True,
)
chat_log.async_add_assistant_content_without_tools(
AssistantContent(
agent_id=user_input.agent_id,
content=None,
tool_calls=[tool_input],
)
)
try:
intent_response = await intent.async_handle(
@@ -597,6 +609,16 @@ class DefaultAgent(ConversationEntity):
)
intent_response.async_set_speech(speech)
tool_result = llm.IntentResponseDict(intent_response)
chat_log.async_add_assistant_content_without_tools(
ToolResultContent(
agent_id=user_input.agent_id,
tool_call_id=tool_input.id,
tool_name=tool_input.tool_name,
tool_result=tool_result,
)
)
return intent_response
def _recognize(
@@ -1523,16 +1545,31 @@ class DefaultAgent(ConversationEntity):
)
async def _handle_trigger_result(
self, result: SentenceTriggerResult, user_input: ConversationInput
self,
result: SentenceTriggerResult,
user_input: ConversationInput,
chat_log: ChatLog,
) -> str:
"""Run sentence trigger callbacks and return response text."""
# Gather callback responses in parallel
trigger_callbacks = [
self._triggers_details[trigger_id].callback(user_input, trigger_result)
for trigger_id, trigger_result in result.matched_triggers.items()
]
tool_input = llm.ToolInput(
tool_name="trigger_sentence",
tool_args={},
external=True,
)
chat_log.async_add_assistant_content_without_tools(
AssistantContent(
agent_id=user_input.agent_id,
content=None,
tool_calls=[tool_input],
)
)
# Use first non-empty result as response.
#
# There may be multiple copies of a trigger running when editing in
@@ -1561,23 +1598,38 @@ class DefaultAgent(ConversationEntity):
f"component.{DOMAIN}.conversation.agent.done", "Done"
)
tool_result: dict[str, Any] = {"response": response_text}
chat_log.async_add_assistant_content_without_tools(
ToolResultContent(
agent_id=user_input.agent_id,
tool_call_id=tool_input.id,
tool_name=tool_input.tool_name,
tool_result=tool_result,
)
)
return response_text
async def async_handle_sentence_triggers(
self, user_input: ConversationInput
self,
user_input: ConversationInput,
chat_log: ChatLog,
) -> str | None:
"""Try to input sentence against sentence triggers and return response text.
Returns None if no match occurred.
"""
if trigger_result := await self.async_recognize_sentence_trigger(user_input):
return await self._handle_trigger_result(trigger_result, user_input)
return await self._handle_trigger_result(
trigger_result, user_input, chat_log
)
return None
async def async_handle_intents(
self,
user_input: ConversationInput,
chat_log: ChatLog,
*,
intent_filter: Callable[[RecognizeResult], bool] | None = None,
) -> intent.IntentResponse | None:
@@ -1593,7 +1645,7 @@ class DefaultAgent(ConversationEntity):
# No error message on failed match
return None
response = await self._async_process_intent_result(result, user_input)
response = await self._async_process_intent_result(result, user_input, chat_log)
if (
response.response_type == intent.IntentResponseType.ERROR
and response.error_code

View File

@@ -8,6 +8,10 @@ from typing import Any
from pycoolmasternet_async import SWING_MODES
from homeassistant.components.climate import (
FAN_AUTO,
FAN_HIGH,
FAN_LOW,
FAN_MEDIUM,
ClimateEntity,
ClimateEntityFeature,
HVACMode,
@@ -31,7 +35,16 @@ CM_TO_HA_STATE = {
HA_STATE_TO_CM = {value: key for key, value in CM_TO_HA_STATE.items()}
FAN_MODES = ["low", "med", "high", "auto"]
CM_TO_HA_FAN = {
"low": FAN_LOW,
"med": FAN_MEDIUM,
"high": FAN_HIGH,
"auto": FAN_AUTO,
}
HA_FAN_TO_CM = {value: key for key, value in CM_TO_HA_FAN.items()}
FAN_MODES = list(CM_TO_HA_FAN.values())
_LOGGER = logging.getLogger(__name__)
@@ -111,7 +124,7 @@ class CoolmasterClimate(CoolmasterEntity, ClimateEntity):
@property
def fan_mode(self):
"""Return the fan setting."""
return self._unit.fan_speed
return CM_TO_HA_FAN[self._unit.fan_speed]
@property
def fan_modes(self):
@@ -138,7 +151,7 @@ class CoolmasterClimate(CoolmasterEntity, ClimateEntity):
async def async_set_fan_mode(self, fan_mode: str) -> None:
"""Set new fan mode."""
_LOGGER.debug("Setting fan mode of %s to %s", self.unique_id, fan_mode)
self._unit = await self._unit.set_fan_speed(fan_mode)
self._unit = await self._unit.set_fan_speed(HA_FAN_TO_CM[fan_mode])
self.async_write_ha_state()
async def async_set_swing_mode(self, swing_mode: str) -> None:

View File

@@ -15,6 +15,11 @@ from homeassistant.helpers.service_info.zeroconf import ZeroconfServiceInfo
from .const import DOMAIN
def normalize_pairing_code(code: str) -> str:
"""Normalize pairing code by removing spaces and capitalizing."""
return code.replace(" ", "").upper()
class DropletConfigFlow(ConfigFlow, domain=DOMAIN):
"""Handle Droplet config flow."""
@@ -52,14 +57,13 @@ class DropletConfigFlow(ConfigFlow, domain=DOMAIN):
if user_input is not None:
# Test if we can connect before returning
session = async_get_clientsession(self.hass)
if await self._droplet_discovery.try_connect(
session, user_input[CONF_CODE]
):
code = normalize_pairing_code(user_input[CONF_CODE])
if await self._droplet_discovery.try_connect(session, code):
device_data = {
CONF_IP_ADDRESS: self._droplet_discovery.host,
CONF_PORT: self._droplet_discovery.port,
CONF_DEVICE_ID: device_id,
CONF_CODE: user_input[CONF_CODE],
CONF_CODE: code,
}
return self.async_create_entry(
@@ -90,14 +94,15 @@ class DropletConfigFlow(ConfigFlow, domain=DOMAIN):
user_input[CONF_IP_ADDRESS], DropletConnection.DEFAULT_PORT, ""
)
session = async_get_clientsession(self.hass)
if await self._droplet_discovery.try_connect(
session, user_input[CONF_CODE]
) and (device_id := await self._droplet_discovery.get_device_id()):
code = normalize_pairing_code(user_input[CONF_CODE])
if await self._droplet_discovery.try_connect(session, code) and (
device_id := await self._droplet_discovery.get_device_id()
):
device_data = {
CONF_IP_ADDRESS: self._droplet_discovery.host,
CONF_PORT: self._droplet_discovery.port,
CONF_DEVICE_ID: device_id,
CONF_CODE: user_input[CONF_CODE],
CONF_CODE: code,
}
await self.async_set_unique_id(device_id, raise_on_progress=False)
self._abort_if_unique_id_configured(

View File

@@ -285,16 +285,14 @@ async def async_setup_entry(
name=sensor.name,
)
# Hourly rain doesn't reset to fixed hours, it must be measurement state classes
# Only total rain needs state class for long-term statistics
if sensor.key in (
"hrain_piezomm",
"hrain_piezo",
"hourlyrainmm",
"hourlyrainin",
"totalrainin",
"totalrainmm",
):
description = dataclasses.replace(
description,
state_class=SensorStateClass.MEASUREMENT,
state_class=SensorStateClass.TOTAL_INCREASING,
)
async_add_entities([EcowittSensorEntity(sensor, description)])

View File

@@ -7,7 +7,7 @@
"iot_class": "local_polling",
"loggers": ["pyenphase"],
"quality_scale": "platinum",
"requirements": ["pyenphase==2.4.0"],
"requirements": ["pyenphase==2.4.2"],
"zeroconf": [
{
"type": "_enphase-envoy._tcp.local."

View File

@@ -17,7 +17,7 @@ DEFAULT_NEW_CONFIG_ALLOW_ALLOW_SERVICE_CALLS = False
DEFAULT_PORT: Final = 6053
STABLE_BLE_VERSION_STR = "2025.8.0"
STABLE_BLE_VERSION_STR = "2025.11.0"
STABLE_BLE_VERSION = AwesomeVersion(STABLE_BLE_VERSION_STR)
PROJECT_URLS = {
"esphome.bluetooth-proxy": "https://esphome.github.io/bluetooth-proxies/",

View File

@@ -157,7 +157,7 @@
"title": "[%key:component::assist_pipeline::issues::assist_in_progress_deprecated::title%]"
},
"ble_firmware_outdated": {
"description": "To improve Bluetooth reliability and performance, we highly recommend updating {name} with ESPHome {version} or later. When updating the device from ESPHome earlier than 2022.12.0, it is recommended to use a serial cable instead of an over-the-air update to take advantage of the new partition scheme.",
"description": "ESPHome {version} introduces ultra-low latency event processing, reducing BLE event delays from 0-16 milliseconds to approximately 12 microseconds. This resolves stability issues when pairing, connecting, or handshaking with devices that require low latency, and makes Bluetooth proxy operations rival or exceed local adapters. We highly recommend updating {name} to take advantage of these improvements.",
"title": "Update {name} with ESPHome {version} or later"
},
"device_conflict": {

View File

@@ -102,6 +102,7 @@ SENSORS: tuple[EssentSensorEntityDescription, ...] = (
key="average_today",
translation_key="average_today",
value_fn=lambda energy_data: energy_data.avg_price,
energy_types=(EnergyType.ELECTRICITY,),
),
EssentSensorEntityDescription(
key="lowest_price_today",

View File

@@ -44,9 +44,6 @@
"electricity_next_price": {
"name": "Next electricity price"
},
"gas_average_today": {
"name": "Average gas price today"
},
"gas_current_price": {
"name": "Current gas price"
},

View File

@@ -1,22 +1,30 @@
"""API for fitbit bound to Home Assistant OAuth."""
from abc import ABC, abstractmethod
from collections.abc import Callable
from collections.abc import Awaitable, Callable
import logging
from typing import Any, cast
from fitbit import Fitbit
from fitbit.exceptions import HTTPException, HTTPUnauthorized
from fitbit_web_api import ApiClient, Configuration, DevicesApi
from fitbit_web_api.exceptions import (
ApiException,
OpenApiException,
UnauthorizedException,
)
from fitbit_web_api.models.device import Device
from requests.exceptions import ConnectionError as RequestsConnectionError
from homeassistant.const import CONF_ACCESS_TOKEN
from homeassistant.core import HomeAssistant
from homeassistant.helpers import config_entry_oauth2_flow
from homeassistant.helpers.aiohttp_client import async_get_clientsession
from homeassistant.util.unit_system import METRIC_SYSTEM
from .const import FitbitUnitSystem
from .exceptions import FitbitApiException, FitbitAuthException
from .model import FitbitDevice, FitbitProfile
from .model import FitbitProfile
_LOGGER = logging.getLogger(__name__)
@@ -58,6 +66,14 @@ class FitbitApi(ABC):
expires_at=float(token[CONF_EXPIRES_AT]),
)
async def _async_get_fitbit_web_api(self) -> ApiClient:
"""Create and return an ApiClient configured with the current access token."""
token = await self.async_get_access_token()
configuration = Configuration()
configuration.pool_manager = async_get_clientsession(self._hass)
configuration.access_token = token[CONF_ACCESS_TOKEN]
return ApiClient(configuration)
async def async_get_user_profile(self) -> FitbitProfile:
"""Return the user profile from the API."""
if self._profile is None:
@@ -94,21 +110,13 @@ class FitbitApi(ABC):
return FitbitUnitSystem.METRIC
return FitbitUnitSystem.EN_US
async def async_get_devices(self) -> list[FitbitDevice]:
"""Return available devices."""
client = await self._async_get_client()
devices: list[dict[str, str]] = await self._run(client.get_devices)
async def async_get_devices(self) -> list[Device]:
"""Return available devices using fitbit-web-api."""
client = await self._async_get_fitbit_web_api()
devices_api = DevicesApi(client)
devices: list[Device] = await self._run_async(devices_api.get_devices)
_LOGGER.debug("get_devices=%s", devices)
return [
FitbitDevice(
id=device["id"],
device_version=device["deviceVersion"],
battery_level=int(device["batteryLevel"]),
battery=device["battery"],
type=device["type"],
)
for device in devices
]
return devices
async def async_get_latest_time_series(self, resource_type: str) -> dict[str, Any]:
"""Return the most recent value from the time series for the specified resource type."""
@@ -140,6 +148,20 @@ class FitbitApi(ABC):
_LOGGER.debug("Error from fitbit API: %s", err)
raise FitbitApiException("Error from fitbit API") from err
async def _run_async[_T](self, func: Callable[[], Awaitable[_T]]) -> _T:
"""Run client command."""
try:
return await func()
except UnauthorizedException as err:
_LOGGER.debug("Unauthorized error from fitbit API: %s", err)
raise FitbitAuthException("Authentication error from fitbit API") from err
except ApiException as err:
_LOGGER.debug("Error from fitbit API: %s", err)
raise FitbitApiException("Error from fitbit API") from err
except OpenApiException as err:
_LOGGER.debug("Error communicating with fitbit API: %s", err)
raise FitbitApiException("Communication error from fitbit API") from err
class OAuthFitbitApi(FitbitApi):
"""Provide fitbit authentication tied to an OAuth2 based config entry."""

View File

@@ -6,6 +6,8 @@ import datetime
import logging
from typing import Final
from fitbit_web_api.models.device import Device
from homeassistant.config_entries import ConfigEntry
from homeassistant.core import HomeAssistant
from homeassistant.exceptions import ConfigEntryAuthFailed
@@ -13,7 +15,6 @@ from homeassistant.helpers.update_coordinator import DataUpdateCoordinator, Upda
from .api import FitbitApi
from .exceptions import FitbitApiException, FitbitAuthException
from .model import FitbitDevice
_LOGGER = logging.getLogger(__name__)
@@ -23,7 +24,7 @@ TIMEOUT = 10
type FitbitConfigEntry = ConfigEntry[FitbitData]
class FitbitDeviceCoordinator(DataUpdateCoordinator[dict[str, FitbitDevice]]):
class FitbitDeviceCoordinator(DataUpdateCoordinator[dict[str, Device]]):
"""Coordinator for fetching fitbit devices from the API."""
config_entry: FitbitConfigEntry
@@ -41,7 +42,7 @@ class FitbitDeviceCoordinator(DataUpdateCoordinator[dict[str, FitbitDevice]]):
)
self._api = api
async def _async_update_data(self) -> dict[str, FitbitDevice]:
async def _async_update_data(self) -> dict[str, Device]:
"""Fetch data from API endpoint."""
async with asyncio.timeout(TIMEOUT):
try:
@@ -50,7 +51,7 @@ class FitbitDeviceCoordinator(DataUpdateCoordinator[dict[str, FitbitDevice]]):
raise ConfigEntryAuthFailed(err) from err
except FitbitApiException as err:
raise UpdateFailed(err) from err
return {device.id: device for device in devices}
return {device.id: device for device in devices if device.id is not None}
@dataclass

View File

@@ -6,6 +6,6 @@
"dependencies": ["application_credentials", "http"],
"documentation": "https://www.home-assistant.io/integrations/fitbit",
"iot_class": "cloud_polling",
"loggers": ["fitbit"],
"requirements": ["fitbit==0.3.1"]
"loggers": ["fitbit", "fitbit_web_api"],
"requirements": ["fitbit==0.3.1", "fitbit-web-api==2.13.5"]
}

View File

@@ -21,26 +21,6 @@ class FitbitProfile:
"""The locale defined in the user's Fitbit account settings."""
@dataclass
class FitbitDevice:
"""Device from the Fitbit API response."""
id: str
"""The device ID."""
device_version: str
"""The product name of the device."""
battery_level: int
"""The battery level as a percentage."""
battery: str
"""Returns the battery level of the device."""
type: str
"""The type of the device such as TRACKER or SCALE."""
@dataclass
class FitbitConfig:
"""Information from the fitbit ConfigEntry data."""

View File

@@ -8,6 +8,8 @@ import datetime
import logging
from typing import Any, Final, cast
from fitbit_web_api.models.device import Device
from homeassistant.components.sensor import (
SensorDeviceClass,
SensorEntity,
@@ -32,7 +34,7 @@ from .api import FitbitApi
from .const import ATTRIBUTION, BATTERY_LEVELS, DOMAIN, FitbitScope, FitbitUnitSystem
from .coordinator import FitbitConfigEntry, FitbitDeviceCoordinator
from .exceptions import FitbitApiException, FitbitAuthException
from .model import FitbitDevice, config_from_entry_data
from .model import config_from_entry_data
_LOGGER: Final = logging.getLogger(__name__)
@@ -657,7 +659,7 @@ class FitbitBatterySensor(CoordinatorEntity[FitbitDeviceCoordinator], SensorEnti
coordinator: FitbitDeviceCoordinator,
user_profile_id: str,
description: FitbitSensorEntityDescription,
device: FitbitDevice,
device: Device,
enable_default_override: bool,
) -> None:
"""Initialize the Fitbit sensor."""
@@ -677,7 +679,9 @@ class FitbitBatterySensor(CoordinatorEntity[FitbitDeviceCoordinator], SensorEnti
@property
def icon(self) -> str | None:
"""Icon to use in the frontend, if any."""
if battery_level := BATTERY_LEVELS.get(self.device.battery):
if self.device.battery is not None and (
battery_level := BATTERY_LEVELS.get(self.device.battery)
):
return icon_for_battery_level(battery_level=battery_level)
return self.entity_description.icon
@@ -697,7 +701,7 @@ class FitbitBatterySensor(CoordinatorEntity[FitbitDeviceCoordinator], SensorEnti
@callback
def _handle_coordinator_update(self) -> None:
"""Handle updated data from the coordinator."""
self.device = self.coordinator.data[self.device.id]
self.device = self.coordinator.data[cast(str, self.device.id)]
self._attr_native_value = self.device.battery
self.async_write_ha_state()
@@ -715,7 +719,7 @@ class FitbitBatteryLevelSensor(
coordinator: FitbitDeviceCoordinator,
user_profile_id: str,
description: FitbitSensorEntityDescription,
device: FitbitDevice,
device: Device,
) -> None:
"""Initialize the Fitbit sensor."""
super().__init__(coordinator)
@@ -736,6 +740,6 @@ class FitbitBatteryLevelSensor(
@callback
def _handle_coordinator_update(self) -> None:
"""Handle updated data from the coordinator."""
self.device = self.coordinator.data[self.device.id]
self.device = self.coordinator.data[cast(str, self.device.id)]
self._attr_native_value = self.device.battery_level
self.async_write_ha_state()

View File

@@ -23,5 +23,5 @@
"winter_mode": {}
},
"quality_scale": "internal",
"requirements": ["home-assistant-frontend==20251126.0"]
"requirements": ["home-assistant-frontend==20251127.0"]
}

View File

@@ -132,7 +132,6 @@
"heavily_polluted": "Heavily polluted",
"heavy_air_pollution": "Heavy air pollution",
"high_air_pollution": "High air pollution",
"high_air_quality": "High air pollution",
"high_health_risk": "High health risk",
"horrible_air_quality": "Horrible air quality",
"light_air_pollution": "Light air pollution",
@@ -165,20 +164,18 @@
"slightly_polluted": "Slightly polluted",
"sufficient_air_quality": "Sufficient air quality",
"unfavorable_air_quality": "Unfavorable air quality",
"unfavorable_sensitive": "Unfavorable air quality for sensitive groups",
"unfavorable_air_quality_for_sensitive_groups": "Unfavorable air quality for sensitive groups",
"unhealthy_air_quality": "Unhealthy air quality",
"unhealthy_sensitive": "Unhealthy air quality for sensitive groups",
"unsatisfactory_air_quality": "Unsatisfactory air quality",
"very_bad_air_quality": "Very bad air quality",
"very_good_air_quality": "Very good air quality",
"very_high_air_pollution": "Very high air pollution",
"very_high_air_quality": "Very High air pollution",
"very_high_health_risk": "Very high health risk",
"very_low_air_pollution": "Very low air pollution",
"very_polluted": "Very polluted",
"very_poor_air_quality": "Very poor air quality",
"very_unfavorable_air_quality": "Very unfavorable air quality",
"very_unhealthy": "Very unhealthy air quality",
"very_unhealthy_air_quality": "Very unhealthy air quality",
"warning_air_pollution": "Warning level air pollution"
}

View File

@@ -53,7 +53,7 @@ from homeassistant.helpers.issue_registry import (
async_create_issue,
async_delete_issue,
)
from homeassistant.helpers.typing import ConfigType, DiscoveryInfoType, StateType
from homeassistant.helpers.typing import ConfigType, DiscoveryInfoType
from .const import CONF_IGNORE_NON_NUMERIC, DOMAIN
from .entity import GroupEntity
@@ -374,7 +374,7 @@ class SensorGroup(GroupEntity, SensorEntity):
def async_update_group_state(self) -> None:
"""Query all members and determine the sensor group state."""
self.calculate_state_attributes(self._get_valid_entities())
states: list[StateType] = []
states: list[str] = []
valid_units = self._valid_units
valid_states: list[bool] = []
sensor_values: list[tuple[str, float, State]] = []

View File

@@ -211,7 +211,7 @@ async def ws_start_preview(
@callback
def async_preview_updated(
last_exception: Exception | None, state: str, attributes: Mapping[str, Any]
last_exception: BaseException | None, state: str, attributes: Mapping[str, Any]
) -> None:
"""Forward config entry state events to websocket."""
if last_exception:

View File

@@ -241,7 +241,9 @@ class HistoryStatsSensor(HistoryStatsSensorBase):
async def async_start_preview(
self,
preview_callback: Callable[[Exception | None, str, Mapping[str, Any]], None],
preview_callback: Callable[
[BaseException | None, str, Mapping[str, Any]], None
],
) -> CALLBACK_TYPE:
"""Render a preview."""

View File

@@ -39,6 +39,10 @@ if TYPE_CHECKING:
_LOGGER = logging.getLogger(__name__)
_DESCRIPTION_PLACEHOLDERS = {
"sensor_value_types_url": "https://www.home-assistant.io/integrations/knx/#value-types"
}
@callback
def async_setup_services(hass: HomeAssistant) -> None:
@@ -48,6 +52,7 @@ def async_setup_services(hass: HomeAssistant) -> None:
SERVICE_KNX_SEND,
service_send_to_knx_bus,
schema=SERVICE_KNX_SEND_SCHEMA,
description_placeholders=_DESCRIPTION_PLACEHOLDERS,
)
hass.services.async_register(
@@ -63,6 +68,7 @@ def async_setup_services(hass: HomeAssistant) -> None:
SERVICE_KNX_EVENT_REGISTER,
service_event_register_modify,
schema=SERVICE_KNX_EVENT_REGISTER_SCHEMA,
description_placeholders=_DESCRIPTION_PLACEHOLDERS,
)
async_register_admin_service(
@@ -71,6 +77,7 @@ def async_setup_services(hass: HomeAssistant) -> None:
SERVICE_KNX_EXPOSURE_REGISTER,
service_exposure_register_modify,
schema=SERVICE_KNX_EXPOSURE_REGISTER_SCHEMA,
description_placeholders=_DESCRIPTION_PLACEHOLDERS,
)
async_register_admin_service(

View File

@@ -674,7 +674,7 @@
"name": "Remove event registration"
},
"type": {
"description": "If set, the payload will be decoded as given DPT in the event data `value` key. KNX sensor types are valid values (see https://www.home-assistant.io/integrations/knx/#value-types).",
"description": "If set, the payload will be decoded as given DPT in the event data `value` key. KNX sensor types are valid values (see {sensor_value_types_url}).",
"name": "Value type"
}
},
@@ -704,7 +704,7 @@
"name": "Remove exposure"
},
"type": {
"description": "Telegrams will be encoded as given DPT. 'binary' and all KNX sensor types are valid values (see https://www.home-assistant.io/integrations/knx/#value-types).",
"description": "Telegrams will be encoded as given DPT. 'binary' and all KNX sensor types are valid values (see {sensor_value_types_url}).",
"name": "Value type"
}
},
@@ -740,7 +740,7 @@
"name": "Send as Response"
},
"type": {
"description": "If set, the payload will not be sent as raw bytes, but encoded as given DPT. KNX sensor types are valid values (see https://www.home-assistant.io/integrations/knx/#value-types).",
"description": "If set, the payload will not be sent as raw bytes, but encoded as given DPT. KNX sensor types are valid values (see {sensor_value_types_url}).",
"name": "Value type"
}
},

View File

@@ -10,6 +10,7 @@ from __future__ import annotations
from collections.abc import Callable
import logging
from homeassistant.const import EVENT_LABS_UPDATED
from homeassistant.core import Event, HomeAssistant, callback
from homeassistant.generated.labs import LABS_PREVIEW_FEATURES
from homeassistant.helpers import config_validation as cv
@@ -17,7 +18,7 @@ from homeassistant.helpers.storage import Store
from homeassistant.helpers.typing import ConfigType
from homeassistant.loader import async_get_custom_components
from .const import DOMAIN, EVENT_LABS_UPDATED, LABS_DATA, STORAGE_KEY, STORAGE_VERSION
from .const import DOMAIN, LABS_DATA, STORAGE_KEY, STORAGE_VERSION
from .models import (
EventLabsUpdatedData,
LabPreviewFeature,

View File

@@ -11,6 +11,4 @@ DOMAIN = "labs"
STORAGE_KEY = "core.labs"
STORAGE_VERSION = 1
EVENT_LABS_UPDATED = "labs_updated"
LABS_DATA: HassKey[LabsData] = HassKey(DOMAIN)

View File

@@ -8,9 +8,10 @@ import voluptuous as vol
from homeassistant.components import websocket_api
from homeassistant.components.backup import async_get_manager
from homeassistant.const import EVENT_LABS_UPDATED
from homeassistant.core import HomeAssistant, callback
from .const import EVENT_LABS_UPDATED, LABS_DATA
from .const import LABS_DATA
from .models import EventLabsUpdatedData

View File

@@ -181,6 +181,16 @@ class LoggerSettings:
"""Save settings."""
self._store.async_delay_save(self._async_data_to_save, delay)
@callback
def async_get_integration_domains(self) -> set[str]:
"""Get domains that have integration-level log settings."""
stored_log_config = self._stored_config[STORAGE_LOG_KEY]
return {
domain
for domain, setting in stored_log_config.items()
if setting.type == LogSettingsType.INTEGRATION
}
@callback
def _async_get_logger_logs(self) -> dict[str, int]:
"""Get the logger logs."""

View File

@@ -6,6 +6,7 @@ import voluptuous as vol
from homeassistant.components import websocket_api
from homeassistant.components.websocket_api import ActiveConnection
from homeassistant.config_entries import DISCOVERY_SOURCES
from homeassistant.core import HomeAssistant, callback
from homeassistant.loader import IntegrationNotFound, async_get_integration
from homeassistant.setup import async_get_loaded_integrations
@@ -34,6 +35,16 @@ def handle_integration_log_info(
hass: HomeAssistant, connection: ActiveConnection, msg: dict[str, Any]
) -> None:
"""Handle integrations logger info."""
integrations = set(async_get_loaded_integrations(hass))
# Add discovered config flows that are not yet loaded
for flow in hass.config_entries.flow.async_progress():
if flow["context"].get("source") in DISCOVERY_SOURCES:
integrations.add(flow["handler"])
# Add integrations with custom log settings
integrations.update(hass.data[DATA_LOGGER].settings.async_get_integration_domains())
connection.send_result(
msg["id"],
[
@@ -43,7 +54,7 @@ def handle_integration_log_info(
f"homeassistant.components.{integration}"
).getEffectiveLevel(),
}
for integration in async_get_loaded_integrations(hass)
for integration in integrations
],
)

View File

@@ -1486,6 +1486,7 @@ class MqttEntity(
entity_registry.async_update_entity(
self.entity_id, new_entity_id=self._update_registry_entity_id
)
self._update_registry_entity_id = None
await super().async_added_to_hass()
self._subscriptions = {}

View File

@@ -729,8 +729,8 @@
"data_description": {
"payload_reset_percentage": "A special payload that resets the fan speed percentage state attribute to unknown when received at the percentage state topic.",
"percentage_command_template": "A [template]({command_templating_url}) to compose the payload to be published at the percentage command topic.",
"percentage_command_topic": "The MQTT topic to publish commands to change the fan speed state based on a percentage. [Learn more.]({url}#percentage_command_topic)",
"percentage_state_topic": "The MQTT topic subscribed to receive fan speed based on percentage. [Learn more.]({url}#percentage_state_topic)",
"percentage_command_topic": "The MQTT topic to publish commands to change the fan speed state based on a percentage setting. The value shall be in the range from \"speed range min\" to \"speed range max\". [Learn more.]({url}#percentage_command_topic)",
"percentage_state_topic": "The MQTT topic subscribed to receive fan speed state. This is a value in the range from \"speed range min\" to \"speed range max\". [Learn more.]({url}#percentage_state_topic)",
"percentage_value_template": "Defines a [template]({value_templating_url}) to extract the speed percentage value.",
"speed_range_max": "The maximum of numeric output range (representing 100 %). The percentage step is 100 / number of speeds within the \"speed range\".",
"speed_range_min": "The minimum of numeric output range (off not included, so speed_range_min - 1 represents 0 %). The percentage step is 100 / the number of speeds within the \"speed range\"."

View File

@@ -19,6 +19,7 @@ from google_nest_sdm.exceptions import (
ConfigurationException,
DecodeException,
SubscriberException,
SubscriberTimeoutException,
)
from google_nest_sdm.traits import TraitType
import voluptuous as vol
@@ -203,10 +204,16 @@ async def async_setup_entry(hass: HomeAssistant, entry: NestConfigEntry) -> bool
await auth.async_get_access_token()
except ClientResponseError as err:
if 400 <= err.status < 500:
raise ConfigEntryAuthFailed from err
raise ConfigEntryNotReady from err
raise ConfigEntryAuthFailed(
translation_domain=DOMAIN, translation_key="reauth_required"
) from err
raise ConfigEntryNotReady(
translation_domain=DOMAIN, translation_key="auth_server_error"
) from err
except ClientError as err:
raise ConfigEntryNotReady from err
raise ConfigEntryNotReady(
translation_domain=DOMAIN, translation_key="auth_client_error"
) from err
subscriber = await api.new_subscriber(hass, entry, auth)
if not subscriber:
@@ -227,19 +234,32 @@ async def async_setup_entry(hass: HomeAssistant, entry: NestConfigEntry) -> bool
unsub = await subscriber.start_async()
except AuthException as err:
raise ConfigEntryAuthFailed(
f"Subscriber authentication error: {err!s}"
translation_domain=DOMAIN,
translation_key="reauth_required",
) from err
except ConfigurationException as err:
_LOGGER.error("Configuration error: %s", err)
return False
except SubscriberTimeoutException as err:
raise ConfigEntryNotReady(
translation_domain=DOMAIN,
translation_key="subscriber_timeout",
) from err
except SubscriberException as err:
raise ConfigEntryNotReady(f"Subscriber error: {err!s}") from err
_LOGGER.error("Subscriber error: %s", err)
raise ConfigEntryNotReady(
translation_domain=DOMAIN,
translation_key="subscriber_error",
) from err
try:
device_manager = await subscriber.async_get_device_manager()
except ApiException as err:
unsub()
raise ConfigEntryNotReady(f"Device manager error: {err!s}") from err
raise ConfigEntryNotReady(
translation_domain=DOMAIN,
translation_key="device_api_error",
) from err
@callback
def on_hass_stop(_: Event) -> None:

View File

@@ -23,12 +23,7 @@ rules:
entity-unique-id: done
docs-installation-instructions: done
docs-removal-instructions: todo
test-before-setup:
status: todo
comment: |
The integration does tests on setup, however the most common issues
observed are related to ipv6 misconfigurations and the error messages
are not self explanatory and can be improved.
test-before-setup: done
docs-high-level-description: done
config-flow-test-coverage: done
docs-actions: done

View File

@@ -131,6 +131,26 @@
}
}
},
"exceptions": {
"auth_client_error": {
"message": "Client error during authentication, please check your network connection."
},
"auth_server_error": {
"message": "Error response from authentication server, please see logs for details."
},
"device_api_error": {
"message": "Error communicating with the Device Access API, please see logs for details."
},
"reauth_required": {
"message": "Reauthentication is required, please follow the instructions in the UI to reauthenticate your account."
},
"subscriber_error": {
"message": "Subscriber failed to connect to Google, please see logs for details."
},
"subscriber_timeout": {
"message": "Subscriber timed out while attempting to connect to Google. Please check your network connection and IPv6 configuration if applicable."
}
},
"selector": {
"subscription_name": {
"options": {

View File

@@ -432,7 +432,7 @@ class NumberDeviceClass(StrEnum):
Unit of measurement: UnitOfVolumeFlowRate
- SI / metric: `m³/h`, `m³/min`, `m³/s`, `L/h`, `L/min`, `L/s`, `mL/s`
- USCS / imperial: `ft³/min`, `gal/min`
- USCS / imperial: `ft³/min`, `gal/min`, `gal/d`
"""
WATER = "water"

View File

@@ -237,7 +237,13 @@ def async_setup_services(hass: HomeAssistant) -> None:
await gw_hub.gateway.set_gpio_mode(gpio_id, gpio_mode)
hass.services.async_register(
DOMAIN, SERVICE_SET_GPIO_MODE, set_gpio_mode, service_set_gpio_mode_schema
DOMAIN,
SERVICE_SET_GPIO_MODE,
set_gpio_mode,
service_set_gpio_mode_schema,
description_placeholders={
"gpio_modes_documentation_url": "https://www.home-assistant.io/integrations/opentherm_gw/#gpio-modes"
},
)
async def set_led_mode(call: ServiceCall) -> None:
@@ -248,7 +254,13 @@ def async_setup_services(hass: HomeAssistant) -> None:
await gw_hub.gateway.set_led_mode(led_id, led_mode)
hass.services.async_register(
DOMAIN, SERVICE_SET_LED_MODE, set_led_mode, service_set_led_mode_schema
DOMAIN,
SERVICE_SET_LED_MODE,
set_led_mode,
service_set_led_mode_schema,
description_placeholders={
"led_modes_documentation_url": "https://www.home-assistant.io/integrations/opentherm_gw/#led-modes"
},
)
async def set_max_mod(call: ServiceCall) -> None:
@@ -294,4 +306,7 @@ def async_setup_services(hass: HomeAssistant) -> None:
SERVICE_SEND_TRANSP_CMD,
send_transparent_cmd,
service_send_transp_cmd_schema,
description_placeholders={
"opentherm_gateway_firmware_url": "https://otgw.tclcode.com/firmware.html"
},
)

View File

@@ -386,7 +386,7 @@
"name": "Reset gateway"
},
"send_transparent_command": {
"description": "Sends custom otgw commands (https://otgw.tclcode.com/firmware.html) through a transparent interface.",
"description": "Sends custom OTGW commands ({opentherm_gateway_firmware_url}) through a transparent interface.",
"fields": {
"gateway_id": {
"description": "[%key:component::opentherm_gw::services::reset_gateway::fields::gateway_id::description%]",
@@ -461,7 +461,7 @@
"name": "ID"
},
"mode": {
"description": "Mode to set on the GPIO pin. Values 0 through 6 are accepted for both GPIOs, 7 is only accepted for GPIO \"B\". See https://www.home-assistant.io/integrations/opentherm_gw/#gpio-modes for an explanation of the values.",
"description": "Mode to set on the GPIO pin. Values 0 through 6 are accepted for both GPIOs, 7 is only accepted for GPIO \"B\". See {gpio_modes_documentation_url} for an explanation of the values.",
"name": "[%key:common::config_flow::data::mode%]"
}
},
@@ -507,7 +507,7 @@
"name": "ID"
},
"mode": {
"description": "The function to assign to the LED. See https://www.home-assistant.io/integrations/opentherm_gw/#led-modes for an explanation of the values.",
"description": "The function to assign to the LED. See {led_modes_documentation_url} for an explanation of the values.",
"name": "[%key:common::config_flow::data::mode%]"
}
},

View File

@@ -11,5 +11,5 @@
"documentation": "https://www.home-assistant.io/integrations/pooldose",
"iot_class": "local_polling",
"quality_scale": "bronze",
"requirements": ["python-pooldose==0.7.8"]
"requirements": ["python-pooldose==0.8.0"]
}

View File

@@ -8,5 +8,5 @@
"iot_class": "cloud_polling",
"loggers": ["renault_api"],
"quality_scale": "silver",
"requirements": ["renault-api==0.5.0"]
"requirements": ["renault-api==0.5.1"]
}

View File

@@ -19,5 +19,5 @@
"iot_class": "local_push",
"loggers": ["reolink_aio"],
"quality_scale": "platinum",
"requirements": ["reolink-aio==0.16.5"]
"requirements": ["reolink-aio==0.16.6"]
}

View File

@@ -33,7 +33,7 @@ from .coordinator import (
RoborockWashingMachineUpdateCoordinator,
RoborockWetDryVacUpdateCoordinator,
)
from .roborock_storage import CacheStore, async_remove_map_storage
from .roborock_storage import CacheStore, async_cleanup_map_storage
SCAN_INTERVAL = timedelta(seconds=30)
@@ -42,6 +42,7 @@ _LOGGER = logging.getLogger(__name__)
async def async_setup_entry(hass: HomeAssistant, entry: RoborockConfigEntry) -> bool:
"""Set up roborock from a config entry."""
await async_cleanup_map_storage(hass, entry.entry_id)
user_data = UserData.from_dict(entry.data[CONF_USER_DATA])
user_params = UserParams(
@@ -245,6 +246,5 @@ async def async_unload_entry(hass: HomeAssistant, entry: RoborockConfigEntry) ->
async def async_remove_entry(hass: HomeAssistant, entry: RoborockConfigEntry) -> None:
"""Handle removal of an entry."""
await async_remove_map_storage(hass, entry.entry_id)
store = CacheStore(hass, entry.entry_id)
await store.async_remove()

View File

@@ -32,7 +32,6 @@ async def async_setup_entry(
(
RoborockMap(
config_entry,
f"{coord.duid_slug}_map_{map_info.name}",
coord,
coord.properties_api.home,
map_info.map_flag,
@@ -55,13 +54,17 @@ class RoborockMap(RoborockCoordinatedEntityV1, ImageEntity):
def __init__(
self,
config_entry: ConfigEntry,
unique_id: str,
coordinator: RoborockDataUpdateCoordinator,
home_trait: HomeTrait,
map_flag: int,
map_name: str,
) -> None:
"""Initialize a Roborock map."""
map_name = map_name or f"Map {map_flag}"
# Note: Map names are not a valid unique id since they can be changed
# in the roborock app. This should be migrated to use map flag for
# the unique id.
unique_id = f"{coordinator.duid_slug}_map_{map_name}"
RoborockCoordinatedEntityV1.__init__(self, unique_id, coordinator)
ImageEntity.__init__(self, coordinator.hass)
self.config_entry = config_entry

View File

@@ -19,7 +19,7 @@
"loggers": ["roborock"],
"quality_scale": "silver",
"requirements": [
"python-roborock==3.7.1",
"python-roborock==3.8.4",
"vacuum-map-parser-roborock==0.1.4"
]
}

View File

@@ -25,8 +25,8 @@ def _storage_path_prefix(hass: HomeAssistant, entry_id: str) -> Path:
return Path(hass.config.path(STORAGE_PATH)) / entry_id
async def async_remove_map_storage(hass: HomeAssistant, entry_id: str) -> None:
"""Remove all map storage associated with a config entry.
async def async_cleanup_map_storage(hass: HomeAssistant, entry_id: str) -> None:
"""Remove map storage in the old format, if any.
This removes all on-disk map files for the given config entry. This is the
old format that was replaced by the `CacheStore` implementation.
@@ -34,13 +34,13 @@ async def async_remove_map_storage(hass: HomeAssistant, entry_id: str) -> None:
def remove(path_prefix: Path) -> None:
try:
if path_prefix.exists():
if path_prefix.exists() and path_prefix.is_dir():
_LOGGER.debug("Removing maps from disk store: %s", path_prefix)
shutil.rmtree(path_prefix, ignore_errors=True)
except OSError as err:
_LOGGER.error("Unable to remove map files in %s: %s", path_prefix, err)
path_prefix = _storage_path_prefix(hass, entry_id)
_LOGGER.debug("Removing maps from disk store: %s", path_prefix)
await hass.async_add_executor_job(remove, path_prefix)

View File

@@ -468,7 +468,7 @@ class SensorDeviceClass(StrEnum):
Unit of measurement: UnitOfVolumeFlowRate
- SI / metric: `m³/h`, `m³/min`, `m³/s`, `L/h`, `L/min`, `L/s`, `mL/s`
- USCS / imperial: `ft³/min`, `gal/min`
- USCS / imperial: `ft³/min`, `gal/min`, `gal/d`
"""
WATER = "water"

View File

@@ -6,5 +6,6 @@
"dependencies": ["application_credentials"],
"documentation": "https://www.home-assistant.io/integrations/senz",
"iot_class": "cloud_polling",
"loggers": ["aiosenz"],
"requirements": ["aiosenz==1.0.0"]
}

View File

@@ -20,6 +20,9 @@ from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from .coordinator import SFRConfigEntry
from .entity import SFRCoordinatorEntity
# Coordinator is used to centralize the data updates
PARALLEL_UPDATES = 0
@dataclass(frozen=True, kw_only=True)
class SFRBoxBinarySensorEntityDescription[_T](BinarySensorEntityDescription):
@@ -94,6 +97,4 @@ class SFRBoxBinarySensor[_T](SFRCoordinatorEntity[_T], BinarySensorEntity):
@property
def is_on(self) -> bool | None:
"""Return the native value of the device."""
if self.coordinator.data is None:
return None
return self.entity_description.value_fn(self.coordinator.data)

View File

@@ -24,6 +24,10 @@ from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from .coordinator import SFRConfigEntry
from .entity import SFREntity
# Coordinator is used to centralize the data updates
# but better to queue action calls to avoid conflicts
PARALLEL_UPDATES = 1
def with_error_wrapping[**_P, _R](
func: Callable[Concatenate[SFRBoxButton, _P], Awaitable[_R]],

View File

@@ -39,7 +39,10 @@ class SFRBoxFlowHandler(ConfigFlow, domain=DOMAIN):
VERSION = 1
_box: SFRBox
_config: dict[str, Any] = {}
def __init__(self) -> None:
"""Initialize SFR Box flow."""
self._config: dict[str, Any] = {}
async def async_step_user(
self, user_input: dict[str, str] | None = None
@@ -47,6 +50,7 @@ class SFRBoxFlowHandler(ConfigFlow, domain=DOMAIN):
"""Handle a flow initialized by the user."""
errors = {}
if user_input is not None:
self._async_abort_entries_match({CONF_HOST: user_input[CONF_HOST]})
box = SFRBox(
ip=user_input[CONF_HOST], client=async_get_clientsession(self.hass)
)
@@ -60,7 +64,6 @@ class SFRBoxFlowHandler(ConfigFlow, domain=DOMAIN):
assert system_info is not None
await self.async_set_unique_id(system_info.mac_addr)
self._abort_if_unique_id_configured()
self._async_abort_entries_match({CONF_HOST: user_input[CONF_HOST]})
self._box = box
self._config.update(user_input)
return await self.async_step_choose_auth()

View File

@@ -33,7 +33,7 @@ class SFRRuntimeData:
wan: SFRDataUpdateCoordinator[WanInfo]
class SFRDataUpdateCoordinator[_DataT](DataUpdateCoordinator[_DataT | None]):
class SFRDataUpdateCoordinator[_DataT](DataUpdateCoordinator[_DataT]):
"""Coordinator to manage data updates."""
config_entry: SFRConfigEntry
@@ -57,9 +57,11 @@ class SFRDataUpdateCoordinator[_DataT](DataUpdateCoordinator[_DataT | None]):
update_interval=_SCAN_INTERVAL,
)
async def _async_update_data(self) -> _DataT | None:
async def _async_update_data(self) -> _DataT:
"""Update data."""
try:
return await self._method(self.box)
if data := await self._method(self.box):
return data
except SFRBoxError as err:
raise UpdateFailed from err
raise UpdateFailed("No data received from SFR Box")

View File

@@ -6,5 +6,6 @@
"documentation": "https://www.home-assistant.io/integrations/sfr_box",
"integration_type": "device",
"iot_class": "local_polling",
"quality_scale": "bronze",
"requirements": ["sfrbox-api==0.1.0"]
}

View File

@@ -0,0 +1,101 @@
rules:
## Bronze
config-flow: done
test-before-configure: done
unique-config-entry: done
config-flow-test-coverage: done
runtime-data: done
test-before-setup: done
appropriate-polling: done
entity-unique-id: done
has-entity-name: done
entity-event-setup:
status: exempt
comment: local_polling without events
dependency-transparency: done
action-setup:
status: exempt
comment: There are no service actions
common-modules: done
docs-high-level-description: done
docs-installation-instructions: done
docs-removal-instructions: done
docs-actions:
status: exempt
comment: There are no service actions
brands: done
## Silver
config-entry-unloading: done
log-when-unavailable: done
entity-unavailable: done
action-exceptions: done
reauthentication-flow: done
parallel-updates: done
test-coverage: done
integration-owner: done
docs-installation-parameters:
status: todo
comment: not yet documented
docs-configuration-parameters:
status: exempt
comment: No options flow
## Gold
entity-translations: done
entity-device-class:
status: todo
comment: |
What does DSL counter count?
What is the state of CRC?
line_status and training and net_infra and mode -> unknown shouldn't be an option and the entity should return None instead
devices:
status: todo
comment: MAC address can be set to the connections
entity-category: done
entity-disabled-by-default: done
discovery:
status: todo
comment: Should be possible
stale-devices: done
diagnostics: done
exception-translations:
status: todo
comment: not yet documented
icon-translations: done
reconfiguration-flow:
status: todo
comment: Need to be able to manually change the IP address
dynamic-devices: done
discovery-update-info:
status: todo
comment: Discovery is not yet implemented
repair-issues: done
docs-use-cases:
status: todo
comment: not yet documented
docs-supported-devices: done
docs-supported-functions: done
docs-data-update:
status: todo
comment: not yet documented
docs-known-limitations:
status: todo
comment: not yet documented
docs-troubleshooting:
status: todo
comment: not yet documented
docs-examples:
status: todo
comment: not yet documented
## Platinum
async-dependency:
status: done
comment: sfrbox-api is asynchronous
inject-websession:
status: done
comment: sfrbox-api uses injected aiohttp websession
strict-typing:
status: done
comment: sfrbox-api is fully typed, and integration uses strict typing

View File

@@ -26,6 +26,9 @@ from homeassistant.helpers.typing import StateType
from .coordinator import SFRConfigEntry
from .entity import SFRCoordinatorEntity
# Coordinator is used to centralize the data updates
PARALLEL_UPDATES = 0
@dataclass(frozen=True, kw_only=True)
class SFRBoxSensorEntityDescription[_T](SensorEntityDescription):
@@ -250,6 +253,4 @@ class SFRBoxSensor[_T](SFRCoordinatorEntity[_T], SensorEntity):
@property
def native_value(self) -> StateType:
"""Return the native value of the device."""
if self.coordinator.data is None:
return None
return self.entity_description.value_fn(self.coordinator.data)

View File

@@ -30,7 +30,7 @@ from .entity import (
ShellyRpcAttributeEntity,
ShellySleepingBlockAttributeEntity,
ShellySleepingRpcAttributeEntity,
async_setup_entry_attribute_entities,
async_setup_entry_block,
async_setup_entry_rest,
async_setup_entry_rpc,
)
@@ -372,7 +372,7 @@ def _async_setup_block_entry(
) -> None:
"""Set up entities for BLOCK device."""
if config_entry.data[CONF_SLEEP_PERIOD]:
async_setup_entry_attribute_entities(
async_setup_entry_block(
hass,
config_entry,
async_add_entities,
@@ -380,7 +380,7 @@ def _async_setup_block_entry(
BlockSleepingBinarySensor,
)
else:
async_setup_entry_attribute_entities(
async_setup_entry_block(
hass,
config_entry,
async_add_entities,

View File

@@ -168,6 +168,7 @@ INPUTS_EVENTS_SUBTYPES: Final = {
"button2": 2,
"button3": 3,
"button4": 4,
"button5": 5,
}
SHBTN_MODELS: Final = [MODEL_BUTTON1, MODEL_BUTTON1_V2]

View File

@@ -79,6 +79,7 @@ from .utils import (
get_rpc_device_wakeup_period,
get_rpc_ws_url,
get_shelly_model_name,
is_rpc_ble_scanner_supported,
update_device_fw_info,
)
@@ -726,6 +727,7 @@ class ShellyRpcCoordinator(ShellyCoordinatorBase[RpcDevice]):
"""Handle device connected."""
async with self._connection_lock:
if self.connected: # Already connected
LOGGER.debug("Device %s already connected", self.name)
return
self.connected = True
try:
@@ -743,10 +745,7 @@ class ShellyRpcCoordinator(ShellyCoordinatorBase[RpcDevice]):
is updated.
"""
if not self.sleep_period:
if (
self.config_entry.runtime_data.rpc_supports_scripts
and not self.config_entry.runtime_data.rpc_zigbee_firmware
):
if is_rpc_ble_scanner_supported(self.config_entry):
await self._async_connect_ble_scanner()
else:
await self._async_setup_outbound_websocket()
@@ -776,6 +775,10 @@ class ShellyRpcCoordinator(ShellyCoordinatorBase[RpcDevice]):
if await async_ensure_ble_enabled(self.device):
# BLE enable required a reboot, don't bother connecting
# the scanner since it will be disconnected anyway
LOGGER.debug(
"Device %s BLE enable required a reboot, skipping scanner connect",
self.name,
)
return
assert self.device_id is not None
self._disconnected_callbacks.append(
@@ -844,21 +847,14 @@ class ShellyRpcCoordinator(ShellyCoordinatorBase[RpcDevice]):
"""Shutdown the coordinator."""
if self.device.connected:
try:
if not self.sleep_period:
if not self.sleep_period and is_rpc_ble_scanner_supported(
self.config_entry
):
await async_stop_scanner(self.device)
await super().shutdown()
except InvalidAuthError:
self.config_entry.async_start_reauth(self.hass)
return
except RpcCallError as err:
# Ignore 404 (No handler for) error
if err.code != 404:
LOGGER.debug(
"Error during shutdown for device %s: %s",
self.name,
err.message,
)
return
except DeviceConnectionError as err:
# If the device is restarting or has gone offline before
# the ping/pong timeout happens, the shutdown command

View File

@@ -27,7 +27,7 @@ from .entity import (
RpcEntityDescription,
ShellyBlockAttributeEntity,
ShellyRpcAttributeEntity,
async_setup_entry_attribute_entities,
async_setup_entry_block,
async_setup_entry_rpc,
rpc_call,
)
@@ -81,7 +81,7 @@ def _async_setup_block_entry(
coordinator = config_entry.runtime_data.block
assert coordinator
async_setup_entry_attribute_entities(
async_setup_entry_block(
hass, config_entry, async_add_entities, BLOCK_COVERS, BlockShellyCover
)

View File

@@ -34,14 +34,14 @@ from .utils import (
@callback
def async_setup_entry_attribute_entities(
def async_setup_entry_block(
hass: HomeAssistant,
config_entry: ShellyConfigEntry,
async_add_entities: AddEntitiesCallback,
sensors: Mapping[tuple[str, str], BlockEntityDescription],
sensor_class: Callable,
) -> None:
"""Set up entities for attributes."""
"""Set up block entities."""
coordinator = config_entry.runtime_data.block
assert coordinator
if coordinator.device.initialized:
@@ -150,7 +150,7 @@ def async_setup_entry_rpc(
sensors: Mapping[str, RpcEntityDescription],
sensor_class: Callable,
) -> None:
"""Set up entities for RPC sensors."""
"""Set up RPC entities."""
coordinator = config_entry.runtime_data.rpc
assert coordinator

View File

@@ -18,7 +18,6 @@ from homeassistant.components.event import (
)
from homeassistant.core import HomeAssistant, callback
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from homeassistant.helpers.update_coordinator import CoordinatorEntity
from .const import (
BASIC_INPUTS_EVENTS_TYPES,
@@ -26,12 +25,11 @@ from .const import (
SHIX3_1_INPUTS_EVENTS_TYPES,
)
from .coordinator import ShellyBlockCoordinator, ShellyConfigEntry, ShellyRpcCoordinator
from .entity import ShellyBlockEntity, get_entity_rpc_device_info
from .entity import ShellyBlockEntity, ShellyRpcEntity
from .utils import (
async_remove_orphaned_entities,
async_remove_shelly_entity,
get_block_channel,
get_block_channel_name,
get_block_custom_name,
get_block_number_of_channels,
get_device_entry_gen,
@@ -137,7 +135,7 @@ def _async_setup_rpc_entry(
async_add_entities: AddConfigEntryEntitiesCallback,
) -> None:
"""Set up entities for RPC device."""
entities: list[ShellyRpcEvent] = []
entities: list[ShellyRpcEvent | ShellyRpcScriptEvent] = []
coordinator = config_entry.runtime_data.rpc
if TYPE_CHECKING:
@@ -163,7 +161,9 @@ def _async_setup_rpc_entry(
continue
if script_events and (event_types := script_events[get_rpc_key_id(script)]):
entities.append(ShellyRpcScriptEvent(coordinator, script, event_types))
entities.append(
ShellyRpcScriptEvent(coordinator, script, SCRIPT_EVENT, event_types)
)
# If a script is removed, from the device configuration, we need to remove orphaned entities
async_remove_orphaned_entities(
@@ -211,7 +211,7 @@ class ShellyBlockEvent(ShellyBlockEntity, EventEntity):
else ""
}
else:
self._attr_name = get_block_channel_name(coordinator.device, block)
self._attr_name = get_block_custom_name(coordinator.device, block)
async def async_added_to_hass(self) -> None:
"""When entity is added to hass."""
@@ -228,7 +228,7 @@ class ShellyBlockEvent(ShellyBlockEntity, EventEntity):
self.async_write_ha_state()
class ShellyRpcEvent(CoordinatorEntity[ShellyRpcCoordinator], EventEntity):
class ShellyRpcEvent(ShellyRpcEntity, EventEntity):
"""Represent RPC event entity."""
_attr_has_entity_name = True
@@ -241,25 +241,19 @@ class ShellyRpcEvent(CoordinatorEntity[ShellyRpcCoordinator], EventEntity):
description: ShellyRpcEventDescription,
) -> None:
"""Initialize Shelly entity."""
super().__init__(coordinator)
self._attr_device_info = get_entity_rpc_device_info(coordinator, key)
self._attr_unique_id = f"{coordinator.mac}-{key}"
super().__init__(coordinator, key)
self.entity_description = description
if description.key == "input":
_, component, component_id = get_rpc_key(key)
if custom_name := get_rpc_custom_name(coordinator.device, key):
self._attr_name = custom_name
else:
self._attr_translation_placeholders = {
"input_number": component_id
if get_rpc_number_of_channels(coordinator.device, component) > 1
else ""
}
self.event_id = int(component_id)
elif description.key == "script":
self._attr_name = get_rpc_custom_name(coordinator.device, key)
self.event_id = get_rpc_key_id(key)
_, component, component_id = get_rpc_key(key)
if custom_name := get_rpc_custom_name(coordinator.device, key):
self._attr_name = custom_name
else:
self._attr_translation_placeholders = {
"input_number": component_id
if get_rpc_number_of_channels(coordinator.device, component) > 1
else ""
}
self.event_id = int(component_id)
async def async_added_to_hass(self) -> None:
"""When entity is added to hass."""
@@ -271,30 +265,36 @@ class ShellyRpcEvent(CoordinatorEntity[ShellyRpcCoordinator], EventEntity):
@callback
def _async_handle_event(self, event: dict[str, Any]) -> None:
"""Handle the demo button event."""
"""Handle the event."""
if event["id"] == self.event_id:
self._trigger_event(event["event"])
self.async_write_ha_state()
class ShellyRpcScriptEvent(ShellyRpcEvent):
class ShellyRpcScriptEvent(ShellyRpcEntity, EventEntity):
"""Represent RPC script event entity."""
_attr_has_entity_name = True
entity_description: ShellyRpcEventDescription
def __init__(
self,
coordinator: ShellyRpcCoordinator,
key: str,
description: ShellyRpcEventDescription,
event_types: list[str],
) -> None:
"""Initialize Shelly script event entity."""
super().__init__(coordinator, key, SCRIPT_EVENT)
self.component = key
super().__init__(coordinator, key)
self.entity_description = description
self._attr_event_types = event_types
self._attr_name = get_rpc_custom_name(coordinator.device, key)
self.event_id = get_rpc_key_id(key)
async def async_added_to_hass(self) -> None:
"""When entity is added to hass."""
await super(CoordinatorEntity, self).async_added_to_hass()
await super().async_added_to_hass()
self.async_on_remove(
self.coordinator.async_subscribe_events(self._async_handle_event)
@@ -303,7 +303,7 @@ class ShellyRpcScriptEvent(ShellyRpcEvent):
@callback
def _async_handle_event(self, event: dict[str, Any]) -> None:
"""Handle script event."""
if event.get("component") == self.component:
if event.get("component") == self.key:
event_type = event.get("event")
if event_type not in self.event_types:
# This can happen if we didn't find this event type in the script

View File

@@ -44,7 +44,7 @@ from .entity import (
RpcEntityDescription,
ShellyBlockAttributeEntity,
ShellyRpcAttributeEntity,
async_setup_entry_attribute_entities,
async_setup_entry_block,
async_setup_entry_rpc,
)
from .utils import (
@@ -101,7 +101,7 @@ def _async_setup_block_entry(
coordinator = config_entry.runtime_data.block
assert coordinator
async_setup_entry_attribute_entities(
async_setup_entry_block(
hass, config_entry, async_add_entities, BLOCK_LIGHTS, BlockShellyLight
)

View File

@@ -42,7 +42,7 @@ from .entity import (
RpcEntityDescription,
ShellyRpcAttributeEntity,
ShellySleepingBlockAttributeEntity,
async_setup_entry_attribute_entities,
async_setup_entry_block,
async_setup_entry_rpc,
rpc_call,
)
@@ -353,7 +353,7 @@ def _async_setup_block_entry(
) -> None:
"""Set up entities for BLOCK device."""
if config_entry.data[CONF_SLEEP_PERIOD]:
async_setup_entry_attribute_entities(
async_setup_entry_block(
hass,
config_entry,
async_add_entities,

View File

@@ -53,7 +53,7 @@ from .entity import (
ShellyRpcAttributeEntity,
ShellySleepingBlockAttributeEntity,
ShellySleepingRpcAttributeEntity,
async_setup_entry_attribute_entities,
async_setup_entry_block,
async_setup_entry_rest,
async_setup_entry_rpc,
get_entity_rpc_device_info,
@@ -1736,7 +1736,7 @@ def _async_setup_block_entry(
) -> None:
"""Set up entities for BLOCK device."""
if config_entry.data[CONF_SLEEP_PERIOD]:
async_setup_entry_attribute_entities(
async_setup_entry_block(
hass,
config_entry,
async_add_entities,
@@ -1744,7 +1744,7 @@ def _async_setup_block_entry(
BlockSleepingSensor,
)
else:
async_setup_entry_attribute_entities(
async_setup_entry_block(
hass,
config_entry,
async_add_entities,

View File

@@ -557,6 +557,9 @@
"child_lock": {
"name": "Child lock"
},
"cury_away_mode": {
"name": "Away mode"
},
"frost_protection": {
"name": "[%key:component::shelly::entity::climate::thermostat::state_attributes::preset_mode::state::frost_protection%]"
},
@@ -589,6 +592,11 @@
"beta_firmware": {
"name": "Beta firmware"
}
},
"valve": {
"gas_valve": {
"name": "Valve"
}
}
},
"exceptions": {

View File

@@ -36,7 +36,7 @@ from .entity import (
ShellyBlockAttributeEntity,
ShellyRpcAttributeEntity,
ShellySleepingBlockAttributeEntity,
async_setup_entry_attribute_entities,
async_setup_entry_block,
async_setup_entry_rpc,
rpc_call,
)
@@ -306,7 +306,6 @@ RPC_SWITCHES = {
"cury_away_mode": RpcSwitchDescription(
key="cury",
sub_key="away_mode",
name="Away mode",
translation_key="cury_away_mode",
is_on=lambda status: status["away_mode"],
method_on="cury_set_away_mode",
@@ -338,11 +337,11 @@ def _async_setup_block_entry(
coordinator = config_entry.runtime_data.block
assert coordinator
async_setup_entry_attribute_entities(
async_setup_entry_block(
hass, config_entry, async_add_entities, BLOCK_RELAY_SWITCHES, BlockRelaySwitch
)
async_setup_entry_attribute_entities(
async_setup_entry_block(
hass,
config_entry,
async_add_entities,

View File

@@ -110,8 +110,6 @@ def get_block_number_of_channels(device: BlockDevice, block: Block) -> int:
channels = device.shelly.get("num_emeters")
elif block.type in ["relay", "light"]:
channels = device.shelly.get("num_outputs")
elif block.type in ["roller", "device"]:
channels = 1
return channels or 1
@@ -134,21 +132,6 @@ def get_block_channel(block: Block | None, base: str = "1") -> str:
return chr(int(block.channel) + ord(base))
def get_block_channel_name(device: BlockDevice, block: Block | None) -> str | None:
"""Get name based on device and channel name."""
if (
not block
or block.type in ("device", "light", "relay", "emeter")
or get_block_number_of_channels(device, block) == 1
):
return None
if custom_name := get_block_custom_name(device, block):
return custom_name
return f"Channel {get_block_channel(block)}"
def get_block_sub_device_name(device: BlockDevice, block: Block) -> str:
"""Get name of block sub-device."""
if TYPE_CHECKING:
@@ -664,10 +647,7 @@ def async_remove_shelly_rpc_entities(
def get_virtual_component_ids(config: dict[str, Any], platform: str) -> list[str]:
"""Return a list of virtual component IDs for a platform."""
component = VIRTUAL_COMPONENTS_MAP.get(platform)
if not component:
return []
component = VIRTUAL_COMPONENTS_MAP[platform]
ids: list[str] = []
@@ -975,10 +955,10 @@ def async_migrate_rpc_virtual_components_unique_ids(
The new unique_id format is: {mac}-{key}-{component}_{role}
"""
for component in VIRTUAL_COMPONENTS:
if entity_entry.unique_id.endswith(f"-{component!s}"):
key = entity_entry.unique_id.split("-")[-2]
if key not in config:
continue
if (
entity_entry.unique_id.endswith(f"-{component!s}")
and (key := entity_entry.unique_id.split("-")[-2]) in config
):
role = get_rpc_role_by_key(config, key)
new_unique_id = f"{entity_entry.unique_id}_{role}"
LOGGER.debug(
@@ -994,3 +974,11 @@ def async_migrate_rpc_virtual_components_unique_ids(
}
return None
def is_rpc_ble_scanner_supported(entry: ConfigEntry) -> bool:
"""Return true if BLE scanner is supported."""
return (
entry.runtime_data.rpc_supports_scripts
and not entry.runtime_data.rpc_zigbee_firmware
)

View File

@@ -49,7 +49,7 @@ class RpcValveDescription(RpcEntityDescription, ValveEntityDescription):
BLOCK_VALVES: dict[tuple[str, str], BlockValveDescription] = {
("valve", "valve"): BlockValveDescription(
key="valve|valve",
name="Valve",
translation_key="gas_valve",
available=lambda block: block.valve not in ("failure", "checking"),
removal_condition=lambda _, block: block.valve in ("not_connected", "unknown"),
models={MODEL_GAS},

View File

@@ -6,5 +6,5 @@
"documentation": "https://www.home-assistant.io/integrations/stiebel_eltron",
"iot_class": "local_polling",
"loggers": ["pymodbus", "pystiebeleltron"],
"requirements": ["pystiebeleltron==0.2.3"]
"requirements": ["pystiebeleltron==0.2.5"]
}

View File

@@ -8,6 +8,11 @@ import logging
from typing import Any
from homeassistant import config as conf_util
from homeassistant.components.automation import (
DOMAIN as AUTOMATION_DOMAIN,
NEW_TRIGGERS_CONDITIONS_FEATURE_FLAG,
)
from homeassistant.components.labs import async_listen as async_labs_listen
from homeassistant.config_entries import ConfigEntry
from homeassistant.const import (
CONF_DEVICE_ID,
@@ -16,7 +21,7 @@ from homeassistant.const import (
CONF_UNIQUE_ID,
SERVICE_RELOAD,
)
from homeassistant.core import Event, HomeAssistant, ServiceCall
from homeassistant.core import Event, HomeAssistant, ServiceCall, callback
from homeassistant.exceptions import ConfigEntryError, HomeAssistantError
from homeassistant.helpers import discovery, issue_registry as ir
from homeassistant.helpers.device import (
@@ -90,6 +95,20 @@ async def async_setup(hass: HomeAssistant, config: ConfigType) -> bool:
async_register_admin_service(hass, DOMAIN, SERVICE_RELOAD, _reload_config)
@callback
def new_triggers_conditions_listener() -> None:
"""Handle new_triggers_conditions flag change."""
hass.async_create_task(
_reload_config(ServiceCall(hass, DOMAIN, SERVICE_RELOAD))
)
async_labs_listen(
hass,
AUTOMATION_DOMAIN,
NEW_TRIGGERS_CONDITIONS_FEATURE_FLAG,
new_triggers_conditions_listener,
)
return True

View File

@@ -48,6 +48,7 @@ from homeassistant.helpers.typing import ConfigType, DiscoveryInfoType
from homeassistant.util import dt as dt_util
from . import TriggerUpdateCoordinator
from .entity import AbstractTemplateEntity
from .helpers import (
async_setup_template_entry,
async_setup_template_platform,
@@ -168,11 +169,27 @@ def async_create_preview_binary_sensor(
)
class StateBinarySensorEntity(TemplateEntity, BinarySensorEntity, RestoreEntity):
class AbstractTemplateBinarySensor(
AbstractTemplateEntity, BinarySensorEntity, RestoreEntity
):
"""Representation of a template binary sensor features."""
_entity_id_format = ENTITY_ID_FORMAT
# The super init is not called because TemplateEntity and TriggerEntity will call AbstractTemplateEntity.__init__.
# This ensures that the __init__ on AbstractTemplateEntity is not called twice.
def __init__(self, config: dict[str, Any]) -> None: # pylint: disable=super-init-not-called
"""Initialize the features."""
self._attr_device_class = config.get(CONF_DEVICE_CLASS)
self._template: template.Template = config[CONF_STATE]
self._delay_cancel: CALLBACK_TYPE | None = None
class StateBinarySensorEntity(TemplateEntity, AbstractTemplateBinarySensor):
"""A virtual binary sensor that triggers from another sensor."""
_attr_should_poll = False
_entity_id_format = ENTITY_ID_FORMAT
def __init__(
self,
@@ -182,19 +199,19 @@ class StateBinarySensorEntity(TemplateEntity, BinarySensorEntity, RestoreEntity)
) -> None:
"""Initialize the Template binary sensor."""
TemplateEntity.__init__(self, hass, config, unique_id)
self._attr_device_class = config.get(CONF_DEVICE_CLASS)
self._template: template.Template = config[CONF_STATE]
self._delay_cancel = None
AbstractTemplateBinarySensor.__init__(self, config)
self._delay_on = None
self._delay_on_raw = config.get(CONF_DELAY_ON)
self._delay_on_template = config.get(CONF_DELAY_ON)
self._delay_off = None
self._delay_off_raw = config.get(CONF_DELAY_OFF)
self._delay_off_template = config.get(CONF_DELAY_OFF)
async def async_added_to_hass(self) -> None:
"""Restore state."""
if (
(self._delay_on_raw is not None or self._delay_off_raw is not None)
(
self._delay_on_template is not None
or self._delay_off_template is not None
)
and (last_state := await self.async_get_last_state()) is not None
and last_state.state not in (STATE_UNKNOWN, STATE_UNAVAILABLE)
):
@@ -206,20 +223,20 @@ class StateBinarySensorEntity(TemplateEntity, BinarySensorEntity, RestoreEntity)
"""Set up templates."""
self.add_template_attribute("_state", self._template, None, self._update_state)
if self._delay_on_raw is not None:
if self._delay_on_template is not None:
try:
self._delay_on = cv.positive_time_period(self._delay_on_raw)
self._delay_on = cv.positive_time_period(self._delay_on_template)
except vol.Invalid:
self.add_template_attribute(
"_delay_on", self._delay_on_raw, cv.positive_time_period
"_delay_on", self._delay_on_template, cv.positive_time_period
)
if self._delay_off_raw is not None:
if self._delay_off_template is not None:
try:
self._delay_off = cv.positive_time_period(self._delay_off_raw)
self._delay_off = cv.positive_time_period(self._delay_off_template)
except vol.Invalid:
self.add_template_attribute(
"_delay_off", self._delay_off_raw, cv.positive_time_period
"_delay_off", self._delay_off_template, cv.positive_time_period
)
super()._async_setup_templates()
@@ -259,12 +276,10 @@ class StateBinarySensorEntity(TemplateEntity, BinarySensorEntity, RestoreEntity)
self._delay_cancel = async_call_later(self.hass, delay, _set_state)
class TriggerBinarySensorEntity(TriggerEntity, BinarySensorEntity, RestoreEntity):
class TriggerBinarySensorEntity(TriggerEntity, AbstractTemplateBinarySensor):
"""Sensor entity based on trigger data."""
_entity_id_format = ENTITY_ID_FORMAT
domain = BINARY_SENSOR_DOMAIN
extra_template_keys = (CONF_STATE,)
def __init__(
self,
@@ -273,7 +288,8 @@ class TriggerBinarySensorEntity(TriggerEntity, BinarySensorEntity, RestoreEntity
config: dict,
) -> None:
"""Initialize the entity."""
super().__init__(hass, coordinator, config)
TriggerEntity.__init__(self, hass, coordinator, config)
AbstractTemplateBinarySensor.__init__(self, config)
for key in (CONF_STATE, CONF_DELAY_ON, CONF_DELAY_OFF, CONF_AUTO_OFF):
if isinstance(config.get(key), template.Template):
@@ -282,7 +298,6 @@ class TriggerBinarySensorEntity(TriggerEntity, BinarySensorEntity, RestoreEntity
self._last_delay_from: bool | None = None
self._last_delay_to: bool | None = None
self._delay_cancel: CALLBACK_TYPE | None = None
self._auto_off_cancel: CALLBACK_TYPE | None = None
self._auto_off_time: datetime | None = None

View File

@@ -46,6 +46,7 @@ from .const import (
CONF_DEFAULT_ENTITY_ID,
CONF_PICTURE,
DOMAIN,
PLATFORMS,
)
from .entity import AbstractTemplateEntity
from .template_entity import TemplateEntity
@@ -234,6 +235,8 @@ def create_legacy_template_issue(
hass: HomeAssistant, config: ConfigType, domain: str
) -> None:
"""Create a repair for legacy template entities."""
if domain not in PLATFORMS:
return
breadcrumb = "Template Entity"
# Default entity id should be in most legacy configuration because

View File

@@ -26,7 +26,12 @@ from homeassistant.const import (
)
from homeassistant.core import HomeAssistant, callback
from homeassistant.exceptions import ConfigEntryAuthFailed, ConfigEntryNotReady
from homeassistant.helpers import config_validation as cv, entity_registry as er
from homeassistant.helpers import (
config_validation as cv,
device_registry as dr,
entity_registry as er,
)
from homeassistant.helpers.device_registry import DeviceEntryType
from homeassistant.helpers.typing import ConfigType
from .const import DEFAULT_PATH, DEFAULT_SSL, DOMAIN
@@ -93,6 +98,19 @@ async def async_setup_entry(
except (AuthenticationError, UnknownError) as error:
raise ConfigEntryAuthFailed from error
protocol: Final = "https" if config_entry.data[CONF_SSL] else "http"
device_registry = dr.async_get(hass)
device_registry.async_get_or_create(
config_entry_id=config_entry.entry_id,
identifiers={(DOMAIN, config_entry.entry_id)},
manufacturer="Transmission",
entry_type=DeviceEntryType.SERVICE,
sw_version=api.server_version,
configuration_url=(
f"{protocol}://{config_entry.data[CONF_HOST]}:{config_entry.data[CONF_PORT]}"
),
)
coordinator = TransmissionDataUpdateCoordinator(hass, config_entry, api)
await hass.async_add_executor_job(coordinator.init_torrent_list)

View File

@@ -26,5 +26,4 @@ class TransmissionEntity(CoordinatorEntity[TransmissionDataUpdateCoordinator]):
)
self._attr_device_info = DeviceInfo(
identifiers={(DOMAIN, coordinator.config_entry.entry_id)},
manufacturer="Transmission",
)

View File

@@ -1,4 +1,43 @@
{
"entity": {
"sensor": {
"active_torrents": {
"default": "mdi:counter"
},
"completed_torrents": {
"default": "mdi:counter"
},
"download_speed": {
"default": "mdi:cloud-download"
},
"paused_torrents": {
"default": "mdi:counter"
},
"started_torrents": {
"default": "mdi:counter"
},
"total_torrents": {
"default": "mdi:counter"
},
"transmission_status": {
"default": "mdi:information-outline"
},
"upload_speed": {
"default": "mdi:cloud-upload"
}
},
"switch": {
"on_off": {
"default": "mdi:cloud",
"state": {
"off": "mdi:cloud-off"
}
},
"turtle_mode": {
"default": "mdi:tortoise"
}
}
},
"services": {
"add_torrent": {
"service": "mdi:download"

View File

@@ -30,18 +30,12 @@ rules:
entity-unavailable: done
integration-owner: done
log-when-unavailable: done
parallel-updates: todo
parallel-updates: done
reauthentication-flow: done
test-coverage:
status: todo
comment: |
Change to mock_setup_entry to avoid repetition when expanding tests.
test-coverage: done
# Gold
devices:
status: todo
comment: |
Add additional device detail including link to ui.
devices: done
diagnostics: todo
discovery-update-info: todo
discovery: todo
@@ -61,10 +55,7 @@ rules:
Speed sensors change so frequently that disabling by default may be appropriate.
entity-translations: done
exception-translations: done
icon-translations:
status: todo
comment: |
Add icons for sensors & switches.
icon-translations: done
reconfiguration-flow: todo
repair-issues: todo
stale-devices: todo

View File

@@ -29,6 +29,8 @@ from .const import (
from .coordinator import TransmissionConfigEntry, TransmissionDataUpdateCoordinator
from .entity import TransmissionEntity
PARALLEL_UPDATES = 0
MODES: dict[str, list[str] | None] = {
"started_torrents": ["downloading"],
"completed_torrents": ["seeding"],

View File

@@ -11,6 +11,8 @@ from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from .coordinator import TransmissionConfigEntry, TransmissionDataUpdateCoordinator
from .entity import TransmissionEntity
PARALLEL_UPDATES = 0
@dataclass(frozen=True, kw_only=True)
class TransmissionSwitchEntityDescription(SwitchEntityDescription):

View File

@@ -44,10 +44,9 @@ class HomeAssistantTuyaData(NamedTuple):
listener: SharingDeviceListener
async def async_setup_entry(hass: HomeAssistant, entry: TuyaConfigEntry) -> bool:
"""Async setup hass config entry."""
token_listener = TokenListener(hass, entry)
manager = Manager(
def _create_manager(entry: TuyaConfigEntry, token_listener: TokenListener) -> Manager:
"""Create a Tuya Manager instance."""
return Manager(
TUYA_CLIENT_ID,
entry.data[CONF_USER_CODE],
entry.data[CONF_TERMINAL_ID],
@@ -56,6 +55,15 @@ async def async_setup_entry(hass: HomeAssistant, entry: TuyaConfigEntry) -> bool
token_listener,
)
async def async_setup_entry(hass: HomeAssistant, entry: TuyaConfigEntry) -> bool:
"""Async setup hass config entry."""
token_listener = TokenListener(hass, entry)
# Move to executor as it makes blocking call to import_module
# with args ('.system', 'urllib3.contrib.resolver')
manager = await hass.async_add_executor_job(_create_manager, entry, token_listener)
listener = DeviceListener(hass, manager)
manager.add_device_listener(listener)

View File

@@ -0,0 +1,60 @@
"""Parsers for RAW (base64-encoded bytes) values."""
from dataclasses import dataclass
import struct
from typing import Self
@dataclass(kw_only=True)
class ElectricityData:
"""Electricity RAW value."""
current: float
power: float
voltage: float
@classmethod
def from_bytes(cls, raw: bytes) -> Self | None:
"""Parse bytes and return an ElectricityValue object."""
# Format:
# - legacy: 8 bytes
# - v01: [ver=0x01][len=0x0F][data(15 bytes)]
# - v02: [ver=0x02][len=0x0F][data(15 bytes)][sign_bitmap(1 byte)]
# Data layout (big-endian):
# - voltage: 2B, unit 0.1 V
# - current: 3B, unit 0.001 A (i.e., mA)
# - active power: 3B, unit 0.001 kW (i.e., W)
# - reactive power: 3B, unit 0.001 kVar
# - apparent power: 3B, unit 0.001 kVA
# - power factor: 1B, unit 0.01
# Sign bitmap (v02 only, 1 bit means negative):
# - bit0 current
# - bit1 active power
# - bit2 reactive
# - bit3 power factor
is_v1 = len(raw) == 17 and raw[0:2] == b"\x01\x0f"
is_v2 = len(raw) == 18 and raw[0:2] == b"\x02\x0f"
if is_v1 or is_v2:
data = raw[2:17]
voltage = struct.unpack(">H", data[0:2])[0] / 10.0
current = struct.unpack(">L", b"\x00" + data[2:5])[0]
power = struct.unpack(">L", b"\x00" + data[5:8])[0]
if is_v2:
sign_bitmap = raw[17]
if sign_bitmap & 0x01:
current = -current
if sign_bitmap & 0x02:
power = -power
return cls(current=current, power=power, voltage=voltage)
if len(raw) >= 8:
voltage = struct.unpack(">H", raw[0:2])[0] / 10.0
current = struct.unpack(">L", b"\x00" + raw[2:5])[0]
power = struct.unpack(">L", b"\x00" + raw[5:8])[0]
return cls(current=current, power=power, voltage=voltage)
return None

Some files were not shown because too many files have changed in this diff Show More