Compare commits

..

13 Commits

Author SHA1 Message Date
abmantis 75413dfc11 Allow targeting non-primary entities in conditions 2026-04-27 14:43:36 +01:00
Michael 0633400725 Fix feedreader tests broken by Python 3.14.3 asyncio changes (#169080) 2026-04-27 14:35:28 +01:00
Erik Montnemery 758a851b0d Add tests asserting condition features (#168881) 2026-04-27 14:35:28 +01:00
Stefan Agner ead2ff214f Keep add-on update entity in progress across post-install refresh (#168756)
Co-authored-by: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-27 14:35:28 +01:00
mnaggatz db91c0eaee Return None for Velux cover position when unknown (#168566) 2026-04-27 14:35:28 +01:00
shbatm 7203f61e7a Fix Flume sensor units and device classes (#169013) 2026-04-27 14:35:28 +01:00
Simone Chemelli ed99a9c7d9 Add uptime device class to the sensor platform (#164266)
Co-authored-by: Copilot <copilot@github.com>
2026-04-27 14:35:28 +01:00
Paulus Schoutsen d8a389afe0 Add radio_frequency platform to ESPHome (#168448)
Co-authored-by: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-27 14:35:28 +01:00
Paulus Schoutsen 6cbbc2185a Add Honeywell String Lights integration (#168450) 2026-04-27 14:35:28 +01:00
abmantis f660ddddea Add target selector tests for primary_entities_only field 2026-04-27 12:53:31 +01:00
abmantis 47579a9ac7 Merge branch 'dev' of github.com:home-assistant/core into non_primary_entity_trigger 2026-04-24 15:28:38 +01:00
abmantis c65c502e2f Allow targeting non-primary entities in triggers 2026-04-23 17:59:56 +01:00
abmantis 13e28210aa Allow extracting non-primary entities in websocket command 2026-04-22 23:19:29 +01:00
613 changed files with 6474 additions and 24913 deletions
+7 -2
View File
@@ -5,7 +5,7 @@
# Copilot code review instructions
- Start review comments with a short, one-sentence summary of the suggested fix.
- Do not comment on code style, formatting or linting issues.
- Do not add comments about code style, formatting or linting issues.
# GitHub Copilot & Claude Code Instructions
@@ -21,7 +21,7 @@ This repository contains the core of Home Assistant, a Python 3 based home autom
## Python Syntax Notes
- Python 3.14 explicitly allows `except TypeA, TypeB:` without parentheses. Never flag this as an issue since Home Assistant officially supports Python 3.14.
- Python 3.14 explicitly allows `except TypeA, TypeB:` without parentheses.
## Testing
@@ -34,3 +34,8 @@ Integrations with Platinum or Gold level in the Integration Quality Scale reflec
When reviewing entity actions, do not suggest extra defensive checks for input fields that are already validated by Home Assistant's service/action schemas and entity selection filters. Suggest additional guards only when data bypasses those validators or is transformed into a less-safe form.
When validation guarantees a dict key exists, prefer direct key access (`data["key"]`) instead of `.get("key")` so contract violations are surfaced instead of silently masked.
# Skills
- ha-integration-knowledge: .claude/skills/ha-integration-knowledge/SKILL.md
@@ -1,47 +0,0 @@
---
applyTo: "homeassistant/components/**, tests/components/**"
excludeAgent: "cloud-agent"
---
<!-- Automatically generated by gen_copilot_instructions.py, do not edit -->
## File Locations
- **Integration code**: `./homeassistant/components/<integration_domain>/`
- **Integration tests**: `./tests/components/<integration_domain>/`
## General guidelines
- When looking for examples, prefer integrations with the platinum or gold quality scale level first.
- Polling intervals are NOT user-configurable. Never add scan_interval, update_interval, or polling frequency options to config flows or config entries.
- Do NOT allow users to set config entry names in config flows. Names are automatically generated or can be customized later in UI. Exception: helper integrations may allow custom names.
- For entity actions and entity services, avoid requesting redundant defensive checks for fields already enforced by Home Assistant validation schemas and entity filters; only request extra guards when values bypass validation or are transformed unsafely.
- When validation guarantees a key is present, prefer direct dictionary indexing (`data["key"]`) over `.get("key")` so invalid assumptions fail fast.
- Integrations should be thin wrappers. Protocol parsing, device state machines, or other domain logic belong in a separate PyPI library, not in the integration itself. If unsure, ask before inlining.
- "potato" is a forbidden word for an integration and should never be used.
The following platforms have extra guidelines:
- **Diagnostics**: [`platform-diagnostics.md`](platform-diagnostics.md) for diagnostic data collection
- **Repairs**: [`platform-repairs.md`](platform-repairs.md) for user-actionable repair issues
## Integration Quality Scale
- When validating the quality scale rules, check them at https://developers.home-assistant.io/docs/core/integration-quality-scale/rules
- When implementing or reviewing an integration, always consider the quality scale rules, since they promote best practices.
Template scale file: `./script/scaffold/templates/integration/integration/quality_scale.yaml`
### How Rules Apply
1. **Check `manifest.json`**: Look for `"quality_scale"` key to determine integration level
2. **Bronze Rules**: Always required for any integration with quality scale
3. **Higher Tier Rules**: Only apply if integration targets that tier or higher
4. **Rule Status**: Check `quality_scale.yaml` in integration folder for:
- `done`: Rule implemented
- `exempt`: Rule doesn't apply (with reason in comment)
- `todo`: Rule needs implementation
## Testing Requirements
- Tests should avoid interacting or mocking internal integration details. For more info, see https://developers.home-assistant.io/docs/development_testing/#writing-tests-for-integrations
+1 -1
View File
@@ -14,7 +14,7 @@ env:
UV_HTTP_TIMEOUT: 60
UV_SYSTEM_PYTHON: "true"
# Base image version from https://github.com/home-assistant/docker
BASE_IMAGE_VERSION: "2026.02.0"
BASE_IMAGE_VERSION: "2026.01.0"
ARCHITECTURES: '["amd64", "aarch64"]'
permissions: {}
+1 -1
View File
@@ -1 +1 @@
3.14.3
3.14.2
+1 -1
View File
@@ -12,7 +12,7 @@ This repository contains the core of Home Assistant, a Python 3 based home autom
## Python Syntax Notes
- Python 3.14 explicitly allows `except TypeA, TypeB:` without parentheses. Never flag this as an issue since Home Assistant officially supports Python 3.14.
- Python 3.14 explicitly allows `except TypeA, TypeB:` without parentheses.
## Testing
Generated
+2 -4
View File
@@ -1203,8 +1203,6 @@ CLAUDE.md @home-assistant/core
/tests/components/notify_events/ @matrozov @papajojo
/homeassistant/components/notion/ @bachya
/tests/components/notion/ @bachya
/homeassistant/components/novy_cooker_hood/ @piitaya
/tests/components/novy_cooker_hood/ @piitaya
/homeassistant/components/nrgkick/ @andijakl
/tests/components/nrgkick/ @andijakl
/homeassistant/components/nsw_fuel_station/ @nickw444
@@ -1989,8 +1987,8 @@ CLAUDE.md @home-assistant/core
/tests/components/wled/ @frenck @mik-laj
/homeassistant/components/wmspro/ @mback2k
/tests/components/wmspro/ @mback2k
/homeassistant/components/wolflink/ @adamkrol93 @EnjoyingM
/tests/components/wolflink/ @adamkrol93 @EnjoyingM
/homeassistant/components/wolflink/ @adamkrol93 @mtielen
/tests/components/wolflink/ @adamkrol93 @mtielen
/homeassistant/components/workday/ @fabaff @gjohansson-ST
/tests/components/workday/ @fabaff @gjohansson-ST
/homeassistant/components/worldclock/ @fabaff
@@ -4,7 +4,7 @@ from __future__ import annotations
from asyncio import timeout
from collections.abc import Mapping
from typing import TYPE_CHECKING, Any
from typing import Any
from accuweather import AccuWeather, ApiError, InvalidApiKeyError, RequestsExceededError
from aiohttp import ClientError
@@ -12,7 +12,7 @@ from aiohttp.client_exceptions import ClientConnectorError
import voluptuous as vol
from homeassistant.config_entries import ConfigFlow, ConfigFlowResult
from homeassistant.const import CONF_API_KEY, CONF_LATITUDE, CONF_LONGITUDE
from homeassistant.const import CONF_API_KEY, CONF_LATITUDE, CONF_LONGITUDE, CONF_NAME
from homeassistant.helpers import config_validation as cv
from homeassistant.helpers.aiohttp_client import async_get_clientsession
@@ -55,11 +55,8 @@ class AccuWeatherFlowHandler(ConfigFlow, domain=DOMAIN):
)
self._abort_if_unique_id_configured()
if TYPE_CHECKING:
assert accuweather.location_name is not None
return self.async_create_entry(
title=accuweather.location_name, data=user_input
title=user_input[CONF_NAME], data=user_input
)
return self.async_show_form(
@@ -73,6 +70,9 @@ class AccuWeatherFlowHandler(ConfigFlow, domain=DOMAIN):
vol.Optional(
CONF_LONGITUDE, default=self.hass.config.longitude
): cv.longitude,
vol.Optional(
CONF_NAME, default=self.hass.config.location_name
): str,
}
),
errors=errors,
@@ -64,7 +64,7 @@ class AccuWeatherObservationDataUpdateCoordinator(
"""Initialize."""
self.accuweather = accuweather
self.location_key = accuweather.location_key
name = config_entry.data.get(CONF_NAME) or config_entry.title
name = config_entry.data[CONF_NAME]
if TYPE_CHECKING:
assert self.location_key is not None
@@ -122,7 +122,7 @@ class AccuWeatherForecastDataUpdateCoordinator(
self.accuweather = accuweather
self.location_key = accuweather.location_key
self._fetch_method = fetch_method
name = config_entry.data.get(CONF_NAME) or config_entry.title
name = config_entry.data[CONF_NAME]
if TYPE_CHECKING:
assert self.location_key is not None
@@ -25,7 +25,8 @@
"data": {
"api_key": "[%key:common::config_flow::data::api_key%]",
"latitude": "[%key:common::config_flow::data::latitude%]",
"longitude": "[%key:common::config_flow::data::longitude%]"
"longitude": "[%key:common::config_flow::data::longitude%]",
"name": "[%key:common::config_flow::data::name%]"
},
"data_description": {
"api_key": "API key generated in the AccuWeather APIs portal."
+1 -24
View File
@@ -38,7 +38,6 @@ HVAC_MODE_MAPPING_ACTRONAIR_TO_HA = {
"HEAT": HVACMode.HEAT,
"FAN": HVACMode.FAN_ONLY,
"AUTO": HVACMode.AUTO,
"DRY": HVACMode.DRY,
"OFF": HVACMode.OFF,
}
HVAC_MODE_MAPPING_HA_TO_ACTRONAIR = {
@@ -80,6 +79,7 @@ class ActronAirClimateEntity(ClimateEntity):
)
_attr_name = None
_attr_fan_modes = list(FAN_MODE_MAPPING_ACTRONAIR_TO_HA.values())
_attr_hvac_modes = list(HVAC_MODE_MAPPING_ACTRONAIR_TO_HA.values())
class ActronSystemClimate(ActronAirAcEntity, ActronAirClimateEntity):
@@ -93,17 +93,6 @@ class ActronSystemClimate(ActronAirAcEntity, ActronAirClimateEntity):
super().__init__(coordinator)
self._attr_unique_id = self._serial_number
@property
def hvac_modes(self) -> list[HVACMode]:
"""Return the list of supported HVAC modes."""
modes = [
HVAC_MODE_MAPPING_ACTRONAIR_TO_HA[mode]
for mode in self._status.user_aircon_settings.supported_modes
if mode in HVAC_MODE_MAPPING_ACTRONAIR_TO_HA
]
modes.append(HVACMode.OFF)
return modes
@property
def min_temp(self) -> float:
"""Return the minimum temperature that can be set."""
@@ -190,18 +179,6 @@ class ActronZoneClimate(ActronAirZoneEntity, ActronAirClimateEntity):
super().__init__(coordinator, zone)
self._attr_unique_id: str = self._zone_identifier
@property
def hvac_modes(self) -> list[HVACMode]:
"""Return the list of supported HVAC modes."""
status = self.coordinator.data
modes = [
HVAC_MODE_MAPPING_ACTRONAIR_TO_HA[mode]
for mode in status.user_aircon_settings.supported_modes
if mode in HVAC_MODE_MAPPING_ACTRONAIR_TO_HA
]
modes.append(HVACMode.OFF)
return modes
@property
def min_temp(self) -> float:
"""Return the minimum temperature that can be set."""
@@ -13,5 +13,5 @@
"integration_type": "hub",
"iot_class": "cloud_polling",
"quality_scale": "silver",
"requirements": ["actron-neo-api==0.5.6"]
"requirements": ["actron-neo-api==0.5.3"]
}
+12 -9
View File
@@ -12,11 +12,11 @@ from airly.exceptions import AirlyError
import voluptuous as vol
from homeassistant.config_entries import ConfigFlow, ConfigFlowResult
from homeassistant.const import CONF_API_KEY, CONF_LATITUDE, CONF_LONGITUDE
from homeassistant.const import CONF_API_KEY, CONF_LATITUDE, CONF_LONGITUDE, CONF_NAME
from homeassistant.helpers import config_validation as cv
from homeassistant.helpers.aiohttp_client import async_get_clientsession
from .const import CONF_USE_NEAREST, DEFAULT_NAME, DOMAIN, NO_AIRLY_SENSORS
from .const import CONF_USE_NEAREST, DOMAIN, NO_AIRLY_SENSORS
DESCRIPTION_PLACEHOLDERS = {
"developer_registration_url": "https://developer.airly.eu/register",
@@ -45,16 +45,16 @@ class AirlyFlowHandler(ConfigFlow, domain=DOMAIN):
try:
location_point_valid = await check_location(
websession,
user_input[CONF_API_KEY],
user_input[CONF_LATITUDE],
user_input[CONF_LONGITUDE],
user_input["api_key"],
user_input["latitude"],
user_input["longitude"],
)
if not location_point_valid:
location_nearest_valid = await check_location(
websession,
user_input[CONF_API_KEY],
user_input[CONF_LATITUDE],
user_input[CONF_LONGITUDE],
user_input["api_key"],
user_input["latitude"],
user_input["longitude"],
use_nearest=True,
)
except AirlyError as err:
@@ -68,7 +68,7 @@ class AirlyFlowHandler(ConfigFlow, domain=DOMAIN):
return self.async_abort(reason="wrong_location")
use_nearest = True
return self.async_create_entry(
title=DEFAULT_NAME,
title=user_input[CONF_NAME],
data={**user_input, CONF_USE_NEAREST: use_nearest},
)
@@ -83,6 +83,9 @@ class AirlyFlowHandler(ConfigFlow, domain=DOMAIN):
vol.Optional(
CONF_LONGITUDE, default=self.hass.config.longitude
): cv.longitude,
vol.Optional(
CONF_NAME, default=self.hass.config.location_name
): str,
}
),
errors=errors,
-2
View File
@@ -37,5 +37,3 @@ MAX_UPDATE_INTERVAL: Final = 90
MIN_UPDATE_INTERVAL: Final = 5
NO_AIRLY_SENSORS: Final = "There are no Airly sensors in this area yet."
URL = "https://airly.org/map/#{latitude},{longitude}"
DEFAULT_NAME: Final = "Airly"
+2 -2
View File
@@ -127,7 +127,7 @@ SENSOR_TYPES: tuple[AirlySensorEntityDescription, ...] = (
),
AirlySensorEntityDescription(
key=ATTR_API_CO,
device_class=SensorDeviceClass.CO,
translation_key="co",
native_unit_of_measurement=CONCENTRATION_MICROGRAMS_PER_CUBIC_METER,
state_class=SensorStateClass.MEASUREMENT,
suggested_display_precision=0,
@@ -178,7 +178,7 @@ async def async_setup_entry(
async_add_entities: AddConfigEntryEntitiesCallback,
) -> None:
"""Set up Airly sensor entities based on a config entry."""
name = entry.data.get(CONF_NAME) or entry.title
name = entry.data[CONF_NAME]
coordinator = entry.runtime_data
+5 -1
View File
@@ -13,7 +13,8 @@
"data": {
"api_key": "[%key:common::config_flow::data::api_key%]",
"latitude": "[%key:common::config_flow::data::latitude%]",
"longitude": "[%key:common::config_flow::data::longitude%]"
"longitude": "[%key:common::config_flow::data::longitude%]",
"name": "[%key:common::config_flow::data::name%]"
},
"description": "To generate API key go to {developer_registration_url}"
}
@@ -23,6 +24,9 @@
"sensor": {
"caqi": {
"name": "Common air quality index"
},
"co": {
"name": "[%key:component::sensor::entity_component::carbon_monoxide::name%]"
}
}
},
+192 -378
View File
@@ -1,8 +1,7 @@
"""Base entity for Anthropic."""
import base64
from collections import deque
from collections.abc import AsyncIterator, Callable, Iterable
from collections.abc import AsyncGenerator, Callable, Iterable
from dataclasses import dataclass, field
from datetime import UTC, datetime
import json
@@ -21,22 +20,18 @@ from anthropic.types import (
CitationWebSearchResultLocationParam,
CodeExecutionTool20250825Param,
CodeExecutionToolResultBlock,
CodeExecutionToolResultBlockContent,
CodeExecutionToolResultBlockParamContentParam,
Container,
ContentBlock,
ContentBlockParam,
DocumentBlockParam,
ImageBlockParam,
InputJSONDelta,
JSONOutputFormatParam,
Message,
MessageDeltaUsage,
MessageParam,
MessageStreamEvent,
ModelInfo,
OutputConfigParam,
RawContentBlockDelta,
RawContentBlockDeltaEvent,
RawContentBlockStartEvent,
RawContentBlockStopEvent,
@@ -73,30 +68,18 @@ from anthropic.types import (
WebSearchTool20250305Param,
WebSearchTool20260209Param,
WebSearchToolResultBlock,
WebSearchToolResultBlockContent,
WebSearchToolResultBlockParamContentParam,
)
from anthropic.types.bash_code_execution_tool_result_block import (
Content as BashCodeExecutionToolResultBlockContent,
)
from anthropic.types.bash_code_execution_tool_result_block_param import (
Content as BashCodeExecutionToolResultBlockParamContentParam,
)
from anthropic.types.message_create_params import MessageCreateParamsStreaming
from anthropic.types.raw_message_delta_event import Delta
from anthropic.types.text_editor_code_execution_tool_result_block import (
Content as TextEditorCodeExecutionToolResultBlockContent,
)
from anthropic.types.text_editor_code_execution_tool_result_block_param import (
Content as TextEditorCodeExecutionToolResultBlockParamContentParam,
)
from anthropic.types.tool_search_tool_result_block import (
Content as ToolSearchToolResultBlockContent,
)
from anthropic.types.tool_search_tool_result_block_param import (
Content as ToolSearchToolResultBlockParamContentParam,
)
from anthropic.types.tool_use_block import Caller
import voluptuous as vol
from voluptuous_openapi import convert
@@ -108,7 +91,7 @@ from homeassistant.helpers import device_registry as dr, llm
from homeassistant.helpers.json import json_dumps
from homeassistant.helpers.update_coordinator import CoordinatorEntity
from homeassistant.util import slugify
from homeassistant.util.json import JsonArrayType, JsonObjectType
from homeassistant.util.json import JsonObjectType
from .const import (
CONF_CHAT_MODEL,
@@ -462,7 +445,13 @@ def _convert_content( # noqa: C901
return messages, container_id
class AnthropicDeltaStream:
async def _transform_stream( # noqa: C901 - This is complex, but better to have it in one place
chat_log: conversation.ChatLog,
stream: AsyncStream[MessageStreamEvent],
output_tool: str | None = None,
) -> AsyncGenerator[
conversation.AssistantContentDeltaDict | conversation.ToolResultContentDeltaDict
]:
"""Transform the response stream into HA format.
A typical stream of responses might look something like the following:
@@ -492,376 +481,201 @@ class AnthropicDeltaStream:
Each message could contain multiple blocks of the same type.
"""
def __init__(
self,
chat_log: conversation.ChatLog,
stream: AsyncStream[MessageStreamEvent],
output_tool: str | None = None,
) -> None:
"""Initialize the delta stream."""
self._chat_log: conversation.ChatLog = chat_log
self._stream: AsyncStream[MessageStreamEvent] = stream
self._output_tool: str | None = output_tool
self._buffer: deque[
conversation.AssistantContentDeltaDict
| conversation.ToolResultContentDeltaDict
] = deque()
self._stream_iterator: AsyncIterator[MessageStreamEvent] | None = None
self._current_tool_block: ToolUseBlockParam | ServerToolUseBlockParam | None = (
None
if stream is None or not hasattr(stream, "__aiter__"):
raise HomeAssistantError(
translation_domain=DOMAIN, translation_key="unexpected_stream_object"
)
self._current_tool_args: str = ""
self._content_details = ContentDetails()
self._content_details.add_citation_detail()
self._input_usage: Usage | None = None
self._first_block: bool = True
def __aiter__(
self,
) -> AsyncIterator[
conversation.AssistantContentDeltaDict | conversation.ToolResultContentDeltaDict
]:
"""Initialize the stream and return the async iterator."""
if self._stream is None or not hasattr(self._stream, "__aiter__"):
raise HomeAssistantError(
translation_domain=DOMAIN, translation_key="unexpected_stream_object"
)
if self._stream_iterator is None:
self._stream_iterator = self._stream.__aiter__()
return self
current_tool_block: ToolUseBlockParam | ServerToolUseBlockParam | None = None
current_tool_args: str
content_details = ContentDetails()
content_details.add_citation_detail()
input_usage: Usage | None = None
first_block: bool = True
async def __anext__(
self,
) -> (
conversation.AssistantContentDeltaDict | conversation.ToolResultContentDeltaDict
):
"""Get the next item from the stream."""
while True:
if self._buffer:
return self._buffer.popleft()
async for response in stream:
LOGGER.debug("Received response: %s", response)
response = await self._stream_iterator.__anext__() # type: ignore[union-attr]
LOGGER.debug("Received response: %s", response)
self.on_message_stream_event(response)
def on_message_stream_event(self, event: MessageStreamEvent) -> None:
"""Handle MessageStreamEvent."""
if isinstance(event, RawMessageStartEvent):
self.on_message_start_event(event.message)
return
if isinstance(event, RawContentBlockStartEvent):
self.on_content_block_start_event(event.content_block, event.index)
return
if isinstance(event, RawContentBlockDeltaEvent):
self.on_content_block_delta_event(event.delta)
return
if isinstance(event, RawContentBlockStopEvent):
self.on_content_block_stop_event(event.index)
return
if isinstance(event, RawMessageDeltaEvent):
self.on_message_delta_event(event.delta, event.usage)
return
if isinstance(event, RawMessageStopEvent):
self.on_message_stop_event()
return
LOGGER.debug("Unhandled event type: %s", event.type) # type: ignore[unreachable] # pragma: no cover - All types are handled but we want to verify that
def on_message_start_event(self, message: Message) -> None:
"""Handle RawMessageStartEvent."""
self._input_usage = message.usage
self._first_block = True
def on_content_block_start_event(
self, content_block: ContentBlock, index: int
) -> None:
"""Handle RawContentBlockStartEvent."""
if isinstance(content_block, ToolUseBlock):
self.on_tool_use_block(
content_block.id,
content_block.input,
content_block.name,
content_block.caller,
)
return
if isinstance(content_block, TextBlock):
self.on_text_block(content_block.text, content_block.citations)
return
if isinstance(content_block, ThinkingBlock):
self.on_thinking_block(content_block.thinking, content_block.signature)
return
if isinstance(content_block, RedactedThinkingBlock):
self.on_redacted_thinking_block(content_block.data)
return
if isinstance(content_block, ServerToolUseBlock):
self.on_server_tool_use_block(
content_block.id,
content_block.name,
content_block.input,
content_block.caller,
)
return
if isinstance(
content_block,
(
WebSearchToolResultBlock,
CodeExecutionToolResultBlock,
BashCodeExecutionToolResultBlock,
TextEditorCodeExecutionToolResultBlock,
ToolSearchToolResultBlock,
),
):
self.on_server_tool_result_block(
content_block.tool_use_id,
content_block.type,
content_block.content,
content_block.caller if hasattr(content_block, "caller") else None,
)
return
LOGGER.debug("Unhandled content block type: %s", content_block.type)
def on_tool_use_block(
self, id: str, input: dict[str, Any], name: str, caller: Caller | None
) -> None:
"""Handle ToolUseBlock."""
self._current_tool_block = ToolUseBlockParam(
type="tool_use",
id=id,
name=name,
input=input,
)
self._current_tool_args = ""
if name == self._output_tool:
if self._first_block or self._content_details.has_content():
if self._content_details:
self._content_details.delete_empty()
self._buffer.append({"native": self._content_details})
self._content_details = ContentDetails()
self._content_details.add_citation_detail()
self._buffer.append({"role": "assistant"})
self._first_block = False
def on_text_block(self, text: str, citations: list[TextCitation] | None) -> None:
"""Handle TextBlock."""
if ( # Do not start a new assistant content just for citations, concatenate consecutive blocks with citations instead.
self._first_block
or (
not self._content_details.has_citations()
and citations is None
and self._content_details.has_content()
)
):
if self._content_details:
self._content_details.delete_empty()
self._buffer.append({"native": self._content_details})
self._content_details = ContentDetails()
self._buffer.append({"role": "assistant"})
self._first_block = False
self._content_details.add_citation_detail()
if text:
self._content_details.citation_details[-1].length += len(text)
self._buffer.append({"content": text})
def on_thinking_block(self, thinking: str, signature: str) -> None:
"""Handle ThinkingBlock."""
if self._first_block or self._content_details.thinking_signature:
if self._content_details:
self._content_details.delete_empty()
self._buffer.append({"native": self._content_details})
self._content_details = ContentDetails()
self._content_details.add_citation_detail()
self._buffer.append({"role": "assistant"})
self._first_block = False
def on_redacted_thinking_block(self, data: str) -> None:
"""Handle RedactedThinkingBlock."""
LOGGER.debug(
"Some of Claudes internal reasoning has been automatically "
"encrypted for safety reasons. This doesnt affect the quality of "
"responses"
)
if self._first_block or self._content_details.redacted_thinking:
if self._content_details:
self._content_details.delete_empty()
self._buffer.append({"native": self._content_details})
self._content_details = ContentDetails()
self._content_details.add_citation_detail()
self._buffer.append({"role": "assistant"})
self._first_block = False
self._content_details.redacted_thinking = data
def on_server_tool_use_block(
self,
id: str,
name: Literal[
"web_search",
"web_fetch",
"code_execution",
"bash_code_execution",
"text_editor_code_execution",
"tool_search_tool_regex",
"tool_search_tool_bm25",
],
input: dict[str, Any],
caller: Caller | None,
) -> None:
"""Handle ServerToolUseBlock."""
self._current_tool_block = ServerToolUseBlockParam(
type="server_tool_use",
id=id,
name=name,
input=input,
)
self._current_tool_args = ""
def on_server_tool_result_block(
self,
tool_use_id: str,
tool_name: Literal[
"web_search_tool_result",
"code_execution_tool_result",
"bash_code_execution_tool_result",
"text_editor_code_execution_tool_result",
"tool_search_tool_result",
],
content: WebSearchToolResultBlockContent
| CodeExecutionToolResultBlockContent
| BashCodeExecutionToolResultBlockContent
| TextEditorCodeExecutionToolResultBlockContent
| ToolSearchToolResultBlockContent,
caller: Caller | None,
) -> None:
"""Handle various server tool result blocks."""
if self._content_details:
self._content_details.delete_empty()
self._buffer.append({"native": self._content_details})
self._content_details = ContentDetails()
self._content_details.add_citation_detail()
self._buffer.append(
{
"role": "tool_result",
"tool_call_id": tool_use_id,
"tool_name": tool_name.removesuffix("_tool_result"),
"tool_result": {
"content": cast(JsonArrayType, [x.to_dict() for x in content])
if isinstance(response, RawMessageStartEvent):
input_usage = response.message.usage
first_block = True
elif isinstance(response, RawContentBlockStartEvent):
if isinstance(response.content_block, ToolUseBlock):
current_tool_block = ToolUseBlockParam(
type="tool_use",
id=response.content_block.id,
name=response.content_block.name,
input=response.content_block.input or {},
)
current_tool_args = ""
if response.content_block.name == output_tool:
if first_block or content_details.has_content():
if content_details:
content_details.delete_empty()
yield {"native": content_details}
content_details = ContentDetails()
content_details.add_citation_detail()
yield {"role": "assistant"}
first_block = False
elif isinstance(response.content_block, TextBlock):
if ( # Do not start a new assistant content just for citations, concatenate consecutive blocks with citations instead.
first_block
or (
not content_details.has_citations()
and response.content_block.citations is None
and content_details.has_content()
)
):
if content_details:
content_details.delete_empty()
yield {"native": content_details}
content_details = ContentDetails()
yield {"role": "assistant"}
first_block = False
content_details.add_citation_detail()
if response.content_block.text:
content_details.citation_details[-1].length += len(
response.content_block.text
)
yield {"content": response.content_block.text}
elif isinstance(response.content_block, ThinkingBlock):
if first_block or content_details.thinking_signature:
if content_details:
content_details.delete_empty()
yield {"native": content_details}
content_details = ContentDetails()
content_details.add_citation_detail()
yield {"role": "assistant"}
first_block = False
elif isinstance(response.content_block, RedactedThinkingBlock):
LOGGER.debug(
"Some of Claudes internal reasoning has been automatically "
"encrypted for safety reasons. This doesnt affect the quality of "
"responses"
)
if first_block or content_details.redacted_thinking:
if content_details:
content_details.delete_empty()
yield {"native": content_details}
content_details = ContentDetails()
content_details.add_citation_detail()
yield {"role": "assistant"}
first_block = False
content_details.redacted_thinking = response.content_block.data
elif isinstance(response.content_block, ServerToolUseBlock):
current_tool_block = ServerToolUseBlockParam(
type="server_tool_use",
id=response.content_block.id,
name=response.content_block.name,
input=response.content_block.input or {},
)
current_tool_args = ""
elif isinstance(
response.content_block,
(
WebSearchToolResultBlock,
CodeExecutionToolResultBlock,
BashCodeExecutionToolResultBlock,
TextEditorCodeExecutionToolResultBlock,
ToolSearchToolResultBlock,
),
):
if content_details:
content_details.delete_empty()
yield {"native": content_details}
content_details = ContentDetails()
content_details.add_citation_detail()
yield {
"role": "tool_result",
"tool_call_id": response.content_block.tool_use_id,
"tool_name": response.content_block.type.removesuffix(
"_tool_result"
),
"tool_result": {
"content": cast(
JsonObjectType, response.content_block.to_dict()["content"]
)
}
if isinstance(response.content_block.content, list)
else cast(JsonObjectType, response.content_block.content.to_dict()),
}
if isinstance(content, list)
else cast(JsonObjectType, content.to_dict()),
}
)
self._first_block = True
def on_content_block_delta_event(self, delta: RawContentBlockDelta) -> None:
"""Handle RawContentBlockDeltaEvent."""
if isinstance(delta, InputJSONDelta):
self.on_input_json_delta(delta.partial_json)
return
if isinstance(delta, TextDelta):
self.on_text_delta(delta.text)
return
if isinstance(delta, ThinkingDelta):
self.on_thinking_delta(delta.thinking)
return
if isinstance(delta, SignatureDelta):
self.on_signature_delta(delta.signature)
return
if isinstance(delta, CitationsDelta):
self.on_citations_delta(delta.citation)
return
LOGGER.debug("Unhandled content delta type: %s", delta.type) # type: ignore[unreachable] # pragma: no cover - All types are handled but we want to verify that
def on_input_json_delta(self, partial_json: str) -> None:
"""Handle InputJSONDelta."""
if (
self._current_tool_block is not None
and self._current_tool_block["name"] == self._output_tool
):
self._content_details.citation_details[-1].length += len(partial_json)
self._buffer.append({"content": partial_json})
else:
self._current_tool_args += partial_json
def on_text_delta(self, text: str) -> None:
"""Handle TextDelta."""
if text:
self._content_details.citation_details[-1].length += len(text)
self._buffer.append({"content": text})
def on_thinking_delta(self, thinking: str) -> None:
"""Handle ThinkingDelta."""
if thinking:
self._buffer.append({"thinking_content": thinking})
def on_signature_delta(self, signature: str) -> None:
"""Handle SignatureDelta."""
self._content_details.thinking_signature = signature
def on_citations_delta(self, citation: TextCitation) -> None:
"""Handle CitationsDelta."""
self._content_details.add_citation(citation)
def on_content_block_stop_event(self, index: int) -> None:
"""Handle RawContentBlockStopEvent."""
if self._current_tool_block is not None:
if self._current_tool_block["name"] == self._output_tool:
self._current_tool_block = None
return
tool_args = (
json.loads(self._current_tool_args) if self._current_tool_args else {}
)
self._current_tool_block["input"] |= tool_args
self._buffer.append(
{
first_block = True
elif isinstance(response, RawContentBlockDeltaEvent):
if isinstance(response.delta, InputJSONDelta):
if (
current_tool_block is not None
and current_tool_block["name"] == output_tool
):
content_details.citation_details[-1].length += len(
response.delta.partial_json
)
yield {"content": response.delta.partial_json}
else:
current_tool_args += response.delta.partial_json
elif isinstance(response.delta, TextDelta):
if response.delta.text:
content_details.citation_details[-1].length += len(
response.delta.text
)
yield {"content": response.delta.text}
elif isinstance(response.delta, ThinkingDelta):
if response.delta.thinking:
yield {"thinking_content": response.delta.thinking}
elif isinstance(response.delta, SignatureDelta):
content_details.thinking_signature = response.delta.signature
elif isinstance(response.delta, CitationsDelta):
content_details.add_citation(response.delta.citation)
elif isinstance(response, RawContentBlockStopEvent):
if current_tool_block is not None:
if current_tool_block["name"] == output_tool:
current_tool_block = None
continue
tool_args = json.loads(current_tool_args) if current_tool_args else {}
current_tool_block["input"] |= tool_args
yield {
"tool_calls": [
llm.ToolInput(
id=self._current_tool_block["id"],
tool_name=self._current_tool_block["name"],
tool_args=self._current_tool_block["input"],
external=self._current_tool_block["type"]
== "server_tool_use",
id=current_tool_block["id"],
tool_name=current_tool_block["name"],
tool_args=current_tool_block["input"],
external=current_tool_block["type"] == "server_tool_use",
)
]
}
)
self._current_tool_block = None
current_tool_block = None
elif isinstance(response, RawMessageDeltaEvent):
if (usage := response.usage) is not None:
chat_log.async_trace(_create_token_stats(input_usage, usage))
content_details.container = response.delta.container
if response.delta.stop_reason == "refusal":
raise HomeAssistantError(
translation_domain=DOMAIN, translation_key="api_refusal"
)
elif isinstance(response, RawMessageStopEvent):
if content_details:
content_details.delete_empty()
yield {"native": content_details}
content_details = ContentDetails()
content_details.add_citation_detail()
def on_message_delta_event(self, delta: Delta, usage: MessageDeltaUsage) -> None:
"""Handle RawMessageDeltaEvent."""
self._chat_log.async_trace(self._create_token_stats(self._input_usage, usage))
self._content_details.container = delta.container
if delta.stop_reason == "refusal":
raise HomeAssistantError(
translation_domain=DOMAIN, translation_key="api_refusal"
)
def on_message_stop_event(self) -> None:
"""Handle RawMessageStopEvent."""
if self._content_details:
self._content_details.delete_empty()
self._buffer.append({"native": self._content_details})
self._content_details = ContentDetails()
self._content_details.add_citation_detail()
def _create_token_stats(
self, input_usage: Usage | None, response_usage: MessageDeltaUsage
) -> dict[str, Any]:
"""Create token stats for conversation agent tracing."""
input_tokens = 0
cached_input_tokens = 0
if input_usage:
input_tokens = input_usage.input_tokens
cached_input_tokens = input_usage.cache_creation_input_tokens or 0
output_tokens = response_usage.output_tokens
return {
"stats": {
"input_tokens": input_tokens,
"cached_input_tokens": cached_input_tokens,
"output_tokens": output_tokens,
}
def _create_token_stats(
input_usage: Usage | None, response_usage: MessageDeltaUsage
) -> dict[str, Any]:
"""Create token stats for conversation agent tracing."""
input_tokens = 0
cached_input_tokens = 0
if input_usage:
input_tokens = input_usage.input_tokens
cached_input_tokens = input_usage.cache_creation_input_tokens or 0
output_tokens = response_usage.output_tokens
return {
"stats": {
"input_tokens": input_tokens,
"cached_input_tokens": cached_input_tokens,
"output_tokens": output_tokens,
}
}
class AnthropicBaseLLMEntity(CoordinatorEntity[AnthropicCoordinator]):
@@ -1149,7 +963,7 @@ class AnthropicBaseLLMEntity(CoordinatorEntity[AnthropicCoordinator]):
content
async for content in chat_log.async_add_delta_content_stream(
self.entity_id,
AnthropicDeltaStream(
_transform_stream(
chat_log,
stream,
output_tool=structure_name or None,
@@ -155,12 +155,7 @@ async def async_setup(hass: HomeAssistant, config: ConfigType) -> bool:
hass.data[DATA_COMPONENT] = storage_collection
collection.DictStorageCollectionWebsocket(
storage_collection,
DOMAIN,
DOMAIN,
CREATE_FIELDS,
UPDATE_FIELDS,
admin_only=True,
storage_collection, DOMAIN, DOMAIN, CREATE_FIELDS, UPDATE_FIELDS
).async_setup(hass)
websocket_api.async_register_command(hass, handle_integration_list)
@@ -346,7 +341,6 @@ async def handle_integration_list(
vol.Required("config_entry_id"): str,
}
)
@websocket_api.require_admin
@websocket_api.async_response
async def handle_config_entry(
hass: HomeAssistant, connection: ActiveConnection, msg: dict[str, Any]
+1 -1
View File
@@ -28,7 +28,7 @@ class AquacellEntity(CoordinatorEntity[AquacellCoordinator]):
self._attr_unique_id = f"{softener_key}-{entity_key}"
self._attr_device_info = DeviceInfo(
name=self.softener.name,
hw_version=self.softener.diagnostics.fw_version,
hw_version=self.softener.fwVersion,
identifiers={(DOMAIN, str(softener_key))},
manufacturer=self.softener.brand,
model=self.softener.ssn,
@@ -8,5 +8,5 @@
"integration_type": "device",
"iot_class": "cloud_polling",
"loggers": ["aioaquacell"],
"requirements": ["aioaquacell==1.0.0"]
"requirements": ["aioaquacell==0.2.0"]
}
+7 -7
View File
@@ -38,39 +38,39 @@ SENSORS: tuple[SoftenerSensorEntityDescription, ...] = (
translation_key="salt_left_side_percentage",
state_class=SensorStateClass.MEASUREMENT,
native_unit_of_measurement=PERCENTAGE,
value_fn=lambda softener: softener.salt.left_percent,
value_fn=lambda softener: softener.salt.leftPercent,
),
SoftenerSensorEntityDescription(
key="salt_right_side_percentage",
translation_key="salt_right_side_percentage",
state_class=SensorStateClass.MEASUREMENT,
native_unit_of_measurement=PERCENTAGE,
value_fn=lambda softener: softener.salt.right_percent,
value_fn=lambda softener: softener.salt.rightPercent,
),
SoftenerSensorEntityDescription(
key="salt_left_side_time_remaining",
translation_key="salt_left_side_time_remaining",
device_class=SensorDeviceClass.DURATION,
native_unit_of_measurement=UnitOfTime.DAYS,
value_fn=lambda softener: softener.salt.left_days,
value_fn=lambda softener: softener.salt.leftDays,
),
SoftenerSensorEntityDescription(
key="salt_right_side_time_remaining",
translation_key="salt_right_side_time_remaining",
device_class=SensorDeviceClass.DURATION,
native_unit_of_measurement=UnitOfTime.DAYS,
value_fn=lambda softener: softener.salt.right_days,
value_fn=lambda softener: softener.salt.rightDays,
),
SoftenerSensorEntityDescription(
key="battery",
device_class=SensorDeviceClass.BATTERY,
native_unit_of_measurement=PERCENTAGE,
value_fn=lambda softener: softener.diagnostics.battery,
value_fn=lambda softener: softener.battery,
),
SoftenerSensorEntityDescription(
key="wi_fi_strength",
translation_key="wi_fi_strength",
value_fn=lambda softener: softener.diagnostics.wifi_level,
value_fn=lambda softener: softener.wifiLevel,
device_class=SensorDeviceClass.ENUM,
options=[
"high",
@@ -82,7 +82,7 @@ SENSORS: tuple[SoftenerSensorEntityDescription, ...] = (
key="last_update",
translation_key="last_update",
device_class=SensorDeviceClass.TIMESTAMP,
value_fn=lambda softener: softener.diagnostics.last_update,
value_fn=lambda softener: softener.lastUpdate,
),
)
+18 -29
View File
@@ -4,9 +4,8 @@ from __future__ import annotations
from collections.abc import Callable
from dataclasses import dataclass
import logging
from arcam.fmj import IncomingVideoAspectRatio, IncomingVideoColorspace, IntOrTypeEnum
from arcam.fmj import IncomingVideoAspectRatio, IncomingVideoColorspace
from arcam.fmj.state import IncomingAudioConfig, IncomingAudioFormat, State
from homeassistant.components.sensor import (
@@ -22,25 +21,6 @@ from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from .coordinator import ArcamFmjConfigEntry
from .entity import ArcamFmjEntity
_LOGGER = logging.getLogger(__name__)
def _enum_options(value: type[IntOrTypeEnum]) -> list[str]:
return [
member.name.lower() for member in value if not member.name.startswith("CODE_")
]
def _enum_value(value: IntOrTypeEnum | None) -> str | None:
if value is None:
return None
if value.name.startswith("CODE_"):
_LOGGER.debug("Undefined enum value %s ignored", value)
return None
return value.name.lower()
@dataclass(frozen=True, kw_only=True)
class ArcamFmjSensorEntityDescription(SensorEntityDescription):
@@ -95,9 +75,9 @@ SENSORS: tuple[ArcamFmjSensorEntityDescription, ...] = (
translation_key="incoming_video_aspect_ratio",
entity_category=EntityCategory.DIAGNOSTIC,
device_class=SensorDeviceClass.ENUM,
options=_enum_options(IncomingVideoAspectRatio),
options=[member.name.lower() for member in IncomingVideoAspectRatio],
value_fn=lambda state: (
_enum_value(vp.aspect_ratio)
vp.aspect_ratio.name.lower()
if (vp := state.get_incoming_video_parameters()) is not None
else None
),
@@ -107,10 +87,11 @@ SENSORS: tuple[ArcamFmjSensorEntityDescription, ...] = (
translation_key="incoming_video_colorspace",
entity_category=EntityCategory.DIAGNOSTIC,
device_class=SensorDeviceClass.ENUM,
options=_enum_options(IncomingVideoColorspace),
options=[member.name.lower() for member in IncomingVideoColorspace],
value_fn=lambda state: (
_enum_value(vp.colorspace)
vp.colorspace.name.lower()
if (vp := state.get_incoming_video_parameters()) is not None
and vp.colorspace is not None
else None
),
),
@@ -119,16 +100,24 @@ SENSORS: tuple[ArcamFmjSensorEntityDescription, ...] = (
translation_key="incoming_audio_format",
entity_category=EntityCategory.DIAGNOSTIC,
device_class=SensorDeviceClass.ENUM,
options=_enum_options(IncomingAudioFormat),
value_fn=lambda state: _enum_value(state.get_incoming_audio_format()[0]),
options=[member.name.lower() for member in IncomingAudioFormat],
value_fn=lambda state: (
result.name.lower()
if (result := state.get_incoming_audio_format()[0]) is not None
else None
),
),
ArcamFmjSensorEntityDescription(
key="incoming_audio_config",
translation_key="incoming_audio_config",
entity_category=EntityCategory.DIAGNOSTIC,
device_class=SensorDeviceClass.ENUM,
options=_enum_options(IncomingAudioConfig),
value_fn=lambda state: _enum_value(state.get_incoming_audio_format()[1]),
options=[member.name.lower() for member in IncomingAudioConfig],
value_fn=lambda state: (
result.name.lower()
if (result := state.get_incoming_audio_format()[1]) is not None
else None
),
),
ArcamFmjSensorEntityDescription(
key="incoming_audio_sample_rate",
@@ -13,12 +13,11 @@ from hassil.util import (
)
import voluptuous as vol
from homeassistant.auth.permissions.const import CAT_ENTITIES, POLICY_CONTROL
from homeassistant.components.http import StaticPathConfig
from homeassistant.config_entries import ConfigEntry
from homeassistant.const import ATTR_ENTITY_ID
from homeassistant.core import HomeAssistant, ServiceCall, SupportsResponse
from homeassistant.exceptions import HomeAssistantError, Unauthorized, UnknownUser
from homeassistant.exceptions import HomeAssistantError
from homeassistant.helpers import config_validation as cv
from homeassistant.helpers.entity_component import EntityComponent
from homeassistant.helpers.typing import ConfigType
@@ -104,22 +103,6 @@ async def async_setup(hass: HomeAssistant, config: ConfigType) -> bool:
async def handle_ask_question(call: ServiceCall) -> dict[str, Any]:
"""Handle a Show View service call."""
satellite_entity_id: str = call.data[ATTR_ENTITY_ID]
if call.context.user_id:
user = await hass.auth.async_get_user(call.context.user_id)
if user is None:
raise UnknownUser(
context=call.context,
permission=POLICY_CONTROL,
user_id=call.context.user_id,
)
if not user.permissions.check_entity(satellite_entity_id, POLICY_CONTROL):
raise Unauthorized(
context=call.context,
permission=POLICY_CONTROL,
user_id=call.context.user_id,
perm_category=CAT_ENTITIES,
)
satellite_entity: AssistSatelliteEntity | None = component.get_entity(
satellite_entity_id
)
@@ -165,7 +165,6 @@ async def websocket_set_wake_words(
vol.Required("entity_id"): cv.entity_domain(DOMAIN),
}
)
@websocket_api.require_admin
@websocket_api.async_response
async def websocket_test_connection(
hass: HomeAssistant,
+25 -18
View File
@@ -15,6 +15,24 @@ from homeassistant.data_entry_flow import FlowContext
from homeassistant.helpers import config_validation as cv
from homeassistant.util.hass_dict import HassKey
WS_TYPE_SETUP_MFA = "auth/setup_mfa"
SCHEMA_WS_SETUP_MFA = vol.All(
websocket_api.BASE_COMMAND_MESSAGE_SCHEMA.extend(
{
vol.Required("type"): WS_TYPE_SETUP_MFA,
vol.Exclusive("mfa_module_id", "module_or_flow_id"): str,
vol.Exclusive("flow_id", "module_or_flow_id"): str,
vol.Optional("user_input"): object,
}
),
cv.has_at_least_one_key("mfa_module_id", "flow_id"),
)
WS_TYPE_DEPOSE_MFA = "auth/depose_mfa"
SCHEMA_WS_DEPOSE_MFA = websocket_api.BASE_COMMAND_MESSAGE_SCHEMA.extend(
{vol.Required("type"): WS_TYPE_DEPOSE_MFA, vol.Required("mfa_module_id"): str}
)
DATA_SETUP_FLOW_MGR: HassKey[MfaFlowManager] = HassKey("auth_mfa_setup_flow_manager")
_LOGGER = logging.getLogger(__name__)
@@ -55,24 +73,16 @@ def async_setup(hass: HomeAssistant) -> None:
"""Init mfa setup flow manager."""
hass.data[DATA_SETUP_FLOW_MGR] = MfaFlowManager(hass)
websocket_api.async_register_command(hass, websocket_setup_mfa)
websocket_api.async_register_command(hass, websocket_depose_mfa)
websocket_api.async_register_command(
hass, WS_TYPE_SETUP_MFA, websocket_setup_mfa, SCHEMA_WS_SETUP_MFA
)
websocket_api.async_register_command(
hass, WS_TYPE_DEPOSE_MFA, websocket_depose_mfa, SCHEMA_WS_DEPOSE_MFA
)
@callback
@websocket_api.websocket_command(
vol.All(
vol.Schema(
{
vol.Required("type"): "auth/setup_mfa",
vol.Exclusive("mfa_module_id", "module_or_flow_id"): str,
vol.Exclusive("flow_id", "module_or_flow_id"): str,
vol.Optional("user_input"): object,
}
),
cv.has_at_least_one_key("mfa_module_id", "flow_id"),
)
)
@websocket_api.ws_require_user(allow_system_user=False)
def websocket_setup_mfa(
hass: HomeAssistant, connection: websocket_api.ActiveConnection, msg: dict[str, Any]
@@ -111,9 +121,6 @@ def websocket_setup_mfa(
@callback
@websocket_api.websocket_command(
{vol.Required("type"): "auth/depose_mfa", vol.Required("mfa_module_id"): str}
)
@websocket_api.ws_require_user(allow_system_user=False)
def websocket_depose_mfa(
hass: HomeAssistant, connection: websocket_api.ActiveConnection, msg: dict[str, Any]
@@ -4,10 +4,10 @@ from __future__ import annotations
from abc import ABC, abstractmethod
import asyncio
from collections.abc import Callable
from collections.abc import Callable, Mapping
from dataclasses import dataclass
import logging
from typing import Any, cast
from typing import Any, Protocol, cast
from propcache.api import cached_property
import voluptuous as vol
@@ -229,11 +229,14 @@ def is_disabled_experimental_trigger(hass: HomeAssistant, platform: str) -> bool
)
class IfAction(condition_helper.ConditionsChecker):
class IfAction(Protocol):
"""Define the format of if_action."""
config: list[ConfigType]
def __call__(self, variables: Mapping[str, Any] | None = None) -> bool:
"""AND all conditions."""
def is_on(hass: HomeAssistant, entity_id: str) -> bool:
"""Return true if specified automation entity_id is on.
@@ -832,7 +835,7 @@ class AutomationEntity(BaseAutomationEntity, RestoreEntity):
if (
not skip_condition
and self._condition is not None
and not self._condition.async_check(variables=variables)
and not self._condition(variables)
):
self._logger.debug(
"Conditions not met, aborting automation. Condition summary: %s",
@@ -901,9 +904,6 @@ class AutomationEntity(BaseAutomationEntity, RestoreEntity):
"""Remove listeners when removing automation from Home Assistant."""
await super().async_will_remove_from_hass()
await self._async_disable()
self.action_script.async_unload()
if self._condition is not None:
self._condition.async_unload()
async def _async_enable_automation(self, event: Event) -> None:
"""Start automation on startup."""
@@ -1276,7 +1276,6 @@ async def _async_process_if(
@websocket_api.websocket_command({"type": "automation/config", "entity_id": str})
@websocket_api.require_admin
def websocket_config(
hass: HomeAssistant,
connection: websocket_api.ActiveConnection,
-1
View File
@@ -36,7 +36,6 @@ async def get_axis_api(
username=config[CONF_USERNAME],
password=config[CONF_PASSWORD],
web_proto=config.get(CONF_PROTOCOL, "http"),
websocket_enabled=True,
)
)
+1 -1
View File
@@ -29,7 +29,7 @@
"integration_type": "device",
"iot_class": "local_push",
"loggers": ["axis"],
"requirements": ["axis==69"],
"requirements": ["axis==68"],
"ssdp": [
{
"manufacturer": "AXIS"
+3 -6
View File
@@ -2,7 +2,6 @@
from homeassistant.core import HomeAssistant, ServiceCall
from homeassistant.helpers.hassio import is_hassio
from homeassistant.helpers.service import async_register_admin_service
from .const import DATA_MANAGER, DOMAIN
@@ -31,9 +30,7 @@ async def _async_handle_create_automatic_service(call: ServiceCall) -> None:
def async_setup_services(hass: HomeAssistant) -> None:
"""Register services."""
if not is_hassio(hass):
async_register_admin_service(
hass, DOMAIN, "create", _async_handle_create_service
)
async_register_admin_service(
hass, DOMAIN, "create_automatic", _async_handle_create_automatic_service
hass.services.async_register(DOMAIN, "create", _async_handle_create_service)
hass.services.async_register(
DOMAIN, "create_automatic", _async_handle_create_automatic_service
)
+19 -5
View File
@@ -30,19 +30,33 @@ BATTERY_PERCENTAGE_DOMAIN_SPECS = {
CONDITIONS: dict[str, type[Condition]] = {
"is_low": make_entity_state_condition(
BATTERY_DOMAIN_SPECS, STATE_ON, support_duration=True
BATTERY_DOMAIN_SPECS,
STATE_ON,
support_duration=True,
primary_entities_only=False,
),
"is_not_low": make_entity_state_condition(
BATTERY_DOMAIN_SPECS, STATE_OFF, support_duration=True
BATTERY_DOMAIN_SPECS,
STATE_OFF,
support_duration=True,
primary_entities_only=False,
),
"is_charging": make_entity_state_condition(
BATTERY_CHARGING_DOMAIN_SPECS, STATE_ON, support_duration=True
BATTERY_CHARGING_DOMAIN_SPECS,
STATE_ON,
support_duration=True,
primary_entities_only=False,
),
"is_not_charging": make_entity_state_condition(
BATTERY_CHARGING_DOMAIN_SPECS, STATE_OFF, support_duration=True
BATTERY_CHARGING_DOMAIN_SPECS,
STATE_OFF,
support_duration=True,
primary_entities_only=False,
),
"is_level": make_entity_numerical_condition(
BATTERY_PERCENTAGE_DOMAIN_SPECS, PERCENTAGE
BATTERY_PERCENTAGE_DOMAIN_SPECS,
PERCENTAGE,
primary_entities_only=False,
),
}
@@ -3,6 +3,7 @@
entity:
- domain: binary_sensor
device_class: battery
primary_entities_only: false
fields:
behavior: &condition_behavior
required: true
@@ -42,6 +43,7 @@ is_charging:
entity:
- domain: binary_sensor
device_class: battery_charging
primary_entities_only: false
fields:
behavior: *condition_behavior
for: *condition_for
@@ -51,6 +53,7 @@ is_not_charging:
entity:
- domain: binary_sensor
device_class: battery_charging
primary_entities_only: false
fields:
behavior: *condition_behavior
for: *condition_for
@@ -60,6 +63,7 @@ is_level:
entity:
- domain: sensor
device_class: battery
primary_entities_only: false
fields:
behavior: *condition_behavior
threshold:
@@ -21,6 +21,9 @@
"save_video": {
"service": "mdi:file-video"
},
"send_pin": {
"service": "mdi:two-factor-authentication"
},
"trigger_camera": {
"service": "mdi:image-refresh"
}
+49 -3
View File
@@ -5,9 +5,15 @@ from __future__ import annotations
import voluptuous as vol
from homeassistant.components.camera import DOMAIN as CAMERA_DOMAIN
from homeassistant.const import CONF_FILE_PATH, CONF_FILENAME
from homeassistant.core import HomeAssistant, callback
from homeassistant.helpers import config_validation as cv, service
from homeassistant.const import (
ATTR_CONFIG_ENTRY_ID,
CONF_FILE_PATH,
CONF_FILENAME,
CONF_PIN,
)
from homeassistant.core import HomeAssistant, ServiceCall, callback
from homeassistant.exceptions import HomeAssistantError
from homeassistant.helpers import config_validation as cv, issue_registry as ir, service
from .const import DOMAIN
@@ -17,10 +23,50 @@ SERVICE_SAVE_VIDEO = "save_video"
SERVICE_SAVE_RECENT_CLIPS = "save_recent_clips"
# Deprecated
SERVICE_SEND_PIN = "send_pin"
SERVICE_SEND_PIN_SCHEMA = vol.Schema(
{
vol.Required(ATTR_CONFIG_ENTRY_ID): vol.All(cv.ensure_list, [cv.string]),
vol.Optional(CONF_PIN): cv.string,
}
)
async def _send_pin(call: ServiceCall) -> None:
"""Call blink to send new pin."""
# Create repair issue to inform user about service removal
ir.async_create_issue(
call.hass,
DOMAIN,
"service_send_pin_deprecation",
is_fixable=False,
issue_domain=DOMAIN,
severity=ir.IssueSeverity.ERROR,
breaks_in_ha_version="2026.5.0",
translation_key="service_send_pin_deprecation",
translation_placeholders={"service_name": f"{DOMAIN}.{SERVICE_SEND_PIN}"},
)
# Service has been removed - raise exception
raise HomeAssistantError(
translation_domain=DOMAIN,
translation_key="service_removed",
translation_placeholders={"service_name": f"{DOMAIN}.{SERVICE_SEND_PIN}"},
)
@callback
def async_setup_services(hass: HomeAssistant) -> None:
"""Set up the services for the Blink integration."""
hass.services.async_register(
DOMAIN,
SERVICE_SEND_PIN,
_send_pin,
schema=SERVICE_SEND_PIN_SCHEMA,
)
service.async_register_platform_entity_service(
hass,
DOMAIN,
@@ -35,3 +35,15 @@ save_recent_clips:
example: "/tmp"
selector:
text:
send_pin:
fields:
config_entry_id:
required: true
selector:
config_entry:
integration: blink
pin:
example: "abc123"
selector:
text:
@@ -82,6 +82,9 @@
},
"not_loaded": {
"message": "{target} is not loaded."
},
"service_removed": {
"message": "The service {service_name} has been removed and is no longer needed. Home Assistant will automatically prompt for reauthentication when required."
}
},
"issues": {
@@ -95,6 +98,10 @@
}
},
"title": "Blink update service is being removed"
},
"service_send_pin_deprecation": {
"description": "The service {service_name} has been removed and is no longer needed. When a new two-factor authentication code is required, Home Assistant will automatically prompt you to reauthenticate through the integration configuration. Please remove any automations or scripts that call this service.",
"title": "Blink send PIN service has been removed"
}
},
"options": {
@@ -133,6 +140,20 @@
},
"name": "Save video"
},
"send_pin": {
"description": "Sends a new PIN to Blink for 2FA.",
"fields": {
"config_entry_id": {
"description": "The Blink integration ID.",
"name": "Integration ID"
},
"pin": {
"description": "PIN received from Blink. Leave empty if you only received a verification email.",
"name": "PIN"
}
},
"name": "Send PIN"
},
"trigger_camera": {
"description": "Requests camera to take new image.",
"name": "Trigger camera"
@@ -7,7 +7,6 @@ DOMAIN = "broadlink"
DOMAINS_AND_TYPES = {
Platform.CLIMATE: {"HYS"},
Platform.LIGHT: {"LB1", "LB2"},
Platform.RADIO_FREQUENCY: {"RM4PRO", "RMPRO"},
Platform.REMOTE: {"RM4MINI", "RM4PRO", "RMMINI", "RMMINIB", "RMPRO"},
Platform.SELECT: {"HYS"},
Platform.SENSOR: {
@@ -1,132 +0,0 @@
"""Radio Frequency platform for Broadlink."""
from __future__ import annotations
import logging
from broadlink.exceptions import BroadlinkException
from rf_protocols import RadioFrequencyCommand
from homeassistant.components.radio_frequency import RadioFrequencyTransmitterEntity
from homeassistant.config_entries import ConfigEntry
from homeassistant.core import HomeAssistant
from homeassistant.exceptions import HomeAssistantError
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from .const import DOMAIN
from .device import BroadlinkDevice
from .entity import BroadlinkEntity
_LOGGER = logging.getLogger(__name__)
PARALLEL_UPDATES = 0
_TICK_US = 32.84
_RF_433_TYPE_BYTE = 0xB2
_RF_315_TYPE_BYTE = 0xB4
_RF_433_RANGE = (433_050_000, 434_790_000)
_RF_315_RANGE = (314_950_000, 315_250_000)
SUPPORTED_FREQUENCY_RANGES: list[tuple[int, int]] = [_RF_433_RANGE, _RF_315_RANGE]
def _type_byte_for_frequency(frequency: int) -> int:
"""Return the Broadlink RF type byte for a given carrier frequency."""
if _RF_433_RANGE[0] <= frequency <= _RF_433_RANGE[1]:
return _RF_433_TYPE_BYTE
if _RF_315_RANGE[0] <= frequency <= _RF_315_RANGE[1]:
return _RF_315_TYPE_BYTE
raise HomeAssistantError(
translation_domain=DOMAIN,
translation_key="frequency_not_supported",
translation_placeholders={"frequency": f"{frequency / 1_000_000:g}"},
)
def encode_rf_packet(
*,
type_byte: int,
repeat_count: int,
timings_us: list[int],
) -> bytes:
"""Encode raw OOK timings as a Broadlink RF pulse-length packet.
The layout is::
byte 0 type byte (0xB2 for 433 MHz, 0xB4 for 315 MHz)
byte 1 repeat count (additional transmissions after the first)
bytes 2..3 payload length (little-endian), counted from byte 4
bytes 4..N-1 pulses: 1 byte when ticks < 256, otherwise
0x00 followed by a 2-byte big-endian tick count
Each pulse is expressed as multiples of 32.84 µs ticks, which is the
timing resolution of the Broadlink RF front-end.
"""
buf = bytearray([type_byte, repeat_count, 0, 0])
for duration in timings_us:
ticks = round(abs(duration) / _TICK_US)
div, mod = divmod(ticks, 256)
if div:
buf.append(0x00)
buf.append(div)
buf.append(mod)
payload_len = len(buf) - 4
buf[2] = payload_len & 0xFF
buf[3] = (payload_len >> 8) & 0xFF
return bytes(buf)
async def async_setup_entry(
hass: HomeAssistant,
config_entry: ConfigEntry,
async_add_entities: AddConfigEntryEntitiesCallback,
) -> None:
"""Set up a Broadlink radio frequency transmitter."""
# Uses legacy hass.data[DOMAIN] pattern
# pylint: disable-next=hass-use-runtime-data
device: BroadlinkDevice = hass.data[DOMAIN].devices[config_entry.entry_id]
async_add_entities([BroadlinkRadioFrequency(device)])
class BroadlinkRadioFrequency(BroadlinkEntity, RadioFrequencyTransmitterEntity):
"""Representation of a Broadlink RF transmitter."""
_attr_has_entity_name = True
_attr_name = None
def __init__(self, device: BroadlinkDevice) -> None:
"""Initialize the entity."""
super().__init__(device)
self._attr_unique_id = device.unique_id
@property
def supported_frequency_ranges(self) -> list[tuple[int, int]]:
"""Return the Broadlink-supported narrow RF bands."""
return SUPPORTED_FREQUENCY_RANGES
async def async_send_command(self, command: RadioFrequencyCommand) -> None:
"""Encode an OOK command and transmit it via the Broadlink device."""
type_byte = _type_byte_for_frequency(command.frequency)
packet = encode_rf_packet(
type_byte=type_byte,
repeat_count=command.repeat_count,
timings_us=command.get_raw_timings(),
)
_LOGGER.debug(
"Transmitting RF packet: %d bytes on %d Hz (repeat=%d)",
len(packet),
command.frequency,
command.repeat_count,
)
device = self._device
try:
await device.async_request(device.api.send_data, packet)
except (BroadlinkException, OSError) as err:
raise HomeAssistantError(
translation_domain=DOMAIN,
translation_key="transmit_failed",
translation_placeholders={"error": str(err)},
) from err
@@ -77,13 +77,5 @@
"name": "Total consumption"
}
}
},
"exceptions": {
"frequency_not_supported": {
"message": "Broadlink devices cannot transmit on {frequency} MHz"
},
"transmit_failed": {
"message": "Failed to transmit RF command: {error}"
}
}
}
+2 -24
View File
@@ -15,10 +15,7 @@ from aiohttp import web
from dateutil.rrule import rrulestr
import voluptuous as vol
from homeassistant.auth.models import User
from homeassistant.auth.permissions.const import POLICY_CONTROL, POLICY_READ
from homeassistant.components import frontend, http, websocket_api
from homeassistant.components.http import KEY_HASS_USER
from homeassistant.components.websocket_api import (
ERR_INVALID_FORMAT,
ERR_NOT_FOUND,
@@ -35,7 +32,7 @@ from homeassistant.core import (
SupportsResponse,
callback,
)
from homeassistant.exceptions import HomeAssistantError, Unauthorized
from homeassistant.exceptions import HomeAssistantError
from homeassistant.helpers import config_validation as cv, entity_registry as er
from homeassistant.helpers.debounce import Debouncer
from homeassistant.helpers.entity import Entity, EntityDescription
@@ -789,10 +786,6 @@ class CalendarEventView(http.HomeAssistantView):
async def get(self, request: web.Request, entity_id: str) -> web.Response:
"""Return calendar events."""
user: User = request[KEY_HASS_USER]
if not user.permissions.check_entity(entity_id, POLICY_READ):
raise Unauthorized(entity_id=entity_id)
if not (entity := self.component.get_entity(entity_id)) or not isinstance(
entity, CalendarEntity
):
@@ -844,14 +837,10 @@ class CalendarListView(http.HomeAssistantView):
async def get(self, request: web.Request) -> web.Response:
"""Retrieve calendar list."""
user: User = request[KEY_HASS_USER]
hass = request.app[http.KEY_HASS]
entity_perm = user.permissions.check_entity
calendar_list: list[dict[str, str]] = []
for entity in self.component.entities:
if not entity_perm(entity.entity_id, POLICY_READ):
continue
state = hass.states.get(entity.entity_id)
assert state
calendar_list.append({"name": state.name, "entity_id": entity.entity_id})
@@ -871,9 +860,6 @@ async def handle_calendar_event_create(
hass: HomeAssistant, connection: ActiveConnection, msg: dict[str, Any]
) -> None:
"""Handle creation of a calendar event."""
if not connection.user.permissions.check_entity(msg["entity_id"], POLICY_CONTROL):
raise Unauthorized(entity_id=msg["entity_id"])
if not (entity := hass.data[DATA_COMPONENT].get_entity(msg["entity_id"])):
connection.send_error(msg["id"], ERR_NOT_FOUND, "Entity not found")
return
@@ -913,8 +899,6 @@ async def handle_calendar_event_delete(
hass: HomeAssistant, connection: ActiveConnection, msg: dict[str, Any]
) -> None:
"""Handle delete of a calendar event."""
if not connection.user.permissions.check_entity(msg["entity_id"], POLICY_CONTROL):
raise Unauthorized(entity_id=msg["entity_id"])
if not (entity := hass.data[DATA_COMPONENT].get_entity(msg["entity_id"])):
connection.send_error(msg["id"], ERR_NOT_FOUND, "Entity not found")
@@ -960,10 +944,7 @@ async def handle_calendar_event_delete(
async def handle_calendar_event_update(
hass: HomeAssistant, connection: ActiveConnection, msg: dict[str, Any]
) -> None:
"""Handle update of a calendar event."""
if not connection.user.permissions.check_entity(msg["entity_id"], POLICY_CONTROL):
raise Unauthorized(entity_id=msg["entity_id"])
"""Handle creation of a calendar event."""
if not (entity := hass.data[DATA_COMPONENT].get_entity(msg["entity_id"])):
connection.send_error(msg["id"], ERR_NOT_FOUND, "Entity not found")
return
@@ -1008,9 +989,6 @@ async def handle_calendar_event_subscribe(
"""Subscribe to calendar event updates."""
entity_id: str = msg["entity_id"]
if not connection.user.permissions.check_entity(entity_id, POLICY_READ):
raise Unauthorized(entity_id=entity_id)
if not (entity := hass.data[DATA_COMPONENT].get_entity(entity_id)):
connection.send_error(
msg["id"],
@@ -926,7 +926,6 @@ async def websocket_get_prefs(
vol.Optional(PREF_ORIENTATION): vol.Coerce(Orientation),
}
)
@websocket_api.require_admin
@websocket_api.async_response
async def websocket_update_prefs(
hass: HomeAssistant, connection: ActiveConnection, msg: dict[str, Any]
-1
View File
@@ -374,7 +374,6 @@ class CloudClient(Interface):
method=payload["method"],
query_string=payload["query"],
mock_source=DOMAIN,
remote=None, # Remote will be used for the local_only check, but since this is from the cloud we want it to be None to mark it as non-local and bypass the ip parsing and remote checks
)
response = await webhook.async_handle_webhook(
@@ -615,7 +615,6 @@ class DownloadSupportPackageView(HomeAssistantView):
return markdown
@require_admin
async def get(self, request: web.Request) -> web.Response:
"""Download support package file."""
@@ -710,7 +709,6 @@ def _require_cloud_login(
return with_cloud_auth
@websocket_api.require_admin
@_require_cloud_login
@websocket_api.websocket_command({vol.Required("type"): "cloud/subscription"})
@websocket_api.async_response
@@ -752,7 +750,6 @@ def validate_language_voice(value: tuple[str, str]) -> tuple[str, str]:
return value
@websocket_api.require_admin
@_require_cloud_login
@websocket_api.websocket_command(
{
@@ -812,7 +809,6 @@ async def websocket_update_prefs(
connection.send_message(websocket_api.result_message(msg["id"]))
@websocket_api.require_admin
@_require_cloud_login
@websocket_api.websocket_command(
{
@@ -833,7 +829,6 @@ async def websocket_hook_create(
connection.send_message(websocket_api.result_message(msg["id"], hook))
@websocket_api.require_admin
@_require_cloud_login
@websocket_api.websocket_command(
{
@@ -8,5 +8,5 @@
"iot_class": "local_polling",
"loggers": ["aiocomelit"],
"quality_scale": "platinum",
"requirements": ["aiocomelit==2.0.3"]
"requirements": ["aiocomelit==2.0.2"]
}
+16 -6
View File
@@ -10,19 +10,32 @@ from homeassistant.auth.models import User
from homeassistant.components import websocket_api
from homeassistant.core import HomeAssistant, callback
WS_TYPE_LIST = "config/auth/list"
SCHEMA_WS_LIST = websocket_api.BASE_COMMAND_MESSAGE_SCHEMA.extend(
{vol.Required("type"): WS_TYPE_LIST}
)
WS_TYPE_DELETE = "config/auth/delete"
SCHEMA_WS_DELETE = websocket_api.BASE_COMMAND_MESSAGE_SCHEMA.extend(
{vol.Required("type"): WS_TYPE_DELETE, vol.Required("user_id"): str}
)
@callback
def async_setup(hass: HomeAssistant) -> bool:
"""Enable the Home Assistant views."""
websocket_api.async_register_command(hass, websocket_list)
websocket_api.async_register_command(hass, websocket_delete)
websocket_api.async_register_command(
hass, WS_TYPE_LIST, websocket_list, SCHEMA_WS_LIST
)
websocket_api.async_register_command(
hass, WS_TYPE_DELETE, websocket_delete, SCHEMA_WS_DELETE
)
websocket_api.async_register_command(hass, websocket_create)
websocket_api.async_register_command(hass, websocket_update)
return True
@websocket_api.require_admin
@websocket_api.websocket_command({vol.Required("type"): "config/auth/list"})
@websocket_api.async_response
async def websocket_list(
hass: HomeAssistant,
@@ -36,9 +49,6 @@ async def websocket_list(
@websocket_api.require_admin
@websocket_api.websocket_command(
{vol.Required("type"): "config/auth/delete", vol.Required("user_id"): str}
)
@websocket_api.async_response
async def websocket_delete(
hass: HomeAssistant,
@@ -14,8 +14,6 @@ from datetime import datetime
import functools as ft
from typing import Any
import voluptuous as vol
from homeassistant.const import ATTR_ENTITY_PICTURE, ATTR_FRIENDLY_NAME
from homeassistant.core import (
HassJob,
@@ -26,7 +24,6 @@ from homeassistant.core import (
from homeassistant.helpers import config_validation as cv
from homeassistant.helpers.entity import async_generate_entity_id
from homeassistant.helpers.event import async_call_later
from homeassistant.helpers.service import async_register_admin_service
from homeassistant.helpers.typing import ConfigType
from homeassistant.util.async_ import run_callback_threadsafe
@@ -152,12 +149,8 @@ class Configurator:
self._requests: dict[
str, tuple[str, list[dict[str, str]], ConfiguratorCallback | None]
] = {}
async_register_admin_service(
hass,
DOMAIN,
SERVICE_CONFIGURE,
self.async_handle_service_call,
schema=vol.Schema({}, extra=vol.ALLOW_EXTRA),
hass.services.async_register(
DOMAIN, SERVICE_CONFIGURE, self.async_handle_service_call
)
@async_callback
+1 -6
View File
@@ -4,11 +4,7 @@ from collections.abc import Mapping
from homeassistant.const import STATE_OFF, STATE_ON
from homeassistant.core import HomeAssistant, State
from homeassistant.helpers.condition import (
ENTITY_STATE_CONDITION_SCHEMA_ANY_ALL_FOR,
Condition,
EntityConditionBase,
)
from homeassistant.helpers.condition import Condition, EntityConditionBase
from .const import ATTR_IS_CLOSED, DOMAIN, CoverDeviceClass
from .models import CoverDomainSpec
@@ -18,7 +14,6 @@ class CoverConditionBase(EntityConditionBase):
"""Base condition for cover state checks."""
_domain_specs: Mapping[str, CoverDomainSpec]
_schema = ENTITY_STATE_CONDITION_SCHEMA_ANY_ALL_FOR
def is_valid_state(self, entity_state: State) -> bool:
"""Check if the state matches the expected cover state."""
@@ -8,11 +8,6 @@
options:
- all
- any
for:
required: true
default: 00:00:00
selector:
duration:
awning_is_closed:
fields: *condition_common_fields
@@ -1,7 +1,6 @@
{
"common": {
"condition_behavior_name": "Condition passes if",
"condition_for_name": "For at least",
"trigger_behavior_name": "Trigger when",
"trigger_for_name": "For at least"
},
@@ -11,9 +10,6 @@
"fields": {
"behavior": {
"name": "[%key:component::cover::common::condition_behavior_name%]"
},
"for": {
"name": "[%key:component::cover::common::condition_for_name%]"
}
},
"name": "Awning is closed"
@@ -23,9 +19,6 @@
"fields": {
"behavior": {
"name": "[%key:component::cover::common::condition_behavior_name%]"
},
"for": {
"name": "[%key:component::cover::common::condition_for_name%]"
}
},
"name": "Awning is open"
@@ -35,9 +28,6 @@
"fields": {
"behavior": {
"name": "[%key:component::cover::common::condition_behavior_name%]"
},
"for": {
"name": "[%key:component::cover::common::condition_for_name%]"
}
},
"name": "Blind is closed"
@@ -47,9 +37,6 @@
"fields": {
"behavior": {
"name": "[%key:component::cover::common::condition_behavior_name%]"
},
"for": {
"name": "[%key:component::cover::common::condition_for_name%]"
}
},
"name": "Blind is open"
@@ -59,9 +46,6 @@
"fields": {
"behavior": {
"name": "[%key:component::cover::common::condition_behavior_name%]"
},
"for": {
"name": "[%key:component::cover::common::condition_for_name%]"
}
},
"name": "Curtain is closed"
@@ -71,9 +55,6 @@
"fields": {
"behavior": {
"name": "[%key:component::cover::common::condition_behavior_name%]"
},
"for": {
"name": "[%key:component::cover::common::condition_for_name%]"
}
},
"name": "Curtain is open"
@@ -83,9 +64,6 @@
"fields": {
"behavior": {
"name": "[%key:component::cover::common::condition_behavior_name%]"
},
"for": {
"name": "[%key:component::cover::common::condition_for_name%]"
}
},
"name": "Shade is closed"
@@ -95,9 +73,6 @@
"fields": {
"behavior": {
"name": "[%key:component::cover::common::condition_behavior_name%]"
},
"for": {
"name": "[%key:component::cover::common::condition_for_name%]"
}
},
"name": "Shade is open"
@@ -107,9 +82,6 @@
"fields": {
"behavior": {
"name": "[%key:component::cover::common::condition_behavior_name%]"
},
"for": {
"name": "[%key:component::cover::common::condition_for_name%]"
}
},
"name": "Shutter is closed"
@@ -119,9 +91,6 @@
"fields": {
"behavior": {
"name": "[%key:component::cover::common::condition_behavior_name%]"
},
"for": {
"name": "[%key:component::cover::common::condition_for_name%]"
}
},
"name": "Shutter is open"
+1 -3
View File
@@ -11,7 +11,6 @@ from homeassistant.helpers import (
device_registry as dr,
entity_registry as er,
)
from homeassistant.helpers.service import async_register_admin_service
from homeassistant.util.read_only_dict import ReadOnlyDict
from .const import CONF_BRIDGE_ID, DOMAIN, LOGGER
@@ -99,8 +98,7 @@ def async_setup_services(hass: HomeAssistant) -> None:
await async_remove_orphaned_entries_service(hub)
for service in SUPPORTED_SERVICES:
async_register_admin_service(
hass,
hass.services.async_register(
DOMAIN,
service,
async_call_deconz_service,
@@ -19,7 +19,7 @@ from homeassistant.const import (
Platform,
)
from homeassistant.core import Event, HomeAssistant
from homeassistant.exceptions import ConfigEntryError, ConfigEntryNotReady
from homeassistant.exceptions import ConfigEntryAuthFailed, ConfigEntryNotReady
PLATFORMS = [Platform.LIGHT]
@@ -40,7 +40,7 @@ def _login_and_get_switches(email: str, password: str) -> DecoraWifiData:
success = session.login(email, password)
if success is None:
raise ConfigEntryError("Invalid credentials for myLeviton account")
raise ConfigEntryAuthFailed("Invalid credentials for myLeviton account")
perms = session.user.get_residential_permissions()
all_switches: list[IotSwitch] = []
@@ -187,7 +187,6 @@ class DerivativeSensor(RestoreSensor, SensorEntity):
_attr_translation_key = "derivative"
_attr_should_poll = False
_attr_state_class = SensorStateClass.MEASUREMENT
def __init__(
self,
@@ -245,7 +245,6 @@ class DownloadDiagnosticsView(http.HomeAssistantView):
extra_urls = ["/api/diagnostics/{d_type}/{d_id}/{sub_type}/{sub_id}"]
name = "api:diagnostics"
@http.require_admin
async def get(
self,
request: web.Request,
+7 -14
View File
@@ -30,10 +30,12 @@ async def async_migrate_entry(hass: HomeAssistant, config_entry: ConfigEntry) ->
return False
if config_entry.version < 2 and config_entry.minor_version < 2:
version = config_entry.version
minor_version = config_entry.minor_version
_LOGGER.debug(
"Migrating configuration from version %s.%s",
config_entry.version,
config_entry.minor_version,
version,
minor_version,
)
new_options = {**config_entry.options}
@@ -44,19 +46,10 @@ async def async_migrate_entry(hass: HomeAssistant, config_entry: ConfigEntry) ->
config_entry, options=new_options, minor_version=2
)
_LOGGER.debug("Migration to configuration version %s.%s successful", 1, 2)
if config_entry.version < 2 and config_entry.minor_version < 3:
_LOGGER.debug(
"Migrating configuration from version %s.%s",
config_entry.version,
config_entry.minor_version,
"Migration to configuration version %s.%s successful",
1,
2,
)
hass.config_entries.async_update_entry(
config_entry, unique_id=None, minor_version=3
)
_LOGGER.debug("Migration to configuration version %s.%s successful", 1, 3)
return True
@@ -93,7 +93,7 @@ class DnsIPConfigFlow(ConfigFlow, domain=DOMAIN):
"""Handle a config flow for dnsip integration."""
VERSION = 1
MINOR_VERSION = 3
MINOR_VERSION = 2
@staticmethod
@callback
@@ -133,7 +133,10 @@ class DnsIPConfigFlow(ConfigFlow, domain=DOMAIN):
):
errors["base"] = "invalid_hostname"
else:
self._async_abort_entries_match({CONF_HOSTNAME: hostname})
# Uses hostname as unique ID, which is no longer allowed
# pylint: disable-next=hass-unique-id-ip-based
await self.async_set_unique_id(hostname)
self._abort_if_unique_id_configured()
return self.async_create_entry(
title=name,
@@ -8,11 +8,6 @@
options:
- all
- any
for:
required: true
default: 00:00:00
selector:
duration:
is_closed:
fields: *condition_common_fields
@@ -1,7 +1,6 @@
{
"common": {
"condition_behavior_name": "Condition passes if",
"condition_for_name": "For at least",
"trigger_behavior_name": "Trigger when",
"trigger_for_name": "For at least"
},
@@ -11,9 +10,6 @@
"fields": {
"behavior": {
"name": "[%key:component::door::common::condition_behavior_name%]"
},
"for": {
"name": "[%key:component::door::common::condition_for_name%]"
}
},
"name": "Door is closed"
@@ -23,9 +19,6 @@
"fields": {
"behavior": {
"name": "[%key:component::door::common::condition_behavior_name%]"
},
"for": {
"name": "[%key:component::door::common::condition_for_name%]"
}
},
"name": "Door is open"
+1 -3
View File
@@ -2,7 +2,7 @@
from __future__ import annotations
from duco import DucoClient, build_ssl_context
from duco import DucoClient
from homeassistant.const import CONF_HOST
from homeassistant.core import HomeAssistant
@@ -14,11 +14,9 @@ from .coordinator import DucoConfigEntry, DucoCoordinator
async def async_setup_entry(hass: HomeAssistant, entry: DucoConfigEntry) -> bool:
"""Set up Duco from a config entry."""
ssl_context = await hass.async_add_executor_job(build_ssl_context)
client = DucoClient(
session=async_get_clientsession(hass),
host=entry.data[CONF_HOST],
ssl_context=ssl_context,
)
coordinator = DucoCoordinator(hass, entry, client)
+1 -3
View File
@@ -5,7 +5,7 @@ from __future__ import annotations
import logging
from typing import Any
from duco import DucoClient, build_ssl_context
from duco import DucoClient
from duco.exceptions import DucoConnectionError, DucoError
import voluptuous as vol
@@ -160,11 +160,9 @@ class DucoConfigFlow(ConfigFlow, domain=DOMAIN):
Returns a tuple of (box_name, mac_address).
"""
ssl_context = await self.hass.async_add_executor_job(build_ssl_context)
client = DucoClient(
session=async_get_clientsession(self.hass),
host=host,
ssl_context=ssl_context,
)
board_info = await client.async_get_board_info()
lan_info = await client.async_get_lan_info()
+6 -13
View File
@@ -2,17 +2,14 @@
from __future__ import annotations
import asyncio
from dataclasses import asdict
from typing import Any
from duco.exceptions import DucoConnectionError
from homeassistant.components.diagnostics import async_redact_data
from homeassistant.const import CONF_HOST
from homeassistant.core import HomeAssistant
from homeassistant.exceptions import HomeAssistantError
from .const import DOMAIN
from .coordinator import DucoConfigEntry
TO_REDACT = {
@@ -35,15 +32,11 @@ async def async_get_config_entry_diagnostics(
board = asdict(coordinator.board_info)
board.pop("time")
try:
lan_info = await coordinator.client.async_get_lan_info()
duco_diags = await coordinator.client.async_get_diagnostics()
write_remaining = await coordinator.client.async_get_write_req_remaining()
except DucoConnectionError as err:
raise HomeAssistantError(
translation_domain=DOMAIN,
translation_key="connection_error",
) from err
lan_info, duco_diags, write_remaining = await asyncio.gather(
coordinator.client.async_get_lan_info(),
coordinator.client.async_get_diagnostics(),
coordinator.client.async_get_write_req_remaining(),
)
return async_redact_data(
{
+1 -1
View File
@@ -35,7 +35,7 @@ PRESET_AUTO = "auto"
# again always round-trips to the same Duco state.
_SPEED_LEVEL_PERCENTAGES: list[int] = [
(i + 1) * 100 // len(ORDERED_NAMED_FAN_SPEEDS)
for i, _ in enumerate(ORDERED_NAMED_FAN_SPEEDS)
for i in range(len(ORDERED_NAMED_FAN_SPEEDS))
]
# Maps every active Duco state (including timed MAN variants) to its
+1 -1
View File
@@ -13,7 +13,7 @@
"iot_class": "local_polling",
"loggers": ["duco"],
"quality_scale": "platinum",
"requirements": ["python-duco-client==0.3.9"],
"requirements": ["python-duco-client==0.3.6"],
"zeroconf": [
{
"name": "duco [[][0-9a-f][0-9a-f][0-9a-f][0-9a-f][0-9a-f][0-9a-f][0-9a-f][0-9a-f][0-9a-f][0-9a-f][0-9a-f][0-9a-f][]].*",
-20
View File
@@ -19,7 +19,6 @@ from homeassistant.const import (
PERCENTAGE,
SIGNAL_STRENGTH_DECIBELS_MILLIWATT,
EntityCategory,
UnitOfTemperature,
)
from homeassistant.core import HomeAssistant, callback
from homeassistant.helpers import device_registry as dr
@@ -60,25 +59,6 @@ SENSOR_DESCRIPTIONS: tuple[DucoSensorEntityDescription, ...] = (
),
node_types=(NodeType.BOX,),
),
DucoSensorEntityDescription(
key="temperature",
device_class=SensorDeviceClass.TEMPERATURE,
state_class=SensorStateClass.MEASUREMENT,
native_unit_of_measurement=UnitOfTemperature.CELSIUS,
value_fn=lambda node: node.sensor.temp if node.sensor else None,
node_types=(NodeType.UCCO2, NodeType.BSRH, NodeType.UCRH),
),
DucoSensorEntityDescription(
key="box_temperature",
translation_key="box_temperature",
device_class=SensorDeviceClass.TEMPERATURE,
state_class=SensorStateClass.MEASUREMENT,
native_unit_of_measurement=UnitOfTemperature.CELSIUS,
entity_category=EntityCategory.DIAGNOSTIC,
entity_registry_enabled_default=False,
value_fn=lambda node: node.sensor.temp if node.sensor else None,
node_types=(NodeType.BOX,),
),
DucoSensorEntityDescription(
key="co2",
device_class=SensorDeviceClass.CO2,
@@ -47,9 +47,6 @@
}
},
"sensor": {
"box_temperature": {
"name": "Box temperature"
},
"iaq_co2": {
"name": "CO2 air quality index"
},
@@ -87,9 +84,6 @@
"cannot_connect": {
"message": "An error occurred while trying to connect to the Duco instance: {error}"
},
"connection_error": {
"message": "Could not connect to the Duco device."
},
"failed_to_set_state": {
"message": "Failed to set ventilation state: {error}"
},
@@ -6,7 +6,6 @@ from datetime import timedelta
from typing import Any
from homeassistant.core import HomeAssistant
from homeassistant.util import dt as dt_util
from .coordinator import EasyEnergyConfigEntry, EasyEnergyData
@@ -24,7 +23,9 @@ def get_gas_price(data: EasyEnergyData, hours: int) -> float | None:
"""
if not data.gas_today:
return None
return data.gas_today.price_at_time(dt_util.utcnow() + timedelta(hours=hours))
return data.gas_today.price_at_time(
data.gas_today.utcnow() + timedelta(hours=hours)
)
async def async_get_config_entry_diagnostics(
@@ -39,21 +40,21 @@ async def async_get_config_entry_diagnostics(
"title": entry.title,
},
"energy_usage": {
"current_hour_price": energy_today.current_price,
"current_hour_price": energy_today.current_usage_price,
"next_hour_price": energy_today.price_at_time(
dt_util.utcnow() + timedelta(hours=1)
energy_today.utcnow() + timedelta(hours=1)
),
"average_price": energy_today.average_price,
"max_price": energy_today.extreme_prices[1],
"min_price": energy_today.extreme_prices[0],
"highest_price_time": energy_today.highest_price_time,
"lowest_price_time": energy_today.lowest_price_time,
"percentage_of_max": energy_today.pct_of_max,
"average_price": energy_today.average_usage_price,
"max_price": energy_today.extreme_usage_prices[1],
"min_price": energy_today.extreme_usage_prices[0],
"highest_price_time": energy_today.highest_usage_price_time,
"lowest_price_time": energy_today.lowest_usage_price_time,
"percentage_of_max": energy_today.pct_of_max_usage,
},
"energy_return": {
"current_hour_price": energy_today.current_return_price,
"next_hour_price": energy_today.return_price_at_time(
dt_util.utcnow() + timedelta(hours=1)
"next_hour_price": energy_today.price_at_time(
energy_today.utcnow() + timedelta(hours=1), "return"
),
"average_price": energy_today.average_return_price,
"max_price": energy_today.extreme_return_prices[1],
@@ -6,6 +6,6 @@
"documentation": "https://www.home-assistant.io/integrations/easyenergy",
"integration_type": "service",
"iot_class": "cloud_polling",
"requirements": ["easyenergy==3.0.0"],
"requirements": ["easyenergy==2.2.0"],
"single_config_entry": true
}
+15 -14
View File
@@ -24,7 +24,6 @@ from homeassistant.core import HomeAssistant
from homeassistant.helpers.device_registry import DeviceEntryType, DeviceInfo
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from homeassistant.helpers.update_coordinator import CoordinatorEntity
from homeassistant.util import dt as dt_util
from .const import DOMAIN, SERVICE_TYPE_DEVICE_NAMES
from .coordinator import (
@@ -64,7 +63,7 @@ SENSORS: tuple[EasyEnergySensorEntityDescription, ...] = (
service_type="today_energy_usage",
state_class=SensorStateClass.MEASUREMENT,
native_unit_of_measurement=f"{CURRENCY_EURO}/{UnitOfEnergy.KILO_WATT_HOUR}",
value_fn=lambda data: data.energy_today.current_price,
value_fn=lambda data: data.energy_today.current_usage_price,
),
EasyEnergySensorEntityDescription(
key="next_hour_price",
@@ -72,7 +71,7 @@ SENSORS: tuple[EasyEnergySensorEntityDescription, ...] = (
service_type="today_energy_usage",
native_unit_of_measurement=f"{CURRENCY_EURO}/{UnitOfEnergy.KILO_WATT_HOUR}",
value_fn=lambda data: data.energy_today.price_at_time(
dt_util.utcnow() + timedelta(hours=1)
data.energy_today.utcnow() + timedelta(hours=1)
),
),
EasyEnergySensorEntityDescription(
@@ -80,42 +79,42 @@ SENSORS: tuple[EasyEnergySensorEntityDescription, ...] = (
translation_key="average_price",
service_type="today_energy_usage",
native_unit_of_measurement=f"{CURRENCY_EURO}/{UnitOfEnergy.KILO_WATT_HOUR}",
value_fn=lambda data: data.energy_today.average_price,
value_fn=lambda data: data.energy_today.average_usage_price,
),
EasyEnergySensorEntityDescription(
key="max_price",
translation_key="max_price",
service_type="today_energy_usage",
native_unit_of_measurement=f"{CURRENCY_EURO}/{UnitOfEnergy.KILO_WATT_HOUR}",
value_fn=lambda data: data.energy_today.extreme_prices[1],
value_fn=lambda data: data.energy_today.extreme_usage_prices[1],
),
EasyEnergySensorEntityDescription(
key="min_price",
translation_key="min_price",
service_type="today_energy_usage",
native_unit_of_measurement=f"{CURRENCY_EURO}/{UnitOfEnergy.KILO_WATT_HOUR}",
value_fn=lambda data: data.energy_today.extreme_prices[0],
value_fn=lambda data: data.energy_today.extreme_usage_prices[0],
),
EasyEnergySensorEntityDescription(
key="highest_price_time",
translation_key="highest_price_time",
service_type="today_energy_usage",
device_class=SensorDeviceClass.TIMESTAMP,
value_fn=lambda data: data.energy_today.highest_price_time,
value_fn=lambda data: data.energy_today.highest_usage_price_time,
),
EasyEnergySensorEntityDescription(
key="lowest_price_time",
translation_key="lowest_price_time",
service_type="today_energy_usage",
device_class=SensorDeviceClass.TIMESTAMP,
value_fn=lambda data: data.energy_today.lowest_price_time,
value_fn=lambda data: data.energy_today.lowest_usage_price_time,
),
EasyEnergySensorEntityDescription(
key="percentage_of_max",
translation_key="percentage_of_max",
service_type="today_energy_usage",
native_unit_of_measurement=PERCENTAGE,
value_fn=lambda data: data.energy_today.pct_of_max,
value_fn=lambda data: data.energy_today.pct_of_max_usage,
),
EasyEnergySensorEntityDescription(
key="current_hour_price",
@@ -130,8 +129,8 @@ SENSORS: tuple[EasyEnergySensorEntityDescription, ...] = (
translation_key="next_hour_price",
service_type="today_energy_return",
native_unit_of_measurement=f"{CURRENCY_EURO}/{UnitOfEnergy.KILO_WATT_HOUR}",
value_fn=lambda data: data.energy_today.return_price_at_time(
dt_util.utcnow() + timedelta(hours=1)
value_fn=lambda data: data.energy_today.price_at_time(
data.energy_today.utcnow() + timedelta(hours=1), "return"
),
),
EasyEnergySensorEntityDescription(
@@ -181,14 +180,14 @@ SENSORS: tuple[EasyEnergySensorEntityDescription, ...] = (
translation_key="hours_priced_equal_or_lower",
service_type="today_energy_usage",
native_unit_of_measurement=UnitOfTime.HOURS,
value_fn=lambda data: data.energy_today.periods_priced_equal_or_lower,
value_fn=lambda data: data.energy_today.hours_priced_equal_or_lower_usage,
),
EasyEnergySensorEntityDescription(
key="hours_priced_equal_or_higher",
translation_key="hours_priced_equal_or_higher",
service_type="today_energy_return",
native_unit_of_measurement=UnitOfTime.HOURS,
value_fn=lambda data: data.energy_today.return_periods_priced_equal_or_higher,
value_fn=lambda data: data.energy_today.hours_priced_equal_or_higher_return,
),
)
@@ -206,7 +205,9 @@ def get_gas_price(data: EasyEnergyData, hours: int) -> float | None:
"""
if data.gas_today is None:
return None
return data.gas_today.price_at_time(dt_util.utcnow() + timedelta(hours=hours))
return data.gas_today.price_at_time(
data.gas_today.utcnow() + timedelta(hours=hours)
)
async def async_setup_entry(
+36 -84
View File
@@ -2,13 +2,12 @@
from __future__ import annotations
from datetime import date, datetime, timedelta
from datetime import date, datetime
from enum import StrEnum
from functools import partial
from typing import Final
from easyenergy import Electricity, Gas, PriceInterval, VatOption
from easyenergy.const import MARKET_TIMEZONE
from easyenergy import Electricity, Gas, VatOption
import voluptuous as vol
from homeassistant.core import (
@@ -33,22 +32,18 @@ ATTR_INCL_VAT: Final = "incl_vat"
GAS_SERVICE_NAME: Final = "get_gas_prices"
ENERGY_USAGE_SERVICE_NAME: Final = "get_energy_usage_prices"
ENERGY_RETURN_SERVICE_NAME: Final = "get_energy_return_prices"
BASE_SERVICE_SCHEMA: Final = {
vol.Required(ATTR_CONFIG_ENTRY): selector.ConfigEntrySelector(
{
"integration": DOMAIN,
}
),
vol.Optional(ATTR_START): str,
vol.Optional(ATTR_END): str,
}
SERVICE_SCHEMA: Final = vol.Schema(
{
**BASE_SERVICE_SCHEMA,
vol.Required(ATTR_CONFIG_ENTRY): selector.ConfigEntrySelector(
{
"integration": DOMAIN,
}
),
vol.Required(ATTR_INCL_VAT): bool,
vol.Optional(ATTR_START): str,
vol.Optional(ATTR_END): str,
}
)
RETURN_SERVICE_SCHEMA: Final = vol.Schema(BASE_SERVICE_SCHEMA)
class PriceType(StrEnum):
@@ -59,47 +54,22 @@ class PriceType(StrEnum):
GAS = "gas"
def __get_date(
date_input: str | None,
) -> tuple[date, datetime | None]:
"""Get date for the API and optional datetime for response filtering."""
def __get_date(date_input: str | None) -> date | datetime:
"""Get date."""
if not date_input:
return dt_util.now().date(), None
return dt_util.now().date()
if date_value := dt_util.parse_date(date_input):
return date_value, None
if value := dt_util.parse_datetime(date_input):
return value
if not (datetime_value := dt_util.parse_datetime(date_input)):
raise ServiceValidationError(
translation_domain=DOMAIN,
translation_key="invalid_date",
translation_placeholders={
"date": date_input,
},
)
datetime_utc = dt_util.as_utc(datetime_value)
return datetime_utc.astimezone(MARKET_TIMEZONE).date(), datetime_utc
def __filter_prices(
prices: list[dict[str, float | datetime]],
intervals: tuple[PriceInterval, ...],
start: datetime,
end: datetime,
) -> list[dict[str, float | datetime]]:
"""Filter prices to the requested datetime range."""
included_timestamps = {
interval.starts_at
for interval in intervals
if interval.ends_at > start and interval.starts_at < end
}
return [
timestamp_price
for timestamp_price in prices
if timestamp_price["timestamp"] in included_timestamps
]
raise ServiceValidationError(
"Invalid datetime provided.",
translation_domain=DOMAIN,
translation_key="invalid_date",
translation_placeholders={
"date": date_input,
},
)
def __serialize_prices(prices: list[dict[str, float | datetime]]) -> ServiceResponse:
@@ -131,8 +101,8 @@ async def __get_prices(
"""Get prices from easyEnergy."""
coordinator = __get_coordinator(call)
start_date, start_datetime = __get_date(call.data.get(ATTR_START))
end_date, end_datetime = __get_date(call.data.get(ATTR_END))
start = __get_date(call.data.get(ATTR_START))
end = __get_date(call.data.get(ATTR_END))
vat = VatOption.INCLUDE
if call.data.get(ATTR_INCL_VAT) is False:
@@ -142,38 +112,20 @@ async def __get_prices(
if price_type == PriceType.GAS:
data = await coordinator.easyenergy.gas_prices(
start_date=start_date,
end_date=end_date,
vat=vat,
)
prices = data.timestamp_prices
else:
data = await coordinator.easyenergy.energy_prices(
start_date=start_date,
end_date=end_date,
start_date=start,
end_date=end,
vat=vat,
)
return __serialize_prices(data.timestamp_prices)
data = await coordinator.easyenergy.energy_prices(
start_date=start,
end_date=end,
vat=vat,
)
if price_type == PriceType.ENERGY_USAGE:
prices = data.timestamp_prices
else:
prices = data.timestamp_return_prices
if start_datetime or end_datetime:
filter_start = start_datetime or dt_util.as_utc(
dt_util.start_of_local_day(start_date)
)
filter_end = end_datetime or dt_util.as_utc(
dt_util.start_of_local_day(end_date + timedelta(days=1))
)
prices = __filter_prices(
prices,
data.intervals,
filter_start,
filter_end,
)
return __serialize_prices(prices)
if price_type == PriceType.ENERGY_USAGE:
return __serialize_prices(data.timestamp_usage_prices)
return __serialize_prices(data.timestamp_return_prices)
@callback
@@ -198,6 +150,6 @@ def async_setup_services(hass: HomeAssistant) -> None:
DOMAIN,
ENERGY_RETURN_SERVICE_NAME,
partial(__get_prices, price_type=PriceType.ENERGY_RETURN),
schema=RETURN_SERVICE_SCHEMA,
schema=SERVICE_SCHEMA,
supports_response=SupportsResponse.ONLY,
)
@@ -11,7 +11,6 @@
"documentation": "https://www.home-assistant.io/integrations/elgato",
"integration_type": "device",
"iot_class": "local_polling",
"quality_scale": "platinum",
"requirements": ["elgato==5.1.2"],
"zeroconf": ["_elg._tcp.local."]
}
@@ -10,7 +10,7 @@ rules:
docs-actions: done
docs-high-level-description: done
docs-installation-instructions: done
docs-removal-instructions: done
docs-removal-instructions: todo
entity-event-setup:
status: exempt
comment: |
@@ -25,8 +25,8 @@ rules:
# Silver
action-exceptions: done
config-entry-unloading: done
docs-configuration-parameters: done
docs-installation-parameters: done
docs-configuration-parameters: todo
docs-installation-parameters: todo
entity-unavailable: done
integration-owner: done
log-when-unavailable: done
@@ -41,13 +41,17 @@ rules:
diagnostics: done
discovery-update-info: done
discovery: done
docs-data-update: done
docs-examples: done
docs-known-limitations: done
docs-supported-devices: done
docs-data-update: todo
docs-examples: todo
docs-known-limitations: todo
docs-supported-devices:
status: todo
comment: |
Device are documented, but some are missing. For example, the their pro
strip is supported as well.
docs-supported-functions: done
docs-troubleshooting: done
docs-use-cases: done
docs-troubleshooting: todo
docs-use-cases: todo
dynamic-devices:
status: exempt
comment: |
+1 -5
View File
@@ -17,7 +17,6 @@ from homeassistant.const import (
CONF_HOST,
CONF_PASSWORD,
CONF_PORT,
EVENT_HOMEASSISTANT_STOP,
__version__ as ha_version,
)
from homeassistant.core import HomeAssistant, callback
@@ -81,10 +80,7 @@ async def async_setup(hass: HomeAssistant, config: ConfigType) -> bool:
if "usb" in hass.config.components:
async_register_serial_port_scanner(hass, _async_scan_serial_ports)
hass.bus.async_listen_once(
EVENT_HOMEASSISTANT_STOP,
serial_proxy.register_serialx_transport(hass.loop),
)
serial_proxy.set_hass_loop(hass.loop)
return True
+1 -10
View File
@@ -1,20 +1,11 @@
"""ESPHome constants."""
from __future__ import annotations
from typing import TYPE_CHECKING, Final
from typing import Final
from awesomeversion import AwesomeVersion
from homeassistant.util.hass_dict import HassKey
if TYPE_CHECKING:
from .domain_data import DomainData
DOMAIN = "esphome"
ESPHOME_DATA: HassKey[DomainData] = HassKey(DOMAIN)
CONF_ALLOW_SERVICE_CALLS = "allow_service_calls"
CONF_SUBSCRIBE_LOGS = "subscribe_logs"
CONF_DEVICE_NAME = "device_name"
@@ -4,11 +4,12 @@ from __future__ import annotations
from dataclasses import dataclass, field
from functools import cache
from typing import Self
from homeassistant.core import HomeAssistant
from homeassistant.helpers.json import JSONEncoder
from .const import ESPHOME_DATA
from .const import DOMAIN
from .entry_data import ESPHomeConfigEntry, ESPHomeStorage, RuntimeEntryData
STORAGE_VERSION = 1
@@ -35,9 +36,11 @@ class DomainData:
),
)
@staticmethod
@classmethod
@cache
def get(hass: HomeAssistant) -> DomainData:
def get(cls, hass: HomeAssistant) -> Self:
"""Get the global DomainData instance stored in hass.data."""
ret = hass.data[ESPHOME_DATA] = DomainData()
# Uses legacy hass.data[DOMAIN] pattern
# pylint: disable-next=hass-use-runtime-data
ret = hass.data[DOMAIN] = cls()
return ret
@@ -3,7 +3,6 @@
from __future__ import annotations
import asyncio
from collections.abc import Callable
from typing import cast
from aioesphomeapi import APIClient
@@ -16,17 +15,25 @@ from serialx.platforms.serial_esphome import (
from yarl import URL
from homeassistant.config_entries import ConfigEntryState
from homeassistant.core import Event, HomeAssistant, async_get_hass, callback
from homeassistant.core import HomeAssistant, async_get_hass
from .const import DOMAIN
from .entry_data import ESPHomeConfigEntry
SCHEME = "esphome-hass://"
# This is required so that serialx can safely query Core for an instance of an
# aioesphomeapi client. We cannot make any assumptions here, some packages run separate
# asyncio event loops in dedicated threads.
_HASS_LOOP: asyncio.AbstractEventLoop | None = None
def set_hass_loop(loop: asyncio.AbstractEventLoop) -> None:
"""Store a reference to the Core event loop."""
global _HASS_LOOP # noqa: PLW0603 # pylint: disable=global-statement
_HASS_LOOP = loop
def build_url(entry_id: str, port_name: str) -> URL:
"""Build a canonical `esphome-hass://` URL."""
return URL.build(
@@ -98,24 +105,9 @@ class HassESPHomeSerialTransport(ESPHomeSerialTransport):
_serial_cls = HassESPHomeSerial
def register_serialx_transport(
loop: asyncio.AbstractEventLoop,
) -> Callable[[Event], None]:
"""Register the ESPHome URI handler."""
global _HASS_LOOP # noqa: PLW0603 # pylint: disable=global-statement
_HASS_LOOP = loop
unregister = register_uri_handler(
scheme="esphome-hass://",
unique_scheme="esphome-hass-internal://", # The unique scheme must differ
sync_cls=HassESPHomeSerial,
async_transport_cls=HassESPHomeSerialTransport,
)
@callback
def _unregister(event: Event) -> None:
global _HASS_LOOP # noqa: PLW0603 # pylint: disable=global-statement
unregister()
_HASS_LOOP = None
return _unregister
register_uri_handler(
scheme=SCHEME,
unique_scheme=SCHEME,
sync_cls=HassESPHomeSerial,
async_transport_cls=HassESPHomeSerialTransport,
)
@@ -56,6 +56,7 @@ class CometBlueClimateEntity(CometBlueBluetoothEntity, ClimateEntity):
]
_attr_supported_features: ClimateEntityFeature = (
ClimateEntityFeature.TARGET_TEMPERATURE
| ClimateEntityFeature.TARGET_TEMPERATURE_RANGE
| ClimateEntityFeature.PRESET_MODE
| ClimateEntityFeature.TURN_ON
| ClimateEntityFeature.TURN_OFF
@@ -80,19 +81,13 @@ class CometBlueClimateEntity(CometBlueBluetoothEntity, ClimateEntity):
return self.coordinator.data.temperatures["manualTemp"]
@property
def _device_comfort_setpoint(self) -> float | None:
"""Return the comfort setpoint temperature.
Internally used for preset selection.
"""
def target_temperature_high(self) -> float | None:
"""Return the upper bound target temperature."""
return self.coordinator.data.temperatures["targetTempHigh"]
@property
def _device_eco_setpoint(self) -> float | None:
"""Return the eco setpoint temperature.
Internally used for preset selection.
"""
def target_temperature_low(self) -> float | None:
"""Return the lower bound target temperature."""
return self.coordinator.data.temperatures["targetTempLow"]
@property
@@ -118,9 +113,9 @@ class CometBlueClimateEntity(CometBlueBluetoothEntity, ClimateEntity):
return PRESET_AWAY
if self.target_temperature == MAX_TEMP:
return PRESET_BOOST
if self.target_temperature == self._device_comfort_setpoint:
if self.target_temperature == self.target_temperature_high:
return PRESET_COMFORT
if self.target_temperature == self._device_eco_setpoint:
if self.target_temperature == self.target_temperature_low:
return PRESET_ECO
return PRESET_NONE
@@ -158,11 +153,11 @@ class CometBlueClimateEntity(CometBlueBluetoothEntity, ClimateEntity):
)
if preset_mode == PRESET_ECO:
return await self.async_set_temperature(
temperature=self._device_eco_setpoint
temperature=self.target_temperature_low
)
if preset_mode == PRESET_COMFORT:
return await self.async_set_temperature(
temperature=self._device_comfort_setpoint
temperature=self.target_temperature_high
)
if preset_mode == PRESET_BOOST:
return await self.async_set_temperature(temperature=MAX_TEMP)
@@ -177,7 +172,7 @@ class CometBlueClimateEntity(CometBlueBluetoothEntity, ClimateEntity):
return await self.async_set_temperature(temperature=MAX_TEMP)
if hvac_mode == HVACMode.AUTO:
return await self.async_set_temperature(
temperature=self._device_eco_setpoint
temperature=self.target_temperature_low
)
raise ServiceValidationError(f"Unknown HVAC mode '{hvac_mode}'")
+1 -3
View File
@@ -9,7 +9,6 @@ import yaml
from homeassistant.core import HomeAssistant, ServiceCall, SupportsResponse, callback
from homeassistant.exceptions import HomeAssistantError, ServiceValidationError
from homeassistant.helpers import config_validation as cv
from homeassistant.helpers.service import async_register_admin_service
from .const import ATTR_FILE_ENCODING, ATTR_FILE_NAME, DOMAIN, SERVICE_READ_FILE
@@ -18,8 +17,7 @@ from .const import ATTR_FILE_ENCODING, ATTR_FILE_NAME, DOMAIN, SERVICE_READ_FILE
def async_setup_services(hass: HomeAssistant) -> None:
"""Register services for File integration."""
async_register_admin_service(
hass,
hass.services.async_register(
DOMAIN,
SERVICE_READ_FILE,
read_file,
@@ -15,7 +15,7 @@ from homeassistant.config_entries import (
OptionsFlow,
SubentryFlowResult,
)
from homeassistant.const import CONF_API_KEY, CONF_LATITUDE, CONF_LONGITUDE
from homeassistant.const import CONF_API_KEY, CONF_LATITUDE, CONF_LONGITUDE, CONF_NAME
from homeassistant.core import callback
from homeassistant.helpers import config_validation as cv, selector
@@ -94,7 +94,7 @@ class ForecastSolarFlowHandler(ConfigFlow, domain=DOMAIN):
"""Handle a flow initiated by the user."""
if user_input is not None:
return self.async_create_entry(
title="",
title=user_input[CONF_NAME],
data={
CONF_LATITUDE: user_input[CONF_LATITUDE],
CONF_LONGITUDE: user_input[CONF_LONGITUDE],
@@ -118,11 +118,13 @@ class ForecastSolarFlowHandler(ConfigFlow, domain=DOMAIN):
data_schema=self.add_suggested_values_to_schema(
vol.Schema(
{
vol.Required(CONF_NAME): str,
vol.Required(CONF_LATITUDE): cv.latitude,
vol.Required(CONF_LONGITUDE): cv.longitude,
}
).extend(PLANE_SCHEMA.schema),
{
CONF_NAME: self.hass.config.location_name,
CONF_LATITUDE: self.hass.config.latitude,
CONF_LONGITUDE: self.hass.config.longitude,
CONF_DECLINATION: DEFAULT_DECLINATION,
@@ -27,8 +27,6 @@ from . import ForecastSolarConfigEntry
from .const import DOMAIN
from .coordinator import ForecastSolarDataUpdateCoordinator
PARALLEL_UPDATES = 0
@dataclass(frozen=True)
class ForecastSolarSensorEntityDescription(SensorEntityDescription):
@@ -7,7 +7,8 @@
"declination": "Declination (0 = Horizontal, 90 = Vertical)",
"latitude": "[%key:common::config_flow::data::latitude%]",
"longitude": "[%key:common::config_flow::data::longitude%]",
"modules_power": "Total Watt peak power of your solar modules"
"modules_power": "Total Watt peak power of your solar modules",
"name": "[%key:common::config_flow::data::name%]"
},
"description": "Fill in the data of your solar panels. Please refer to the documentation if a field is unclear."
}
@@ -8,6 +8,6 @@
"integration_type": "device",
"iot_class": "local_polling",
"loggers": ["freebox_api"],
"requirements": ["freebox-api==1.3.1"],
"requirements": ["freebox-api==1.3.0"],
"zeroconf": ["_fbx-api._tcp.local."]
}
-36
View File
@@ -45,7 +45,6 @@ BUTTONS: Final = [
device_class=ButtonDeviceClass.UPDATE,
entity_category=EntityCategory.CONFIG,
press_action=lambda avm_wrapper: avm_wrapper.async_trigger_firmware_update(),
entity_registry_enabled_default=False,
),
FritzButtonDescription(
key="reboot",
@@ -97,33 +96,6 @@ def repair_issue_cleanup(hass: HomeAssistant, avm_wrapper: AvmWrapper) -> None:
)
def repair_issue_firmware_update(hass: HomeAssistant, avm_wrapper: AvmWrapper) -> None:
"""Repair issue for firmware update button."""
entity_registry = er.async_get(hass)
if (
(
entity_button := entity_registry.async_get_entity_id(
"button", DOMAIN, f"{avm_wrapper.unique_id}-firmware_update"
)
)
and (entity_entry := entity_registry.async_get(entity_button))
and not entity_entry.disabled
):
# Deprecate the 'firmware update' button: create a Repairs issue for users
ir.async_create_issue(
hass,
domain=DOMAIN,
issue_id="deprecated_firmware_update_button",
is_fixable=False,
is_persistent=True,
severity=ir.IssueSeverity.WARNING,
translation_key="deprecated_firmware_update_button",
translation_placeholders={"removal_version": "2026.11.0"},
breaks_in_ha_version="2026.11.0",
)
async def async_setup_entry(
hass: HomeAssistant,
entry: FritzConfigEntry,
@@ -140,7 +112,6 @@ async def async_setup_entry(
if avm_wrapper.mesh_role == MeshRoles.SLAVE:
async_add_entities(entities_list)
repair_issue_cleanup(hass, avm_wrapper)
repair_issue_firmware_update(hass, avm_wrapper)
return
data_fritz = hass.data[FRITZ_DATA_KEY]
@@ -160,7 +131,6 @@ async def async_setup_entry(
)
repair_issue_cleanup(hass, avm_wrapper)
repair_issue_firmware_update(hass, avm_wrapper)
class FritzButton(ButtonEntity):
@@ -194,12 +164,6 @@ class FritzButton(ButtonEntity):
"Please update your automations and dashboards to remove any usage of this button. "
"The action is now performed automatically at each data refresh",
)
elif self.entity_description.key == "firmware_update":
_LOGGER.warning(
"The 'firmware update' button is deprecated and will be removed in Home Assistant Core "
"2026.11.0. It has been superseded by an update entity. Please update your automations "
"and dashboards to remove any usage of this button",
)
await self.entity_description.press_action(self.avm_wrapper)
+2 -6
View File
@@ -13,10 +13,7 @@ import voluptuous as vol
from homeassistant.core import HomeAssistant, ServiceCall, callback
from homeassistant.exceptions import HomeAssistantError, ServiceValidationError
from homeassistant.helpers.service import (
async_extract_config_entry_ids,
async_register_admin_service,
)
from homeassistant.helpers.service import async_extract_config_entry_ids
from .const import DOMAIN
from .coordinator import FritzConfigEntry
@@ -121,8 +118,7 @@ async def _async_dial(service_call: ServiceCall) -> None:
def async_setup_services(hass: HomeAssistant) -> None:
"""Set up services for Fritz integration."""
async_register_admin_service(
hass,
hass.services.async_register(
DOMAIN,
SERVICE_SET_GUEST_WIFI_PW,
_async_set_guest_wifi_password,
@@ -211,10 +211,6 @@
"deprecated_cleanup_button": {
"description": "The 'Cleanup' button is deprecated and will be removed in Home Assistant Core {removal_version}. Please update your automations and dashboards to remove any usage of this button. The action is now performed automatically at each data refresh.",
"title": "'Cleanup' button is deprecated"
},
"deprecated_firmware_update_button": {
"description": "The 'Firmware update' button is deprecated and will be removed in Home Assistant Core {removal_version}. It has been superseded by an update entity. Please update your automations and dashboards to remove any usage of this button.",
"title": "'Firmware update' button is deprecated"
}
},
"options": {
@@ -21,5 +21,5 @@
"integration_type": "system",
"preview_features": { "winter_mode": {} },
"quality_scale": "internal",
"requirements": ["home-assistant-frontend==20260325.8"]
"requirements": ["home-assistant-frontend==20260325.7"]
}
+1 -6
View File
@@ -7,12 +7,7 @@ from homeassistant.core import HomeAssistant
from .coordinator import FumisConfigEntry, FumisDataUpdateCoordinator
PLATFORMS = [
Platform.BINARY_SENSOR,
Platform.BUTTON,
Platform.CLIMATE,
Platform.SENSOR,
]
PLATFORMS = [Platform.CLIMATE, Platform.SENSOR]
async def async_setup_entry(hass: HomeAssistant, entry: FumisConfigEntry) -> bool:
@@ -1,76 +0,0 @@
"""Support for Fumis binary sensor entities."""
from __future__ import annotations
from collections.abc import Callable
from dataclasses import dataclass
from fumis import FumisInfo
from homeassistant.components.binary_sensor import (
BinarySensorDeviceClass,
BinarySensorEntity,
BinarySensorEntityDescription,
)
from homeassistant.const import EntityCategory
from homeassistant.core import HomeAssistant
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from .coordinator import FumisConfigEntry, FumisDataUpdateCoordinator
from .entity import FumisEntity
PARALLEL_UPDATES = 0
@dataclass(frozen=True, kw_only=True)
class FumisBinarySensorEntityDescription(BinarySensorEntityDescription):
"""Describes a Fumis binary sensor entity."""
has_fn: Callable[[FumisInfo], bool] = lambda _: True
is_on_fn: Callable[[FumisInfo], bool | None]
BINARY_SENSORS: tuple[FumisBinarySensorEntityDescription, ...] = (
FumisBinarySensorEntityDescription(
key="door",
device_class=BinarySensorDeviceClass.DOOR,
entity_category=EntityCategory.DIAGNOSTIC,
has_fn=lambda data: data.controller.door_open is not None,
is_on_fn=lambda data: data.controller.door_open,
),
)
async def async_setup_entry(
hass: HomeAssistant,
entry: FumisConfigEntry,
async_add_entities: AddConfigEntryEntitiesCallback,
) -> None:
"""Set up Fumis binary sensor entities based on a config entry."""
coordinator = entry.runtime_data
async_add_entities(
FumisBinarySensorEntity(coordinator=coordinator, description=description)
for description in BINARY_SENSORS
if description.has_fn(coordinator.data)
)
class FumisBinarySensorEntity(FumisEntity, BinarySensorEntity):
"""Defines a Fumis binary sensor entity."""
entity_description: FumisBinarySensorEntityDescription
def __init__(
self,
coordinator: FumisDataUpdateCoordinator,
description: FumisBinarySensorEntityDescription,
) -> None:
"""Initialize the Fumis binary sensor entity."""
super().__init__(coordinator)
self.entity_description = description
self._attr_unique_id = f"{coordinator.config_entry.unique_id}_{description.key}"
@property
def is_on(self) -> bool | None:
"""Return the state of the binary sensor."""
return self.entity_description.is_on_fn(self.coordinator.data)
-71
View File
@@ -1,71 +0,0 @@
"""Support for Fumis button entities."""
from __future__ import annotations
from collections.abc import Awaitable, Callable
from dataclasses import dataclass
from typing import Any
from fumis import Fumis
from homeassistant.components.button import ButtonEntity, ButtonEntityDescription
from homeassistant.const import EntityCategory
from homeassistant.core import HomeAssistant
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from .coordinator import FumisConfigEntry, FumisDataUpdateCoordinator
from .entity import FumisEntity
from .helpers import fumis_exception_handler
PARALLEL_UPDATES = 1
@dataclass(frozen=True, kw_only=True)
class FumisButtonEntityDescription(ButtonEntityDescription):
"""Describes a Fumis button entity."""
press_fn: Callable[[Fumis], Awaitable[Any]]
BUTTONS: tuple[FumisButtonEntityDescription, ...] = (
FumisButtonEntityDescription(
key="sync_clock",
translation_key="sync_clock",
entity_category=EntityCategory.DIAGNOSTIC,
press_fn=lambda client: client.set_clock(),
),
)
async def async_setup_entry(
hass: HomeAssistant,
entry: FumisConfigEntry,
async_add_entities: AddConfigEntryEntitiesCallback,
) -> None:
"""Set up Fumis button entities based on a config entry."""
coordinator = entry.runtime_data
async_add_entities(
FumisButtonEntity(coordinator=coordinator, description=description)
for description in BUTTONS
)
class FumisButtonEntity(FumisEntity, ButtonEntity):
"""Defines a Fumis button entity."""
entity_description: FumisButtonEntityDescription
def __init__(
self,
coordinator: FumisDataUpdateCoordinator,
description: FumisButtonEntityDescription,
) -> None:
"""Initialize the Fumis button entity."""
super().__init__(coordinator)
self.entity_description = description
self._attr_unique_id = f"{coordinator.config_entry.unique_id}_{description.key}"
@fumis_exception_handler
async def async_press(self) -> None:
"""Handle the button press."""
await self.entity_description.press_fn(self.coordinator.client)
@@ -1,10 +1,5 @@
{
"entity": {
"button": {
"sync_clock": {
"default": "mdi:clock-sync"
}
},
"sensor": {
"combustion_chamber_temperature": {
"default": "mdi:thermometer-high"
+1 -1
View File
@@ -13,5 +13,5 @@
"iot_class": "cloud_polling",
"loggers": ["fumis"],
"quality_scale": "bronze",
"requirements": ["fumis==0.4.0"]
"requirements": ["fumis==0.3.0"]
}
@@ -53,11 +53,6 @@
}
},
"entity": {
"button": {
"sync_clock": {
"name": "Sync clock"
}
},
"sensor": {
"combustion_chamber_temperature": {
"name": "Combustion chamber"
@@ -8,11 +8,6 @@
options:
- all
- any
for:
required: true
default: 00:00:00
selector:
duration:
is_closed:
fields: *condition_common_fields
@@ -1,7 +1,6 @@
{
"common": {
"condition_behavior_name": "Condition passes if",
"condition_for_name": "For at least",
"trigger_behavior_name": "Trigger when",
"trigger_for_name": "For at least"
},
@@ -11,9 +10,6 @@
"fields": {
"behavior": {
"name": "[%key:component::garage_door::common::condition_behavior_name%]"
},
"for": {
"name": "[%key:component::garage_door::common::condition_for_name%]"
}
},
"name": "Garage door is closed"
@@ -23,9 +19,6 @@
"fields": {
"behavior": {
"name": "[%key:component::garage_door::common::condition_behavior_name%]"
},
"for": {
"name": "[%key:component::garage_door::common::condition_for_name%]"
}
},
"name": "Garage door is open"
@@ -8,11 +8,6 @@
options:
- all
- any
for:
required: true
default: 00:00:00
selector:
duration:
is_closed:
fields: *condition_common_fields
@@ -1,7 +1,6 @@
{
"common": {
"condition_behavior_name": "Condition passes if",
"condition_for_name": "For at least",
"trigger_behavior_name": "Trigger when",
"trigger_for_name": "For at least"
},
@@ -11,9 +10,6 @@
"fields": {
"behavior": {
"name": "[%key:component::gate::common::condition_behavior_name%]"
},
"for": {
"name": "[%key:component::gate::common::condition_for_name%]"
}
},
"name": "Gate is closed"
@@ -23,9 +19,6 @@
"fields": {
"behavior": {
"name": "[%key:component::gate::common::condition_behavior_name%]"
},
"for": {
"name": "[%key:component::gate::common::condition_for_name%]"
}
},
"name": "Gate is open"
+6 -7
View File
@@ -67,7 +67,7 @@ from .const import (
RECOMMENDED_VERSION,
)
from .server import Server
from .util import get_camera_identifier, get_go2rtc_unix_socket_path
from .util import get_go2rtc_unix_socket_path
_LOGGER = logging.getLogger(__name__)
@@ -308,7 +308,7 @@ class WebRTCProvider(CameraWebRTCProvider):
return
self._sessions[session_id] = ws_client = Go2RtcWsClient(
self._session, self._url, source=get_camera_identifier(camera)
self._session, self._url, source=camera.entity_id
)
@callback
@@ -354,7 +354,7 @@ class WebRTCProvider(CameraWebRTCProvider):
"""Get an image from the camera."""
await self._update_stream_source(camera)
return await self._rest_client.get_jpeg_snapshot(
get_camera_identifier(camera), width, height
camera.entity_id, width, height
)
async def _update_stream_source(self, camera: Camera) -> None:
@@ -399,19 +399,18 @@ class WebRTCProvider(CameraWebRTCProvider):
stream_source += "#rotate=90"
streams = await self._rest_client.streams.list()
identifier = get_camera_identifier(camera)
if (stream := streams.get(identifier)) is None or not any(
if (stream := streams.get(camera.entity_id)) is None or not any(
stream_source == producer.url for producer in stream.producers
):
await self._rest_client.streams.add(
identifier,
camera.entity_id,
[
stream_source,
# We are setting any ffmpeg rtsp related logs to debug
# Connection problems to the camera will be logged by the first stream
# Therefore setting it to debug will not hide any important logs
f"ffmpeg:{identifier}#audio=opus#query=log_level=debug",
f"ffmpeg:{camera.entity_id}#audio=opus#query=log_level=debug",
],
)
-15
View File
@@ -1,15 +1,8 @@
"""Go2rtc utility functions."""
from pathlib import Path
import string
from urllib.parse import quote
from homeassistant.components.camera import Camera
_HA_MANAGED_UNIX_SOCKET_FILE = "go2rtc.sock"
# Go2rtc is not validating the camera identifier, but some characters (e.g. : or #)
# have special meaning in URLs and could cause issues.
_SAFE_CHARS = string.ascii_letters + string.digits + "._-"
def get_go2rtc_unix_socket_path(path: str | Path) -> str:
@@ -17,11 +10,3 @@ def get_go2rtc_unix_socket_path(path: str | Path) -> str:
if not isinstance(path, Path):
path = Path(path)
return str(path / _HA_MANAGED_UNIX_SOCKET_FILE)
def get_camera_identifier(camera: Camera) -> str:
"""Get the Go2rtc camera identifier."""
attr = camera.entity_id
if camera.unique_id is not None:
attr = f"{camera.platform.platform_name}_{camera.unique_id}"
return quote(attr, safe=_SAFE_CHARS)
@@ -3,28 +3,38 @@
from __future__ import annotations
from functools import partial
from pathlib import Path
from types import MappingProxyType
from google.genai import Client
from google.genai.errors import APIError, ClientError
from requests.exceptions import Timeout
import voluptuous as vol
from homeassistant.config_entries import ConfigEntry, ConfigSubentry
from homeassistant.const import CONF_API_KEY, Platform
from homeassistant.core import HomeAssistant
from homeassistant.core import (
HomeAssistant,
ServiceCall,
ServiceResponse,
SupportsResponse,
)
from homeassistant.exceptions import (
ConfigEntryAuthFailed,
ConfigEntryError,
ConfigEntryNotReady,
HomeAssistantError,
)
from homeassistant.helpers import (
config_validation as cv,
device_registry as dr,
entity_registry as er,
issue_registry as ir,
)
from homeassistant.helpers.typing import ConfigType
from .const import (
CONF_PROMPT,
DEFAULT_AI_TASK_NAME,
DEFAULT_STT_NAME,
DEFAULT_TITLE,
@@ -37,6 +47,11 @@ from .const import (
RECOMMENDED_TTS_OPTIONS,
TIMEOUT_MILLIS,
)
from .entity import async_prepare_files_for_prompt
SERVICE_GENERATE_CONTENT = "generate_content"
CONF_IMAGE_FILENAME = "image_filename"
CONF_FILENAMES = "filenames"
CONFIG_SCHEMA = cv.config_entry_only_config_schema(DOMAIN)
PLATFORMS = (
@@ -54,6 +69,88 @@ async def async_setup(hass: HomeAssistant, config: ConfigType) -> bool:
await async_migrate_integration(hass)
async def generate_content(call: ServiceCall) -> ServiceResponse:
"""Generate content from text and optionally images."""
LOGGER.warning(
"Action '%s.%s' is deprecated and will be removed in the 2026.4.0 release. "
"Please use the 'ai_task.generate_data' action instead",
DOMAIN,
SERVICE_GENERATE_CONTENT,
)
ir.async_create_issue(
hass,
DOMAIN,
"deprecated_generate_content",
breaks_in_ha_version="2026.4.0",
is_fixable=False,
severity=ir.IssueSeverity.WARNING,
translation_key="deprecated_generate_content",
)
prompt_parts = [call.data[CONF_PROMPT]]
config_entry: GoogleGenerativeAIConfigEntry = (
hass.config_entries.async_loaded_entries(DOMAIN)[0]
)
client = config_entry.runtime_data
files = call.data[CONF_FILENAMES]
if files:
for filename in files:
if not hass.config.is_allowed_path(filename):
raise HomeAssistantError(
f"Cannot read `{filename}`, no access to path; "
"`allowlist_external_dirs` may need to be adjusted in "
"`configuration.yaml`"
)
prompt_parts.extend(
await async_prepare_files_for_prompt(
hass, client, [(Path(filename), None) for filename in files]
)
)
try:
response = await client.aio.models.generate_content(
model=RECOMMENDED_CHAT_MODEL, contents=prompt_parts
)
except (
APIError,
ValueError,
) as err:
raise HomeAssistantError(f"Error generating content: {err}") from err
if response.prompt_feedback:
raise HomeAssistantError(
f"Error generating content due to content violations, reason: {response.prompt_feedback.block_reason_message}"
)
if (
not response.candidates
or not response.candidates[0].content
or not response.candidates[0].content.parts
):
raise HomeAssistantError("Unknown error generating content")
return {"text": response.text}
hass.services.async_register(
DOMAIN,
SERVICE_GENERATE_CONTENT,
generate_content,
schema=vol.Schema(
{
vol.Required(CONF_PROMPT): cv.string,
vol.Optional(CONF_FILENAMES, default=[]): vol.All(
cv.ensure_list, [cv.string]
),
}
),
supports_response=SupportsResponse.ONLY,
description_placeholders={"example_image_path": "/config/www/image.jpg"},
)
return True
@@ -338,7 +338,6 @@ def _convert_content(
async def _transform_stream(
chat_log: conversation.ChatLog,
result: AsyncIterator[GenerateContentResponse],
) -> AsyncGenerator[conversation.AssistantContentDeltaDict]:
new_message = True
@@ -347,19 +346,6 @@ async def _transform_stream(
async for response in result:
LOGGER.debug("Received response chunk: %s", response)
if (usage := response.usage_metadata) is not None:
chat_log.async_trace(
{
"stats": {
"input_tokens": usage.prompt_token_count,
"cached_input_tokens": (
usage.cached_content_token_count or 0
),
"output_tokens": usage.candidates_token_count,
}
}
)
if new_message:
if part_details:
yield {"native": ContentDetails(part_details=part_details)}
@@ -637,7 +623,7 @@ class GoogleGenerativeAILLMBaseEntity(Entity):
content
async for content in chat_log.async_add_delta_content_stream(
self.entity_id,
_transform_stream(chat_log, chat_response_generator),
_transform_stream(chat_response_generator),
)
if isinstance(content, conversation.ToolResultContent)
]
@@ -0,0 +1,7 @@
{
"services": {
"generate_content": {
"service": "mdi:receipt-text"
}
}
}
@@ -0,0 +1,12 @@
generate_content:
fields:
prompt:
required: true
selector:
text:
multiline: true
filenames:
required: false
selector:
text:
multiple: true
@@ -149,5 +149,29 @@
}
}
}
},
"issues": {
"deprecated_generate_content": {
"description": "Action 'google_generative_ai_conversation.generate_content' is deprecated and will be removed in the 2026.4.0 release. Please use the 'ai_task.generate_data' action instead",
"title": "Deprecated 'generate_content' action"
}
},
"services": {
"generate_content": {
"description": "Generate content from a prompt consisting of text and optionally images (deprecated)",
"fields": {
"filenames": {
"description": "Attachments to add to the prompt (images, PDFs, etc)",
"example": "{example_image_path}",
"name": "Attachment filenames"
},
"prompt": {
"description": "The prompt",
"example": "Describe what you see in these images",
"name": "Prompt"
}
},
"name": "Generate content (deprecated)"
}
}
}
@@ -4,7 +4,7 @@ from __future__ import annotations
from collections.abc import Callable
from dataclasses import dataclass
from datetime import datetime, timedelta
from datetime import datetime
import logging
from typing import Any
@@ -36,40 +36,6 @@ class GreenPlanetEnergySensorEntityDescription(SensorEntityDescription):
value_fn: Callable[[GreenPlanetEnergyAPI, dict[str, Any]], float | datetime | None]
def _get_lowest_price_day_time(
api: GreenPlanetEnergyAPI, data: dict[str, Any]
) -> datetime | None:
"""Return timestamp of the lowest-priced day hour (06:0018:00)."""
now = dt_util.now()
now_h = now.hour
hour = api.get_lowest_price_day_with_hour(data, now_h)[1]
if hour is None:
return None
# After 18:00 the day period is over; use tomorrow's date
base = dt_util.start_of_local_day(now + timedelta(days=1) if now_h >= 18 else now)
return base.replace(hour=hour)
def _get_lowest_price_night_time(
api: GreenPlanetEnergyAPI, data: dict[str, Any]
) -> datetime | None:
"""Return timestamp of the lowest-priced night hour (18:00-06:00)."""
now = dt_util.now()
now_h = now.hour
hour = api.get_lowest_price_night_with_hour(data)[1]
if hour is None:
return None
if now_h < 6:
base = dt_util.start_of_local_day(
now - timedelta(days=1) if hour >= 18 else now
)
else:
base = dt_util.start_of_local_day(now + timedelta(days=1) if hour < 6 else now)
return base.replace(hour=hour)
SENSOR_DESCRIPTIONS: list[GreenPlanetEnergySensorEntityDescription] = [
# Statistical sensors only - hourly prices available via service
GreenPlanetEnergySensorEntityDescription(
@@ -101,7 +67,7 @@ SENSOR_DESCRIPTIONS: list[GreenPlanetEnergySensorEntityDescription] = [
translation_placeholders={"time_range": "(06:00-18:00)"},
value_fn=lambda api, data: (
price / 100
if (price := api.get_lowest_price_day(data, dt_util.now().hour)) is not None
if (price := api.get_lowest_price_day(data)) is not None
else None
),
),
@@ -110,7 +76,11 @@ SENSOR_DESCRIPTIONS: list[GreenPlanetEnergySensorEntityDescription] = [
translation_key="lowest_price_day_time",
device_class=SensorDeviceClass.TIMESTAMP,
translation_placeholders={"time_range": "(06:00-18:00)"},
value_fn=_get_lowest_price_day_time,
value_fn=lambda api, data: (
dt_util.start_of_local_day().replace(hour=hour)
if (hour := api.get_lowest_price_day_with_hour(data)[1]) is not None
else None
),
),
GreenPlanetEnergySensorEntityDescription(
key="gpe_lowest_price_night",
@@ -129,7 +99,11 @@ SENSOR_DESCRIPTIONS: list[GreenPlanetEnergySensorEntityDescription] = [
translation_key="lowest_price_night_time",
device_class=SensorDeviceClass.TIMESTAMP,
translation_placeholders={"time_range": "(18:00-06:00)"},
value_fn=_get_lowest_price_night_time,
value_fn=lambda api, data: (
dt_util.start_of_local_day().replace(hour=hour)
if (hour := api.get_lowest_price_night_with_hour(data)[1]) is not None
else None
),
),
GreenPlanetEnergySensorEntityDescription(
key="gpe_current_price",

Some files were not shown because too many files have changed in this diff Show More