Compare commits

..

34 Commits

Author SHA1 Message Date
Daniel Hjelseth Høyer
5a2ecf5e39 Merge branch 'dev' into tibber_data 2025-12-02 21:02:45 +01:00
Daniel Hjelseth Høyer
7d6ceb0975 test
Signed-off-by: Daniel Hjelseth Høyer <github@dahoiv.net>
2025-12-02 20:45:00 +01:00
Daniel Hjelseth Høyer
5d03ce2b28 Update homeassistant/components/tibber/sensor.py
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
2025-12-02 10:06:56 +01:00
Daniel Hjelseth Høyer
1a9b72a5d5 tests
Signed-off-by: Daniel Hjelseth Høyer <github@dahoiv.net>
2025-12-02 06:19:47 +01:00
Daniel Hjelseth Høyer
0f460730c5 adjust available state
Signed-off-by: Daniel Hjelseth Høyer <github@dahoiv.net>
2025-11-27 20:33:01 +01:00
Daniel Hjelseth Høyer
2b898d2eab config
Signed-off-by: Daniel Hjelseth Høyer <github@dahoiv.net>
2025-11-27 18:54:32 +01:00
Daniel Hjelseth Høyer
45ddbb6ab8 config
Signed-off-by: Daniel Hjelseth Høyer <github@dahoiv.net>
2025-11-27 16:48:46 +01:00
Daniel Hjelseth Høyer
7d8a89258f config
Signed-off-by: Daniel Hjelseth Høyer <github@dahoiv.net>
2025-11-27 16:36:51 +01:00
Daniel Hjelseth Høyer
420e01ef26 fix strings
Signed-off-by: Daniel Hjelseth Høyer <github@dahoiv.net>
2025-11-26 07:13:34 +01:00
Daniel Hjelseth Høyer
b1b3eb80b1 config
Signed-off-by: Daniel Hjelseth Høyer <github@dahoiv.net>
2025-11-25 20:25:58 +01:00
Daniel Hjelseth Høyer
1d717a1957 config
Signed-off-by: Daniel Hjelseth Høyer <github@dahoiv.net>
2025-11-25 19:02:38 +01:00
Daniel Hjelseth Høyer
a8e3f79941 config
Signed-off-by: Daniel Hjelseth Høyer <github@dahoiv.net>
2025-11-25 18:57:12 +01:00
Daniel Hjelseth Høyer
946afd7d91 config
Signed-off-by: Daniel Hjelseth Høyer <github@dahoiv.net>
2025-11-25 07:17:34 +01:00
Daniel Hjelseth Høyer
f6277d0ec2 Merge branch 'dev' into tibber_data 2025-11-24 12:21:03 +01:00
Daniel Hjelseth Høyer
d7aa939f83 Merge branch 'tibber_data' of github.com:home-assistant/core into tibber_data 2025-11-19 06:53:01 +01:00
Daniel Hjelseth Høyer
77b349d00f test
Signed-off-by: Daniel Hjelseth Høyer <github@dahoiv.net>
2025-11-19 06:52:06 +01:00
Daniel Hjelseth Høyer
1c036128fa Merge branch 'dev' into tibber_data 2025-11-18 08:40:09 +01:00
Daniel Hjelseth Høyer
16d898cc8e test coverage
Signed-off-by: Daniel Hjelseth Høyer <github@dahoiv.net>
2025-11-18 07:12:48 +01:00
Daniel Hjelseth Høyer
a7225c7cd4 Merge branch 'dev' into tibber_data 2025-11-18 06:51:29 +01:00
Daniel Hjelseth Høyer
433a429c5a test coverage
Signed-off-by: Daniel Hjelseth Høyer <github@dahoiv.net>
2025-11-18 06:37:33 +01:00
Daniel Hjelseth Høyer
c4770ed423 test coverage
Signed-off-by: Daniel Hjelseth Høyer <github@dahoiv.net>
2025-11-17 20:57:03 +01:00
Daniel Hjelseth Høyer
df329fd273 test coverage
Signed-off-by: Daniel Hjelseth Høyer <github@dahoiv.net>
2025-11-17 20:36:43 +01:00
Daniel Hjelseth Høyer
6eb40574bc tests
Signed-off-by: Daniel Hjelseth Høyer <github@dahoiv.net>
2025-11-16 19:39:19 +01:00
Daniel Hjelseth Høyer
4fd1ef5483 Tibber data api
Signed-off-by: Daniel Hjelseth Høyer <github@dahoiv.net>
2025-11-16 17:49:18 +01:00
Daniel Hjelseth Høyer
7ec5d5305d Tibber data api
Signed-off-by: Daniel Hjelseth Høyer <github@dahoiv.net>
2025-11-16 16:38:01 +01:00
Daniel Hjelseth Høyer
7f31d2538e Tibber data api
Signed-off-by: Daniel Hjelseth Høyer <github@dahoiv.net>
2025-11-16 16:08:45 +01:00
Daniel Hjelseth Høyer
e1943307cf Merge branch 'dev' into tibber_data 2025-11-16 16:08:21 +01:00
Daniel Hjelseth Høyer
a06529d187 Tibber data api
Signed-off-by: Daniel Hjelseth Høyer <github@dahoiv.net>
2025-11-16 15:59:18 +01:00
Daniel Hjelseth Høyer
21554af6a1 Tibber data api
Signed-off-by: Daniel Hjelseth Høyer <github@dahoiv.net>
2025-11-16 12:14:03 +01:00
Daniel Hjelseth Høyer
b4aae93c45 Tibber data api
Signed-off-by: Daniel Hjelseth Høyer <github@dahoiv.net>
2025-11-14 19:18:22 +01:00
Daniel Hjelseth Høyer
1f9c244c5c Tibber data api
Signed-off-by: Daniel Hjelseth Høyer <github@dahoiv.net>
2025-11-14 06:01:05 +01:00
Daniel Hjelseth Høyer
9fa1b1b8df Tibber data api
Signed-off-by: Daniel Hjelseth Høyer <github@dahoiv.net>
2025-11-13 22:11:18 +01:00
Daniel Hjelseth Høyer
f3ac3ecf05 Tibber data api
Signed-off-by: Daniel Hjelseth Høyer <github@dahoiv.net>
2025-11-13 21:07:27 +01:00
Daniel Hjelseth Høyer
9477b2206b Tibber data api
Signed-off-by: Daniel Hjelseth Høyer <github@dahoiv.net>
2025-11-13 20:07:57 +01:00
180 changed files with 2110 additions and 4989 deletions

View File

@@ -30,7 +30,7 @@ jobs:
architectures: ${{ env.ARCHITECTURES }}
steps:
- name: Checkout the repository
uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
uses: actions/checkout@1af3b93b6815bc44a9784bd300feb67ff0d1eeb3 # v6.0.0
- name: Set up Python ${{ env.DEFAULT_PYTHON }}
uses: actions/setup-python@83679a892e2d95755f2dac6acb0bfd1e9ac5d548 # v6.1.0
@@ -96,7 +96,7 @@ jobs:
os: ubuntu-24.04-arm
steps:
- name: Checkout the repository
uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
uses: actions/checkout@1af3b93b6815bc44a9784bd300feb67ff0d1eeb3 # v6.0.0
- name: Download nightly wheels of frontend
if: needs.init.outputs.channel == 'dev'
@@ -273,7 +273,7 @@ jobs:
- green
steps:
- name: Checkout the repository
uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
uses: actions/checkout@1af3b93b6815bc44a9784bd300feb67ff0d1eeb3 # v6.0.0
- name: Set build additional args
run: |
@@ -311,7 +311,7 @@ jobs:
runs-on: ubuntu-latest
steps:
- name: Checkout the repository
uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
uses: actions/checkout@1af3b93b6815bc44a9784bd300feb67ff0d1eeb3 # v6.0.0
- name: Initialize git
uses: home-assistant/actions/helpers/git-init@master
@@ -416,19 +416,9 @@ jobs:
ARCHS=$(echo '${{ needs.init.outputs.architectures }}' | jq -r '.[]')
for arch in $ARCHS; do
echo "Copying ${arch} image to DockerHub..."
for attempt in 1 2 3; do
if docker buildx imagetools create \
--tag "docker.io/homeassistant/${arch}-homeassistant:${{ needs.init.outputs.version }}" \
"ghcr.io/home-assistant/${arch}-homeassistant:${{ needs.init.outputs.version }}"; then
break
fi
echo "Attempt ${attempt} failed, retrying in 10 seconds..."
sleep 10
if [ "${attempt}" -eq 3 ]; then
echo "Failed after 3 attempts"
exit 1
fi
done
docker buildx imagetools create \
--tag "docker.io/homeassistant/${arch}-homeassistant:${{ needs.init.outputs.version }}" \
"ghcr.io/home-assistant/${arch}-homeassistant:${{ needs.init.outputs.version }}"
cosign sign --yes "docker.io/homeassistant/${arch}-homeassistant:${{ needs.init.outputs.version }}"
done
@@ -474,7 +464,7 @@ jobs:
if: github.repository_owner == 'home-assistant' && needs.init.outputs.publish == 'true'
steps:
- name: Checkout the repository
uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
uses: actions/checkout@1af3b93b6815bc44a9784bd300feb67ff0d1eeb3 # v6.0.0
- name: Set up Python ${{ env.DEFAULT_PYTHON }}
uses: actions/setup-python@83679a892e2d95755f2dac6acb0bfd1e9ac5d548 # v6.1.0
@@ -519,7 +509,7 @@ jobs:
HASSFEST_IMAGE_TAG: ghcr.io/home-assistant/hassfest:${{ needs.init.outputs.version }}
steps:
- name: Checkout repository
uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
uses: actions/checkout@1af3b93b6815bc44a9784bd300feb67ff0d1eeb3 # v6.0.0
- name: Login to GitHub Container Registry
uses: docker/login-action@5e57cd118135c172c3672efd75eb46360885c0ef # v3.6.0

View File

@@ -41,8 +41,8 @@ env:
UV_CACHE_VERSION: 1
MYPY_CACHE_VERSION: 1
HA_SHORT_VERSION: "2026.1"
DEFAULT_PYTHON: "3.13.9"
ALL_PYTHON_VERSIONS: "['3.13.9', '3.14.0']"
DEFAULT_PYTHON: "3.13"
ALL_PYTHON_VERSIONS: "['3.13', '3.14']"
# 10.3 is the oldest supported version
# - 10.3.32 is the version currently shipped with Synology (as of 17 Feb 2022)
# 10.6 is the current long-term-support
@@ -99,7 +99,7 @@ jobs:
steps:
- &checkout
name: Check out code from GitHub
uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
uses: actions/checkout@1af3b93b6815bc44a9784bd300feb67ff0d1eeb3 # v6.0.0
- name: Generate partial Python venv restore key
id: generate_python_cache_key
run: |

View File

@@ -21,7 +21,7 @@ jobs:
steps:
- name: Check out code from GitHub
uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
uses: actions/checkout@1af3b93b6815bc44a9784bd300feb67ff0d1eeb3 # v6.0.0
- name: Initialize CodeQL
uses: github/codeql-action/init@fe4161a26a8629af62121b670040955b330f9af2 # v4.31.6

View File

@@ -17,7 +17,7 @@ jobs:
# - No PRs marked as no-stale
# - No issues (-1)
- name: 60 days stale PRs policy
uses: actions/stale@997185467fa4f803885201cee163a9f38240193d # v10.1.1
uses: actions/stale@5f858e3efba33a5ca4407a664cc011ad407f2008 # v10.1.0
with:
repo-token: ${{ secrets.GITHUB_TOKEN }}
days-before-stale: 60
@@ -57,7 +57,7 @@ jobs:
# - No issues marked as no-stale or help-wanted
# - No PRs (-1)
- name: 90 days stale issues
uses: actions/stale@997185467fa4f803885201cee163a9f38240193d # v10.1.1
uses: actions/stale@5f858e3efba33a5ca4407a664cc011ad407f2008 # v10.1.0
with:
repo-token: ${{ steps.token.outputs.token }}
days-before-stale: 90
@@ -87,7 +87,7 @@ jobs:
# - No Issues marked as no-stale or help-wanted
# - No PRs (-1)
- name: Needs more information stale issues policy
uses: actions/stale@997185467fa4f803885201cee163a9f38240193d # v10.1.1
uses: actions/stale@5f858e3efba33a5ca4407a664cc011ad407f2008 # v10.1.0
with:
repo-token: ${{ steps.token.outputs.token }}
only-labels: "needs-more-information"

View File

@@ -19,7 +19,7 @@ jobs:
runs-on: ubuntu-latest
steps:
- name: Checkout the repository
uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
uses: actions/checkout@1af3b93b6815bc44a9784bd300feb67ff0d1eeb3 # v6.0.0
- name: Set up Python ${{ env.DEFAULT_PYTHON }}
uses: actions/setup-python@83679a892e2d95755f2dac6acb0bfd1e9ac5d548 # v6.1.0

View File

@@ -31,7 +31,7 @@ jobs:
steps:
- &checkout
name: Checkout the repository
uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
uses: actions/checkout@1af3b93b6815bc44a9784bd300feb67ff0d1eeb3 # v6.0.0
- name: Set up Python ${{ env.DEFAULT_PYTHON }}
id: python

View File

@@ -6,5 +6,5 @@
"documentation": "https://www.home-assistant.io/integrations/assist_satellite",
"integration_type": "entity",
"quality_scale": "internal",
"requirements": ["hassil==3.5.0"]
"requirements": ["hassil==3.4.0"]
}

View File

@@ -8,8 +8,6 @@
"integration_type": "system",
"preview_features": {
"new_triggers_conditions": {
"feedback_url": "https://forms.gle/fWFZqf5MzuwWTsCH8",
"learn_more_url": "https://www.home-assistant.io/blog/2025/12/03/release-202512/#purpose-specific-triggers-and-conditions",
"report_issue_url": "https://github.com/home-assistant/core/issues/new?template=bug_report.yml&integration_link=https://www.home-assistant.io/integrations/automation&integration_name=Automation"
}
},

View File

@@ -21,29 +21,29 @@ from homeassistant.helpers import device_registry as dr
from homeassistant.util.ssl import get_default_context
from .const import DOMAIN
from .websocket import BeoWebsocket
from .websocket import BangOlufsenWebsocket
@dataclass
class BeoData:
class BangOlufsenData:
"""Dataclass for API client and WebSocket client."""
websocket: BeoWebsocket
websocket: BangOlufsenWebsocket
client: MozartClient
type BeoConfigEntry = ConfigEntry[BeoData]
type BangOlufsenConfigEntry = ConfigEntry[BangOlufsenData]
PLATFORMS = [Platform.EVENT, Platform.MEDIA_PLAYER]
async def async_setup_entry(hass: HomeAssistant, entry: BeoConfigEntry) -> bool:
async def async_setup_entry(hass: HomeAssistant, entry: BangOlufsenConfigEntry) -> bool:
"""Set up from a config entry."""
# Remove casts to str
assert entry.unique_id
# Create device now as BeoWebsocket needs a device for debug logging, firing events etc.
# Create device now as BangOlufsenWebsocket needs a device for debug logging, firing events etc.
device_registry = dr.async_get(hass)
device_registry.async_get_or_create(
config_entry_id=entry.entry_id,
@@ -68,10 +68,10 @@ async def async_setup_entry(hass: HomeAssistant, entry: BeoConfigEntry) -> bool:
await client.close_api_client()
raise ConfigEntryNotReady(f"Unable to connect to {entry.title}") from error
websocket = BeoWebsocket(hass, entry, client)
websocket = BangOlufsenWebsocket(hass, entry, client)
# Add the websocket and API client
entry.runtime_data = BeoData(websocket, client)
entry.runtime_data = BangOlufsenData(websocket, client)
await hass.config_entries.async_forward_entry_setups(entry, PLATFORMS)
@@ -82,7 +82,9 @@ async def async_setup_entry(hass: HomeAssistant, entry: BeoConfigEntry) -> bool:
return True
async def async_unload_entry(hass: HomeAssistant, entry: BeoConfigEntry) -> bool:
async def async_unload_entry(
hass: HomeAssistant, entry: BangOlufsenConfigEntry
) -> bool:
"""Unload a config entry."""
# Close the API client and WebSocket notification listener
entry.runtime_data.client.disconnect_notifications()

View File

@@ -47,7 +47,7 @@ _exception_map = {
}
class BeoConfigFlowHandler(ConfigFlow, domain=DOMAIN):
class BangOlufsenConfigFlowHandler(ConfigFlow, domain=DOMAIN):
"""Handle a config flow."""
_beolink_jid = ""

View File

@@ -14,7 +14,7 @@ from homeassistant.components.media_player import (
)
class BeoSource:
class BangOlufsenSource:
"""Class used for associating device source ids with friendly names. May not include all sources."""
DEEZER: Final[Source] = Source(name="Deezer", id="deezer")
@@ -26,7 +26,7 @@ class BeoSource:
URI_STREAMER: Final[Source] = Source(name="Audio Streamer", id="uriStreamer")
BEO_STATES: dict[str, MediaPlayerState] = {
BANG_OLUFSEN_STATES: dict[str, MediaPlayerState] = {
# Dict used for translating device states to Home Assistant states.
"started": MediaPlayerState.PLAYING,
"buffering": MediaPlayerState.PLAYING,
@@ -40,19 +40,19 @@ BEO_STATES: dict[str, MediaPlayerState] = {
}
# Dict used for translating Home Assistant settings to device repeat settings.
BEO_REPEAT_FROM_HA: dict[RepeatMode, str] = {
BANG_OLUFSEN_REPEAT_FROM_HA: dict[RepeatMode, str] = {
RepeatMode.ALL: "all",
RepeatMode.ONE: "track",
RepeatMode.OFF: "none",
}
# Dict used for translating device repeat settings to Home Assistant settings.
BEO_REPEAT_TO_HA: dict[str, RepeatMode] = {
value: key for key, value in BEO_REPEAT_FROM_HA.items()
BANG_OLUFSEN_REPEAT_TO_HA: dict[str, RepeatMode] = {
value: key for key, value in BANG_OLUFSEN_REPEAT_FROM_HA.items()
}
# Media types for play_media
class BeoMediaType(StrEnum):
class BangOlufsenMediaType(StrEnum):
"""Bang & Olufsen specific media types."""
FAVOURITE = "favourite"
@@ -63,7 +63,7 @@ class BeoMediaType(StrEnum):
OVERLAY_TTS = "overlay_tts"
class BeoModel(StrEnum):
class BangOlufsenModel(StrEnum):
"""Enum for compatible model names."""
# Mozart devices
@@ -82,7 +82,7 @@ class BeoModel(StrEnum):
BEOREMOTE_ONE = "Beoremote One"
class BeoAttribute(StrEnum):
class BangOlufsenAttribute(StrEnum):
"""Enum for extra_state_attribute keys."""
BEOLINK = "beolink"
@@ -93,7 +93,7 @@ class BeoAttribute(StrEnum):
# Physical "buttons" on devices
class BeoButtons(StrEnum):
class BangOlufsenButtons(StrEnum):
"""Enum for device buttons."""
BLUETOOTH = "Bluetooth"
@@ -140,7 +140,7 @@ class WebsocketNotification(StrEnum):
DOMAIN: Final[str] = "bang_olufsen"
# Default values for configuration.
DEFAULT_MODEL: Final[str] = BeoModel.BEOSOUND_BALANCE
DEFAULT_MODEL: Final[str] = BangOlufsenModel.BEOSOUND_BALANCE
# Configuration.
CONF_SERIAL_NUMBER: Final = "serial_number"
@@ -148,7 +148,7 @@ CONF_BEOLINK_JID: Final = "jid"
# Models to choose from in manual configuration.
SELECTABLE_MODELS: list[str] = [
model.value for model in BeoModel if model != BeoModel.BEOREMOTE_ONE
model.value for model in BangOlufsenModel if model != BangOlufsenModel.BEOREMOTE_ONE
]
MANUFACTURER: Final[str] = "Bang & Olufsen"
@@ -160,15 +160,15 @@ ATTR_ITEM_NUMBER: Final[str] = "in"
ATTR_FRIENDLY_NAME: Final[str] = "fn"
# Power states.
BEO_ON: Final[str] = "on"
BANG_OLUFSEN_ON: Final[str] = "on"
VALID_MEDIA_TYPES: Final[tuple] = (
BeoMediaType.FAVOURITE,
BeoMediaType.DEEZER,
BeoMediaType.RADIO,
BeoMediaType.TTS,
BeoMediaType.TIDAL,
BeoMediaType.OVERLAY_TTS,
BangOlufsenMediaType.FAVOURITE,
BangOlufsenMediaType.DEEZER,
BangOlufsenMediaType.RADIO,
BangOlufsenMediaType.TTS,
BangOlufsenMediaType.TIDAL,
BangOlufsenMediaType.OVERLAY_TTS,
MediaType.MUSIC,
MediaType.URL,
MediaType.CHANNEL,
@@ -246,7 +246,7 @@ FALLBACK_SOURCES: Final[SourceArray] = SourceArray(
)
# Device events
BEO_WEBSOCKET_EVENT: Final[str] = f"{DOMAIN}_websocket_event"
BANG_OLUFSEN_WEBSOCKET_EVENT: Final[str] = f"{DOMAIN}_websocket_event"
# Dict used to translate native Bang & Olufsen event names to string.json compatible ones
EVENT_TRANSLATION_MAP: dict[str, str] = {
@@ -263,7 +263,7 @@ EVENT_TRANSLATION_MAP: dict[str, str] = {
CONNECTION_STATUS: Final[str] = "CONNECTION_STATUS"
DEVICE_BUTTONS: Final[list[str]] = [x.value for x in BeoButtons]
DEVICE_BUTTONS: Final[list[str]] = [x.value for x in BangOlufsenButtons]
DEVICE_BUTTON_EVENTS: Final[list[str]] = [

View File

@@ -10,13 +10,13 @@ from homeassistant.const import CONF_MODEL
from homeassistant.core import HomeAssistant
from homeassistant.helpers import entity_registry as er
from . import BeoConfigEntry
from . import BangOlufsenConfigEntry
from .const import DOMAIN
from .util import get_device_buttons
async def async_get_config_entry_diagnostics(
hass: HomeAssistant, config_entry: BeoConfigEntry
hass: HomeAssistant, config_entry: BangOlufsenConfigEntry
) -> dict[str, Any]:
"""Return diagnostics for a config entry."""

View File

@@ -24,8 +24,8 @@ from homeassistant.helpers.entity import Entity
from .const import DOMAIN
class BeoBase:
"""Base class for Bang & Olufsen Home Assistant objects."""
class BangOlufsenBase:
"""Base class for BangOlufsen Home Assistant objects."""
def __init__(self, entry: ConfigEntry, client: MozartClient) -> None:
"""Initialize the object."""
@@ -51,8 +51,8 @@ class BeoBase:
)
class BeoEntity(Entity, BeoBase):
"""Base Entity for Bang & Olufsen entities."""
class BangOlufsenEntity(Entity, BangOlufsenBase):
"""Base Entity for BangOlufsen entities."""
_attr_has_entity_name = True
_attr_should_poll = False

View File

@@ -14,7 +14,7 @@ from homeassistant.helpers.device_registry import DeviceInfo
from homeassistant.helpers.dispatcher import async_dispatcher_connect
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from . import BeoConfigEntry
from . import BangOlufsenConfigEntry
from .const import (
BEO_REMOTE_CONTROL_KEYS,
BEO_REMOTE_KEY_EVENTS,
@@ -25,10 +25,10 @@ from .const import (
DEVICE_BUTTON_EVENTS,
DOMAIN,
MANUFACTURER,
BeoModel,
BangOlufsenModel,
WebsocketNotification,
)
from .entity import BeoEntity
from .entity import BangOlufsenEntity
from .util import get_device_buttons, get_remotes
PARALLEL_UPDATES = 0
@@ -36,14 +36,14 @@ PARALLEL_UPDATES = 0
async def async_setup_entry(
hass: HomeAssistant,
config_entry: BeoConfigEntry,
config_entry: BangOlufsenConfigEntry,
async_add_entities: AddConfigEntryEntitiesCallback,
) -> None:
"""Set up Event entities from config entry."""
entities: list[BeoEvent] = []
entities: list[BangOlufsenEvent] = []
async_add_entities(
BeoButtonEvent(config_entry, button_type)
BangOlufsenButtonEvent(config_entry, button_type)
for button_type in get_device_buttons(config_entry.data[CONF_MODEL])
)
@@ -54,7 +54,7 @@ async def async_setup_entry(
# Add Light keys
entities.extend(
[
BeoRemoteKeyEvent(
BangOlufsenRemoteKeyEvent(
config_entry,
remote,
f"{BEO_REMOTE_SUBMENU_LIGHT}/{key_type}",
@@ -66,7 +66,7 @@ async def async_setup_entry(
# Add Control keys
entities.extend(
[
BeoRemoteKeyEvent(
BangOlufsenRemoteKeyEvent(
config_entry,
remote,
f"{BEO_REMOTE_SUBMENU_CONTROL}/{key_type}",
@@ -84,9 +84,10 @@ async def async_setup_entry(
config_entry.entry_id
)
for device in devices:
if device.model == BeoModel.BEOREMOTE_ONE and device.serial_number not in {
remote.serial_number for remote in remotes
}:
if (
device.model == BangOlufsenModel.BEOREMOTE_ONE
and device.serial_number not in {remote.serial_number for remote in remotes}
):
device_registry.async_update_device(
device.id, remove_config_entry_id=config_entry.entry_id
)
@@ -94,13 +95,13 @@ async def async_setup_entry(
async_add_entities(new_entities=entities)
class BeoEvent(BeoEntity, EventEntity):
class BangOlufsenEvent(BangOlufsenEntity, EventEntity):
"""Base Event class."""
_attr_device_class = EventDeviceClass.BUTTON
_attr_entity_registry_enabled_default = False
def __init__(self, config_entry: BeoConfigEntry) -> None:
def __init__(self, config_entry: BangOlufsenConfigEntry) -> None:
"""Initialize Event."""
super().__init__(config_entry, config_entry.runtime_data.client)
@@ -111,12 +112,12 @@ class BeoEvent(BeoEntity, EventEntity):
self.async_write_ha_state()
class BeoButtonEvent(BeoEvent):
class BangOlufsenButtonEvent(BangOlufsenEvent):
"""Event class for Button events."""
_attr_event_types = DEVICE_BUTTON_EVENTS
def __init__(self, config_entry: BeoConfigEntry, button_type: str) -> None:
def __init__(self, config_entry: BangOlufsenConfigEntry, button_type: str) -> None:
"""Initialize Button."""
super().__init__(config_entry)
@@ -145,14 +146,14 @@ class BeoButtonEvent(BeoEvent):
)
class BeoRemoteKeyEvent(BeoEvent):
class BangOlufsenRemoteKeyEvent(BangOlufsenEvent):
"""Event class for Beoremote One key events."""
_attr_event_types = BEO_REMOTE_KEY_EVENTS
def __init__(
self,
config_entry: BeoConfigEntry,
config_entry: BangOlufsenConfigEntry,
remote: PairedRemote,
key_type: str,
) -> None:
@@ -165,8 +166,8 @@ class BeoRemoteKeyEvent(BeoEvent):
self._attr_unique_id = f"{remote.serial_number}_{self._unique_id}_{key_type}"
self._attr_device_info = DeviceInfo(
identifiers={(DOMAIN, f"{remote.serial_number}_{self._unique_id}")},
name=f"{BeoModel.BEOREMOTE_ONE}-{remote.serial_number}-{self._unique_id}",
model=BeoModel.BEOREMOTE_ONE,
name=f"{BangOlufsenModel.BEOREMOTE_ONE}-{remote.serial_number}-{self._unique_id}",
model=BangOlufsenModel.BEOREMOTE_ONE,
serial_number=remote.serial_number,
sw_version=remote.app_version,
manufacturer=MANUFACTURER,

View File

@@ -69,11 +69,11 @@ from homeassistant.helpers.entity_platform import (
)
from homeassistant.util.dt import utcnow
from . import BeoConfigEntry
from . import BangOlufsenConfigEntry
from .const import (
BEO_REPEAT_FROM_HA,
BEO_REPEAT_TO_HA,
BEO_STATES,
BANG_OLUFSEN_REPEAT_FROM_HA,
BANG_OLUFSEN_REPEAT_TO_HA,
BANG_OLUFSEN_STATES,
BEOLINK_JOIN_SOURCES,
BEOLINK_JOIN_SOURCES_TO_UPPER,
CONF_BEOLINK_JID,
@@ -82,12 +82,12 @@ from .const import (
FALLBACK_SOURCES,
MANUFACTURER,
VALID_MEDIA_TYPES,
BeoAttribute,
BeoMediaType,
BeoSource,
BangOlufsenAttribute,
BangOlufsenMediaType,
BangOlufsenSource,
WebsocketNotification,
)
from .entity import BeoEntity
from .entity import BangOlufsenEntity
from .util import get_serial_number_from_jid
PARALLEL_UPDATES = 0
@@ -96,7 +96,7 @@ SCAN_INTERVAL = timedelta(seconds=30)
_LOGGER = logging.getLogger(__name__)
BEO_FEATURES = (
BANG_OLUFSEN_FEATURES = (
MediaPlayerEntityFeature.BROWSE_MEDIA
| MediaPlayerEntityFeature.CLEAR_PLAYLIST
| MediaPlayerEntityFeature.GROUPING
@@ -119,13 +119,15 @@ BEO_FEATURES = (
async def async_setup_entry(
hass: HomeAssistant,
config_entry: BeoConfigEntry,
config_entry: BangOlufsenConfigEntry,
async_add_entities: AddConfigEntryEntitiesCallback,
) -> None:
"""Set up a Media Player entity from config entry."""
# Add MediaPlayer entity
async_add_entities(
new_entities=[BeoMediaPlayer(config_entry, config_entry.runtime_data.client)],
new_entities=[
BangOlufsenMediaPlayer(config_entry, config_entry.runtime_data.client)
],
update_before_add=True,
)
@@ -185,7 +187,7 @@ async def async_setup_entry(
)
class BeoMediaPlayer(BeoEntity, MediaPlayerEntity):
class BangOlufsenMediaPlayer(BangOlufsenEntity, MediaPlayerEntity):
"""Representation of a media player."""
_attr_name = None
@@ -286,7 +288,7 @@ class BeoMediaPlayer(BeoEntity, MediaPlayerEntity):
queue_settings = await self._client.get_settings_queue(_request_timeout=5)
if queue_settings.repeat is not None:
self._attr_repeat = BEO_REPEAT_TO_HA[queue_settings.repeat]
self._attr_repeat = BANG_OLUFSEN_REPEAT_TO_HA[queue_settings.repeat]
if queue_settings.shuffle is not None:
self._attr_shuffle = queue_settings.shuffle
@@ -406,8 +408,8 @@ class BeoMediaPlayer(BeoEntity, MediaPlayerEntity):
# Check if source is line-in or optical and progress should be updated
if self._source_change.id in (
BeoSource.LINE_IN.id,
BeoSource.SPDIF.id,
BangOlufsenSource.LINE_IN.id,
BangOlufsenSource.SPDIF.id,
):
self._playback_progress = PlaybackProgress(progress=0)
@@ -448,8 +450,10 @@ class BeoMediaPlayer(BeoEntity, MediaPlayerEntity):
# Add Beolink self
self._beolink_attributes = {
BeoAttribute.BEOLINK: {
BeoAttribute.BEOLINK_SELF: {self.device_entry.name: self._beolink_jid}
BangOlufsenAttribute.BEOLINK: {
BangOlufsenAttribute.BEOLINK_SELF: {
self.device_entry.name: self._beolink_jid
}
}
}
@@ -457,12 +461,12 @@ class BeoMediaPlayer(BeoEntity, MediaPlayerEntity):
peers = await self._client.get_beolink_peers()
if len(peers) > 0:
self._beolink_attributes[BeoAttribute.BEOLINK][
BeoAttribute.BEOLINK_PEERS
self._beolink_attributes[BangOlufsenAttribute.BEOLINK][
BangOlufsenAttribute.BEOLINK_PEERS
] = {}
for peer in peers:
self._beolink_attributes[BeoAttribute.BEOLINK][
BeoAttribute.BEOLINK_PEERS
self._beolink_attributes[BangOlufsenAttribute.BEOLINK][
BangOlufsenAttribute.BEOLINK_PEERS
][peer.friendly_name] = peer.jid
# Add Beolink listeners / leader
@@ -484,8 +488,8 @@ class BeoMediaPlayer(BeoEntity, MediaPlayerEntity):
# Add self
group_members.append(self.entity_id)
self._beolink_attributes[BeoAttribute.BEOLINK][
BeoAttribute.BEOLINK_LEADER
self._beolink_attributes[BangOlufsenAttribute.BEOLINK][
BangOlufsenAttribute.BEOLINK_LEADER
] = {
self._remote_leader.friendly_name: self._remote_leader.jid,
}
@@ -523,8 +527,8 @@ class BeoMediaPlayer(BeoEntity, MediaPlayerEntity):
beolink_listener.jid
)
break
self._beolink_attributes[BeoAttribute.BEOLINK][
BeoAttribute.BEOLINK_LISTENERS
self._beolink_attributes[BangOlufsenAttribute.BEOLINK][
BangOlufsenAttribute.BEOLINK_LISTENERS
] = beolink_listeners_attribute
self._attr_group_members = group_members
@@ -596,7 +600,7 @@ class BeoMediaPlayer(BeoEntity, MediaPlayerEntity):
@property
def supported_features(self) -> MediaPlayerEntityFeature:
"""Flag media player features that are supported."""
features = BEO_FEATURES
features = BANG_OLUFSEN_FEATURES
# Add seeking if supported by the current source
if self._source_change.is_seekable is True:
@@ -607,7 +611,7 @@ class BeoMediaPlayer(BeoEntity, MediaPlayerEntity):
@property
def state(self) -> MediaPlayerState:
"""Return the current state of the media player."""
return BEO_STATES[self._state]
return BANG_OLUFSEN_STATES[self._state]
@property
def volume_level(self) -> float | None:
@@ -627,10 +631,10 @@ class BeoMediaPlayer(BeoEntity, MediaPlayerEntity):
def media_content_type(self) -> MediaType | str | None:
"""Return the current media type."""
content_type = {
BeoSource.URI_STREAMER.id: MediaType.URL,
BeoSource.DEEZER.id: BeoMediaType.DEEZER,
BeoSource.TIDAL.id: BeoMediaType.TIDAL,
BeoSource.NET_RADIO.id: BeoMediaType.RADIO,
BangOlufsenSource.URI_STREAMER.id: MediaType.URL,
BangOlufsenSource.DEEZER.id: BangOlufsenMediaType.DEEZER,
BangOlufsenSource.TIDAL.id: BangOlufsenMediaType.TIDAL,
BangOlufsenSource.NET_RADIO.id: BangOlufsenMediaType.RADIO,
}
# Hard to determine content type.
if self._source_change.id in content_type:
@@ -761,7 +765,9 @@ class BeoMediaPlayer(BeoEntity, MediaPlayerEntity):
async def async_set_repeat(self, repeat: RepeatMode) -> None:
"""Set playback queues to repeat."""
await self._client.set_settings_queue(
play_queue_settings=PlayQueueSettings(repeat=BEO_REPEAT_FROM_HA[repeat])
play_queue_settings=PlayQueueSettings(
repeat=BANG_OLUFSEN_REPEAT_FROM_HA[repeat]
)
)
async def async_set_shuffle(self, shuffle: bool) -> None:
@@ -865,7 +871,7 @@ class BeoMediaPlayer(BeoEntity, MediaPlayerEntity):
self._volume.level.level + offset_volume, 100
)
if media_type == BeoMediaType.OVERLAY_TTS:
if media_type == BangOlufsenMediaType.OVERLAY_TTS:
# Bang & Olufsen cloud TTS
overlay_play_request.text_to_speech = (
OverlayPlayRequestTextToSpeechTextToSpeech(
@@ -882,14 +888,14 @@ class BeoMediaPlayer(BeoEntity, MediaPlayerEntity):
# The "provider" media_type may not be suitable for overlay all the time.
# Use it for now.
elif media_type == BeoMediaType.TTS:
elif media_type == BangOlufsenMediaType.TTS:
await self._client.post_overlay_play(
overlay_play_request=OverlayPlayRequest(
uri=Uri(location=media_id),
)
)
elif media_type == BeoMediaType.RADIO:
elif media_type == BangOlufsenMediaType.RADIO:
await self._client.run_provided_scene(
scene_properties=SceneProperties(
action_list=[
@@ -901,13 +907,13 @@ class BeoMediaPlayer(BeoEntity, MediaPlayerEntity):
)
)
elif media_type == BeoMediaType.FAVOURITE:
elif media_type == BangOlufsenMediaType.FAVOURITE:
await self._client.activate_preset(id=int(media_id))
elif media_type in (BeoMediaType.DEEZER, BeoMediaType.TIDAL):
elif media_type in (BangOlufsenMediaType.DEEZER, BangOlufsenMediaType.TIDAL):
try:
# Play Deezer flow.
if media_id == "flow" and media_type == BeoMediaType.DEEZER:
if media_id == "flow" and media_type == BangOlufsenMediaType.DEEZER:
deezer_id = None
if "id" in kwargs[ATTR_MEDIA_EXTRA]:

View File

@@ -11,7 +11,7 @@ from homeassistant.core import HomeAssistant
from homeassistant.helpers import device_registry as dr
from homeassistant.helpers.device_registry import DeviceEntry
from .const import DEVICE_BUTTONS, DOMAIN, BeoButtons, BeoModel
from .const import DEVICE_BUTTONS, DOMAIN, BangOlufsenButtons, BangOlufsenModel
def get_device(hass: HomeAssistant, unique_id: str) -> DeviceEntry:
@@ -40,16 +40,16 @@ async def get_remotes(client: MozartClient) -> list[PairedRemote]:
]
def get_device_buttons(model: BeoModel) -> list[str]:
def get_device_buttons(model: BangOlufsenModel) -> list[str]:
"""Get supported buttons for a given model."""
buttons = DEVICE_BUTTONS.copy()
# Beosound Premiere does not have a bluetooth button
if model == BeoModel.BEOSOUND_PREMIERE:
buttons.remove(BeoButtons.BLUETOOTH)
if model == BangOlufsenModel.BEOSOUND_PREMIERE:
buttons.remove(BangOlufsenButtons.BLUETOOTH)
# Beoconnect Core does not have any buttons
elif model == BeoModel.BEOCONNECT_CORE:
elif model == BangOlufsenModel.BEOCONNECT_CORE:
buttons = []
return buttons

View File

@@ -27,20 +27,20 @@ from homeassistant.helpers.dispatcher import async_dispatcher_send
from homeassistant.util.enum import try_parse_enum
from .const import (
BEO_WEBSOCKET_EVENT,
BANG_OLUFSEN_WEBSOCKET_EVENT,
CONNECTION_STATUS,
DOMAIN,
EVENT_TRANSLATION_MAP,
BeoModel,
BangOlufsenModel,
WebsocketNotification,
)
from .entity import BeoBase
from .entity import BangOlufsenBase
from .util import get_device, get_remotes
_LOGGER = logging.getLogger(__name__)
class BeoWebsocket(BeoBase):
class BangOlufsenWebsocket(BangOlufsenBase):
"""The WebSocket listeners."""
def __init__(
@@ -48,7 +48,7 @@ class BeoWebsocket(BeoBase):
) -> None:
"""Initialize the WebSocket listeners."""
BeoBase.__init__(self, entry, client)
BangOlufsenBase.__init__(self, entry, client)
self.hass = hass
self._device = get_device(hass, self._unique_id)
@@ -178,7 +178,7 @@ class BeoWebsocket(BeoBase):
self.entry.entry_id
)
if device.serial_number is not None
and device.model == BeoModel.BEOREMOTE_ONE
and device.model == BangOlufsenModel.BEOREMOTE_ONE
]
# Get paired remotes from device
remote_serial_numbers = [
@@ -274,4 +274,4 @@ class BeoWebsocket(BeoBase):
}
_LOGGER.debug("%s", debug_notification)
self.hass.bus.async_fire(BEO_WEBSOCKET_EVENT, debug_notification)
self.hass.bus.async_fire(BANG_OLUFSEN_WEBSOCKET_EVENT, debug_notification)

View File

@@ -1,6 +1,6 @@
{
"common": {
"trigger_behavior_description_occupancy": "The behavior of the targeted occupancy sensors to trigger on.",
"trigger_behavior_description_presence": "The behavior of the targeted presence sensors to trigger on.",
"trigger_behavior_name": "Behavior"
},
"device_automation": {
@@ -336,17 +336,17 @@
"description": "Triggers after one or more occupancy sensors stop detecting occupancy.",
"fields": {
"behavior": {
"description": "[%key:component::binary_sensor::common::trigger_behavior_description_occupancy%]",
"description": "[%key:component::binary_sensor::common::trigger_behavior_description_presence%]",
"name": "[%key:component::binary_sensor::common::trigger_behavior_name%]"
}
},
"name": "Occupancy cleared"
},
"occupancy_detected": {
"description": "Triggers after one or more occupancy sensors start detecting occupancy.",
"description": "Triggers after one ore more occupancy sensors start detecting occupancy.",
"fields": {
"behavior": {
"description": "[%key:component::binary_sensor::common::trigger_behavior_description_occupancy%]",
"description": "[%key:component::binary_sensor::common::trigger_behavior_description_presence%]",
"name": "[%key:component::binary_sensor::common::trigger_behavior_name%]"
}
},

View File

@@ -15,11 +15,11 @@ occupancy_cleared:
target:
entity:
domain: binary_sensor
device_class: occupancy
device_class: presence
occupancy_detected:
fields: *trigger_common_fields
target:
entity:
domain: binary_sensor
device_class: occupancy
device_class: presence

View File

@@ -15,12 +15,12 @@
],
"quality_scale": "internal",
"requirements": [
"bleak==2.0.0",
"bleak==1.0.1",
"bleak-retry-connector==4.4.3",
"bluetooth-adapters==2.1.0",
"bluetooth-auto-recovery==1.5.3",
"bluetooth-data-tools==1.28.4",
"dbus-fast==3.1.2",
"habluetooth==5.8.0"
"habluetooth==5.7.0"
]
}

View File

@@ -407,8 +407,8 @@ async def async_setup(hass: HomeAssistant, config: ConfigType) -> bool:
return [
RTCIceServer(
urls=[
"stun:stun.home-assistant.io:3478",
"stun:stun.home-assistant.io:80",
"stun:stun.home-assistant.io:3478",
]
),
]

View File

@@ -71,7 +71,6 @@ class CloudClient(Interface):
self._google_config_init_lock = asyncio.Lock()
self._relayer_region: str | None = None
self._cloud_ice_servers_listener: Callable[[], None] | None = None
self._ice_servers: list[RTCIceServer] = []
@property
def base_path(self) -> Path:
@@ -118,11 +117,6 @@ class CloudClient(Interface):
"""Return the connected relayer region."""
return self._relayer_region
@property
def ice_servers(self) -> list[RTCIceServer]:
"""Return the current ICE servers."""
return self._ice_servers
async def get_alexa_config(self) -> alexa_config.CloudAlexaConfig:
"""Return Alexa config."""
if self._alexa_config is None:
@@ -209,8 +203,11 @@ class CloudClient(Interface):
ice_servers: list[RTCIceServer],
) -> Callable[[], None]:
"""Register cloud ice server."""
self._ice_servers = ice_servers
return async_register_ice_servers(self._hass, lambda: self._ice_servers)
def get_ice_servers() -> list[RTCIceServer]:
return ice_servers
return async_register_ice_servers(self._hass, get_ice_servers)
async def async_register_cloud_ice_servers_listener(
prefs: CloudPreferences,
@@ -271,7 +268,6 @@ class CloudClient(Interface):
async def logout_cleanups(self) -> None:
"""Cleanup some stuff after logout."""
self._ice_servers = []
await self.prefs.async_set_username(None)
if self._alexa_config:

View File

@@ -561,7 +561,7 @@ class BaseCloudLLMEntity(Entity):
"schema": _format_structured_output(
structure, chat_log.llm_api
),
"strict": False,
"strict": True,
},
}

View File

@@ -99,7 +99,6 @@ def async_setup(hass: HomeAssistant) -> None:
websocket_api.async_register_command(hass, websocket_hook_delete)
websocket_api.async_register_command(hass, websocket_remote_connect)
websocket_api.async_register_command(hass, websocket_remote_disconnect)
websocket_api.async_register_command(hass, websocket_webrtc_ice_servers)
websocket_api.async_register_command(hass, google_assistant_get)
websocket_api.async_register_command(hass, google_assistant_list)
@@ -1108,7 +1107,6 @@ async def alexa_sync(
@websocket_api.websocket_command({"type": "cloud/tts/info"})
@callback
def tts_info(
hass: HomeAssistant,
connection: websocket_api.ActiveConnection,
@@ -1136,22 +1134,3 @@ def tts_info(
)
connection.send_result(msg["id"], {"languages": result})
@websocket_api.websocket_command(
{
vol.Required("type"): "cloud/webrtc/ice_servers",
}
)
@_require_cloud_login
@callback
def websocket_webrtc_ice_servers(
hass: HomeAssistant,
connection: websocket_api.ActiveConnection,
msg: dict[str, Any],
) -> None:
"""Handle get WebRTC ICE servers websocket command."""
connection.send_result(
msg["id"],
[server.to_dict() for server in hass.data[DATA_CLOUD].client.ice_servers],
)

View File

@@ -13,6 +13,6 @@
"integration_type": "system",
"iot_class": "cloud_push",
"loggers": ["acme", "hass_nabucasa", "snitun"],
"requirements": ["hass-nabucasa==1.7.0"],
"requirements": ["hass-nabucasa==1.6.2"],
"single_config_entry": true
}

View File

@@ -6,5 +6,5 @@
"documentation": "https://www.home-assistant.io/integrations/conversation",
"integration_type": "entity",
"quality_scale": "internal",
"requirements": ["hassil==3.5.0", "home-assistant-intents==2025.12.2"]
"requirements": ["hassil==3.4.0", "home-assistant-intents==2025.12.2"]
}

View File

@@ -7,5 +7,5 @@
"integration_type": "hub",
"iot_class": "cloud_push",
"loggers": ["sleekxmppfs", "sucks", "deebot_client"],
"requirements": ["py-sucks==0.9.11", "deebot-client==17.0.0"]
"requirements": ["py-sucks==0.9.11", "deebot-client==16.4.0"]
}

View File

@@ -17,7 +17,7 @@ DEFAULT_TTS_MODEL = "eleven_multilingual_v2"
DEFAULT_STABILITY = 0.5
DEFAULT_SIMILARITY = 0.75
DEFAULT_STT_AUTO_LANGUAGE = False
DEFAULT_STT_MODEL = "scribe_v2"
DEFAULT_STT_MODEL = "scribe_v1"
DEFAULT_STYLE = 0
DEFAULT_USE_SPEAKER_BOOST = True
@@ -129,5 +129,4 @@ STT_LANGUAGES = [
STT_MODELS = {
"scribe_v1": "Scribe v1",
"scribe_v1_experimental": "Scribe v1 Experimental",
"scribe_v2": "Scribe v2 Realtime",
}

View File

@@ -34,6 +34,13 @@ BINARY_SENSOR_ENTITY_DESCRIPTIONS: tuple[
entity_category=EntityCategory.DIAGNOSTIC,
value_fn=lambda data: data.charging,
),
FressnapfTrackerBinarySensorDescription(
translation_key="deep_sleep",
key="deep_sleep_value",
device_class=BinarySensorDeviceClass.POWER,
entity_category=EntityCategory.DIAGNOSTIC,
value_fn=lambda data: bool(data.deep_sleep_value),
),
)

View File

@@ -7,5 +7,5 @@
"integration_type": "hub",
"iot_class": "cloud_polling",
"quality_scale": "bronze",
"requirements": ["fressnapftracker==0.2.0"]
"requirements": ["fressnapftracker==0.1.2"]
}

View File

@@ -45,5 +45,12 @@
}
}
}
},
"entity": {
"binary_sensor": {
"deep_sleep": {
"name": "Deep Sleep"
}
}
}
}

View File

@@ -23,5 +23,5 @@
"winter_mode": {}
},
"quality_scale": "internal",
"requirements": ["home-assistant-frontend==20251203.0"]
"requirements": ["home-assistant-frontend==20251202.0"]
}

View File

@@ -59,14 +59,9 @@
"user": "Add location"
},
"step": {
"location": {
"user": {
"data": {
"location": "[%key:common::config_flow::data::location%]",
"name": "[%key:common::config_flow::data::name%]"
},
"data_description": {
"location": "[%key:component::google_air_quality::config::step::user::data_description::location%]",
"name": "[%key:component::google_air_quality::config::step::user::data_description::name%]"
"location": "[%key:common::config_flow::data::location%]"
},
"description": "Select the coordinates for which you want to create an entry.",
"title": "Air quality data location"

View File

@@ -18,12 +18,10 @@ from homeassistant.components.notify import (
SERVICE_SEND_MESSAGE,
BaseNotificationService,
NotifyEntity,
NotifyEntityFeature,
)
from homeassistant.config_entries import ConfigEntry
from homeassistant.const import (
ATTR_ENTITY_ID,
ATTR_SUPPORTED_FEATURES,
CONF_ACTION,
CONF_ENTITIES,
CONF_SERVICE,
@@ -175,23 +173,14 @@ class NotifyGroup(GroupEntity, NotifyEntity):
async def async_send_message(self, message: str, title: str | None = None) -> None:
"""Send a message to all members of the group."""
data = {
ATTR_MESSAGE: message,
ATTR_ENTITY_ID: self._entity_ids,
}
# add title only if supported and provided
if (
title is not None
and self._attr_supported_features & NotifyEntityFeature.TITLE
):
data[ATTR_TITLE] = title
await self.hass.services.async_call(
NOTIFY_DOMAIN,
SERVICE_SEND_MESSAGE,
data,
{
ATTR_MESSAGE: message,
ATTR_TITLE: title,
ATTR_ENTITY_ID: self._entity_ids,
},
blocking=True,
context=self._context,
)
@@ -205,15 +194,3 @@ class NotifyGroup(GroupEntity, NotifyEntity):
for entity_id in self._entity_ids
if (state := self.hass.states.get(entity_id)) is not None
)
# Support title if all members support it
self._attr_supported_features |= NotifyEntityFeature.TITLE
for entity_id in self._entity_ids:
state = self.hass.states.get(entity_id)
if (
state is None
or not state.attributes.get(ATTR_SUPPORTED_FEATURES, 0)
& NotifyEntityFeature.TITLE
):
self._attr_supported_features &= ~NotifyEntityFeature.TITLE
break

View File

@@ -37,6 +37,7 @@ def get_device_list_classic(
login_response = api.login(config[CONF_USERNAME], config[CONF_PASSWORD])
# DEBUG: Log the actual response structure
except Exception as ex:
_LOGGER.error("DEBUG - Login response: %s", login_response)
raise ConfigEntryError(
f"Error communicating with Growatt API during login: {ex}"
) from ex

View File

@@ -113,6 +113,9 @@ class GrowattCoordinator(DataUpdateCoordinator[dict[str, Any]]):
min_settings = self.api.min_settings(self.device_id)
min_energy = self.api.min_energy(self.device_id)
except growattServer.GrowattV1ApiError as err:
_LOGGER.error(
"Error fetching min device data for %s: %s", self.device_id, err
)
raise UpdateFailed(f"Error fetching min device data: {err}") from err
min_info = {**min_details, **min_settings, **min_energy}
@@ -177,6 +180,7 @@ class GrowattCoordinator(DataUpdateCoordinator[dict[str, Any]]):
try:
return await self.hass.async_add_executor_job(self._sync_update_data)
except json.decoder.JSONDecodeError as err:
_LOGGER.error("Unable to fetch data from Growatt server: %s", err)
raise UpdateFailed(f"Error fetching data: {err}") from err
def get_currency(self):

View File

@@ -1,74 +0,0 @@
rules:
# Bronze
action-setup: done
appropriate-polling: done
brands: done
common-modules: done
config-flow-test-coverage: done
config-flow:
status: todo
comment: data-descriptions missing
dependency-transparency: done
docs-actions: done
docs-high-level-description: done
docs-installation-instructions: done
docs-removal-instructions: done
entity-event-setup: done
entity-unique-id: done
has-entity-name: done
runtime-data: done
test-before-configure: done
test-before-setup: done
unique-config-entry: done
# Silver
action-exceptions: done
config-entry-unloading: done
docs-configuration-parameters:
status: todo
comment: Update server URL dropdown to show regional descriptions (e.g., 'China', 'United States') instead of raw URLs.
docs-installation-parameters: todo
entity-unavailable:
status: todo
comment: Replace bare Exception catches in __init__.py with specific growattServer exceptions.
integration-owner: done
log-when-unavailable: done
parallel-updates: done
reauthentication-flow: todo
test-coverage: todo
# Gold
devices:
status: todo
comment: Add serial_number field to DeviceInfo in sensor, number, and switch platforms using device_id/serial_id.
diagnostics: todo
discovery-update-info: todo
discovery: todo
docs-data-update: todo
docs-examples: todo
docs-known-limitations: todo
docs-supported-devices: todo
docs-supported-functions: todo
docs-troubleshooting: todo
docs-use-cases: todo
dynamic-devices: todo
entity-category:
status: todo
comment: Add EntityCategory.DIAGNOSTIC to temperature and other diagnostic sensors. Merge GrowattRequiredKeysMixin into GrowattSensorEntityDescription using kw_only=True.
entity-device-class:
status: todo
comment: Replace custom precision field with suggested_display_precision to preserve full data granularity.
entity-disabled-by-default: todo
entity-translations: todo
exception-translations: todo
icon-translations: todo
reconfiguration-flow: todo
repair-issues:
status: exempt
comment: Integration does not raise repairable issues.
stale-devices: todo
# Platinum
async-dependency: todo
inject-websession: todo
strict-typing: todo

View File

@@ -161,7 +161,6 @@ EXTRA_PLACEHOLDERS = {
ISSUE_KEY_ADDON_DETACHED_ADDON_REMOVED: HELP_URLS,
ISSUE_KEY_SYSTEM_FREE_SPACE: {
"more_info_free_space": "https://www.home-assistant.io/more-info/free-space",
"storage_url": "/config/storage",
},
ISSUE_KEY_ADDON_PWNED: {
"more_info_pwned": "https://www.home-assistant.io/more-info/pwned-passwords",

View File

@@ -130,7 +130,7 @@
"title": "Restart(s) required"
},
"issue_system_free_space": {
"description": "The data disk has only {free_space}GB free space left. This may cause issues with system stability and interfere with functionality such as backups and updates. Go to [storage]({storage_url}) to see what is taking up space or see [clear up storage]({more_info_free_space}) for tips on how to free up space.",
"description": "The data disk has only {free_space}GB free space left. This may cause issues with system stability and interfere with functionality such as backups and updates. See [clear up storage]({more_info_free_space}) for tips on how to free up space.",
"title": "Data disk is running low on free space"
},
"issue_system_multiple_data_disks": {

View File

@@ -7,6 +7,6 @@
"integration_type": "device",
"iot_class": "local_polling",
"quality_scale": "bronze",
"requirements": ["iometer==0.3.0"],
"requirements": ["iometer==0.2.0"],
"zeroconf": ["_iometer._tcp.local."]
}

View File

@@ -162,11 +162,8 @@ SUPPORTED_PLATFORMS_UI: Final = {
Platform.BINARY_SENSOR,
Platform.CLIMATE,
Platform.COVER,
Platform.DATE,
Platform.DATETIME,
Platform.LIGHT,
Platform.SWITCH,
Platform.TIME,
}
# Map KNX controller modes to HA modes. This list might not be complete.

View File

@@ -3,8 +3,8 @@
from __future__ import annotations
from datetime import date as dt_date
from typing import Any
from xknx import XKNX
from xknx.devices import DateDevice as XknxDateDevice
from xknx.dpt.dpt_11 import KNXDate as XKNXDate
@@ -18,10 +18,7 @@ from homeassistant.const import (
Platform,
)
from homeassistant.core import HomeAssistant
from homeassistant.helpers.entity_platform import (
AddConfigEntryEntitiesCallback,
async_get_current_platform,
)
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from homeassistant.helpers.restore_state import RestoreEntity
from homeassistant.helpers.typing import ConfigType
@@ -29,14 +26,11 @@ from .const import (
CONF_RESPOND_TO_READ,
CONF_STATE_ADDRESS,
CONF_SYNC_STATE,
DOMAIN,
KNX_ADDRESS,
KNX_MODULE_KEY,
)
from .entity import KnxUiEntity, KnxUiEntityPlatformController, KnxYamlEntity
from .entity import KnxYamlEntity
from .knx_module import KNXModule
from .storage.const import CONF_ENTITY, CONF_GA_DATE
from .storage.util import ConfigExtractor
async def async_setup_entry(
@@ -46,36 +40,40 @@ async def async_setup_entry(
) -> None:
"""Set up entities for KNX platform."""
knx_module = hass.data[KNX_MODULE_KEY]
platform = async_get_current_platform()
knx_module.config_store.add_platform(
platform=Platform.DATE,
controller=KnxUiEntityPlatformController(
knx_module=knx_module,
entity_platform=platform,
entity_class=KnxUiDate,
),
config: list[ConfigType] = knx_module.config_yaml[Platform.DATE]
async_add_entities(
KNXDateEntity(knx_module, entity_config) for entity_config in config
)
entities: list[KnxYamlEntity | KnxUiEntity] = []
if yaml_platform_config := knx_module.config_yaml.get(Platform.DATE):
entities.extend(
KnxYamlDate(knx_module, entity_config)
for entity_config in yaml_platform_config
)
if ui_config := knx_module.config_store.data["entities"].get(Platform.DATE):
entities.extend(
KnxUiDate(knx_module, unique_id, config)
for unique_id, config in ui_config.items()
)
if entities:
async_add_entities(entities)
def _create_xknx_device(xknx: XKNX, config: ConfigType) -> XknxDateDevice:
"""Return a XKNX DateTime object to be used within XKNX."""
return XknxDateDevice(
xknx,
name=config[CONF_NAME],
localtime=False,
group_address=config[KNX_ADDRESS],
group_address_state=config.get(CONF_STATE_ADDRESS),
respond_to_read=config[CONF_RESPOND_TO_READ],
sync_state=config[CONF_SYNC_STATE],
)
class _KNXDate(DateEntity, RestoreEntity):
class KNXDateEntity(KnxYamlEntity, DateEntity, RestoreEntity):
"""Representation of a KNX date."""
_device: XknxDateDevice
def __init__(self, knx_module: KNXModule, config: ConfigType) -> None:
"""Initialize a KNX time."""
super().__init__(
knx_module=knx_module,
device=_create_xknx_device(knx_module.xknx, config),
)
self._attr_entity_category = config.get(CONF_ENTITY_CATEGORY)
self._attr_unique_id = str(self._device.remote_value.group_address)
async def async_added_to_hass(self) -> None:
"""Restore last state."""
await super().async_added_to_hass()
@@ -96,52 +94,3 @@ class _KNXDate(DateEntity, RestoreEntity):
async def async_set_value(self, value: dt_date) -> None:
"""Change the value."""
await self._device.set(value)
class KnxYamlDate(_KNXDate, KnxYamlEntity):
"""Representation of a KNX date configured from YAML."""
_device: XknxDateDevice
def __init__(self, knx_module: KNXModule, config: ConfigType) -> None:
"""Initialize a KNX date."""
super().__init__(
knx_module=knx_module,
device=XknxDateDevice(
knx_module.xknx,
name=config[CONF_NAME],
localtime=False,
group_address=config[KNX_ADDRESS],
group_address_state=config.get(CONF_STATE_ADDRESS),
respond_to_read=config[CONF_RESPOND_TO_READ],
sync_state=config[CONF_SYNC_STATE],
),
)
self._attr_entity_category = config.get(CONF_ENTITY_CATEGORY)
self._attr_unique_id = str(self._device.remote_value.group_address)
class KnxUiDate(_KNXDate, KnxUiEntity):
"""Representation of a KNX date configured from the UI."""
_device: XknxDateDevice
def __init__(
self, knx_module: KNXModule, unique_id: str, config: dict[str, Any]
) -> None:
"""Initialize KNX date."""
super().__init__(
knx_module=knx_module,
unique_id=unique_id,
entity_config=config[CONF_ENTITY],
)
knx_conf = ConfigExtractor(config[DOMAIN])
self._device = XknxDateDevice(
knx_module.xknx,
name=config[CONF_ENTITY][CONF_NAME],
localtime=False,
group_address=knx_conf.get_write(CONF_GA_DATE),
group_address_state=knx_conf.get_state_and_passive(CONF_GA_DATE),
respond_to_read=knx_conf.get(CONF_RESPOND_TO_READ),
sync_state=knx_conf.get(CONF_SYNC_STATE),
)

View File

@@ -3,8 +3,8 @@
from __future__ import annotations
from datetime import datetime
from typing import Any
from xknx import XKNX
from xknx.devices import DateTimeDevice as XknxDateTimeDevice
from xknx.dpt.dpt_19 import KNXDateTime as XKNXDateTime
@@ -18,10 +18,7 @@ from homeassistant.const import (
Platform,
)
from homeassistant.core import HomeAssistant
from homeassistant.helpers.entity_platform import (
AddConfigEntryEntitiesCallback,
async_get_current_platform,
)
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from homeassistant.helpers.restore_state import RestoreEntity
from homeassistant.helpers.typing import ConfigType
from homeassistant.util import dt as dt_util
@@ -30,14 +27,11 @@ from .const import (
CONF_RESPOND_TO_READ,
CONF_STATE_ADDRESS,
CONF_SYNC_STATE,
DOMAIN,
KNX_ADDRESS,
KNX_MODULE_KEY,
)
from .entity import KnxUiEntity, KnxUiEntityPlatformController, KnxYamlEntity
from .entity import KnxYamlEntity
from .knx_module import KNXModule
from .storage.const import CONF_ENTITY, CONF_GA_DATETIME
from .storage.util import ConfigExtractor
async def async_setup_entry(
@@ -47,36 +41,40 @@ async def async_setup_entry(
) -> None:
"""Set up entities for KNX platform."""
knx_module = hass.data[KNX_MODULE_KEY]
platform = async_get_current_platform()
knx_module.config_store.add_platform(
platform=Platform.DATETIME,
controller=KnxUiEntityPlatformController(
knx_module=knx_module,
entity_platform=platform,
entity_class=KnxUiDateTime,
),
config: list[ConfigType] = knx_module.config_yaml[Platform.DATETIME]
async_add_entities(
KNXDateTimeEntity(knx_module, entity_config) for entity_config in config
)
entities: list[KnxYamlEntity | KnxUiEntity] = []
if yaml_platform_config := knx_module.config_yaml.get(Platform.DATETIME):
entities.extend(
KnxYamlDateTime(knx_module, entity_config)
for entity_config in yaml_platform_config
)
if ui_config := knx_module.config_store.data["entities"].get(Platform.DATETIME):
entities.extend(
KnxUiDateTime(knx_module, unique_id, config)
for unique_id, config in ui_config.items()
)
if entities:
async_add_entities(entities)
def _create_xknx_device(xknx: XKNX, config: ConfigType) -> XknxDateTimeDevice:
"""Return a XKNX DateTime object to be used within XKNX."""
return XknxDateTimeDevice(
xknx,
name=config[CONF_NAME],
localtime=False,
group_address=config[KNX_ADDRESS],
group_address_state=config.get(CONF_STATE_ADDRESS),
respond_to_read=config[CONF_RESPOND_TO_READ],
sync_state=config[CONF_SYNC_STATE],
)
class _KNXDateTime(DateTimeEntity, RestoreEntity):
class KNXDateTimeEntity(KnxYamlEntity, DateTimeEntity, RestoreEntity):
"""Representation of a KNX datetime."""
_device: XknxDateTimeDevice
def __init__(self, knx_module: KNXModule, config: ConfigType) -> None:
"""Initialize a KNX time."""
super().__init__(
knx_module=knx_module,
device=_create_xknx_device(knx_module.xknx, config),
)
self._attr_entity_category = config.get(CONF_ENTITY_CATEGORY)
self._attr_unique_id = str(self._device.remote_value.group_address)
async def async_added_to_hass(self) -> None:
"""Restore last state."""
await super().async_added_to_hass()
@@ -101,52 +99,3 @@ class _KNXDateTime(DateTimeEntity, RestoreEntity):
async def async_set_value(self, value: datetime) -> None:
"""Change the value."""
await self._device.set(value.astimezone(dt_util.get_default_time_zone()))
class KnxYamlDateTime(_KNXDateTime, KnxYamlEntity):
"""Representation of a KNX datetime configured from YAML."""
_device: XknxDateTimeDevice
def __init__(self, knx_module: KNXModule, config: ConfigType) -> None:
"""Initialize a KNX datetime."""
super().__init__(
knx_module=knx_module,
device=XknxDateTimeDevice(
knx_module.xknx,
name=config[CONF_NAME],
localtime=False,
group_address=config[KNX_ADDRESS],
group_address_state=config.get(CONF_STATE_ADDRESS),
respond_to_read=config[CONF_RESPOND_TO_READ],
sync_state=config[CONF_SYNC_STATE],
),
)
self._attr_entity_category = config.get(CONF_ENTITY_CATEGORY)
self._attr_unique_id = str(self._device.remote_value.group_address)
class KnxUiDateTime(_KNXDateTime, KnxUiEntity):
"""Representation of a KNX datetime configured from the UI."""
_device: XknxDateTimeDevice
def __init__(
self, knx_module: KNXModule, unique_id: str, config: dict[str, Any]
) -> None:
"""Initialize KNX datetime."""
super().__init__(
knx_module=knx_module,
unique_id=unique_id,
entity_config=config[CONF_ENTITY],
)
knx_conf = ConfigExtractor(config[DOMAIN])
self._device = XknxDateTimeDevice(
knx_module.xknx,
name=config[CONF_ENTITY][CONF_NAME],
localtime=False,
group_address=knx_conf.get_write(CONF_GA_DATETIME),
group_address_state=knx_conf.get_state_and_passive(CONF_GA_DATETIME),
respond_to_read=knx_conf.get(CONF_RESPOND_TO_READ),
sync_state=knx_conf.get(CONF_SYNC_STATE),
)

View File

@@ -11,7 +11,7 @@
"loggers": ["xknx", "xknxproject"],
"quality_scale": "silver",
"requirements": [
"xknx==3.12.0",
"xknx==3.11.0",
"xknxproject==3.8.2",
"knx-frontend==2025.10.31.195356"
],

View File

@@ -13,9 +13,6 @@ CONF_DPT: Final = "dpt"
CONF_GA_SENSOR: Final = "ga_sensor"
CONF_GA_SWITCH: Final = "ga_switch"
CONF_GA_DATE: Final = "ga_date"
CONF_GA_DATETIME: Final = "ga_datetime"
CONF_GA_TIME: Final = "ga_time"
# Climate
CONF_GA_TEMPERATURE_CURRENT: Final = "ga_temperature_current"

View File

@@ -46,8 +46,6 @@ from .const import (
CONF_GA_COLOR_TEMP,
CONF_GA_CONTROLLER_MODE,
CONF_GA_CONTROLLER_STATUS,
CONF_GA_DATE,
CONF_GA_DATETIME,
CONF_GA_FAN_SPEED,
CONF_GA_FAN_SWING,
CONF_GA_FAN_SWING_HORIZONTAL,
@@ -74,7 +72,6 @@ from .const import (
CONF_GA_SWITCH,
CONF_GA_TEMPERATURE_CURRENT,
CONF_GA_TEMPERATURE_TARGET,
CONF_GA_TIME,
CONF_GA_UP_DOWN,
CONF_GA_VALVE,
CONF_GA_WHITE_BRIGHTNESS,
@@ -202,24 +199,6 @@ COVER_KNX_SCHEMA = AllSerializeFirst(
),
)
DATE_KNX_SCHEMA = vol.Schema(
{
vol.Required(CONF_GA_DATE): GASelector(write_required=True, valid_dpt="11.001"),
vol.Optional(CONF_RESPOND_TO_READ, default=False): selector.BooleanSelector(),
vol.Optional(CONF_SYNC_STATE, default=True): SyncStateSelector(),
}
)
DATETIME_KNX_SCHEMA = vol.Schema(
{
vol.Required(CONF_GA_DATETIME): GASelector(
write_required=True, valid_dpt="19.001"
),
vol.Optional(CONF_RESPOND_TO_READ, default=False): selector.BooleanSelector(),
vol.Optional(CONF_SYNC_STATE, default=True): SyncStateSelector(),
}
)
@unique
class LightColorMode(StrEnum):
@@ -357,14 +336,6 @@ SWITCH_KNX_SCHEMA = vol.Schema(
},
)
TIME_KNX_SCHEMA = vol.Schema(
{
vol.Required(CONF_GA_TIME): GASelector(write_required=True, valid_dpt="10.001"),
vol.Optional(CONF_RESPOND_TO_READ, default=False): selector.BooleanSelector(),
vol.Optional(CONF_SYNC_STATE, default=True): SyncStateSelector(),
}
)
@unique
class ConfSetpointShiftMode(StrEnum):
@@ -511,11 +482,8 @@ KNX_SCHEMA_FOR_PLATFORM = {
Platform.BINARY_SENSOR: BINARY_SENSOR_KNX_SCHEMA,
Platform.CLIMATE: CLIMATE_KNX_SCHEMA,
Platform.COVER: COVER_KNX_SCHEMA,
Platform.DATE: DATE_KNX_SCHEMA,
Platform.DATETIME: DATETIME_KNX_SCHEMA,
Platform.LIGHT: LIGHT_KNX_SCHEMA,
Platform.SWITCH: SWITCH_KNX_SCHEMA,
Platform.TIME: TIME_KNX_SCHEMA,
}
ENTITY_STORE_DATA_SCHEMA: VolSchemaType = vol.All(

View File

@@ -176,10 +176,6 @@
"state_address": "State address",
"valid_dpts": "Valid DPTs"
},
"respond_to_read": {
"description": "Respond to GroupValueRead telegrams received to the configured send address.",
"label": "Respond to read"
},
"sync_state": {
"description": "Actively request state updates from KNX bus for state addresses.",
"options": {
@@ -442,24 +438,6 @@
}
}
},
"date": {
"description": "The KNX date platform is used as an interface to date objects.",
"knx": {
"ga_date": {
"description": "The group address of the date object.",
"label": "Date"
}
}
},
"datetime": {
"description": "The KNX datetime platform is used as an interface to date and time objects.",
"knx": {
"ga_datetime": {
"description": "The group address of the date and time object.",
"label": "Date and time"
}
}
},
"header": "Create new entity",
"light": {
"description": "The KNX light platform is used as an interface to dimming actuators, LED controllers, DALI gateways and similar.",
@@ -568,15 +546,10 @@
"invert": {
"description": "Invert payloads before processing or sending.",
"label": "Invert"
}
}
},
"time": {
"description": "The KNX time platform is used as an interface to time objects.",
"knx": {
"ga_time": {
"description": "The group address of the time object.",
"label": "Time"
},
"respond_to_read": {
"description": "Respond to GroupValueRead telegrams received to the configured send address.",
"label": "Respond to read"
}
}
},

View File

@@ -3,8 +3,8 @@
from __future__ import annotations
from datetime import time as dt_time
from typing import Any
from xknx import XKNX
from xknx.devices import TimeDevice as XknxTimeDevice
from xknx.dpt.dpt_10 import KNXTime as XknxTime
@@ -18,10 +18,7 @@ from homeassistant.const import (
Platform,
)
from homeassistant.core import HomeAssistant
from homeassistant.helpers.entity_platform import (
AddConfigEntryEntitiesCallback,
async_get_current_platform,
)
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from homeassistant.helpers.restore_state import RestoreEntity
from homeassistant.helpers.typing import ConfigType
@@ -29,14 +26,11 @@ from .const import (
CONF_RESPOND_TO_READ,
CONF_STATE_ADDRESS,
CONF_SYNC_STATE,
DOMAIN,
KNX_ADDRESS,
KNX_MODULE_KEY,
)
from .entity import KnxUiEntity, KnxUiEntityPlatformController, KnxYamlEntity
from .entity import KnxYamlEntity
from .knx_module import KNXModule
from .storage.const import CONF_ENTITY, CONF_GA_TIME
from .storage.util import ConfigExtractor
async def async_setup_entry(
@@ -46,36 +40,40 @@ async def async_setup_entry(
) -> None:
"""Set up entities for KNX platform."""
knx_module = hass.data[KNX_MODULE_KEY]
platform = async_get_current_platform()
knx_module.config_store.add_platform(
platform=Platform.TIME,
controller=KnxUiEntityPlatformController(
knx_module=knx_module,
entity_platform=platform,
entity_class=KnxUiTime,
),
config: list[ConfigType] = knx_module.config_yaml[Platform.TIME]
async_add_entities(
KNXTimeEntity(knx_module, entity_config) for entity_config in config
)
entities: list[KnxYamlEntity | KnxUiEntity] = []
if yaml_platform_config := knx_module.config_yaml.get(Platform.TIME):
entities.extend(
KnxYamlTime(knx_module, entity_config)
for entity_config in yaml_platform_config
)
if ui_config := knx_module.config_store.data["entities"].get(Platform.TIME):
entities.extend(
KnxUiTime(knx_module, unique_id, config)
for unique_id, config in ui_config.items()
)
if entities:
async_add_entities(entities)
def _create_xknx_device(xknx: XKNX, config: ConfigType) -> XknxTimeDevice:
"""Return a XKNX DateTime object to be used within XKNX."""
return XknxTimeDevice(
xknx,
name=config[CONF_NAME],
localtime=False,
group_address=config[KNX_ADDRESS],
group_address_state=config.get(CONF_STATE_ADDRESS),
respond_to_read=config[CONF_RESPOND_TO_READ],
sync_state=config[CONF_SYNC_STATE],
)
class _KNXTime(TimeEntity, RestoreEntity):
class KNXTimeEntity(KnxYamlEntity, TimeEntity, RestoreEntity):
"""Representation of a KNX time."""
_device: XknxTimeDevice
def __init__(self, knx_module: KNXModule, config: ConfigType) -> None:
"""Initialize a KNX time."""
super().__init__(
knx_module=knx_module,
device=_create_xknx_device(knx_module.xknx, config),
)
self._attr_entity_category = config.get(CONF_ENTITY_CATEGORY)
self._attr_unique_id = str(self._device.remote_value.group_address)
async def async_added_to_hass(self) -> None:
"""Restore last state."""
await super().async_added_to_hass()
@@ -96,52 +94,3 @@ class _KNXTime(TimeEntity, RestoreEntity):
async def async_set_value(self, value: dt_time) -> None:
"""Change the value."""
await self._device.set(value)
class KnxYamlTime(_KNXTime, KnxYamlEntity):
"""Representation of a KNX time configured from YAML."""
_device: XknxTimeDevice
def __init__(self, knx_module: KNXModule, config: ConfigType) -> None:
"""Initialize a KNX time."""
super().__init__(
knx_module=knx_module,
device=XknxTimeDevice(
knx_module.xknx,
name=config[CONF_NAME],
localtime=False,
group_address=config[KNX_ADDRESS],
group_address_state=config.get(CONF_STATE_ADDRESS),
respond_to_read=config[CONF_RESPOND_TO_READ],
sync_state=config[CONF_SYNC_STATE],
),
)
self._attr_entity_category = config.get(CONF_ENTITY_CATEGORY)
self._attr_unique_id = str(self._device.remote_value.group_address)
class KnxUiTime(_KNXTime, KnxUiEntity):
"""Representation of a KNX time configured from the UI."""
_device: XknxTimeDevice
def __init__(
self, knx_module: KNXModule, unique_id: str, config: dict[str, Any]
) -> None:
"""Initialize KNX time."""
super().__init__(
knx_module=knx_module,
unique_id=unique_id,
entity_config=config[CONF_ENTITY],
)
knx_conf = ConfigExtractor(config[DOMAIN])
self._device = XknxTimeDevice(
knx_module.xknx,
name=config[CONF_ENTITY][CONF_NAME],
localtime=False,
group_address=knx_conf.get_write(CONF_GA_TIME),
group_address_state=knx_conf.get_state_and_passive(CONF_GA_TIME),
respond_to_read=knx_conf.get(CONF_RESPOND_TO_READ),
sync_state=knx_conf.get(CONF_SYNC_STATE),
)

View File

@@ -35,7 +35,6 @@ from homeassistant.helpers.aiohttp_client import async_create_clientsession
from .const import CONF_INSTALLATION_KEY, CONF_USE_BLUETOOTH, DOMAIN
from .coordinator import (
LaMarzoccoBluetoothUpdateCoordinator,
LaMarzoccoConfigEntry,
LaMarzoccoConfigUpdateCoordinator,
LaMarzoccoRuntimeData,
@@ -73,10 +72,38 @@ async def async_setup_entry(hass: HomeAssistant, entry: LaMarzoccoConfigEntry) -
client=create_client_session(hass),
)
try:
settings = await cloud_client.get_thing_settings(serial)
except AuthFail as ex:
raise ConfigEntryAuthFailed(
translation_domain=DOMAIN, translation_key="authentication_failed"
) from ex
except (RequestNotSuccessful, TimeoutError) as ex:
_LOGGER.debug(ex, exc_info=True)
raise ConfigEntryNotReady(
translation_domain=DOMAIN, translation_key="api_error"
) from ex
gateway_version = version.parse(
settings.firmwares[FirmwareType.GATEWAY].build_version
)
if gateway_version < version.parse("v5.0.9"):
# incompatible gateway firmware, create an issue
ir.async_create_issue(
hass,
DOMAIN,
"unsupported_gateway_firmware",
is_fixable=False,
severity=ir.IssueSeverity.ERROR,
translation_key="unsupported_gateway_firmware",
translation_placeholders={"gateway_version": str(gateway_version)},
)
# initialize Bluetooth
bluetooth_client: LaMarzoccoBluetoothClient | None = None
if entry.options.get(CONF_USE_BLUETOOTH, True) and (
token := entry.data.get(CONF_TOKEN)
token := (entry.data.get(CONF_TOKEN) or settings.ble_auth_token)
):
if CONF_MAC not in entry.data:
for discovery_info in async_discovered_service_info(hass):
@@ -118,44 +145,6 @@ async def async_setup_entry(hass: HomeAssistant, entry: LaMarzoccoConfigEntry) -
_LOGGER.info(
"Bluetooth device not found during lamarzocco setup, continuing with cloud only"
)
try:
settings = await cloud_client.get_thing_settings(serial)
except AuthFail as ex:
raise ConfigEntryAuthFailed(
translation_domain=DOMAIN, translation_key="authentication_failed"
) from ex
except (RequestNotSuccessful, TimeoutError) as ex:
_LOGGER.debug(ex, exc_info=True)
if not bluetooth_client:
raise ConfigEntryNotReady(
translation_domain=DOMAIN, translation_key="api_error"
) from ex
_LOGGER.debug("Cloud failed, continuing with Bluetooth only", exc_info=True)
else:
gateway_version = version.parse(
settings.firmwares[FirmwareType.GATEWAY].build_version
)
if gateway_version < version.parse("v5.0.9"):
# incompatible gateway firmware, create an issue
ir.async_create_issue(
hass,
DOMAIN,
"unsupported_gateway_firmware",
is_fixable=False,
severity=ir.IssueSeverity.ERROR,
translation_key="unsupported_gateway_firmware",
translation_placeholders={"gateway_version": str(gateway_version)},
)
# Update BLE Token if exists
if settings.ble_auth_token:
hass.config_entries.async_update_entry(
entry,
data={
**entry.data,
CONF_TOKEN: settings.ble_auth_token,
},
)
device = LaMarzoccoMachine(
serial_number=entry.unique_id,
@@ -164,7 +153,7 @@ async def async_setup_entry(hass: HomeAssistant, entry: LaMarzoccoConfigEntry) -
)
coordinators = LaMarzoccoRuntimeData(
LaMarzoccoConfigUpdateCoordinator(hass, entry, device),
LaMarzoccoConfigUpdateCoordinator(hass, entry, device, cloud_client),
LaMarzoccoSettingsUpdateCoordinator(hass, entry, device),
LaMarzoccoScheduleUpdateCoordinator(hass, entry, device),
LaMarzoccoStatisticsUpdateCoordinator(hass, entry, device),
@@ -177,16 +166,6 @@ async def async_setup_entry(hass: HomeAssistant, entry: LaMarzoccoConfigEntry) -
coordinators.statistics_coordinator.async_config_entry_first_refresh(),
)
# bt coordinator only if bluetooth client is available
# and after the initial refresh of the config coordinator
# to fetch only if the others failed
if bluetooth_client:
bluetooth_coordinator = LaMarzoccoBluetoothUpdateCoordinator(
hass, entry, device
)
await bluetooth_coordinator.async_config_entry_first_refresh()
coordinators.bluetooth_coordinator = bluetooth_coordinator
entry.runtime_data = coordinators
await hass.config_entries.async_forward_entry_setups(entry, PLATFORMS)

View File

@@ -6,7 +6,7 @@ from typing import cast
from pylamarzocco import LaMarzoccoMachine
from pylamarzocco.const import BackFlushStatus, MachineState, ModelName, WidgetType
from pylamarzocco.models import BackFlush, MachineStatus, NoWater
from pylamarzocco.models import BackFlush, MachineStatus
from homeassistant.components.binary_sensor import (
BinarySensorDeviceClass,
@@ -39,15 +39,8 @@ ENTITIES: tuple[LaMarzoccoBinarySensorEntityDescription, ...] = (
key="water_tank",
translation_key="water_tank",
device_class=BinarySensorDeviceClass.PROBLEM,
is_on_fn=(
lambda machine: cast(
NoWater, machine.dashboard.config[WidgetType.CM_NO_WATER]
).allarm
if WidgetType.CM_NO_WATER in machine.dashboard.config
else False
),
is_on_fn=lambda machine: WidgetType.CM_NO_WATER in machine.dashboard.config,
entity_category=EntityCategory.DIAGNOSTIC,
bt_offline_mode=True,
),
LaMarzoccoBinarySensorEntityDescription(
key="brew_active",
@@ -100,9 +93,7 @@ async def async_setup_entry(
coordinator = entry.runtime_data.config_coordinator
async_add_entities(
LaMarzoccoBinarySensorEntity(
coordinator, description, entry.runtime_data.bluetooth_coordinator
)
LaMarzoccoBinarySensorEntity(coordinator, description)
for description in ENTITIES
if description.supported_fn(coordinator)
)

View File

@@ -10,12 +10,8 @@ from datetime import timedelta
import logging
from typing import Any
from pylamarzocco import LaMarzoccoMachine
from pylamarzocco.exceptions import (
AuthFail,
BluetoothConnectionFailed,
RequestNotSuccessful,
)
from pylamarzocco import LaMarzoccoCloudClient, LaMarzoccoMachine
from pylamarzocco.exceptions import AuthFail, RequestNotSuccessful
from homeassistant.config_entries import ConfigEntry
from homeassistant.const import EVENT_HOMEASSISTANT_STOP
@@ -40,7 +36,6 @@ class LaMarzoccoRuntimeData:
settings_coordinator: LaMarzoccoSettingsUpdateCoordinator
schedule_coordinator: LaMarzoccoScheduleUpdateCoordinator
statistics_coordinator: LaMarzoccoStatisticsUpdateCoordinator
bluetooth_coordinator: LaMarzoccoBluetoothUpdateCoordinator | None = None
type LaMarzoccoConfigEntry = ConfigEntry[LaMarzoccoRuntimeData]
@@ -51,13 +46,14 @@ class LaMarzoccoUpdateCoordinator(DataUpdateCoordinator[None]):
_default_update_interval = SCAN_INTERVAL
config_entry: LaMarzoccoConfigEntry
update_success = False
_websocket_task: Task | None = None
def __init__(
self,
hass: HomeAssistant,
entry: LaMarzoccoConfigEntry,
device: LaMarzoccoMachine,
cloud_client: LaMarzoccoCloudClient | None = None,
) -> None:
"""Initialize coordinator."""
super().__init__(
@@ -68,7 +64,7 @@ class LaMarzoccoUpdateCoordinator(DataUpdateCoordinator[None]):
update_interval=self._default_update_interval,
)
self.device = device
self._websocket_task: Task | None = None
self.cloud_client = cloud_client
@property
def websocket_terminated(self) -> bool:
@@ -85,28 +81,14 @@ class LaMarzoccoUpdateCoordinator(DataUpdateCoordinator[None]):
await func()
except AuthFail as ex:
_LOGGER.debug("Authentication failed", exc_info=True)
self.update_success = False
raise ConfigEntryAuthFailed(
translation_domain=DOMAIN, translation_key="authentication_failed"
) from ex
except RequestNotSuccessful as ex:
_LOGGER.debug(ex, exc_info=True)
self.update_success = False
# if no bluetooth coordinator, this is a fatal error
# otherwise, bluetooth may still work
if not self.device.bluetooth_client_available:
raise UpdateFailed(
translation_domain=DOMAIN, translation_key="api_error"
) from ex
except BluetoothConnectionFailed as err:
self.update_success = False
raise UpdateFailed(
translation_domain=DOMAIN,
translation_key="bluetooth_connection_failed",
) from err
else:
self.update_success = True
_LOGGER.debug("Current status: %s", self.device.dashboard.to_dict())
translation_domain=DOMAIN, translation_key="api_error"
) from ex
async def _async_setup(self) -> None:
"""Set up coordinator."""
@@ -127,9 +109,11 @@ class LaMarzoccoUpdateCoordinator(DataUpdateCoordinator[None]):
class LaMarzoccoConfigUpdateCoordinator(LaMarzoccoUpdateCoordinator):
"""Class to handle fetching data from the La Marzocco API centrally."""
cloud_client: LaMarzoccoCloudClient
async def _internal_async_setup(self) -> None:
"""Set up the coordinator."""
await self.device.ensure_token_valid()
await self.cloud_client.async_get_access_token()
await self.device.get_dashboard()
_LOGGER.debug("Current status: %s", self.device.dashboard.to_dict())
@@ -137,7 +121,7 @@ class LaMarzoccoConfigUpdateCoordinator(LaMarzoccoUpdateCoordinator):
"""Fetch data from API endpoint."""
# ensure token stays valid; does nothing if token is still valid
await self.device.ensure_token_valid()
await self.cloud_client.async_get_access_token()
# Only skip websocket reconnection if it's currently connected and the task is still running
if self.device.websocket.connected and not self.websocket_terminated:
@@ -209,19 +193,3 @@ class LaMarzoccoStatisticsUpdateCoordinator(LaMarzoccoUpdateCoordinator):
"""Fetch data from API endpoint."""
await self.device.get_coffee_and_flush_counter()
_LOGGER.debug("Current statistics: %s", self.device.statistics.to_dict())
class LaMarzoccoBluetoothUpdateCoordinator(LaMarzoccoUpdateCoordinator):
"""Class to handle fetching data from the La Marzocco Bluetooth API centrally."""
async def _internal_async_setup(self) -> None:
"""Initial setup for Bluetooth coordinator."""
await self.device.get_model_info_from_bluetooth()
async def _internal_async_update_data(self) -> None:
"""Fetch data from Bluetooth endpoint."""
# if the websocket is connected and the machine is connected to the cloud
# skip bluetooth update, because we get push updates
if self.device.websocket.connected and self.device.dashboard.connected:
return
await self.device.get_dashboard_from_bluetooth()

View File

@@ -17,10 +17,7 @@ from homeassistant.helpers.entity import EntityDescription
from homeassistant.helpers.update_coordinator import CoordinatorEntity
from .const import DOMAIN
from .coordinator import (
LaMarzoccoBluetoothUpdateCoordinator,
LaMarzoccoUpdateCoordinator,
)
from .coordinator import LaMarzoccoUpdateCoordinator
@dataclass(frozen=True, kw_only=True)
@@ -29,7 +26,6 @@ class LaMarzoccoEntityDescription(EntityDescription):
available_fn: Callable[[LaMarzoccoUpdateCoordinator], bool] = lambda _: True
supported_fn: Callable[[LaMarzoccoUpdateCoordinator], bool] = lambda _: True
bt_offline_mode: bool = False
class LaMarzoccoBaseEntity(
@@ -49,19 +45,14 @@ class LaMarzoccoBaseEntity(
super().__init__(coordinator)
device = coordinator.device
self._attr_unique_id = f"{device.serial_number}_{key}"
sw_version = (
device.settings.firmwares[FirmwareType.MACHINE].build_version
if FirmwareType.MACHINE in device.settings.firmwares
else None
)
self._attr_device_info = DeviceInfo(
identifiers={(DOMAIN, device.serial_number)},
name=device.dashboard.name or self.coordinator.config_entry.title,
name=device.dashboard.name,
manufacturer="La Marzocco",
model=device.dashboard.model_name.value,
model_id=device.dashboard.model_code.value,
serial_number=device.serial_number,
sw_version=sw_version,
sw_version=device.settings.firmwares[FirmwareType.MACHINE].build_version,
)
connections: set[tuple[str, str]] = set()
if coordinator.config_entry.data.get(CONF_ADDRESS):
@@ -86,12 +77,8 @@ class LaMarzoccoBaseEntity(
if WidgetType.CM_MACHINE_STATUS in self.coordinator.device.dashboard.config
else MachineState.OFF
)
return (
super().available
and not (
self._unavailable_when_machine_off and machine_state is MachineState.OFF
)
and self.coordinator.update_success
return super().available and not (
self._unavailable_when_machine_off and machine_state is MachineState.OFF
)
@@ -103,11 +90,6 @@ class LaMarzoccoEntity(LaMarzoccoBaseEntity):
@property
def available(self) -> bool:
"""Return True if entity is available."""
if (
self.entity_description.bt_offline_mode
and self.bluetooth_coordinator is not None
):
return self.bluetooth_coordinator.last_update_success
if super().available:
return self.entity_description.available_fn(self.coordinator)
return False
@@ -116,17 +98,7 @@ class LaMarzoccoEntity(LaMarzoccoBaseEntity):
self,
coordinator: LaMarzoccoUpdateCoordinator,
entity_description: LaMarzoccoEntityDescription,
bluetooth_coordinator: LaMarzoccoBluetoothUpdateCoordinator | None = None,
) -> None:
"""Initialize the entity."""
super().__init__(coordinator, entity_description.key)
self.entity_description = entity_description
self.bluetooth_coordinator = bluetooth_coordinator
async def async_added_to_hass(self) -> None:
"""Handle when entity is added to hass."""
await super().async_added_to_hass()
if self.bluetooth_coordinator is not None:
self.async_on_remove(
self.bluetooth_coordinator.async_add_listener(self.async_write_ha_state)
)

View File

@@ -58,7 +58,6 @@ ENTITIES: tuple[LaMarzoccoNumberEntityDescription, ...] = (
CoffeeBoiler, machine.dashboard.config[WidgetType.CM_COFFEE_BOILER]
).target_temperature
),
bt_offline_mode=True,
),
LaMarzoccoNumberEntityDescription(
key="steam_temp",
@@ -79,7 +78,6 @@ ENTITIES: tuple[LaMarzoccoNumberEntityDescription, ...] = (
lambda coordinator: coordinator.device.dashboard.model_name
in (ModelName.GS3_AV, ModelName.GS3_MP)
),
bt_offline_mode=True,
),
LaMarzoccoNumberEntityDescription(
key="smart_standby_time",
@@ -98,7 +96,6 @@ ENTITIES: tuple[LaMarzoccoNumberEntityDescription, ...] = (
)
),
native_value_fn=lambda machine: machine.schedule.smart_wake_up_sleep.smart_stand_by_minutes,
bt_offline_mode=True,
),
LaMarzoccoNumberEntityDescription(
key="preinfusion_off",
@@ -229,14 +226,13 @@ async def async_setup_entry(
) -> None:
"""Set up number entities."""
coordinator = entry.runtime_data.config_coordinator
async_add_entities(
LaMarzoccoNumberEntity(
coordinator, description, entry.runtime_data.bluetooth_coordinator
)
entities: list[NumberEntity] = [
LaMarzoccoNumberEntity(coordinator, description)
for description in ENTITIES
if description.supported_fn(coordinator)
)
]
async_add_entities(entities)
class LaMarzoccoNumberEntity(LaMarzoccoEntity, NumberEntity):

View File

@@ -80,7 +80,6 @@ ENTITIES: tuple[LaMarzoccoSelectEntityDescription, ...] = (
lambda coordinator: coordinator.device.dashboard.model_name
in (ModelName.LINEA_MINI_R, ModelName.LINEA_MICRA)
),
bt_offline_mode=True,
),
LaMarzoccoSelectEntityDescription(
key="prebrew_infusion_select",
@@ -129,9 +128,7 @@ async def async_setup_entry(
coordinator = entry.runtime_data.config_coordinator
async_add_entities(
LaMarzoccoSelectEntity(
coordinator, description, entry.runtime_data.bluetooth_coordinator
)
LaMarzoccoSelectEntity(coordinator, description)
for description in ENTITIES
if description.supported_fn(coordinator)
)

View File

@@ -183,9 +183,6 @@
"auto_on_off_error": {
"message": "Error while setting auto on/off to {state} for {id}"
},
"bluetooth_connection_failed": {
"message": "Error while connecting to machine via Bluetooth"
},
"button_error": {
"message": "Error while executing button {key}"
},

View File

@@ -50,7 +50,6 @@ ENTITIES: tuple[LaMarzoccoSwitchEntityDescription, ...] = (
).mode
is MachineMode.BREWING_MODE
),
bt_offline_mode=True,
),
LaMarzoccoSwitchEntityDescription(
key="steam_boiler_enable",
@@ -66,7 +65,6 @@ ENTITIES: tuple[LaMarzoccoSwitchEntityDescription, ...] = (
lambda coordinator: coordinator.device.dashboard.model_name
in (ModelName.LINEA_MINI_R, ModelName.LINEA_MICRA)
),
bt_offline_mode=True,
),
LaMarzoccoSwitchEntityDescription(
key="steam_boiler_enable",
@@ -82,7 +80,6 @@ ENTITIES: tuple[LaMarzoccoSwitchEntityDescription, ...] = (
lambda coordinator: coordinator.device.dashboard.model_name
not in (ModelName.LINEA_MINI_R, ModelName.LINEA_MICRA)
),
bt_offline_mode=True,
),
LaMarzoccoSwitchEntityDescription(
key="smart_standby_enabled",
@@ -94,7 +91,6 @@ ENTITIES: tuple[LaMarzoccoSwitchEntityDescription, ...] = (
minutes=machine.schedule.smart_wake_up_sleep.smart_stand_by_minutes,
),
is_on_fn=lambda machine: machine.schedule.smart_wake_up_sleep.smart_stand_by_enabled,
bt_offline_mode=True,
),
)
@@ -110,9 +106,7 @@ async def async_setup_entry(
entities: list[SwitchEntity] = []
entities.extend(
LaMarzoccoSwitchEntity(
coordinator, description, entry.runtime_data.bluetooth_coordinator
)
LaMarzoccoSwitchEntity(coordinator, description)
for description in ENTITIES
if description.supported_fn(coordinator)
)

View File

@@ -8,5 +8,5 @@
"iot_class": "cloud_push",
"loggers": ["letpot"],
"quality_scale": "silver",
"requirements": ["letpot==0.6.4"]
"requirements": ["letpot==0.6.3"]
}

View File

@@ -1 +0,0 @@
"""Virtual integration: Levoit."""

View File

@@ -1,6 +0,0 @@
{
"domain": "levoit",
"name": "Levoit",
"integration_type": "virtual",
"supported_by": "vesync"
}

View File

@@ -4,6 +4,7 @@ from __future__ import annotations
from datetime import timedelta
import logging
from types import MappingProxyType
from librehardwaremonitor_api import (
LibreHardwareMonitorClient,
@@ -54,11 +55,15 @@ class LibreHardwareMonitorCoordinator(DataUpdateCoordinator[LibreHardwareMonitor
device_entries: list[DeviceEntry] = dr.async_entries_for_config_entry(
registry=dr.async_get(self.hass), config_entry_id=config_entry.entry_id
)
self._previous_devices: dict[DeviceId, DeviceName] = {
DeviceId(next(iter(device.identifiers))[1]): DeviceName(device.name)
for device in device_entries
if device.identifiers and device.name
}
self._previous_devices: MappingProxyType[DeviceId, DeviceName] = (
MappingProxyType(
{
DeviceId(next(iter(device.identifiers))[1]): DeviceName(device.name)
for device in device_entries
if device.identifiers and device.name
}
)
)
async def _async_update_data(self) -> LibreHardwareMonitorData:
try:
@@ -70,9 +75,7 @@ class LibreHardwareMonitorCoordinator(DataUpdateCoordinator[LibreHardwareMonitor
except LibreHardwareMonitorNoDevicesError as err:
raise UpdateFailed("No sensor data available, will retry") from err
await self._async_handle_changes_in_devices(
dict(lhm_data.main_device_ids_and_names)
)
await self._async_handle_changes_in_devices(lhm_data.main_device_ids_and_names)
return lhm_data
@@ -89,23 +92,20 @@ class LibreHardwareMonitorCoordinator(DataUpdateCoordinator[LibreHardwareMonitor
)
async def _async_handle_changes_in_devices(
self, detected_devices: dict[DeviceId, DeviceName]
self, detected_devices: MappingProxyType[DeviceId, DeviceName]
) -> None:
"""Handle device changes by deleting devices from / adding devices to Home Assistant."""
detected_devices = {
DeviceId(f"{self.config_entry.entry_id}_{detected_id}"): device_name
for detected_id, device_name in detected_devices.items()
}
previous_device_ids = set(self._previous_devices.keys())
detected_device_ids = set(detected_devices.keys())
_LOGGER.debug("Previous device_ids: %s", previous_device_ids)
_LOGGER.debug("Detected device_ids: %s", detected_device_ids)
if previous_device_ids == detected_device_ids:
return
if self.data is None:
# initial update during integration startup
self._previous_devices = detected_devices # type: ignore[unreachable]
return
if orphaned_devices := previous_device_ids - detected_device_ids:
_LOGGER.warning(
"Device(s) no longer available, will be removed: %s",
@@ -114,21 +114,13 @@ class LibreHardwareMonitorCoordinator(DataUpdateCoordinator[LibreHardwareMonitor
device_registry = dr.async_get(self.hass)
for device_id in orphaned_devices:
if device := device_registry.async_get_device(
identifiers={(DOMAIN, device_id)}
identifiers={(DOMAIN, f"{self.config_entry.entry_id}_{device_id}")}
):
_LOGGER.debug(
"Removing device: %s", self._previous_devices[device_id]
)
device_registry.async_update_device(
device_id=device.id,
remove_config_entry_id=self.config_entry.entry_id,
)
if self.data is None:
# initial update during integration startup
self._previous_devices = detected_devices # type: ignore[unreachable]
return
if new_devices := detected_device_ids - previous_device_ids:
_LOGGER.warning(
"New Device(s) detected, reload integration to add them to Home Assistant: %s",

View File

@@ -86,12 +86,6 @@
"current_phase": {
"default": "mdi:state-machine"
},
"door_closed_events": {
"default": "mdi:door-closed"
},
"door_open_events": {
"default": "mdi:door-open"
},
"esa_opt_out_state": {
"default": "mdi:home-lightning-bolt"
},

View File

@@ -1488,30 +1488,4 @@ DISCOVERY_SCHEMAS = [
entity_class=MatterSensor,
required_attributes=(clusters.ServiceArea.Attributes.EstimatedEndTime,),
),
MatterDiscoverySchema(
platform=Platform.SENSOR,
entity_description=MatterSensorEntityDescription(
key="DoorLockDoorOpenEvents",
translation_key="door_open_events",
entity_category=EntityCategory.DIAGNOSTIC,
entity_registry_enabled_default=False,
state_class=SensorStateClass.TOTAL_INCREASING,
),
entity_class=MatterSensor,
required_attributes=(clusters.DoorLock.Attributes.DoorOpenEvents,),
featuremap_contains=clusters.DoorLock.Bitmaps.Feature.kDoorPositionSensor,
),
MatterDiscoverySchema(
platform=Platform.SENSOR,
entity_description=MatterSensorEntityDescription(
key="DoorLockDoorClosedEvents",
translation_key="door_closed_events",
entity_category=EntityCategory.DIAGNOSTIC,
entity_registry_enabled_default=False,
state_class=SensorStateClass.TOTAL_INCREASING,
),
entity_class=MatterSensor,
required_attributes=(clusters.DoorLock.Attributes.DoorClosedEvents,),
featuremap_contains=clusters.DoorLock.Bitmaps.Feature.kDoorPositionSensor,
),
]

View File

@@ -375,12 +375,6 @@
"current_phase": {
"name": "Current phase"
},
"door_closed_events": {
"name": "Door closed events"
},
"door_open_events": {
"name": "Door open events"
},
"energy_exported": {
"name": "Energy exported"
},

View File

@@ -19,5 +19,5 @@
"documentation": "https://www.home-assistant.io/integrations/nest",
"iot_class": "cloud_push",
"loggers": ["google_nest_sdm"],
"requirements": ["google-nest-sdm==9.1.2"]
"requirements": ["google-nest-sdm==9.1.1"]
}

View File

@@ -4,7 +4,7 @@
"codeowners": ["@gjohansson-ST"],
"config_flow": true,
"documentation": "https://www.home-assistant.io/integrations/nordpool",
"integration_type": "service",
"integration_type": "hub",
"iot_class": "cloud_polling",
"loggers": ["pynordpool"],
"quality_scale": "platinum",

View File

@@ -9,7 +9,7 @@
"iot_class": "local_push",
"loggers": ["aioonkyo"],
"quality_scale": "bronze",
"requirements": ["aioonkyo==0.4.0"],
"requirements": ["aioonkyo==0.3.0"],
"ssdp": [
{
"deviceType": "urn:schemas-upnp-org:device:MediaRenderer:1",

View File

@@ -10,7 +10,6 @@
"config_flow": true,
"dependencies": ["bluetooth_adapters"],
"documentation": "https://www.home-assistant.io/integrations/oralb",
"integration_type": "device",
"iot_class": "local_push",
"loggers": ["oralb_ble"],
"requirements": ["oralb-ble==0.17.6"]

View File

@@ -162,7 +162,7 @@ class PingDataSubProcess(PingData):
if pinger:
with suppress(TypeError, ProcessLookupError):
pinger.kill()
await pinger.kill() # type: ignore[func-returns-value]
del pinger
return None

View File

@@ -251,7 +251,13 @@ class PlaystationNetworkFriendDataCoordinator(
def _update_data(self) -> PlaystationNetworkData:
"""Update friend status data."""
try:
presence = self.user.get_presence()
return PlaystationNetworkData(
username=self.user.online_id,
account_id=self.user.account_id,
presence=self.user.get_presence(),
profile=self.profile,
trophy_summary=self.user.trophy_summary(),
)
except PSNAWPForbiddenError as error:
raise UpdateFailed(
translation_domain=DOMAIN,
@@ -261,19 +267,6 @@ class PlaystationNetworkFriendDataCoordinator(
except PSNAWPError:
raise
try:
trophy_summary = self.user.trophy_summary()
except PSNAWPForbiddenError:
trophy_summary = None
return PlaystationNetworkData(
username=self.user.online_id,
account_id=self.user.account_id,
profile=self.profile,
presence=presence,
trophy_summary=trophy_summary,
)
async def update_data(self) -> PlaystationNetworkData:
"""Update friend status data."""
return await self.hass.async_add_executor_job(self._update_data)

View File

@@ -17,12 +17,7 @@ from .coordinator import PooldoseConfigEntry, PooldoseCoordinator
_LOGGER = logging.getLogger(__name__)
PLATFORMS: list[Platform] = [
Platform.BINARY_SENSOR,
Platform.NUMBER,
Platform.SENSOR,
Platform.SWITCH,
]
PLATFORMS: list[Platform] = [Platform.BINARY_SENSOR, Platform.SENSOR, Platform.SWITCH]
async def async_migrate_entry(hass: HomeAssistant, entry: PooldoseConfigEntry) -> bool:

View File

@@ -68,35 +68,6 @@
}
}
},
"number": {
"cl_target": {
"default": "mdi:pool"
},
"ofa_cl_lower": {
"default": "mdi:arrow-down-bold"
},
"ofa_cl_upper": {
"default": "mdi:arrow-up-bold"
},
"ofa_orp_lower": {
"default": "mdi:arrow-down-bold"
},
"ofa_orp_upper": {
"default": "mdi:arrow-up-bold"
},
"ofa_ph_lower": {
"default": "mdi:arrow-down-bold"
},
"ofa_ph_upper": {
"default": "mdi:arrow-up-bold"
},
"orp_target": {
"default": "mdi:water-check"
},
"ph_target": {
"default": "mdi:ph"
}
},
"sensor": {
"cl": {
"default": "mdi:pool"

View File

@@ -1,142 +0,0 @@
"""Number entities for the Seko PoolDose integration."""
from __future__ import annotations
import logging
from typing import TYPE_CHECKING, Any, cast
from homeassistant.components.number import (
NumberDeviceClass,
NumberEntity,
NumberEntityDescription,
)
from homeassistant.const import (
CONCENTRATION_PARTS_PER_MILLION,
EntityCategory,
UnitOfElectricPotential,
)
from homeassistant.core import HomeAssistant
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from . import PooldoseConfigEntry
from .entity import PooldoseEntity
if TYPE_CHECKING:
from .coordinator import PooldoseCoordinator
_LOGGER = logging.getLogger(__name__)
NUMBER_DESCRIPTIONS: tuple[NumberEntityDescription, ...] = (
NumberEntityDescription(
key="ph_target",
translation_key="ph_target",
entity_category=EntityCategory.CONFIG,
device_class=NumberDeviceClass.PH,
),
NumberEntityDescription(
key="orp_target",
translation_key="orp_target",
entity_category=EntityCategory.CONFIG,
device_class=NumberDeviceClass.VOLTAGE,
native_unit_of_measurement=UnitOfElectricPotential.MILLIVOLT,
),
NumberEntityDescription(
key="cl_target",
translation_key="cl_target",
entity_category=EntityCategory.CONFIG,
native_unit_of_measurement=CONCENTRATION_PARTS_PER_MILLION,
),
NumberEntityDescription(
key="ofa_ph_lower",
translation_key="ofa_ph_lower",
entity_category=EntityCategory.CONFIG,
device_class=NumberDeviceClass.PH,
),
NumberEntityDescription(
key="ofa_ph_upper",
translation_key="ofa_ph_upper",
entity_category=EntityCategory.CONFIG,
device_class=NumberDeviceClass.PH,
),
NumberEntityDescription(
key="ofa_orp_lower",
translation_key="ofa_orp_lower",
entity_category=EntityCategory.CONFIG,
device_class=NumberDeviceClass.VOLTAGE,
native_unit_of_measurement=UnitOfElectricPotential.MILLIVOLT,
),
NumberEntityDescription(
key="ofa_orp_upper",
translation_key="ofa_orp_upper",
entity_category=EntityCategory.CONFIG,
device_class=NumberDeviceClass.VOLTAGE,
native_unit_of_measurement=UnitOfElectricPotential.MILLIVOLT,
),
NumberEntityDescription(
key="ofa_cl_lower",
translation_key="ofa_cl_lower",
entity_category=EntityCategory.CONFIG,
native_unit_of_measurement=CONCENTRATION_PARTS_PER_MILLION,
),
NumberEntityDescription(
key="ofa_cl_upper",
translation_key="ofa_cl_upper",
entity_category=EntityCategory.CONFIG,
native_unit_of_measurement=CONCENTRATION_PARTS_PER_MILLION,
),
)
async def async_setup_entry(
hass: HomeAssistant,
config_entry: PooldoseConfigEntry,
async_add_entities: AddConfigEntryEntitiesCallback,
) -> None:
"""Set up PoolDose number entities from a config entry."""
if TYPE_CHECKING:
assert config_entry.unique_id is not None
coordinator = config_entry.runtime_data
number_data = coordinator.data.get("number", {})
serial_number = config_entry.unique_id
async_add_entities(
PooldoseNumber(coordinator, serial_number, coordinator.device_info, description)
for description in NUMBER_DESCRIPTIONS
if description.key in number_data
)
class PooldoseNumber(PooldoseEntity, NumberEntity):
"""Number entity for the Seko PoolDose Python API."""
def __init__(
self,
coordinator: PooldoseCoordinator,
serial_number: str,
device_info: Any,
description: NumberEntityDescription,
) -> None:
"""Initialize the number."""
super().__init__(coordinator, serial_number, device_info, description, "number")
self._async_update_attrs()
def _handle_coordinator_update(self) -> None:
"""Handle updated data from the coordinator."""
self._async_update_attrs()
super()._handle_coordinator_update()
def _async_update_attrs(self) -> None:
"""Update number attributes."""
data = cast(dict, self.get_data())
self._attr_native_value = data["value"]
self._attr_native_min_value = data["min"]
self._attr_native_max_value = data["max"]
self._attr_native_step = data["step"]
async def async_set_native_value(self, value: float) -> None:
"""Set new value."""
await self.coordinator.client.set_number(self.entity_description.key, value)
self._attr_native_value = value
self.async_write_ha_state()

View File

@@ -68,35 +68,6 @@
"name": "Auxiliary relay 3 status"
}
},
"number": {
"cl_target": {
"name": "Chlorine target"
},
"ofa_cl_lower": {
"name": "Chlorine overfeed alarm lower limit"
},
"ofa_cl_upper": {
"name": "Chlorine overfeed alarm upper limit"
},
"ofa_orp_lower": {
"name": "ORP overfeed alarm lower limit"
},
"ofa_orp_upper": {
"name": "ORP overfeed alarm upper limit"
},
"ofa_ph_lower": {
"name": "pH overfeed alarm lower limit"
},
"ofa_ph_upper": {
"name": "pH overfeed alarm upper limit"
},
"orp_target": {
"name": "ORP target"
},
"ph_target": {
"name": "pH target"
}
},
"sensor": {
"cl": {
"name": "Chlorine"

View File

@@ -20,5 +20,5 @@
"iot_class": "local_push",
"loggers": ["reolink_aio"],
"quality_scale": "platinum",
"requirements": ["reolink-aio==0.17.0"]
"requirements": ["reolink-aio==0.16.6"]
}

View File

@@ -20,7 +20,7 @@
"loggers": ["roborock"],
"quality_scale": "silver",
"requirements": [
"python-roborock==3.9.3",
"python-roborock==3.8.4",
"vacuum-map-parser-roborock==0.1.4"
]
}

View File

@@ -1,5 +1,6 @@
"""Roborock storage."""
import dataclasses
import logging
from pathlib import Path
import shutil
@@ -16,7 +17,7 @@ _LOGGER = logging.getLogger(__name__)
STORAGE_PATH = f".storage/{DOMAIN}"
MAPS_PATH = "maps"
CACHE_VERSION = 2
CACHE_VERSION = 1
def _storage_path_prefix(hass: HomeAssistant, entry_id: str) -> Path:
@@ -43,31 +44,6 @@ async def async_cleanup_map_storage(hass: HomeAssistant, entry_id: str) -> None:
await hass.async_add_executor_job(remove, path_prefix)
class StoreImpl(Store[dict[str, Any]]):
"""Store implementation for Roborock cache."""
def __init__(self, hass: HomeAssistant, entry_id: str) -> None:
"""Initialize StoreImpl."""
super().__init__(
hass,
version=CACHE_VERSION,
key=f"{DOMAIN}/{entry_id}",
private=True,
)
async def _async_migrate_func(
self,
old_major_version: int,
old_minor_version: int,
old_data: dict[str, Any],
) -> dict[str, Any]:
"""Wipe out old caches with the old format."""
if old_major_version == 1:
# No need for migration as version 1 was never in any stable releases
return {}
return old_data
class CacheStore(Cache):
"""Store and retrieve cache for a Roborock device.
@@ -79,14 +55,19 @@ class CacheStore(Cache):
def __init__(self, hass: HomeAssistant, entry_id: str) -> None:
"""Initialize CacheStore."""
self._cache_store = StoreImpl(hass, entry_id)
self._cache_store = Store[dict[str, Any]](
hass,
version=CACHE_VERSION,
key=f"{DOMAIN}/{entry_id}",
private=True,
)
self._cache_data: CacheData | None = None
async def get(self) -> CacheData:
"""Retrieve cached metadata."""
if self._cache_data is None:
if data := await self._cache_store.async_load():
self._cache_data = CacheData.from_dict(data)
self._cache_data = CacheData(**data)
else:
self._cache_data = CacheData()
@@ -99,7 +80,7 @@ class CacheStore(Cache):
async def flush(self) -> None:
"""Flush cached metadata to disk."""
if self._cache_data is not None:
await self._cache_store.async_save(self._cache_data.as_dict())
await self._cache_store.async_save(dataclasses.asdict(self._cache_data))
async def async_remove(self) -> None:
"""Remove cached metadata from disk."""

View File

@@ -60,7 +60,7 @@
},
"dsl_connect_count": {
"name": "DSL connect count",
"unit_of_measurement": "attempts"
"unit_of_measurement": "connects"
},
"dsl_crc_error_count": {
"name": "DSL CRC error count",

View File

@@ -7,7 +7,6 @@ from dataclasses import dataclass
from pysmartthings import Capability, Command, SmartThings
from homeassistant.components.button import ButtonEntity, ButtonEntityDescription
from homeassistant.const import EntityCategory
from homeassistant.core import HomeAssistant
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
@@ -34,13 +33,6 @@ CAPABILITIES_TO_BUTTONS: dict[Capability | str, SmartThingsButtonDescription] =
key=Capability.CUSTOM_WATER_FILTER,
translation_key="reset_water_filter",
command=Command.RESET_WATER_FILTER,
entity_category=EntityCategory.DIAGNOSTIC,
),
Capability.SAMSUNG_CE_HOOD_FILTER: SmartThingsButtonDescription(
key=Capability.SAMSUNG_CE_HOOD_FILTER,
translation_key="reset_hood_filter",
command=Command.RESET_HOOD_FILTER,
entity_category=EntityCategory.DIAGNOSTIC,
),
}

View File

@@ -1157,17 +1157,6 @@ CAPABILITY_TO_SENSORS: dict[
)
]
},
Capability.SAMSUNG_CE_HOOD_FILTER: {
Attribute.HOOD_FILTER_USAGE: [
SmartThingsSensorEntityDescription(
key=Attribute.HOOD_FILTER_USAGE,
translation_key="hood_filter_usage",
state_class=SensorStateClass.MEASUREMENT,
native_unit_of_measurement=PERCENTAGE,
entity_category=EntityCategory.DIAGNOSTIC,
)
]
},
}

View File

@@ -74,9 +74,6 @@
}
},
"button": {
"reset_hood_filter": {
"name": "Reset filter"
},
"reset_water_filter": {
"name": "Reset water filter"
},
@@ -414,9 +411,6 @@
"simmer": "Simmer"
}
},
"hood_filter_usage": {
"name": "Filter usage"
},
"infrared_level": {
"name": "Infrared level"
},

View File

@@ -5,7 +5,6 @@
"config_flow": true,
"dependencies": ["http", "webhook"],
"documentation": "https://www.home-assistant.io/integrations/tedee",
"integration_type": "hub",
"iot_class": "local_push",
"loggers": ["aiotedee"],
"quality_scale": "platinum",

View File

@@ -28,9 +28,6 @@ from homeassistant.helpers import discovery, issue_registry as ir
from homeassistant.helpers.device import (
async_remove_stale_devices_links_keep_current_device,
)
from homeassistant.helpers.helper_integration import (
async_remove_helper_config_entry_from_source_device,
)
from homeassistant.helpers.reload import async_reload_integration_platforms
from homeassistant.helpers.service import async_register_admin_service
from homeassistant.helpers.typing import ConfigType
@@ -119,7 +116,6 @@ async def async_setup(hass: HomeAssistant, config: ConfigType) -> bool:
async def async_setup_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool:
"""Set up a config entry."""
# This can be removed in HA Core 2026.7
async_remove_stale_devices_links_keep_current_device(
hass,
entry.entry_id,
@@ -158,41 +154,6 @@ async def async_unload_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool:
)
async def async_migrate_entry(hass: HomeAssistant, config_entry: ConfigEntry) -> bool:
"""Migrate old entry."""
_LOGGER.debug(
"Migrating configuration from version %s.%s",
config_entry.version,
config_entry.minor_version,
)
if config_entry.version > 1:
# This means the user has downgraded from a future version
return False
if config_entry.version == 1:
if config_entry.minor_version < 2:
# Remove the template config entry from the source device
if source_device_id := config_entry.options.get(CONF_DEVICE_ID):
async_remove_helper_config_entry_from_source_device(
hass,
helper_config_entry_id=config_entry.entry_id,
source_device_id=source_device_id,
)
hass.config_entries.async_update_entry(
config_entry, version=1, minor_version=2
)
_LOGGER.debug(
"Migration to configuration version %s.%s successful",
config_entry.version,
config_entry.minor_version,
)
return True
async def _process_config(hass: HomeAssistant, hass_config: ConfigType) -> None:
"""Process config."""
coordinators = hass.data.pop(DATA_COORDINATORS, None)

View File

@@ -697,9 +697,6 @@ class TemplateConfigFlowHandler(SchemaConfigFlowHandler, domain=DOMAIN):
options_flow = OPTIONS_FLOW
options_flow_reloads = True
MINOR_VERSION = 2
VERSION = 1
@callback
def async_config_entry_title(self, options: Mapping[str, Any]) -> str:
"""Return config entry title."""

View File

@@ -1,20 +1,30 @@
"""Support for Tibber."""
from __future__ import annotations
from dataclasses import dataclass
import logging
import aiohttp
from aiohttp.client_exceptions import ClientError, ClientResponseError
import tibber
from tibber import data_api as tibber_data_api
from homeassistant.config_entries import ConfigEntry
from homeassistant.const import CONF_ACCESS_TOKEN, EVENT_HOMEASSISTANT_STOP, Platform
from homeassistant.core import Event, HomeAssistant
from homeassistant.exceptions import ConfigEntryNotReady
from homeassistant.exceptions import ConfigEntryAuthFailed, ConfigEntryNotReady
from homeassistant.helpers import config_validation as cv
from homeassistant.helpers.aiohttp_client import async_get_clientsession
from homeassistant.helpers.config_entry_oauth2_flow import (
ImplementationUnavailableError,
OAuth2Session,
async_get_config_entry_implementation,
)
from homeassistant.helpers.typing import ConfigType
from homeassistant.util import dt as dt_util, ssl as ssl_util
from .const import DATA_HASS_CONFIG, DOMAIN
from .const import AUTH_IMPLEMENTATION, DATA_HASS_CONFIG, DOMAIN
from .services import async_setup_services
PLATFORMS = [Platform.NOTIFY, Platform.SENSOR]
@@ -24,6 +34,34 @@ CONFIG_SCHEMA = cv.config_entry_only_config_schema(DOMAIN)
_LOGGER = logging.getLogger(__name__)
@dataclass(slots=True)
class TibberRuntimeData:
"""Runtime data for Tibber API entries."""
tibber_connection: tibber.Tibber
session: OAuth2Session | None = None
_client: tibber_data_api.TibberDataAPI | None = None
async def async_get_client(
self, hass: HomeAssistant
) -> tibber_data_api.TibberDataAPI:
"""Return an authenticated Tibber Data API client."""
if self.session is None:
raise ConfigEntryAuthFailed("OAuth session not available")
await self.session.async_ensure_token_valid()
token = self.session.token
access_token = token.get(CONF_ACCESS_TOKEN)
if not access_token:
raise ConfigEntryAuthFailed("Access token missing from OAuth session")
if self._client is None:
self._client = tibber_data_api.TibberDataAPI(
access_token,
websession=async_get_clientsession(hass),
)
self._client.set_access_token(access_token)
return self._client
async def async_setup(hass: HomeAssistant, config: ConfigType) -> bool:
"""Set up the Tibber component."""
@@ -37,13 +75,19 @@ async def async_setup(hass: HomeAssistant, config: ConfigType) -> bool:
async def async_setup_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool:
"""Set up a config entry."""
if AUTH_IMPLEMENTATION not in entry.data:
entry.async_start_reauth(hass)
raise ConfigEntryAuthFailed(
translation_domain=DOMAIN,
translation_key="data_api_reauth_required",
)
tibber_connection = tibber.Tibber(
access_token=entry.data[CONF_ACCESS_TOKEN],
websession=async_get_clientsession(hass),
time_zone=dt_util.get_default_time_zone(),
ssl=ssl_util.get_default_context(),
)
hass.data[DOMAIN] = tibber_connection
async def _close(event: Event) -> None:
await tibber_connection.rt_disconnect()
@@ -52,7 +96,6 @@ async def async_setup_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool:
try:
await tibber_connection.update_info()
except (
TimeoutError,
aiohttp.ClientError,
@@ -65,8 +108,32 @@ async def async_setup_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool:
except tibber.FatalHttpExceptionError:
return False
await hass.config_entries.async_forward_entry_setups(entry, PLATFORMS)
try:
implementation = await async_get_config_entry_implementation(hass, entry)
except ImplementationUnavailableError as err:
raise ConfigEntryNotReady(
translation_domain=DOMAIN,
translation_key="oauth2_implementation_unavailable",
) from err
session = OAuth2Session(hass, entry, implementation)
try:
await session.async_ensure_token_valid()
except ClientResponseError as err:
if 400 <= err.status < 500:
raise ConfigEntryAuthFailed(
"OAuth session is not valid, reauthentication required"
) from err
raise ConfigEntryNotReady from err
except ClientError as err:
raise ConfigEntryNotReady from err
hass.data[DOMAIN] = TibberRuntimeData(
tibber_connection=tibber_connection,
session=session,
)
await hass.config_entries.async_forward_entry_setups(entry, PLATFORMS)
return True
@@ -76,6 +143,6 @@ async def async_unload_entry(hass: HomeAssistant, config_entry: ConfigEntry) ->
config_entry, PLATFORMS
)
if unload_ok:
tibber_connection = hass.data[DOMAIN]
await tibber_connection.rt_disconnect()
if runtime := hass.data.pop(DOMAIN, None):
await runtime.tibber_connection.rt_disconnect()
return unload_ok

View File

@@ -0,0 +1,15 @@
"""Application credentials platform for Tibber."""
from homeassistant.components.application_credentials import AuthorizationServer
from homeassistant.core import HomeAssistant
AUTHORIZE_URL = "https://thewall.tibber.com/connect/authorize"
TOKEN_URL = "https://thewall.tibber.com/connect/token"
async def async_get_authorization_server(hass: HomeAssistant) -> AuthorizationServer:
"""Return authorization server for Tibber Data API."""
return AuthorizationServer(
authorize_url=AUTHORIZE_URL,
token_url=TOKEN_URL,
)

View File

@@ -2,17 +2,21 @@
from __future__ import annotations
from collections.abc import Mapping
import logging
from typing import Any
import aiohttp
import tibber
from tibber import data_api as tibber_data_api
import voluptuous as vol
from homeassistant.config_entries import ConfigFlow, ConfigFlowResult
from homeassistant.const import CONF_ACCESS_TOKEN
from homeassistant.config_entries import SOURCE_REAUTH, SOURCE_USER, ConfigFlowResult
from homeassistant.const import CONF_ACCESS_TOKEN, CONF_TOKEN
from homeassistant.helpers.aiohttp_client import async_get_clientsession
from homeassistant.helpers.config_entry_oauth2_flow import AbstractOAuth2FlowHandler
from .const import DOMAIN
from .const import AUTH_IMPLEMENTATION, DATA_API_DEFAULT_SCOPES, DOMAIN
DATA_SCHEMA = vol.Schema({vol.Required(CONF_ACCESS_TOKEN): str})
ERR_TIMEOUT = "timeout"
@@ -20,62 +24,145 @@ ERR_CLIENT = "cannot_connect"
ERR_TOKEN = "invalid_access_token"
TOKEN_URL = "https://developer.tibber.com/settings/access-token"
_LOGGER = logging.getLogger(__name__)
class TibberConfigFlow(ConfigFlow, domain=DOMAIN):
class TibberConfigFlow(AbstractOAuth2FlowHandler, domain=DOMAIN):
"""Handle a config flow for Tibber integration."""
VERSION = 1
DOMAIN = DOMAIN
def __init__(self) -> None:
"""Initialize the config flow."""
super().__init__()
self._access_token: str | None = None
self._title = ""
@property
def logger(self) -> logging.Logger:
"""Return the logger."""
return _LOGGER
@property
def extra_authorize_data(self) -> dict[str, Any]:
"""Extra data appended to the authorize URL."""
return {
**super().extra_authorize_data,
"scope": " ".join(DATA_API_DEFAULT_SCOPES),
}
async def async_step_user(
self, user_input: dict[str, Any] | None = None
) -> ConfigFlowResult:
"""Handle the initial step."""
self._async_abort_entries_match()
if user_input is not None:
access_token = user_input[CONF_ACCESS_TOKEN].replace(" ", "")
tibber_connection = tibber.Tibber(
access_token=access_token,
websession=async_get_clientsession(self.hass),
if user_input is None:
data_schema = self.add_suggested_values_to_schema(
DATA_SCHEMA, {CONF_ACCESS_TOKEN: self._access_token or ""}
)
errors = {}
try:
await tibber_connection.update_info()
except TimeoutError:
errors[CONF_ACCESS_TOKEN] = ERR_TIMEOUT
except tibber.InvalidLoginError:
errors[CONF_ACCESS_TOKEN] = ERR_TOKEN
except (
aiohttp.ClientError,
tibber.RetryableHttpExceptionError,
tibber.FatalHttpExceptionError,
):
errors[CONF_ACCESS_TOKEN] = ERR_CLIENT
if errors:
return self.async_show_form(
step_id="user",
data_schema=DATA_SCHEMA,
description_placeholders={"url": TOKEN_URL},
errors=errors,
)
unique_id = tibber_connection.user_id
await self.async_set_unique_id(unique_id)
self._abort_if_unique_id_configured()
return self.async_create_entry(
title=tibber_connection.name,
data={CONF_ACCESS_TOKEN: access_token},
return self.async_show_form(
step_id=SOURCE_USER,
data_schema=data_schema,
description_placeholders={"url": TOKEN_URL},
errors={},
)
return self.async_show_form(
step_id="user",
data_schema=DATA_SCHEMA,
description_placeholders={"url": TOKEN_URL},
errors={},
self._access_token = user_input[CONF_ACCESS_TOKEN].replace(" ", "")
tibber_connection = tibber.Tibber(
access_token=self._access_token,
websession=async_get_clientsession(self.hass),
)
self._title = tibber_connection.name or "Tibber"
errors: dict[str, str] = {}
try:
await tibber_connection.update_info()
except TimeoutError:
errors[CONF_ACCESS_TOKEN] = ERR_TIMEOUT
except tibber.InvalidLoginError:
errors[CONF_ACCESS_TOKEN] = ERR_TOKEN
except (
aiohttp.ClientError,
tibber.RetryableHttpExceptionError,
tibber.FatalHttpExceptionError,
):
errors[CONF_ACCESS_TOKEN] = ERR_CLIENT
if errors:
data_schema = self.add_suggested_values_to_schema(
DATA_SCHEMA, {CONF_ACCESS_TOKEN: self._access_token or ""}
)
return self.async_show_form(
step_id=SOURCE_USER,
data_schema=data_schema,
description_placeholders={"url": TOKEN_URL},
errors=errors,
)
await self.async_set_unique_id(tibber_connection.user_id)
if self.source == SOURCE_REAUTH:
reauth_entry = self._get_reauth_entry()
self._abort_if_unique_id_mismatch(
reason="wrong_account",
description_placeholders={"title": reauth_entry.title},
)
else:
self._abort_if_unique_id_configured()
self._async_abort_entries_match({AUTH_IMPLEMENTATION: DOMAIN})
return await self.async_step_pick_implementation()
async def async_step_reauth(
self, entry_data: Mapping[str, Any]
) -> ConfigFlowResult:
"""Handle a reauth flow."""
reauth_entry = self._get_reauth_entry()
self._access_token = reauth_entry.data.get(CONF_ACCESS_TOKEN)
self._title = reauth_entry.title
if reauth_entry.unique_id is not None:
await self.async_set_unique_id(reauth_entry.unique_id)
return await self.async_step_reauth_confirm()
async def async_step_reauth_confirm(
self, user_input: dict[str, Any] | None = None
) -> ConfigFlowResult:
"""Confirm reauthentication by reusing the user step."""
reauth_entry = self._get_reauth_entry()
self._access_token = reauth_entry.data.get(CONF_ACCESS_TOKEN)
self._title = reauth_entry.title
if user_input is None:
return self.async_show_form(
step_id="reauth_confirm",
)
return await self.async_step_user()
async def async_oauth_create_entry(self, data: dict) -> ConfigFlowResult:
"""Finalize the OAuth flow and create the config entry."""
if self._access_token is None:
return self.async_abort(reason="missing_configuration")
data[CONF_ACCESS_TOKEN] = self._access_token
access_token = data[CONF_TOKEN][CONF_ACCESS_TOKEN]
data_api_client = tibber_data_api.TibberDataAPI(
access_token,
websession=async_get_clientsession(self.hass),
)
try:
await data_api_client.get_userinfo()
except (aiohttp.ClientError, TimeoutError):
return self.async_abort(reason="cannot_connect")
if self.source == SOURCE_REAUTH:
reauth_entry = self._get_reauth_entry()
return self.async_update_reload_and_abort(
reauth_entry,
data=data,
title=self._title,
)
return self.async_create_entry(title=self._title, data=data)

View File

@@ -1,5 +1,19 @@
"""Constants for Tibber integration."""
AUTH_IMPLEMENTATION = "auth_implementation"
DATA_HASS_CONFIG = "tibber_hass_config"
DOMAIN = "tibber"
MANUFACTURER = "Tibber"
DATA_API_DEFAULT_SCOPES = [
"openid",
"profile",
"email",
"offline_access",
"data-api-user-read",
"data-api-chargers-read",
"data-api-energy-systems-read",
"data-api-homes-read",
"data-api-thermostats-read",
"data-api-vehicles-read",
"data-api-inverters-read",
]

View File

@@ -4,9 +4,14 @@ from __future__ import annotations
from datetime import timedelta
import logging
from typing import cast
from typing import TYPE_CHECKING, cast
from aiohttp.client_exceptions import ClientError
import tibber
from tibber.data_api import TibberDataAPI, TibberDevice
if TYPE_CHECKING:
from . import TibberRuntimeData
from homeassistant.components.recorder import get_instance
from homeassistant.components.recorder.models import (
@@ -22,6 +27,7 @@ from homeassistant.components.recorder.statistics import (
from homeassistant.config_entries import ConfigEntry
from homeassistant.const import UnitOfEnergy
from homeassistant.core import HomeAssistant
from homeassistant.exceptions import ConfigEntryAuthFailed
from homeassistant.helpers.update_coordinator import DataUpdateCoordinator, UpdateFailed
from homeassistant.util import dt as dt_util
from homeassistant.util.unit_conversion import EnergyConverter
@@ -187,3 +193,45 @@ class TibberDataCoordinator(DataUpdateCoordinator[None]):
unit_of_measurement=unit,
)
async_add_external_statistics(self.hass, metadata, statistics)
class TibberDataAPICoordinator(DataUpdateCoordinator[dict[str, TibberDevice]]):
"""Fetch and cache Tibber Data API device capabilities."""
def __init__(
self,
hass: HomeAssistant,
entry: ConfigEntry,
runtime_data: TibberRuntimeData,
) -> None:
"""Initialize the coordinator."""
super().__init__(
hass,
_LOGGER,
name=f"{DOMAIN} Data API",
update_interval=timedelta(minutes=1),
config_entry=entry,
)
self._runtime_data = runtime_data
async def _async_get_client(self) -> TibberDataAPI:
"""Get the Tibber Data API client with error handling."""
try:
return await self._runtime_data.async_get_client(self.hass)
except ConfigEntryAuthFailed:
raise
except (ClientError, TimeoutError, tibber.UserAgentMissingError) as err:
raise UpdateFailed(
f"Unable to create Tibber Data API client: {err}"
) from err
async def _async_setup(self) -> None:
"""Initial load of Tibber Data API devices."""
client = await self._async_get_client()
self.data = await client.get_all_devices()
async def _async_update_data(self) -> dict[str, TibberDevice]:
"""Fetch the latest device capabilities from the Tibber Data API."""
client = await self._async_get_client()
devices: dict[str, TibberDevice] = await client.update_devices()
return devices

View File

@@ -2,23 +2,30 @@
from __future__ import annotations
from typing import Any
from typing import TYPE_CHECKING, Any, cast
import aiohttp
import tibber
from homeassistant.config_entries import ConfigEntry
from homeassistant.core import HomeAssistant
from homeassistant.exceptions import ConfigEntryAuthFailed
from .const import DOMAIN
if TYPE_CHECKING:
from . import TibberRuntimeData
async def async_get_config_entry_diagnostics(
hass: HomeAssistant, config_entry: ConfigEntry
) -> dict[str, Any]:
"""Return diagnostics for a config entry."""
tibber_connection: tibber.Tibber = hass.data[DOMAIN]
return {
runtime = cast("TibberRuntimeData | None", hass.data.get(DOMAIN))
if runtime is None:
return {"homes": []}
result: dict[str, Any] = {
"homes": [
{
"last_data_timestamp": home.last_data_timestamp,
@@ -27,6 +34,38 @@ async def async_get_config_entry_diagnostics(
"last_cons_data_timestamp": home.last_cons_data_timestamp,
"country": home.country,
}
for home in tibber_connection.get_homes(only_active=False)
for home in runtime.tibber_connection.get_homes(only_active=False)
]
}
if runtime.session:
devices: dict[str, Any] = {}
error: str | None = None
try:
client = await runtime.async_get_client(hass)
devices = await client.get_all_devices()
except ConfigEntryAuthFailed:
error = "Authentication failed"
except TimeoutError:
error = "Timeout error"
except aiohttp.ClientError:
error = "Client error"
except tibber.InvalidLoginError:
error = "Invalid login"
except tibber.RetryableHttpExceptionError as err:
error = f"Retryable HTTP error ({err.status})"
except tibber.FatalHttpExceptionError as err:
error = f"Fatal HTTP error ({err.status})"
result["error"] = error
result["devices"] = [
{
"id": device.id,
"name": device.name,
"brand": device.brand,
"model": device.model,
}
for device in devices.values()
]
return result

View File

@@ -3,9 +3,9 @@
"name": "Tibber",
"codeowners": ["@danielhiversen"],
"config_flow": true,
"dependencies": ["recorder"],
"dependencies": ["application_credentials", "recorder"],
"documentation": "https://www.home-assistant.io/integrations/tibber",
"iot_class": "cloud_polling",
"loggers": ["tibber"],
"requirements": ["pyTibber==0.32.2"]
"requirements": ["pyTibber==0.33.1"]
}

View File

@@ -2,6 +2,8 @@
from __future__ import annotations
from typing import TYPE_CHECKING, cast
from tibber import Tibber
from homeassistant.components.notify import (
@@ -14,7 +16,10 @@ from homeassistant.core import HomeAssistant
from homeassistant.exceptions import HomeAssistantError
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from . import DOMAIN
from .const import DOMAIN
if TYPE_CHECKING:
from . import TibberRuntimeData
async def async_setup_entry(
@@ -39,7 +44,10 @@ class TibberNotificationEntity(NotifyEntity):
async def async_send_message(self, message: str, title: str | None = None) -> None:
"""Send a message to Tibber devices."""
tibber_connection: Tibber = self.hass.data[DOMAIN]
runtime = cast("TibberRuntimeData | None", self.hass.data.get(DOMAIN))
if runtime is None:
raise HomeAssistantError("Tibber integration is not initialized")
tibber_connection: Tibber = runtime.tibber_connection
try:
await tibber_connection.send_notification(
title or ATTR_TITLE_DEFAULT, message

View File

@@ -2,6 +2,7 @@
from __future__ import annotations
import asyncio
from collections.abc import Callable
import datetime
from datetime import timedelta
@@ -10,7 +11,8 @@ from random import randrange
from typing import Any
import aiohttp
import tibber
from tibber import FatalHttpExceptionError, RetryableHttpExceptionError, TibberHome
from tibber.data_api import TibberDevice
from homeassistant.components.sensor import (
SensorDeviceClass,
@@ -27,6 +29,7 @@ from homeassistant.const import (
UnitOfElectricCurrent,
UnitOfElectricPotential,
UnitOfEnergy,
UnitOfLength,
UnitOfPower,
)
from homeassistant.core import Event, HomeAssistant, callback
@@ -42,7 +45,7 @@ from homeassistant.helpers.update_coordinator import (
from homeassistant.util import Throttle, dt as dt_util
from .const import DOMAIN, MANUFACTURER
from .coordinator import TibberDataCoordinator
from .coordinator import TibberDataAPICoordinator, TibberDataCoordinator
_LOGGER = logging.getLogger(__name__)
@@ -260,6 +263,58 @@ SENSORS: tuple[SensorEntityDescription, ...] = (
)
DATA_API_SENSORS: tuple[SensorEntityDescription, ...] = (
SensorEntityDescription(
key="storage.stateOfCharge",
translation_key="storage_state_of_charge",
device_class=SensorDeviceClass.BATTERY,
native_unit_of_measurement=PERCENTAGE,
state_class=SensorStateClass.MEASUREMENT,
),
SensorEntityDescription(
key="storage.targetStateOfCharge",
translation_key="storage_target_state_of_charge",
device_class=SensorDeviceClass.BATTERY,
native_unit_of_measurement=PERCENTAGE,
state_class=SensorStateClass.MEASUREMENT,
),
SensorEntityDescription(
key="connector.status",
translation_key="connector_status",
device_class=SensorDeviceClass.ENUM,
options=["connected", "disconnected", "unknown"],
),
SensorEntityDescription(
key="charging.status",
translation_key="charging_status",
device_class=SensorDeviceClass.ENUM,
options=["charging", "idle", "unknown"],
),
SensorEntityDescription(
key="range.remaining",
translation_key="range_remaining",
device_class=SensorDeviceClass.DISTANCE,
native_unit_of_measurement=UnitOfLength.KILOMETERS,
state_class=SensorStateClass.MEASUREMENT,
suggested_display_precision=1,
),
SensorEntityDescription(
key="charging.current.max",
translation_key="charging_current_max",
device_class=SensorDeviceClass.CURRENT,
native_unit_of_measurement=UnitOfElectricCurrent.AMPERE,
state_class=SensorStateClass.MEASUREMENT,
),
SensorEntityDescription(
key="charging.current.offlineFallback",
translation_key="charging_current_offline_fallback",
device_class=SensorDeviceClass.CURRENT,
native_unit_of_measurement=UnitOfElectricCurrent.AMPERE,
state_class=SensorStateClass.MEASUREMENT,
),
)
async def async_setup_entry(
hass: HomeAssistant,
entry: ConfigEntry,
@@ -267,7 +322,23 @@ async def async_setup_entry(
) -> None:
"""Set up the Tibber sensor."""
tibber_connection = hass.data[DOMAIN]
await asyncio.gather(
_async_setup_data_api_sensors(hass, entry, async_add_entities),
_async_setup_graphql_sensors(hass, entry, async_add_entities),
)
async def _async_setup_graphql_sensors(
hass: HomeAssistant,
entry: ConfigEntry,
async_add_entities: AddConfigEntryEntitiesCallback,
) -> None:
"""Set up the Tibber sensor."""
runtime = hass.data.get(DOMAIN)
if runtime is None:
raise PlatformNotReady("Tibber runtime is not ready")
tibber_connection = runtime.tibber_connection
entity_registry = er.async_get(hass)
device_registry = dr.async_get(hass)
@@ -280,7 +351,11 @@ async def async_setup_entry(
except TimeoutError as err:
_LOGGER.error("Timeout connecting to Tibber home: %s ", err)
raise PlatformNotReady from err
except (tibber.RetryableHttpExceptionError, aiohttp.ClientError) as err:
except (
RetryableHttpExceptionError,
FatalHttpExceptionError,
aiohttp.ClientError,
) as err:
_LOGGER.error("Error connecting to Tibber home: %s ", err)
raise PlatformNotReady from err
@@ -328,14 +403,93 @@ async def async_setup_entry(
async_add_entities(entities, True)
async def _async_setup_data_api_sensors(
hass: HomeAssistant,
entry: ConfigEntry,
async_add_entities: AddConfigEntryEntitiesCallback,
) -> None:
"""Set up sensors backed by the Tibber Data API."""
runtime = hass.data.get(DOMAIN)
if runtime is None:
raise PlatformNotReady("Tibber runtime is not ready")
coordinator = TibberDataAPICoordinator(hass, entry, runtime)
await coordinator.async_config_entry_first_refresh()
entities: list[TibberDataAPISensor] = []
api_sensors = {sensor.key: sensor for sensor in DATA_API_SENSORS}
for device in coordinator.data.values():
for sensor in device.sensors:
description: SensorEntityDescription | None = api_sensors.get(sensor.id)
if description is None:
_LOGGER.debug(
"Sensor %s not found in DATA_API_SENSORS, skipping", sensor
)
continue
entities.append(
TibberDataAPISensor(
coordinator, device, description, sensor.description
)
)
async_add_entities(entities)
class TibberDataAPISensor(CoordinatorEntity[TibberDataAPICoordinator], SensorEntity):
"""Representation of a Tibber Data API capability sensor."""
_attr_has_entity_name = True
def __init__(
self,
coordinator: TibberDataAPICoordinator,
device: TibberDevice,
entity_description: SensorEntityDescription,
name: str,
) -> None:
"""Initialize the sensor."""
super().__init__(coordinator)
self._device_id: str = device.id
self.entity_description = entity_description
self._attr_name = name
self._attr_unique_id = f"{device.external_id}_{self.entity_description.key}"
self._attr_device_info = DeviceInfo(
identifiers={(DOMAIN, device.external_id)},
name=device.name,
manufacturer=device.brand,
model=device.model,
)
@property
def native_value(
self,
) -> StateType:
"""Return the value reported by the device."""
device = self.coordinator.data.get(self._device_id)
if device is None:
return None
for sensor in device.sensors:
if sensor.id == self.entity_description.key:
return sensor.value
return None
@property
def available(self) -> bool:
"""Return whether the sensor is available."""
return super().available and self.native_value is not None
class TibberSensor(SensorEntity):
"""Representation of a generic Tibber sensor."""
_attr_has_entity_name = True
def __init__(
self, *args: Any, tibber_home: tibber.TibberHome, **kwargs: Any
) -> None:
def __init__(self, *args: Any, tibber_home: TibberHome, **kwargs: Any) -> None:
"""Initialize the sensor."""
super().__init__(*args, **kwargs)
self._tibber_home = tibber_home
@@ -366,7 +520,7 @@ class TibberSensorElPrice(TibberSensor):
_attr_state_class = SensorStateClass.MEASUREMENT
_attr_translation_key = "electricity_price"
def __init__(self, tibber_home: tibber.TibberHome) -> None:
def __init__(self, tibber_home: TibberHome) -> None:
"""Initialize the sensor."""
super().__init__(tibber_home=tibber_home)
self._last_updated: datetime.datetime | None = None
@@ -443,7 +597,7 @@ class TibberDataSensor(TibberSensor, CoordinatorEntity[TibberDataCoordinator]):
def __init__(
self,
tibber_home: tibber.TibberHome,
tibber_home: TibberHome,
coordinator: TibberDataCoordinator,
entity_description: SensorEntityDescription,
) -> None:
@@ -470,7 +624,7 @@ class TibberSensorRT(TibberSensor, CoordinatorEntity["TibberRtDataCoordinator"])
def __init__(
self,
tibber_home: tibber.TibberHome,
tibber_home: TibberHome,
description: SensorEntityDescription,
initial_state: float,
coordinator: TibberRtDataCoordinator,
@@ -532,7 +686,7 @@ class TibberRtEntityCreator:
def __init__(
self,
async_add_entities: AddConfigEntryEntitiesCallback,
tibber_home: tibber.TibberHome,
tibber_home: TibberHome,
entity_registry: er.EntityRegistry,
) -> None:
"""Initialize the data handler."""
@@ -618,7 +772,7 @@ class TibberRtDataCoordinator(DataUpdateCoordinator): # pylint: disable=hass-en
hass: HomeAssistant,
config_entry: ConfigEntry,
add_sensor_callback: Callable[[TibberRtDataCoordinator, Any], None],
tibber_home: tibber.TibberHome,
tibber_home: TibberHome,
) -> None:
"""Initialize the data handler."""
self._add_sensor_callback = add_sensor_callback

View File

@@ -33,7 +33,8 @@ SERVICE_SCHEMA: Final = vol.Schema(
async def __get_prices(call: ServiceCall) -> ServiceResponse:
tibber_connection = call.hass.data[DOMAIN]
runtime = call.hass.data[DOMAIN]
tibber_connection = runtime.tibber_connection
start = __get_date(call.data.get(ATTR_START), "start")
end = __get_date(call.data.get(ATTR_END), "end")

View File

@@ -1,7 +1,11 @@
{
"config": {
"abort": {
"already_configured": "[%key:common::config_flow::abort::already_configured_service%]"
"already_configured": "[%key:common::config_flow::abort::already_configured_service%]",
"missing_configuration": "[%key:common::config_flow::abort::oauth2_missing_configuration%]",
"missing_credentials": "[%key:common::config_flow::abort::oauth2_missing_credentials%]",
"reauth_successful": "[%key:common::config_flow::abort::reauth_successful%]",
"wrong_account": "The connected account does not match {title}. Sign in with the same Tibber account and try again."
},
"error": {
"cannot_connect": "[%key:common::config_flow::error::cannot_connect%]",
@@ -9,6 +13,10 @@
"timeout": "[%key:common::config_flow::error::timeout_connect%]"
},
"step": {
"reauth_confirm": {
"description": "Reconnect your Tibber account to refresh access.",
"title": "[%key:common::config_flow::title::reauth%]"
},
"user": {
"data": {
"access_token": "[%key:common::config_flow::data::access_token%]"
@@ -40,6 +48,28 @@
"average_power": {
"name": "Average power"
},
"charging_current_max": {
"name": "Maximum charging current"
},
"charging_current_offline_fallback": {
"name": "Offline fallback charging current"
},
"charging_status": {
"name": "Charging status",
"state": {
"charging": "Charging",
"idle": "Idle",
"unknown": "Unknown"
}
},
"connector_status": {
"name": "Connector status",
"state": {
"connected": "Connected",
"disconnected": "Disconnected",
"unknown": "Unknown"
}
},
"current_l1": {
"name": "Current L1"
},
@@ -88,9 +118,18 @@
"power_production": {
"name": "Power production"
},
"range_remaining": {
"name": "Remaining range"
},
"signal_strength": {
"name": "Signal strength"
},
"storage_state_of_charge": {
"name": "Storage state of charge"
},
"storage_target_state_of_charge": {
"name": "Storage target state of charge"
},
"voltage_phase1": {
"name": "Voltage phase1"
},
@@ -103,9 +142,15 @@
}
},
"exceptions": {
"data_api_reauth_required": {
"message": "Reconnect Tibber so Home Assistant can enable the new Tibber Data API features."
},
"invalid_date": {
"message": "Invalid datetime provided {date}"
},
"oauth2_implementation_unavailable": {
"message": "[%key:common::exceptions::oauth2_implementation_unavailable::message%]"
},
"send_message_timeout": {
"message": "Timeout sending message with Tibber"
}

View File

@@ -4,7 +4,6 @@
"codeowners": ["@ntilley905"],
"config_flow": true,
"documentation": "https://www.home-assistant.io/integrations/wake_on_lan",
"integration_type": "service",
"iot_class": "local_push",
"requirements": ["wakeonlan==3.1.0"]
}

View File

@@ -5,7 +5,7 @@ from __future__ import annotations
from typing import Any
import voluptuous as vol
from wled import WLED, Device, WLEDConnectionError, WLEDUnsupportedVersionError
from wled import WLED, Device, WLEDConnectionError
from homeassistant.components import onboarding
from homeassistant.config_entries import (
@@ -48,8 +48,6 @@ class WLEDFlowHandler(ConfigFlow, domain=DOMAIN):
if user_input is not None:
try:
device = await self._async_get_device(user_input[CONF_HOST])
except WLEDUnsupportedVersionError:
errors["base"] = "unsupported_version"
except WLEDConnectionError:
errors["base"] = "cannot_connect"
else:
@@ -112,8 +110,6 @@ class WLEDFlowHandler(ConfigFlow, domain=DOMAIN):
self.discovered_host = discovery_info.host
try:
self.discovered_device = await self._async_get_device(discovery_info.host)
except WLEDUnsupportedVersionError:
return self.async_abort(reason="unsupported_version")
except WLEDConnectionError:
return self.async_abort(reason="cannot_connect")

View File

@@ -9,7 +9,6 @@ from wled import (
WLEDConnectionClosedError,
WLEDError,
WLEDReleases,
WLEDUnsupportedVersionError,
)
from homeassistant.config_entries import ConfigEntry
@@ -116,14 +115,6 @@ class WLEDDataUpdateCoordinator(DataUpdateCoordinator[WLEDDevice]):
"""Fetch data from WLED."""
try:
device = await self.wled.update()
except WLEDUnsupportedVersionError as error:
# Error message from WLED library contains version info
# better to show that to user, but it is not translatable.
raise ConfigEntryError(
translation_domain=DOMAIN,
translation_key="unsupported_version",
translation_placeholders={"error": str(error)},
) from error
except WLEDError as error:
raise UpdateFailed(
translation_domain=DOMAIN,

View File

@@ -150,9 +150,12 @@ class WLEDSegmentLight(WLEDEntity, LightEntity):
@property
def available(self) -> bool:
"""Return True if entity is available."""
return (
super().available and self._segment in self.coordinator.data.state.segments
)
try:
self.coordinator.data.state.segments[self._segment]
except KeyError:
return False
return super().available
@property
def rgb_color(self) -> tuple[int, int, int] | None:

View File

@@ -97,9 +97,12 @@ class WLEDNumber(WLEDEntity, NumberEntity):
@property
def available(self) -> bool:
"""Return True if entity is available."""
return (
super().available and self._segment in self.coordinator.data.state.segments
)
try:
self.coordinator.data.state.segments[self._segment]
except KeyError:
return False
return super().available
@property
def native_value(self) -> float | None:

View File

@@ -1,88 +0,0 @@
rules:
# Bronze
action-setup:
status: exempt
comment: This integration does not have custom service actions.
appropriate-polling: done
brands: done
common-modules: done
config-flow-test-coverage:
status: todo
comment: |
test_connection_error and test_unsupported_version_error should end in CREATE_ENTRY
config-flow: done
dependency-transparency: done
docs-actions:
status: exempt
comment: This integration does not have custom service actions.
docs-high-level-description: done
docs-installation-instructions: done
docs-removal-instructions: done
entity-event-setup: done
entity-unique-id: done
has-entity-name: done
runtime-data: done
test-before-configure: done
test-before-setup: done
unique-config-entry: done
# Silver
action-exceptions: todo
config-entry-unloading: done
docs-configuration-parameters: done
docs-installation-parameters: todo
integration-owner: done
log-when-unavailable: done
parallel-updates: done
reauthentication-flow:
status: exempt
comment: |
This integration does not require authentication.
test-coverage:
status: todo
comment: |
The test_setting_unique_id test is redundant.
The test_websocket_already_connected test can use the freezer.
The snapshot tests should be used more widely.
We should use pytest.mark.freeze_time instead of mock.
# Gold
devices: done
diagnostics: done
discovery-update-info: done
discovery: done
docs-data-update: todo
docs-examples: done
docs-known-limitations:
status: todo
comment: |
Analog RGBCCT Strip are poor supported by HA.
See: https://github.com/home-assistant/core/issues/123614
docs-supported-devices: todo
docs-supported-functions: done
docs-troubleshooting: todo
docs-use-cases: todo
dynamic-devices:
status: exempt
comment: |
This integration has a fixed single device.
entity-category: done
entity-device-class:
status: todo
comment: Led count could receive unit of measurement
entity-disabled-by-default: done
entity-translations: done
exception-translations: done
icon-translations: done
reconfiguration-flow: done
repair-issues:
status: exempt
comment: This integration does not have any known issues that require repair.
stale-devices:
status: exempt
comment: |
This integration has a fixed single device.
# Platinum
async-dependency: done
inject-websession: done
strict-typing: done

Some files were not shown because too many files have changed in this diff Show More