Compare commits

..

174 Commits

Author SHA1 Message Date
Petar Petrov
90e7653367 Update tests 2026-01-13 15:59:11 +02:00
Petar Petrov
55fc77a09a PR comments 2026-01-13 15:09:15 +02:00
Petar Petrov
0fbcb7b8f7 Apply suggestions from code review
Co-authored-by: Martin Hjelmare <marhje52@gmail.com>
2026-01-13 13:55:45 +02:00
Petar Petrov
5f4ffd6f8a Add availability checks 2026-01-08 18:07:00 +02:00
Petar Petrov
294c93e3ed PR comments 2026-01-08 17:53:20 +02:00
Petar Petrov
51faa35f1b Update homeassistant/components/energy/data.py
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
2026-01-08 17:35:05 +02:00
Petar Petrov
303a4091a7 Handle different units in combined sensors 2026-01-08 15:41:08 +02:00
Petar Petrov
fc9a86b919 naming 2026-01-08 15:15:18 +02:00
Petar Petrov
2be7b57e48 validation tweak 2026-01-08 14:09:37 +02:00
Petar Petrov
27ecfd1319 type fix 2026-01-08 10:59:25 +02:00
Petar Petrov
ade50c93cf typing 2026-01-08 09:28:33 +02:00
Petar Petrov
b029a48ed4 add tests 2026-01-08 09:09:03 +02:00
Petar Petrov
b05a6dadf6 Add non standard power sensor support 2026-01-07 16:51:59 +02:00
Simone Chemelli
6e32a2aa18 Bump aiovodafone to 3.1.1 (#160429) 2026-01-07 15:34:46 +01:00
Abílio Costa
3b575fe3e3 Support target triggers in automation relation extraction (#160369) 2026-01-07 15:15:44 +01:00
Joost Lekkerkerker
229400de98 Make Watts depend on the cloud integration (#160424) 2026-01-07 15:07:24 +01:00
Norbert Rittel
e963adfdf0 Fix capitalization in openevse data_description string (#160423) 2026-01-07 14:53:19 +01:00
Simone Chemelli
fd7bbc68c6 Bump aioshelly to 13.23.1 (#160420) 2026-01-07 14:49:18 +01:00
Robert Resch
9281ab018c Constraint aiomqtt>=2.5.0 to fix blocking call (#160410)
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
2026-01-07 14:21:49 +01:00
Andres Ruiz
80baf86e23 Add codeowners and integration_type for waterfurnace (#160397) 2026-01-07 13:12:58 +01:00
Simone Chemelli
db497b23fe Small cleanup for Vodafone Station tests (#160415) 2026-01-07 12:50:12 +01:00
cdnninja
a2fb8f5a72 Add Vesync Air Fryer Sensors (#160170)
Co-authored-by: Norbert Rittel <norbert@rittel.de>
Co-authored-by: Joost Lekkerkerker <joostlek@outlook.com>
2026-01-07 12:41:34 +01:00
hanwg
6953bd4599 Fix schema validation error in Telegram (#160367) 2026-01-07 12:27:17 +01:00
Xiangxuan Qu
225be65f71 Fix IndexError in Israel Rail sensor when no departures available (#160351)
Co-authored-by: Joostlek <joostlek@outlook.com>
2026-01-07 12:22:39 +01:00
momala454
7b0463f763 Add additional lens modes 4 to 10 to JVC projector remote (#159657)
Co-authored-by: Joostlek <joostlek@outlook.com>
2026-01-07 12:22:19 +01:00
Luke Lashley
4d305b657a Bump python-roborock to 4.2.1 (#160398) 2026-01-07 11:23:40 +01:00
Paul Tarjan
d5a553c8c7 Fix Ring integration log flooding for accounts without subscription (#158012)
Co-authored-by: Robert Resch <robert@resch.dev>
2026-01-07 11:14:05 +01:00
Ivan Dlugos
9169b68254 Bump sentry-sdk to 2.48.0 (#159415)
Co-authored-by: Claude Sonnet 4.5 <noreply@anthropic.com>
2026-01-07 11:05:38 +01:00
Colin
fde9bd95d5 Replace openevse backend library (#160325) 2026-01-07 10:25:15 +01:00
Marc Mueller
e4db8ff86e Update guppy3 to 3.1.6 (#160356) 2026-01-07 10:11:01 +01:00
Erik Montnemery
a084e51345 Add test helpers for numerical state triggers (#160308)
Co-authored-by: Martin Hjelmare <marhje52@gmail.com>
2026-01-07 08:53:35 +01:00
Luke Lashley
00381e6dfd Remove q7 total cleaning time for Roborock (#160399) 2026-01-06 20:27:09 -08:00
Michael Hansen
b6d493696a Bump intents to 2026.1.6 (#160389) 2026-01-06 17:11:54 -06:00
Artem Draft
5f0500c3cd Add SSL support in Bravia TV (#160373)
Co-authored-by: Maciej Bieniek <bieniu@users.noreply.github.com>
2026-01-06 23:59:47 +01:00
dontinelli
c61a63cc6f Bump solarlog_cli to 0.7.0 (#160382) 2026-01-06 23:59:16 +01:00
Raphael Hehl
5445a4f40f Bump uiprotect to 8.0.0 (#160384)
Co-authored-by: RaHehl <rahehl@users.noreply.github.com>
2026-01-06 23:57:19 +01:00
Daniel Hjelseth Høyer
2888cacc3f Bump pyTibber to 0.34.1 (#160380)
Signed-off-by: Daniel Hjelseth Høyer <github@dahoiv.net>
2026-01-06 23:56:26 +01:00
TheJulianJES
16f3e6d2c9 Bump ZHA to 0.0.83 (#160342) 2026-01-06 12:11:40 -05:00
Bram Kragten
7a872970fa Update frontend to 20251229.1 (#160372) 2026-01-06 17:53:56 +01:00
Bram Kragten
4f5ca986ce Fix number or entity choose schema (#160358) 2026-01-06 17:23:24 +01:00
Artem Draft
b58e058da5 Bump pybravia to 0.4.1 (#160368) 2026-01-06 16:42:58 +01:00
epenet
badebe0c7f Refactor Tuya event platform to use DeviceWrapper (#160366) 2026-01-06 16:09:13 +01:00
mettolen
7817ec1a52 Update Saunum integration to gold quality tier (#159783) 2026-01-06 16:07:28 +01:00
epenet
c773998946 Remove default in Tuya DeviceWrapper options (#160303) 2026-01-06 13:06:53 +01:00
Mika
2bc9397103 Fix missing state class to solaredge (#160336) 2026-01-06 12:36:49 +01:00
Daniel Hjelseth Høyer
685534b17c Add more Tibber sensors (#160354) 2026-01-06 11:18:35 +01:00
tronikos
c740f44bfa Bump opower to 0.16.0 (#160348) 2026-01-06 10:25:43 +01:00
mettolen
ce471d0222 Add button entity to Airobot integration (#160169) 2026-01-06 08:57:24 +01:00
Manu
53ed344fe0 Add action exceptions to Duck DNS (#160331) 2026-01-06 08:38:22 +01:00
Manu
5f8f3c961a Remove stale devices in Xbox integration (#160337) 2026-01-06 08:27:42 +01:00
Aidan Timson
9d0c5530f2 Bump systembridgeconnector to 5.3.1 (#160326) 2026-01-06 08:09:25 +01:00
abelyliu
d114fe4fbd Add report_type to Tuya diagnostic (#160311) 2026-01-06 07:37:48 +01:00
Daniel Hjelseth Høyer
f03d44d5b5 Bump pyTibber to 0.34.0 (#160333) 2026-01-06 00:55:28 +01:00
Frédéric
35f4464d4a Add Resideo X2S Smart Thermostat to Matter fan-only mode list (#160260) 2026-01-06 00:24:20 +01:00
DeerMaximum
fc2530e979 Use a fixture in NINA to mock async_setup_entry (#160323) 2026-01-05 21:51:10 +01:00
Manu
354fafda1a Refactor Xbox coordinators (#160174) 2026-01-05 21:31:57 +01:00
Sid
5b0dab479d Bump eheimdigital to 1.5.0 (#160312) 2026-01-05 21:30:05 +01:00
Daniel Hjelseth Høyer
1e1f414849 Fix unit for Tibber sensor (#160319) 2026-01-05 21:27:00 +01:00
Xidorn Quan
7c81df6c5c Fix rain count sensors' state class of Ecowitt (#158204) 2026-01-05 21:02:21 +01:00
J. Nick Koston
95d7c42e6a Require service_uuid and service_data_uuid to match hue ble (#160321) 2026-01-05 19:47:31 +01:00
Joakim Sørensen
19fd80035e Add connection check before registering cloudhook URL (#160284) 2026-01-05 09:35:49 -05:00
Martin Hjelmare
8e30787ae6 Test hassfest translations gen_strings_schema (#159464) 2026-01-05 15:27:03 +01:00
epenet
7133da928f Add max_value/min_value/value_step to Tuya DeviceWrapper (#160300) 2026-01-05 14:56:17 +01:00
Colin
3f9a41d393 Add openevse config flow (#158968)
Co-authored-by: Joost Lekkerkerker <joostlek@outlook.com>
2026-01-05 14:49:36 +01:00
epenet
f4caf36204 Use generic DeviceWrapper in Tuya cover (#160301) 2026-01-05 14:45:13 +01:00
Bram Kragten
079866e384 Fix humidifier trigger turned on icon (#160297) 2026-01-05 13:45:31 +01:00
epenet
dce0db78aa Use generic DeviceWrapper in Tuya sensor (#160299) 2026-01-05 13:45:08 +01:00
epenet
fffc18d28b Use generic DeviceWrapper in more Tuya platforms (#160298) 2026-01-05 13:44:45 +01:00
cdnninja
c7cbcbc32d Refactor entity unavailable handling in VeSync (#160274) 2026-01-05 13:17:35 +01:00
Samuel Xiao
aebcdd6e7a Switchbot Cloud: Add new supported light (#160282) 2026-01-05 13:12:02 +01:00
Paul Tarjan
3be92510f8 Fix Tesla update showing scheduled updates as installing (#158681) 2026-01-05 12:42:58 +01:00
Matrix
4d8448e82a Bump yolink api to 0.6.1 (#160293) 2026-01-05 12:37:26 +01:00
Daniel Hjelseth Høyer
625bc467d4 Move Tibber to OAuth (#156690)
Signed-off-by: Daniel Hjelseth Høyer <github@dahoiv.net>
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
Co-authored-by: Joostlek <joostlek@outlook.com>
2026-01-05 12:36:41 +01:00
Steven Looman
47573b7f6a Bump async-upnp-client to 0.46.2 (#160188) 2026-01-05 12:33:40 +01:00
epenet
7c95c92525 Make Tuya DeviceWrapper a generic class (#159349) 2026-01-05 12:22:48 +01:00
dotlambda
1aed46e39e Bump google-genai to 1.56.0 (#160210) 2026-01-05 12:21:02 +01:00
epenet
6659166df0 Move Tuya vacuum entity logic to wrapper class (#159255) 2026-01-05 12:20:25 +01:00
Erik Montnemery
1e6b0ba9ec Allow passing trigger options to parametrize_trigger_states (#160119) 2026-01-05 10:44:31 +01:00
abelyliu
1f23098638 Bump tuya-device-sharing-sdk to 0.2.8 (#160288) 2026-01-05 10:28:13 +01:00
epenet
98ee0421b7 Fix Tuya light color data wrapper (#160280) 2026-01-05 10:27:57 +01:00
cdnninja
6aaa57f660 Set PARALLEL_UPDATES in VeSync (#160272) 2026-01-05 09:18:16 +01:00
Andres Ruiz
fad817853f Add state_class to waterfurnace sensors (#160277) 2026-01-05 09:17:41 +01:00
Michael
7ef7d3f570 Add entered and left home person triggers (#159320)
Co-authored-by: Erik Montnemery <erik@montnemery.com>
2026-01-05 09:09:59 +01:00
Thomas55555
14bca5a052 Make verify_ssl configurable in remote calendar (#160216) 2026-01-04 15:32:38 -08:00
Vincent Courcelle
18769730f0 Bump python-roborock to 4.2.0 (#160184) 2026-01-05 00:26:50 +01:00
Sab44
de6d117d9a Bump librehardwaremonitor-api to version 1.8.4 (#160249) 2026-01-04 21:35:28 +01:00
J. Nick Koston
d2deef968a Ensure Brotli >= 1.2.0 (#160229) 2026-01-04 08:08:42 -10:00
Mick Vleeshouwer
6cae1821fb Fix execution history matching to ignore subsystem suffix in diagnostics in Overkiz (#160218) 2026-01-04 11:38:30 +01:00
Jan-Philipp Benecke
8d8046d233 Bump aiowebdav2 to 0.5.0 (#160233) 2026-01-04 11:37:41 +01:00
Samuel Xiao
d7a9a980d0 Switchbot Cloud: Fixed Robot Vacuum Cleaner S20 had two device_model name (#160230) 2026-01-04 11:36:24 +01:00
J. Nick Koston
ff8ad0c9ba Bump aioesphomeapi to 43.10.1 (#160227) 2026-01-03 17:40:15 -10:00
J. Nick Koston
27728cdca8 Bump aiohttp 3.13.3 (#160206) 2026-01-03 16:46:27 -10:00
Erwin Douna
f1eaf78923 Portainer polish ephemeral container ID (#160186) 2026-01-03 21:32:07 +01:00
Willem-Jan van Rootselaar
667b1db594 Bump python-bsblan dependency to version 3.1.6 (#160202) 2026-01-03 21:30:26 +01:00
Josef Zweck
d6cad546e1 Remove referral link from fish_audio (#160193) 2026-01-03 17:12:53 +01:00
Tom
4c8ffa2158 Bump airOS to v0.6.1 adding LiteAP AC support (#160194) 2026-01-03 13:52:08 +01:00
Lukas
933fae9ade Pooldose document exempts (#160166) 2026-01-03 08:59:07 +01:00
Erwin Douna
b6dd9db76e Portainer add state sensor (#160156) 2026-01-03 08:51:58 +01:00
Manu
11487d6856 Set integration type service in Duck DNS (#160172) 2026-01-03 08:51:18 +01:00
Manu
920e938d84 Add discovery for default hostnames to PlayStation Network (#160173) 2026-01-03 08:44:17 +01:00
Kevin Stillhammer
afc256622a raise proper service exceptions in fressnapf_tracker (#159707)
Co-authored-by: Copilot <198982749+Copilot@users.noreply.github.com>
2026-01-02 19:53:16 +01:00
Erwin Douna
bfef048a7c Bump pyportainer 1.0.22 (#160140) 2026-01-02 18:37:14 +01:00
Maikel Punie
bfc8111728 Bump velbus to silver integration scale (#160147) 2026-01-02 18:36:04 +01:00
Maikel Punie
ebd6ae7e80 Velbus mark entities unavailable when connection is terminated (#160143) 2026-01-02 17:43:33 +01:00
MarkGodwin
dd98a85300 Refactor TP-Link Omada config flow tests (#159950)
Co-authored-by: Joostlek <joostlek@outlook.com>
2026-01-02 17:41:04 +01:00
Brett Adams
6568a19ce6 Handle export options when enrolled to VPP in Teslemetry (#157665) 2026-01-02 16:52:03 +01:00
wollew
83c1e8d5b5 bump pyvlx version to 0.2.27 (#160139) 2026-01-02 16:49:09 +01:00
Simone Chemelli
c5a06657a3 Remove low level call for Shelly climate (#160065) 2026-01-02 16:47:39 +01:00
Maciej Bieniek
25e54990d2 Bump nextdns to version 5.0.0 (#160138) 2026-01-02 16:33:27 +01:00
Nikoheld
3b2a7ba561 bump nibe to 2.21.0 (#160135) 2026-01-02 16:06:43 +01:00
Åke Strandberg
8f8f896675 Add filling level sensors to miele (#157858) 2026-01-02 15:57:15 +01:00
Willem-Jan van Rootselaar
9539a612a6 Add time synchronization feature to BSB-Lan integration (#156600) 2026-01-02 15:54:37 +01:00
Erwin Douna
d6751eb63f Bump pyportainer 1.0.21 (#160130) 2026-01-02 15:06:46 +01:00
Pete Sage
b462038126 Use long service timeout for Sonos Unjoin (#160110) 2026-01-02 14:18:56 +01:00
cdnninja
ce06446376 Add pm1 and pm10 to vesync (#160072)
Co-authored-by: Josef Zweck <josef@zweck.dev>
2026-01-02 14:15:52 +01:00
Erik Montnemery
8de22e0134 Await writes in shopping_list action handlers (#157420) 2026-01-02 13:41:41 +01:00
mettolen
fbd08d4e42 Bump pyairobotrest to 0.2.0 (#160125) 2026-01-02 12:29:29 +01:00
Zoltán Farkasdi
32e0be4535 netatmo: test_camera webhook testing parametrize and light split (#159772) 2026-01-02 11:00:17 +01:00
Maikel Punie
0423639833 Bump velbusaio to 2026.1.1 (#160116) 2026-01-02 09:16:27 +01:00
Jan Bouwhuis
1244d8aa33 Fix reolink brightness scaling (#160106) 2026-01-01 21:56:35 +01:00
Pete Sage
38c37ab33c Improve Sonos wait to unjoin timeout (#160011) 2026-01-01 20:21:25 +01:00
Willem-Jan van Rootselaar
1636eab2e8 Add schema validation for set_hot_water_schedule service (#159990) 2026-01-01 20:16:54 +01:00
Miguel Camba
737a5811a9 Update voluptuous and voluptuous-openapi (#160073) 2026-01-01 20:07:06 +01:00
Austin Mroczek
5f2da20319 Bump total_connect_client to 2025.12.2 (#160075) 2026-01-01 20:02:56 +01:00
Michael Hansen
2aed4fb8e9 Bump intents to 2026.1.1 (#160099) 2026-01-01 19:58:37 +01:00
Lukas
2b10dc4545 Add reconfiguration flow to pooldose (#159978)
Co-authored-by: Joost Lekkerkerker <joostlek@outlook.com>
2026-01-01 17:20:33 +01:00
Maikel Punie
b5d22a63bb Velbus quality docs updates (#160092) 2026-01-01 17:02:30 +01:00
Maikel Punie
e8e19f47cd Velbus Exception translations (#159627)
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
2026-01-01 16:51:39 +01:00
Maikel Punie
97e6643cd7 Bump velbusaio to 2026.1.0 (#160087) 2026-01-01 16:50:28 +01:00
Ben Wolstencroft
ee4bb0eef5 Add support for health_overview API endpoint to Tractive integration (#157960)
Co-authored-by: Maciej Bieniek <bieniu@users.noreply.github.com>
2026-01-01 13:06:24 +01:00
Maikel Punie
f82bb8f0b8 Use brightness scale in velbus light (#160041) 2026-01-01 13:03:52 +01:00
cdnninja
79b368cfc3 add description to string vesync (#160003) 2025-12-31 22:20:50 +01:00
cdnninja
6da4a006f2 Add Auto Off Switch to VeSync (#160070) 2025-12-31 22:17:33 +01:00
Allen Porter
e5f3ccb38d Improve roborock test accuracy/robustness (#160021) 2025-12-31 16:32:53 +01:00
tronikos
560b91b93b Filter out duplicate voices without language code in Google Cloud (#160046) 2025-12-31 16:30:53 +01:00
Pete Sage
edd9f50562 bump soco to 0.30.14 for Sonos (#160050) 2025-12-31 16:25:55 +01:00
Paul Tarjan
a4b2e84b03 Fix Hikvision thread safety issue when calling async_write_ha_state (#160027) 2025-12-31 15:52:41 +01:00
rlippmann
9da07c2058 remove domain and service slots from Service object (#160039) 2025-12-31 13:34:02 +01:00
Simone Chemelli
8de6785182 Bump aioamazondevices to 11.0.2 (#160016)
Co-authored-by: Franck Nijhof <git@frenck.dev>
2025-12-31 12:31:32 +01:00
Anders Melchiorsen
77f6fa8116 Bump eternalegypt to 0.0.18 (#160006) 2025-12-31 10:57:58 +01:00
Anders Melchiorsen
6b6f338e7e Fix netgear_lte unloading (#160008) 2025-12-31 10:53:24 +01:00
David Knowles
aa995fb590 Use WATER device_class for Hydrawise sensors (#160018) 2025-12-31 10:47:48 +01:00
Anders Melchiorsen
f0fee87b9e Move async_setup_services to async_setup for netgear_lte (#160007) 2025-12-31 10:43:59 +01:00
Erwin Douna
56ab3bf59b Bump pyfirefly 0.1.10 (#160028) 2025-12-31 09:04:40 +01:00
Luke Lashley
24e2720924 Don't prefer cache for Roborock device fetching (#160022) 2025-12-30 13:21:54 -08:00
Erwin Douna
bacc2f00af Bump portainer 1.0.19 (#160014) 2025-12-30 21:13:24 +01:00
Manu
6de2d6810b Convert store image URLs to https in Xbox media resolver (#160015) 2025-12-30 21:10:51 +01:00
Allen Porter
de07833d92 Update roborock binary sensor tests with snapshots (#159981) 2025-12-30 19:36:32 +01:00
Matthias Alphart
b4eff231c3 Update knx-frontend to 2025.12.30.151231 (#159999) 2025-12-30 18:49:02 +01:00
Luke Lashley
98fea46eea Add support for vacuum entity for Roborock Q7 (#159966) 2025-12-30 07:26:18 -08:00
divers33
18e8821891 Add podcast favorites support to Sonos media browser (#159961)
Co-authored-by: divers33 <divers33@users.noreply.github.com>
2025-12-30 15:14:53 +01:00
Sab44
cc2377d44d Bump librehardwaremonitor-api to version 1.7.2 (#159987) 2025-12-30 12:18:50 +01:00
doomsniper09
8370c6abfb Accept integer coordinates in has_location helper (#159835)
Co-authored-by: Paulus Schoutsen <balloob@gmail.com>
Co-authored-by: Artur Pragacz <49985303+arturpragacz@users.noreply.github.com>
2025-12-30 12:06:23 +01:00
Panda-NZ
2d1a672de5 Add ambient temperature sensor to ToGrill (#159798) 2025-12-30 09:44:23 +01:00
Ernst Klamer
75ea42a834 bump xiaomi-ble to 1.4.1 (#159954) 2025-12-30 00:12:45 +01:00
Lukas
45491e17cd Pooldose Diagnostics (#159965) 2025-12-29 23:03:13 +01:00
Stefan H.
b994f03391 Migrate traccar_server to use entry.runtime_data (#156065)
Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: Joostlek <joostlek@outlook.com>
2025-12-29 22:16:01 +01:00
Kamil Breguła
473cb59013 Add translation of exceptions in met (#155765)
Co-authored-by: mik-laj <12058428+mik-laj@users.noreply.github.com>
Co-authored-by: Joostlek <joostlek@outlook.com>
2025-12-29 22:12:40 +01:00
J. Nick Koston
9302926d99 Bump aioesphomeapi to 43.9.1 (#159960) 2025-12-29 11:09:37 -10:00
Branden Cash
d92516b7c9 Implement reconfigure config flow in SRP energy (#151542)
Co-authored-by: Joostlek <joostlek@outlook.com>
2025-12-29 21:52:25 +01:00
Luke Lashley
5b561213d3 Bump Python-Roborock to 4.1.0 (#159963) 2025-12-29 21:52:13 +01:00
Erwin Douna
0a16bd4919 Portainer fix stopped container for stats (#159964) 2025-12-29 21:51:46 +01:00
Michael
f74a6e2625 Record current Feedreader integration quality scale and set to silver (#143179)
Co-authored-by: Joostlek <joostlek@outlook.com>
2025-12-29 21:36:23 +01:00
Joost Lekkerkerker
ecc271409a Small cleanup in Feedreader (#159962) 2025-12-29 21:31:25 +01:00
Michael
1f63bc3231 Record current Synology DSM integration quality scale (#141245)
Co-authored-by: Joostlek <joostlek@outlook.com>
2025-12-29 21:24:18 +01:00
Joost Lekkerkerker
78adeb837e Inject session in Switchbot cloud (#159942) 2025-12-29 21:18:34 +01:00
Joost Lekkerkerker
bfacf462bf Add integration_type service to nuheat (#159845) 2025-12-29 21:12:23 +01:00
Joost Lekkerkerker
771d40dbf6 Add integration_type hub to permobil (#159872) 2025-12-29 21:12:05 +01:00
Joost Lekkerkerker
8e441242ad Add integration_type hub to pooldose (#159880) 2025-12-29 21:11:46 +01:00
Joost Lekkerkerker
b8a4237ab1 Add integration_type hub to poolsense (#159881) 2025-12-29 21:11:17 +01:00
Joost Lekkerkerker
e92af1ee76 Add integration_type device to ps4 (#159892) 2025-12-29 21:10:52 +01:00
Matthias Alphart
e561c1cebb Fix KNX translation references (#159959) 2025-12-29 20:50:53 +01:00
Franck Nijhof
d77f82f8e8 Bump version to 2026.2.0dev0 (#159956) 2025-12-29 20:38:24 +01:00
Joost Lekkerkerker
fcc3598d7f Add integration_type device to netgear (#159816) 2025-12-29 21:14:58 +02:00
335 changed files with 13637 additions and 2593 deletions

View File

@@ -40,7 +40,7 @@ env:
CACHE_VERSION: 2
UV_CACHE_VERSION: 1
MYPY_CACHE_VERSION: 1
HA_SHORT_VERSION: "2026.1"
HA_SHORT_VERSION: "2026.2"
DEFAULT_PYTHON: "3.13.11"
ALL_PYTHON_VERSIONS: "['3.13.11', '3.14.2']"
# 10.3 is the oldest supported version

3
CODEOWNERS generated
View File

@@ -1170,6 +1170,8 @@ build.json @home-assistant/supervisor
/tests/components/open_router/ @joostlek
/homeassistant/components/openerz/ @misialq
/tests/components/openerz/ @misialq
/homeassistant/components/openevse/ @c00w
/tests/components/openevse/ @c00w
/homeassistant/components/openexchangerates/ @MartinHjelmare
/tests/components/openexchangerates/ @MartinHjelmare
/homeassistant/components/opengarage/ @danielhiversen
@@ -1801,6 +1803,7 @@ build.json @home-assistant/supervisor
/tests/components/waqi/ @joostlek
/homeassistant/components/water_heater/ @home-assistant/core
/tests/components/water_heater/ @home-assistant/core
/homeassistant/components/waterfurnace/ @sdague @masterkoppa
/homeassistant/components/watergate/ @adam-the-hero
/tests/components/watergate/ @adam-the-hero
/homeassistant/components/watson_tts/ @rutkai

View File

@@ -7,7 +7,12 @@ from homeassistant.core import HomeAssistant
from .coordinator import AirobotConfigEntry, AirobotDataUpdateCoordinator
PLATFORMS: list[Platform] = [Platform.CLIMATE, Platform.NUMBER, Platform.SENSOR]
PLATFORMS: list[Platform] = [
Platform.BUTTON,
Platform.CLIMATE,
Platform.NUMBER,
Platform.SENSOR,
]
async def async_setup_entry(hass: HomeAssistant, entry: AirobotConfigEntry) -> bool:

View File

@@ -0,0 +1,89 @@
"""Button platform for Airobot integration."""
from __future__ import annotations
from collections.abc import Callable, Coroutine
from dataclasses import dataclass
from typing import Any
from pyairobotrest.exceptions import (
AirobotConnectionError,
AirobotError,
AirobotTimeoutError,
)
from homeassistant.components.button import (
ButtonDeviceClass,
ButtonEntity,
ButtonEntityDescription,
)
from homeassistant.const import EntityCategory
from homeassistant.core import HomeAssistant
from homeassistant.exceptions import HomeAssistantError
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from .const import DOMAIN
from .coordinator import AirobotConfigEntry, AirobotDataUpdateCoordinator
from .entity import AirobotEntity
PARALLEL_UPDATES = 0
@dataclass(frozen=True, kw_only=True)
class AirobotButtonEntityDescription(ButtonEntityDescription):
"""Describes Airobot button entity."""
press_fn: Callable[[AirobotDataUpdateCoordinator], Coroutine[Any, Any, None]]
BUTTON_TYPES: tuple[AirobotButtonEntityDescription, ...] = (
AirobotButtonEntityDescription(
key="restart",
device_class=ButtonDeviceClass.RESTART,
entity_category=EntityCategory.CONFIG,
press_fn=lambda coordinator: coordinator.client.reboot_thermostat(),
),
)
async def async_setup_entry(
hass: HomeAssistant,
entry: AirobotConfigEntry,
async_add_entities: AddConfigEntryEntitiesCallback,
) -> None:
"""Set up Airobot button entities."""
coordinator = entry.runtime_data
async_add_entities(
AirobotButton(coordinator, description) for description in BUTTON_TYPES
)
class AirobotButton(AirobotEntity, ButtonEntity):
"""Representation of an Airobot button."""
entity_description: AirobotButtonEntityDescription
def __init__(
self,
coordinator: AirobotDataUpdateCoordinator,
description: AirobotButtonEntityDescription,
) -> None:
"""Initialize the button."""
super().__init__(coordinator)
self.entity_description = description
self._attr_unique_id = f"{coordinator.data.status.device_id}_{description.key}"
async def async_press(self) -> None:
"""Handle the button press."""
try:
await self.entity_description.press_fn(self.coordinator)
except (AirobotConnectionError, AirobotTimeoutError):
# Connection errors during reboot are expected as device restarts
pass
except AirobotError as err:
raise HomeAssistantError(
translation_domain=DOMAIN,
translation_key="button_press_failed",
translation_placeholders={"button": self.entity_description.key},
) from err

View File

@@ -86,6 +86,9 @@
"authentication_failed": {
"message": "Authentication failed, please reauthenticate."
},
"button_press_failed": {
"message": "Failed to press {button} button."
},
"connection_failed": {
"message": "Failed to communicate with device."
},

View File

@@ -7,5 +7,5 @@
"integration_type": "device",
"iot_class": "local_polling",
"quality_scale": "silver",
"requirements": ["airos==0.6.0"]
"requirements": ["airos==0.6.1"]
}

View File

@@ -69,7 +69,6 @@ from homeassistant.core import HomeAssistant
from homeassistant.exceptions import HomeAssistantError
from homeassistant.helpers import device_registry as dr, llm
from homeassistant.helpers.entity import Entity
from homeassistant.helpers.json import json_dumps
from homeassistant.util import slugify
from . import AnthropicConfigEntry
@@ -194,7 +193,7 @@ def _convert_content(
tool_result_block = ToolResultBlockParam(
type="tool_result",
tool_use_id=content.tool_call_id,
content=json_dumps(content.tool_result),
content=json.dumps(content.tool_result),
)
external_tool = False
if not messages or messages[-1]["role"] != (

View File

@@ -8,5 +8,5 @@
"integration_type": "system",
"iot_class": "local_push",
"quality_scale": "internal",
"requirements": ["pysilero-vad==3.2.0", "pyspeex-noise==1.0.2"]
"requirements": ["pysilero-vad==3.0.1", "pyspeex-noise==1.0.2"]
}

View File

@@ -140,6 +140,7 @@ _EXPERIMENTAL_TRIGGER_PLATFORMS = {
"light",
"lock",
"media_player",
"person",
"scene",
"siren",
"switch",

View File

@@ -36,10 +36,6 @@ _LOGGER = logging.getLogger(__name__)
# Cache TTL for backup list (in seconds)
CACHE_TTL = 300
# Timeout for upload operations (in seconds)
# This prevents uploads from hanging indefinitely
UPLOAD_TIMEOUT = 43200 # 12 hours (matches B2 HTTP timeout)
def suggested_filenames(backup: AgentBackup) -> tuple[str, str]:
"""Return the suggested filenames for the backup and metadata files."""
@@ -333,28 +329,13 @@ class BackblazeBackupAgent(BackupAgent):
_LOGGER.debug("Uploading backup file %s with streaming", filename)
try:
content_type, _ = mimetypes.guess_type(filename)
file_version = await asyncio.wait_for(
self._hass.async_add_executor_job(
self._upload_unbound_stream_sync,
reader,
filename,
content_type or "application/x-tar",
file_info,
),
timeout=UPLOAD_TIMEOUT,
file_version = await self._hass.async_add_executor_job(
self._upload_unbound_stream_sync,
reader,
filename,
content_type or "application/x-tar",
file_info,
)
except TimeoutError:
_LOGGER.error(
"Upload of %s timed out after %s seconds", filename, UPLOAD_TIMEOUT
)
reader.abort()
raise BackupAgentError(
f"Upload timed out after {UPLOAD_TIMEOUT} seconds"
) from None
except asyncio.CancelledError:
_LOGGER.warning("Upload of %s was cancelled", filename)
reader.abort()
raise
finally:
reader.close()

View File

@@ -2,6 +2,9 @@
"services": {
"set_hot_water_schedule": {
"service": "mdi:calendar-clock"
},
"sync_time": {
"service": "mdi:timer-sync-outline"
}
}
}

View File

@@ -7,7 +7,7 @@
"integration_type": "device",
"iot_class": "local_polling",
"loggers": ["bsblan"],
"requirements": ["python-bsblan==3.1.4"],
"requirements": ["python-bsblan==3.1.6"],
"zeroconf": [
{
"name": "bsb-lan*",

View File

@@ -13,6 +13,7 @@ from homeassistant.config_entries import ConfigEntryState
from homeassistant.core import HomeAssistant, ServiceCall, callback
from homeassistant.exceptions import HomeAssistantError, ServiceValidationError
from homeassistant.helpers import config_validation as cv, device_registry as dr
from homeassistant.util import dt as dt_util
from .const import DOMAIN
@@ -30,8 +31,9 @@ ATTR_FRIDAY_SLOTS = "friday_slots"
ATTR_SATURDAY_SLOTS = "saturday_slots"
ATTR_SUNDAY_SLOTS = "sunday_slots"
# Service name
# Service names
SERVICE_SET_HOT_WATER_SCHEDULE = "set_hot_water_schedule"
SERVICE_SYNC_TIME = "sync_time"
# Schema for a single time slot
@@ -203,6 +205,74 @@ async def set_hot_water_schedule(service_call: ServiceCall) -> None:
await entry.runtime_data.slow_coordinator.async_request_refresh()
async def async_sync_time(service_call: ServiceCall) -> None:
"""Synchronize BSB-LAN device time with Home Assistant."""
device_id: str = service_call.data[ATTR_DEVICE_ID]
# Get the device and config entry
device_registry = dr.async_get(service_call.hass)
device_entry = device_registry.async_get(device_id)
if device_entry is None:
raise ServiceValidationError(
translation_domain=DOMAIN,
translation_key="invalid_device_id",
translation_placeholders={"device_id": device_id},
)
# Find the config entry for this device
matching_entries: list[BSBLanConfigEntry] = [
entry
for entry in service_call.hass.config_entries.async_entries(DOMAIN)
if entry.entry_id in device_entry.config_entries
]
if not matching_entries:
raise ServiceValidationError(
translation_domain=DOMAIN,
translation_key="no_config_entry_for_device",
translation_placeholders={"device_id": device_entry.name or device_id},
)
entry = matching_entries[0]
# Verify the config entry is loaded
if entry.state is not ConfigEntryState.LOADED:
raise ServiceValidationError(
translation_domain=DOMAIN,
translation_key="config_entry_not_loaded",
translation_placeholders={"device_name": device_entry.name or device_id},
)
client = entry.runtime_data.client
try:
# Get current device time
device_time = await client.time()
current_time = dt_util.now()
current_time_str = current_time.strftime("%d.%m.%Y %H:%M:%S")
# Only sync if device time differs from HA time
if device_time.time.value != current_time_str:
await client.set_time(current_time_str)
except BSBLANError as err:
raise HomeAssistantError(
translation_domain=DOMAIN,
translation_key="sync_time_failed",
translation_placeholders={
"device_name": device_entry.name or device_id,
"error": str(err),
},
) from err
SYNC_TIME_SCHEMA = vol.Schema(
{
vol.Required(ATTR_DEVICE_ID): cv.string,
}
)
@callback
def async_setup_services(hass: HomeAssistant) -> None:
"""Register the BSB-Lan services."""
@@ -212,3 +282,10 @@ def async_setup_services(hass: HomeAssistant) -> None:
set_hot_water_schedule,
schema=SERVICE_SET_HOT_WATER_SCHEDULE_SCHEMA,
)
hass.services.async_register(
DOMAIN,
SERVICE_SYNC_TIME,
async_sync_time,
schema=SYNC_TIME_SCHEMA,
)

View File

@@ -1,3 +1,12 @@
sync_time:
fields:
device_id:
required: true
example: "abc123device456"
selector:
device:
integration: bsblan
set_hot_water_schedule:
fields:
device_id:

View File

@@ -79,9 +79,6 @@
"invalid_device_id": {
"message": "Invalid device ID: {device_id}"
},
"invalid_time_format": {
"message": "Invalid time format provided"
},
"no_config_entry_for_device": {
"message": "No configuration entry found for device: {device_id}"
},
@@ -108,6 +105,9 @@
},
"setup_general_error": {
"message": "An unknown error occurred while retrieving static device data"
},
"sync_time_failed": {
"message": "Failed to sync time for {device_name}: {error}"
}
},
"services": {
@@ -148,6 +148,16 @@
}
},
"name": "Set hot water schedule"
},
"sync_time": {
"description": "Synchronize Home Assistant time to the BSB-Lan device. Only updates if device time differs from Home Assistant time.",
"fields": {
"device_id": {
"description": "The BSB-LAN device to sync time for.",
"name": "Device"
}
},
"name": "Sync time"
}
}
}

View File

@@ -20,5 +20,5 @@
"dependencies": ["bluetooth_adapters"],
"documentation": "https://www.home-assistant.io/integrations/bthome",
"iot_class": "local_push",
"requirements": ["bthome-ble==3.16.0"]
"requirements": ["bthome-ble==3.17.0"]
}

View File

@@ -33,7 +33,7 @@ HVAC_MODE_CHANGED_TRIGGER_SCHEMA = ENTITY_STATE_TRIGGER_SCHEMA_FIRST_LAST.extend
{
vol.Required(CONF_OPTIONS): {
vol.Required(CONF_HVAC_MODE): vol.All(
cv.ensure_list, vol.Length(min=1), [vol.Coerce(HVACMode)]
cv.ensure_list, vol.Length(min=1), [HVACMode]
),
},
}

View File

@@ -19,10 +19,6 @@
selector:
choose:
choices:
number:
selector:
number:
mode: box
entity:
selector:
entity:
@@ -31,11 +27,14 @@
- input_number
- number
- sensor
number:
selector:
number:
mode: box
translation_key: number_or_entity
.trigger_threshold_type: &trigger_threshold_type
required: true
default: above
selector:
select:
options:

View File

@@ -9,7 +9,7 @@
"integration_type": "device",
"iot_class": "local_push",
"loggers": ["async_upnp_client"],
"requirements": ["async-upnp-client==0.46.1", "getmac==0.9.5"],
"requirements": ["async-upnp-client==0.46.2", "getmac==0.9.5"],
"ssdp": [
{
"deviceType": "urn:schemas-upnp-org:device:MediaRenderer:1",

View File

@@ -8,7 +8,7 @@
"documentation": "https://www.home-assistant.io/integrations/dlna_dms",
"integration_type": "service",
"iot_class": "local_polling",
"requirements": ["async-upnp-client==0.46.1"],
"requirements": ["async-upnp-client==0.46.2"],
"ssdp": [
{
"deviceType": "urn:schemas-upnp-org:device:MediaServer:1",

View File

@@ -1,4 +1,4 @@
"""Integrate with DuckDNS."""
"""Duck DNS integration."""
from __future__ import annotations

View File

@@ -4,5 +4,6 @@
"codeowners": ["@tr4nt0r"],
"config_flow": true,
"documentation": "https://www.home-assistant.io/integrations/duckdns",
"integration_type": "service",
"iot_class": "cloud_polling"
}

View File

@@ -2,11 +2,12 @@
from __future__ import annotations
from aiohttp import ClientError
import voluptuous as vol
from homeassistant.const import CONF_ACCESS_TOKEN, CONF_DOMAIN
from homeassistant.core import HomeAssistant, ServiceCall, callback
from homeassistant.exceptions import ServiceValidationError
from homeassistant.exceptions import HomeAssistantError, ServiceValidationError
from homeassistant.helpers import config_validation as cv
from homeassistant.helpers.aiohttp_client import async_get_clientsession
from homeassistant.helpers.selector import ConfigEntrySelector
@@ -62,9 +63,25 @@ async def update_domain_service(call: ServiceCall) -> None:
session = async_get_clientsession(call.hass)
await update_duckdns(
session,
entry.data[CONF_DOMAIN],
entry.data[CONF_ACCESS_TOKEN],
txt=call.data.get(ATTR_TXT),
)
try:
if not await update_duckdns(
session,
entry.data[CONF_DOMAIN],
entry.data[CONF_ACCESS_TOKEN],
txt=call.data.get(ATTR_TXT),
):
raise HomeAssistantError(
translation_domain=DOMAIN,
translation_key="update_failed",
translation_placeholders={
CONF_DOMAIN: entry.data[CONF_DOMAIN],
},
)
except ClientError as e:
raise HomeAssistantError(
translation_domain=DOMAIN,
translation_key="connection_error",
translation_placeholders={
CONF_DOMAIN: entry.data[CONF_DOMAIN],
},
) from e

View File

@@ -7,5 +7,5 @@
"integration_type": "hub",
"iot_class": "cloud_push",
"loggers": ["sleekxmppfs", "sucks", "deebot_client"],
"requirements": ["py-sucks==0.9.11", "deebot-client==17.0.1"]
"requirements": ["py-sucks==0.9.11", "deebot-client==17.0.0"]
}

View File

@@ -59,13 +59,38 @@ class FlowToGridSourceType(TypedDict):
number_energy_price: float | None # Price for energy ($/kWh)
class GridPowerSourceType(TypedDict):
class PowerConfig(TypedDict, total=False):
"""Dictionary holding power sensor configuration options.
Users can configure power sensors in three ways:
1. Standard: single sensor (positive=discharge/from_grid, negative=charge/to_grid)
2. Inverted: single sensor with opposite polarity (needs to be multiplied by -1)
3. Two sensors: separate positive sensors for each direction
"""
# Standard: single sensor (positive=discharge/from_grid, negative=charge/to_grid)
stat_rate: str
# Inverted: single sensor with opposite polarity (needs to be multiplied by -1)
stat_rate_inverted: str
# Two sensors: separate positive sensors for each direction
# Result = stat_rate_from - stat_rate_to (positive when net outflow)
stat_rate_from: str # Battery: discharge, Grid: consumption
stat_rate_to: str # Battery: charge, Grid: return
class GridPowerSourceType(TypedDict, total=False):
"""Dictionary holding the source of grid power consumption."""
# statistic_id of a power meter (kW)
# negative values indicate grid return
# This is either the original sensor or a generated template sensor
stat_rate: str
# User's original power sensor configuration
power_config: PowerConfig
class GridSourceType(TypedDict):
"""Dictionary holding the source of grid energy consumption."""
@@ -97,8 +122,12 @@ class BatterySourceType(TypedDict):
stat_energy_from: str
stat_energy_to: str
# positive when discharging, negative when charging
# This is either the original sensor or a generated template sensor
stat_rate: NotRequired[str]
# User's original power sensor configuration
power_config: NotRequired[PowerConfig]
class GasSourceType(TypedDict):
"""Dictionary holding the source of gas consumption."""
@@ -211,10 +240,53 @@ FLOW_TO_GRID_SOURCE_SCHEMA = vol.Schema(
}
)
GRID_POWER_SOURCE_SCHEMA = vol.Schema(
{
vol.Required("stat_rate"): str,
}
def _validate_power_config(val: dict[str, Any]) -> dict[str, Any]:
"""Validate power_config has exactly one configuration method."""
if not val:
raise vol.Invalid("power_config must have at least one option")
# Ensure only one configuration method is used
has_single = "stat_rate" in val
has_inverted = "stat_rate_inverted" in val
has_combined = "stat_rate_from" in val or "stat_rate_to" in val
methods_count = sum([has_single, has_inverted, has_combined])
if methods_count > 1:
raise vol.Invalid(
"power_config must use only one configuration method: "
"stat_rate, stat_rate_inverted, or stat_rate_from/stat_rate_to"
)
return val
POWER_CONFIG_SCHEMA = vol.All(
vol.Schema(
{
vol.Exclusive("stat_rate", "power_source"): str,
vol.Exclusive("stat_rate_inverted", "power_source"): str,
# stat_rate_from/stat_rate_to: two sensors for bidirectional power
# Battery: from=discharge (out), to=charge (in)
# Grid: from=consumption, to=return
vol.Inclusive("stat_rate_from", "two_sensors"): str,
vol.Inclusive("stat_rate_to", "two_sensors"): str,
}
),
_validate_power_config,
)
GRID_POWER_SOURCE_SCHEMA = vol.All(
vol.Schema(
{
# stat_rate and power_config are both optional schema keys, but the validator
# requires that at least one is provided; power_config takes precedence
vol.Optional("stat_rate"): str,
vol.Optional("power_config"): POWER_CONFIG_SCHEMA,
}
),
cv.has_at_least_one_key("stat_rate", "power_config"),
)
@@ -225,7 +297,7 @@ def _generate_unique_value_validator(key: str) -> Callable[[list[dict]], list[di
val: list[dict],
) -> list[dict]:
"""Ensure that the user doesn't add duplicate values."""
counts = Counter(flow_from[key] for flow_from in val)
counts = Counter(item.get(key) for item in val if item.get(key) is not None)
for value, count in counts.items():
if count > 1:
@@ -267,7 +339,10 @@ BATTERY_SOURCE_SCHEMA = vol.Schema(
vol.Required("type"): "battery",
vol.Required("stat_energy_from"): str,
vol.Required("stat_energy_to"): str,
# Both stat_rate and power_config are optional
# If power_config is provided, it takes precedence and stat_rate is overwritten
vol.Optional("stat_rate"): str,
vol.Optional("power_config"): POWER_CONFIG_SCHEMA,
}
)
GAS_SOURCE_SCHEMA = vol.Schema(
@@ -387,6 +462,12 @@ class EnergyManager:
if key in update:
data[key] = update[key]
# Process energy sources and set stat_rate for power configs
if "energy_sources" in update:
data["energy_sources"] = self._process_energy_sources(
data["energy_sources"]
)
self.data = data
self._store.async_delay_save(lambda: data, 60)
@@ -395,6 +476,74 @@ class EnergyManager:
await asyncio.gather(*(listener() for listener in self._update_listeners))
def _process_energy_sources(self, sources: list[SourceType]) -> list[SourceType]:
"""Process energy sources and set stat_rate for power configs."""
from .helpers import generate_power_sensor_entity_id # noqa: PLC0415
processed: list[SourceType] = []
for source in sources:
if source["type"] == "battery":
source = self._process_battery_power(
source, generate_power_sensor_entity_id
)
elif source["type"] == "grid":
source = self._process_grid_power(
source, generate_power_sensor_entity_id
)
processed.append(source)
return processed
def _process_battery_power(
self,
source: BatterySourceType,
generate_entity_id: Callable[[str, PowerConfig], str | None],
) -> BatterySourceType:
"""Set stat_rate for battery if power_config is specified."""
if "power_config" not in source:
return source
config = source["power_config"]
# If power_config has stat_rate (standard), just use it directly
if "stat_rate" in config:
return {**source, "stat_rate": config["stat_rate"]}
# For inverted or two-sensor config, set stat_rate to the generated entity_id
entity_id = generate_entity_id("battery", config)
if entity_id:
return {**source, "stat_rate": entity_id}
return source
def _process_grid_power(
self,
source: GridSourceType,
generate_entity_id: Callable[[str, PowerConfig], str | None],
) -> GridSourceType:
"""Set stat_rate for grid power sources if power_config is specified."""
if "power" not in source:
return source
processed_power: list[GridPowerSourceType] = []
for power in source["power"]:
if "power_config" in power:
config = power["power_config"]
# If power_config has stat_rate (standard), just use it directly
if "stat_rate" in config:
processed_power.append({**power, "stat_rate": config["stat_rate"]})
continue
# For inverted or two-sensor config, set stat_rate to generated entity_id
entity_id = generate_entity_id("grid", config)
if entity_id:
processed_power.append({**power, "stat_rate": entity_id})
continue
processed_power.append(power)
return {**source, "power": processed_power}
@callback
def async_listen_updates(self, update_listener: Callable[[], Awaitable]) -> None:
"""Listen for data updates."""

View File

@@ -0,0 +1,42 @@
"""Helpers for the Energy integration."""
from __future__ import annotations
from typing import TYPE_CHECKING
if TYPE_CHECKING:
from .data import PowerConfig
def generate_power_sensor_unique_id(
source_type: str, config: PowerConfig
) -> str | None:
"""Generate a unique ID for a power transform sensor."""
if "stat_rate_inverted" in config:
sensor_id = config["stat_rate_inverted"].replace(".", "_")
return f"energy_power_{source_type}_inverted_{sensor_id}"
if "stat_rate_from" in config and "stat_rate_to" in config:
from_id = config["stat_rate_from"].replace(".", "_")
to_id = config["stat_rate_to"].replace(".", "_")
return f"energy_power_{source_type}_combined_{from_id}_{to_id}"
return None
def generate_power_sensor_entity_id(
source_type: str, config: PowerConfig
) -> str | None:
"""Generate an entity ID for a power transform sensor."""
if "stat_rate_inverted" in config:
# Use source sensor name with _inverted suffix
source = config["stat_rate_inverted"]
if source.startswith("sensor."):
return f"{source}_inverted"
return f"sensor.{source.replace('.', '_')}_inverted"
if "stat_rate_from" in config and "stat_rate_to" in config:
# Use both sensors in entity ID to ensure uniqueness when multiple
# combined configs exist. The entity represents net power (from - to),
# e.g., discharge - charge for battery.
from_sensor = config["stat_rate_from"].removeprefix("sensor.")
to_sensor = config["stat_rate_to"].removeprefix("sensor.")
return f"sensor.energy_{source_type}_{from_sensor}_{to_sensor}_net_power"
return None

View File

@@ -19,7 +19,12 @@ from homeassistant.components.sensor import (
from homeassistant.components.sensor.recorder import ( # pylint: disable=hass-component-root-import
reset_detected,
)
from homeassistant.const import ATTR_UNIT_OF_MEASUREMENT, UnitOfEnergy, UnitOfVolume
from homeassistant.const import (
ATTR_UNIT_OF_MEASUREMENT,
UnitOfEnergy,
UnitOfPower,
UnitOfVolume,
)
from homeassistant.core import (
HomeAssistant,
State,
@@ -36,7 +41,8 @@ from homeassistant.util import dt as dt_util, unit_conversion
from homeassistant.util.unit_system import METRIC_SYSTEM
from .const import DOMAIN
from .data import EnergyManager, async_get_manager
from .data import EnergyManager, PowerConfig, async_get_manager
from .helpers import generate_power_sensor_entity_id, generate_power_sensor_unique_id
SUPPORTED_STATE_CLASSES = {
SensorStateClass.MEASUREMENT,
@@ -137,6 +143,7 @@ class SensorManager:
self.manager = manager
self.async_add_entities = async_add_entities
self.current_entities: dict[tuple[str, str | None, str], EnergyCostSensor] = {}
self.current_power_entities: dict[str, EnergyPowerSensor] = {}
async def async_start(self) -> None:
"""Start."""
@@ -147,8 +154,9 @@ class SensorManager:
async def _process_manager_data(self) -> None:
"""Process manager data."""
to_add: list[EnergyCostSensor] = []
to_add: list[EnergyCostSensor | EnergyPowerSensor] = []
to_remove = dict(self.current_entities)
power_to_remove = dict(self.current_power_entities)
async def finish() -> None:
if to_add:
@@ -159,6 +167,10 @@ class SensorManager:
self.current_entities.pop(key)
await entity.async_remove()
for power_key, power_entity in power_to_remove.items():
self.current_power_entities.pop(power_key)
await power_entity.async_remove()
if not self.manager.data:
await finish()
return
@@ -185,6 +197,13 @@ class SensorManager:
to_remove,
)
# Process power sensors for battery and grid sources
self._process_power_sensor_data(
energy_source,
to_add,
power_to_remove,
)
await finish()
@callback
@@ -192,7 +211,7 @@ class SensorManager:
self,
adapter: SourceAdapter,
config: Mapping[str, Any],
to_add: list[EnergyCostSensor],
to_add: list[EnergyCostSensor | EnergyPowerSensor],
to_remove: dict[tuple[str, str | None, str], EnergyCostSensor],
) -> None:
"""Process sensor data."""
@@ -220,6 +239,74 @@ class SensorManager:
)
to_add.append(self.current_entities[key])
@callback
def _process_power_sensor_data(
self,
energy_source: Mapping[str, Any],
to_add: list[EnergyCostSensor | EnergyPowerSensor],
to_remove: dict[str, EnergyPowerSensor],
) -> None:
"""Process power sensor data for battery and grid sources."""
source_type = energy_source.get("type")
if source_type == "battery":
power_config = energy_source.get("power_config")
if power_config and self._needs_power_sensor(power_config):
self._create_or_keep_power_sensor(
source_type, power_config, to_add, to_remove
)
elif source_type == "grid":
for power in energy_source.get("power", []):
power_config = power.get("power_config")
if power_config and self._needs_power_sensor(power_config):
self._create_or_keep_power_sensor(
source_type, power_config, to_add, to_remove
)
@staticmethod
def _needs_power_sensor(power_config: PowerConfig) -> bool:
"""Check if power_config needs a transform sensor."""
# Only create sensors for inverted or two-sensor configs
# Standard stat_rate configs don't need a transform sensor
return "stat_rate_inverted" in power_config or (
"stat_rate_from" in power_config and "stat_rate_to" in power_config
)
def _create_or_keep_power_sensor(
self,
source_type: str,
power_config: PowerConfig,
to_add: list[EnergyCostSensor | EnergyPowerSensor],
to_remove: dict[str, EnergyPowerSensor],
) -> None:
"""Create a power sensor or keep an existing one."""
unique_id = generate_power_sensor_unique_id(source_type, power_config)
if not unique_id:
return
# If entity already exists, keep it
if unique_id in to_remove:
to_remove.pop(unique_id)
return
# If we already have this entity, skip
if unique_id in self.current_power_entities:
return
entity_id = generate_power_sensor_entity_id(source_type, power_config)
if not entity_id:
return
sensor = EnergyPowerSensor(
source_type,
power_config,
unique_id,
entity_id,
)
self.current_power_entities[unique_id] = sensor
to_add.append(sensor)
def _set_result_unless_done(future: asyncio.Future[None]) -> None:
"""Set the result of a future unless it is done."""
@@ -495,3 +582,197 @@ class EnergyCostSensor(SensorEntity):
prefix = self._config[self._adapter.stat_energy_key]
return f"{prefix}_{self._adapter.source_type}_{self._adapter.entity_id_suffix}"
class EnergyPowerSensor(SensorEntity):
"""Transform power sensor values (invert or combine two sensors).
This sensor handles non-standard power sensor configurations for the energy
dashboard by either inverting polarity or combining two positive sensors.
"""
_attr_should_poll = False
_attr_device_class = SensorDeviceClass.POWER
_attr_state_class = SensorStateClass.MEASUREMENT
_attr_has_entity_name = True
def __init__(
self,
source_type: str,
config: PowerConfig,
unique_id: str,
entity_id: str,
) -> None:
"""Initialize the sensor."""
super().__init__()
self._source_type = source_type
self._config: PowerConfig = config
self._attr_unique_id = unique_id
self.entity_id = entity_id
self._source_sensors: list[str] = []
self._is_inverted = "stat_rate_inverted" in config
self._is_combined = "stat_rate_from" in config and "stat_rate_to" in config
# Determine source sensors
if self._is_inverted:
self._source_sensors = [config["stat_rate_inverted"]]
elif self._is_combined:
self._source_sensors = [
config["stat_rate_from"],
config["stat_rate_to"],
]
# add_finished is set when either async_added_to_hass or add_to_platform_abort
# is called
self.add_finished: asyncio.Future[None] = (
asyncio.get_running_loop().create_future()
)
@property
def available(self) -> bool:
"""Return if entity is available."""
if self._is_inverted:
source = self.hass.states.get(self._source_sensors[0])
return source is not None and source.state not in (
"unknown",
"unavailable",
)
if self._is_combined:
discharge = self.hass.states.get(self._source_sensors[0])
charge = self.hass.states.get(self._source_sensors[1])
return (
discharge is not None
and charge is not None
and discharge.state not in ("unknown", "unavailable")
and charge.state not in ("unknown", "unavailable")
)
return True
@callback
def _update_state(self) -> None:
"""Update the sensor state based on source sensors."""
if self._is_inverted:
source_state = self.hass.states.get(self._source_sensors[0])
if source_state is None or source_state.state in ("unknown", "unavailable"):
self._attr_native_value = None
return
try:
value = float(source_state.state)
except ValueError:
self._attr_native_value = None
return
self._attr_native_value = value * -1
elif self._is_combined:
discharge_state = self.hass.states.get(self._source_sensors[0])
charge_state = self.hass.states.get(self._source_sensors[1])
if (
discharge_state is None
or charge_state is None
or discharge_state.state in ("unknown", "unavailable")
or charge_state.state in ("unknown", "unavailable")
):
self._attr_native_value = None
return
try:
discharge = float(discharge_state.state)
charge = float(charge_state.state)
except ValueError:
self._attr_native_value = None
return
# Get units from state attributes
discharge_unit = discharge_state.attributes.get(ATTR_UNIT_OF_MEASUREMENT)
charge_unit = charge_state.attributes.get(ATTR_UNIT_OF_MEASUREMENT)
# Convert to Watts if units are present
if discharge_unit:
discharge = unit_conversion.PowerConverter.convert(
discharge, discharge_unit, UnitOfPower.WATT
)
if charge_unit:
charge = unit_conversion.PowerConverter.convert(
charge, charge_unit, UnitOfPower.WATT
)
self._attr_native_value = discharge - charge
async def async_added_to_hass(self) -> None:
"""Register callbacks."""
# Set name based on source sensor(s)
if self._source_sensors:
entity_reg = er.async_get(self.hass)
device_id = None
source_name = None
# Check first sensor
if source_entry := entity_reg.async_get(self._source_sensors[0]):
device_id = source_entry.device_id
# For combined mode, always use Watts because we may have different source units; for inverted mode, copy source unit
if self._is_combined:
self._attr_native_unit_of_measurement = UnitOfPower.WATT
else:
self._attr_native_unit_of_measurement = (
source_entry.unit_of_measurement
)
# Get source name from registry
source_name = source_entry.name or source_entry.original_name
# Assign power sensor to same device as source sensor(s)
# Note: We use manual entity registry update instead of _attr_device_info
# because device assignment depends on runtime information from the entity
# registry (which source sensor has a device). This information isn't
# available during __init__, and the entity is already registered before
# async_added_to_hass runs, making the standard _attr_device_info pattern
# incompatible with this use case.
# If first sensor has no device and we have a second sensor, check it
if not device_id and len(self._source_sensors) > 1:
if source_entry := entity_reg.async_get(self._source_sensors[1]):
device_id = source_entry.device_id
# Update entity registry entry with device_id
if device_id and (power_entry := entity_reg.async_get(self.entity_id)):
entity_reg.async_update_entity(
power_entry.entity_id, device_id=device_id
)
else:
self._attr_has_entity_name = False
# Set name for inverted mode
if self._is_inverted:
if source_name:
self._attr_name = f"{source_name} Inverted"
else:
# Fall back to entity_id if no name in registry
sensor_name = split_entity_id(self._source_sensors[0])[1].replace(
"_", " "
)
self._attr_name = f"{sensor_name.title()} Inverted"
# Set name for combined mode
if self._is_combined:
self._attr_name = f"{self._source_type.title()} Power"
self._update_state()
# Track state changes on all source sensors
self.async_on_remove(
async_track_state_change_event(
self.hass,
self._source_sensors,
self._async_state_changed_listener,
)
)
_set_result_unless_done(self.add_finished)
@callback
def _async_state_changed_listener(self, *_: Any) -> None:
"""Handle source sensor state changes."""
self._update_state()
self.async_write_ha_state()
@callback
def add_to_platform_abort(self) -> None:
"""Abort adding an entity to a platform."""
_set_result_unless_done(self.add_finished)
super().add_to_platform_abort()

View File

@@ -783,7 +783,7 @@ ENCHARGE_AGGREGATE_SENSORS = (
translation_key="available_energy",
native_unit_of_measurement=UnitOfEnergy.WATT_HOUR,
state_class=SensorStateClass.MEASUREMENT,
device_class=SensorDeviceClass.ENERGY_STORAGE,
device_class=SensorDeviceClass.ENERGY,
value_fn=attrgetter("available_energy"),
),
EnvoyEnchargeAggregateSensorEntityDescription(
@@ -791,14 +791,14 @@ ENCHARGE_AGGREGATE_SENSORS = (
translation_key="reserve_energy",
native_unit_of_measurement=UnitOfEnergy.WATT_HOUR,
state_class=SensorStateClass.MEASUREMENT,
device_class=SensorDeviceClass.ENERGY_STORAGE,
device_class=SensorDeviceClass.ENERGY,
value_fn=attrgetter("backup_reserve"),
),
EnvoyEnchargeAggregateSensorEntityDescription(
key="max_capacity",
translation_key="max_capacity",
native_unit_of_measurement=UnitOfEnergy.WATT_HOUR,
device_class=SensorDeviceClass.ENERGY_STORAGE,
device_class=SensorDeviceClass.ENERGY,
value_fn=attrgetter("max_available_capacity"),
),
)

View File

@@ -17,7 +17,7 @@
"mqtt": ["esphome/discover/#"],
"quality_scale": "platinum",
"requirements": [
"aioesphomeapi==43.9.1",
"aioesphomeapi==43.10.1",
"esphome-dashboard-api==1.3.0",
"bleak-esphome==3.4.0"
],

View File

@@ -19,6 +19,9 @@ from .coordinator import FeedReaderCoordinator
LOGGER = logging.getLogger(__name__)
# Coordinator is used to centralize the data updates
PARALLEL_UPDATES = 0
ATTR_CONTENT = "content"
ATTR_DESCRIPTION = "description"
ATTR_LINK = "link"

View File

@@ -0,0 +1,94 @@
rules:
# Bronze
action-setup:
status: exempt
comment: No custom actions are defined.
appropriate-polling: done
brands: done
common-modules: done
config-flow-test-coverage:
status: todo
comment: missing test for uniqueness of feed URL.
config-flow:
status: todo
comment: missing data descriptions
dependency-transparency: done
docs-actions:
status: exempt
comment: No custom actions are defined.
docs-high-level-description: done
docs-installation-instructions: done
docs-removal-instructions: done
entity-event-setup: done
entity-unique-id: done
has-entity-name: done
runtime-data: done
test-before-configure: done
test-before-setup: done
unique-config-entry: done
# Silver
action-exceptions:
status: exempt
comment: No custom actions are defined.
config-entry-unloading: done
docs-configuration-parameters: done
docs-installation-parameters: done
entity-unavailable: done
integration-owner: done
log-when-unavailable: done
parallel-updates: done
reauthentication-flow:
status: exempt
comment: No authentication support.
test-coverage:
status: done
comment: Can use freezer for skipping time instead
# Gold
devices: done
diagnostics: todo
discovery-update-info:
status: exempt
comment: No discovery support.
discovery:
status: exempt
comment: No discovery support.
docs-data-update: done
docs-examples: done
docs-known-limitations: todo
docs-supported-devices: todo
docs-supported-functions: todo
docs-troubleshooting: todo
docs-use-cases: done
dynamic-devices:
status: exempt
comment: Each config entry, represents one service.
entity-category: done
entity-device-class:
status: exempt
comment: Matches no available event entity class.
entity-disabled-by-default:
status: exempt
comment: Only one entity per config entry.
entity-translations: todo
exception-translations: todo
icon-translations: done
reconfiguration-flow: done
repair-issues:
status: done
comment: Only one repair-issue for yaml-import defined.
stale-devices:
status: exempt
comment: Each config entry, represents one service.
# Platinum
async-dependency:
status: todo
comment: feedparser lib is not async.
inject-websession:
status: todo
comment: feedparser lib doesn't take a session as argument.
strict-typing:
status: todo
comment: feedparser lib is not fully typed.

View File

@@ -7,5 +7,5 @@
"integration_type": "service",
"iot_class": "local_polling",
"quality_scale": "bronze",
"requirements": ["pyfirefly==0.1.8"]
"requirements": ["pyfirefly==0.1.10"]
}

View File

@@ -461,7 +461,7 @@ FITBIT_RESOURCES_LIST: Final[tuple[FitbitSensorEntityDescription, ...]] = (
key="sleep/timeInBed",
translation_key="sleep_time_in_bed",
native_unit_of_measurement=UnitOfTime.MINUTES,
icon="mdi:bed",
icon="mdi:hotel",
device_class=SensorDeviceClass.DURATION,
scope=FitbitScope.SLEEP,
state_class=SensorStateClass.TOTAL_INCREASING,

View File

@@ -2,6 +2,8 @@
from typing import TYPE_CHECKING, Any
from fressnapftracker import FressnapfTrackerError
from homeassistant.components.light import (
ATTR_BRIGHTNESS,
ColorMode,
@@ -16,6 +18,7 @@ from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from . import FressnapfTrackerConfigEntry
from .const import DOMAIN
from .entity import FressnapfTrackerEntity
from .services import handle_fressnapf_tracker_exception
PARALLEL_UPDATES = 1
@@ -61,12 +64,18 @@ class FressnapfTrackerLight(FressnapfTrackerEntity, LightEntity):
self.raise_if_not_activatable()
brightness = kwargs.get(ATTR_BRIGHTNESS, 255)
brightness = int((brightness / 255) * 100)
await self.coordinator.client.set_led_brightness(brightness)
try:
await self.coordinator.client.set_led_brightness(brightness)
except FressnapfTrackerError as e:
handle_fressnapf_tracker_exception(e)
await self.coordinator.async_request_refresh()
async def async_turn_off(self, **kwargs: Any) -> None:
"""Turn off the device."""
await self.coordinator.client.set_led_brightness(0)
try:
await self.coordinator.client.set_led_brightness(0)
except FressnapfTrackerError as e:
handle_fressnapf_tracker_exception(e)
await self.coordinator.async_request_refresh()
def raise_if_not_activatable(self) -> None:

View File

@@ -26,7 +26,7 @@ rules:
unique-config-entry: done
# Silver
action-exceptions: todo
action-exceptions: done
config-entry-unloading: done
docs-configuration-parameters: done
docs-installation-parameters: done

View File

@@ -0,0 +1,21 @@
"""Services and service helpers for fressnapf_tracker."""
from fressnapftracker import FressnapfTrackerError, FressnapfTrackerInvalidTokenError
from homeassistant.exceptions import ConfigEntryAuthFailed, HomeAssistantError
from .const import DOMAIN
def handle_fressnapf_tracker_exception(exception: FressnapfTrackerError):
"""Handle the different FressnapfTracker errors."""
if isinstance(exception, FressnapfTrackerInvalidTokenError):
raise ConfigEntryAuthFailed(
translation_domain=DOMAIN,
translation_key="invalid_auth",
) from exception
raise HomeAssistantError(
translation_domain=DOMAIN,
translation_key="api_error",
translation_placeholders={"error_message": str(exception)},
) from exception

View File

@@ -77,6 +77,9 @@
}
},
"exceptions": {
"api_error": {
"message": "An error occurred while communicating with the Fressnapf Tracker API: {error_message}"
},
"charging": {
"message": "The flashlight cannot be activated while charging."
},

View File

@@ -2,6 +2,8 @@
from typing import TYPE_CHECKING, Any
from fressnapftracker import FressnapfTrackerError
from homeassistant.components.switch import (
SwitchDeviceClass,
SwitchEntity,
@@ -13,6 +15,7 @@ from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from . import FressnapfTrackerConfigEntry
from .entity import FressnapfTrackerEntity
from .services import handle_fressnapf_tracker_exception
PARALLEL_UPDATES = 1
@@ -43,12 +46,18 @@ class FressnapfTrackerSwitch(FressnapfTrackerEntity, SwitchEntity):
async def async_turn_on(self, **kwargs: Any) -> None:
"""Turn on the device."""
await self.coordinator.client.set_energy_saving(True)
try:
await self.coordinator.client.set_energy_saving(True)
except FressnapfTrackerError as e:
handle_fressnapf_tracker_exception(e)
await self.coordinator.async_request_refresh()
async def async_turn_off(self, **kwargs: Any) -> None:
"""Turn off the device."""
await self.coordinator.client.set_energy_saving(False)
try:
await self.coordinator.client.set_energy_saving(False)
except FressnapfTrackerError as e:
handle_fressnapf_tracker_exception(e)
await self.coordinator.async_request_refresh()
@property

View File

@@ -77,14 +77,9 @@ class FritzboxDataUpdateCoordinator(DataUpdateCoordinator[FritzboxCoordinatorDat
)
LOGGER.debug("enable smarthome templates: %s", self.has_templates)
try:
self.has_triggers = await self.hass.async_add_executor_job(
self.fritz.has_triggers
)
except HTTPError:
# Fritz!OS < 7.39 just don't have this api endpoint
# so we need to fetch the HTTPError here and assume no triggers
self.has_triggers = False
self.has_triggers = await self.hass.async_add_executor_job(
self.fritz.has_triggers
)
LOGGER.debug("enable smarthome triggers: %s", self.has_triggers)
self.configuration_url = self.fritz.get_prefixed_host()

View File

@@ -23,5 +23,5 @@
"winter_mode": {}
},
"quality_scale": "internal",
"requirements": ["home-assistant-frontend==20260107.1"]
"requirements": ["home-assistant-frontend==20251229.1"]
}

View File

@@ -8,5 +8,5 @@
"iot_class": "cloud_polling",
"loggers": ["google_air_quality_api"],
"quality_scale": "bronze",
"requirements": ["google_air_quality_api==2.1.2"]
"requirements": ["google_air_quality_api==2.0.2"]
}

View File

@@ -8,5 +8,5 @@
"documentation": "https://www.home-assistant.io/integrations/google_generative_ai_conversation",
"integration_type": "service",
"iot_class": "cloud_polling",
"requirements": ["google-genai==1.38.0"]
"requirements": ["google-genai==1.56.0"]
}

View File

@@ -7,5 +7,5 @@
"documentation": "https://www.home-assistant.io/integrations/gree",
"iot_class": "local_polling",
"loggers": ["greeclimate"],
"requirements": ["greeclimate==2.1.1"]
"requirements": ["greeclimate==2.1.0"]
}

View File

@@ -5,7 +5,6 @@ from __future__ import annotations
from dataclasses import dataclass
import logging
from pyhik.constants import SENSOR_MAP
from pyhik.hikvision import HikCamera
import requests
@@ -71,33 +70,13 @@ async def async_setup_entry(hass: HomeAssistant, entry: HikvisionConfigEntry) ->
device_type=device_type,
)
_LOGGER.debug(
"Device %s (type=%s) initial event_states: %s",
device_name,
device_type,
camera.current_event_states,
)
# For NVRs or devices with no detected events, try to fetch events from ISAPI
# Use broader notification methods for NVRs since they often use 'record' etc.
if device_type == "NVR" or not camera.current_event_states:
nvr_notification_methods = {"center", "HTTP", "record", "email", "beep"}
def fetch_and_inject_nvr_events() -> None:
"""Fetch and inject NVR events in a single executor job."""
nvr_events = camera.get_event_triggers(nvr_notification_methods)
_LOGGER.debug("NVR events fetched with extended methods: %s", nvr_events)
if nvr_events:
# Map raw event type names to friendly names using SENSOR_MAP
mapped_events: dict[str, list[int]] = {}
for event_type, channels in nvr_events.items():
friendly_name = SENSOR_MAP.get(event_type.lower(), event_type)
if friendly_name in mapped_events:
mapped_events[friendly_name].extend(channels)
else:
mapped_events[friendly_name] = list(channels)
_LOGGER.debug("Mapped NVR events: %s", mapped_events)
camera.inject_events(mapped_events)
if nvr_events := camera.get_event_triggers():
camera.inject_events(nvr_events)
await hass.async_add_executor_job(fetch_and_inject_nvr_events)

View File

@@ -8,5 +8,5 @@
"iot_class": "local_push",
"loggers": ["pyhik"],
"quality_scale": "legacy",
"requirements": ["pyHik==0.4.0"]
"requirements": ["pyHik==0.3.4"]
}

View File

@@ -7,7 +7,7 @@
"documentation": "https://www.home-assistant.io/integrations/homeassistant_hardware",
"integration_type": "system",
"requirements": [
"serialx==0.6.2",
"serialx==0.5.0",
"universal-silabs-flasher==0.1.2",
"ha-silabs-firmware-client==0.3.0"
]

View File

@@ -13,6 +13,6 @@
"iot_class": "local_polling",
"loggers": ["homewizard_energy"],
"quality_scale": "platinum",
"requirements": ["python-homewizard-energy==10.0.1"],
"requirements": ["python-homewizard-energy==10.0.0"],
"zeroconf": ["_hwenergy._tcp.local.", "_homewizard._tcp.local."]
}

View File

@@ -19,10 +19,6 @@
selector:
choose:
choices:
number:
selector:
number:
mode: box
entity:
selector:
entity:
@@ -31,11 +27,14 @@
- input_number
- number
- sensor
number:
selector:
number:
mode: box
translation_key: number_or_entity
.trigger_threshold_type: &trigger_threshold_type
required: true
default: above
selector:
select:
options:

View File

@@ -12,5 +12,5 @@
"iot_class": "local_polling",
"loggers": ["incomfortclient"],
"quality_scale": "platinum",
"requirements": ["incomfort-client==0.6.11"]
"requirements": ["incomfort-client==0.6.10"]
}

View File

@@ -7,5 +7,5 @@
"integration_type": "device",
"iot_class": "local_polling",
"loggers": ["jvcprojector"],
"requirements": ["pyjvcprojector==1.1.2"]
"requirements": ["pyjvcprojector==1.1.3"]
}

View File

@@ -41,6 +41,13 @@ COMMANDS = {
"mode_1": const.REMOTE_MODE_1,
"mode_2": const.REMOTE_MODE_2,
"mode_3": const.REMOTE_MODE_3,
"mode_4": const.REMOTE_MODE_4,
"mode_5": const.REMOTE_MODE_5,
"mode_6": const.REMOTE_MODE_6,
"mode_7": const.REMOTE_MODE_7,
"mode_8": const.REMOTE_MODE_8,
"mode_9": const.REMOTE_MODE_9,
"mode_10": const.REMOTE_MODE_10,
"lens_ap": const.REMOTE_LENS_AP,
"gamma": const.REMOTE_GAMMA,
"color_temp": const.REMOTE_COLOR_TEMP,

View File

@@ -256,8 +256,6 @@ ENTITIES: tuple[LaMarzoccoNumberEntityDescription, ...] = (
supported_fn=(
lambda coordinator: coordinator.device.dashboard.model_name
in (ModelName.LINEA_MINI, ModelName.LINEA_MINI_R)
and WidgetType.CM_BREW_BY_WEIGHT_DOSES
in coordinator.device.dashboard.config
),
),
LaMarzoccoNumberEntityDescription(
@@ -291,8 +289,6 @@ ENTITIES: tuple[LaMarzoccoNumberEntityDescription, ...] = (
supported_fn=(
lambda coordinator: coordinator.device.dashboard.model_name
in (ModelName.LINEA_MINI, ModelName.LINEA_MINI_R)
and WidgetType.CM_BREW_BY_WEIGHT_DOSES
in coordinator.device.dashboard.config
),
),
)

View File

@@ -149,8 +149,6 @@ ENTITIES: tuple[LaMarzoccoSelectEntityDescription, ...] = (
supported_fn=(
lambda coordinator: coordinator.device.dashboard.model_name
in (ModelName.LINEA_MINI, ModelName.LINEA_MINI_R)
and WidgetType.CM_BREW_BY_WEIGHT_DOSES
in coordinator.device.dashboard.config
),
),
)

View File

@@ -7,5 +7,5 @@
"integration_type": "device",
"iot_class": "local_polling",
"quality_scale": "silver",
"requirements": ["librehardwaremonitor-api==1.6.0"]
"requirements": ["librehardwaremonitor-api==1.8.4"]
}

View File

@@ -19,10 +19,6 @@
selector:
choose:
choices:
number:
selector:
number:
mode: box
entity:
selector:
entity:
@@ -31,6 +27,10 @@
- input_number
- number
- sensor
number:
selector:
number:
mode: box
translation_key: number_or_entity
turned_on: *trigger_common
@@ -48,7 +48,6 @@ brightness_crossed_threshold:
behavior: *trigger_behavior
threshold_type:
required: true
default: above
selector:
select:
options:

View File

@@ -19,7 +19,12 @@ from homeassistant.helpers.typing import ConfigType
from .api import AsyncConfigEntryAuth
from .const import DOMAIN
from .coordinator import MieleConfigEntry, MieleDataUpdateCoordinator
from .coordinator import (
MieleAuxDataUpdateCoordinator,
MieleConfigEntry,
MieleDataUpdateCoordinator,
MieleRuntimeData,
)
from .services import async_setup_services
PLATFORMS: list[Platform] = [
@@ -75,19 +80,23 @@ async def async_setup_entry(hass: HomeAssistant, entry: MieleConfigEntry) -> boo
) from err
# Setup MieleAPI and coordinator for data fetch
api = MieleAPI(auth)
coordinator = MieleDataUpdateCoordinator(hass, entry, api)
await coordinator.async_config_entry_first_refresh()
entry.runtime_data = coordinator
_api = MieleAPI(auth)
_coordinator = MieleDataUpdateCoordinator(hass, entry, _api)
await _coordinator.async_config_entry_first_refresh()
_aux_coordinator = MieleAuxDataUpdateCoordinator(hass, entry, _api)
await _aux_coordinator.async_config_entry_first_refresh()
entry.runtime_data = MieleRuntimeData(_api, _coordinator, _aux_coordinator)
entry.async_create_background_task(
hass,
coordinator.api.listen_events(
data_callback=coordinator.callback_update_data,
actions_callback=coordinator.callback_update_actions,
entry.runtime_data.api.listen_events(
data_callback=_coordinator.callback_update_data,
actions_callback=_coordinator.callback_update_actions,
),
"pymiele event listener",
)
await hass.config_entries.async_forward_entry_setups(entry, PLATFORMS)
return True
@@ -107,5 +116,5 @@ async def async_remove_config_entry_device(
identifier
for identifier in device_entry.identifiers
if identifier[0] == DOMAIN
and identifier[1] in config_entry.runtime_data.data.devices
and identifier[1] in config_entry.runtime_data.coordinator.data.devices
)

View File

@@ -264,7 +264,7 @@ async def async_setup_entry(
async_add_entities: AddConfigEntryEntitiesCallback,
) -> None:
"""Set up the binary sensor platform."""
coordinator = config_entry.runtime_data
coordinator = config_entry.runtime_data.coordinator
added_devices: set[str] = set()
def _async_add_new_devices() -> None:

View File

@@ -112,7 +112,7 @@ async def async_setup_entry(
async_add_entities: AddConfigEntryEntitiesCallback,
) -> None:
"""Set up the button platform."""
coordinator = config_entry.runtime_data
coordinator = config_entry.runtime_data.coordinator
added_devices: set[str] = set()
def _async_add_new_devices() -> None:

View File

@@ -138,7 +138,7 @@ async def async_setup_entry(
async_add_entities: AddConfigEntryEntitiesCallback,
) -> None:
"""Set up the climate platform."""
coordinator = config_entry.runtime_data
coordinator = config_entry.runtime_data.coordinator
added_devices: set[str] = set()
def _async_add_new_devices() -> None:

View File

@@ -9,7 +9,13 @@ from datetime import timedelta
import logging
from aiohttp import ClientResponseError
from pymiele import MieleAction, MieleAPI, MieleDevice
from pymiele import (
MieleAction,
MieleAPI,
MieleDevice,
MieleFillingLevel,
MieleFillingLevels,
)
from homeassistant.config_entries import ConfigEntry
from homeassistant.core import HomeAssistant
@@ -20,7 +26,16 @@ from .const import DOMAIN
_LOGGER = logging.getLogger(__name__)
type MieleConfigEntry = ConfigEntry[MieleDataUpdateCoordinator]
@dataclass
class MieleRuntimeData:
"""Runtime data for the Miele integration."""
api: MieleAPI
coordinator: MieleDataUpdateCoordinator
aux_coordinator: MieleAuxDataUpdateCoordinator
type MieleConfigEntry = ConfigEntry[MieleRuntimeData]
@dataclass
@@ -31,8 +46,15 @@ class MieleCoordinatorData:
actions: dict[str, MieleAction]
@dataclass
class MieleAuxCoordinatorData:
"""Data class for storing auxiliary coordinator data."""
filling_levels: dict[str, MieleFillingLevel]
class MieleDataUpdateCoordinator(DataUpdateCoordinator[MieleCoordinatorData]):
"""Coordinator for Miele data."""
"""Main coordinator for Miele data."""
config_entry: MieleConfigEntry
new_device_callbacks: list[Callable[[dict[str, MieleDevice]], None]] = []
@@ -66,6 +88,7 @@ class MieleDataUpdateCoordinator(DataUpdateCoordinator[MieleCoordinatorData]):
}
self.devices = devices
actions = {}
for device_id in devices:
try:
actions_json = await self.api.get_actions(device_id)
@@ -99,10 +122,7 @@ class MieleDataUpdateCoordinator(DataUpdateCoordinator[MieleCoordinatorData]):
device_id: MieleDevice(device) for device_id, device in devices_json.items()
}
self.async_set_updated_data(
MieleCoordinatorData(
devices=devices,
actions=self.data.actions,
)
MieleCoordinatorData(devices=devices, actions=self.data.actions)
)
async def callback_update_actions(self, actions_json: dict[str, dict]) -> None:
@@ -111,8 +131,34 @@ class MieleDataUpdateCoordinator(DataUpdateCoordinator[MieleCoordinatorData]):
device_id: MieleAction(action) for device_id, action in actions_json.items()
}
self.async_set_updated_data(
MieleCoordinatorData(
devices=self.data.devices,
actions=actions,
)
MieleCoordinatorData(devices=self.data.devices, actions=actions)
)
class MieleAuxDataUpdateCoordinator(DataUpdateCoordinator[MieleAuxCoordinatorData]):
"""Coordinator for Miele data for slowly polled endpoints."""
config_entry: MieleConfigEntry
def __init__(
self,
hass: HomeAssistant,
config_entry: MieleConfigEntry,
api: MieleAPI,
) -> None:
"""Initialize the Miele data coordinator."""
super().__init__(
hass,
_LOGGER,
config_entry=config_entry,
name=DOMAIN,
update_interval=timedelta(seconds=60),
)
self.api = api
async def _async_update_data(self) -> MieleAuxCoordinatorData:
"""Fetch data from the Miele API."""
filling_levels_json = await self.api.get_filling_levels()
return MieleAuxCoordinatorData(
filling_levels=MieleFillingLevels(filling_levels_json).filling_levels
)

View File

@@ -38,13 +38,19 @@ async def async_get_config_entry_diagnostics(
"devices": redact_identifiers(
{
device_id: device_data.raw
for device_id, device_data in config_entry.runtime_data.data.devices.items()
for device_id, device_data in config_entry.runtime_data.coordinator.data.devices.items()
}
),
"filling_levels": redact_identifiers(
{
device_id: filling_level_data.raw
for device_id, filling_level_data in config_entry.runtime_data.aux_coordinator.data.filling_levels.items()
}
),
"actions": redact_identifiers(
{
device_id: action_data.raw
for device_id, action_data in config_entry.runtime_data.data.actions.items()
for device_id, action_data in config_entry.runtime_data.coordinator.data.actions.items()
}
),
}
@@ -68,13 +74,19 @@ async def async_get_device_diagnostics(
"model_id": device.model_id,
}
coordinator = config_entry.runtime_data
coordinator = config_entry.runtime_data.coordinator
aux_coordinator = config_entry.runtime_data.aux_coordinator
device_id = cast(str, device.serial_number)
miele_data: dict[str, Any] = {
"devices": {
hash_identifier(device_id): coordinator.data.devices[device_id].raw
},
"filling_levels": {
hash_identifier(device_id): aux_coordinator.data.filling_levels[
device_id
].raw
},
"actions": {
hash_identifier(device_id): coordinator.data.actions[device_id].raw
},

View File

@@ -1,16 +1,18 @@
"""Entity base class for the Miele integration."""
from pymiele import MieleAction, MieleAPI, MieleDevice
from pymiele import MieleAction, MieleAPI, MieleDevice, MieleFillingLevel
from homeassistant.helpers.device_registry import DeviceInfo
from homeassistant.helpers.entity import EntityDescription
from homeassistant.helpers.update_coordinator import CoordinatorEntity
from .const import DEVICE_TYPE_TAGS, DOMAIN, MANUFACTURER, MieleAppliance, StateStatus
from .coordinator import MieleDataUpdateCoordinator
from .coordinator import MieleAuxDataUpdateCoordinator, MieleDataUpdateCoordinator
class MieleEntity(CoordinatorEntity[MieleDataUpdateCoordinator]):
class MieleBaseEntity[
_MieleCoordinatorT: MieleDataUpdateCoordinator | MieleAuxDataUpdateCoordinator
](CoordinatorEntity[_MieleCoordinatorT]):
"""Base class for Miele entities."""
_attr_has_entity_name = True
@@ -22,7 +24,7 @@ class MieleEntity(CoordinatorEntity[MieleDataUpdateCoordinator]):
def __init__(
self,
coordinator: MieleDataUpdateCoordinator,
coordinator: _MieleCoordinatorT,
device_id: str,
description: EntityDescription,
) -> None:
@@ -30,7 +32,26 @@ class MieleEntity(CoordinatorEntity[MieleDataUpdateCoordinator]):
super().__init__(coordinator)
self._device_id = device_id
self.entity_description = description
self._attr_unique_id = MieleEntity.get_unique_id(device_id, description)
self._attr_unique_id = MieleBaseEntity.get_unique_id(device_id, description)
self._attr_device_info = DeviceInfo(identifiers={(DOMAIN, device_id)})
@property
def api(self) -> MieleAPI:
"""Return the api object."""
return self.coordinator.api
class MieleEntity(MieleBaseEntity[MieleDataUpdateCoordinator]):
"""Base class for Miele entities that use the main data coordinator."""
def __init__(
self,
coordinator: MieleDataUpdateCoordinator,
device_id: str,
description: EntityDescription,
) -> None:
"""Initialize the entity."""
super().__init__(coordinator, device_id, description)
device = self.device
appliance_type = DEVICE_TYPE_TAGS.get(MieleAppliance(device.device_type))
@@ -61,11 +82,6 @@ class MieleEntity(CoordinatorEntity[MieleDataUpdateCoordinator]):
"""Return the actions object."""
return self.coordinator.data.actions[self._device_id]
@property
def api(self) -> MieleAPI:
"""Return the api object."""
return self.coordinator.api
@property
def available(self) -> bool:
"""Return the availability of the entity."""
@@ -75,3 +91,12 @@ class MieleEntity(CoordinatorEntity[MieleDataUpdateCoordinator]):
and self._device_id in self.coordinator.data.devices
and (self.device.state_status is not StateStatus.not_connected)
)
class MieleAuxEntity(MieleBaseEntity[MieleAuxDataUpdateCoordinator]):
"""Base class for Miele entities that use the auxiliary data coordinator."""
@property
def levels(self) -> MieleFillingLevel:
"""Return the filling levels object."""
return self.coordinator.data.filling_levels[self._device_id]

View File

@@ -66,7 +66,7 @@ async def async_setup_entry(
async_add_entities: AddConfigEntryEntitiesCallback,
) -> None:
"""Set up the fan platform."""
coordinator = config_entry.runtime_data
coordinator = config_entry.runtime_data.coordinator
added_devices: set[str] = set()
def _async_add_new_devices() -> None:

View File

@@ -71,6 +71,9 @@
"plate_step_warming": "mdi:alpha-w-circle-outline"
}
},
"power_disk_level": {
"default": "mdi:car-coolant-level"
},
"program_id": {
"default": "mdi:selection-ellipse-arrow-inside"
},
@@ -83,6 +86,12 @@
"remaining_time": {
"default": "mdi:clock-end"
},
"rinse_aid_level": {
"default": "mdi:water-opacity"
},
"salt_level": {
"default": "mdi:shaker-outline"
},
"spin_speed": {
"default": "mdi:sync"
},
@@ -95,6 +104,12 @@
"target_temperature": {
"default": "mdi:thermometer-check"
},
"twin_dos_1_level": {
"default": "mdi:car-coolant-level"
},
"twin_dos_2_level": {
"default": "mdi:car-coolant-level"
},
"water_forecast": {
"default": "mdi:water-outline"
}

View File

@@ -86,7 +86,7 @@ async def async_setup_entry(
async_add_entities: AddConfigEntryEntitiesCallback,
) -> None:
"""Set up the light platform."""
coordinator = config_entry.runtime_data
coordinator = config_entry.runtime_data.coordinator
added_devices: set[str] = set()
def _async_add_new_devices() -> None:

View File

@@ -71,7 +71,7 @@ async def async_setup_entry(
async_add_entities: AddConfigEntryEntitiesCallback,
) -> None:
"""Set up the select platform."""
coordinator = config_entry.runtime_data
coordinator = config_entry.runtime_data.coordinator
added_devices: set[str] = set()
def _async_add_new_devices() -> None:

View File

@@ -8,7 +8,7 @@ from datetime import datetime, timedelta
import logging
from typing import Any, Final, cast
from pymiele import MieleDevice, MieleTemperature
from pymiele import MieleDevice, MieleFillingLevel, MieleTemperature
from homeassistant.components.sensor import (
RestoreSensor,
@@ -44,8 +44,12 @@ from .const import (
StateProgramType,
StateStatus,
)
from .coordinator import MieleConfigEntry, MieleDataUpdateCoordinator
from .entity import MieleEntity
from .coordinator import (
MieleAuxDataUpdateCoordinator,
MieleConfigEntry,
MieleDataUpdateCoordinator,
)
from .entity import MieleAuxEntity, MieleEntity
PARALLEL_UPDATES = 0
@@ -139,10 +143,13 @@ def _convert_finish_timestamp(
@dataclass(frozen=True, kw_only=True)
class MieleSensorDescription(SensorEntityDescription):
class MieleSensorDescription[T: (MieleDevice, MieleFillingLevel)](
SensorEntityDescription
):
"""Class describing Miele sensor entities."""
value_fn: Callable[[MieleDevice], StateType | datetime]
value_fn: Callable[[T], StateType | datetime]
end_value_fn: Callable[[StateType | datetime], StateType | datetime] | None = None
extra_attributes: dict[str, Callable[[MieleDevice], StateType]] | None = None
zone: int | None = None
@@ -150,14 +157,14 @@ class MieleSensorDescription(SensorEntityDescription):
@dataclass
class MieleSensorDefinition:
class MieleSensorDefinition[T: (MieleDevice, MieleFillingLevel)]:
"""Class for defining sensor entities."""
types: tuple[MieleAppliance, ...]
description: MieleSensorDescription
description: MieleSensorDescription[T]
SENSOR_TYPES: Final[tuple[MieleSensorDefinition, ...]] = (
SENSOR_TYPES: Final[tuple[MieleSensorDefinition[MieleDevice], ...]] = (
MieleSensorDefinition(
types=(
MieleAppliance.WASHING_MACHINE,
@@ -689,6 +696,59 @@ SENSOR_TYPES: Final[tuple[MieleSensorDefinition, ...]] = (
),
)
POLLED_SENSOR_TYPES: Final[tuple[MieleSensorDefinition[MieleFillingLevel], ...]] = (
MieleSensorDefinition(
types=(MieleAppliance.WASHING_MACHINE,),
description=MieleSensorDescription[MieleFillingLevel](
key="twin_dos_1_level",
translation_key="twin_dos_1_level",
value_fn=lambda value: value.twin_dos_container_1_filling_level,
native_unit_of_measurement=PERCENTAGE,
entity_category=EntityCategory.DIAGNOSTIC,
),
),
MieleSensorDefinition(
types=(MieleAppliance.WASHING_MACHINE,),
description=MieleSensorDescription[MieleFillingLevel](
key="twin_dos_2_level",
translation_key="twin_dos_2_level",
value_fn=lambda value: value.twin_dos_container_2_filling_level,
native_unit_of_measurement=PERCENTAGE,
entity_category=EntityCategory.DIAGNOSTIC,
),
),
MieleSensorDefinition(
types=(MieleAppliance.DISHWASHER,),
description=MieleSensorDescription[MieleFillingLevel](
key="power_disk_level",
translation_key="power_disk_level",
value_fn=lambda value: None,
native_unit_of_measurement=PERCENTAGE,
entity_category=EntityCategory.DIAGNOSTIC,
),
),
MieleSensorDefinition(
types=(MieleAppliance.DISHWASHER,),
description=MieleSensorDescription[MieleFillingLevel](
key="salt_level",
translation_key="salt_level",
value_fn=lambda value: value.salt_filling_level,
native_unit_of_measurement=PERCENTAGE,
entity_category=EntityCategory.DIAGNOSTIC,
),
),
MieleSensorDefinition(
types=(MieleAppliance.DISHWASHER,),
description=MieleSensorDescription[MieleFillingLevel](
key="rinse_aid_level",
translation_key="rinse_aid_level",
value_fn=lambda value: value.rinse_aid_filling_level,
native_unit_of_measurement=PERCENTAGE,
entity_category=EntityCategory.DIAGNOSTIC,
),
),
)
async def async_setup_entry(
hass: HomeAssistant,
@@ -696,11 +756,14 @@ async def async_setup_entry(
async_add_entities: AddConfigEntryEntitiesCallback,
) -> None:
"""Set up the sensor platform."""
coordinator = config_entry.runtime_data
coordinator = config_entry.runtime_data.coordinator
aux_coordinator = config_entry.runtime_data.aux_coordinator
added_devices: set[str] = set() # device_id
added_entities: set[str] = set() # unique_id
def _get_entity_class(definition: MieleSensorDefinition) -> type[MieleSensor]:
def _get_entity_class(
definition: MieleSensorDefinition[MieleDevice],
) -> type[MieleSensor]:
"""Get the entity class for the sensor."""
return {
"state_status": MieleStatusSensor,
@@ -725,7 +788,7 @@ async def async_setup_entry(
)
def _is_sensor_enabled(
definition: MieleSensorDefinition,
definition: MieleSensorDefinition[MieleDevice],
device: MieleDevice,
unique_id: str,
) -> bool:
@@ -748,6 +811,15 @@ async def async_setup_entry(
return False
return True
def _enabled_aux_sensor(
definition: MieleSensorDefinition[MieleFillingLevel], level: MieleFillingLevel
) -> bool:
"""Check if aux sensors are enabled."""
return not (
definition.description.value_fn is not None
and definition.description.value_fn(level) is None
)
def _async_add_devices() -> None:
nonlocal added_devices, added_entities
entities: list = []
@@ -775,7 +847,11 @@ async def async_setup_entry(
continue
# sensors is not enabled, skip
if not _is_sensor_enabled(definition, device, unique_id):
if not _is_sensor_enabled(
definition,
device,
unique_id,
):
continue
added_entities.add(unique_id)
@@ -787,6 +863,15 @@ async def async_setup_entry(
config_entry.async_on_unload(coordinator.async_add_listener(_async_add_devices))
_async_add_devices()
async_add_entities(
MieleAuxSensor(aux_coordinator, device_id, definition.description)
for device_id in aux_coordinator.data.filling_levels
for definition in POLLED_SENSOR_TYPES
if _enabled_aux_sensor(
definition, aux_coordinator.data.filling_levels[device_id]
)
)
APPLIANCE_ICONS = {
MieleAppliance.WASHING_MACHINE: "mdi:washing-machine",
@@ -885,6 +970,32 @@ class MieleRestorableSensor(MieleSensor, RestoreSensor):
super()._handle_coordinator_update()
class MieleAuxSensor(MieleAuxEntity, SensorEntity):
"""Representation of a filling level Sensor."""
entity_description: MieleSensorDescription
def __init__(
self,
coordinator: MieleAuxDataUpdateCoordinator,
device_id: str,
description: MieleSensorDescription,
) -> None:
"""Initialize the sensor."""
super().__init__(coordinator, device_id, description)
if description.unique_id_fn is not None:
self._attr_unique_id = description.unique_id_fn(device_id, description)
@property
def native_value(self) -> StateType | datetime:
"""Return the state of the level sensor."""
return (
self.entity_description.value_fn(self.levels)
if self.entity_description.value_fn is not None
else None
)
class MielePlateSensor(MieleSensor):
"""Representation of a Sensor."""

View File

@@ -257,6 +257,9 @@
"plate_step_warm": "Warming"
}
},
"power_disk_level": {
"name": "PowerDisk level"
},
"program_id": {
"name": "Program",
"state": {
@@ -1038,6 +1041,12 @@
"remaining_time": {
"name": "Remaining time"
},
"rinse_aid_level": {
"name": "Rinse aid level"
},
"salt_level": {
"name": "Salt level"
},
"spin_speed": {
"name": "Spin speed"
},
@@ -1080,6 +1089,12 @@
"temperature_zone_3": {
"name": "Temperature zone 3"
},
"twin_dos_1_level": {
"name": "TwinDos 1 level"
},
"twin_dos_2_level": {
"name": "TwinDos 2 level"
},
"water_consumption": {
"name": "Water consumption"
},

View File

@@ -117,7 +117,7 @@ async def async_setup_entry(
async_add_entities: AddConfigEntryEntitiesCallback,
) -> None:
"""Set up the switch platform."""
coordinator = config_entry.runtime_data
coordinator = config_entry.runtime_data.coordinator
added_devices: set[str] = set()
def _async_add_new_devices() -> None:

View File

@@ -128,7 +128,7 @@ async def async_setup_entry(
async_add_entities: AddConfigEntryEntitiesCallback,
) -> None:
"""Set up the vacuum platform."""
coordinator = config_entry.runtime_data
coordinator = config_entry.runtime_data.coordinator
async_add_entities(
MieleVacuum(coordinator, device_id, definition.description)

View File

@@ -8,5 +8,5 @@
"iot_class": "cloud_polling",
"loggers": ["nextdns"],
"quality_scale": "platinum",
"requirements": ["nextdns==4.1.0"]
"requirements": ["nextdns==5.0.0"]
}

View File

@@ -6,5 +6,5 @@
"documentation": "https://www.home-assistant.io/integrations/nibe_heatpump",
"integration_type": "device",
"iot_class": "local_polling",
"requirements": ["nibe==2.20.0"]
"requirements": ["nibe==2.21.0"]
}

View File

@@ -8,5 +8,5 @@
"iot_class": "cloud_polling",
"loggers": ["pynintendoauth", "pynintendoparental"],
"quality_scale": "bronze",
"requirements": ["pynintendoauth==1.0.2", "pynintendoparental==2.3.2"]
"requirements": ["pynintendoauth==1.0.2", "pynintendoparental==2.3.0"]
}

View File

@@ -1 +1,30 @@
"""The openevse component."""
"""The OpenEVSE integration."""
from __future__ import annotations
from openevsehttp.__main__ import OpenEVSE
from homeassistant.config_entries import ConfigEntry
from homeassistant.const import CONF_HOST, Platform
from homeassistant.core import HomeAssistant
from homeassistant.exceptions import ConfigEntryError
type OpenEVSEConfigEntry = ConfigEntry[OpenEVSE]
async def async_setup_entry(hass: HomeAssistant, entry: OpenEVSEConfigEntry) -> bool:
"""Set up openevse from a config entry."""
entry.runtime_data = OpenEVSE(entry.data[CONF_HOST])
try:
await entry.runtime_data.test_and_get()
except TimeoutError as ex:
raise ConfigEntryError("Unable to connect to charger") from ex
await hass.config_entries.async_forward_entry_setups(entry, [Platform.SENSOR])
return True
async def async_unload_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool:
"""Unload a config entry."""
return await hass.config_entries.async_unload_platforms(entry, [Platform.SENSOR])

View File

@@ -0,0 +1,64 @@
"""Config flow for OpenEVSE integration."""
from typing import Any
from openevsehttp.__main__ import OpenEVSE
import voluptuous as vol
from homeassistant.config_entries import ConfigFlow, ConfigFlowResult
from homeassistant.const import CONF_HOST
from .const import DOMAIN
class OpenEVSEConfigFlow(ConfigFlow, domain=DOMAIN):
"""OpenEVSE config flow."""
VERSION = 1
MINOR_VERSION = 1
async def check_status(self, host: str) -> bool:
"""Check if we can connect to the OpenEVSE charger."""
charger = OpenEVSE(host)
try:
await charger.test_and_get()
except TimeoutError:
return False
else:
return True
async def async_step_user(
self, user_input: dict[str, Any] | None = None
) -> ConfigFlowResult:
"""Handle the initial step."""
errors = None
if user_input is not None:
self._async_abort_entries_match({CONF_HOST: user_input[CONF_HOST]})
if await self.check_status(user_input[CONF_HOST]):
return self.async_create_entry(
title=f"OpenEVSE {user_input[CONF_HOST]}",
data=user_input,
)
errors = {CONF_HOST: "cannot_connect"}
return self.async_show_form(
step_id="user",
data_schema=vol.Schema({vol.Required(CONF_HOST): str}),
errors=errors,
)
async def async_step_import(self, data: dict[str, str]) -> ConfigFlowResult:
"""Handle the initial step."""
self._async_abort_entries_match({CONF_HOST: data[CONF_HOST]})
if not await self.check_status(data[CONF_HOST]):
return self.async_abort(reason="unavailable_host")
return self.async_create_entry(
title=f"OpenEVSE {data[CONF_HOST]}",
data=data,
)

View File

@@ -0,0 +1,4 @@
"""Constants for the OpenEVSE integration."""
DOMAIN = "openevse"
INTEGRATION_TITLE = "OpenEVSE"

View File

@@ -1,10 +1,12 @@
{
"domain": "openevse",
"name": "OpenEVSE",
"codeowners": [],
"codeowners": ["@c00w"],
"config_flow": true,
"documentation": "https://www.home-assistant.io/integrations/openevse",
"integration_type": "device",
"iot_class": "local_polling",
"loggers": ["openevsewifi"],
"loggers": ["openevsehttp"],
"quality_scale": "legacy",
"requirements": ["openevsewifi==1.1.2"]
"requirements": ["python-openevse-http==0.2.1"]
}

View File

@@ -4,17 +4,18 @@ from __future__ import annotations
import logging
import openevsewifi
from requests import RequestException
from openevsehttp.__main__ import OpenEVSE
import voluptuous as vol
from homeassistant.components.sensor import (
DOMAIN as HOMEASSISTANT_DOMAIN,
PLATFORM_SCHEMA as SENSOR_PLATFORM_SCHEMA,
SensorDeviceClass,
SensorEntity,
SensorEntityDescription,
SensorStateClass,
)
from homeassistant.config_entries import SOURCE_IMPORT
from homeassistant.const import (
CONF_HOST,
CONF_MONITORED_VARIABLES,
@@ -23,10 +24,17 @@ from homeassistant.const import (
UnitOfTime,
)
from homeassistant.core import HomeAssistant
from homeassistant.helpers import config_validation as cv
from homeassistant.helpers.entity_platform import AddEntitiesCallback
from homeassistant.data_entry_flow import FlowResultType
from homeassistant.helpers import config_validation as cv, issue_registry as ir
from homeassistant.helpers.entity_platform import (
AddConfigEntryEntitiesCallback,
AddEntitiesCallback,
)
from homeassistant.helpers.typing import ConfigType, DiscoveryInfoType
from . import ConfigEntry
from .const import DOMAIN, INTEGRATION_TITLE
_LOGGER = logging.getLogger(__name__)
SENSOR_TYPES: tuple[SensorEntityDescription, ...] = (
@@ -54,6 +62,7 @@ SENSOR_TYPES: tuple[SensorEntityDescription, ...] = (
native_unit_of_measurement=UnitOfTemperature.CELSIUS,
device_class=SensorDeviceClass.TEMPERATURE,
state_class=SensorStateClass.MEASUREMENT,
entity_registry_enabled_default=False,
),
SensorEntityDescription(
key="rtc_temp",
@@ -61,6 +70,7 @@ SENSOR_TYPES: tuple[SensorEntityDescription, ...] = (
native_unit_of_measurement=UnitOfTemperature.CELSIUS,
device_class=SensorDeviceClass.TEMPERATURE,
state_class=SensorStateClass.MEASUREMENT,
entity_registry_enabled_default=False,
),
SensorEntityDescription(
key="usage_session",
@@ -90,54 +100,110 @@ PLATFORM_SCHEMA = SENSOR_PLATFORM_SCHEMA.extend(
)
def setup_platform(
async def async_setup_platform(
hass: HomeAssistant,
config: ConfigType,
add_entities: AddEntitiesCallback,
async_add_entities: AddEntitiesCallback,
discovery_info: DiscoveryInfoType | None = None,
) -> None:
"""Set up the OpenEVSE sensor."""
host = config[CONF_HOST]
monitored_variables = config[CONF_MONITORED_VARIABLES]
"""Set up the openevse platform."""
result = await hass.config_entries.flow.async_init(
DOMAIN,
context={"source": SOURCE_IMPORT},
data=config,
)
charger = openevsewifi.Charger(host)
if (
result.get("type") is FlowResultType.ABORT
and result.get("reason") != "already_configured"
):
ir.async_create_issue(
hass,
DOMAIN,
f"deprecated_yaml_import_issue_{result.get('reason')}",
breaks_in_ha_version="2026.6.0",
is_fixable=False,
issue_domain=DOMAIN,
severity=ir.IssueSeverity.WARNING,
translation_key=f"deprecated_yaml_import_issue_{result.get('reason')}",
translation_placeholders={
"domain": DOMAIN,
"integration_title": INTEGRATION_TITLE,
},
)
return
entities = [
OpenEVSESensor(charger, description)
for description in SENSOR_TYPES
if description.key in monitored_variables
]
ir.async_create_issue(
hass,
HOMEASSISTANT_DOMAIN,
"deprecated_yaml",
breaks_in_ha_version="2026.7.0",
is_fixable=False,
issue_domain=DOMAIN,
severity=ir.IssueSeverity.WARNING,
translation_key="deprecated_yaml",
translation_placeholders={
"domain": DOMAIN,
"integration_title": INTEGRATION_TITLE,
},
)
add_entities(entities, True)
async def async_setup_entry(
hass: HomeAssistant,
config_entry: ConfigEntry,
async_add_entities: AddConfigEntryEntitiesCallback,
) -> None:
"""Add sensors for passed config_entry in HA."""
async_add_entities(
(
OpenEVSESensor(
config_entry.data[CONF_HOST],
config_entry.runtime_data,
description,
)
for description in SENSOR_TYPES
),
True,
)
class OpenEVSESensor(SensorEntity):
"""Implementation of an OpenEVSE sensor."""
def __init__(self, charger, description: SensorEntityDescription) -> None:
def __init__(
self,
host: str,
charger: OpenEVSE,
description: SensorEntityDescription,
) -> None:
"""Initialize the sensor."""
self.entity_description = description
self.host = host
self.charger = charger
def update(self) -> None:
async def async_update(self) -> None:
"""Get the monitored data from the charger."""
try:
sensor_type = self.entity_description.key
if sensor_type == "status":
self._attr_native_value = self.charger.getStatus()
elif sensor_type == "charge_time":
self._attr_native_value = self.charger.getChargeTimeElapsed() / 60
elif sensor_type == "ambient_temp":
self._attr_native_value = self.charger.getAmbientTemperature()
elif sensor_type == "ir_temp":
self._attr_native_value = self.charger.getIRTemperature()
elif sensor_type == "rtc_temp":
self._attr_native_value = self.charger.getRTCTemperature()
elif sensor_type == "usage_session":
self._attr_native_value = float(self.charger.getUsageSession()) / 1000
elif sensor_type == "usage_total":
self._attr_native_value = float(self.charger.getUsageTotal()) / 1000
else:
self._attr_native_value = "Unknown"
except (RequestException, ValueError, KeyError):
await self.charger.update()
except TimeoutError:
_LOGGER.warning("Could not update status for %s", self.name)
return
sensor_type = self.entity_description.key
if sensor_type == "status":
self._attr_native_value = self.charger.status
elif sensor_type == "charge_time":
self._attr_native_value = self.charger.charge_time_elapsed / 60
elif sensor_type == "ambient_temp":
self._attr_native_value = self.charger.ambient_temperature
elif sensor_type == "ir_temp":
self._attr_native_value = self.charger.ir_temperature
elif sensor_type == "rtc_temp":
self._attr_native_value = self.charger.rtc_temperature
elif sensor_type == "usage_session":
self._attr_native_value = float(self.charger.usage_session) / 1000
elif sensor_type == "usage_total":
self._attr_native_value = float(self.charger.usage_total) / 1000
else:
self._attr_native_value = "Unknown"

View File

@@ -0,0 +1,27 @@
{
"config": {
"abort": {
"already_configured": "This charger is already configured",
"unavailable_host": "Unable to connect to host"
},
"error": {
"cannot_connect": "Unable to connect"
},
"step": {
"user": {
"data": {
"host": "[%key:common::config_flow::data::host%]"
},
"data_description": {
"host": "Enter the IP address of your OpenEVSE. Should match the address you used to set it up."
}
}
}
},
"issues": {
"yaml_deprecated": {
"description": "Configuring OpenEVSE using YAML is being removed. Your existing YAML configuration has been imported into the UI automatically. Remove the `openevse` configuration from your configuration.yaml file and restart Home Assistant to fix this issue.",
"title": "OpenEVSE YAML configuration is deprecated"
}
}
}

View File

@@ -9,5 +9,5 @@
"iot_class": "cloud_polling",
"loggers": ["opower"],
"quality_scale": "bronze",
"requirements": ["opower==0.16.1"]
"requirements": ["opower==0.16.0"]
}

View File

@@ -55,7 +55,7 @@ SENSOR_TYPES: dict[str, OSOEnergySensorEntityDescription] = {
key="optimization_mode",
translation_key="optimization_mode",
device_class=SensorDeviceClass.ENUM,
options=["off", "oso", "gridcompany", "smartcompany", "advanced", "nettleie"],
options=["off", "oso", "gridcompany", "smartcompany", "advanced"],
value_fn=lambda entity_data: entity_data.state.lower(),
),
"power_load": OSOEnergySensorEntityDescription(

View File

@@ -58,7 +58,6 @@
"state": {
"advanced": "Advanced",
"gridcompany": "Grid company",
"nettleie": "Nettleie",
"off": "[%key:common::state::off%]",
"oso": "OSO",
"smartcompany": "Smart company"

View File

@@ -8,5 +8,5 @@
"documentation": "https://www.home-assistant.io/integrations/otbr",
"integration_type": "service",
"iot_class": "local_polling",
"requirements": ["python-otbr-api==2.7.1"]
"requirements": ["python-otbr-api==2.7.0"]
}

View File

@@ -61,7 +61,10 @@ async def async_get_device_diagnostics(
data["execution_history"] = [
repr(execution)
for execution in await client.get_execution_history()
if any(command.device_url == device_url for command in execution.commands)
if any(
command.device_url.split("#", 1)[0] == device_url.split("#", 1)[0]
for command in execution.commands
)
]
return data

View File

@@ -13,7 +13,7 @@
"integration_type": "hub",
"iot_class": "local_polling",
"loggers": ["boto3", "botocore", "pyhumps", "pyoverkiz", "s3transfer"],
"requirements": ["pyoverkiz==1.19.4"],
"requirements": ["pyoverkiz==1.19.3"],
"zeroconf": [
{
"name": "gateway*",

View File

@@ -11,5 +11,13 @@
"reload": {
"service": "mdi:reload"
}
},
"triggers": {
"entered_home": {
"trigger": "mdi:account-arrow-left"
},
"left_home": {
"trigger": "mdi:account-arrow-right"
}
}
}

View File

@@ -1,4 +1,8 @@
{
"common": {
"trigger_behavior_description": "The behavior of the targeted persons to trigger on.",
"trigger_behavior_name": "Behavior"
},
"entity_component": {
"_": {
"name": "[%key:component::person::title%]",
@@ -25,11 +29,42 @@
}
}
},
"selector": {
"trigger_behavior": {
"options": {
"any": "Any",
"first": "First",
"last": "Last"
}
}
},
"services": {
"reload": {
"description": "Reloads persons from the YAML-configuration.",
"name": "[%key:common::action::reload%]"
}
},
"title": "Person"
"title": "Person",
"triggers": {
"entered_home": {
"description": "Triggers when one or more persons enter home.",
"fields": {
"behavior": {
"description": "[%key:component::person::common::trigger_behavior_description%]",
"name": "[%key:component::person::common::trigger_behavior_name%]"
}
},
"name": "Entered home"
},
"left_home": {
"description": "Triggers when one or more persons leave home.",
"fields": {
"behavior": {
"description": "[%key:component::person::common::trigger_behavior_description%]",
"name": "[%key:component::person::common::trigger_behavior_name%]"
}
},
"name": "Left home"
}
}
}

View File

@@ -0,0 +1,21 @@
"""Provides triggers for persons."""
from homeassistant.const import STATE_HOME
from homeassistant.core import HomeAssistant
from homeassistant.helpers.trigger import (
Trigger,
make_entity_origin_state_trigger,
make_entity_target_state_trigger,
)
from .const import DOMAIN
TRIGGERS: dict[str, type[Trigger]] = {
"entered_home": make_entity_target_state_trigger(DOMAIN, STATE_HOME),
"left_home": make_entity_origin_state_trigger(DOMAIN, from_state=STATE_HOME),
}
async def async_get_triggers(hass: HomeAssistant) -> dict[str, type[Trigger]]:
"""Return the triggers for persons."""
return TRIGGERS

View File

@@ -0,0 +1,18 @@
.trigger_common: &trigger_common
target:
entity:
domain: person
fields:
behavior:
required: true
default: any
selector:
select:
options:
- first
- last
- any
translation_key: trigger_behavior
entered_home: *trigger_common
left_home: *trigger_common

View File

@@ -75,6 +75,15 @@
},
{
"macaddress": "84E657*"
},
{
"hostname": "ps5-*"
},
{
"hostname": "ps4-*"
},
{
"hostname": "ps3"
}
],
"documentation": "https://www.home-assistant.io/integrations/playstation_network",

View File

@@ -114,32 +114,72 @@ class PooldoseConfigFlow(ConfigFlow, domain=DOMAIN):
self, user_input: dict[str, Any] | None = None
) -> ConfigFlowResult:
"""Handle the initial step."""
if not user_input:
return self.async_show_form(
step_id="user",
data_schema=SCHEMA_DEVICE,
if user_input is not None:
host = user_input[CONF_HOST]
serial_number, api_versions, errors = await self._validate_host(host)
if errors:
return self.async_show_form(
step_id="user",
data_schema=SCHEMA_DEVICE,
errors=errors,
# Handle API version info for error display; pass version info when available
# or None when api_versions is None to avoid displaying version details
description_placeholders={
"api_version_is": api_versions.get("api_version_is") or "",
"api_version_should": api_versions.get("api_version_should")
or "",
}
if api_versions
else None,
)
await self.async_set_unique_id(serial_number, raise_on_progress=False)
self._abort_if_unique_id_configured()
return self.async_create_entry(
title=f"PoolDose {serial_number}",
data={CONF_HOST: host},
)
host = user_input[CONF_HOST]
serial_number, api_versions, errors = await self._validate_host(host)
if errors:
return self.async_show_form(
step_id="user",
data_schema=SCHEMA_DEVICE,
errors=errors,
# Handle API version info for error display; pass version info when available
# or None when api_versions is None to avoid displaying version details
description_placeholders={
"api_version_is": api_versions.get("api_version_is") or "",
"api_version_should": api_versions.get("api_version_should") or "",
}
if api_versions
else None,
)
await self.async_set_unique_id(serial_number, raise_on_progress=False)
self._abort_if_unique_id_configured()
return self.async_create_entry(
title=f"PoolDose {serial_number}",
data={CONF_HOST: host},
return self.async_show_form(
step_id="user",
data_schema=SCHEMA_DEVICE,
)
async def async_step_reconfigure(
self, user_input: dict[str, Any] | None = None
) -> ConfigFlowResult:
"""Handle reconfigure to change the device host/IP for an existing entry."""
if user_input is not None:
host = user_input[CONF_HOST]
serial_number, api_versions, errors = await self._validate_host(host)
if errors:
return self.async_show_form(
step_id="reconfigure",
data_schema=SCHEMA_DEVICE,
errors=errors,
# Handle API version info for error display identical to other steps
description_placeholders={
"api_version_is": api_versions.get("api_version_is") or "",
"api_version_should": api_versions.get("api_version_should")
or "",
}
if api_versions
else None,
)
# Ensure new serial number matches the existing entry unique_id (serial number)
if serial_number != self._get_reconfigure_entry().unique_id:
return self.async_abort(reason="wrong_device")
# Update the existing config entry with the new host and schedule reload
return self.async_update_reload_and_abort(
self._get_reconfigure_entry(), data_updates={CONF_HOST: host}
)
return self.async_show_form(
step_id="reconfigure",
# Pre-fill with current host from the entry being reconfigured
data_schema=self.add_suggested_values_to_schema(
SCHEMA_DEVICE, self._get_reconfigure_entry().data
),
)

View File

@@ -0,0 +1,34 @@
"""Diagnostics support for Pooldose."""
from __future__ import annotations
from typing import Any
from homeassistant.components.diagnostics import async_redact_data
from homeassistant.core import HomeAssistant
from . import PooldoseConfigEntry
TO_REDACT = {
"IP",
"MAC",
"WIFI_SSID",
"AP_SSID",
"SERIAL_NUMBER",
"DEVICE_ID",
"OWNERID",
"NAME",
"GROUPNAME",
}
async def async_get_config_entry_diagnostics(
hass: HomeAssistant, entry: PooldoseConfigEntry
) -> dict[str, Any]:
"""Return diagnostics for a config entry."""
coordinator = entry.runtime_data
return {
"device_info": async_redact_data(coordinator.device_info, TO_REDACT),
"data": coordinator.data,
}

View File

@@ -41,7 +41,7 @@ rules:
# Gold
devices: done
diagnostics: todo
diagnostics: done
discovery-update-info: done
discovery: done
docs-data-update: done
@@ -53,20 +53,20 @@ rules:
docs-use-cases: todo
dynamic-devices:
status: exempt
comment: This integration does not support dynamic devices, as it is designed for a single PoolDose device.
comment: This integration does not support dynamic device discovery, as each config entry represents a single PoolDose device with all available entities.
entity-category: done
entity-device-class: done
entity-disabled-by-default: done
entity-translations: done
exception-translations: done
icon-translations: done
reconfiguration-flow: todo
reconfiguration-flow: done
repair-issues:
status: exempt
comment: This integration does not provide repair issues, as it is designed for a single PoolDose device with a fixed configuration.
comment: This integration does not have any identified cases where repair issues would be needed.
stale-devices:
status: exempt
comment: This integration does not support stale devices, as it is designed for a single PoolDose device with a fixed configuration.
comment: This integration manages a single device per config entry, so stale device removal is not applicable.
# Platinum
async-dependency: done

View File

@@ -4,7 +4,9 @@
"already_configured": "[%key:common::config_flow::abort::already_configured_device%]",
"cannot_connect": "[%key:common::config_flow::error::cannot_connect%]",
"no_device_info": "Unable to retrieve device information",
"no_serial_number": "No serial number found on the device"
"no_serial_number": "No serial number found on the device",
"reconfigure_successful": "[%key:common::config_flow::abort::reconfigure_successful%]",
"wrong_device": "The provided device does not match the configured device"
},
"error": {
"api_not_set": "API version not found in device response. Device firmware may not be compatible with this integration.",
@@ -20,6 +22,14 @@
"description": "A PoolDose device was found on your network at {ip} with MAC address {mac}.\n\nDo you want to add {name} to Home Assistant?",
"title": "Confirm DHCP discovered PoolDose device"
},
"reconfigure": {
"data": {
"host": "[%key:common::config_flow::data::host%]"
},
"data_description": {
"host": "[%key:component::pooldose::config::step::user::data_description::host%]"
}
},
"user": {
"data": {
"host": "[%key:common::config_flow::data::host%]"

View File

@@ -164,7 +164,11 @@ class PortainerContainerSensor(PortainerContainerEntity, BinarySensorEntity):
@property
def available(self) -> bool:
"""Return if the device is available."""
return super().available and self.endpoint_id in self.coordinator.data
return (
super().available
and self.endpoint_id in self.coordinator.data
and self.device_name in self.coordinator.data[self.endpoint_id].containers
)
@property
def is_on(self) -> bool | None:

View File

@@ -113,7 +113,9 @@ class PortainerButton(PortainerContainerEntity, ButtonEntity):
"""Trigger the Portainer button press service."""
try:
await self.entity_description.press_action(
self.coordinator.portainer, self.endpoint_id, self.device_id
self.coordinator.portainer,
self.endpoint_id,
self.container_data.container.id,
)
except PortainerConnectionError as err:
raise HomeAssistantError(

View File

@@ -50,7 +50,7 @@ class PortainerContainerData:
"""Container data held by the Portainer coordinator."""
container: DockerContainer
stats: DockerContainerStats
stats: DockerContainerStats | None
stats_pre: DockerContainerStats | None
@@ -147,47 +147,52 @@ class PortainerCoordinator(DataUpdateCoordinator[dict[int, PortainerCoordinatorD
docker_version = await self.portainer.docker_version(endpoint.id)
docker_info = await self.portainer.docker_info(endpoint.id)
prev_endpoint = self.data.get(endpoint.id) if self.data else None
container_map: dict[str, PortainerContainerData] = {}
container_stats_task = [
(
container,
self.portainer.container_stats(
endpoint_id=endpoint.id,
container_id=container.id,
),
# Map containers, started and stopped
for container in containers:
container_name = self._get_container_name(container.names[0])
prev_container = (
prev_endpoint.containers[container_name]
if prev_endpoint
else None
)
container_map[container_name] = PortainerContainerData(
container=container,
stats=None,
stats_pre=prev_container.stats if prev_container else None,
)
# Separately fetch stats for running containers
running_containers = [
container
for container in containers
if container.state == CONTAINER_STATE_RUNNING
]
container_stats_gather = await asyncio.gather(
*[task for _, task in container_stats_task]
)
for (container, _), container_stats in zip(
container_stats_task, container_stats_gather, strict=False
):
container_name = container.names[0].replace("/", " ").strip()
# Store previous stats if available. This is used to calculate deltas for CPU and network usage
# In the first call it will be None, since it has nothing to compare with
# Added a walrus pattern to check if not None on prev_container, to keep mypy happy. :)
container_map[container_name] = PortainerContainerData(
container=container,
stats=container_stats,
stats_pre=(
prev_container.stats
if self.data
and (prev_data := self.data.get(endpoint.id)) is not None
and (
prev_container := prev_data.containers.get(
container_name
if running_containers:
container_stats = dict(
zip(
(
self._get_container_name(container.names[0])
for container in running_containers
),
await asyncio.gather(
*(
self.portainer.container_stats(
endpoint_id=endpoint.id,
container_id=container.id,
)
for container in running_containers
)
)
is not None
else None
),
),
strict=False,
)
)
# Now assign stats to the containers
for container_name, stats in container_stats.items():
container_map[container_name].stats = stats
except PortainerConnectionError as err:
_LOGGER.exception("Connection error")
raise UpdateFailed(
@@ -228,11 +233,15 @@ class PortainerCoordinator(DataUpdateCoordinator[dict[int, PortainerCoordinatorD
# Surprise, we also handle containers here :)
current_containers = {
(endpoint.id, container.container.id)
(endpoint.id, container_name)
for endpoint in mapped_endpoints.values()
for container in endpoint.containers.values()
for container_name in endpoint.containers
}
new_containers = current_containers - self.known_containers
if new_containers:
_LOGGER.debug("New containers found: %s", new_containers)
self.known_containers.update(new_containers)
def _get_container_name(self, container_name: str) -> str:
"""Sanitize to get a proper container name."""
return container_name.replace("/", " ").strip()

View File

@@ -7,6 +7,9 @@
"architecture": {
"default": "mdi:cpu-64-bit"
},
"container_state": {
"default": "mdi:state-machine"
},
"containers_count": {
"default": "mdi:database"
},

View File

@@ -7,5 +7,5 @@
"integration_type": "hub",
"iot_class": "local_polling",
"quality_scale": "bronze",
"requirements": ["pyportainer==1.0.19"]
"requirements": ["pyportainer==1.0.22"]
}

View File

@@ -49,10 +49,19 @@ CONTAINER_SENSORS: tuple[PortainerContainerSensorEntityDescription, ...] = (
translation_key="image",
value_fn=lambda data: data.container.image,
),
PortainerContainerSensorEntityDescription(
key="container_state",
translation_key="container_state",
value_fn=lambda data: data.container.state,
device_class=SensorDeviceClass.ENUM,
options=["running", "exited", "paused", "restarting", "created", "dead"],
),
PortainerContainerSensorEntityDescription(
key="memory_limit",
translation_key="memory_limit",
value_fn=lambda data: data.stats.memory_stats.limit,
value_fn=lambda data: (
data.stats.memory_stats.limit if data.stats is not None else 0
),
device_class=SensorDeviceClass.DATA_SIZE,
native_unit_of_measurement=UnitOfInformation.BYTES,
suggested_unit_of_measurement=UnitOfInformation.MEGABYTES,
@@ -63,7 +72,9 @@ CONTAINER_SENSORS: tuple[PortainerContainerSensorEntityDescription, ...] = (
PortainerContainerSensorEntityDescription(
key="memory_usage",
translation_key="memory_usage",
value_fn=lambda data: data.stats.memory_stats.usage,
value_fn=lambda data: (
data.stats.memory_stats.usage if data.stats is not None else 0
),
device_class=SensorDeviceClass.DATA_SIZE,
native_unit_of_measurement=UnitOfInformation.BYTES,
suggested_unit_of_measurement=UnitOfInformation.MEGABYTES,
@@ -76,7 +87,9 @@ CONTAINER_SENSORS: tuple[PortainerContainerSensorEntityDescription, ...] = (
translation_key="memory_usage_percentage",
value_fn=lambda data: (
(data.stats.memory_stats.usage / data.stats.memory_stats.limit) * 100.0
if data.stats.memory_stats.limit > 0 and data.stats.memory_stats.usage > 0
if data.stats is not None
and data.stats.memory_stats.limit > 0
and data.stats.memory_stats.usage > 0
else 0.0
),
native_unit_of_measurement=PERCENTAGE,
@@ -89,7 +102,8 @@ CONTAINER_SENSORS: tuple[PortainerContainerSensorEntityDescription, ...] = (
translation_key="cpu_usage_total",
value_fn=lambda data: (
(total_delta / system_delta) * data.stats.cpu_stats.online_cpus * 100.0
if (prev := data.stats_pre) is not None
if data.stats is not None
and (prev := data.stats_pre) is not None
and (
system_delta := (
data.stats.cpu_stats.system_cpu_usage
@@ -247,7 +261,6 @@ async def async_setup_entry(
)
for (endpoint, container) in containers
for entity_description in CONTAINER_SENSORS
if entity_description.value_fn(container) is not None
)
coordinator.new_endpoints_callbacks.append(_async_add_new_endpoints)
@@ -290,7 +303,11 @@ class PortainerContainerSensor(PortainerContainerEntity, SensorEntity):
@property
def available(self) -> bool:
"""Return if the device is available."""
return super().available and self.endpoint_id in self.coordinator.data
return (
super().available
and self.endpoint_id in self.coordinator.data
and self.device_name in self.coordinator.data[self.endpoint_id].containers
)
@property
def native_value(self) -> StateType:

Some files were not shown because too many files have changed in this diff Show More