Compare commits

..

142 Commits

Author SHA1 Message Date
Franck Nijhof
bb98ed6633 2025.10.3 (#154718) 2025-10-17 23:14:01 +02:00
Franck Nijhof
59dace572a Bump version to 2025.10.3 2025-10-17 20:35:30 +00:00
cdnninja
735cf36a5b Bump pyvesync version to 3.1.2 (#154650)
Co-authored-by: Franck Nijhof <git@frenck.dev>
2025-10-17 20:34:48 +00:00
tstabrawa
90b0f50b8f Move URL out of Nuheat strings.json (#154580) 2025-10-17 20:34:47 +00:00
Simone Chemelli
e731c07b77 Bump aioamazondevices to 6.4.4 (#154538) 2025-10-17 20:34:46 +00:00
Whitney Young
2c75635e95 OpenUV: Fix update by skipping when protection window is null (#154487) 2025-10-17 20:34:45 +00:00
Anuj Soni
1f031695c2 Move translatable URLs out of strings.json for isy994 (#154464) 2025-10-17 20:34:43 +00:00
Michel van de Wetering
fb279212a9 Add missinglong_press entry for trigger_type in strings.json for Hue (#154437) 2025-10-17 20:34:42 +00:00
DannyS95
45869523d0 Move igloohome API access URL into constant placeholders (#154430) 2025-10-17 20:34:41 +00:00
puddly
a753926f22 Use async_schedule_reload instead of async_reload for ZHA (#154397) 2025-10-17 20:34:40 +00:00
Simone Chemelli
dc874ff53a Bump aiocomelit to 1.1.2 (#154393) 2025-10-17 20:34:38 +00:00
Renat Sibgatulin
3ef6865708 Bump aioairq to 0.4.7 (#154386) 2025-10-17 20:34:37 +00:00
Anuj Soni
7f1989f9f2 Move translatable URLs out of strings.json for huawei lte (#154368)
Co-authored-by: Joost Lekkerkerker <joostlek@outlook.com>
2025-10-17 20:34:36 +00:00
epenet
97e338c760 Move URL out of sfr_box strings.json (#154364)
Co-authored-by: Joost Lekkerkerker <joostlek@outlook.com>
2025-10-17 20:34:35 +00:00
wollew
101679c17d update pysqueezebox lib to 0.13.0 (#154358) 2025-10-17 20:34:33 +00:00
tronikos
bc784c356e Bump opower to 0.15.7 (#154351) 2025-10-17 20:34:32 +00:00
J. Nick Koston
556cc57d8b Fix Bluetooth discovery for devices with alternating advertisement names (#154347) 2025-10-17 20:34:31 +00:00
Oliver Gründel
eef6e96a93 Move developer url out of strings.json for coinbase setup flow (#154339) 2025-10-17 20:34:30 +00:00
Shai Ungar
56d237af7f Move URLs out of SABnzbd strings.json (#154333)
Co-authored-by: Claude <noreply@anthropic.com>
2025-10-17 20:34:29 +00:00
Oliver Gründel
e5d1902d2a Move Ecobee authorization URL out of strings.json (#154332) 2025-10-17 20:34:27 +00:00
Yevhenii Vaskivskyi
a9a203678e AsusWRT: Pass only online clients to the device list from the API (#154322) 2025-10-17 20:34:26 +00:00
Mick Vleeshouwer
7f6237cc63 Move URL out of Overkiz Config Flow descriptions (#154315) 2025-10-17 20:34:25 +00:00
Simone Chemelli
5468e691ca Bump aioamazondevices to 6.4.3 (#154293) 2025-10-17 20:34:23 +00:00
Jan-Philipp Benecke
67cbbc3522 Move Electricity Maps url out of strings.json (#154284) 2025-10-17 20:33:17 +00:00
Dan Schafer
504da54c11 Update Snoo strings.json to include weaning_baseline (#154268) 2025-10-17 20:31:13 +00:00
Jordan Harvey
cdda2ef5c8 Bump pyprobeplus to 1.1.0 (#154265) 2025-10-17 20:31:12 +00:00
Jan Bouwhuis
f405f9eb4b Fix home wiziard total increasing sensors returning 0 (#154264) 2025-10-17 20:31:10 +00:00
Manu
634f71835a Add description placeholders to pyLoad config flow (#154254) 2025-10-17 20:31:09 +00:00
Manu
49bfb01fac Add description placeholders in Uptime Kuma config flow (#154252)
Signed-off-by: tr4nt0r <4445816+tr4nt0r@users.noreply.github.com>
2025-10-17 20:31:08 +00:00
Joakim Plate
ad8f7fdcab Move url like strings to placeholders for nibe (#154249) 2025-10-17 20:31:07 +00:00
J. Nick Koston
f82ec81062 Fix Yale integration to handle unavailable OAuth implementation at startup (#154245) 2025-10-17 20:31:05 +00:00
J. Nick Koston
03b0842a01 Fix August integration to handle unavailable OAuth implementation at startup (#154244) 2025-10-17 20:31:04 +00:00
Christopher Fenner
13e5cb5cc8 Remove URL from ViCare strings.json (#154243) 2025-10-17 20:31:03 +00:00
Shay Levy
f18cdaf4d8 Move URL out of Switcher strings.json (#154240) 2025-10-17 20:31:02 +00:00
Andrew Jackson
5b3bca1426 Move URL out of Mastodon strings.json (#154231) 2025-10-17 20:31:01 +00:00
Andrew Jackson
d812e9d43c Move URL out of Mealie strings.json (#154230) 2025-10-17 20:30:59 +00:00
Simone Chemelli
fa1071b221 Bump aioamazondevices to 6.4.1 (#154228) 2025-10-17 20:30:58 +00:00
Paul Bottein
e48c2c6c0b Bump frontend 20251001.4 (#154218) 2025-10-17 20:23:01 +00:00
Yvan13120
bddd4100c0 Fix state class for Overkiz water consumption (#154164) 2025-10-17 20:23:00 +00:00
srirams
70d8df2e95 Remove redudant state write in Smart Meter Texas (#154126) 2025-10-17 20:22:58 +00:00
Lennart Coopmans
08b3dd0173 PushSafer: Handle empty data section properly (#154109) 2025-10-17 20:22:57 +00:00
Magnus
6723a7c4e1 Bump aioasuswrt to 1.5.1 (#153209) 2025-10-17 20:22:55 +00:00
Franck Nijhof
40d7f2a89e 2025.10.2 (#154181) 2025-10-10 23:19:19 +02:00
Shay Levy
13b717e2da Fix shelly remove orphaned entities (#154182)
Co-authored-by: Franck Nijhof <git@frenck.dev>
2025-10-10 22:46:30 +02:00
Franck Nijhof
5fcfd3ad84 Bump version to 2025.10.2 2025-10-10 20:29:17 +00:00
Shay Levy
324a7b5443 Fix Shelly RPC cover update when the device is not initialized (#154159) 2025-10-10 20:27:30 +00:00
Robert Resch
491ae8f72c Bump deebot-client to 15.1.0 (#154154) 2025-10-10 20:23:10 +00:00
Justus
259247892f IOmeter bump version v0.2.0 (#154150) 2025-10-10 20:23:09 +00:00
Bram Kragten
caeda0ef64 Update frontend to 20251001.2 (#154143) 2025-10-10 20:23:08 +00:00
Paul Bottein
df35c535e4 Add missing entity category and icons for smlight integration (#154131) 2025-10-10 20:23:07 +00:00
Paulus Schoutsen
f93b9e0ed0 Z-Wave: ESPHome discovery to update all options (#154113) 2025-10-10 20:23:05 +00:00
peteS-UK
48a3372cf2 Fix for multiple Lyrion Music Server on a single Home Assistant server for Squeezebox (#154081) 2025-10-10 20:23:04 +00:00
Maciej Bieniek
d84fd72428 Bump brother to version 5.1.1 (#154080) 2025-10-10 20:23:03 +00:00
Simone Chemelli
e8cb386962 Bump aioamazondevices to 6.4.0 (#154071) 2025-10-10 20:23:02 +00:00
epenet
5ac726703c Filter out invalid Renault vehicles (#154070)
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
2025-10-10 20:23:00 +00:00
Joost Lekkerkerker
688649a799 Don't mark ZHA coordinator as via_device with itself (#154004) 2025-10-10 20:17:07 +00:00
Artur Pragacz
c5359ade3e Fix empty llm api list in chat log (#153996) 2025-10-10 20:17:05 +00:00
Michael Davie
4e60dedc1b Bump env-canada to 0.11.3 (#153967) 2025-10-10 20:17:04 +00:00
Maciej Bieniek
221d74f83a Fix update interval for AccuWeather hourly forecast (#153957) 2025-10-10 20:17:02 +00:00
G Johansson
fbbb3d6415 Bump holidays to 0.82 (#153952) 2025-10-10 20:17:01 +00:00
Josef Zweck
8297019011 Bump pylamarzocco to 2.1.2 (#153950) 2025-10-10 20:16:59 +00:00
TheJulianJES
61715dcff3 Adjust OTBR config entry name for ZBT-2 (#153940) 2025-10-10 20:16:58 +00:00
TheJulianJES
32b822ee99 Fix HA hardware configuration message for Thread without HAOS (#153933) 2025-10-10 20:16:56 +00:00
Fabien Kleinbourg
e6c2e0ad80 sharkiq dependency bump to 1.4.2 (#153931) 2025-10-10 20:16:55 +00:00
TheJulianJES
1314427dc5 Do not auto-set up ZHA zeroconf discoveries during onboarding (#153914) 2025-10-10 20:16:53 +00:00
Tom Matheussen
bf499a45f7 Add missing translation string for Satel Integra subentry type (#153905) 2025-10-10 20:16:52 +00:00
Christopher Fenner
b955e22628 fix typo in icon assignment of AccuWeather integration (#153890) 2025-10-10 20:16:50 +00:00
Simone Chemelli
1b222ff5fd Fix restore cover state for Comelit SimpleHome (#153887) 2025-10-10 20:16:49 +00:00
derytive
f0510e703f Add plate_count for Miele KM7575 (#153868) 2025-10-10 19:36:03 +00:00
G Johansson
cbe3956e15 Handle timeout errors gracefully in Nord Pool services (#153856) 2025-10-10 19:36:01 +00:00
Aaron Bach
4588e9da8d Limit SimpliSafe websocket connection attempts during startup (#153853)
Co-authored-by: Joost Lekkerkerker <joostlek@outlook.com>
2025-10-10 19:36:00 +00:00
Simone Chemelli
5445890fdf Bump aiocomelit to 1.1.1 (#153843) 2025-10-10 19:35:59 +00:00
Simone Chemelli
9b49f77f86 Fix PIN validation for Comelit SimpleHome (#153840) 2025-10-10 19:35:57 +00:00
Petro31
566c8fb786 Fix delay_on and auto_off with multiple triggers (#153839) 2025-10-10 19:35:56 +00:00
Joost Lekkerkerker
b36150c213 Add motion presets to SmartThings AC (#153830) 2025-10-10 19:35:54 +00:00
Joost Lekkerkerker
809070d2ad Catch update exception in AirGradient (#153828) 2025-10-10 19:35:53 +00:00
Joost Lekkerkerker
f4339dc031 Bump pySmartThings to 3.3.1 (#153826) 2025-10-10 19:35:51 +00:00
epenet
f3b37d24b0 Fix Tuya cover position when only control is available (#153803) 2025-10-10 19:35:50 +00:00
Paulus Schoutsen
4c8348caa7 Handle ESPHome discoveries with uninitialized Z-Wave antennas (#153790) 2025-10-10 19:35:49 +00:00
cdnninja
b9e7c102ea vesync correct fan set modes (#153761) 2025-10-10 19:35:47 +00:00
Simone Chemelli
69d9fa89b7 Remove stale entities from Alexa Devices (#153759) 2025-10-10 19:35:46 +00:00
Simone Chemelli
6f3f5a5ec1 Bump aioamazondevices to 6.2.9 (#153756) 2025-10-10 19:35:44 +00:00
Simone Chemelli
5ecfeca90a Fix sensors availability check for Alexa Devices (#153743) 2025-10-10 19:35:43 +00:00
Sander Jochems
00e0570fd4 Upgrade python-melcloud to 0.1.2 (#153742) 2025-10-10 19:35:41 +00:00
Øyvind Matheson Wergeland
5a5b94f3af Synology DSM: Don't reinitialize API during configuration (#153739) 2025-10-10 19:35:40 +00:00
Maciej Bieniek
34f00d9b33 Align Shelly presencezone entity to the new API/firmware (#153737) 2025-10-10 19:35:39 +00:00
Tom
4cabc5b368 Bump airOS to 0.5.5 using formdata for v6 firmware (#153736) 2025-10-10 19:35:37 +00:00
tronikos
4045125422 Fix missing google_assistant_sdk.send_text_command (#153735) 2025-10-10 19:35:36 +00:00
Fredrik Erlandsson
d7393af76f Version bump pydaikin to 2.17.1 (#153726) 2025-10-10 19:35:34 +00:00
Fredrik Erlandsson
ad41386b27 Version bump pydaikin to 2.17.0 (#153718) 2025-10-10 19:35:33 +00:00
tronikos
62d17ea20c Bump opower to 0.15.6 (#153714) 2025-10-10 19:35:31 +00:00
peetersch
c4954731d0 Modbus Fix message_wait_milliseconds is no longer applied (#153709) 2025-10-10 19:35:30 +00:00
cdnninja
647723d3f0 Bump pyvesync to 3.1.0 (#153693) 2025-10-10 19:35:28 +00:00
Christopher Fenner
51c500e22c Fix ViCare pressure sensors missing unit of measurement (#153691) 2025-10-10 19:35:26 +00:00
Denis Shulyaka
f6fc13c1f2 Gemini: Use default model instead of recommended where applicable (#153676) 2025-10-10 19:35:25 +00:00
Jan Bouwhuis
0009a7a042 Fix MQTT Lock state reset to unknown when a reset payload is received (#153647) 2025-10-10 19:35:24 +00:00
Luke Lashley
a3d1aa28e7 Switch Roborock to v4 of the code login api (#153593) 2025-10-10 19:35:22 +00:00
Simone Chemelli
9f53eb9b76 Bump aioamazondevices to 6.2.8 (#153592) 2025-10-10 19:35:21 +00:00
Luke Lashley
f53a205ff3 Bump python-roborock to 2.50.2 (#153561) 2025-10-10 19:35:19 +00:00
NANI
d08517c3df Updated VRM client and accounted for missing forecasts (#153464) 2025-10-10 19:35:18 +00:00
Kinachi249
d7398a44a1 Bump PyCync to 0.4.1 (#153401) 2025-10-10 19:35:17 +00:00
Aidan Timson
9acfc0cb88 Fix power device classes for system bridge (#153201) 2025-10-10 19:35:15 +00:00
Hessel
1b3d21523a Wallbox fix Rate Limit issue for multiple chargers (#153074) 2025-10-10 19:35:14 +00:00
puddly
1d407d1326 Prevent reloading the ZHA integration while adapter firmware is being updated (#152626) 2025-10-10 19:35:12 +00:00
Franck Nijhof
013346cead 2025.10.1 (#153582) 2025-10-03 20:08:44 +02:00
Franck Nijhof
5abaabc9da Bump version to 2025.10.1 2025-10-03 17:26:37 +00:00
Paulus Schoutsen
32481312c3 When discovering a Z-Wave adapter, always configure add-on in config flow (#153575) 2025-10-03 17:26:16 +00:00
Paulus Schoutsen
bdc9eb37d3 Z-Wave to support migrating from USB to socket with same home ID (#153522) 2025-10-03 17:26:15 +00:00
Abílio Costa
e0afcbc02b Debounce updates in Idasen Desk (#153503) 2025-10-03 17:26:13 +00:00
puddly
cd56a6a98d Bump universal-silabs-flasher to 0.0.35 (#153500) 2025-10-03 17:26:11 +00:00
cdnninja
9d85893bbb Fix VeSync zero fan speed handling (#153493)
Co-authored-by: Joostlek <joostlek@outlook.com>
2025-10-03 17:26:10 +00:00
starkillerOG
9e8a70225f Bump reolink-aio to 0.16.1 (#153489) 2025-10-03 17:26:08 +00:00
Daniel Hjelseth Høyer
96ec795d5e Bump pyTibber to 0.32.2 (#153484) 2025-10-03 17:26:07 +00:00
Josef Zweck
65b796070d Fix missing parameter pass in onedrive (#153478) 2025-10-03 17:26:05 +00:00
Aidan Timson
32994812e5 Update OVOEnergy to 3.0.1 (#153476) 2025-10-03 17:26:04 +00:00
G Johansson
66ff9d63a3 Fix next event in workday calendar (#153465) 2025-10-03 17:26:02 +00:00
Joost Lekkerkerker
b2a63d4996 Add translation for turbo fan mode in SmartThings (#153445)
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
2025-10-03 17:26:00 +00:00
puddly
f9f37b7f2a Disable baudrate bootloader reset for ZBT-2 (#153443) 2025-10-03 17:25:59 +00:00
Stefan Agner
7bdd9dd38a Update Home Assistant base image to 2025.10.0 (#153441) 2025-10-03 17:25:58 +00:00
Joost Lekkerkerker
1e8aae0a89 Fix missing powerconsumptionreport in Smartthings (#153438) 2025-10-03 17:25:56 +00:00
Aidan Timson
cf668e9dc2 Add missing translation for media browser default title (#153430)
Co-authored-by: Erwin Douna <e.douna@gmail.com>
Co-authored-by: Norbert Rittel <norbert@rittel.de>
2025-10-03 17:25:55 +00:00
Norbert Rittel
2e91c8700f Fix sentence-casing in user-facing strings of slack (#153427) 2025-10-03 17:25:53 +00:00
J. Nick Koston
9d14627daa Bump aiohomekit to 3.2.19 (#153423) 2025-10-03 17:25:52 +00:00
TheJulianJES
73b8283748 Fix Z-Wave RGB light turn on causing rare ZeroDivisionError (#153422) 2025-10-03 17:25:50 +00:00
Manu
edeaaa2e63 Update markdown field description in ntfy integration (#153421) 2025-10-03 17:25:49 +00:00
Tom Matheussen
d26dd8fc39 Fix Satel Integra creating new binary sensors on YAML import (#153419) 2025-10-03 17:25:47 +00:00
Denis Shulyaka
34640ea735 Disable thinking for unsupported gemini models (#153415) 2025-10-03 17:25:46 +00:00
Erwin Douna
46a2e21ef0 Bump pyportainer 1.0.3 (#153413) 2025-10-03 17:25:45 +00:00
Erwin Douna
508af53e72 Bump pyportainer 1.0.2 (#153326) 2025-10-03 17:25:43 +00:00
Josef Zweck
5f7440608c Increase onedrive upload chunk size (#153406) 2025-10-03 17:22:10 +00:00
Michael J. Kidd
0d1aa38a26 Pushover: Handle empty data section properly (#153397) 2025-10-03 17:22:08 +00:00
Luke Lashley
929f8c148a Bump python-roborock to 2.49.1 (#153396) 2025-10-03 17:22:07 +00:00
Joakim Plate
92db1f5a04 Correct blocking update in ToGrill with lack of notifications (#153387) 2025-10-03 17:22:05 +00:00
starkillerOG
e66b5ce0bf Add Roborock mop intensity translations (#153380) 2025-10-03 17:22:03 +00:00
Michael
1e17150e9f Explicit pass in the config entry to coordinator in airtouch4 (#153361)
Co-authored-by: Josef Zweck <josef@zweck.dev>
Co-authored-by: Franck Nijhof <git@frenck.dev>
2025-10-03 17:22:02 +00:00
Michael
792902de3d Set config entry to None in ProxmoxVE (#153357) 2025-10-03 17:22:00 +00:00
Andre Lengwenus
04d78c3dd5 Explicitly check for None in raw value processing of modbus (#153352) 2025-10-03 17:21:59 +00:00
G Johansson
5c8d5bfb84 Fix Nord Pool 15 minute interval (#153350) 2025-10-03 17:21:57 +00:00
puddly
99bff31869 Do not reset the adapter twice during ZHA options flow migration (#153345) 2025-10-03 17:21:56 +00:00
Stefan Agner
d949119fb0 Bump aiohasupervisor to 0.3.3 (#153344) 2025-10-03 17:21:54 +00:00
Tom
e7b737ece5 Bump airOS module for alternative login url (#153317) 2025-10-03 17:21:52 +00:00
Tom
fb8ddac2e8 Bump airOS dependency (#153065) 2025-10-03 17:21:51 +00:00
1453 changed files with 19707 additions and 100523 deletions

View File

@@ -190,7 +190,7 @@ jobs:
echo "${{ github.sha }};${{ github.ref }};${{ github.event_name }};${{ github.actor }}" > rootfs/OFFICIAL_IMAGE
- name: Login to GitHub Container Registry
uses: docker/login-action@5e57cd118135c172c3672efd75eb46360885c0ef # v3.6.0
uses: docker/login-action@184bdaa0721073962dff0199f1fb9940f07167d1 # v3.5.0
with:
registry: ghcr.io
username: ${{ github.repository_owner }}
@@ -257,7 +257,7 @@ jobs:
fi
- name: Login to GitHub Container Registry
uses: docker/login-action@5e57cd118135c172c3672efd75eb46360885c0ef # v3.6.0
uses: docker/login-action@184bdaa0721073962dff0199f1fb9940f07167d1 # v3.5.0
with:
registry: ghcr.io
username: ${{ github.repository_owner }}
@@ -326,20 +326,20 @@ jobs:
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
- name: Install Cosign
uses: sigstore/cosign-installer@faadad0cce49287aee09b3a48701e75088a2c6ad # v4.0.0
uses: sigstore/cosign-installer@d7543c93d881b35a8faa02e8e3605f69b7a1ce62 # v3.10.0
with:
cosign-release: "v2.2.3"
- name: Login to DockerHub
if: matrix.registry == 'docker.io/homeassistant'
uses: docker/login-action@5e57cd118135c172c3672efd75eb46360885c0ef # v3.6.0
uses: docker/login-action@184bdaa0721073962dff0199f1fb9940f07167d1 # v3.5.0
with:
username: ${{ secrets.DOCKERHUB_USERNAME }}
password: ${{ secrets.DOCKERHUB_TOKEN }}
- name: Login to GitHub Container Registry
if: matrix.registry == 'ghcr.io/home-assistant'
uses: docker/login-action@5e57cd118135c172c3672efd75eb46360885c0ef # v3.6.0
uses: docker/login-action@184bdaa0721073962dff0199f1fb9940f07167d1 # v3.5.0
with:
registry: ghcr.io
username: ${{ github.repository_owner }}
@@ -504,7 +504,7 @@ jobs:
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
- name: Login to GitHub Container Registry
uses: docker/login-action@5e57cd118135c172c3672efd75eb46360885c0ef # v3.6.0
uses: docker/login-action@184bdaa0721073962dff0199f1fb9940f07167d1 # v3.5.0
with:
registry: ghcr.io
username: ${{ github.repository_owner }}

File diff suppressed because it is too large Load Diff

View File

@@ -24,11 +24,11 @@ jobs:
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
- name: Initialize CodeQL
uses: github/codeql-action/init@f443b600d91635bebf5b0d9ebc620189c0d6fba5 # v4.30.8
uses: github/codeql-action/init@192325c86100d080feab897ff886c34abd4c83a3 # v3.30.3
with:
languages: python
- name: Perform CodeQL Analysis
uses: github/codeql-action/analyze@f443b600d91635bebf5b0d9ebc620189c0d6fba5 # v4.30.8
uses: github/codeql-action/analyze@192325c86100d080feab897ff886c34abd4c83a3 # v3.30.3
with:
category: "/language:python"

View File

@@ -17,7 +17,7 @@ jobs:
# - No PRs marked as no-stale
# - No issues (-1)
- name: 60 days stale PRs policy
uses: actions/stale@5f858e3efba33a5ca4407a664cc011ad407f2008 # v10.1.0
uses: actions/stale@3a9db7e6a41a89f618792c92c0e97cc736e1b13f # v10.0.0
with:
repo-token: ${{ secrets.GITHUB_TOKEN }}
days-before-stale: 60
@@ -57,7 +57,7 @@ jobs:
# - No issues marked as no-stale or help-wanted
# - No PRs (-1)
- name: 90 days stale issues
uses: actions/stale@5f858e3efba33a5ca4407a664cc011ad407f2008 # v10.1.0
uses: actions/stale@3a9db7e6a41a89f618792c92c0e97cc736e1b13f # v10.0.0
with:
repo-token: ${{ steps.token.outputs.token }}
days-before-stale: 90
@@ -87,7 +87,7 @@ jobs:
# - No Issues marked as no-stale or help-wanted
# - No PRs (-1)
- name: Needs more information stale issues policy
uses: actions/stale@5f858e3efba33a5ca4407a664cc011ad407f2008 # v10.1.0
uses: actions/stale@3a9db7e6a41a89f618792c92c0e97cc736e1b13f # v10.0.0
with:
repo-token: ${{ steps.token.outputs.token }}
only-labels: "needs-more-information"

1
.gitignore vendored
View File

@@ -79,6 +79,7 @@ junit.xml
.project
.pydevproject
.python-version
.tool-versions
# emacs auto backups

View File

@@ -1 +0,0 @@
3.13

View File

@@ -203,7 +203,6 @@ homeassistant.components.feedreader.*
homeassistant.components.file_upload.*
homeassistant.components.filesize.*
homeassistant.components.filter.*
homeassistant.components.firefly_iii.*
homeassistant.components.fitbit.*
homeassistant.components.flexit_bacnet.*
homeassistant.components.flux_led.*
@@ -221,7 +220,6 @@ homeassistant.components.generic_thermostat.*
homeassistant.components.geo_location.*
homeassistant.components.geocaching.*
homeassistant.components.gios.*
homeassistant.components.github.*
homeassistant.components.glances.*
homeassistant.components.go2rtc.*
homeassistant.components.goalzero.*
@@ -327,7 +325,6 @@ homeassistant.components.london_underground.*
homeassistant.components.lookin.*
homeassistant.components.lovelace.*
homeassistant.components.luftdaten.*
homeassistant.components.lunatone.*
homeassistant.components.madvr.*
homeassistant.components.manual.*
homeassistant.components.mastodon.*
@@ -556,7 +553,6 @@ homeassistant.components.vacuum.*
homeassistant.components.vallox.*
homeassistant.components.valve.*
homeassistant.components.velbus.*
homeassistant.components.vivotek.*
homeassistant.components.vlc_telnet.*
homeassistant.components.vodafone_station.*
homeassistant.components.volvo.*

View File

@@ -1 +0,0 @@
.github/copilot-instructions.md

24
CODEOWNERS generated
View File

@@ -46,8 +46,6 @@ build.json @home-assistant/supervisor
/tests/components/accuweather/ @bieniu
/homeassistant/components/acmeda/ @atmurray
/tests/components/acmeda/ @atmurray
/homeassistant/components/actron_air/ @kclif9 @JagadishDhanamjayam
/tests/components/actron_air/ @kclif9 @JagadishDhanamjayam
/homeassistant/components/adax/ @danielhiversen @lazytarget
/tests/components/adax/ @danielhiversen @lazytarget
/homeassistant/components/adguard/ @frenck
@@ -494,8 +492,6 @@ build.json @home-assistant/supervisor
/tests/components/filesize/ @gjohansson-ST
/homeassistant/components/filter/ @dgomes
/tests/components/filter/ @dgomes
/homeassistant/components/firefly_iii/ @erwindouna
/tests/components/firefly_iii/ @erwindouna
/homeassistant/components/fireservicerota/ @cyberjunky
/tests/components/fireservicerota/ @cyberjunky
/homeassistant/components/firmata/ @DaAwesomeP
@@ -619,8 +615,6 @@ build.json @home-assistant/supervisor
/tests/components/greeneye_monitor/ @jkeljo
/homeassistant/components/group/ @home-assistant/core
/tests/components/group/ @home-assistant/core
/homeassistant/components/growatt_server/ @johanzander
/tests/components/growatt_server/ @johanzander
/homeassistant/components/guardian/ @bachya
/tests/components/guardian/ @bachya
/homeassistant/components/habitica/ @tr4nt0r
@@ -914,8 +908,6 @@ build.json @home-assistant/supervisor
/homeassistant/components/luci/ @mzdrale
/homeassistant/components/luftdaten/ @fabaff @frenck
/tests/components/luftdaten/ @fabaff @frenck
/homeassistant/components/lunatone/ @MoonDevLT
/tests/components/lunatone/ @MoonDevLT
/homeassistant/components/lupusec/ @majuss @suaveolent
/tests/components/lupusec/ @majuss @suaveolent
/homeassistant/components/lutron/ @cdheiser @wilburCForce
@@ -961,8 +953,6 @@ build.json @home-assistant/supervisor
/tests/components/met_eireann/ @DylanGore
/homeassistant/components/meteo_france/ @hacf-fr @oncleben31 @Quentame
/tests/components/meteo_france/ @hacf-fr @oncleben31 @Quentame
/homeassistant/components/meteo_lt/ @xE1H
/tests/components/meteo_lt/ @xE1H
/homeassistant/components/meteoalarm/ @rolfberkenbosch
/homeassistant/components/meteoclimatic/ @adrianmo
/tests/components/meteoclimatic/ @adrianmo
@@ -1069,8 +1059,6 @@ build.json @home-assistant/supervisor
/homeassistant/components/nilu/ @hfurubotten
/homeassistant/components/nina/ @DeerMaximum
/tests/components/nina/ @DeerMaximum
/homeassistant/components/nintendo_parental_controls/ @pantherale0
/tests/components/nintendo_parental_controls/ @pantherale0
/homeassistant/components/nissan_leaf/ @filcole
/homeassistant/components/noaa_tides/ @jdelaney72
/homeassistant/components/nobo_hub/ @echoromeo @oyvindwe
@@ -1139,8 +1127,6 @@ build.json @home-assistant/supervisor
/tests/components/opengarage/ @danielhiversen
/homeassistant/components/openhome/ @bazwilliams
/tests/components/openhome/ @bazwilliams
/homeassistant/components/openrgb/ @felipecrs
/tests/components/openrgb/ @felipecrs
/homeassistant/components/opensky/ @joostlek
/tests/components/opensky/ @joostlek
/homeassistant/components/opentherm_gw/ @mvn23
@@ -1204,6 +1190,8 @@ build.json @home-assistant/supervisor
/tests/components/plex/ @jjlawren
/homeassistant/components/plugwise/ @CoMPaTech @bouwew
/tests/components/plugwise/ @CoMPaTech @bouwew
/homeassistant/components/plum_lightpad/ @ColinHarrington @prystupa
/tests/components/plum_lightpad/ @ColinHarrington @prystupa
/homeassistant/components/point/ @fredrike
/tests/components/point/ @fredrike
/homeassistant/components/pooldose/ @lmaertin
@@ -1419,8 +1407,8 @@ build.json @home-assistant/supervisor
/tests/components/sfr_box/ @epenet
/homeassistant/components/sftp_storage/ @maretodoric
/tests/components/sftp_storage/ @maretodoric
/homeassistant/components/sharkiq/ @JeffResc @funkybunch @TheOneOgre
/tests/components/sharkiq/ @JeffResc @funkybunch @TheOneOgre
/homeassistant/components/sharkiq/ @JeffResc @funkybunch
/tests/components/sharkiq/ @JeffResc @funkybunch
/homeassistant/components/shell_command/ @home-assistant/core
/tests/components/shell_command/ @home-assistant/core
/homeassistant/components/shelly/ @bieniu @thecode @chemelli74 @bdraco
@@ -1485,8 +1473,8 @@ build.json @home-assistant/supervisor
/tests/components/snoo/ @Lash-L
/homeassistant/components/snooz/ @AustinBrunkhorst
/tests/components/snooz/ @AustinBrunkhorst
/homeassistant/components/solaredge/ @frenck @bdraco @tronikos
/tests/components/solaredge/ @frenck @bdraco @tronikos
/homeassistant/components/solaredge/ @frenck @bdraco
/tests/components/solaredge/ @frenck @bdraco
/homeassistant/components/solaredge_local/ @drobtravels @scheric
/homeassistant/components/solarlog/ @Ernst79 @dontinelli
/tests/components/solarlog/ @Ernst79 @dontinelli

View File

@@ -34,11 +34,9 @@ WORKDIR /usr/src
COPY --from=ghcr.io/astral-sh/uv:latest /uv /usr/local/bin/uv
RUN uv python install 3.13.2
USER vscode
COPY .python-version ./
RUN uv python install
ENV VIRTUAL_ENV="/home/vscode/.local/ha-venv"
RUN uv venv $VIRTUAL_ENV
ENV PATH="$VIRTUAL_ENV/bin:$PATH"

View File

@@ -1,10 +1,10 @@
image: ghcr.io/home-assistant/{arch}-homeassistant
build_from:
aarch64: ghcr.io/home-assistant/aarch64-homeassistant-base:2025.10.1
armhf: ghcr.io/home-assistant/armhf-homeassistant-base:2025.10.1
armv7: ghcr.io/home-assistant/armv7-homeassistant-base:2025.10.1
amd64: ghcr.io/home-assistant/amd64-homeassistant-base:2025.10.1
i386: ghcr.io/home-assistant/i386-homeassistant-base:2025.10.1
aarch64: ghcr.io/home-assistant/aarch64-homeassistant-base:2025.10.0
armhf: ghcr.io/home-assistant/armhf-homeassistant-base:2025.10.0
armv7: ghcr.io/home-assistant/armv7-homeassistant-base:2025.10.0
amd64: ghcr.io/home-assistant/amd64-homeassistant-base:2025.10.0
i386: ghcr.io/home-assistant/i386-homeassistant-base:2025.10.0
codenotary:
signer: notary@home-assistant.io
base_image: notary@home-assistant.io

View File

@@ -616,34 +616,34 @@ async def async_enable_logging(
),
)
logger = logging.getLogger()
logger.setLevel(logging.INFO if verbose else logging.WARNING)
# Log errors to a file if we have write access to file or config dir
if log_file is None:
default_log_path = hass.config.path(ERROR_LOG_FILENAME)
if "SUPERVISOR" in os.environ:
_LOGGER.info("Running in Supervisor, not logging to file")
# Rename the default log file if it exists, since previous versions created
# it even on Supervisor
if os.path.isfile(default_log_path):
with contextlib.suppress(OSError):
os.rename(default_log_path, f"{default_log_path}.old")
err_log_path = None
else:
err_log_path = default_log_path
err_log_path = hass.config.path(ERROR_LOG_FILENAME)
else:
err_log_path = os.path.abspath(log_file)
if err_log_path:
err_path_exists = os.path.isfile(err_log_path)
err_dir = os.path.dirname(err_log_path)
# Check if we can write to the error log if it exists or that
# we can create files in the containing directory if not.
if (err_path_exists and os.access(err_log_path, os.W_OK)) or (
not err_path_exists and os.access(err_dir, os.W_OK)
):
err_handler = await hass.async_add_executor_job(
_create_log_file, err_log_path, log_rotate_days
)
err_handler.setFormatter(logging.Formatter(fmt, datefmt=FORMAT_DATETIME))
logger = logging.getLogger()
logger.addHandler(err_handler)
logger.setLevel(logging.INFO if verbose else logging.WARNING)
# Save the log file location for access by other components.
hass.data[DATA_LOGGING] = err_log_path
else:
_LOGGER.error("Unable to set up error log %s (access denied)", err_log_path)
async_activate_log_queue_handler(hass)

View File

@@ -0,0 +1,5 @@
{
"domain": "ibm",
"name": "IBM",
"integrations": ["watson_iot", "watson_tts"]
}

View File

@@ -12,13 +12,11 @@ from homeassistant.components.bluetooth import async_get_scanner
from homeassistant.config_entries import ConfigEntry
from homeassistant.const import CONF_ADDRESS
from homeassistant.core import HomeAssistant
from homeassistant.helpers.debounce import Debouncer
from homeassistant.helpers.update_coordinator import DataUpdateCoordinator
from .const import CONF_IS_NEW_STYLE_SCALE
SCAN_INTERVAL = timedelta(seconds=15)
UPDATE_DEBOUNCE_TIME = 0.2
_LOGGER = logging.getLogger(__name__)
@@ -40,19 +38,11 @@ class AcaiaCoordinator(DataUpdateCoordinator[None]):
config_entry=entry,
)
debouncer = Debouncer(
hass=hass,
logger=_LOGGER,
cooldown=UPDATE_DEBOUNCE_TIME,
immediate=True,
function=self.async_update_listeners,
)
self._scale = AcaiaScale(
address_or_ble_device=entry.data[CONF_ADDRESS],
name=entry.title,
is_new_style_scale=entry.data[CONF_IS_NEW_STYLE_SCALE],
notify_callback=debouncer.async_schedule_call,
notify_callback=self.async_update_listeners,
scanner=async_get_scanner(hass),
)

View File

@@ -1,57 +0,0 @@
"""The Actron Air integration."""
from actron_neo_api import (
ActronAirNeoACSystem,
ActronNeoAPI,
ActronNeoAPIError,
ActronNeoAuthError,
)
from homeassistant.const import CONF_API_TOKEN, Platform
from homeassistant.core import HomeAssistant
from .const import _LOGGER
from .coordinator import (
ActronAirConfigEntry,
ActronAirRuntimeData,
ActronAirSystemCoordinator,
)
PLATFORM = [Platform.CLIMATE]
async def async_setup_entry(hass: HomeAssistant, entry: ActronAirConfigEntry) -> bool:
"""Set up Actron Air integration from a config entry."""
api = ActronNeoAPI(refresh_token=entry.data[CONF_API_TOKEN])
systems: list[ActronAirNeoACSystem] = []
try:
systems = await api.get_ac_systems()
await api.update_status()
except ActronNeoAuthError:
_LOGGER.error("Authentication error while setting up Actron Air integration")
raise
except ActronNeoAPIError as err:
_LOGGER.error("API error while setting up Actron Air integration: %s", err)
raise
system_coordinators: dict[str, ActronAirSystemCoordinator] = {}
for system in systems:
coordinator = ActronAirSystemCoordinator(hass, entry, api, system)
_LOGGER.debug("Setting up coordinator for system: %s", system["serial"])
await coordinator.async_config_entry_first_refresh()
system_coordinators[system["serial"]] = coordinator
entry.runtime_data = ActronAirRuntimeData(
api=api,
system_coordinators=system_coordinators,
)
await hass.config_entries.async_forward_entry_setups(entry, PLATFORM)
return True
async def async_unload_entry(hass: HomeAssistant, entry: ActronAirConfigEntry) -> bool:
"""Unload a config entry."""
return await hass.config_entries.async_unload_platforms(entry, PLATFORM)

View File

@@ -1,259 +0,0 @@
"""Climate platform for Actron Air integration."""
from typing import Any
from actron_neo_api import ActronAirNeoStatus, ActronAirNeoZone
from homeassistant.components.climate import (
FAN_AUTO,
FAN_HIGH,
FAN_LOW,
FAN_MEDIUM,
ClimateEntity,
ClimateEntityFeature,
HVACMode,
)
from homeassistant.const import ATTR_TEMPERATURE, UnitOfTemperature
from homeassistant.core import HomeAssistant
from homeassistant.helpers.device_registry import DeviceInfo
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from homeassistant.helpers.update_coordinator import CoordinatorEntity
from .const import DOMAIN
from .coordinator import ActronAirConfigEntry, ActronAirSystemCoordinator
PARALLEL_UPDATES = 0
FAN_MODE_MAPPING_ACTRONAIR_TO_HA = {
"AUTO": FAN_AUTO,
"LOW": FAN_LOW,
"MED": FAN_MEDIUM,
"HIGH": FAN_HIGH,
}
FAN_MODE_MAPPING_HA_TO_ACTRONAIR = {
v: k for k, v in FAN_MODE_MAPPING_ACTRONAIR_TO_HA.items()
}
HVAC_MODE_MAPPING_ACTRONAIR_TO_HA = {
"COOL": HVACMode.COOL,
"HEAT": HVACMode.HEAT,
"FAN": HVACMode.FAN_ONLY,
"AUTO": HVACMode.AUTO,
"OFF": HVACMode.OFF,
}
HVAC_MODE_MAPPING_HA_TO_ACTRONAIR = {
v: k for k, v in HVAC_MODE_MAPPING_ACTRONAIR_TO_HA.items()
}
async def async_setup_entry(
hass: HomeAssistant,
entry: ActronAirConfigEntry,
async_add_entities: AddConfigEntryEntitiesCallback,
) -> None:
"""Set up Actron Air climate entities."""
system_coordinators = entry.runtime_data.system_coordinators
entities: list[ClimateEntity] = []
for coordinator in system_coordinators.values():
status = coordinator.data
name = status.ac_system.system_name
entities.append(ActronSystemClimate(coordinator, name))
entities.extend(
ActronZoneClimate(coordinator, zone)
for zone in status.remote_zone_info
if zone.exists
)
async_add_entities(entities)
class BaseClimateEntity(CoordinatorEntity[ActronAirSystemCoordinator], ClimateEntity):
"""Base class for Actron Air climate entities."""
_attr_has_entity_name = True
_attr_temperature_unit = UnitOfTemperature.CELSIUS
_attr_supported_features = (
ClimateEntityFeature.TARGET_TEMPERATURE
| ClimateEntityFeature.FAN_MODE
| ClimateEntityFeature.TURN_ON
| ClimateEntityFeature.TURN_OFF
)
_attr_name = None
_attr_fan_modes = list(FAN_MODE_MAPPING_ACTRONAIR_TO_HA.values())
_attr_hvac_modes = list(HVAC_MODE_MAPPING_ACTRONAIR_TO_HA.values())
def __init__(
self,
coordinator: ActronAirSystemCoordinator,
name: str,
) -> None:
"""Initialize an Actron Air unit."""
super().__init__(coordinator)
self._serial_number = coordinator.serial_number
class ActronSystemClimate(BaseClimateEntity):
"""Representation of the Actron Air system."""
_attr_supported_features = (
ClimateEntityFeature.TARGET_TEMPERATURE
| ClimateEntityFeature.FAN_MODE
| ClimateEntityFeature.TURN_ON
| ClimateEntityFeature.TURN_OFF
)
def __init__(
self,
coordinator: ActronAirSystemCoordinator,
name: str,
) -> None:
"""Initialize an Actron Air unit."""
super().__init__(coordinator, name)
serial_number = coordinator.serial_number
self._attr_unique_id = serial_number
self._attr_device_info = DeviceInfo(
identifiers={(DOMAIN, serial_number)},
name=self._status.ac_system.system_name,
manufacturer="Actron Air",
model_id=self._status.ac_system.master_wc_model,
sw_version=self._status.ac_system.master_wc_firmware_version,
serial_number=serial_number,
)
@property
def min_temp(self) -> float:
"""Return the minimum temperature that can be set."""
return self._status.min_temp
@property
def max_temp(self) -> float:
"""Return the maximum temperature that can be set."""
return self._status.max_temp
@property
def _status(self) -> ActronAirNeoStatus:
"""Get the current status from the coordinator."""
return self.coordinator.data
@property
def hvac_mode(self) -> HVACMode | None:
"""Return the current HVAC mode."""
if not self._status.user_aircon_settings.is_on:
return HVACMode.OFF
mode = self._status.user_aircon_settings.mode
return HVAC_MODE_MAPPING_ACTRONAIR_TO_HA.get(mode)
@property
def fan_mode(self) -> str | None:
"""Return the current fan mode."""
fan_mode = self._status.user_aircon_settings.fan_mode
return FAN_MODE_MAPPING_ACTRONAIR_TO_HA.get(fan_mode)
@property
def current_humidity(self) -> float:
"""Return the current humidity."""
return self._status.master_info.live_humidity_pc
@property
def current_temperature(self) -> float:
"""Return the current temperature."""
return self._status.master_info.live_temp_c
@property
def target_temperature(self) -> float:
"""Return the target temperature."""
return self._status.user_aircon_settings.temperature_setpoint_cool_c
async def async_set_fan_mode(self, fan_mode: str) -> None:
"""Set a new fan mode."""
api_fan_mode = FAN_MODE_MAPPING_HA_TO_ACTRONAIR.get(fan_mode.lower())
await self._status.user_aircon_settings.set_fan_mode(api_fan_mode)
async def async_set_hvac_mode(self, hvac_mode: HVACMode) -> None:
"""Set the HVAC mode."""
ac_mode = HVAC_MODE_MAPPING_HA_TO_ACTRONAIR.get(hvac_mode)
await self._status.ac_system.set_system_mode(ac_mode)
async def async_set_temperature(self, **kwargs: Any) -> None:
"""Set the temperature."""
temp = kwargs.get(ATTR_TEMPERATURE)
await self._status.user_aircon_settings.set_temperature(temperature=temp)
class ActronZoneClimate(BaseClimateEntity):
"""Representation of a zone within the Actron Air system."""
_attr_supported_features = (
ClimateEntityFeature.TARGET_TEMPERATURE
| ClimateEntityFeature.TURN_ON
| ClimateEntityFeature.TURN_OFF
)
def __init__(
self,
coordinator: ActronAirSystemCoordinator,
zone: ActronAirNeoZone,
) -> None:
"""Initialize an Actron Air unit."""
super().__init__(coordinator, zone.title)
serial_number = coordinator.serial_number
self._zone_id: int = zone.zone_id
self._attr_unique_id: str = f"{serial_number}_zone_{zone.zone_id}"
self._attr_device_info: DeviceInfo = DeviceInfo(
identifiers={(DOMAIN, self._attr_unique_id)},
name=zone.title,
manufacturer="Actron Air",
model="Zone",
suggested_area=zone.title,
via_device=(DOMAIN, serial_number),
)
@property
def min_temp(self) -> float:
"""Return the minimum temperature that can be set."""
return self._zone.min_temp
@property
def max_temp(self) -> float:
"""Return the maximum temperature that can be set."""
return self._zone.max_temp
@property
def _zone(self) -> ActronAirNeoZone:
"""Get the current zone data from the coordinator."""
status = self.coordinator.data
return status.zones[self._zone_id]
@property
def hvac_mode(self) -> HVACMode | None:
"""Return the current HVAC mode."""
if self._zone.is_active:
mode = self._zone.hvac_mode
return HVAC_MODE_MAPPING_ACTRONAIR_TO_HA.get(mode)
return HVACMode.OFF
@property
def current_humidity(self) -> float | None:
"""Return the current humidity."""
return self._zone.humidity
@property
def current_temperature(self) -> float | None:
"""Return the current temperature."""
return self._zone.live_temp_c
@property
def target_temperature(self) -> float | None:
"""Return the target temperature."""
return self._zone.temperature_setpoint_cool_c
async def async_set_hvac_mode(self, hvac_mode: HVACMode) -> None:
"""Set the HVAC mode."""
is_enabled = hvac_mode != HVACMode.OFF
await self._zone.enable(is_enabled)
async def async_set_temperature(self, **kwargs: Any) -> None:
"""Set the temperature."""
await self._zone.set_temperature(temperature=kwargs["temperature"])

View File

@@ -1,132 +0,0 @@
"""Setup config flow for Actron Air integration."""
import asyncio
from typing import Any
from actron_neo_api import ActronNeoAPI, ActronNeoAuthError
from homeassistant.config_entries import ConfigFlow, ConfigFlowResult
from homeassistant.const import CONF_API_TOKEN
from homeassistant.exceptions import HomeAssistantError
from .const import _LOGGER, DOMAIN
class ActronAirConfigFlow(ConfigFlow, domain=DOMAIN):
"""Handle a config flow for Actron Air."""
def __init__(self) -> None:
"""Initialize the config flow."""
self._api: ActronNeoAPI | None = None
self._device_code: str | None = None
self._user_code: str = ""
self._verification_uri: str = ""
self._expires_minutes: str = "30"
self.login_task: asyncio.Task | None = None
async def async_step_user(
self, user_input: dict[str, Any] | None = None
) -> ConfigFlowResult:
"""Handle the initial step."""
if self._api is None:
_LOGGER.debug("Initiating device authorization")
self._api = ActronNeoAPI()
try:
device_code_response = await self._api.request_device_code()
except ActronNeoAuthError as err:
_LOGGER.error("OAuth2 flow failed: %s", err)
return self.async_abort(reason="oauth2_error")
self._device_code = device_code_response["device_code"]
self._user_code = device_code_response["user_code"]
self._verification_uri = device_code_response["verification_uri_complete"]
self._expires_minutes = str(device_code_response["expires_in"] // 60)
async def _wait_for_authorization() -> None:
"""Wait for the user to authorize the device."""
assert self._api is not None
assert self._device_code is not None
_LOGGER.debug("Waiting for device authorization")
try:
await self._api.poll_for_token(self._device_code)
_LOGGER.debug("Authorization successful")
except ActronNeoAuthError as ex:
_LOGGER.exception("Error while waiting for device authorization")
raise CannotConnect from ex
_LOGGER.debug("Checking login task")
if self.login_task is None:
_LOGGER.debug("Creating task for device authorization")
self.login_task = self.hass.async_create_task(_wait_for_authorization())
if self.login_task.done():
_LOGGER.debug("Login task is done, checking results")
if exception := self.login_task.exception():
if isinstance(exception, CannotConnect):
return self.async_show_progress_done(
next_step_id="connection_error"
)
return self.async_show_progress_done(next_step_id="timeout")
return self.async_show_progress_done(next_step_id="finish_login")
return self.async_show_progress(
step_id="user",
progress_action="wait_for_authorization",
description_placeholders={
"user_code": self._user_code,
"verification_uri": self._verification_uri,
"expires_minutes": self._expires_minutes,
},
progress_task=self.login_task,
)
async def async_step_finish_login(
self, user_input: dict[str, Any] | None = None
) -> ConfigFlowResult:
"""Handle the finalization of login."""
_LOGGER.debug("Finalizing authorization")
assert self._api is not None
try:
user_data = await self._api.get_user_info()
except ActronNeoAuthError as err:
_LOGGER.error("Error getting user info: %s", err)
return self.async_abort(reason="oauth2_error")
unique_id = str(user_data["id"])
await self.async_set_unique_id(unique_id)
self._abort_if_unique_id_configured()
return self.async_create_entry(
title=user_data["email"],
data={CONF_API_TOKEN: self._api.refresh_token_value},
)
async def async_step_timeout(
self,
user_input: dict[str, Any] | None = None,
) -> ConfigFlowResult:
"""Handle issues that need transition await from progress step."""
if user_input is None:
return self.async_show_form(
step_id="timeout",
)
del self.login_task
return await self.async_step_user()
async def async_step_connection_error(
self, user_input: dict[str, Any] | None = None
) -> ConfigFlowResult:
"""Handle connection error from progress step."""
if user_input is None:
return self.async_show_form(step_id="connection_error")
# Reset state and try again
self._api = None
self._device_code = None
self.login_task = None
return await self.async_step_user()
class CannotConnect(HomeAssistantError):
"""Error to indicate we cannot connect."""

View File

@@ -1,6 +0,0 @@
"""Constants used by Actron Air integration."""
import logging
_LOGGER = logging.getLogger(__package__)
DOMAIN = "actron_air"

View File

@@ -1,69 +0,0 @@
"""Coordinator for Actron Air integration."""
from __future__ import annotations
from dataclasses import dataclass
from datetime import timedelta
from actron_neo_api import ActronAirNeoACSystem, ActronAirNeoStatus, ActronNeoAPI
from homeassistant.config_entries import ConfigEntry
from homeassistant.core import HomeAssistant
from homeassistant.helpers.update_coordinator import DataUpdateCoordinator
from homeassistant.util import dt as dt_util
from .const import _LOGGER
STALE_DEVICE_TIMEOUT = timedelta(hours=24)
ERROR_NO_SYSTEMS_FOUND = "no_systems_found"
ERROR_UNKNOWN = "unknown_error"
@dataclass
class ActronAirRuntimeData:
"""Runtime data for the Actron Air integration."""
api: ActronNeoAPI
system_coordinators: dict[str, ActronAirSystemCoordinator]
type ActronAirConfigEntry = ConfigEntry[ActronAirRuntimeData]
AUTH_ERROR_THRESHOLD = 3
SCAN_INTERVAL = timedelta(seconds=30)
class ActronAirSystemCoordinator(DataUpdateCoordinator[ActronAirNeoACSystem]):
"""System coordinator for Actron Air integration."""
def __init__(
self,
hass: HomeAssistant,
entry: ActronAirConfigEntry,
api: ActronNeoAPI,
system: ActronAirNeoACSystem,
) -> None:
"""Initialize the coordinator."""
super().__init__(
hass,
_LOGGER,
name="Actron Air Status",
update_interval=SCAN_INTERVAL,
config_entry=entry,
)
self.system = system
self.serial_number = system["serial"]
self.api = api
self.status = self.api.state_manager.get_status(self.serial_number)
self.last_seen = dt_util.utcnow()
async def _async_update_data(self) -> ActronAirNeoStatus:
"""Fetch updates and merge incremental changes into the full state."""
await self.api.update_status()
self.status = self.api.state_manager.get_status(self.serial_number)
self.last_seen = dt_util.utcnow()
return self.status
def is_device_stale(self) -> bool:
"""Check if a device is stale (not seen for a while)."""
return (dt_util.utcnow() - self.last_seen) > STALE_DEVICE_TIMEOUT

View File

@@ -1,16 +0,0 @@
{
"domain": "actron_air",
"name": "Actron Air",
"codeowners": ["@kclif9", "@JagadishDhanamjayam"],
"config_flow": true,
"dhcp": [
{
"hostname": "neo-*",
"macaddress": "FC0FE7*"
}
],
"documentation": "https://www.home-assistant.io/integrations/actron_air",
"iot_class": "cloud_polling",
"quality_scale": "bronze",
"requirements": ["actron-neo-api==0.1.84"]
}

View File

@@ -1,78 +0,0 @@
rules:
# Bronze
action-setup:
status: exempt
comment: This integration does not have custom service actions.
appropriate-polling: done
brands: done
common-modules: done
config-flow-test-coverage: done
config-flow: done
dependency-transparency: done
docs-actions:
status: exempt
comment: This integration does not have custom service actions.
docs-high-level-description: done
docs-installation-instructions: done
docs-removal-instructions: done
entity-event-setup:
status: exempt
comment: This integration does not subscribe to external events.
entity-unique-id: done
has-entity-name: done
runtime-data: done
test-before-configure: done
test-before-setup: done
unique-config-entry: done
# Silver
action-exceptions: todo
config-entry-unloading: done
docs-configuration-parameters:
status: exempt
comment: No options flow
docs-installation-parameters: done
entity-unavailable: done
integration-owner: done
log-when-unavailable: done
parallel-updates: done
reauthentication-flow: todo
test-coverage: todo
# Gold
devices: done
diagnostics: todo
discovery-update-info:
status: exempt
comment: This integration uses DHCP discovery, however is cloud polling. Therefore there is no information to update.
discovery: done
docs-data-update: done
docs-examples: done
docs-known-limitations: done
docs-supported-devices: done
docs-supported-functions: done
docs-troubleshooting: done
docs-use-cases: done
dynamic-devices: todo
entity-category:
status: exempt
comment: This integration does not use entity categories.
entity-device-class:
status: exempt
comment: This integration does not use entity device classes.
entity-disabled-by-default:
status: exempt
comment: Not required for this integration at this stage.
entity-translations: todo
exception-translations: todo
icon-translations: todo
reconfiguration-flow: todo
repair-issues:
status: exempt
comment: This integration does not have any known issues that require repair.
stale-devices: todo
# Platinum
async-dependency: done
inject-websession: todo
strict-typing: todo

View File

@@ -1,29 +0,0 @@
{
"config": {
"step": {
"user": {
"title": "Actron Air OAuth2 Authorization"
},
"timeout": {
"title": "Authorization timeout",
"description": "The authorization process timed out. Please try again.",
"data": {}
},
"connection_error": {
"title": "Connection error",
"description": "Failed to connect to Actron Air. Please check your internet connection and try again.",
"data": {}
}
},
"progress": {
"wait_for_authorization": "To authenticate, open the following URL and login at Actron Air:\n{verification_uri}\nIf the code is not automatically copied, paste the following code to authorize the integration:\n\n```{user_code}```\n\n\nThe login attempt will time out after {expires_minutes} minutes."
},
"error": {
"oauth2_error": "Failed to start OAuth2 flow. Please try again later."
},
"abort": {
"oauth2_error": "Failed to start OAuth2 flow",
"already_configured": "[%key:common::config_flow::abort::already_configured_account%]"
}
}
}

View File

@@ -71,14 +71,7 @@ class AemetConfigFlow(ConfigFlow, domain=DOMAIN):
}
)
return self.async_show_form(
step_id="user",
data_schema=schema,
errors=errors,
description_placeholders={
"api_key_url": "https://opendata.aemet.es/centrodedescargas/altaUsuario"
},
)
return self.async_show_form(step_id="user", data_schema=schema, errors=errors)
@staticmethod
@callback

View File

@@ -14,7 +14,7 @@
"longitude": "[%key:common::config_flow::data::longitude%]",
"name": "Name of the integration"
},
"description": "To generate API key go to {api_key_url}"
"description": "To generate API key go to https://opendata.aemet.es/centrodedescargas/altaUsuario"
}
}
},

View File

@@ -18,10 +18,6 @@ from homeassistant.helpers.aiohttp_client import async_get_clientsession
from .const import CONF_USE_NEAREST, DOMAIN, NO_AIRLY_SENSORS
DESCRIPTION_PLACEHOLDERS = {
"developer_registration_url": "https://developer.airly.eu/register",
}
class AirlyFlowHandler(ConfigFlow, domain=DOMAIN):
"""Config flow for Airly."""
@@ -89,7 +85,6 @@ class AirlyFlowHandler(ConfigFlow, domain=DOMAIN):
}
),
errors=errors,
description_placeholders=DESCRIPTION_PLACEHOLDERS,
)

View File

@@ -2,7 +2,7 @@
"config": {
"step": {
"user": {
"description": "To generate API key go to {developer_registration_url}",
"description": "To generate API key go to https://developer.airly.eu/register",
"data": {
"name": "[%key:common::config_flow::data::name%]",
"api_key": "[%key:common::config_flow::data::api_key%]",

View File

@@ -2,8 +2,6 @@
from __future__ import annotations
import logging
from airos.airos8 import AirOS8
from homeassistant.const import (
@@ -14,11 +12,10 @@ from homeassistant.const import (
CONF_VERIFY_SSL,
Platform,
)
from homeassistant.core import HomeAssistant, callback
from homeassistant.helpers import device_registry as dr, entity_registry as er
from homeassistant.core import HomeAssistant
from homeassistant.helpers.aiohttp_client import async_get_clientsession
from .const import DEFAULT_SSL, DEFAULT_VERIFY_SSL, DOMAIN, SECTION_ADVANCED_SETTINGS
from .const import DEFAULT_SSL, DEFAULT_VERIFY_SSL, SECTION_ADVANCED_SETTINGS
from .coordinator import AirOSConfigEntry, AirOSDataUpdateCoordinator
_PLATFORMS: list[Platform] = [
@@ -26,8 +23,6 @@ _PLATFORMS: list[Platform] = [
Platform.SENSOR,
]
_LOGGER = logging.getLogger(__name__)
async def async_setup_entry(hass: HomeAssistant, entry: AirOSConfigEntry) -> bool:
"""Set up Ubiquiti airOS from a config entry."""
@@ -59,13 +54,11 @@ async def async_setup_entry(hass: HomeAssistant, entry: AirOSConfigEntry) -> boo
async def async_migrate_entry(hass: HomeAssistant, entry: AirOSConfigEntry) -> bool:
"""Migrate old config entry."""
# This means the user has downgraded from a future version
if entry.version > 2:
if entry.version > 1:
# This means the user has downgraded from a future version
return False
# 1.1 Migrate config_entry to add advanced ssl settings
if entry.version == 1 and entry.minor_version == 1:
new_minor_version = 2
new_data = {**entry.data}
advanced_data = {
CONF_SSL: DEFAULT_SSL,
@@ -76,52 +69,7 @@ async def async_migrate_entry(hass: HomeAssistant, entry: AirOSConfigEntry) -> b
hass.config_entries.async_update_entry(
entry,
data=new_data,
minor_version=new_minor_version,
)
# 2.1 Migrate binary_sensor entity unique_id from device_id to mac_address
# Step 1 - migrate binary_sensor entity unique_id
# Step 2 - migrate device entity identifier
if entry.version == 1:
new_version = 2
new_minor_version = 1
mac_adress = dr.format_mac(entry.unique_id)
device_registry = dr.async_get(hass)
if device_entry := device_registry.async_get_device(
connections={(dr.CONNECTION_NETWORK_MAC, mac_adress)}
):
old_device_id = next(
(
device_id
for domain, device_id in device_entry.identifiers
if domain == DOMAIN
),
)
@callback
def update_unique_id(
entity_entry: er.RegistryEntry,
) -> dict[str, str] | None:
"""Update unique id from device_id to mac address."""
if old_device_id and entity_entry.unique_id.startswith(old_device_id):
suffix = entity_entry.unique_id.removeprefix(old_device_id)
new_unique_id = f"{mac_adress}{suffix}"
return {"new_unique_id": new_unique_id}
return None
await er.async_migrate_entries(hass, entry.entry_id, update_unique_id)
new_identifiers = device_entry.identifiers.copy()
new_identifiers.discard((DOMAIN, old_device_id))
new_identifiers.add((DOMAIN, mac_adress))
device_registry.async_update_device(
device_entry.id, new_identifiers=new_identifiers
)
hass.config_entries.async_update_entry(
entry, version=new_version, minor_version=new_minor_version
minor_version=2,
)
return True

View File

@@ -98,7 +98,7 @@ class AirOSBinarySensor(AirOSEntity, BinarySensorEntity):
super().__init__(coordinator)
self.entity_description = description
self._attr_unique_id = f"{coordinator.data.derived.mac}_{description.key}"
self._attr_unique_id = f"{coordinator.data.host.device_id}_{description.key}"
@property
def is_on(self) -> bool:

View File

@@ -2,7 +2,6 @@
from __future__ import annotations
from collections.abc import Mapping
import logging
from typing import Any
@@ -15,12 +14,7 @@ from airos.exceptions import (
)
import voluptuous as vol
from homeassistant.config_entries import (
SOURCE_REAUTH,
SOURCE_RECONFIGURE,
ConfigFlow,
ConfigFlowResult,
)
from homeassistant.config_entries import ConfigFlow, ConfigFlowResult
from homeassistant.const import (
CONF_HOST,
CONF_PASSWORD,
@@ -30,11 +24,6 @@ from homeassistant.const import (
)
from homeassistant.data_entry_flow import section
from homeassistant.helpers.aiohttp_client import async_get_clientsession
from homeassistant.helpers.selector import (
TextSelector,
TextSelectorConfig,
TextSelectorType,
)
from .const import DEFAULT_SSL, DEFAULT_VERIFY_SSL, DOMAIN, SECTION_ADVANCED_SETTINGS
from .coordinator import AirOS8
@@ -62,161 +51,53 @@ STEP_USER_DATA_SCHEMA = vol.Schema(
class AirOSConfigFlow(ConfigFlow, domain=DOMAIN):
"""Handle a config flow for Ubiquiti airOS."""
VERSION = 2
MINOR_VERSION = 1
def __init__(self) -> None:
"""Initialize the config flow."""
super().__init__()
self.airos_device: AirOS8
self.errors: dict[str, str] = {}
VERSION = 1
MINOR_VERSION = 2
async def async_step_user(
self, user_input: dict[str, Any] | None = None
self,
user_input: dict[str, Any] | None = None,
) -> ConfigFlowResult:
"""Handle the manual input of host and credentials."""
self.errors = {}
"""Handle the initial step."""
errors: dict[str, str] = {}
if user_input is not None:
validated_info = await self._validate_and_get_device_info(user_input)
if validated_info:
return self.async_create_entry(
title=validated_info["title"],
data=validated_info["data"],
)
return self.async_show_form(
step_id="user", data_schema=STEP_USER_DATA_SCHEMA, errors=self.errors
)
# By default airOS 8 comes with self-signed SSL certificates,
# with no option in the web UI to change or upload a custom certificate.
session = async_get_clientsession(
self.hass,
verify_ssl=user_input[SECTION_ADVANCED_SETTINGS][CONF_VERIFY_SSL],
)
async def _validate_and_get_device_info(
self, config_data: dict[str, Any]
) -> dict[str, Any] | None:
"""Validate user input with the device API."""
# By default airOS 8 comes with self-signed SSL certificates,
# with no option in the web UI to change or upload a custom certificate.
session = async_get_clientsession(
self.hass,
verify_ssl=config_data[SECTION_ADVANCED_SETTINGS][CONF_VERIFY_SSL],
)
airos_device = AirOS8(
host=user_input[CONF_HOST],
username=user_input[CONF_USERNAME],
password=user_input[CONF_PASSWORD],
session=session,
use_ssl=user_input[SECTION_ADVANCED_SETTINGS][CONF_SSL],
)
try:
await airos_device.login()
airos_data = await airos_device.status()
airos_device = AirOS8(
host=config_data[CONF_HOST],
username=config_data[CONF_USERNAME],
password=config_data[CONF_PASSWORD],
session=session,
use_ssl=config_data[SECTION_ADVANCED_SETTINGS][CONF_SSL],
)
try:
await airos_device.login()
airos_data = await airos_device.status()
except (
AirOSConnectionSetupError,
AirOSDeviceConnectionError,
):
self.errors["base"] = "cannot_connect"
except (AirOSConnectionAuthenticationError, AirOSDataMissingError):
self.errors["base"] = "invalid_auth"
except AirOSKeyDataMissingError:
self.errors["base"] = "key_data_missing"
except Exception:
_LOGGER.exception("Unexpected exception during credential validation")
self.errors["base"] = "unknown"
else:
await self.async_set_unique_id(airos_data.derived.mac)
if self.source in [SOURCE_REAUTH, SOURCE_RECONFIGURE]:
self._abort_if_unique_id_mismatch()
except (
AirOSConnectionSetupError,
AirOSDeviceConnectionError,
):
errors["base"] = "cannot_connect"
except (AirOSConnectionAuthenticationError, AirOSDataMissingError):
errors["base"] = "invalid_auth"
except AirOSKeyDataMissingError:
errors["base"] = "key_data_missing"
except Exception:
_LOGGER.exception("Unexpected exception")
errors["base"] = "unknown"
else:
await self.async_set_unique_id(airos_data.derived.mac)
self._abort_if_unique_id_configured()
return {"title": airos_data.host.hostname, "data": config_data}
return None
async def async_step_reauth(
self,
user_input: Mapping[str, Any],
) -> ConfigFlowResult:
"""Perform reauthentication upon an API authentication error."""
return await self.async_step_reauth_confirm(user_input)
async def async_step_reauth_confirm(
self,
user_input: Mapping[str, Any],
) -> ConfigFlowResult:
"""Perform reauthentication upon an API authentication error."""
self.errors = {}
if user_input:
validate_data = {**self._get_reauth_entry().data, **user_input}
if await self._validate_and_get_device_info(config_data=validate_data):
return self.async_update_reload_and_abort(
self._get_reauth_entry(),
data_updates=validate_data,
return self.async_create_entry(
title=airos_data.host.hostname, data=user_input
)
return self.async_show_form(
step_id="reauth_confirm",
data_schema=vol.Schema(
{
vol.Required(CONF_PASSWORD): TextSelector(
TextSelectorConfig(
type=TextSelectorType.PASSWORD,
autocomplete="current-password",
)
),
}
),
errors=self.errors,
)
async def async_step_reconfigure(
self,
user_input: Mapping[str, Any] | None = None,
) -> ConfigFlowResult:
"""Handle reconfiguration of airOS."""
self.errors = {}
entry = self._get_reconfigure_entry()
current_data = entry.data
if user_input is not None:
validate_data = {**current_data, **user_input}
if await self._validate_and_get_device_info(config_data=validate_data):
return self.async_update_reload_and_abort(
entry,
data_updates=validate_data,
)
return self.async_show_form(
step_id="reconfigure",
data_schema=vol.Schema(
{
vol.Required(CONF_PASSWORD): TextSelector(
TextSelectorConfig(
type=TextSelectorType.PASSWORD,
autocomplete="current-password",
)
),
vol.Required(SECTION_ADVANCED_SETTINGS): section(
vol.Schema(
{
vol.Required(
CONF_SSL,
default=current_data[SECTION_ADVANCED_SETTINGS][
CONF_SSL
],
): bool,
vol.Required(
CONF_VERIFY_SSL,
default=current_data[SECTION_ADVANCED_SETTINGS][
CONF_VERIFY_SSL
],
): bool,
}
),
{"collapsed": True},
),
}
),
errors=self.errors,
step_id="user", data_schema=STEP_USER_DATA_SCHEMA, errors=errors
)

View File

@@ -14,7 +14,7 @@ from airos.exceptions import (
from homeassistant.config_entries import ConfigEntry
from homeassistant.core import HomeAssistant
from homeassistant.exceptions import ConfigEntryAuthFailed
from homeassistant.exceptions import ConfigEntryError
from homeassistant.helpers.update_coordinator import DataUpdateCoordinator, UpdateFailed
from .const import DOMAIN, SCAN_INTERVAL
@@ -47,9 +47,9 @@ class AirOSDataUpdateCoordinator(DataUpdateCoordinator[AirOS8Data]):
try:
await self.airos_device.login()
return await self.airos_device.status()
except AirOSConnectionAuthenticationError as err:
except (AirOSConnectionAuthenticationError,) as err:
_LOGGER.exception("Error authenticating with airOS device")
raise ConfigEntryAuthFailed(
raise ConfigEntryError(
translation_domain=DOMAIN, translation_key="invalid_auth"
) from err
except (

View File

@@ -33,14 +33,9 @@ class AirOSEntity(CoordinatorEntity[AirOSDataUpdateCoordinator]):
self._attr_device_info = DeviceInfo(
connections={(CONNECTION_NETWORK_MAC, airos_data.derived.mac)},
configuration_url=configuration_url,
identifiers={(DOMAIN, airos_data.derived.mac)},
identifiers={(DOMAIN, str(airos_data.host.device_id))},
manufacturer=MANUFACTURER,
model=airos_data.host.devmodel,
model_id=(
sku
if (sku := airos_data.derived.sku) not in ["UNKNOWN", "AMBIGUOUS"]
else None
),
name=airos_data.host.hostname,
sw_version=airos_data.host.fwversion,
)

View File

@@ -4,8 +4,7 @@
"codeowners": ["@CoMPaTech"],
"config_flow": true,
"documentation": "https://www.home-assistant.io/integrations/airos",
"integration_type": "device",
"iot_class": "local_polling",
"quality_scale": "silver",
"requirements": ["airos==0.5.6"]
"quality_scale": "bronze",
"requirements": ["airos==0.5.5"]
}

View File

@@ -32,11 +32,11 @@ rules:
config-entry-unloading: done
docs-configuration-parameters: done
docs-installation-parameters: done
entity-unavailable: done
entity-unavailable: todo
integration-owner: done
log-when-unavailable: done
parallel-updates: done
reauthentication-flow: done
log-when-unavailable: todo
parallel-updates: todo
reauthentication-flow: todo
test-coverage: done
# Gold
@@ -48,9 +48,9 @@ rules:
docs-examples: todo
docs-known-limitations: done
docs-supported-devices: done
docs-supported-functions: done
docs-supported-functions: todo
docs-troubleshooting: done
docs-use-cases: done
docs-use-cases: todo
dynamic-devices: todo
entity-category: done
entity-device-class: done
@@ -60,7 +60,7 @@ rules:
icon-translations:
status: exempt
comment: no (custom) icons used or envisioned
reconfiguration-flow: done
reconfiguration-flow: todo
repair-issues: todo
stale-devices: todo

View File

@@ -2,35 +2,6 @@
"config": {
"flow_title": "Ubiquiti airOS device",
"step": {
"reauth_confirm": {
"data": {
"password": "[%key:common::config_flow::data::password%]"
},
"data_description": {
"password": "[%key:component::airos::config::step::user::data_description::password%]"
}
},
"reconfigure": {
"data": {
"password": "[%key:common::config_flow::data::password%]"
},
"data_description": {
"password": "[%key:component::airos::config::step::user::data_description::password%]"
},
"sections": {
"advanced_settings": {
"name": "[%key:component::airos::config::step::user::sections::advanced_settings::name%]",
"data": {
"ssl": "[%key:component::airos::config::step::user::sections::advanced_settings::data::ssl%]",
"verify_ssl": "[%key:common::config_flow::data::verify_ssl%]"
},
"data_description": {
"ssl": "[%key:component::airos::config::step::user::sections::advanced_settings::data_description::ssl%]",
"verify_ssl": "[%key:component::airos::config::step::user::sections::advanced_settings::data_description::verify_ssl%]"
}
}
}
},
"user": {
"data": {
"host": "[%key:common::config_flow::data::host%]",
@@ -44,7 +15,6 @@
},
"sections": {
"advanced_settings": {
"name": "Advanced settings",
"data": {
"ssl": "Use HTTPS",
"verify_ssl": "[%key:common::config_flow::data::verify_ssl%]"
@@ -64,10 +34,7 @@
"unknown": "[%key:common::config_flow::error::unknown%]"
},
"abort": {
"already_configured": "[%key:common::config_flow::abort::already_configured_device%]",
"reauth_successful": "[%key:common::config_flow::abort::reauth_successful%]",
"reconfigure_successful": "[%key:common::config_flow::abort::reconfigure_successful%]",
"unique_id_mismatch": "Re-authentication should be used for the same device not a new one"
"already_configured": "[%key:common::config_flow::abort::already_configured_device%]"
}
},
"entity": {

View File

@@ -29,7 +29,7 @@
},
"data_description": {
"return_average": "air-Q allows to poll both the noisy sensor readings as well as the values averaged on the device (default)",
"clip_negatives": "For baseline calibration purposes, certain sensor values may briefly become negative. The default behavior is to clip such values to 0"
"clip_negatives": "For baseline calibration purposes, certain sensor values may briefly become negative. The default behaviour is to clip such values to 0"
}
}
}

View File

@@ -23,10 +23,6 @@ STEP_USER_DATA_SCHEMA = vol.Schema(
}
)
URL_API_INTEGRATION = {
"url": "https://dashboard.airthings.com/integrations/api-integration"
}
class AirthingsConfigFlow(ConfigFlow, domain=DOMAIN):
"""Handle a config flow for Airthings."""
@@ -41,7 +37,11 @@ class AirthingsConfigFlow(ConfigFlow, domain=DOMAIN):
return self.async_show_form(
step_id="user",
data_schema=STEP_USER_DATA_SCHEMA,
description_placeholders=URL_API_INTEGRATION,
description_placeholders={
"url": (
"https://dashboard.airthings.com/integrations/api-integration"
),
},
)
errors = {}
@@ -65,8 +65,5 @@ class AirthingsConfigFlow(ConfigFlow, domain=DOMAIN):
return self.async_create_entry(title="Airthings", data=user_input)
return self.async_show_form(
step_id="user",
data_schema=STEP_USER_DATA_SCHEMA,
errors=errors,
description_placeholders=URL_API_INTEGRATION,
step_id="user", data_schema=STEP_USER_DATA_SCHEMA, errors=errors
)

View File

@@ -4,9 +4,9 @@
"user": {
"data": {
"id": "ID",
"secret": "Secret"
},
"description": "Log in at {url} to find your credentials"
"secret": "Secret",
"description": "Login at {url} to find your credentials"
}
}
},
"error": {

View File

@@ -6,13 +6,8 @@ import dataclasses
import logging
from typing import Any
from airthings_ble import (
AirthingsBluetoothDeviceData,
AirthingsDevice,
UnsupportedDeviceError,
)
from airthings_ble import AirthingsBluetoothDeviceData, AirthingsDevice
from bleak import BleakError
from habluetooth import BluetoothServiceInfoBleak
import voluptuous as vol
from homeassistant.components import bluetooth
@@ -32,7 +27,6 @@ SERVICE_UUIDS = [
"b42e4a8e-ade7-11e4-89d3-123b93f75cba",
"b42e1c08-ade7-11e4-89d3-123b93f75cba",
"b42e3882-ade7-11e4-89d3-123b93f75cba",
"b42e90a2-ade7-11e4-89d3-123b93f75cba",
]
@@ -43,7 +37,6 @@ class Discovery:
name: str
discovery_info: BluetoothServiceInfo
device: AirthingsDevice
data: AirthingsBluetoothDeviceData
def get_name(device: AirthingsDevice) -> str:
@@ -51,7 +44,7 @@ def get_name(device: AirthingsDevice) -> str:
name = device.friendly_name()
if identifier := device.identifier:
name += f" ({device.model.value}{identifier})"
name += f" ({identifier})"
return name
@@ -69,8 +62,8 @@ class AirthingsConfigFlow(ConfigFlow, domain=DOMAIN):
self._discovered_device: Discovery | None = None
self._discovered_devices: dict[str, Discovery] = {}
async def _get_device(
self, data: AirthingsBluetoothDeviceData, discovery_info: BluetoothServiceInfo
async def _get_device_data(
self, discovery_info: BluetoothServiceInfo
) -> AirthingsDevice:
ble_device = bluetooth.async_ble_device_from_address(
self.hass, discovery_info.address
@@ -79,8 +72,10 @@ class AirthingsConfigFlow(ConfigFlow, domain=DOMAIN):
_LOGGER.debug("no ble_device in _get_device_data")
raise AirthingsDeviceUpdateError("No ble_device")
airthings = AirthingsBluetoothDeviceData(_LOGGER)
try:
device = await data.update_device(ble_device)
data = await airthings.update_device(ble_device)
except BleakError as err:
_LOGGER.error(
"Error connecting to and getting data from %s: %s",
@@ -88,15 +83,12 @@ class AirthingsConfigFlow(ConfigFlow, domain=DOMAIN):
err,
)
raise AirthingsDeviceUpdateError("Failed getting device data") from err
except UnsupportedDeviceError:
_LOGGER.debug("Skipping unsupported device: %s", discovery_info.name)
raise
except Exception as err:
_LOGGER.error(
"Unknown error occurred from %s: %s", discovery_info.address, err
)
raise
return device
return data
async def async_step_bluetooth(
self, discovery_info: BluetoothServiceInfo
@@ -106,21 +98,17 @@ class AirthingsConfigFlow(ConfigFlow, domain=DOMAIN):
await self.async_set_unique_id(discovery_info.address)
self._abort_if_unique_id_configured()
data = AirthingsBluetoothDeviceData(logger=_LOGGER)
try:
device = await self._get_device(data=data, discovery_info=discovery_info)
device = await self._get_device_data(discovery_info)
except AirthingsDeviceUpdateError:
return self.async_abort(reason="cannot_connect")
except UnsupportedDeviceError:
return self.async_abort(reason="unsupported_device")
except Exception:
_LOGGER.exception("Unknown error occurred")
return self.async_abort(reason="unknown")
name = get_name(device)
self.context["title_placeholders"] = {"name": name}
self._discovered_device = Discovery(name, discovery_info, device, data=data)
self._discovered_device = Discovery(name, discovery_info, device)
return await self.async_step_bluetooth_confirm()
@@ -129,12 +117,6 @@ class AirthingsConfigFlow(ConfigFlow, domain=DOMAIN):
) -> ConfigFlowResult:
"""Confirm discovery."""
if user_input is not None:
if (
self._discovered_device is not None
and self._discovered_device.device.firmware.need_firmware_upgrade
):
return self.async_abort(reason="firmware_upgrade_required")
return self.async_create_entry(
title=self.context["title_placeholders"]["name"], data={}
)
@@ -155,9 +137,6 @@ class AirthingsConfigFlow(ConfigFlow, domain=DOMAIN):
self._abort_if_unique_id_configured()
discovery = self._discovered_devices[address]
if discovery.device.firmware.need_firmware_upgrade:
return self.async_abort(reason="firmware_upgrade_required")
self.context["title_placeholders"] = {
"name": discovery.name,
}
@@ -167,53 +146,32 @@ class AirthingsConfigFlow(ConfigFlow, domain=DOMAIN):
return self.async_create_entry(title=discovery.name, data={})
current_addresses = self._async_current_ids(include_ignore=False)
devices: list[BluetoothServiceInfoBleak] = []
for discovery_info in async_discovered_service_info(self.hass):
address = discovery_info.address
if address in current_addresses or address in self._discovered_devices:
continue
if MFCT_ID not in discovery_info.manufacturer_data:
continue
if not any(uuid in SERVICE_UUIDS for uuid in discovery_info.service_uuids):
_LOGGER.debug(
"Skipping unsupported device: %s (%s)", discovery_info.name, address
)
continue
devices.append(discovery_info)
for discovery_info in devices:
address = discovery_info.address
data = AirthingsBluetoothDeviceData(logger=_LOGGER)
if not any(uuid in SERVICE_UUIDS for uuid in discovery_info.service_uuids):
continue
try:
device = await self._get_device(data, discovery_info)
device = await self._get_device_data(discovery_info)
except AirthingsDeviceUpdateError:
_LOGGER.error(
"Error connecting to and getting data from %s (%s)",
discovery_info.name,
discovery_info.address,
)
continue
except UnsupportedDeviceError:
_LOGGER.debug(
"Skipping unsupported device: %s (%s)",
discovery_info.name,
discovery_info.address,
)
continue
return self.async_abort(reason="cannot_connect")
except Exception:
_LOGGER.exception("Unknown error occurred")
return self.async_abort(reason="unknown")
name = get_name(device)
_LOGGER.debug("Discovered Airthings device: %s (%s)", name, address)
self._discovered_devices[address] = Discovery(
name, discovery_info, device, data
)
self._discovered_devices[address] = Discovery(name, discovery_info, device)
if not self._discovered_devices:
return self.async_abort(reason="no_devices_found")
titles = {
address: get_name(discovery.device)
address: discovery.device.name
for (address, discovery) in self._discovered_devices.items()
}
return self.async_show_form(

View File

@@ -17,10 +17,6 @@
{
"manufacturer_id": 820,
"service_uuid": "b42e3882-ade7-11e4-89d3-123b93f75cba"
},
{
"manufacturer_id": 820,
"service_uuid": "b42e90a2-ade7-11e4-89d3-123b93f75cba"
}
],
"codeowners": ["@vincegio", "@LaStrada"],
@@ -28,5 +24,5 @@
"dependencies": ["bluetooth_adapters"],
"documentation": "https://www.home-assistant.io/integrations/airthings_ble",
"iot_class": "local_polling",
"requirements": ["airthings-ble==1.1.1"]
"requirements": ["airthings-ble==0.9.2"]
}

View File

@@ -16,12 +16,10 @@ from homeassistant.components.sensor import (
from homeassistant.const import (
CONCENTRATION_PARTS_PER_BILLION,
CONCENTRATION_PARTS_PER_MILLION,
LIGHT_LUX,
PERCENTAGE,
EntityCategory,
Platform,
UnitOfPressure,
UnitOfSoundPressure,
UnitOfTemperature,
)
from homeassistant.core import HomeAssistant, callback
@@ -114,25 +112,8 @@ SENSORS_MAPPING_TEMPLATE: dict[str, SensorEntityDescription] = {
state_class=SensorStateClass.MEASUREMENT,
suggested_display_precision=0,
),
"lux": SensorEntityDescription(
key="lux",
device_class=SensorDeviceClass.ILLUMINANCE,
native_unit_of_measurement=LIGHT_LUX,
state_class=SensorStateClass.MEASUREMENT,
suggested_display_precision=0,
),
"noise": SensorEntityDescription(
key="noise",
translation_key="ambient_noise",
device_class=SensorDeviceClass.SOUND_PRESSURE,
native_unit_of_measurement=UnitOfSoundPressure.WEIGHTED_DECIBEL_A,
state_class=SensorStateClass.MEASUREMENT,
suggested_display_precision=0,
),
}
PARALLEL_UPDATES = 0
@callback
def async_migrate(hass: HomeAssistant, address: str, sensor_name: str) -> None:

View File

@@ -6,9 +6,6 @@
"description": "[%key:component::bluetooth::config::step::user::description%]",
"data": {
"address": "[%key:common::config_flow::data::device%]"
},
"data_description": {
"address": "The Airthings devices discovered via Bluetooth."
}
},
"bluetooth_confirm": {
@@ -20,8 +17,6 @@
"already_in_progress": "[%key:common::config_flow::abort::already_in_progress%]",
"already_configured": "[%key:common::config_flow::abort::already_configured_device%]",
"cannot_connect": "[%key:common::config_flow::error::cannot_connect%]",
"firmware_upgrade_required": "Your device requires a firmware upgrade. Please use the Airthings app (Android/iOS) to upgrade it.",
"unsupported_device": "Unsupported device",
"unknown": "[%key:common::config_flow::error::unknown%]"
}
},
@@ -41,9 +36,6 @@
},
"illuminance": {
"name": "[%key:component::sensor::entity_component::illuminance::name%]"
},
"ambient_noise": {
"name": "Ambient noise"
}
}
}

View File

@@ -2,9 +2,10 @@
from __future__ import annotations
import asyncio
from datetime import timedelta
import logging
from typing import Any, Final, final
from typing import TYPE_CHECKING, Any, Final, final
from propcache.api import cached_property
import voluptuous as vol
@@ -27,6 +28,8 @@ from homeassistant.helpers import config_validation as cv
from homeassistant.helpers.config_validation import make_entity_service_schema
from homeassistant.helpers.entity import Entity, EntityDescription
from homeassistant.helpers.entity_component import EntityComponent
from homeassistant.helpers.entity_platform import EntityPlatform
from homeassistant.helpers.frame import ReportBehavior, report_usage
from homeassistant.helpers.typing import ConfigType
from homeassistant.util.hass_dict import HassKey
@@ -146,11 +149,68 @@ class AlarmControlPanelEntity(Entity, cached_properties=CACHED_PROPERTIES_WITH_A
)
_alarm_control_panel_option_default_code: str | None = None
__alarm_legacy_state: bool = False
def __init_subclass__(cls, **kwargs: Any) -> None:
"""Post initialisation processing."""
super().__init_subclass__(**kwargs)
if any(method in cls.__dict__ for method in ("_attr_state", "state")):
# Integrations should use the 'alarm_state' property instead of
# setting the state directly.
cls.__alarm_legacy_state = True
def __setattr__(self, name: str, value: Any, /) -> None:
"""Set attribute.
Deprecation warning if setting '_attr_state' directly
unless already reported.
"""
if name == "_attr_state":
self._report_deprecated_alarm_state_handling()
return super().__setattr__(name, value)
@callback
def add_to_platform_start(
self,
hass: HomeAssistant,
platform: EntityPlatform,
parallel_updates: asyncio.Semaphore | None,
) -> None:
"""Start adding an entity to a platform."""
super().add_to_platform_start(hass, platform, parallel_updates)
if self.__alarm_legacy_state:
self._report_deprecated_alarm_state_handling()
@callback
def _report_deprecated_alarm_state_handling(self) -> None:
"""Report on deprecated handling of alarm state.
Integrations should implement alarm_state instead of using state directly.
"""
report_usage(
"is setting state directly."
f" Entity {self.entity_id} ({type(self)}) should implement the 'alarm_state'"
" property and return its state using the AlarmControlPanelState enum",
core_integration_behavior=ReportBehavior.ERROR,
custom_integration_behavior=ReportBehavior.LOG,
breaks_in_ha_version="2025.11",
integration_domain=self.platform.platform_name if self.platform else None,
exclude_integrations={DOMAIN},
)
@final
@property
def state(self) -> str | None:
"""Return the current state."""
return self.alarm_state
if (alarm_state := self.alarm_state) is not None:
return alarm_state
if self._attr_state is not None:
# Backwards compatibility for integrations that set state directly
# Should be removed in 2025.11
if TYPE_CHECKING:
assert isinstance(self._attr_state, str)
return self._attr_state
return None
@cached_property
def alarm_state(self) -> AlarmControlPanelState | None:

View File

@@ -1472,10 +1472,10 @@ class AlexaModeController(AlexaCapability):
# Return state instead of position when using ModeController.
mode = self.entity.state
if mode in (
cover.CoverState.OPEN,
cover.CoverState.OPENING,
cover.CoverState.CLOSED,
cover.CoverState.CLOSING,
cover.STATE_OPEN,
cover.STATE_OPENING,
cover.STATE_CLOSED,
cover.STATE_CLOSING,
STATE_UNKNOWN,
):
return f"{cover.ATTR_POSITION}.{mode}"
@@ -1594,11 +1594,11 @@ class AlexaModeController(AlexaCapability):
["Position", AlexaGlobalCatalog.SETTING_OPENING], False
)
self._resource.add_mode(
f"{cover.ATTR_POSITION}.{cover.CoverState.OPEN}",
f"{cover.ATTR_POSITION}.{cover.STATE_OPEN}",
[AlexaGlobalCatalog.VALUE_OPEN],
)
self._resource.add_mode(
f"{cover.ATTR_POSITION}.{cover.CoverState.CLOSED}",
f"{cover.ATTR_POSITION}.{cover.STATE_CLOSED}",
[AlexaGlobalCatalog.VALUE_CLOSE],
)
self._resource.add_mode(
@@ -1651,22 +1651,22 @@ class AlexaModeController(AlexaCapability):
raise_labels.append(AlexaSemantics.ACTION_OPEN)
self._semantics.add_states_to_value(
[AlexaSemantics.STATES_CLOSED],
f"{cover.ATTR_POSITION}.{cover.CoverState.CLOSED}",
f"{cover.ATTR_POSITION}.{cover.STATE_CLOSED}",
)
self._semantics.add_states_to_value(
[AlexaSemantics.STATES_OPEN],
f"{cover.ATTR_POSITION}.{cover.CoverState.OPEN}",
f"{cover.ATTR_POSITION}.{cover.STATE_OPEN}",
)
self._semantics.add_action_to_directive(
lower_labels,
"SetMode",
{"mode": f"{cover.ATTR_POSITION}.{cover.CoverState.CLOSED}"},
{"mode": f"{cover.ATTR_POSITION}.{cover.STATE_CLOSED}"},
)
self._semantics.add_action_to_directive(
raise_labels,
"SetMode",
{"mode": f"{cover.ATTR_POSITION}.{cover.CoverState.OPEN}"},
{"mode": f"{cover.ATTR_POSITION}.{cover.STATE_OPEN}"},
)
return self._semantics.serialize_semantics()

View File

@@ -1261,9 +1261,9 @@ async def async_api_set_mode(
elif instance == f"{cover.DOMAIN}.{cover.ATTR_POSITION}":
position = mode.split(".")[1]
if position == cover.CoverState.CLOSED:
if position == cover.STATE_CLOSED:
service = cover.SERVICE_CLOSE_COVER
elif position == cover.CoverState.OPEN:
elif position == cover.STATE_OPEN:
service = cover.SERVICE_OPEN_COVER
elif position == "custom":
service = cover.SERVICE_STOP_COVER

View File

@@ -65,31 +65,6 @@ SENSOR_DESCRIPTIONS = [
suggested_display_precision=2,
translation_placeholders={"sensor_name": "BME280"},
),
AltruistSensorEntityDescription(
device_class=SensorDeviceClass.HUMIDITY,
key="BME680_humidity",
translation_key="humidity",
native_unit_of_measurement=PERCENTAGE,
suggested_display_precision=2,
translation_placeholders={"sensor_name": "BME680"},
),
AltruistSensorEntityDescription(
device_class=SensorDeviceClass.PRESSURE,
key="BME680_pressure",
translation_key="pressure",
native_unit_of_measurement=UnitOfPressure.PA,
suggested_unit_of_measurement=UnitOfPressure.MMHG,
suggested_display_precision=0,
translation_placeholders={"sensor_name": "BME680"},
),
AltruistSensorEntityDescription(
device_class=SensorDeviceClass.TEMPERATURE,
key="BME680_temperature",
translation_key="temperature",
native_unit_of_measurement=UnitOfTemperature.CELSIUS,
suggested_display_precision=2,
translation_placeholders={"sensor_name": "BME680"},
),
AltruistSensorEntityDescription(
device_class=SensorDeviceClass.PRESSURE,
key="BMP_pressure",

View File

@@ -629,6 +629,7 @@ async def async_devices_payload(hass: HomeAssistant) -> dict: # noqa: C901
devices_info.append(
{
"entities": [],
"entry_type": device_entry.entry_type,
"has_configuration_url": device_entry.configuration_url is not None,
"hw_version": device_entry.hw_version,
@@ -637,7 +638,6 @@ async def async_devices_payload(hass: HomeAssistant) -> dict: # noqa: C901
"model_id": device_entry.model_id,
"sw_version": device_entry.sw_version,
"via_device": device_entry.via_device_id,
"entities": [],
}
)

View File

@@ -4,15 +4,12 @@ from __future__ import annotations
from collections.abc import Mapping
from functools import partial
import json
import logging
from typing import Any, cast
import anthropic
import voluptuous as vol
from voluptuous_openapi import convert
from homeassistant.components.zone import ENTITY_ID_HOME
from homeassistant.config_entries import (
ConfigEntry,
ConfigEntryState,
@@ -21,13 +18,7 @@ from homeassistant.config_entries import (
ConfigSubentryFlow,
SubentryFlowResult,
)
from homeassistant.const import (
ATTR_LATITUDE,
ATTR_LONGITUDE,
CONF_API_KEY,
CONF_LLM_HASS_API,
CONF_NAME,
)
from homeassistant.const import CONF_API_KEY, CONF_LLM_HASS_API, CONF_NAME
from homeassistant.core import HomeAssistant, callback
from homeassistant.helpers import llm
from homeassistant.helpers.selector import (
@@ -46,23 +37,12 @@ from .const import (
CONF_RECOMMENDED,
CONF_TEMPERATURE,
CONF_THINKING_BUDGET,
CONF_WEB_SEARCH,
CONF_WEB_SEARCH_CITY,
CONF_WEB_SEARCH_COUNTRY,
CONF_WEB_SEARCH_MAX_USES,
CONF_WEB_SEARCH_REGION,
CONF_WEB_SEARCH_TIMEZONE,
CONF_WEB_SEARCH_USER_LOCATION,
DEFAULT_CONVERSATION_NAME,
DOMAIN,
RECOMMENDED_CHAT_MODEL,
RECOMMENDED_MAX_TOKENS,
RECOMMENDED_TEMPERATURE,
RECOMMENDED_THINKING_BUDGET,
RECOMMENDED_WEB_SEARCH,
RECOMMENDED_WEB_SEARCH_MAX_USES,
RECOMMENDED_WEB_SEARCH_USER_LOCATION,
WEB_SEARCH_UNSUPPORTED_MODELS,
)
_LOGGER = logging.getLogger(__name__)
@@ -188,14 +168,6 @@ class ConversationSubentryFlowHandler(ConfigSubentryFlow):
CONF_THINKING_BUDGET, RECOMMENDED_THINKING_BUDGET
) >= user_input.get(CONF_MAX_TOKENS, RECOMMENDED_MAX_TOKENS):
errors[CONF_THINKING_BUDGET] = "thinking_budget_too_large"
if user_input.get(CONF_WEB_SEARCH, RECOMMENDED_WEB_SEARCH):
model = user_input.get(CONF_CHAT_MODEL, RECOMMENDED_CHAT_MODEL)
if model.startswith(tuple(WEB_SEARCH_UNSUPPORTED_MODELS)):
errors[CONF_WEB_SEARCH] = "web_search_unsupported_model"
elif user_input.get(
CONF_WEB_SEARCH_USER_LOCATION, RECOMMENDED_WEB_SEARCH_USER_LOCATION
):
user_input.update(await self._get_location_data())
if not errors:
if self._is_new:
@@ -243,68 +215,6 @@ class ConversationSubentryFlowHandler(ConfigSubentryFlow):
errors=errors or None,
)
async def _get_location_data(self) -> dict[str, str]:
"""Get approximate location data of the user."""
location_data: dict[str, str] = {}
zone_home = self.hass.states.get(ENTITY_ID_HOME)
if zone_home is not None:
client = await self.hass.async_add_executor_job(
partial(
anthropic.AsyncAnthropic,
api_key=self._get_entry().data[CONF_API_KEY],
)
)
location_schema = vol.Schema(
{
vol.Optional(
CONF_WEB_SEARCH_CITY,
description="Free text input for the city, e.g. `San Francisco`",
): str,
vol.Optional(
CONF_WEB_SEARCH_REGION,
description="Free text input for the region, e.g. `California`",
): str,
}
)
response = await client.messages.create(
model=RECOMMENDED_CHAT_MODEL,
messages=[
{
"role": "user",
"content": "Where are the following coordinates located: "
f"({zone_home.attributes[ATTR_LATITUDE]},"
f" {zone_home.attributes[ATTR_LONGITUDE]})? Please respond "
"only with a JSON object using the following schema:\n"
f"{convert(location_schema)}",
},
{
"role": "assistant",
"content": "{", # hints the model to skip any preamble
},
],
max_tokens=RECOMMENDED_MAX_TOKENS,
)
_LOGGER.debug("Model response: %s", response.content)
location_data = location_schema(
json.loads(
"{"
+ "".join(
block.text
for block in response.content
if isinstance(block, anthropic.types.TextBlock)
)
)
or {}
)
if self.hass.config.country:
location_data[CONF_WEB_SEARCH_COUNTRY] = self.hass.config.country
location_data[CONF_WEB_SEARCH_TIMEZONE] = self.hass.config.time_zone
_LOGGER.debug("Location data: %s", location_data)
return location_data
async_step_user = async_step_set_options
async_step_reconfigure = async_step_set_options
@@ -363,18 +273,6 @@ def anthropic_config_option_schema(
CONF_THINKING_BUDGET,
default=RECOMMENDED_THINKING_BUDGET,
): int,
vol.Optional(
CONF_WEB_SEARCH,
default=RECOMMENDED_WEB_SEARCH,
): bool,
vol.Optional(
CONF_WEB_SEARCH_MAX_USES,
default=RECOMMENDED_WEB_SEARCH_MAX_USES,
): int,
vol.Optional(
CONF_WEB_SEARCH_USER_LOCATION,
default=RECOMMENDED_WEB_SEARCH_USER_LOCATION,
): bool,
}
)
return schema

View File

@@ -18,26 +18,10 @@ RECOMMENDED_TEMPERATURE = 1.0
CONF_THINKING_BUDGET = "thinking_budget"
RECOMMENDED_THINKING_BUDGET = 0
MIN_THINKING_BUDGET = 1024
CONF_WEB_SEARCH = "web_search"
RECOMMENDED_WEB_SEARCH = False
CONF_WEB_SEARCH_USER_LOCATION = "user_location"
RECOMMENDED_WEB_SEARCH_USER_LOCATION = False
CONF_WEB_SEARCH_MAX_USES = "web_search_max_uses"
RECOMMENDED_WEB_SEARCH_MAX_USES = 5
CONF_WEB_SEARCH_CITY = "city"
CONF_WEB_SEARCH_REGION = "region"
CONF_WEB_SEARCH_COUNTRY = "country"
CONF_WEB_SEARCH_TIMEZONE = "timezone"
NON_THINKING_MODELS = [
"claude-3-5", # Both sonnet and haiku
"claude-3-opus",
"claude-3-haiku",
]
WEB_SEARCH_UNSUPPORTED_MODELS = [
"claude-3-haiku",
"claude-3-opus",
"claude-3-5-sonnet-20240620",
"claude-3-5-sonnet-20241022",
THINKING_MODELS = [
"claude-3-7-sonnet",
"claude-sonnet-4-0",
"claude-opus-4-0",
"claude-opus-4-1",
]

View File

@@ -1,17 +1,12 @@
"""Base entity for Anthropic."""
from collections.abc import AsyncGenerator, Callable, Iterable
from dataclasses import dataclass, field
import json
from typing import Any
import anthropic
from anthropic import AsyncStream
from anthropic.types import (
CitationsDelta,
CitationsWebSearchResultLocation,
CitationWebSearchResultLocationParam,
ContentBlockParam,
InputJSONDelta,
MessageDeltaUsage,
MessageParam,
@@ -21,16 +16,11 @@ from anthropic.types import (
RawContentBlockStopEvent,
RawMessageDeltaEvent,
RawMessageStartEvent,
RawMessageStopEvent,
RedactedThinkingBlock,
RedactedThinkingBlockParam,
ServerToolUseBlock,
ServerToolUseBlockParam,
SignatureDelta,
TextBlock,
TextBlockParam,
TextCitation,
TextCitationParam,
TextDelta,
ThinkingBlock,
ThinkingBlockParam,
@@ -39,15 +29,9 @@ from anthropic.types import (
ThinkingDelta,
ToolParam,
ToolResultBlockParam,
ToolUnionParam,
ToolUseBlock,
ToolUseBlockParam,
Usage,
WebSearchTool20250305Param,
WebSearchToolRequestErrorParam,
WebSearchToolResultBlock,
WebSearchToolResultBlockParam,
WebSearchToolResultError,
)
from anthropic.types.message_create_params import MessageCreateParamsStreaming
from voluptuous_openapi import convert
@@ -64,21 +48,14 @@ from .const import (
CONF_MAX_TOKENS,
CONF_TEMPERATURE,
CONF_THINKING_BUDGET,
CONF_WEB_SEARCH,
CONF_WEB_SEARCH_CITY,
CONF_WEB_SEARCH_COUNTRY,
CONF_WEB_SEARCH_MAX_USES,
CONF_WEB_SEARCH_REGION,
CONF_WEB_SEARCH_TIMEZONE,
CONF_WEB_SEARCH_USER_LOCATION,
DOMAIN,
LOGGER,
MIN_THINKING_BUDGET,
NON_THINKING_MODELS,
RECOMMENDED_CHAT_MODEL,
RECOMMENDED_MAX_TOKENS,
RECOMMENDED_TEMPERATURE,
RECOMMENDED_THINKING_BUDGET,
THINKING_MODELS,
)
# Max number of back and forth with the LLM to generate a response
@@ -96,69 +73,6 @@ def _format_tool(
)
@dataclass(slots=True)
class CitationDetails:
"""Citation details for a content part."""
index: int = 0
"""Start position of the text."""
length: int = 0
"""Length of the relevant data."""
citations: list[TextCitationParam] = field(default_factory=list)
"""Citations for the content part."""
@dataclass(slots=True)
class ContentDetails:
"""Native data for AssistantContent."""
citation_details: list[CitationDetails] = field(default_factory=list)
def has_content(self) -> bool:
"""Check if there is any content."""
return any(detail.length > 0 for detail in self.citation_details)
def has_citations(self) -> bool:
"""Check if there are any citations."""
return any(detail.citations for detail in self.citation_details)
def add_citation_detail(self) -> None:
"""Add a new citation detail."""
if not self.citation_details or self.citation_details[-1].length > 0:
self.citation_details.append(
CitationDetails(
index=self.citation_details[-1].index
+ self.citation_details[-1].length
if self.citation_details
else 0
)
)
def add_citation(self, citation: TextCitation) -> None:
"""Add a citation to the current detail."""
if not self.citation_details:
self.citation_details.append(CitationDetails())
citation_param: TextCitationParam | None = None
if isinstance(citation, CitationsWebSearchResultLocation):
citation_param = CitationWebSearchResultLocationParam(
type="web_search_result_location",
title=citation.title,
url=citation.url,
cited_text=citation.cited_text,
encrypted_index=citation.encrypted_index,
)
if citation_param:
self.citation_details[-1].citations.append(citation_param)
def delete_empty(self) -> None:
"""Delete empty citation details."""
self.citation_details = [
detail for detail in self.citation_details if detail.citations
]
def _convert_content(
chat_content: Iterable[conversation.Content],
) -> list[MessageParam]:
@@ -167,31 +81,15 @@ def _convert_content(
for content in chat_content:
if isinstance(content, conversation.ToolResultContent):
if content.tool_name == "web_search":
tool_result_block: ContentBlockParam = WebSearchToolResultBlockParam(
type="web_search_tool_result",
tool_use_id=content.tool_call_id,
content=content.tool_result["content"]
if "content" in content.tool_result
else WebSearchToolRequestErrorParam(
type="web_search_tool_result_error",
error_code=content.tool_result.get("error_code", "unavailable"), # type: ignore[typeddict-item]
),
)
external_tool = True
else:
tool_result_block = ToolResultBlockParam(
type="tool_result",
tool_use_id=content.tool_call_id,
content=json.dumps(content.tool_result),
)
external_tool = False
if not messages or messages[-1]["role"] != (
"assistant" if external_tool else "user"
):
tool_result_block = ToolResultBlockParam(
type="tool_result",
tool_use_id=content.tool_call_id,
content=json.dumps(content.tool_result),
)
if not messages or messages[-1]["role"] != "user":
messages.append(
MessageParam(
role="assistant" if external_tool else "user",
role="user",
content=[tool_result_block],
)
)
@@ -253,56 +151,13 @@ def _convert_content(
redacted_thinking_block
)
if content.content:
current_index = 0
for detail in (
content.native.citation_details
if isinstance(content.native, ContentDetails)
else [CitationDetails(length=len(content.content))]
):
if detail.index > current_index:
# Add text block for any text without citations
messages[-1]["content"].append( # type: ignore[union-attr]
TextBlockParam(
type="text",
text=content.content[current_index : detail.index],
)
)
messages[-1]["content"].append( # type: ignore[union-attr]
TextBlockParam(
type="text",
text=content.content[
detail.index : detail.index + detail.length
],
citations=detail.citations,
)
if detail.citations
else TextBlockParam(
type="text",
text=content.content[
detail.index : detail.index + detail.length
],
)
)
current_index = detail.index + detail.length
if current_index < len(content.content):
# Add text block for any remaining text without citations
messages[-1]["content"].append( # type: ignore[union-attr]
TextBlockParam(
type="text",
text=content.content[current_index:],
)
)
messages[-1]["content"].append( # type: ignore[union-attr]
TextBlockParam(type="text", text=content.content)
)
if content.tool_calls:
messages[-1]["content"].extend( # type: ignore[union-attr]
[
ServerToolUseBlockParam(
type="server_tool_use",
id=tool_call.id,
name="web_search",
input=tool_call.tool_args,
)
if tool_call.external and tool_call.tool_name == "web_search"
else ToolUseBlockParam(
ToolUseBlockParam(
type="tool_use",
id=tool_call.id,
name=tool_call.tool_name,
@@ -318,12 +173,10 @@ def _convert_content(
return messages
async def _transform_stream( # noqa: C901 - This is complex, but better to have it in one place
async def _transform_stream(
chat_log: conversation.ChatLog,
stream: AsyncStream[MessageStreamEvent],
) -> AsyncGenerator[
conversation.AssistantContentDeltaDict | conversation.ToolResultContentDeltaDict
]:
) -> AsyncGenerator[conversation.AssistantContentDeltaDict]:
"""Transform the response stream into HA format.
A typical stream of responses might look something like the following:
@@ -356,13 +209,11 @@ async def _transform_stream( # noqa: C901 - This is complex, but better to have
if stream is None:
raise TypeError("Expected a stream of messages")
current_tool_block: ToolUseBlockParam | ServerToolUseBlockParam | None = None
current_tool_block: ToolUseBlockParam | None = None
current_tool_args: str
content_details = ContentDetails()
content_details.add_citation_detail()
input_usage: Usage | None = None
has_content = False
has_native = False
first_block: bool
async for response in stream:
LOGGER.debug("Received response: %s", response)
@@ -371,7 +222,6 @@ async def _transform_stream( # noqa: C901 - This is complex, but better to have
if response.message.role != "assistant":
raise ValueError("Unexpected message role")
input_usage = response.message.usage
first_block = True
elif isinstance(response, RawContentBlockStartEvent):
if isinstance(response.content_block, ToolUseBlock):
current_tool_block = ToolUseBlockParam(
@@ -382,37 +232,17 @@ async def _transform_stream( # noqa: C901 - This is complex, but better to have
)
current_tool_args = ""
elif isinstance(response.content_block, TextBlock):
if ( # Do not start a new assistant content just for citations, concatenate consecutive blocks with citations instead.
first_block
or (
not content_details.has_citations()
and response.content_block.citations is None
and content_details.has_content()
)
):
if content_details.has_citations():
content_details.delete_empty()
yield {"native": content_details}
content_details = ContentDetails()
if has_content:
yield {"role": "assistant"}
has_native = False
first_block = False
content_details.add_citation_detail()
has_content = True
if response.content_block.text:
content_details.citation_details[-1].length += len(
response.content_block.text
)
yield {"content": response.content_block.text}
elif isinstance(response.content_block, ThinkingBlock):
if first_block or has_native:
if content_details.has_citations():
content_details.delete_empty()
yield {"native": content_details}
content_details = ContentDetails()
content_details.add_citation_detail()
if has_native:
yield {"role": "assistant"}
has_native = False
first_block = False
has_content = False
elif isinstance(response.content_block, RedactedThinkingBlock):
LOGGER.debug(
"Some of Claudes internal reasoning has been automatically "
@@ -420,60 +250,15 @@ async def _transform_stream( # noqa: C901 - This is complex, but better to have
"responses"
)
if has_native:
if content_details.has_citations():
content_details.delete_empty()
yield {"native": content_details}
content_details = ContentDetails()
content_details.add_citation_detail()
yield {"role": "assistant"}
has_native = False
first_block = False
has_content = False
yield {"native": response.content_block}
has_native = True
elif isinstance(response.content_block, ServerToolUseBlock):
current_tool_block = ServerToolUseBlockParam(
type="server_tool_use",
id=response.content_block.id,
name=response.content_block.name,
input="",
)
current_tool_args = ""
elif isinstance(response.content_block, WebSearchToolResultBlock):
if content_details.has_citations():
content_details.delete_empty()
yield {"native": content_details}
content_details = ContentDetails()
content_details.add_citation_detail()
yield {
"role": "tool_result",
"tool_call_id": response.content_block.tool_use_id,
"tool_name": "web_search",
"tool_result": {
"type": "web_search_tool_result_error",
"error_code": response.content_block.content.error_code,
}
if isinstance(
response.content_block.content, WebSearchToolResultError
)
else {
"content": [
{
"type": "web_search_result",
"encrypted_content": block.encrypted_content,
"page_age": block.page_age,
"title": block.title,
"url": block.url,
}
for block in response.content_block.content
]
},
}
first_block = True
elif isinstance(response, RawContentBlockDeltaEvent):
if isinstance(response.delta, InputJSONDelta):
current_tool_args += response.delta.partial_json
elif isinstance(response.delta, TextDelta):
content_details.citation_details[-1].length += len(response.delta.text)
yield {"content": response.delta.text}
elif isinstance(response.delta, ThinkingDelta):
yield {"thinking_content": response.delta.thinking}
@@ -486,8 +271,6 @@ async def _transform_stream( # noqa: C901 - This is complex, but better to have
)
}
has_native = True
elif isinstance(response.delta, CitationsDelta):
content_details.add_citation(response.delta.citation)
elif isinstance(response, RawContentBlockStopEvent):
if current_tool_block is not None:
tool_args = json.loads(current_tool_args) if current_tool_args else {}
@@ -498,7 +281,6 @@ async def _transform_stream( # noqa: C901 - This is complex, but better to have
id=current_tool_block["id"],
tool_name=current_tool_block["name"],
tool_args=tool_args,
external=current_tool_block["type"] == "server_tool_use",
)
]
}
@@ -508,12 +290,6 @@ async def _transform_stream( # noqa: C901 - This is complex, but better to have
chat_log.async_trace(_create_token_stats(input_usage, usage))
if response.delta.stop_reason == "refusal":
raise HomeAssistantError("Potential policy violation detected")
elif isinstance(response, RawMessageStopEvent):
if content_details.has_citations():
content_details.delete_empty()
yield {"native": content_details}
content_details = ContentDetails()
content_details.add_citation_detail()
def _create_token_stats(
@@ -561,11 +337,21 @@ class AnthropicBaseLLMEntity(Entity):
"""Generate an answer for the chat log."""
options = self.subentry.data
tools: list[ToolParam] | None = None
if chat_log.llm_api:
tools = [
_format_tool(tool, chat_log.llm_api.custom_serializer)
for tool in chat_log.llm_api.tools
]
system = chat_log.content[0]
if not isinstance(system, conversation.SystemContent):
raise TypeError("First message must be a system message")
messages = _convert_content(chat_log.content[1:])
client = self.entry.runtime_data
thinking_budget = options.get(CONF_THINKING_BUDGET, RECOMMENDED_THINKING_BUDGET)
model = options.get(CONF_CHAT_MODEL, RECOMMENDED_CHAT_MODEL)
model_args = MessageCreateParamsStreaming(
@@ -575,10 +361,10 @@ class AnthropicBaseLLMEntity(Entity):
system=system.content,
stream=True,
)
thinking_budget = options.get(CONF_THINKING_BUDGET, RECOMMENDED_THINKING_BUDGET)
if tools:
model_args["tools"] = tools
if (
not model.startswith(tuple(NON_THINKING_MODELS))
model.startswith(tuple(THINKING_MODELS))
and thinking_budget >= MIN_THINKING_BUDGET
):
model_args["thinking"] = ThinkingConfigEnabledParam(
@@ -590,34 +376,6 @@ class AnthropicBaseLLMEntity(Entity):
CONF_TEMPERATURE, RECOMMENDED_TEMPERATURE
)
tools: list[ToolUnionParam] = []
if chat_log.llm_api:
tools = [
_format_tool(tool, chat_log.llm_api.custom_serializer)
for tool in chat_log.llm_api.tools
]
if options.get(CONF_WEB_SEARCH):
web_search = WebSearchTool20250305Param(
name="web_search",
type="web_search_20250305",
max_uses=options.get(CONF_WEB_SEARCH_MAX_USES),
)
if options.get(CONF_WEB_SEARCH_USER_LOCATION):
web_search["user_location"] = {
"type": "approximate",
"city": options.get(CONF_WEB_SEARCH_CITY, ""),
"region": options.get(CONF_WEB_SEARCH_REGION, ""),
"country": options.get(CONF_WEB_SEARCH_COUNTRY, ""),
"timezone": options.get(CONF_WEB_SEARCH_TIMEZONE, ""),
}
tools.append(web_search)
if tools:
model_args["tools"] = tools
client = self.entry.runtime_data
# To prevent infinite loops, we limit the number of iterations
for _iteration in range(MAX_TOOL_ITERATIONS):
try:

View File

@@ -8,5 +8,5 @@
"documentation": "https://www.home-assistant.io/integrations/anthropic",
"integration_type": "service",
"iot_class": "cloud_polling",
"requirements": ["anthropic==0.69.0"]
"requirements": ["anthropic==0.62.0"]
}

View File

@@ -35,17 +35,11 @@
"temperature": "Temperature",
"llm_hass_api": "[%key:common::config_flow::data::llm_hass_api%]",
"recommended": "Recommended model settings",
"thinking_budget": "Thinking budget",
"web_search": "Enable web search",
"web_search_max_uses": "Maximum web searches",
"user_location": "Include home location"
"thinking_budget_tokens": "Thinking budget"
},
"data_description": {
"prompt": "Instruct how the LLM should respond. This can be a template.",
"thinking_budget": "The number of tokens the model can use to think about the response out of the total maximum number of tokens. Set to 1024 or greater to enable extended thinking.",
"web_search": "The web search tool gives Claude direct access to real-time web content, allowing it to answer questions with up-to-date information beyond its knowledge cutoff",
"web_search_max_uses": "Limit the number of searches performed per response",
"user_location": "Localize search results based on home location"
"thinking_budget_tokens": "The number of tokens the model can use to think about the response out of the total maximum number of tokens. Set to 1024 or greater to enable extended thinking."
}
}
},
@@ -54,8 +48,7 @@
"entry_not_loaded": "Cannot add things while the configuration is disabled."
},
"error": {
"thinking_budget_too_large": "Maximum tokens must be greater than the thinking budget.",
"web_search_unsupported_model": "Web search is not supported by the selected model. Please choose a compatible model or disable web search."
"thinking_budget_too_large": "Maximum tokens must be greater than the thinking budget."
}
}
}

View File

@@ -5,9 +5,14 @@ from __future__ import annotations
import asyncio
import logging
from random import randrange
import sys
from typing import Any, cast
from pyatv import connect, exceptions, scan
from pyatv.conf import AppleTV
from pyatv.const import DeviceModel, Protocol
from pyatv.convert import model_str
from pyatv.interface import AppleTV as AppleTVInterface, DeviceListener
from homeassistant.components import zeroconf
from homeassistant.config_entries import ConfigEntry
from homeassistant.const import (
@@ -24,11 +29,7 @@ from homeassistant.const import (
Platform,
)
from homeassistant.core import Event, HomeAssistant, callback
from homeassistant.exceptions import (
ConfigEntryAuthFailed,
ConfigEntryNotReady,
HomeAssistantError,
)
from homeassistant.exceptions import ConfigEntryAuthFailed, ConfigEntryNotReady
from homeassistant.helpers import device_registry as dr
from homeassistant.helpers.aiohttp_client import async_get_clientsession
from homeassistant.helpers.dispatcher import async_dispatcher_send
@@ -42,18 +43,6 @@ from .const import (
SIGNAL_DISCONNECTED,
)
if sys.version_info < (3, 14):
from pyatv import connect, exceptions, scan
from pyatv.conf import AppleTV
from pyatv.const import DeviceModel, Protocol
from pyatv.convert import model_str
from pyatv.interface import AppleTV as AppleTVInterface, DeviceListener
else:
class DeviceListener:
"""Dummy class."""
_LOGGER = logging.getLogger(__name__)
DEFAULT_NAME_TV = "Apple TV"
@@ -64,41 +53,31 @@ BACKOFF_TIME_UPPER_LIMIT = 300 # Five minutes
PLATFORMS = [Platform.MEDIA_PLAYER, Platform.REMOTE]
if sys.version_info < (3, 14):
AUTH_EXCEPTIONS = (
exceptions.AuthenticationError,
exceptions.InvalidCredentialsError,
exceptions.NoCredentialsError,
)
CONNECTION_TIMEOUT_EXCEPTIONS = (
OSError,
asyncio.CancelledError,
TimeoutError,
exceptions.ConnectionLostError,
exceptions.ConnectionFailedError,
)
DEVICE_EXCEPTIONS = (
exceptions.ProtocolError,
exceptions.NoServiceError,
exceptions.PairingError,
exceptions.BackOffError,
exceptions.DeviceIdMissingError,
)
else:
AUTH_EXCEPTIONS = ()
CONNECTION_TIMEOUT_EXCEPTIONS = ()
DEVICE_EXCEPTIONS = ()
AUTH_EXCEPTIONS = (
exceptions.AuthenticationError,
exceptions.InvalidCredentialsError,
exceptions.NoCredentialsError,
)
CONNECTION_TIMEOUT_EXCEPTIONS = (
OSError,
asyncio.CancelledError,
TimeoutError,
exceptions.ConnectionLostError,
exceptions.ConnectionFailedError,
)
DEVICE_EXCEPTIONS = (
exceptions.ProtocolError,
exceptions.NoServiceError,
exceptions.PairingError,
exceptions.BackOffError,
exceptions.DeviceIdMissingError,
)
type AppleTvConfigEntry = ConfigEntry[AppleTVManager]
async def async_setup_entry(hass: HomeAssistant, entry: AppleTvConfigEntry) -> bool:
"""Set up a config entry for Apple TV."""
if sys.version_info >= (3, 14):
raise HomeAssistantError(
"Apple TV is not supported on Python 3.14. Please use Python 3.13."
)
manager = AppleTVManager(hass, entry)
if manager.is_on:

View File

@@ -7,7 +7,7 @@
"documentation": "https://www.home-assistant.io/integrations/apple_tv",
"iot_class": "local_push",
"loggers": ["pyatv", "srptools"],
"requirements": ["pyatv==0.16.1;python_version<'3.14'"],
"requirements": ["pyatv==0.16.1"],
"zeroconf": [
"_mediaremotetv._tcp.local.",
"_companion-link._tcp.local.",

View File

@@ -7,8 +7,6 @@ from typing import Any
from pyaprilaire.const import Attribute
from homeassistant.components.climate import (
ATTR_TARGET_TEMP_HIGH,
ATTR_TARGET_TEMP_LOW,
FAN_AUTO,
FAN_ON,
PRESET_AWAY,
@@ -18,12 +16,7 @@ from homeassistant.components.climate import (
HVACAction,
HVACMode,
)
from homeassistant.const import (
ATTR_TEMPERATURE,
PRECISION_HALVES,
PRECISION_WHOLE,
UnitOfTemperature,
)
from homeassistant.const import PRECISION_HALVES, PRECISION_WHOLE, UnitOfTemperature
from homeassistant.core import HomeAssistant
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
@@ -239,15 +232,15 @@ class AprilaireClimate(BaseAprilaireEntity, ClimateEntity):
cool_setpoint = 0
heat_setpoint = 0
if temperature := kwargs.get(ATTR_TEMPERATURE):
if temperature := kwargs.get("temperature"):
if self.coordinator.data.get(Attribute.MODE) == 3:
cool_setpoint = temperature
else:
heat_setpoint = temperature
else:
if target_temp_low := kwargs.get(ATTR_TARGET_TEMP_LOW):
if target_temp_low := kwargs.get("target_temp_low"):
heat_setpoint = target_temp_low
if target_temp_high := kwargs.get(ATTR_TARGET_TEMP_HIGH):
if target_temp_high := kwargs.get("target_temp_high"):
cool_setpoint = target_temp_high
if cool_setpoint == 0 and heat_setpoint == 0:

View File

@@ -41,8 +41,6 @@ from .pipeline import (
async_setup_pipeline_store,
async_update_pipeline,
)
from .select import AssistPipelineSelect, VadSensitivitySelect
from .vad import VadSensitivity
from .websocket_api import async_register_websocket_api
__all__ = (
@@ -53,14 +51,11 @@ __all__ = (
"SAMPLE_CHANNELS",
"SAMPLE_RATE",
"SAMPLE_WIDTH",
"AssistPipelineSelect",
"AudioSettings",
"Pipeline",
"PipelineEvent",
"PipelineEventType",
"PipelineNotFound",
"VadSensitivity",
"VadSensitivitySelect",
"WakeWordSettings",
"async_create_default_pipeline",
"async_get_pipelines",

View File

@@ -3,17 +3,17 @@
from __future__ import annotations
from abc import ABC, abstractmethod
from collections import namedtuple
from collections.abc import Awaitable, Callable, Coroutine
import functools
import logging
from typing import Any, NamedTuple
from typing import Any
from aioasuswrt.asuswrt import AsusWrt as AsusWrtLegacy
from aiohttp import ClientSession
from asusrouter import AsusRouter, AsusRouterError
from asusrouter.config import ARConfigKey
from asusrouter.modules.client import AsusClient
from asusrouter.modules.connection import ConnectionState
from asusrouter.modules.client import AsusClient, ConnectionState
from asusrouter.modules.data import AsusData
from asusrouter.modules.homeassistant import convert_to_ha_data, convert_to_ha_sensors
from asusrouter.tools.connection import get_cookie_jar
@@ -61,14 +61,7 @@ SENSORS_TYPE_RATES = "sensors_rates"
SENSORS_TYPE_TEMPERATURES = "sensors_temperatures"
SENSORS_TYPE_UPTIME = "sensors_uptime"
class WrtDevice(NamedTuple):
"""WrtDevice structure."""
ip: str | None
name: str | None
conneted_to: str | None
WrtDevice = namedtuple("WrtDevice", ["ip", "name", "connected_to"]) # noqa: PYI024
_LOGGER = logging.getLogger(__name__)
@@ -87,7 +80,7 @@ def handle_errors_and_zip[_AsusWrtBridgeT: AsusWrtBridge](
"""Run library methods and zip results or manage exceptions."""
@functools.wraps(func)
async def _wrapper(self: _AsusWrtBridgeT) -> dict[str, str]:
async def _wrapper(self: _AsusWrtBridgeT) -> dict[str, Any]:
try:
data = await func(self)
except exceptions as exc:

View File

@@ -10,6 +10,8 @@ from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from . import AsusWrtConfigEntry
from .router import AsusWrtDevInfo, AsusWrtRouter
ATTR_LAST_TIME_REACHABLE = "last_time_reachable"
DEFAULT_DEVICE_NAME = "Unknown device"
@@ -56,6 +58,8 @@ def add_entities(
class AsusWrtDevice(ScannerEntity):
"""Representation of a AsusWrt device."""
_unrecorded_attributes = frozenset({ATTR_LAST_TIME_REACHABLE})
_attr_should_poll = False
def __init__(self, router: AsusWrtRouter, device: AsusWrtDevInfo) -> None:
@@ -93,6 +97,11 @@ class AsusWrtDevice(ScannerEntity):
def async_on_demand_update(self) -> None:
"""Update state."""
self._device = self._router.devices[self._device.mac]
self._attr_extra_state_attributes = {}
if self._device.last_activity:
self._attr_extra_state_attributes[ATTR_LAST_TIME_REACHABLE] = (
self._device.last_activity.isoformat(timespec="seconds")
)
self.async_write_ha_state()
async def async_added_to_hass(self) -> None:

View File

@@ -2,7 +2,9 @@
from __future__ import annotations
from typing import Any
from typing import Any, TypeVar
T = TypeVar("T", dict[str, Any], list[Any], None)
TRANSLATION_MAP = {
"wan_rx": "sensor_rx_bytes",
@@ -34,7 +36,7 @@ def clean_dict(raw: dict[str, Any]) -> dict[str, Any]:
return {k: v for k, v in raw.items() if v is not None or k.endswith("state")}
def translate_to_legacy[T: (dict[str, Any], list[Any], None)](raw: T) -> T:
def translate_to_legacy(raw: T) -> T:
"""Translate raw data to legacy format for dicts and lists."""
if raw is None:

View File

@@ -5,5 +5,5 @@
"config_flow": true,
"documentation": "https://www.home-assistant.io/integrations/autarco",
"iot_class": "cloud_polling",
"requirements": ["autarco==3.2.0"]
"requirements": ["autarco==3.1.0"]
}

View File

@@ -136,22 +136,17 @@ class WellKnownOAuthInfoView(HomeAssistantView):
url_prefix = get_url(hass, require_current_request=True)
except NoURLAvailableError:
url_prefix = ""
metadata = {
"authorization_endpoint": f"{url_prefix}/auth/authorize",
"token_endpoint": f"{url_prefix}/auth/token",
"revocation_endpoint": f"{url_prefix}/auth/revoke",
"response_types_supported": ["code"],
"service_documentation": (
"https://developers.home-assistant.io/docs/auth_api"
),
}
# Add issuer only when we have a valid base URL (RFC 8414 compliance)
if url_prefix:
metadata["issuer"] = url_prefix
return self.json(metadata)
return self.json(
{
"authorization_endpoint": f"{url_prefix}/auth/authorize",
"token_endpoint": f"{url_prefix}/auth/token",
"revocation_endpoint": f"{url_prefix}/auth/revoke",
"response_types_supported": ["code"],
"service_documentation": (
"https://developers.home-assistant.io/docs/auth_api"
),
}
)
class AuthProvidersView(HomeAssistantView):

View File

@@ -26,6 +26,9 @@ async def async_setup_entry(
if CONF_HOST in config_entry.data:
coordinator = AwairLocalDataUpdateCoordinator(hass, config_entry, session)
config_entry.async_on_unload(
config_entry.add_update_listener(_async_update_listener)
)
else:
coordinator = AwairCloudDataUpdateCoordinator(hass, config_entry, session)
@@ -33,11 +36,6 @@ async def async_setup_entry(
config_entry.runtime_data = coordinator
if CONF_HOST in config_entry.data:
config_entry.async_on_unload(
config_entry.add_update_listener(_async_update_listener)
)
await hass.config_entries.async_forward_entry_setups(config_entry, PLATFORMS)
return True

View File

@@ -17,7 +17,6 @@ from homeassistant.core import HomeAssistant, callback
from homeassistant.exceptions import HomeAssistantError
from homeassistant.helpers import frame
from homeassistant.util import slugify
from homeassistant.util.async_iterator import AsyncIteratorReader, AsyncIteratorWriter
from . import util
from .agent import BackupAgent
@@ -145,7 +144,7 @@ class DownloadBackupView(HomeAssistantView):
return Response(status=HTTPStatus.NOT_FOUND)
else:
stream = await agent.async_download_backup(backup_id)
reader = cast(IO[bytes], AsyncIteratorReader(hass.loop, stream))
reader = cast(IO[bytes], util.AsyncIteratorReader(hass, stream))
worker_done_event = asyncio.Event()
@@ -153,7 +152,7 @@ class DownloadBackupView(HomeAssistantView):
"""Call by the worker thread when it's done."""
hass.loop.call_soon_threadsafe(worker_done_event.set)
stream = AsyncIteratorWriter(hass.loop)
stream = util.AsyncIteratorWriter(hass)
worker = threading.Thread(
target=util.decrypt_backup,
args=[backup, reader, stream, password, on_done, 0, []],

View File

@@ -38,7 +38,6 @@ from homeassistant.helpers import (
)
from homeassistant.helpers.json import json_bytes
from homeassistant.util import dt as dt_util, json as json_util
from homeassistant.util.async_iterator import AsyncIteratorReader
from . import util as backup_util
from .agent import (
@@ -73,6 +72,7 @@ from .models import (
)
from .store import BackupStore
from .util import (
AsyncIteratorReader,
DecryptedBackupStreamer,
EncryptedBackupStreamer,
make_backup_dir,
@@ -1525,7 +1525,7 @@ class BackupManager:
reader = await self.hass.async_add_executor_job(open, path.as_posix(), "rb")
else:
backup_stream = await agent.async_download_backup(backup_id)
reader = cast(IO[bytes], AsyncIteratorReader(self.hass.loop, backup_stream))
reader = cast(IO[bytes], AsyncIteratorReader(self.hass, backup_stream))
try:
await self.hass.async_add_executor_job(
validate_password_stream, reader, password

View File

@@ -4,6 +4,7 @@ from __future__ import annotations
import asyncio
from collections.abc import AsyncIterator, Callable, Coroutine
from concurrent.futures import CancelledError, Future
import copy
from dataclasses import dataclass, replace
from io import BytesIO
@@ -13,7 +14,7 @@ from pathlib import Path, PurePath
from queue import SimpleQueue
import tarfile
import threading
from typing import IO, Any, cast
from typing import IO, Any, Self, cast
import aiohttp
from securetar import SecureTarError, SecureTarFile, SecureTarReadError
@@ -22,11 +23,6 @@ from homeassistant.backup_restore import password_to_key
from homeassistant.core import HomeAssistant
from homeassistant.exceptions import HomeAssistantError
from homeassistant.util import dt as dt_util
from homeassistant.util.async_iterator import (
Abort,
AsyncIteratorReader,
AsyncIteratorWriter,
)
from homeassistant.util.json import JsonObjectType, json_loads_object
from .const import BUF_SIZE, LOGGER
@@ -63,6 +59,12 @@ class BackupEmpty(DecryptError):
_message = "No tar files found in the backup."
class AbortCipher(HomeAssistantError):
"""Abort the cipher operation."""
_message = "Abort cipher operation."
def make_backup_dir(path: Path) -> None:
"""Create a backup directory if it does not exist."""
path.mkdir(exist_ok=True)
@@ -164,6 +166,106 @@ def validate_password(path: Path, password: str | None) -> bool:
return False
class AsyncIteratorReader:
"""Wrap an AsyncIterator."""
def __init__(self, hass: HomeAssistant, stream: AsyncIterator[bytes]) -> None:
"""Initialize the wrapper."""
self._aborted = False
self._hass = hass
self._stream = stream
self._buffer: bytes | None = None
self._next_future: Future[bytes | None] | None = None
self._pos: int = 0
async def _next(self) -> bytes | None:
"""Get the next chunk from the iterator."""
return await anext(self._stream, None)
def abort(self) -> None:
"""Abort the reader."""
self._aborted = True
if self._next_future is not None:
self._next_future.cancel()
def read(self, n: int = -1, /) -> bytes:
"""Read data from the iterator."""
result = bytearray()
while n < 0 or len(result) < n:
if not self._buffer:
self._next_future = asyncio.run_coroutine_threadsafe(
self._next(), self._hass.loop
)
if self._aborted:
self._next_future.cancel()
raise AbortCipher
try:
self._buffer = self._next_future.result()
except CancelledError as err:
raise AbortCipher from err
self._pos = 0
if not self._buffer:
# The stream is exhausted
break
chunk = self._buffer[self._pos : self._pos + n]
result.extend(chunk)
n -= len(chunk)
self._pos += len(chunk)
if self._pos == len(self._buffer):
self._buffer = None
return bytes(result)
def close(self) -> None:
"""Close the iterator."""
class AsyncIteratorWriter:
"""Wrap an AsyncIterator."""
def __init__(self, hass: HomeAssistant) -> None:
"""Initialize the wrapper."""
self._aborted = False
self._hass = hass
self._pos: int = 0
self._queue: asyncio.Queue[bytes | None] = asyncio.Queue(maxsize=1)
self._write_future: Future[bytes | None] | None = None
def __aiter__(self) -> Self:
"""Return the iterator."""
return self
async def __anext__(self) -> bytes:
"""Get the next chunk from the iterator."""
if data := await self._queue.get():
return data
raise StopAsyncIteration
def abort(self) -> None:
"""Abort the writer."""
self._aborted = True
if self._write_future is not None:
self._write_future.cancel()
def tell(self) -> int:
"""Return the current position in the iterator."""
return self._pos
def write(self, s: bytes, /) -> int:
"""Write data to the iterator."""
self._write_future = asyncio.run_coroutine_threadsafe(
self._queue.put(s), self._hass.loop
)
if self._aborted:
self._write_future.cancel()
raise AbortCipher
try:
self._write_future.result()
except CancelledError as err:
raise AbortCipher from err
self._pos += len(s)
return len(s)
def validate_password_stream(
input_stream: IO[bytes],
password: str | None,
@@ -240,7 +342,7 @@ def decrypt_backup(
finally:
# Write an empty chunk to signal the end of the stream
output_stream.write(b"")
except Abort:
except AbortCipher:
LOGGER.debug("Cipher operation aborted")
finally:
on_done(error)
@@ -328,7 +430,7 @@ def encrypt_backup(
finally:
# Write an empty chunk to signal the end of the stream
output_stream.write(b"")
except Abort:
except AbortCipher:
LOGGER.debug("Cipher operation aborted")
finally:
on_done(error)
@@ -455,8 +557,8 @@ class _CipherBackupStreamer:
self._hass.loop.call_soon_threadsafe(worker_status.done.set)
stream = await self._open_stream()
reader = AsyncIteratorReader(self._hass.loop, stream)
writer = AsyncIteratorWriter(self._hass.loop)
reader = AsyncIteratorReader(self._hass, stream)
writer = AsyncIteratorWriter(self._hass)
worker = threading.Thread(
target=self._cipher_func,
args=[

View File

@@ -73,12 +73,11 @@ async def async_setup_entry(hass: HomeAssistant, entry: BangOlufsenConfigEntry)
# Add the websocket and API client
entry.runtime_data = BangOlufsenData(websocket, client)
await hass.config_entries.async_forward_entry_setups(entry, PLATFORMS)
# Start WebSocket connection once the platforms have been loaded.
# This ensures that the initial WebSocket notifications are dispatched to entities
# Start WebSocket connection
await client.connect_notifications(remote_control=True, reconnect=True)
await hass.config_entries.async_forward_entry_setups(entry, PLATFORMS)
return True

View File

@@ -125,8 +125,7 @@ async def async_setup_entry(
async_add_entities(
new_entities=[
BangOlufsenMediaPlayer(config_entry, config_entry.runtime_data.client)
],
update_before_add=True,
]
)
# Register actions.
@@ -267,8 +266,34 @@ class BangOlufsenMediaPlayer(BangOlufsenEntity, MediaPlayerEntity):
self._software_status.software_version,
)
# Get overall device state once. This is handled by WebSocket events the rest of the time.
product_state = await self._client.get_product_state()
# Get volume information.
if product_state.volume:
self._volume = product_state.volume
# Get all playback information.
# Ensure that the metadata is not None upon startup
if product_state.playback:
if product_state.playback.metadata:
self._playback_metadata = product_state.playback.metadata
self._remote_leader = product_state.playback.metadata.remote_leader
if product_state.playback.progress:
self._playback_progress = product_state.playback.progress
if product_state.playback.source:
self._source_change = product_state.playback.source
if product_state.playback.state:
self._playback_state = product_state.playback.state
# Set initial state
if self._playback_state.value:
self._state = self._playback_state.value
self._attr_media_position_updated_at = utcnow()
# Get the highest resolution available of the given images.
self._media_image = get_highest_resolution_artwork(self._playback_metadata)
# If the device has been updated with new sources, then the API will fail here.
await self._async_update_sources()

View File

@@ -3,12 +3,16 @@ beolink_allstandby:
entity:
integration: bang_olufsen
domain: media_player
device:
integration: bang_olufsen
beolink_expand:
target:
entity:
integration: bang_olufsen
domain: media_player
device:
integration: bang_olufsen
fields:
all_discovered:
required: false
@@ -33,6 +37,8 @@ beolink_join:
entity:
integration: bang_olufsen
domain: media_player
device:
integration: bang_olufsen
fields:
jid_options:
collapsed: false
@@ -65,12 +71,16 @@ beolink_leave:
entity:
integration: bang_olufsen
domain: media_player
device:
integration: bang_olufsen
beolink_unexpand:
target:
entity:
integration: bang_olufsen
domain: media_player
device:
integration: bang_olufsen
fields:
jid_options:
collapsed: false

View File

@@ -146,7 +146,7 @@
},
"state": {
"title": "Add a Bayesian sensor",
"description": "Add an observation which evaluates to `True` when the value of the sensor exactly matches *'To state'*. When `False`, it will update the prior with probabilities that are the inverse of those set below. This behavior can be overridden by adding observations for the same entity's other states.",
"description": "Add an observation which evaluates to `True` when the value of the sensor exactly matches *'To state'*. When `False`, it will update the prior with probabilities that are the inverse of those set below. This behaviour can be overridden by adding observations for the same entity's other states.",
"data": {
"name": "[%key:common::config_flow::data::name%]",

View File

@@ -57,7 +57,6 @@ from .api import (
_get_manager,
async_address_present,
async_ble_device_from_address,
async_clear_address_from_match_history,
async_current_scanners,
async_discovered_service_info,
async_get_advertisement_callback,
@@ -113,9 +112,9 @@ __all__ = [
"BluetoothServiceInfo",
"BluetoothServiceInfoBleak",
"HaBluetoothConnector",
"HomeAssistantRemoteScanner",
"async_address_present",
"async_ble_device_from_address",
"async_clear_address_from_match_history",
"async_current_scanners",
"async_discovered_service_info",
"async_get_advertisement_callback",

View File

@@ -193,20 +193,6 @@ def async_rediscover_address(hass: HomeAssistant, address: str) -> None:
_get_manager(hass).async_rediscover_address(address)
@hass_callback
def async_clear_address_from_match_history(hass: HomeAssistant, address: str) -> None:
"""Clear an address from the integration matcher history.
This allows future advertisements from this address to trigger discovery
even if the advertisement content has changed but the service data UUIDs
remain the same.
Unlike async_rediscover_address, this does not immediately re-trigger
discovery with the current advertisement in history.
"""
_get_manager(hass).async_clear_address_from_match_history(address)
@hass_callback
def async_register_scanner(
hass: HomeAssistant,

View File

@@ -120,19 +120,6 @@ class HomeAssistantBluetoothManager(BluetoothManager):
if service_info := self._all_history.get(address):
self._async_trigger_matching_discovery(service_info)
@hass_callback
def async_clear_address_from_match_history(self, address: str) -> None:
"""Clear an address from the integration matcher history.
This allows future advertisements from this address to trigger discovery
even if the advertisement content has changed but the service data UUIDs
remain the same.
Unlike async_rediscover_address, this does not immediately re-trigger
discovery with the current advertisement in history.
"""
self._integration_matcher.async_clear_address(address)
def _discover_service_info(self, service_info: BluetoothServiceInfoBleak) -> None:
matched_domains = self._integration_matcher.match_domains(service_info)
if self._debug:

View File

@@ -19,8 +19,8 @@
"bleak-retry-connector==4.4.3",
"bluetooth-adapters==2.1.0",
"bluetooth-auto-recovery==1.5.3",
"bluetooth-data-tools==1.28.3",
"dbus-fast==2.44.5",
"habluetooth==5.7.0"
"bluetooth-data-tools==1.28.2",
"dbus-fast==2.44.3",
"habluetooth==5.6.4"
]
}

View File

@@ -7,14 +7,12 @@ from typing import Any
from evolutionhttp import BryantEvolutionLocalClient
from homeassistant.components.climate import (
ATTR_TARGET_TEMP_HIGH,
ATTR_TARGET_TEMP_LOW,
ClimateEntity,
ClimateEntityFeature,
HVACAction,
HVACMode,
)
from homeassistant.const import ATTR_TEMPERATURE, UnitOfTemperature
from homeassistant.const import UnitOfTemperature
from homeassistant.core import HomeAssistant
from homeassistant.exceptions import HomeAssistantError
from homeassistant.helpers.device_registry import DeviceInfo
@@ -210,24 +208,24 @@ class BryantEvolutionClimate(ClimateEntity):
async def async_set_temperature(self, **kwargs: Any) -> None:
"""Set new target temperature."""
if value := kwargs.get(ATTR_TARGET_TEMP_HIGH):
temp = int(value)
if kwargs.get("target_temp_high"):
temp = int(kwargs["target_temp_high"])
if not await self._client.set_cooling_setpoint(temp):
raise HomeAssistantError(
translation_domain=DOMAIN, translation_key="failed_to_set_clsp"
)
self._attr_target_temperature_high = temp
if value := kwargs.get(ATTR_TARGET_TEMP_LOW):
temp = int(value)
if kwargs.get("target_temp_low"):
temp = int(kwargs["target_temp_low"])
if not await self._client.set_heating_setpoint(temp):
raise HomeAssistantError(
translation_domain=DOMAIN, translation_key="failed_to_set_htsp"
)
self._attr_target_temperature_low = temp
if value := kwargs.get(ATTR_TEMPERATURE):
temp = int(value)
if kwargs.get("temperature"):
temp = int(kwargs["temperature"])
fn = (
self._client.set_heating_setpoint
if self.hvac_mode == HVACMode.HEAT

View File

@@ -3,20 +3,15 @@
from __future__ import annotations
from datetime import datetime
from functools import partial
import logging
from typing import Any
import caldav
from caldav.lib.error import DAVError
import requests
import voluptuous as vol
from homeassistant.components.calendar import (
ENTITY_ID_FORMAT,
PLATFORM_SCHEMA as CALENDAR_PLATFORM_SCHEMA,
CalendarEntity,
CalendarEntityFeature,
CalendarEvent,
is_offset_reached,
)
@@ -28,7 +23,6 @@ from homeassistant.const import (
CONF_VERIFY_SSL,
)
from homeassistant.core import HomeAssistant, callback
from homeassistant.exceptions import HomeAssistantError
from homeassistant.helpers import config_validation as cv
from homeassistant.helpers.entity import async_generate_entity_id
from homeassistant.helpers.entity_platform import (
@@ -181,8 +175,6 @@ async def async_setup_entry(
class WebDavCalendarEntity(CoordinatorEntity[CalDavUpdateCoordinator], CalendarEntity):
"""A device for getting the next Task from a WebDav Calendar."""
_attr_supported_features = CalendarEntityFeature.CREATE_EVENT
def __init__(
self,
name: str | None,
@@ -211,31 +203,6 @@ class WebDavCalendarEntity(CoordinatorEntity[CalDavUpdateCoordinator], CalendarE
"""Get all events in a specific time frame."""
return await self.coordinator.async_get_events(hass, start_date, end_date)
async def async_create_event(self, **kwargs: Any) -> None:
"""Create a new event in the calendar."""
_LOGGER.debug("Event: %s", kwargs)
item_data: dict[str, Any] = {
"summary": kwargs["summary"],
"dtstart": kwargs["dtstart"],
"dtend": kwargs["dtend"],
}
if description := kwargs.get("description"):
item_data["description"] = description
if location := kwargs.get("location"):
item_data["location"] = location
if rrule := kwargs.get("rrule"):
item_data["rrule"] = rrule
_LOGGER.debug("ICS data %s", item_data)
try:
await self.hass.async_add_executor_job(
partial(self.coordinator.calendar.add_event, **item_data),
)
except (requests.ConnectionError, DAVError) as err:
raise HomeAssistantError(f"CalDAV save error: {err}") from err
@callback
def _handle_coordinator_update(self) -> None:
"""Update event data."""

View File

@@ -315,7 +315,9 @@ async def async_setup(hass: HomeAssistant, config: ConfigType) -> bool:
hass.http.register_view(CalendarListView(component))
hass.http.register_view(CalendarEventView(component))
frontend.async_register_built_in_panel(hass, "calendar", "calendar", "mdi:calendar")
frontend.async_register_built_in_panel(
hass, "calendar", "calendar", "hass:calendar"
)
websocket_api.async_register_command(hass, handle_calendar_event_create)
websocket_api.async_register_command(hass, handle_calendar_event_delete)

View File

@@ -169,7 +169,7 @@ class CalendarEventListener:
def __init__(
self,
hass: HomeAssistant,
job: HassJob[..., Coroutine[Any, Any, None] | Any],
job: HassJob[..., Coroutine[Any, Any, None]],
trigger_data: dict[str, Any],
fetcher: QueuedEventFetcher,
) -> None:

View File

@@ -51,6 +51,12 @@ from homeassistant.const import (
from homeassistant.core import Event, HomeAssistant, ServiceCall, callback
from homeassistant.exceptions import HomeAssistantError
from homeassistant.helpers import config_validation as cv, issue_registry as ir
from homeassistant.helpers.deprecation import (
DeprecatedConstantEnum,
all_with_deprecated_constants,
check_if_deprecated_constant,
dir_with_deprecated_constants,
)
from homeassistant.helpers.entity import Entity, EntityDescription
from homeassistant.helpers.entity_component import EntityComponent
from homeassistant.helpers.event import async_track_time_interval
@@ -74,10 +80,7 @@ from .const import (
StreamType,
)
from .helper import get_camera_from_entity_id
from .img_util import (
TurboJPEGSingleton, # noqa: F401
scale_jpeg_camera_image,
)
from .img_util import scale_jpeg_camera_image
from .prefs import (
CameraPreferences,
DynamicStreamSettings, # noqa: F401
@@ -115,6 +118,12 @@ ATTR_FILENAME: Final = "filename"
ATTR_MEDIA_PLAYER: Final = "media_player"
ATTR_FORMAT: Final = "format"
# These constants are deprecated as of Home Assistant 2024.10
# Please use the StreamType enum instead.
_DEPRECATED_STATE_RECORDING = DeprecatedConstantEnum(CameraState.RECORDING, "2025.10")
_DEPRECATED_STATE_STREAMING = DeprecatedConstantEnum(CameraState.STREAMING, "2025.10")
_DEPRECATED_STATE_IDLE = DeprecatedConstantEnum(CameraState.IDLE, "2025.10")
class CameraEntityFeature(IntFlag):
"""Supported features of the camera entity."""
@@ -1108,3 +1117,11 @@ async def async_handle_record_service(
duration=service_call.data[CONF_DURATION],
lookback=service_call.data[CONF_LOOKBACK],
)
# These can be removed if no deprecated constant are in this module anymore
__getattr__ = partial(check_if_deprecated_constant, module_globals=globals())
__dir__ = partial(
dir_with_deprecated_constants, module_globals_keys=[*globals().keys()]
)
__all__ = all_with_deprecated_constants(globals())

View File

@@ -31,7 +31,7 @@ async def async_setup_entry(
for location_id, location in coordinator.data["locations"].items()
]
async_add_entities(alarms)
async_add_entities(alarms, True)
class CanaryAlarm(

View File

@@ -68,7 +68,8 @@ async def async_setup_entry(
for location_id, location in coordinator.data["locations"].items()
for device in location.devices
if device.is_online
)
),
True,
)

View File

@@ -80,7 +80,7 @@ async def async_setup_entry(
if device_type.get("name") in sensor_type[4]
)
async_add_entities(sensors)
async_add_entities(sensors, True)
class CanarySensor(CoordinatorEntity[CanaryDataUpdateCoordinator], SensorEntity):

View File

@@ -4,6 +4,5 @@
"codeowners": [],
"documentation": "https://www.home-assistant.io/integrations/citybikes",
"iot_class": "cloud_polling",
"quality_scale": "legacy",
"requirements": ["python-citybikes==0.3.3"]
"quality_scale": "legacy"
}

View File

@@ -5,11 +5,8 @@ from __future__ import annotations
import asyncio
from datetime import timedelta
import logging
import sys
import aiohttp
from citybikes import __version__ as CITYBIKES_CLIENT_VERSION
from citybikes.asyncio import Client as CitybikesClient
import voluptuous as vol
from homeassistant.components.sensor import (
@@ -18,18 +15,21 @@ from homeassistant.components.sensor import (
SensorEntity,
)
from homeassistant.const import (
APPLICATION_NAME,
ATTR_ID,
ATTR_LATITUDE,
ATTR_LOCATION,
ATTR_LONGITUDE,
ATTR_NAME,
CONF_LATITUDE,
CONF_LONGITUDE,
CONF_NAME,
CONF_RADIUS,
EVENT_HOMEASSISTANT_CLOSE,
UnitOfLength,
__version__,
)
from homeassistant.core import HomeAssistant
from homeassistant.exceptions import PlatformNotReady
from homeassistant.helpers import config_validation as cv
from homeassistant.helpers.aiohttp_client import async_get_clientsession
from homeassistant.helpers.entity import async_generate_entity_id
from homeassistant.helpers.entity_platform import AddEntitiesCallback
from homeassistant.helpers.event import async_track_time_interval
@@ -40,33 +40,31 @@ from homeassistant.util.unit_system import US_CUSTOMARY_SYSTEM
_LOGGER = logging.getLogger(__name__)
HA_USER_AGENT = (
f"{APPLICATION_NAME}/{__version__} "
f"python-citybikes/{CITYBIKES_CLIENT_VERSION} "
f"Python/{sys.version_info[0]}.{sys.version_info[1]}"
)
ATTR_UID = "uid"
ATTR_LATITUDE = "latitude"
ATTR_LONGITUDE = "longitude"
ATTR_EMPTY_SLOTS = "empty_slots"
ATTR_EXTRA = "extra"
ATTR_FREE_BIKES = "free_bikes"
ATTR_NETWORK = "network"
ATTR_NETWORKS_LIST = "networks"
ATTR_STATIONS_LIST = "stations"
ATTR_TIMESTAMP = "timestamp"
ATTR_UID = "uid"
CONF_NETWORK = "network"
CONF_STATIONS_LIST = "stations"
DEFAULT_ENDPOINT = "https://api.citybik.es/{uri}"
PLATFORM = "citybikes"
MONITORED_NETWORKS = "monitored-networks"
DATA_CLIENT = "client"
NETWORKS_URI = "v2/networks"
REQUEST_TIMEOUT = aiohttp.ClientTimeout(total=5)
REQUEST_TIMEOUT = 5 # In seconds; argument to asyncio.timeout
SCAN_INTERVAL = timedelta(minutes=5) # Timely, and doesn't suffocate the API
STATIONS_URI = "v2/networks/{uid}?fields=network.stations"
CITYBIKES_ATTRIBUTION = (
"Information provided by the CityBikes Project (https://citybik.es/#about)"
)
@@ -89,6 +87,72 @@ PLATFORM_SCHEMA = vol.All(
),
)
NETWORK_SCHEMA = vol.Schema(
{
vol.Required(ATTR_ID): cv.string,
vol.Required(ATTR_NAME): cv.string,
vol.Required(ATTR_LOCATION): vol.Schema(
{
vol.Required(ATTR_LATITUDE): cv.latitude,
vol.Required(ATTR_LONGITUDE): cv.longitude,
},
extra=vol.REMOVE_EXTRA,
),
},
extra=vol.REMOVE_EXTRA,
)
NETWORKS_RESPONSE_SCHEMA = vol.Schema(
{vol.Required(ATTR_NETWORKS_LIST): [NETWORK_SCHEMA]}
)
STATION_SCHEMA = vol.Schema(
{
vol.Required(ATTR_FREE_BIKES): cv.positive_int,
vol.Required(ATTR_EMPTY_SLOTS): vol.Any(cv.positive_int, None),
vol.Required(ATTR_LATITUDE): cv.latitude,
vol.Required(ATTR_LONGITUDE): cv.longitude,
vol.Required(ATTR_ID): cv.string,
vol.Required(ATTR_NAME): cv.string,
vol.Required(ATTR_TIMESTAMP): cv.string,
vol.Optional(ATTR_EXTRA): vol.Schema(
{vol.Optional(ATTR_UID): cv.string}, extra=vol.REMOVE_EXTRA
),
},
extra=vol.REMOVE_EXTRA,
)
STATIONS_RESPONSE_SCHEMA = vol.Schema(
{
vol.Required(ATTR_NETWORK): vol.Schema(
{vol.Required(ATTR_STATIONS_LIST): [STATION_SCHEMA]}, extra=vol.REMOVE_EXTRA
)
}
)
class CityBikesRequestError(Exception):
"""Error to indicate a CityBikes API request has failed."""
async def async_citybikes_request(hass, uri, schema):
"""Perform a request to CityBikes API endpoint, and parse the response."""
try:
session = async_get_clientsession(hass)
async with asyncio.timeout(REQUEST_TIMEOUT):
req = await session.get(DEFAULT_ENDPOINT.format(uri=uri))
json_response = await req.json()
return schema(json_response)
except (TimeoutError, aiohttp.ClientError):
_LOGGER.error("Could not connect to CityBikes API endpoint")
except ValueError:
_LOGGER.error("Received non-JSON data from CityBikes API endpoint")
except vol.Invalid as err:
_LOGGER.error("Received unexpected JSON from CityBikes API endpoint: %s", err)
raise CityBikesRequestError
async def async_setup_platform(
hass: HomeAssistant,
@@ -111,14 +175,6 @@ async def async_setup_platform(
radius, UnitOfLength.FEET, UnitOfLength.METERS
)
client = CitybikesClient(user_agent=HA_USER_AGENT, timeout=REQUEST_TIMEOUT)
hass.data[PLATFORM][DATA_CLIENT] = client
async def _async_close_client(event):
await client.close()
hass.bus.async_listen_once(EVENT_HOMEASSISTANT_CLOSE, _async_close_client)
# Create a single instance of CityBikesNetworks.
networks = hass.data.setdefault(CITYBIKES_NETWORKS, CityBikesNetworks(hass))
@@ -138,10 +194,10 @@ async def async_setup_platform(
devices = []
for station in network.stations:
dist = location_util.distance(
latitude, longitude, station.latitude, station.longitude
latitude, longitude, station[ATTR_LATITUDE], station[ATTR_LONGITUDE]
)
station_id = station.id
station_uid = str(station.extra.get(ATTR_UID, ""))
station_id = station[ATTR_ID]
station_uid = str(station.get(ATTR_EXTRA, {}).get(ATTR_UID, ""))
if radius > dist or stations_list.intersection((station_id, station_uid)):
if name:
@@ -160,7 +216,6 @@ class CityBikesNetworks:
def __init__(self, hass):
"""Initialize the networks instance."""
self.hass = hass
self.client = hass.data[PLATFORM][DATA_CLIENT]
self.networks = None
self.networks_loading = asyncio.Condition()
@@ -169,21 +224,24 @@ class CityBikesNetworks:
try:
await self.networks_loading.acquire()
if self.networks is None:
self.networks = await self.client.networks.fetch()
except aiohttp.ClientError as err:
networks = await async_citybikes_request(
self.hass, NETWORKS_URI, NETWORKS_RESPONSE_SCHEMA
)
self.networks = networks[ATTR_NETWORKS_LIST]
except CityBikesRequestError as err:
raise PlatformNotReady from err
else:
result = None
minimum_dist = None
for network in self.networks:
network_latitude = network.location.latitude
network_longitude = network.location.longitude
network_latitude = network[ATTR_LOCATION][ATTR_LATITUDE]
network_longitude = network[ATTR_LOCATION][ATTR_LONGITUDE]
dist = location_util.distance(
latitude, longitude, network_latitude, network_longitude
)
if minimum_dist is None or dist < minimum_dist:
minimum_dist = dist
result = network.id
result = network[ATTR_ID]
return result
finally:
@@ -199,20 +257,22 @@ class CityBikesNetwork:
self.network_id = network_id
self.stations = []
self.ready = asyncio.Event()
self.client = hass.data[PLATFORM][DATA_CLIENT]
async def async_refresh(self, now=None):
"""Refresh the state of the network."""
try:
network = await self.client.network(uid=self.network_id).fetch()
except aiohttp.ClientError as err:
if now is None:
network = await async_citybikes_request(
self.hass,
STATIONS_URI.format(uid=self.network_id),
STATIONS_RESPONSE_SCHEMA,
)
self.stations = network[ATTR_NETWORK][ATTR_STATIONS_LIST]
self.ready.set()
except CityBikesRequestError as err:
if now is not None:
self.ready.clear()
else:
raise PlatformNotReady from err
self.ready.clear()
return
self.stations = network.stations
self.ready.set()
class CityBikesStation(SensorEntity):
@@ -230,13 +290,16 @@ class CityBikesStation(SensorEntity):
async def async_update(self) -> None:
"""Update station state."""
station = next(s for s in self._network.stations if s.id == self._station_id)
self._attr_name = station.name
self._attr_native_value = station.free_bikes
for station in self._network.stations:
if station[ATTR_ID] == self._station_id:
station_data = station
break
self._attr_name = station_data.get(ATTR_NAME)
self._attr_native_value = station_data.get(ATTR_FREE_BIKES)
self._attr_extra_state_attributes = {
ATTR_UID: station.extra.get(ATTR_UID),
ATTR_LATITUDE: station.latitude,
ATTR_LONGITUDE: station.longitude,
ATTR_EMPTY_SLOTS: station.empty_slots,
ATTR_TIMESTAMP: station.timestamp,
ATTR_UID: station_data.get(ATTR_EXTRA, {}).get(ATTR_UID),
ATTR_LATITUDE: station_data.get(ATTR_LATITUDE),
ATTR_LONGITUDE: station_data.get(ATTR_LONGITUDE),
ATTR_EMPTY_SLOTS: station_data.get(ATTR_EMPTY_SLOTS),
ATTR_TIMESTAMP: station_data.get(ATTR_TIMESTAMP),
}

View File

@@ -53,6 +53,7 @@ from .const import (
CONF_ACME_SERVER,
CONF_ALEXA,
CONF_ALIASES,
CONF_CLOUDHOOK_SERVER,
CONF_COGNITO_CLIENT_ID,
CONF_ENTITY_CONFIG,
CONF_FILTER,
@@ -129,6 +130,7 @@ CONFIG_SCHEMA = vol.Schema(
vol.Optional(CONF_ACCOUNT_LINK_SERVER): str,
vol.Optional(CONF_ACCOUNTS_SERVER): str,
vol.Optional(CONF_ACME_SERVER): str,
vol.Optional(CONF_CLOUDHOOK_SERVER): str,
vol.Optional(CONF_RELAYER_SERVER): str,
vol.Optional(CONF_REMOTESTATE_SERVER): str,
vol.Optional(CONF_SERVICEHANDLERS_SERVER): str,

View File

@@ -19,7 +19,7 @@ from homeassistant.components.alexa import (
errors as alexa_errors,
smart_home as alexa_smart_home,
)
from homeassistant.components.camera import async_register_ice_servers
from homeassistant.components.camera.webrtc import async_register_ice_servers
from homeassistant.components.google_assistant import smart_home as ga
from homeassistant.const import __version__ as HA_VERSION
from homeassistant.core import Context, HassJob, HomeAssistant, callback

View File

@@ -78,6 +78,7 @@ CONF_USER_POOL_ID = "user_pool_id"
CONF_ACCOUNT_LINK_SERVER = "account_link_server"
CONF_ACCOUNTS_SERVER = "accounts_server"
CONF_ACME_SERVER = "acme_server"
CONF_CLOUDHOOK_SERVER = "cloudhook_server"
CONF_RELAYER_SERVER = "relayer_server"
CONF_REMOTESTATE_SERVER = "remotestate_server"
CONF_SERVICEHANDLERS_SERVER = "servicehandlers_server"

View File

@@ -12,9 +12,7 @@ from hass_nabucasa.google_report_state import ErrorResponse
from homeassistant.components.binary_sensor import BinarySensorDeviceClass
from homeassistant.components.google_assistant import DOMAIN as GOOGLE_DOMAIN
from homeassistant.components.google_assistant.helpers import ( # pylint: disable=hass-component-root-import
AbstractConfig,
)
from homeassistant.components.google_assistant.helpers import AbstractConfig
from homeassistant.components.homeassistant.exposed_entities import (
async_expose_entity,
async_get_assistant_settings,

View File

@@ -13,6 +13,6 @@
"integration_type": "system",
"iot_class": "cloud_push",
"loggers": ["acme", "hass_nabucasa", "snitun"],
"requirements": ["hass-nabucasa==1.4.0"],
"requirements": ["hass-nabucasa==1.1.1"],
"single_config_entry": true
}

View File

@@ -11,7 +11,7 @@ from hass_nabucasa.voice import MAP_VOICE, Gender
from homeassistant.auth.const import GROUP_ID_ADMIN
from homeassistant.auth.models import User
from homeassistant.components import webhook
from homeassistant.components.google_assistant.http import ( # pylint: disable=hass-component-root-import
from homeassistant.components.google_assistant.http import (
async_get_users as async_get_google_assistant_users,
)
from homeassistant.core import HomeAssistant, callback

View File

@@ -1,105 +0,0 @@
rules:
# Bronze
action-setup:
status: exempt
comment: |
The integration does not provide any actions.
appropriate-polling: done
brands: done
common-modules: done
config-flow-test-coverage:
status: todo
comment: |
Stale docstring and test name: `test_form_home` and reusing result.
Extract `async_setup_entry` into own fixture.
Avoid importing `config_flow` in tests.
Test reauth with errors
config-flow:
status: todo
comment: |
The config flow misses data descriptions.
Make use of Electricity Maps zone keys in country code as dropdown.
Make use of location selector for coordinates.
dependency-transparency: done
docs-actions:
status: exempt
comment: |
The integration does not provide any actions.
docs-high-level-description: done
docs-installation-instructions: done
docs-removal-instructions: done
entity-event-setup:
status: exempt
comment: |
Entities of this integration do not explicitly subscribe to events.
entity-unique-id: done
has-entity-name: done
runtime-data: done
test-before-configure: done
test-before-setup: done
unique-config-entry: todo
# Silver
action-exceptions:
status: exempt
comment: |
The integration does not provide any actions.
config-entry-unloading: done
docs-configuration-parameters:
status: exempt
comment: |
The integration does not provide any additional options.
docs-installation-parameters: done
entity-unavailable: done
integration-owner: done
log-when-unavailable: done
parallel-updates: todo
reauthentication-flow: done
test-coverage:
status: todo
comment: |
Use `hass.config_entries.async_setup` instead of assert await `async_setup_component(hass, DOMAIN, {})`
`test_sensor` could use `snapshot_platform`
# Gold
devices: done
diagnostics: done
discovery-update-info:
status: exempt
comment: |
This integration cannot be discovered, it is a connecting to a cloud service.
discovery:
status: exempt
comment: |
This integration cannot be discovered, it is a connecting to a cloud service.
docs-data-update: done
docs-examples: done
docs-known-limitations: done
docs-supported-devices: done
docs-supported-functions: done
docs-troubleshooting: done
docs-use-cases: done
dynamic-devices:
status: exempt
comment: |
The integration connects to a single service per configuration entry.
entity-category: done
entity-device-class: done
entity-disabled-by-default: done
entity-translations: done
exception-translations: todo
icon-translations: todo
reconfiguration-flow: todo
repair-issues:
status: exempt
comment: |
This integration does not raise any repairable issues.
stale-devices:
status: exempt
comment: |
This integration connect to a single device per configuration entry.
# Platinum
async-dependency: done
inject-websession: done
strict-typing: done

View File

@@ -15,7 +15,6 @@ from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from homeassistant.helpers.update_coordinator import CoordinatorEntity
from .coordinator import ComelitConfigEntry, ComelitVedoSystem
from .utils import DeviceType, new_device_listener
# Coordinator is used to centralize the data updates
PARALLEL_UPDATES = 0
@@ -30,19 +29,23 @@ async def async_setup_entry(
coordinator = cast(ComelitVedoSystem, config_entry.runtime_data)
def _add_new_entities(new_devices: list[DeviceType], dev_type: str) -> None:
"""Add entities for new monitors."""
entities = [
ComelitVedoBinarySensorEntity(coordinator, device, config_entry.entry_id)
for device in coordinator.data["alarm_zones"].values()
if device in new_devices
]
if entities:
async_add_entities(entities)
known_devices: set[int] = set()
config_entry.async_on_unload(
new_device_listener(coordinator, _add_new_entities, "alarm_zones")
)
def _check_device() -> None:
current_devices = set(coordinator.data["alarm_zones"])
new_devices = current_devices - known_devices
if new_devices:
known_devices.update(new_devices)
async_add_entities(
ComelitVedoBinarySensorEntity(
coordinator, device, config_entry.entry_id
)
for device in coordinator.data["alarm_zones"].values()
if device.index in new_devices
)
_check_device()
config_entry.async_on_unload(coordinator.async_add_listener(_check_device))
class ComelitVedoBinarySensorEntity(

View File

@@ -7,14 +7,21 @@ from typing import Any, cast
from aiocomelit import ComelitSerialBridgeObject
from aiocomelit.const import COVER, STATE_COVER, STATE_OFF, STATE_ON
from homeassistant.components.cover import CoverDeviceClass, CoverEntity, CoverState
from homeassistant.components.cover import (
STATE_CLOSED,
STATE_CLOSING,
STATE_OPEN,
STATE_OPENING,
CoverDeviceClass,
CoverEntity,
)
from homeassistant.core import HomeAssistant
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from homeassistant.helpers.restore_state import RestoreEntity
from .coordinator import ComelitConfigEntry, ComelitSerialBridge
from .entity import ComelitBridgeBaseEntity
from .utils import DeviceType, bridge_api_call, new_device_listener
from .utils import bridge_api_call
# Coordinator is used to centralize the data updates
PARALLEL_UPDATES = 0
@@ -29,19 +36,21 @@ async def async_setup_entry(
coordinator = cast(ComelitSerialBridge, config_entry.runtime_data)
def _add_new_entities(new_devices: list[DeviceType], dev_type: str) -> None:
"""Add entities for new monitors."""
entities = [
ComelitCoverEntity(coordinator, device, config_entry.entry_id)
for device in coordinator.data[dev_type].values()
if device in new_devices
]
if entities:
async_add_entities(entities)
known_devices: set[int] = set()
config_entry.async_on_unload(
new_device_listener(coordinator, _add_new_entities, COVER)
)
def _check_device() -> None:
current_devices = set(coordinator.data[COVER])
new_devices = current_devices - known_devices
if new_devices:
known_devices.update(new_devices)
async_add_entities(
ComelitCoverEntity(coordinator, device, config_entry.entry_id)
for device in coordinator.data[COVER].values()
if device.index in new_devices
)
_check_device()
config_entry.async_on_unload(coordinator.async_add_listener(_check_device))
class ComelitCoverEntity(ComelitBridgeBaseEntity, RestoreEntity, CoverEntity):
@@ -121,9 +130,9 @@ class ComelitCoverEntity(ComelitBridgeBaseEntity, RestoreEntity, CoverEntity):
await super().async_added_to_hass()
if (state := await self.async_get_last_state()) is not None:
if state.state == CoverState.CLOSED:
self._last_action = STATE_COVER.index(CoverState.CLOSING)
if state.state == CoverState.OPEN:
self._last_action = STATE_COVER.index(CoverState.OPENING)
if state.state == STATE_CLOSED:
self._last_action = STATE_COVER.index(STATE_CLOSING)
if state.state == STATE_OPEN:
self._last_action = STATE_COVER.index(STATE_OPENING)
self._attr_is_closed = state.state == CoverState.CLOSED
self._attr_is_closed = state.state == STATE_CLOSED

View File

@@ -12,7 +12,7 @@ from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from .coordinator import ComelitConfigEntry, ComelitSerialBridge
from .entity import ComelitBridgeBaseEntity
from .utils import DeviceType, bridge_api_call, new_device_listener
from .utils import bridge_api_call
# Coordinator is used to centralize the data updates
PARALLEL_UPDATES = 0
@@ -27,19 +27,21 @@ async def async_setup_entry(
coordinator = cast(ComelitSerialBridge, config_entry.runtime_data)
def _add_new_entities(new_devices: list[DeviceType], dev_type: str) -> None:
"""Add entities for new monitors."""
entities = [
ComelitLightEntity(coordinator, device, config_entry.entry_id)
for device in coordinator.data[dev_type].values()
if device in new_devices
]
if entities:
async_add_entities(entities)
known_devices: set[int] = set()
config_entry.async_on_unload(
new_device_listener(coordinator, _add_new_entities, LIGHT)
)
def _check_device() -> None:
current_devices = set(coordinator.data[LIGHT])
new_devices = current_devices - known_devices
if new_devices:
known_devices.update(new_devices)
async_add_entities(
ComelitLightEntity(coordinator, device, config_entry.entry_id)
for device in coordinator.data[LIGHT].values()
if device.index in new_devices
)
_check_device()
config_entry.async_on_unload(coordinator.async_add_listener(_check_device))
class ComelitLightEntity(ComelitBridgeBaseEntity, LightEntity):

View File

@@ -20,7 +20,6 @@ from homeassistant.helpers.update_coordinator import CoordinatorEntity
from .coordinator import ComelitConfigEntry, ComelitSerialBridge, ComelitVedoSystem
from .entity import ComelitBridgeBaseEntity
from .utils import DeviceType, new_device_listener
# Coordinator is used to centralize the data updates
PARALLEL_UPDATES = 0
@@ -66,22 +65,24 @@ async def async_setup_bridge_entry(
coordinator = cast(ComelitSerialBridge, config_entry.runtime_data)
def _add_new_entities(new_devices: list[DeviceType], dev_type: str) -> None:
"""Add entities for new monitors."""
entities = [
ComelitBridgeSensorEntity(
coordinator, device, config_entry.entry_id, sensor_desc
)
for sensor_desc in SENSOR_BRIDGE_TYPES
for device in coordinator.data[dev_type].values()
if device in new_devices
]
if entities:
async_add_entities(entities)
known_devices: set[int] = set()
config_entry.async_on_unload(
new_device_listener(coordinator, _add_new_entities, OTHER)
)
def _check_device() -> None:
current_devices = set(coordinator.data[OTHER])
new_devices = current_devices - known_devices
if new_devices:
known_devices.update(new_devices)
async_add_entities(
ComelitBridgeSensorEntity(
coordinator, device, config_entry.entry_id, sensor_desc
)
for sensor_desc in SENSOR_BRIDGE_TYPES
for device in coordinator.data[OTHER].values()
if device.index in new_devices
)
_check_device()
config_entry.async_on_unload(coordinator.async_add_listener(_check_device))
async def async_setup_vedo_entry(
@@ -93,22 +94,24 @@ async def async_setup_vedo_entry(
coordinator = cast(ComelitVedoSystem, config_entry.runtime_data)
def _add_new_entities(new_devices: list[DeviceType], dev_type: str) -> None:
"""Add entities for new monitors."""
entities = [
ComelitVedoSensorEntity(
coordinator, device, config_entry.entry_id, sensor_desc
)
for sensor_desc in SENSOR_VEDO_TYPES
for device in coordinator.data["alarm_zones"].values()
if device in new_devices
]
if entities:
async_add_entities(entities)
known_devices: set[int] = set()
config_entry.async_on_unload(
new_device_listener(coordinator, _add_new_entities, "alarm_zones")
)
def _check_device() -> None:
current_devices = set(coordinator.data["alarm_zones"])
new_devices = current_devices - known_devices
if new_devices:
known_devices.update(new_devices)
async_add_entities(
ComelitVedoSensorEntity(
coordinator, device, config_entry.entry_id, sensor_desc
)
for sensor_desc in SENSOR_VEDO_TYPES
for device in coordinator.data["alarm_zones"].values()
if device.index in new_devices
)
_check_device()
config_entry.async_on_unload(coordinator.async_add_listener(_check_device))
class ComelitBridgeSensorEntity(ComelitBridgeBaseEntity, SensorEntity):

View File

@@ -13,7 +13,7 @@ from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from .coordinator import ComelitConfigEntry, ComelitSerialBridge
from .entity import ComelitBridgeBaseEntity
from .utils import DeviceType, bridge_api_call, new_device_listener
from .utils import bridge_api_call
# Coordinator is used to centralize the data updates
PARALLEL_UPDATES = 0
@@ -28,20 +28,35 @@ async def async_setup_entry(
coordinator = cast(ComelitSerialBridge, config_entry.runtime_data)
def _add_new_entities(new_devices: list[DeviceType], dev_type: str) -> None:
"""Add entities for new monitors."""
entities = [
ComelitSwitchEntity(coordinator, device, config_entry.entry_id)
for device in coordinator.data[dev_type].values()
if device in new_devices
]
if entities:
async_add_entities(entities)
entities: list[ComelitSwitchEntity] = []
entities.extend(
ComelitSwitchEntity(coordinator, device, config_entry.entry_id)
for device in coordinator.data[IRRIGATION].values()
)
entities.extend(
ComelitSwitchEntity(coordinator, device, config_entry.entry_id)
for device in coordinator.data[OTHER].values()
)
async_add_entities(entities)
for dev_type in (IRRIGATION, OTHER):
config_entry.async_on_unload(
new_device_listener(coordinator, _add_new_entities, dev_type)
)
known_devices: dict[str, set[int]] = {
dev_type: set() for dev_type in (IRRIGATION, OTHER)
}
def _check_device() -> None:
for dev_type in (IRRIGATION, OTHER):
current_devices = set(coordinator.data[dev_type])
new_devices = current_devices - known_devices[dev_type]
if new_devices:
known_devices[dev_type].update(new_devices)
async_add_entities(
ComelitSwitchEntity(coordinator, device, config_entry.entry_id)
for device in coordinator.data[dev_type].values()
if device.index in new_devices
)
_check_device()
config_entry.async_on_unload(coordinator.async_add_listener(_check_device))
class ComelitSwitchEntity(ComelitBridgeBaseEntity, SwitchEntity):

View File

@@ -4,11 +4,7 @@ from collections.abc import Awaitable, Callable, Coroutine
from functools import wraps
from typing import Any, Concatenate
from aiocomelit.api import (
ComelitSerialBridgeObject,
ComelitVedoAreaObject,
ComelitVedoZoneObject,
)
from aiocomelit import ComelitSerialBridgeObject
from aiocomelit.exceptions import CannotAuthenticate, CannotConnect, CannotRetrieveData
from aiohttp import ClientSession, CookieJar
@@ -23,11 +19,8 @@ from homeassistant.helpers import (
)
from .const import _LOGGER, DOMAIN
from .coordinator import ComelitBaseCoordinator
from .entity import ComelitBridgeBaseEntity
DeviceType = ComelitSerialBridgeObject | ComelitVedoAreaObject | ComelitVedoZoneObject
async def async_client_session(hass: HomeAssistant) -> ClientSession:
"""Return a new aiohttp session."""
@@ -120,41 +113,3 @@ def bridge_api_call[_T: ComelitBridgeBaseEntity, **_P](
self.coordinator.config_entry.async_start_reauth(self.hass)
return cmd_wrapper
def new_device_listener(
coordinator: ComelitBaseCoordinator,
new_devices_callback: Callable[
[
list[
ComelitSerialBridgeObject
| ComelitVedoAreaObject
| ComelitVedoZoneObject
],
str,
],
None,
],
data_type: str,
) -> Callable[[], None]:
"""Subscribe to coordinator updates to check for new devices."""
known_devices: dict[str, list[int]] = {}
def _check_devices() -> None:
"""Check for new devices and call callback with any new monitors."""
if not coordinator.data:
return
new_devices: list[DeviceType] = []
for _id in coordinator.data[data_type]:
if _id not in (id_list := known_devices.get(data_type, [])):
known_devices.update({data_type: [*id_list, _id]})
new_devices.append(coordinator.data[data_type][_id])
if new_devices:
new_devices_callback(new_devices, data_type)
# Check for devices immediately
_check_devices()
return coordinator.async_add_listener(_check_devices)

View File

@@ -49,7 +49,7 @@ CONFIG_SCHEMA = cv.empty_config_schema(DOMAIN)
async def async_setup(hass: HomeAssistant, config: ConfigType) -> bool:
"""Set up the config component."""
frontend.async_register_built_in_panel(
hass, "config", "config", "mdi:cog", require_admin=True
hass, "config", "config", "hass:cog", require_admin=True
)
for panel in SECTIONS:

View File

@@ -6,9 +6,7 @@ from typing import Any
import uuid
from homeassistant.components.automation import DOMAIN as AUTOMATION_DOMAIN
from homeassistant.components.automation.config import ( # pylint: disable=hass-component-root-import
async_validate_config_item,
)
from homeassistant.components.automation.config import async_validate_config_item
from homeassistant.config import AUTOMATION_CONFIG_PATH
from homeassistant.const import CONF_ID, SERVICE_RELOAD
from homeassistant.core import HomeAssistant, callback

View File

@@ -4,7 +4,6 @@ from __future__ import annotations
from collections.abc import Callable
from http import HTTPStatus
import logging
from typing import Any, NoReturn
from aiohttp import web
@@ -24,12 +23,7 @@ from homeassistant.helpers.data_entry_flow import (
FlowManagerResourceView,
)
from homeassistant.helpers.dispatcher import async_dispatcher_connect
from homeassistant.helpers.json import (
JSON_DUMP,
find_paths_unserializable_data,
json_bytes,
json_fragment,
)
from homeassistant.helpers.json import json_fragment
from homeassistant.loader import (
Integration,
IntegrationNotFound,
@@ -37,9 +31,6 @@ from homeassistant.loader import (
async_get_integrations,
async_get_loaded_integration,
)
from homeassistant.util.json import format_unserializable_data
_LOGGER = logging.getLogger(__name__)
@callback
@@ -411,40 +402,18 @@ def config_entries_flow_subscribe(
connection.subscriptions[msg["id"]] = hass.config_entries.flow.async_subscribe_flow(
async_on_flow_init_remove
)
try:
serialized_flows = [
json_bytes({"type": None, "flow_id": flw["flow_id"], "flow": flw})
for flw in hass.config_entries.flow.async_progress()
if flw["context"]["source"]
not in (
config_entries.SOURCE_RECONFIGURE,
config_entries.SOURCE_USER,
)
]
except (ValueError, TypeError):
# If we can't serialize, we'll filter out unserializable flows
serialized_flows = []
for flw in hass.config_entries.flow.async_progress():
if flw["context"]["source"] in (
config_entries.SOURCE_RECONFIGURE,
config_entries.SOURCE_USER,
):
continue
try:
serialized_flows.append(
json_bytes({"type": None, "flow_id": flw["flow_id"], "flow": flw})
)
except (ValueError, TypeError):
_LOGGER.error(
"Unable to serialize to JSON. Bad data found at %s",
format_unserializable_data(
find_paths_unserializable_data(flw, dump=JSON_DUMP)
),
)
continue
connection.send_message(
websocket_api.messages.construct_event_message(
msg["id"], b"".join((b"[", b",".join(serialized_flows), b"]"))
websocket_api.event_message(
msg["id"],
[
{"type": None, "flow_id": flw["flow_id"], "flow": flw}
for flw in hass.config_entries.flow.async_progress()
if flw["context"]["source"]
not in (
config_entries.SOURCE_RECONFIGURE,
config_entries.SOURCE_USER,
)
],
)
)
connection.send_result(msg["id"])

View File

@@ -5,9 +5,7 @@ from __future__ import annotations
from typing import Any
from homeassistant.components.script import DOMAIN as SCRIPT_DOMAIN
from homeassistant.components.script.config import ( # pylint: disable=hass-component-root-import
async_validate_config_item,
)
from homeassistant.components.script.config import async_validate_config_item
from homeassistant.config import SCRIPT_CONFIG_PATH
from homeassistant.const import SERVICE_RELOAD
from homeassistant.core import HomeAssistant, callback

View File

@@ -6,7 +6,7 @@
"documentation": "https://www.home-assistant.io/integrations/control4",
"iot_class": "local_polling",
"loggers": ["pyControl4"],
"requirements": ["pyControl4==1.5.0"],
"requirements": ["pyControl4==1.2.0"],
"ssdp": [
{
"st": "c4:director"

Some files were not shown because too many files have changed in this diff Show More