Compare commits

..

68 Commits

Author SHA1 Message Date
Erik Montnemery 0b75f7ed8c Merge branch 'dev' into exclude_ai_task_from_bootstrap 2026-04-20 13:44:06 +02:00
epenet b43d6a70da Fix cookie file suppression in verisure (#168609) 2026-04-20 13:41:41 +02:00
Erik Montnemery b5caabcbae Fix quantum_gateway tests (#168610) 2026-04-20 13:37:37 +02:00
Raphael Hehl 9bb46494d3 unifi: implement parallel-updates quality scale rule (#168563) 2026-04-20 13:26:17 +02:00
Erik Montnemery ca066b94c5 Deprecate legacy device tracker (#168387) 2026-04-20 13:20:36 +02:00
Erik Montnemery 8de6fa63cd Allow passing a set of event types to logbook.async_subscribe_events (#168163) 2026-04-20 13:19:31 +02:00
Erik Montnemery 866f41791a Deprecate support for local installation of dependencies (#168164) 2026-04-20 13:19:13 +02:00
epenet 5b3d2f823f Always load all platforms in sfr_box (#168594)
Co-authored-by: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-20 13:05:51 +02:00
Ariel Ebersberger e1d38fa237 Fix flaky airtouch5 test for Python 3.14.3 (#168366) 2026-04-20 12:51:18 +02:00
epenet 8eef269ce3 Use runtime_data in upb integration (#168600)
Co-authored-by: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-20 12:50:22 +02:00
David Bonnes 8afee640ef Remove device ids from extra_state_attrs of Evohome's Button entities (#168517)
Co-authored-by: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-04-20 12:17:09 +02:00
Maciej Bieniek 10fd51b34d Add ice phenomena sensor to IMGW-PIB integration (#168548) 2026-04-20 12:07:26 +02:00
Copilot a1cde0308a Clarify Copilot review guidance for validated entity action inputs (#168449)
Co-authored-by: balloob <1444314+balloob@users.noreply.github.com>
2026-04-20 11:56:21 +02:00
Erik 1acabe5c06 Update snapshots 2026-04-20 11:48:30 +02:00
Erik Montnemery 5c14025e70 Sort keys in dict returned by async_get_system_info (#168585) 2026-04-20 11:31:03 +02:00
Erik 55edd74762 Exclude ai_task from bootstrap 2026-04-20 11:22:05 +02:00
epenet 7e5762dcee Use runtime_data in ukraine_alarm integration (#168597)
Co-authored-by: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-20 11:09:10 +02:00
Franck Nijhof c425b69373 Set parallel updates for Tailscale platforms (#168596) 2026-04-20 11:09:06 +02:00
Thijs W. f73ee29ffb Add seek support to frontier_silicon (#168483) 2026-04-20 11:05:42 +02:00
Erik Montnemery db9c5a6df4 Adjust repair text about unsupported installation method (#168156)
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
2026-04-20 11:03:48 +02:00
J. Nick Koston 00d16864e3 Bump habluetooth to 6.1.0 (#168576) 2026-04-20 04:01:16 -05:00
Erik Montnemery 2fb22e5654 Use hass_tmp_config_dir fixture in device_tracker tests (#168582) 2026-04-20 10:58:23 +02:00
epenet 65e09c3213 Use runtime_data in toon integration (#168591)
Co-authored-by: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-20 10:51:13 +02:00
SeifEddineMezned 86eece57c8 Fix grammar and clarity in strings.json (#168577) 2026-04-20 10:47:37 +02:00
trevorvey e449e28ff5 Updated H590 input source mapping (#168523) 2026-04-20 10:41:49 +02:00
TheJulianJES 6e5b72ea87 Bump matter-python-client to 0.6.0 (#168312) 2026-04-20 10:39:38 +02:00
dependabot[bot] 450aa6d73b Bump actions/cache from 5.0.4 to 5.0.5 (#168583)
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-04-20 10:39:15 +02:00
dependabot[bot] 953fda87c8 Bump j178/prek-action from 2.0.1 to 2.0.2 (#168584)
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-04-20 10:39:01 +02:00
epenet 4b38b79ac5 Use runtime_data in todoist integration (#168590)
Co-authored-by: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-20 10:37:58 +02:00
epenet 7acc412902 Use runtime_data in thethingsnetwork integration (#168589)
Co-authored-by: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-20 10:28:36 +02:00
Nick Berardi bf2364e4cb Add First Alert app selection to Lyric auth (#168427)
Co-authored-by: Erwin Douna <e.douna@gmail.com>
2026-04-20 10:20:34 +02:00
Amit Finkelstein 2a6fba3990 Bump hdate to 1.2.1 (#168538) 2026-04-20 10:10:21 +02:00
epenet 6a8220a9df Use runtime_data in tami4 integration (#168587)
Co-authored-by: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-20 09:56:18 +02:00
Øyvind Matheson Wergeland b005fb236f Improve nobo_hub config entry setup (#168550)
Co-authored-by: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
Co-authored-by: Norbert Rittel <norbert@rittel.de>
2026-04-20 09:08:49 +02:00
renovate[bot] 528f7625f4 Update zizmor (#168581) 2026-04-20 08:34:51 +02:00
Franck Nijhof 0358696028 Update tailscale to 0.7.0 (#168544)
Co-authored-by: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-20 08:31:26 +02:00
Franck Nijhof ca4b4de20e Migrate Tailscale to use runtime_data (#168556) 2026-04-20 08:30:53 +02:00
Ronald van der Meer 34530810db Enable strict typing for Duco integration (#168572) 2026-04-20 08:30:25 +02:00
Michael c201275fef Migrate Zeversolar to use runtime_data (#168574) 2026-04-20 08:29:45 +02:00
Tomer ef2fa67c36 Victron GX: dedupe strings.json (#168460) 2026-04-20 08:26:43 +02:00
Fabian Munkes 0af4dfb7fd Add initial support for PlayerOptions: Select entities to Music Assistant (#167974)
Co-authored-by: Artur Pragacz <49985303+arturpragacz@users.noreply.github.com>
2026-04-20 05:14:21 +02:00
Thomas55555 894b3bd6a4 Add suggested uom to mop_drying_remaining_time in roborock (#168516) 2026-04-20 03:13:14 +02:00
Franck Nijhof a317bf9ed1 Remove leftover YAML import code from PVOutput config flow (#168560) 2026-04-19 22:29:34 +02:00
Ronald van der Meer 18a4440668 Handle rate limit error separately in Duco fan platform (#168558) 2026-04-19 22:26:15 +02:00
Franck Nijhof fc45201f93 Update pvo to v3.0.0 (#168561) 2026-04-19 21:53:11 +02:00
Franck Nijhof 829d3da432 Update peblar to v0.5.1 (#168386)
Co-authored-by: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-19 21:43:26 +02:00
Franck Nijhof 94cc1e6aed Migrate Rituals Perfume Genie to use runtime_data (#168564) 2026-04-19 21:40:50 +02:00
Yuval Weiss 746846fa74 Add Broadlink infrared emitter support to native infrared platform (#168385) 2026-04-19 14:54:59 -04:00
J. Nick Koston 59986f2a13 Revert "Bump habluetooth to 6.0.0" (#168552) 2026-04-19 20:14:46 +02:00
Franck Nijhof 025a5d31ae Mark reconfiguration-flow as exempt for Twente Milieu (#168040)
Co-authored-by: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-19 14:22:02 +02:00
Raphael Hehl 77bd066a71 Improve UniFi quality scale: has-entity-name (#168490)
Co-authored-by: RaHehl <rahehl@users.noreply.github.com>
2026-04-19 14:13:11 +02:00
Erwin Douna 4ac4ead186 Firefly III update installation instructions (#168529) 2026-04-19 14:10:52 +02:00
Maciej Bieniek a532c72459 Bump imgw_pib to 2.1.1 (#168534) 2026-04-19 14:09:27 +02:00
Maciej Bieniek 4a0152b4d7 Bump aioshelly to 13.24.0 (#168533) 2026-04-19 14:09:01 +02:00
Ronald van der Meer 75e9608631 Add zeroconf discovery to Duco integration (#168439) 2026-04-19 14:08:55 +02:00
Simone Chemelli 5301f1d49e Bump aioamazondevices to 13.4.3 (#168536) 2026-04-19 14:06:04 +02:00
Ronald van der Meer 8f75131829 Bump python-duco-client to 0.3.2 (#168528)
Co-authored-by: Erwin Douna <e.douna@gmail.com>
2026-04-19 11:35:04 +02:00
Artem Draft 7ba4b92fa8 Long polling Bravia TV in standby mode (#167364) 2026-04-19 10:30:47 +02:00
David Knowles d9b4d633d2 Bump pydrawise to 2026.4.0 (#168500) 2026-04-19 08:51:46 +02:00
Denis Shulyaka 93b236ff94 Remove Temperature parameter from Anthropic integration (#168504) 2026-04-19 08:51:02 +02:00
superdingo101 2ec1b12a94 Fix local_calendar recurring event creation failing with UNTIL when HA timezone is non-UTC (#167735) 2026-04-18 17:42:57 -07:00
Paulus Schoutsen 0c2dd5b02f Include uid, recurrence_id, and rrule in calendar event subscription data (#168318)
Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
Co-authored-by: copilot-swe-agent[bot] <198982749+Copilot@users.noreply.github.com>
Co-authored-by: balloob <1444314+balloob@users.noreply.github.com>
Co-authored-by: allenporter <6026418+allenporter@users.noreply.github.com>
Co-authored-by: Petar Petrov <MindFreeze@users.noreply.github.com>
2026-04-19 00:30:01 +02:00
Alistair Francis 5b255d476a husqvarna_automower_ble: Wait for product data (#168426)
Signed-off-by: Alistair Francis <alistair@alistair23.me>
2026-04-18 21:18:05 +02:00
Denis Shulyaka 476a04dcb2 Add Claude Opus 4.7 support (#168496) 2026-04-18 16:58:31 +02:00
Denis Shulyaka 2fde105979 Bump anthropic to 0.96.0 (#168487) 2026-04-18 14:47:11 +02:00
David Bonnes 8adc9600f2 Refactor how Evohome allocates names and IDs (#168485)
Co-authored-by: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-04-18 14:46:48 +02:00
Khole e7d26d8a60 Hive - Bump pyhive-integration to 1.0.9 (#168489) 2026-04-18 14:39:31 +02:00
Denis Shulyaka 9fa2430e5f Get deprecated Anthropic models from SDK (#168464)
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
Co-authored-by: Paulus Schoutsen <paulus@home-assistant.io>
2026-04-18 13:19:59 +03:00
178 changed files with 3011 additions and 1184 deletions
+2
View File
@@ -12,6 +12,8 @@ description: Everything you need to know to build, test and review Home Assistan
- When looking for examples, prefer integrations with the platinum or gold quality scale level first.
- Polling intervals are NOT user-configurable. Never add scan_interval, update_interval, or polling frequency options to config flows or config entries.
- Do NOT allow users to set config entry names in config flows. Names are automatically generated or can be customized later in UI. Exception: helper integrations may allow custom names.
- For entity actions and entity services, avoid requesting redundant defensive checks for fields already enforced by Home Assistant validation schemas and entity filters; only request extra guards when values bypass validation or are transformed unsafely.
- When validation guarantees a key is present, prefer direct dictionary indexing (`data["key"]`) over `.get("key")` so invalid assumptions fail fast.
The following platforms have extra guidelines:
- **Diagnostics**: [`platform-diagnostics.md`](platform-diagnostics.md) for diagnostic data collection
+3
View File
@@ -32,6 +32,9 @@ Prefer concrete types (for example, `HomeAssistant`, `MockConfigEntry`, etc.) ov
Integrations with Platinum or Gold level in the Integration Quality Scale reflect a high standard of code quality and maintainability. When looking for examples of something, these are good places to start. The level is indicated in the manifest.json of the integration.
When reviewing entity actions, do not suggest extra defensive checks for input fields that are already validated by Home Assistant's service/action schemas and entity selection filters. Suggest additional guards only when data bypasses those validators or is transformed into a less-safe form.
When validation guarantees a dict key exists, prefer direct key access (`data["key"]`) instead of `.get("key")` so contract violations are surfaced instead of silently masked.
# Skills
+24 -24
View File
@@ -282,7 +282,7 @@ jobs:
echo "::add-matcher::.github/workflows/matchers/check-executables-have-shebangs.json"
echo "::add-matcher::.github/workflows/matchers/codespell.json"
- name: Run prek
uses: j178/prek-action@53276d8b0d10f8b6672aa85b4588c6921d0370cc # v2.0.1
uses: j178/prek-action@cbc2f23eb5539cf20d82d1aabd0d0ecbcc56f4e3 # v2.0.2
env:
PREK_SKIP: no-commit-to-branch,mypy,pylint,gen_requirements_all,hassfest,hassfest-metadata,hassfest-mypy-config,zizmor
RUFF_OUTPUT_FORMAT: github
@@ -303,7 +303,7 @@ jobs:
with:
persist-credentials: false
- name: Run zizmor
uses: j178/prek-action@53276d8b0d10f8b6672aa85b4588c6921d0370cc # v2.0.1
uses: j178/prek-action@cbc2f23eb5539cf20d82d1aabd0d0ecbcc56f4e3 # v2.0.2
with:
extra-args: --all-files zizmor
@@ -366,7 +366,7 @@ jobs:
echo "key=uv-${UV_CACHE_VERSION}-${uv_version}-${HA_SHORT_VERSION}-$(date -u '+%Y-%m-%dT%H:%M:%s')" >> $GITHUB_OUTPUT
- name: Restore base Python virtual environment
id: cache-venv
uses: actions/cache@668228422ae6a00e4ad889ee87cd7109ec5666a7 # v5.0.4
uses: actions/cache@27d5ce7f107fe9357f9df03efb73ab90386fccae # v5.0.5
with:
path: venv
key: >-
@@ -374,7 +374,7 @@ jobs:
needs.info.outputs.python_cache_key }}
- name: Restore uv wheel cache
if: steps.cache-venv.outputs.cache-hit != 'true'
uses: actions/cache@668228422ae6a00e4ad889ee87cd7109ec5666a7 # v5.0.4
uses: actions/cache@27d5ce7f107fe9357f9df03efb73ab90386fccae # v5.0.5
with:
path: ${{ env.UV_CACHE_DIR }}
key: >-
@@ -386,7 +386,7 @@ jobs:
env.HA_SHORT_VERSION }}-
- name: Check if apt cache exists
id: cache-apt-check
uses: actions/cache@668228422ae6a00e4ad889ee87cd7109ec5666a7 # v5.0.4
uses: actions/cache@27d5ce7f107fe9357f9df03efb73ab90386fccae # v5.0.5
with:
lookup-only: ${{ steps.cache-venv.outputs.cache-hit == 'true' }}
path: |
@@ -432,7 +432,7 @@ jobs:
fi
- name: Save apt cache
if: steps.cache-apt-check.outputs.cache-hit != 'true'
uses: actions/cache/save@668228422ae6a00e4ad889ee87cd7109ec5666a7 # v5.0.4
uses: actions/cache/save@27d5ce7f107fe9357f9df03efb73ab90386fccae # v5.0.5
with:
path: |
${{ env.APT_CACHE_DIR }}
@@ -486,7 +486,7 @@ jobs:
&& github.event.inputs.audit-licenses-only != 'true'
steps:
- name: Restore apt cache
uses: actions/cache/restore@668228422ae6a00e4ad889ee87cd7109ec5666a7 # v5.0.4
uses: actions/cache/restore@27d5ce7f107fe9357f9df03efb73ab90386fccae # v5.0.5
with:
path: |
${{ env.APT_CACHE_DIR }}
@@ -517,7 +517,7 @@ jobs:
check-latest: true
- name: Restore full Python virtual environment
id: cache-venv
uses: actions/cache/restore@668228422ae6a00e4ad889ee87cd7109ec5666a7 # v5.0.4
uses: actions/cache/restore@27d5ce7f107fe9357f9df03efb73ab90386fccae # v5.0.5
with:
path: venv
fail-on-cache-miss: true
@@ -554,7 +554,7 @@ jobs:
check-latest: true
- name: Restore full Python virtual environment
id: cache-venv
uses: actions/cache/restore@668228422ae6a00e4ad889ee87cd7109ec5666a7 # v5.0.4
uses: actions/cache/restore@27d5ce7f107fe9357f9df03efb73ab90386fccae # v5.0.5
with:
path: venv
fail-on-cache-miss: true
@@ -645,7 +645,7 @@ jobs:
check-latest: true
- name: Restore full Python ${{ matrix.python-version }} virtual environment
id: cache-venv
uses: actions/cache/restore@668228422ae6a00e4ad889ee87cd7109ec5666a7 # v5.0.4
uses: actions/cache/restore@27d5ce7f107fe9357f9df03efb73ab90386fccae # v5.0.5
with:
path: venv
fail-on-cache-miss: true
@@ -696,7 +696,7 @@ jobs:
check-latest: true
- name: Restore full Python virtual environment
id: cache-venv
uses: actions/cache/restore@668228422ae6a00e4ad889ee87cd7109ec5666a7 # v5.0.4
uses: actions/cache/restore@27d5ce7f107fe9357f9df03efb73ab90386fccae # v5.0.5
with:
path: venv
fail-on-cache-miss: true
@@ -749,7 +749,7 @@ jobs:
check-latest: true
- name: Restore full Python virtual environment
id: cache-venv
uses: actions/cache/restore@668228422ae6a00e4ad889ee87cd7109ec5666a7 # v5.0.4
uses: actions/cache/restore@27d5ce7f107fe9357f9df03efb73ab90386fccae # v5.0.5
with:
path: venv
fail-on-cache-miss: true
@@ -806,7 +806,7 @@ jobs:
echo "key=mypy-${MYPY_CACHE_VERSION}-${mypy_version}-${HA_SHORT_VERSION}-$(date -u '+%Y-%m-%dT%H:%M:%s')" >> $GITHUB_OUTPUT
- name: Restore full Python virtual environment
id: cache-venv
uses: actions/cache/restore@668228422ae6a00e4ad889ee87cd7109ec5666a7 # v5.0.4
uses: actions/cache/restore@27d5ce7f107fe9357f9df03efb73ab90386fccae # v5.0.5
with:
path: venv
fail-on-cache-miss: true
@@ -814,7 +814,7 @@ jobs:
${{ runner.os }}-${{ runner.arch }}-${{ steps.python.outputs.python-version }}-${{
needs.info.outputs.python_cache_key }}
- name: Restore mypy cache
uses: actions/cache@668228422ae6a00e4ad889ee87cd7109ec5666a7 # v5.0.4
uses: actions/cache@27d5ce7f107fe9357f9df03efb73ab90386fccae # v5.0.5
with:
path: .mypy_cache
key: >-
@@ -856,7 +856,7 @@ jobs:
- base
steps:
- name: Restore apt cache
uses: actions/cache/restore@668228422ae6a00e4ad889ee87cd7109ec5666a7 # v5.0.4
uses: actions/cache/restore@27d5ce7f107fe9357f9df03efb73ab90386fccae # v5.0.5
with:
path: |
${{ env.APT_CACHE_DIR }}
@@ -889,7 +889,7 @@ jobs:
check-latest: true
- name: Restore full Python virtual environment
id: cache-venv
uses: actions/cache/restore@668228422ae6a00e4ad889ee87cd7109ec5666a7 # v5.0.4
uses: actions/cache/restore@27d5ce7f107fe9357f9df03efb73ab90386fccae # v5.0.5
with:
path: venv
fail-on-cache-miss: true
@@ -932,7 +932,7 @@ jobs:
group: ${{ fromJson(needs.info.outputs.test_groups) }}
steps:
- name: Restore apt cache
uses: actions/cache/restore@668228422ae6a00e4ad889ee87cd7109ec5666a7 # v5.0.4
uses: actions/cache/restore@27d5ce7f107fe9357f9df03efb73ab90386fccae # v5.0.5
with:
path: |
${{ env.APT_CACHE_DIR }}
@@ -966,7 +966,7 @@ jobs:
check-latest: true
- name: Restore full Python ${{ matrix.python-version }} virtual environment
id: cache-venv
uses: actions/cache/restore@668228422ae6a00e4ad889ee87cd7109ec5666a7 # v5.0.4
uses: actions/cache/restore@27d5ce7f107fe9357f9df03efb73ab90386fccae # v5.0.5
with:
path: venv
fail-on-cache-miss: true
@@ -1084,7 +1084,7 @@ jobs:
mariadb-group: ${{ fromJson(needs.info.outputs.mariadb_groups) }}
steps:
- name: Restore apt cache
uses: actions/cache/restore@668228422ae6a00e4ad889ee87cd7109ec5666a7 # v5.0.4
uses: actions/cache/restore@27d5ce7f107fe9357f9df03efb73ab90386fccae # v5.0.5
with:
path: |
${{ env.APT_CACHE_DIR }}
@@ -1119,7 +1119,7 @@ jobs:
check-latest: true
- name: Restore full Python ${{ matrix.python-version }} virtual environment
id: cache-venv
uses: actions/cache/restore@668228422ae6a00e4ad889ee87cd7109ec5666a7 # v5.0.4
uses: actions/cache/restore@27d5ce7f107fe9357f9df03efb73ab90386fccae # v5.0.5
with:
path: venv
fail-on-cache-miss: true
@@ -1242,7 +1242,7 @@ jobs:
postgresql-group: ${{ fromJson(needs.info.outputs.postgresql_groups) }}
steps:
- name: Restore apt cache
uses: actions/cache/restore@668228422ae6a00e4ad889ee87cd7109ec5666a7 # v5.0.4
uses: actions/cache/restore@27d5ce7f107fe9357f9df03efb73ab90386fccae # v5.0.5
with:
path: |
${{ env.APT_CACHE_DIR }}
@@ -1279,7 +1279,7 @@ jobs:
check-latest: true
- name: Restore full Python ${{ matrix.python-version }} virtual environment
id: cache-venv
uses: actions/cache/restore@668228422ae6a00e4ad889ee87cd7109ec5666a7 # v5.0.4
uses: actions/cache/restore@27d5ce7f107fe9357f9df03efb73ab90386fccae # v5.0.5
with:
path: venv
fail-on-cache-miss: true
@@ -1425,7 +1425,7 @@ jobs:
group: ${{ fromJson(needs.info.outputs.test_groups) }}
steps:
- name: Restore apt cache
uses: actions/cache/restore@668228422ae6a00e4ad889ee87cd7109ec5666a7 # v5.0.4
uses: actions/cache/restore@27d5ce7f107fe9357f9df03efb73ab90386fccae # v5.0.5
with:
path: |
${{ env.APT_CACHE_DIR }}
@@ -1459,7 +1459,7 @@ jobs:
check-latest: true
- name: Restore full Python ${{ matrix.python-version }} virtual environment
id: cache-venv
uses: actions/cache/restore@668228422ae6a00e4ad889ee87cd7109ec5666a7 # v5.0.4
uses: actions/cache/restore@27d5ce7f107fe9357f9df03efb73ab90386fccae # v5.0.5
with:
path: venv
fail-on-cache-miss: true
+1 -1
View File
@@ -18,7 +18,7 @@ repos:
exclude_types: [csv, json, html]
exclude: ^tests/fixtures/|homeassistant/generated/|tests/components/.*/snapshots/
- repo: https://github.com/zizmorcore/zizmor-pre-commit
rev: v1.23.1
rev: v1.24.0
hooks:
- id: zizmor
args:
+1
View File
@@ -178,6 +178,7 @@ homeassistant.components.dropbox.*
homeassistant.components.droplet.*
homeassistant.components.dsmr.*
homeassistant.components.duckdns.*
homeassistant.components.duco.*
homeassistant.components.dunehd.*
homeassistant.components.duotecno.*
homeassistant.components.easyenergy.*
+3
View File
@@ -22,3 +22,6 @@ Prefer concrete types (for example, `HomeAssistant`, `MockConfigEntry`, etc.) ov
## Good practices
Integrations with Platinum or Gold level in the Integration Quality Scale reflect a high standard of code quality and maintainability. When looking for examples of something, these are good places to start. The level is indicated in the manifest.json of the integration.
When reviewing entity actions, do not suggest extra defensive checks for input fields that are already validated by Home Assistant's service/action schemas and entity selection filters. Suggest additional guards only when data bypasses those validators or is transformed into a less-safe form.
When validation guarantees a dict key exists, prefer direct key access (`data["key"]`) instead of `.get("key")` so contract violations are surfaced instead of silently masked.
+5 -2
View File
@@ -238,9 +238,12 @@ DEFAULT_INTEGRATIONS = {
"timer",
#
# Base platforms:
# Note: Calendar and todo are not included to prevent them from registering
# Note:
# - AI task is not included to not give the perception that AI functionality
# is mandatory with Home Assistant.
# - Calendar and todo are not included to prevent them from registering
# their frontend panels when there are no calendar or todo integrations.
*(BASE_PLATFORMS - {"calendar", "todo"}),
*(BASE_PLATFORMS - {"ai_task", "calendar", "todo"}),
#
# Integrations providing triggers and conditions for base platforms:
"air_quality",
@@ -8,5 +8,5 @@
"iot_class": "cloud_polling",
"loggers": ["aioamazondevices"],
"quality_scale": "platinum",
"requirements": ["aioamazondevices==13.4.1"]
"requirements": ["aioamazondevices==13.4.3"]
}
+17 -10
View File
@@ -2,6 +2,8 @@
from __future__ import annotations
from anthropic.resources.messages.messages import DEPRECATED_MODELS
from homeassistant.config_entries import ConfigSubentry
from homeassistant.const import CONF_API_KEY, Platform
from homeassistant.core import HomeAssistant
@@ -13,13 +15,7 @@ from homeassistant.helpers import (
)
from homeassistant.helpers.typing import ConfigType
from .const import (
CONF_CHAT_MODEL,
DEFAULT_CONVERSATION_NAME,
DEPRECATED_MODELS,
DOMAIN,
LOGGER,
)
from .const import CONF_CHAT_MODEL, DEFAULT_CONVERSATION_NAME, DOMAIN, LOGGER
from .coordinator import AnthropicConfigEntry, AnthropicCoordinator
PLATFORMS = (Platform.AI_TASK, Platform.CONVERSATION)
@@ -44,9 +40,7 @@ async def async_setup_entry(hass: HomeAssistant, entry: AnthropicConfigEntry) ->
entry.async_on_unload(entry.add_update_listener(async_update_options))
for subentry in entry.subentries.values():
if (model := subentry.data.get(CONF_CHAT_MODEL)) and model.startswith(
tuple(DEPRECATED_MODELS)
):
if (model := subentry.data.get(CONF_CHAT_MODEL)) and model in DEPRECATED_MODELS:
ir.async_create_issue(
hass,
DOMAIN,
@@ -236,6 +230,19 @@ async def async_migrate_entry(hass: HomeAssistant, entry: AnthropicConfigEntry)
)
hass.config_entries.async_update_entry(entry, minor_version=3)
if entry.version == 2 and entry.minor_version == 3:
# Remove Temperature parameter
CONF_TEMPERATURE = "temperature"
for subentry in entry.subentries.values():
data = subentry.data.copy()
if CONF_TEMPERATURE not in data:
continue
data.pop(CONF_TEMPERATURE, None)
hass.config_entries.async_update_subentry(entry, subentry, data=data)
hass.config_entries.async_update_entry(entry, minor_version=4)
LOGGER.debug(
"Migration to version %s:%s successful", entry.version, entry.minor_version
)
@@ -50,7 +50,6 @@ from .const import (
CONF_PROMPT,
CONF_PROMPT_CACHING,
CONF_RECOMMENDED,
CONF_TEMPERATURE,
CONF_THINKING_BUDGET,
CONF_THINKING_EFFORT,
CONF_TOOL_SEARCH,
@@ -109,7 +108,7 @@ class AnthropicConfigFlow(ConfigFlow, domain=DOMAIN):
"""Handle a config flow for Anthropic."""
VERSION = 2
MINOR_VERSION = 3
MINOR_VERSION = 4
async def async_step_user(
self, user_input: dict[str, Any] | None = None
@@ -324,10 +323,6 @@ class ConversationSubentryFlowHandler(ConfigSubentryFlow):
): SelectSelector(
SelectSelectorConfig(options=self._get_model_list(), custom_value=True)
),
vol.Optional(
CONF_TEMPERATURE,
default=DEFAULT[CONF_TEMPERATURE],
): NumberSelector(NumberSelectorConfig(min=0, max=1, step=0.05)),
vol.Optional(
CONF_PROMPT_CACHING,
default=DEFAULT[CONF_PROMPT_CACHING],
@@ -431,6 +426,8 @@ class ConversationSubentryFlowHandler(ConfigSubentryFlow):
effort_options.append("medium")
if effort_capability.high.supported:
effort_options.append("high")
if effort_capability.xhigh and effort_capability.xhigh.supported:
effort_options.append("xhigh")
if effort_capability.max.supported:
effort_options.append("max")
step_schema[
@@ -15,7 +15,6 @@ CONF_CHAT_MODEL = "chat_model"
CONF_CODE_EXECUTION = "code_execution"
CONF_MAX_TOKENS = "max_tokens"
CONF_PROMPT_CACHING = "prompt_caching"
CONF_TEMPERATURE = "temperature"
CONF_THINKING_BUDGET = "thinking_budget"
CONF_THINKING_EFFORT = "thinking_effort"
CONF_TOOL_SEARCH = "tool_search"
@@ -43,7 +42,6 @@ DEFAULT = {
CONF_CODE_EXECUTION: False,
CONF_MAX_TOKENS: 3000,
CONF_PROMPT_CACHING: PromptCaching.PROMPT.value,
CONF_TEMPERATURE: 1.0,
CONF_THINKING_BUDGET: MIN_THINKING_BUDGET,
CONF_THINKING_EFFORT: "low",
CONF_TOOL_SEARCH: False,
@@ -64,7 +62,3 @@ TOOL_SEARCH_UNSUPPORTED_MODELS = [
"claude-3",
"claude-haiku",
]
DEPRECATED_MODELS = [
"claude-3",
]
+4 -9
View File
@@ -98,7 +98,6 @@ from .const import (
CONF_CODE_EXECUTION,
CONF_MAX_TOKENS,
CONF_PROMPT_CACHING,
CONF_TEMPERATURE,
CONF_THINKING_BUDGET,
CONF_THINKING_EFFORT,
CONF_TOOL_SEARCH,
@@ -762,13 +761,12 @@ class AnthropicBaseLLMEntity(CoordinatorEntity[AnthropicCoordinator]):
CONF_THINKING_EFFORT, DEFAULT[CONF_THINKING_EFFORT]
)
if thinking_effort != "none":
model_args["thinking"] = ThinkingConfigAdaptiveParam(type="adaptive")
model_args["thinking"] = ThinkingConfigAdaptiveParam(
type="adaptive", display="summarized"
)
model_args["output_config"] = OutputConfigParam(effort=thinking_effort)
else:
model_args["thinking"] = ThinkingConfigDisabledParam(type="disabled")
model_args["temperature"] = options.get(
CONF_TEMPERATURE, DEFAULT[CONF_TEMPERATURE]
)
else:
thinking_budget = options.get(
CONF_THINKING_BUDGET, DEFAULT[CONF_THINKING_BUDGET]
@@ -779,13 +777,10 @@ class AnthropicBaseLLMEntity(CoordinatorEntity[AnthropicCoordinator]):
and thinking_budget >= MIN_THINKING_BUDGET
):
model_args["thinking"] = ThinkingConfigEnabledParam(
type="enabled", budget_tokens=thinking_budget
type="enabled", display="summarized", budget_tokens=thinking_budget
)
else:
model_args["thinking"] = ThinkingConfigDisabledParam(type="disabled")
model_args["temperature"] = options.get(
CONF_TEMPERATURE, DEFAULT[CONF_TEMPERATURE]
)
if (
self.model_info.capabilities
@@ -9,5 +9,5 @@
"integration_type": "service",
"iot_class": "cloud_polling",
"quality_scale": "silver",
"requirements": ["anthropic==0.92.0"]
"requirements": ["anthropic==0.96.0"]
}
@@ -6,6 +6,7 @@ from collections.abc import Iterator
from typing import TYPE_CHECKING
import anthropic
from anthropic.resources.messages.messages import DEPRECATED_MODELS
import voluptuous as vol
from homeassistant import data_entry_flow
@@ -19,7 +20,7 @@ from homeassistant.helpers.selector import (
SelectSelectorConfig,
)
from .const import CONF_CHAT_MODEL, DEPRECATED_MODELS, DOMAIN
from .const import CONF_CHAT_MODEL, DOMAIN
from .coordinator import model_alias
if TYPE_CHECKING:
@@ -63,7 +64,7 @@ class ModelDeprecatedRepairFlow(RepairsFlow):
model_list = [
model_option
for model_option in await self.get_model_list(client)
if not model_option["value"].startswith(tuple(DEPRECATED_MODELS))
if model_option["value"] not in DEPRECATED_MODELS
]
self._model_list_cache[entry.entry_id] = model_list
@@ -105,6 +106,7 @@ class ModelDeprecatedRepairFlow(RepairsFlow):
"model": model,
"subentry_name": subentry.title,
"subentry_type": self._format_subentry_type(subentry.subentry_type),
"retirement_date": DEPRECATED_MODELS[model],
},
)
@@ -131,7 +133,7 @@ class ModelDeprecatedRepairFlow(RepairsFlow):
continue
for subentry in entry.subentries.values():
model = subentry.data.get(CONF_CHAT_MODEL)
if model and model.startswith(tuple(DEPRECATED_MODELS)):
if model and model in DEPRECATED_MODELS:
yield entry.entry_id, subentry.subentry_id
async def _async_next_target(
@@ -158,7 +160,7 @@ class ModelDeprecatedRepairFlow(RepairsFlow):
continue
model = subentry.data.get(CONF_CHAT_MODEL)
if not model or not model.startswith(tuple(DEPRECATED_MODELS)):
if not model or model not in DEPRECATED_MODELS:
continue
self._current_entry_id = entry_id
@@ -219,7 +219,7 @@
"data_description": {
"chat_model": "Select the new model to use."
},
"description": "You are updating {subentry_name} ({subentry_type}) in {entry_name}. The current model {model} is deprecated. Select a supported model to continue.",
"description": "You are updating {subentry_name} ({subentry_type}) in {entry_name}. The current model {model} is deprecated and will reach end-of-life on {retirement_date}. Select a supported model to continue.",
"title": "Update model"
}
}
@@ -241,7 +241,8 @@
"low": "[%key:common::state::low%]",
"max": "Max",
"medium": "[%key:common::state::medium%]",
"none": "None"
"none": "None",
"xhigh": "X-High"
}
}
}
@@ -21,6 +21,6 @@
"bluetooth-auto-recovery==1.5.3",
"bluetooth-data-tools==1.28.4",
"dbus-fast==4.0.4",
"habluetooth==6.0.0"
"habluetooth==6.1.0"
]
}
@@ -7,9 +7,11 @@ from typing import Final
from aiohttp import CookieJar
from pybravia import BraviaClient
from homeassistant.components import ssdp
from homeassistant.const import CONF_HOST, CONF_MAC, Platform
from homeassistant.core import HomeAssistant
from homeassistant.helpers.aiohttp_client import async_create_clientsession
from homeassistant.helpers.service_info.ssdp import SsdpServiceInfo
from .const import CONF_USE_SSL
from .coordinator import BraviaTVConfigEntry, BraviaTVCoordinator
@@ -46,6 +48,19 @@ async def async_setup_entry(
await hass.config_entries.async_forward_entry_setups(config_entry, PLATFORMS)
async def async_ssdp_callback(
discovery_info: SsdpServiceInfo, change: ssdp.SsdpChange
) -> None:
await coordinator.async_request_refresh()
config_entry.async_on_unload(
await ssdp.async_register_callback(
hass,
async_ssdp_callback,
{"nt": "urn:schemas-upnp-org:device:MediaRenderer:1", "_host": host},
)
)
return True
@@ -173,6 +173,9 @@ class BraviaTVCoordinator(DataUpdateCoordinator[None]):
power_status = await self.client.get_power_status()
self.is_on = power_status == "active"
self.skipped_updates = 0
self.update_interval = (
timedelta(seconds=120) if power_status == "standby" else SCAN_INTERVAL
)
if not self.system_info:
self.system_info = await self.client.get_system_info()
@@ -6,6 +6,7 @@ DOMAIN = "broadlink"
DOMAINS_AND_TYPES = {
Platform.CLIMATE: {"HYS"},
Platform.INFRARED: {"RM4MINI", "RM4PRO", "RMMINI", "RMMINIB", "RMPRO"},
Platform.LIGHT: {"LB1", "LB2"},
Platform.REMOTE: {"RM4MINI", "RM4PRO", "RMMINI", "RMMINIB", "RMPRO"},
Platform.SELECT: {"HYS"},
@@ -44,3 +45,6 @@ DEVICE_TYPES = set.union(*DOMAINS_AND_TYPES.values())
DEFAULT_PORT = 80
DEFAULT_TIMEOUT = 5
# Broadlink IR packet format - repeat count byte offset
IR_PACKET_REPEAT_INDEX = 1
@@ -0,0 +1,184 @@
"""Infrared platform for Broadlink remotes."""
from __future__ import annotations
from typing import TYPE_CHECKING
from broadlink.exceptions import BroadlinkException
from broadlink.remote import pulses_to_data as _bl_pulses_to_data
import infrared_protocols
from homeassistant.components.infrared import InfraredCommand, InfraredEntity
from homeassistant.config_entries import ConfigEntry
from homeassistant.core import HomeAssistant
from homeassistant.exceptions import HomeAssistantError
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from .const import DOMAIN, IR_PACKET_REPEAT_INDEX
from .entity import BroadlinkEntity
if TYPE_CHECKING:
from .device import BroadlinkDevice
PARALLEL_UPDATES = 1
class BroadlinkIRCommand(InfraredCommand):
"""Raw IR command with optional Broadlink hardware repeat count.
This class lets you send raw timing data through a Broadlink infrared
entity. The repeat_count maps directly to the Broadlink packet repeat
byte: the device will re-transmit the entire IR burst that many
additional times after the first transmission.
Use this when you have existing Broadlink-encoded IR data (e.g. from
IR code databases like SmartIR) and want to use it with the new
infrared platform.
Protocol-aware commands (infrared_protocols.NECCommand, LgTVCommand,
etc.) manage repeats *inside* get_raw_timings() and should use the
default repeat=0. Only BroadlinkIRCommand should set hardware repeat.
Example: Migrating IR code database base64 codes to the infrared platform:
import base64
from broadlink.remote import data_to_pulses
from homeassistant.components.broadlink.infrared import BroadlinkIRCommand
from homeassistant.components.broadlink.const import IR_PACKET_REPEAT_INDEX
# Decode base64 IR code (e.g. from IR code database)
packet_data = base64.b64decode(b64_code)
repeat_count = packet_data[IR_PACKET_REPEAT_INDEX]
# Parse Broadlink packet to microsecond timings
pulses = data_to_pulses(packet_data)
timings = list(zip(pulses[::2], pulses[1::2]))
if len(pulses) % 2:
timings.append((pulses[-1], 0))
# Create command
cmd = BroadlinkIRCommand(timings, repeat_count=repeat_count)
await infrared.async_send_command(hass, entity_id, cmd)
"""
# Standard IR carrier frequency. Broadlink hardware handles the carrier
# internally, so this value is informational only.
MODULATION = 38000
def __init__(
self,
timings: list[tuple[int, int]],
repeat_count: int = 0,
) -> None:
"""Initialize with timing pairs and optional repeat count.
Args:
timings: List of (mark_us, space_us) pairs in microseconds.
repeat_count: Broadlink hardware repeat count (0 = send once).
Must be 0255 (the hardware repeat byte is a single unsigned byte).
Raises:
ValueError: If repeat_count is outside 0255 range.
"""
if not 0 <= repeat_count <= 255:
raise ValueError(f"repeat_count must be 0255, got {repeat_count}")
super().__init__(modulation=self.MODULATION, repeat_count=repeat_count)
self._timings = [
infrared_protocols.Timing(high_us=high, low_us=low) for high, low in timings
]
def get_raw_timings(self) -> list[infrared_protocols.Timing]:
"""Return timing pairs for transmission."""
return self._timings
def timings_to_broadlink_packet(
timings: list[tuple[int, int]],
repeat: int = 0,
) -> bytes:
"""Convert raw timing pairs (high_us, low_us) to a Broadlink IR packet.
Args:
timings: List of (mark_us, space_us) pairs in microseconds.
repeat: Number of extra repeats (0 = send once).
Returns:
Binary packet ready for Broadlink send_data().
"""
if not 0 <= repeat <= 255:
raise ValueError(f"repeat must be 0255, got {repeat}")
# Flatten (mark, space) pairs into a pulse list, omitting any zero-length spaces
pulses: list[int] = []
for high_us, low_us in timings:
pulses.append(high_us)
if low_us:
pulses.append(low_us)
# Use broadlink library's encoder (tick=32.84 µs)
packet = bytearray(_bl_pulses_to_data(pulses))
packet[IR_PACKET_REPEAT_INDEX] = repeat
return bytes(packet)
async def async_setup_entry(
hass: HomeAssistant,
config_entry: ConfigEntry,
async_add_entities: AddConfigEntryEntitiesCallback,
) -> None:
"""Set up Broadlink infrared entity."""
device = hass.data[DOMAIN].devices[config_entry.entry_id]
async_add_entities([BroadlinkInfraredEntity(device)])
class BroadlinkInfraredEntity(BroadlinkEntity, InfraredEntity):
"""Broadlink infrared transmitter entity."""
_attr_has_entity_name = True
_attr_translation_key = "infrared"
def __init__(self, device: BroadlinkDevice) -> None:
"""Initialize the entity."""
super().__init__(device)
self._attr_unique_id = f"{device.unique_id}-infrared"
async def async_send_command(self, command: InfraredCommand) -> None:
"""Send an IR command via the Broadlink device.
Handles two types of repeat behavior:
1. Protocol-aware commands (NECCommand, etc.): These encode repeats
(like NEC repeat codes) inside their get_raw_timings() data. The
Broadlink packet is sent with repeat=0.
2. BroadlinkIRCommand: Carries Broadlink hardware repeat count,
which tells the device to re-transmit the entire burst N times.
This is used for protocols/commands that need multiple full frame
transmissions (e.g. legacy SmartIR data).
Using isinstance check ensures protocol-level repeats (already in
timing data) don't get conflated with hardware repeats.
"""
timings = [
(timing.high_us, timing.low_us) for timing in command.get_raw_timings()
]
# Only BroadlinkIRCommand uses Broadlink hardware repeat. Protocol-aware
# commands (NECCommand, etc.) encode repeats inside get_raw_timings()
# and must use hardware repeat=0 to avoid double-repeating.
if isinstance(command, BroadlinkIRCommand):
repeat = command.repeat_count
else:
repeat = 0
packet = timings_to_broadlink_packet(timings, repeat=repeat)
try:
await self._device.async_request(self._device.api.send_data, packet)
except (BroadlinkException, OSError) as err:
raise HomeAssistantError(
translation_domain=DOMAIN,
translation_key="send_command_failed",
translation_placeholders={"error": str(err)},
) from err
@@ -3,6 +3,7 @@
"name": "Broadlink",
"codeowners": ["@danielhiversen", "@felipediel", "@L-I-Am", "@eifinger"],
"config_flow": true,
"dependencies": ["infrared"],
"dhcp": [
{
"registered_devices": true
@@ -49,6 +49,11 @@
}
},
"entity": {
"infrared": {
"infrared": {
"name": "IR transmitter"
}
},
"select": {
"day_of_week": {
"name": "Day of week",
@@ -77,5 +82,10 @@
"name": "Total consumption"
}
}
},
"exceptions": {
"send_command_failed": {
"message": "Failed to send IR command: {error}"
}
}
}
@@ -738,10 +738,7 @@ class CalendarEntity(Entity):
listener(None)
return
event_list: list[JsonValueType] = [
dataclasses.asdict(event, dict_factory=_list_events_dict_factory)
for event in events
]
event_list: list[JsonValueType] = [event.as_dict() for event in events]
listener(event_list)
async def async_get_events(
@@ -6,6 +6,7 @@ import asyncio
from collections.abc import Callable, Coroutine, Sequence
from datetime import datetime, timedelta
import hashlib
import logging
from types import ModuleType
from typing import Any, Final, Protocol, final
@@ -82,6 +83,8 @@ from .const import (
SourceType,
)
_LOGGER = logging.getLogger(__name__)
SERVICE_SEE: Final = "see"
SOURCE_TYPES = [cls.value for cls in SourceType]
@@ -128,6 +131,8 @@ SERVICE_SEE_PAYLOAD_SCHEMA: Final[vol.Schema] = vol.Schema(
YAML_DEVICES: Final = "known_devices.yaml"
EVENT_NEW_DEVICE: Final = "device_tracker_new_device"
DATA_LEGACY_TRACKERS: Final = "device_tracker.legacy_trackers"
class SeeCallback(Protocol):
"""Protocol type for DeviceTracker.see callback."""
@@ -243,8 +248,19 @@ async def _async_setup_integration(
tracker = await get_tracker(hass, config)
tracker_future.set_result(tracker)
warned_called_see = False
async def async_see_service(call: ServiceCall) -> None:
"""Service to see a device."""
nonlocal warned_called_see
if not warned_called_see:
_LOGGER.warning(
"The %s.%s action is deprecated and will be removed in "
"Home Assistant Core 2027.5",
DOMAIN,
SERVICE_SEE,
)
warned_called_see = True
# Temp workaround for iOS, introduced in 0.65
data = dict(call.data)
data.pop("hostname", None)
@@ -327,6 +343,18 @@ class DeviceTrackerPlatform:
try:
scanner = None
setup: bool | None = None
legacy_trackers = hass.data.setdefault(DATA_LEGACY_TRACKERS, set())
if full_name not in legacy_trackers:
legacy_trackers.add(full_name)
_LOGGER.warning(
"The legacy device tracker platform %s is being set up; legacy "
"device trackers are deprecated and will be removed in Home "
"Assistant Core 2027.5, please migrate to an integration which "
"uses a modern config entry based device tracker",
full_name,
)
if hasattr(self.platform, "async_get_scanner"):
scanner = await self.platform.async_get_scanner(
hass, {DOMAIN: self.config}
+42 -1
View File
@@ -13,6 +13,7 @@ from homeassistant.config_entries import ConfigFlow, ConfigFlowResult
from homeassistant.const import CONF_HOST
from homeassistant.helpers.aiohttp_client import async_get_clientsession
from homeassistant.helpers.device_registry import format_mac
from homeassistant.helpers.service_info.zeroconf import ZeroconfServiceInfo
from .const import DOMAIN
@@ -31,6 +32,46 @@ class DucoConfigFlow(ConfigFlow, domain=DOMAIN):
VERSION = 1
MINOR_VERSION = 1
_host: str
_box_name: str
async def async_step_zeroconf(
self, discovery_info: ZeroconfServiceInfo
) -> ConfigFlowResult:
"""Handle zeroconf discovery."""
try:
box_name, mac = await self._validate_input(discovery_info.host)
except DucoConnectionError:
return self.async_abort(reason="cannot_connect")
except DucoError:
_LOGGER.exception("Unexpected error discovering Duco box via zeroconf")
return self.async_abort(reason="unknown")
await self.async_set_unique_id(format_mac(mac))
self._abort_if_unique_id_configured(updates={CONF_HOST: discovery_info.host})
self._host = discovery_info.host
self._box_name = box_name
self.context["title_placeholders"] = {"name": box_name}
return await self.async_step_discovery_confirm()
async def async_step_discovery_confirm(
self, user_input: dict[str, Any] | None = None
) -> ConfigFlowResult:
"""Confirm discovery."""
if user_input is not None:
return self.async_create_entry(
title=self._box_name,
data={CONF_HOST: self._host},
)
self._set_confirm_only()
return self.async_show_form(
step_id="discovery_confirm",
description_placeholders={"name": self._box_name},
)
async def async_step_user(
self, user_input: dict[str, Any] | None = None
) -> ConfigFlowResult:
@@ -46,7 +87,7 @@ class DucoConfigFlow(ConfigFlow, domain=DOMAIN):
_LOGGER.exception("Unexpected error connecting to Duco box")
errors["base"] = "unknown"
else:
await self.async_set_unique_id(format_mac(mac))
await self.async_set_unique_id(format_mac(mac), raise_on_progress=False)
self._abort_if_unique_id_configured()
return self.async_create_entry(
+11 -1
View File
@@ -2,7 +2,9 @@
from __future__ import annotations
from duco.exceptions import DucoError
import logging
from duco.exceptions import DucoError, DucoRateLimitError
from duco.models import Node, NodeType, VentilationState
from homeassistant.components.fan import FanEntity, FanEntityFeature
@@ -15,6 +17,8 @@ from .const import DOMAIN
from .coordinator import DucoConfigEntry, DucoCoordinator
from .entity import DucoEntity
_LOGGER = logging.getLogger(__name__)
PARALLEL_UPDATES = 1
# Permanent speed states ordered low → high.
@@ -118,6 +122,12 @@ class DucoVentilationFanEntity(DucoEntity, FanEntity):
await self.coordinator.client.async_set_ventilation_state(
self._node_id, state
)
except DucoRateLimitError as err:
_LOGGER.warning("Duco write rate limit exceeded for node %s", self._node_id)
raise HomeAssistantError(
translation_domain=DOMAIN,
translation_key="rate_limit_exceeded",
) from err
except DucoError as err:
raise HomeAssistantError(
translation_domain=DOMAIN,
+7 -1
View File
@@ -8,5 +8,11 @@
"iot_class": "local_polling",
"loggers": ["duco"],
"quality_scale": "bronze",
"requirements": ["python-duco-client==0.3.1"]
"requirements": ["python-duco-client==0.3.2"],
"zeroconf": [
{
"name": "duco [[][0-9a-f][0-9a-f][0-9a-f][0-9a-f][0-9a-f][0-9a-f][0-9a-f][0-9a-f][0-9a-f][0-9a-f][0-9a-f][0-9a-f][]].*",
"type": "_http._tcp.local."
}
]
}
@@ -46,18 +46,8 @@ rules:
# Gold
devices: done
diagnostics: done
discovery-update-info:
status: todo
comment: >-
DHCP host updating to be implemented in a follow-up PR.
The device hostname follows the pattern duco_<last 6 chars of MAC>
(e.g. duco_061293), which can be used for DHCP hostname matching.
discovery:
status: todo
comment: >-
Device can be discovered via DHCP. The hostname follows the pattern
duco_<last 6 chars of MAC> (e.g. duco_061293). To be implemented
in a follow-up PR together with discovery-update-info.
discovery-update-info: done
discovery: done
docs-data-update: done
docs-examples: done
docs-known-limitations: done
@@ -86,4 +76,4 @@ rules:
# Platinum
async-dependency: done
inject-websession: done
strict-typing: todo
strict-typing: done
+10 -1
View File
@@ -1,13 +1,19 @@
{
"config": {
"abort": {
"already_configured": "[%key:common::config_flow::abort::already_configured_device%]"
"already_configured": "[%key:common::config_flow::abort::already_configured_device%]",
"already_in_progress": "[%key:common::config_flow::abort::already_in_progress%]",
"cannot_connect": "[%key:common::config_flow::error::cannot_connect%]",
"unknown": "[%key:common::config_flow::error::unknown%]"
},
"error": {
"cannot_connect": "[%key:common::config_flow::error::cannot_connect%]",
"unknown": "[%key:common::config_flow::error::unknown%]"
},
"step": {
"discovery_confirm": {
"description": "Do you want to set up {name}?"
},
"user": {
"data": {
"host": "[%key:common::config_flow::data::host%]"
@@ -70,6 +76,9 @@
},
"failed_to_set_state": {
"message": "Failed to set ventilation state: {error}"
},
"rate_limit_exceeded": {
"message": "The Duco device has reached its daily write limit. Try again tomorrow."
}
}
}
+15 -16
View File
@@ -9,10 +9,11 @@ from homeassistant.const import EntityCategory
from homeassistant.core import HomeAssistant
from homeassistant.helpers.entity_platform import AddEntitiesCallback
from homeassistant.helpers.typing import ConfigType, DiscoveryInfoType
from homeassistant.helpers.update_coordinator import CoordinatorEntity
from .const import EVOHOME_DATA
from .coordinator import EvoDataUpdateCoordinator
from .entity import EvoEntity, is_valid_zone, unique_zone_id
from .entity import is_valid_zone, unique_zone_id
async def async_setup_platform(
@@ -40,15 +41,22 @@ async def async_setup_platform(
async_add_entities(entities)
for entity in entities:
await entity.update_attrs()
class EvoResetButtonBase(EvoEntity, ButtonEntity):
"""Base for reset button entities."""
class EvoResetButtonBase(CoordinatorEntity[EvoDataUpdateCoordinator], ButtonEntity):
"""Base for Evohome's Button entities."""
_attr_entity_category = EntityCategory.CONFIG
_evo_state_attr_names = ()
_evo_device: evo.ControlSystem | evo.HotWater | evo.Zone
def __init__(
self,
coordinator: EvoDataUpdateCoordinator,
evo_device: evo.ControlSystem | evo.HotWater | evo.Zone,
) -> None:
"""Initialize an Evohome reset button entity."""
super().__init__(coordinator, context=evo_device.id)
self._evo_device = evo_device
async def async_press(self) -> None:
"""Reset the Evohome entity to its base operating mode."""
@@ -58,10 +66,7 @@ class EvoResetButtonBase(EvoEntity, ButtonEntity):
class EvoResetSystemButton(EvoResetButtonBase):
"""Button entity for system reset."""
_attr_translation_key = "reset_system_mode"
_evo_device: evo.ControlSystem
_evo_id_attr = "system_id"
def __init__(
self,
@@ -78,10 +83,7 @@ class EvoResetSystemButton(EvoResetButtonBase):
class EvoResetDhwButton(EvoResetButtonBase):
"""Button entity for DHW override reset."""
_attr_translation_key = "clear_dhw_override"
_evo_device: evo.HotWater
_evo_id_attr = "dhw_id"
def __init__(
self,
@@ -98,10 +100,7 @@ class EvoResetDhwButton(EvoResetButtonBase):
class EvoResetZoneButton(EvoResetButtonBase):
"""Button entity for zone override reset."""
_attr_translation_key = "clear_zone_override"
_evo_device: evo.Zone
_evo_id_attr = "zone_id"
def __init__(
self,
+1 -5
View File
@@ -40,11 +40,7 @@ def unique_zone_id(evo_device: evo.Zone) -> str:
class EvoEntity(CoordinatorEntity[EvoDataUpdateCoordinator]):
"""Base for any evohome-compatible entity (controller, DHW, zone).
This includes the controller, (1 to 12) heating zones and (optionally) a
DHW controller.
"""
"""Base for Evohome's Climate & WaterHeater entities."""
_evo_device: evo.ControlSystem | evo.HotWater | evo.Zone
_evo_id_attr: str
@@ -19,7 +19,7 @@
"data_description": {
"api_key": "The new API access token for authenticating with Firefly III"
},
"description": "The access token for your Firefly III instance is invalid and needs to be updated. Go to **Options > Profile** and select the **OAuth** tab. Create a new personal access token and copy it (it will only display once)."
"description": "The access token for your Firefly III instance is invalid and needs to be updated. Go to **Options > Remote access and tokens**. Create a new personal access token and copy it (it will only display once)."
},
"reconfigure": {
"data": {
@@ -46,7 +46,7 @@
"url": "[%key:common::config_flow::data::url%]",
"verify_ssl": "Verify the SSL certificate of the Firefly III instance"
},
"description": "You can create an API key in the Firefly III UI. Go to **Options > Profile** and select the **OAuth** tab. Create a new personal access token and copy it (it will only display once)."
"description": "You can create an API key in the Firefly III UI. Go to **Options > Remote access and tokens**. Create a new personal access token and copy it (it will only display once)."
}
}
},
@@ -26,6 +26,7 @@ from homeassistant.components.media_player import (
from homeassistant.core import HomeAssistant
from homeassistant.helpers.device_registry import DeviceInfo
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from homeassistant.util import dt as dt_util
from . import FrontierSiliconConfigEntry
from .browse_media import browse_node, browse_top_level
@@ -118,7 +119,8 @@ class AFSAPIDevice(MediaPlayerEntity):
features |= MediaPlayerEntityFeature.REPEAT_SET
if self.__play_caps & PlayCaps.SHUFFLE:
features |= MediaPlayerEntityFeature.SHUFFLE_SET
if self.__play_caps & PlayCaps.SEEK:
features |= MediaPlayerEntityFeature.SEEK
if self._supports_sound_mode:
features |= MediaPlayerEntityFeature.SELECT_SOUND_MODE
@@ -223,6 +225,21 @@ class AFSAPIDevice(MediaPlayerEntity):
self._attr_is_volume_muted = await afsapi.get_mute()
self._attr_media_image_url = await afsapi.get_play_graphic()
if self.__play_caps and self.__play_caps & PlayCaps.SEEK:
position_ms = await afsapi.get_play_position()
duration_ms = await afsapi.get_play_duration()
self._attr_media_position = (
position_ms // 1000 if position_ms is not None else None
)
self._attr_media_duration = (
duration_ms // 1000 if duration_ms is not None else None
)
self._attr_media_position_updated_at = dt_util.utcnow()
else:
self._attr_media_position = None
self._attr_media_duration = None
self._attr_media_position_updated_at = None
if self._supports_sound_mode:
try:
eq_preset = await afsapi.get_eq_preset()
@@ -247,6 +264,9 @@ class AFSAPIDevice(MediaPlayerEntity):
self._attr_is_volume_muted = None
self._attr_media_image_url = None
self._attr_sound_mode = None
self._attr_media_position = None
self._attr_media_duration = None
self._attr_media_position_updated_at = None
self._attr_volume_level = None
@@ -334,6 +354,10 @@ class AFSAPIDevice(MediaPlayerEntity):
"""Set shuffle mode."""
await self.fs_device.set_play_shuffle(shuffle)
async def async_media_seek(self, position: float) -> None:
"""Seek to a position in seconds."""
await self.fs_device.set_play_position(int(position * 1000))
async def async_browse_media(
self,
media_content_type: MediaType | str | None = None,
+1
View File
@@ -81,6 +81,7 @@ MODEL_INPUTS = {
"XLR 2",
"Analog 1",
"Analog 2",
"Analog 3",
"BNC",
"Coaxial",
"Optical 1",
+1 -1
View File
@@ -10,5 +10,5 @@
"integration_type": "hub",
"iot_class": "cloud_polling",
"loggers": ["apyhiveapi"],
"requirements": ["pyhive-integration==1.0.8"]
"requirements": ["pyhive-integration==1.0.9"]
}
@@ -452,6 +452,16 @@ async def async_setup(hass: HomeAssistant, config: ConfigType) -> bool: # noqa:
"arch": arch,
},
)
if not info["docker"] and not info["virtualenv"]:
ir.async_create_issue(
hass,
DOMAIN,
"unsupported_local_deps",
learn_more_url=DEPRECATION_URL,
is_fixable=False,
severity=IssueSeverity.WARNING,
translation_key="unsupported_local_deps",
)
# Delay deprecation check to make sure installation method is determined correctly
hass.bus.async_listen_once(EVENT_HOMEASSISTANT_STARTED, _async_check_deprecation)
@@ -106,12 +106,12 @@
"title": "[%key:component::homeassistant::issues::deprecated_architecture::title%]"
},
"deprecated_method": {
"description": "This system is using the {installation_type} installation type, which has been deprecated and will become unsupported following the release of Home Assistant 2025.12. While you can continue using your current setup after that point, we strongly recommend migrating to a supported installation method.",
"title": "Deprecation notice: Installation method"
"description": "This system is using the {installation_type} installation type, which has been unsupported since Home Assistant 2025.12. To continue receiving updates and support, migrate to a supported installation method.",
"title": "Unsupported installation method"
},
"deprecated_method_architecture": {
"description": "This system is using the {installation_type} installation type, and 32-bit hardware (`{arch}`), both of which have been deprecated and will no longer be supported after the release of Home Assistant 2025.12.",
"title": "Deprecation notice"
"description": "This system is using the {installation_type} installation type, and 32-bit hardware (`{arch}`), both of which have been unsupported since Home Assistant 2025.12. To continue receiving updates and support, migrate to supported hardware and use a supported installation method.",
"title": "Unsupported installation method and architecture"
},
"deprecated_os_aarch64": {
"description": "This system is running on a 32-bit operating system (`armv7`), which has been deprecated and will no longer receive updates after the release of Home Assistant 2025.12. To continue using Home Assistant on this hardware, you will need to install a 64-bit operating system. Please refer to our [installation guide]({installation_guide}).",
@@ -203,6 +203,10 @@
}
},
"title": "Storage corruption detected for {storage_key}"
},
"unsupported_local_deps": {
"description": "This system is running Home Assistant outside a virtual environment or a Docker container. This is not supported and will not work after the release of Home Assistant 2026.11.",
"title": "Deprecation notice: Installation method"
}
},
"services": {
@@ -11,7 +11,8 @@ from automower_ble.protocol import ResponseResult
from bleak import BleakError
from bleak_retry_connector import get_device
from gardena_bluetooth.const import ScanService
from gardena_bluetooth.parse import ManufacturerData, ProductType
from gardena_bluetooth.parse import ProductType
from gardena_bluetooth.scan import async_get_manufacturer_data
import voluptuous as vol
from homeassistant.components import bluetooth
@@ -37,43 +38,6 @@ USER_SCHEMA = vol.Schema(
REAUTH_SCHEMA = BLUETOOTH_SCHEMA
def _is_supported(discovery_info: BluetoothServiceInfo):
"""Check if device is supported."""
if ScanService not in discovery_info.service_uuids:
LOGGER.debug(
"Unsupported device, missing service %s: %s", ScanService, discovery_info
)
return False
if not (data := discovery_info.manufacturer_data.get(ManufacturerData.company)):
LOGGER.debug(
"Unsupported device, missing manufacturer data %s: %s",
ManufacturerData.company,
discovery_info,
)
return False
manufacturer_data = ManufacturerData.decode(data)
product_type = ProductType.from_manufacturer_data(manufacturer_data)
# Some mowers only expose the serial number in the manufacturer data
# and not the product type, so we allow None here as well.
if product_type not in (ProductType.MOWER, ProductType.UNKNOWN):
LOGGER.debug("Unsupported device: %s (%s)", manufacturer_data, discovery_info)
return False
if not manufacturer_data.pairable:
LOGGER.error(
"The mower does not appear to be pairable. "
"Ensure the mower is in pairing mode before continuing. "
"If the mower isn't pariable you will receive authentication "
"errors and be unable to connect"
)
LOGGER.debug("Supported device: %s", manufacturer_data)
return True
def _pin_valid(pin: str) -> bool:
"""Check if the pin is valid."""
try:
@@ -91,6 +55,32 @@ class HusqvarnaAutomowerBleConfigFlow(ConfigFlow, domain=DOMAIN):
address: str | None = None
mower_name: str = ""
pin: str | None = None
pairable: bool | None = None
async def _is_supported(self, discovery_info: BluetoothServiceInfo):
"""Check if device is supported."""
if ScanService not in discovery_info.service_uuids:
LOGGER.debug(
"Unsupported device, missing service %s: %s",
ScanService,
discovery_info,
)
return False
manufacturer_data = (
await async_get_manufacturer_data({discovery_info.address})
)[discovery_info.address]
if manufacturer_data.product_type != ProductType.MOWER:
LOGGER.debug(
"Unsupported device: %s (%s)", manufacturer_data, discovery_info
)
return False
self.pairable = manufacturer_data.pairable
LOGGER.debug("Supported device: %s", manufacturer_data)
return True
async def async_step_bluetooth(
self, discovery_info: BluetoothServiceInfo
@@ -98,7 +88,7 @@ class HusqvarnaAutomowerBleConfigFlow(ConfigFlow, domain=DOMAIN):
"""Handle the bluetooth discovery step."""
LOGGER.debug("Discovered device: %s", discovery_info)
if not _is_supported(discovery_info):
if not await self._is_supported(discovery_info):
return self.async_abort(reason="no_devices_found")
self.context["title_placeholders"] = {
@@ -122,6 +112,13 @@ class HusqvarnaAutomowerBleConfigFlow(ConfigFlow, domain=DOMAIN):
errors["base"] = "invalid_pin"
else:
self.pin = user_input[CONF_PIN]
if self.pairable is False:
LOGGER.warning(
"The mower does not appear to be pairable. "
"Ensure the mower is in pairing mode before continuing. "
"If the mower isn't pairable you will receive authentication "
"errors and be unable to connect"
)
return await self.check_mower(user_input)
return self.async_show_form(
@@ -7,5 +7,5 @@
"integration_type": "hub",
"iot_class": "cloud_polling",
"loggers": ["pydrawise"],
"requirements": ["pydrawise==2026.3.0"]
"requirements": ["pydrawise==2026.4.0"]
}
@@ -4,6 +4,9 @@
"hydrological_alert": {
"default": "mdi:alert-octagon-outline"
},
"ice_phenomena": {
"default": "mdi:snowflake"
},
"water_flow": {
"default": "mdi:waves-arrow-right"
},
@@ -7,5 +7,5 @@
"integration_type": "service",
"iot_class": "cloud_polling",
"quality_scale": "platinum",
"requirements": ["imgw_pib==2.1.0"]
"requirements": ["imgw_pib==2.1.1"]
}
+14 -1
View File
@@ -16,7 +16,12 @@ from homeassistant.components.sensor import (
SensorEntityDescription,
SensorStateClass,
)
from homeassistant.const import UnitOfLength, UnitOfTemperature, UnitOfVolumeFlowRate
from homeassistant.const import (
PERCENTAGE,
UnitOfLength,
UnitOfTemperature,
UnitOfVolumeFlowRate,
)
from homeassistant.core import HomeAssistant
from homeassistant.helpers import entity_registry as er
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
@@ -60,6 +65,14 @@ SENSOR_TYPES: tuple[ImgwPibSensorEntityDescription, ...] = (
value=lambda data: data.hydrological_alert.value,
attrs=gen_alert_attributes,
),
ImgwPibSensorEntityDescription(
key="ice_phenomena",
translation_key="ice_phenomena",
native_unit_of_measurement=PERCENTAGE,
state_class=SensorStateClass.MEASUREMENT,
value=lambda data: data.ice_phenomena.value,
suggested_display_precision=0,
),
ImgwPibSensorEntityDescription(
key="water_flow",
translation_key="water_flow",
@@ -59,6 +59,9 @@
}
}
},
"ice_phenomena": {
"name": "Ice phenomena"
},
"water_flow": {
"name": "Water flow"
},
@@ -6,6 +6,6 @@
"documentation": "https://www.home-assistant.io/integrations/jewish_calendar",
"iot_class": "calculated",
"loggers": ["hdate"],
"requirements": ["hdate[astral]==1.1.2"],
"requirements": ["hdate[astral]==1.2.1"],
"single_config_entry": true
}
@@ -197,6 +197,12 @@ def _parse_event(event: dict[str, Any]) -> Event:
and value.tzinfo is not None
):
event[key] = dt_util.as_local(value).replace(tzinfo=None)
# UNTIL in the rrule must be floating (timezone-naive) to match the
# floating dtstart used by the ical library. Strip tzinfo from UNTIL
# if present, converting to local time first.
if (rrule_obj := event.get(EVENT_RRULE)) and isinstance(rrule_obj, Recur):
if isinstance(rrule_obj.until, datetime) and rrule_obj.until.tzinfo is not None:
rrule_obj.until = dt_util.as_local(rrule_obj.until).replace(tzinfo=None)
try:
return Event(**event)
+7 -7
View File
@@ -2,7 +2,7 @@
from __future__ import annotations
from collections.abc import Callable, Mapping
from collections.abc import Callable, Collection, Mapping
from typing import Any
from homeassistant.components.sensor import ATTR_STATE_CLASS, NON_NUMERIC_DEVICE_CLASSES
@@ -75,12 +75,12 @@ def _async_config_entries_for_ids(
def async_determine_event_types(
hass: HomeAssistant, entity_ids: list[str] | None, device_ids: list[str] | None
) -> tuple[EventType[Any] | str, ...]:
) -> set[EventType[Any] | str]:
"""Reduce the event types based on the entity ids and device ids."""
logbook_config: LogbookConfig = hass.data[DOMAIN]
external_events = logbook_config.external_events
if not entity_ids and not device_ids:
return (*BUILT_IN_EVENTS, *external_events)
return {*BUILT_IN_EVENTS, *external_events}
interested_domains: set[str] = set()
for entry_id in _async_config_entries_for_ids(hass, entity_ids, device_ids):
@@ -93,16 +93,16 @@ def async_determine_event_types(
# to add them since we have historically included
# them when matching only on entities
#
intrested_event_types: set[EventType[Any] | str] = {
interested_event_types: set[EventType[Any] | str] = {
external_event
for external_event, domain_call in external_events.items()
if domain_call[0] in interested_domains
} | AUTOMATION_EVENTS
if entity_ids:
# We also allow entity_ids to be recorded via manual logbook entries.
intrested_event_types.add(EVENT_LOGBOOK_ENTRY)
interested_event_types.add(EVENT_LOGBOOK_ENTRY)
return tuple(intrested_event_types)
return interested_event_types
@callback
@@ -187,7 +187,7 @@ def async_subscribe_events(
hass: HomeAssistant,
subscriptions: list[CALLBACK_TYPE],
target: Callable[[Event[Any]], None],
event_types: tuple[EventType[Any] | str, ...],
event_types: Collection[EventType[Any] | str],
entities_filter: Callable[[str], bool] | None,
entity_ids: list[str] | None,
device_ids: list[str] | None,
@@ -2,7 +2,7 @@
from __future__ import annotations
from collections.abc import Callable, Generator, Sequence
from collections.abc import Callable, Collection, Generator, Sequence
from dataclasses import dataclass, field
from datetime import datetime as dt
import logging
@@ -126,7 +126,7 @@ class EventProcessor:
def __init__(
self,
hass: HomeAssistant,
event_types: tuple[EventType[Any] | str, ...],
event_types: Collection[EventType[Any] | str],
entity_ids: list[str] | None = None,
device_ids: list[str] | None = None,
context_id: str | None = None,
@@ -20,7 +20,6 @@ from homeassistant.helpers.event import async_track_point_in_utc_time
from homeassistant.helpers.json import json_bytes
from homeassistant.util import dt as dt_util
from homeassistant.util.async_ import create_eager_task
from homeassistant.util.event_type import EventType
from .const import DOMAIN
from .helpers import (
@@ -366,16 +365,11 @@ async def ws_event_stream(
# cache parent user_ids as they fire. Historical queries don't — the
# context_only join fetches them by context_id regardless of type.
# Unfiltered streams already include it via BUILT_IN_EVENTS.
live_event_types: tuple[EventType[Any] | str, ...] = (
event_types
if EVENT_CALL_SERVICE in event_types
else (*event_types, EVENT_CALL_SERVICE)
)
async_subscribe_events(
hass,
subscriptions,
_queue_or_cancel,
live_event_types,
{*event_types, EVENT_CALL_SERVICE},
entities_filter,
entity_ids,
device_ids,
+5
View File
@@ -46,6 +46,11 @@ class LyricLocalOAuth2Implementation(
):
"""Lyric Local OAuth2 implementation."""
@property
def extra_authorize_data(self) -> dict:
"""Prompt the user to choose between Resideo and First Alert apps."""
return {"appSelect": "1"}
async def _token_request(self, data: dict) -> dict:
"""Make a token request."""
session = async_get_clientsession(self.hass)
@@ -8,6 +8,6 @@
"documentation": "https://www.home-assistant.io/integrations/matter",
"integration_type": "hub",
"iot_class": "local_push",
"requirements": ["matter-python-client==0.4.1"],
"requirements": ["matter-python-client==0.6.0"],
"zeroconf": ["_matter._tcp.local.", "_matterc._udp.local."]
}
@@ -53,6 +53,7 @@ PLATFORMS = [
Platform.BUTTON,
Platform.MEDIA_PLAYER,
Platform.NUMBER,
Platform.SELECT,
Platform.SWITCH,
Platform.TEXT,
]
@@ -0,0 +1,123 @@
"""Music Assistant select platform."""
from __future__ import annotations
from typing import Final
from music_assistant_client.client import MusicAssistantClient
from music_assistant_models.player import PlayerOption, PlayerOptionType
from homeassistant.components.select import SelectEntity, SelectEntityDescription
from homeassistant.const import Platform
from homeassistant.core import HomeAssistant
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from . import MusicAssistantConfigEntry
from .entity import MusicAssistantPlayerOptionEntity
from .helpers import catch_musicassistant_error
PLAYER_OPTIONS_SELECT: Final[dict[str, bool]] = {
# translation_key: enabled_by_default
"dimmer": False,
"equalizer_mode": False,
"link_audio_delay": True,
"link_audio_quality": False,
"link_control": False,
"sleep": False,
"surround_decoder_type": False,
"tone_control_mode": True,
}
async def async_setup_entry(
hass: HomeAssistant,
entry: MusicAssistantConfigEntry,
async_add_entities: AddConfigEntryEntitiesCallback,
) -> None:
"""Set up Music Assistant Select Entities (Player Options) from Config Entry."""
mass = entry.runtime_data.mass
def add_player(player_id: str) -> None:
"""Handle add player."""
player = mass.players.get(player_id)
if player is None:
return
entities: list[MusicAssistantPlayerConfigSelect] = []
for player_option in player.options:
if (
not player_option.read_only
and player_option.type
!= PlayerOptionType.BOOLEAN # these always go to switch
and player_option.options
):
# We ignore entities with unknown translation key for the base name.
# However, we accept a non-available translation_key in strings.json for the entity's state,
# as these are oftentimes dynamically created, dependent on a specific player and might not be known to the provider
# developer. In that case, the frontend falls back to showing the state's bare translation key.
if player_option.translation_key not in PLAYER_OPTIONS_SELECT:
continue
entities.append(
MusicAssistantPlayerConfigSelect(
mass,
player_id,
player_option=player_option,
entity_description=SelectEntityDescription(
key=player_option.key,
translation_key=player_option.translation_key,
entity_registry_enabled_default=PLAYER_OPTIONS_SELECT[
player_option.translation_key
],
),
)
)
async_add_entities(entities)
# register callback to add players when they are discovered
entry.runtime_data.platform_handlers.setdefault(Platform.SELECT, add_player)
class MusicAssistantPlayerConfigSelect(MusicAssistantPlayerOptionEntity, SelectEntity):
"""Representation of a select entity to control player provider dependent settings."""
def __init__(
self,
mass: MusicAssistantClient,
player_id: str,
player_option: PlayerOption,
entity_description: SelectEntityDescription,
) -> None:
"""Initialize MusicAssistantPlayerConfigSelect."""
# this was verified already in the entry callback
assert player_option.options is not None
# we have to define the dicts before initializing the parent, as this
# then calls self.on_player_option_update
self._option_translation_key_to_key_mapping = {
option.translation_key: option.key for option in player_option.options
}
self._option_key_to_translation_key_mapping = {
option.key: option.translation_key for option in player_option.options
}
super().__init__(mass, player_id, player_option)
self.entity_description = entity_description
self._attr_options = list(self._option_translation_key_to_key_mapping.keys())
@catch_musicassistant_error
async def async_select_option(self, option: str) -> None:
"""Select an option."""
await self.mass.players.set_option(
self.player_id,
self.mass_option_key,
self._option_translation_key_to_key_mapping[option],
)
def on_player_option_update(self, player_option: PlayerOption) -> None:
"""Update on player option update."""
self._attr_current_option = (
self._option_key_to_translation_key_mapping.get(player_option.value)
if isinstance(player_option.value, str)
else None
)
@@ -147,6 +147,80 @@
"name": "Treble"
}
},
"select": {
"dimmer": {
"name": "Dimmer",
"state": {
"auto": "[%key:common::state::auto%]"
}
},
"equalizer_mode": {
"name": "Equalizer mode",
"state": {
"auto": "[%key:common::state::auto%]",
"bypass": "Bypass",
"manual": "[%key:common::state::manual%]"
}
},
"link_audio_delay": {
"name": "Link audio delay",
"state": {
"audio_sync": "Audio synchronization",
"audio_sync_off": "Audio synchronization off",
"audio_sync_on": "Audio synchronization on",
"balanced": "Balanced",
"lip_sync": "Lip synchronization"
}
},
"link_audio_quality": {
"name": "Link audio quality",
"state": {
"compressed": "Compressed",
"uncompressed": "Uncompressed"
}
},
"link_control": {
"name": "Link control",
"state": {
"speed": "Speed",
"stability": "Stability",
"standard": "Standard"
}
},
"sleep": {
"name": "Sleep timer",
"state": {
"0": "[%key:common::state::off%]",
"30": "30 minutes",
"60": "60 minutes",
"90": "90 minutes",
"120": "120 minutes"
}
},
"surround_decoder_type": {
"name": "Surround decoder type",
"state": {
"auto": "[%key:common::state::auto%]",
"dolby_pl": "Dolby ProLogic",
"dolby_pl2x_game": "Dolby ProLogic 2x Game",
"dolby_pl2x_movie": "Dolby ProLogic 2x Movie",
"dolby_pl2x_music": "Dolby ProLogic 2x Music",
"dolby_surround": "Dolby Surround",
"dts_neo6_cinema": "DTS Neo:6 Cinema",
"dts_neo6_music": "DTS Neo:6 Music",
"dts_neural_x": "DTS Neural:X",
"toggle": "[%key:common::action::toggle%]"
}
},
"tone_control_mode": {
"name": "Tone control mode",
"state": {
"auto": "[%key:common::state::auto%]",
"bypass": "Bypass",
"manual": "[%key:common::state::manual%]"
}
}
},
"switch": {
"adaptive_drc": {
"name": "Adaptive DRC"
+47 -11
View File
@@ -7,9 +7,10 @@ from pynobo import nobo
from homeassistant.config_entries import ConfigEntry
from homeassistant.const import CONF_IP_ADDRESS, EVENT_HOMEASSISTANT_STOP, Platform
from homeassistant.core import HomeAssistant
from homeassistant.exceptions import ConfigEntryNotReady
from homeassistant.util import dt as dt_util
from .const import CONF_AUTO_DISCOVERED, CONF_SERIAL
from .const import CONF_AUTO_DISCOVERED, CONF_SERIAL, DOMAIN
PLATFORMS = [Platform.CLIMATE, Platform.SELECT, Platform.SENSOR]
@@ -20,16 +21,51 @@ async def async_setup_entry(hass: HomeAssistant, entry: NoboHubConfigEntry) -> b
"""Set up Nobø Ecohub from a config entry."""
serial = entry.data[CONF_SERIAL]
discover = entry.data[CONF_AUTO_DISCOVERED]
ip_address = None if discover else entry.data[CONF_IP_ADDRESS]
hub = nobo(
serial=serial,
ip=ip_address,
discover=discover,
synchronous=False,
timezone=dt_util.get_default_time_zone(),
)
await hub.connect()
stored_ip = entry.data[CONF_IP_ADDRESS]
auto_discovered = entry.data[CONF_AUTO_DISCOVERED]
async def _connect(ip: str) -> nobo:
hub = nobo(
serial=serial,
ip=ip,
discover=False,
synchronous=False,
timezone=dt_util.get_default_time_zone(),
)
await hub.connect()
return hub
try:
hub = await _connect(stored_ip)
except OSError as err:
if not auto_discovered:
raise ConfigEntryNotReady(
translation_domain=DOMAIN,
translation_key="cannot_connect_manual",
translation_placeholders={"serial": serial, "ip": stored_ip},
) from err
# Stored IP may be stale for an auto-discovered entry - try UDP
# rediscovery to pick up a new DHCP lease.
discovered = await nobo.async_discover_hubs(serial=serial)
if not discovered:
raise ConfigEntryNotReady(
translation_domain=DOMAIN,
translation_key="hub_not_found",
translation_placeholders={"serial": serial},
) from err
new_ip, _ = next(iter(discovered))
try:
hub = await _connect(new_ip)
except OSError as rediscover_err:
raise ConfigEntryNotReady(
translation_domain=DOMAIN,
translation_key="cannot_connect_rediscovered",
translation_placeholders={"ip": new_ip},
) from rediscover_err
if new_ip != stored_ip:
hass.config_entries.async_update_entry(
entry, data={**entry.data, CONF_IP_ADDRESS: new_ip}
)
async def _async_close(event):
"""Close the Nobø Ecohub socket connection when HA stops."""
@@ -47,6 +47,17 @@
}
}
},
"exceptions": {
"cannot_connect_manual": {
"message": "Unable to connect to Nobø Ecohub with serial {serial} at {ip}. If the hub has moved to a new IP address, remove and re-add the integration."
},
"cannot_connect_rediscovered": {
"message": "Unable to connect to Nobø Ecohub at rediscovered IP {ip}; will retry."
},
"hub_not_found": {
"message": "Nobø Ecohub with serial {serial} not found on the network. The hub may be offline or on a different subnet; will retry."
}
},
"options": {
"step": {
"init": {
@@ -7,6 +7,6 @@
"integration_type": "device",
"iot_class": "local_polling",
"quality_scale": "platinum",
"requirements": ["peblar==0.4.0"],
"requirements": ["peblar==0.5.1"],
"zeroconf": [{ "name": "pblr-*", "type": "_http._tcp.local." }]
}
@@ -32,8 +32,6 @@ class PVOutputFlowHandler(ConfigFlow, domain=DOMAIN):
VERSION = 1
imported_name: str | None = None
async def async_step_user(
self, user_input: dict[str, Any] | None = None
) -> ConfigFlowResult:
@@ -56,7 +54,7 @@ class PVOutputFlowHandler(ConfigFlow, domain=DOMAIN):
await self.async_set_unique_id(str(user_input[CONF_SYSTEM_ID]))
self._abort_if_unique_id_configured()
return self.async_create_entry(
title=self.imported_name or str(user_input[CONF_SYSTEM_ID]),
title=str(user_input[CONF_SYSTEM_ID]),
data={
CONF_SYSTEM_ID: user_input[CONF_SYSTEM_ID],
CONF_API_KEY: user_input[CONF_API_KEY],
@@ -6,5 +6,5 @@
"documentation": "https://www.home-assistant.io/integrations/pvoutput",
"integration_type": "device",
"iot_class": "cloud_polling",
"requirements": ["pvo==2.2.1"]
"requirements": ["pvo==3.0.0"]
}
@@ -13,8 +13,8 @@ from homeassistant.exceptions import ConfigEntryAuthFailed, ConfigEntryNotReady
from homeassistant.helpers import entity_registry as er
from homeassistant.helpers.aiohttp_client import async_get_clientsession
from .const import ACCOUNT_HASH, DOMAIN, UPDATE_INTERVAL
from .coordinator import RitualsDataUpdateCoordinator
from .const import ACCOUNT_HASH, UPDATE_INTERVAL
from .coordinator import RitualsConfigEntry, RitualsDataUpdateCoordinator
_LOGGER = logging.getLogger(__name__)
@@ -27,7 +27,7 @@ PLATFORMS = [
]
async def async_setup_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool:
async def async_setup_entry(hass: HomeAssistant, entry: RitualsConfigEntry) -> bool:
"""Set up Rituals Perfume Genie from a config entry."""
# Initiate reauth for old config entries which don't have username / password in the entry data
if CONF_EMAIL not in entry.data or CONF_PASSWORD not in entry.data:
@@ -87,19 +87,15 @@ async def async_setup_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool:
]
)
hass.data.setdefault(DOMAIN, {})[entry.entry_id] = coordinators
entry.runtime_data = coordinators
await hass.config_entries.async_forward_entry_setups(entry, PLATFORMS)
return True
async def async_unload_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool:
async def async_unload_entry(hass: HomeAssistant, entry: RitualsConfigEntry) -> bool:
"""Unload a config entry."""
unload_ok = await hass.config_entries.async_unload_platforms(entry, PLATFORMS)
if unload_ok:
hass.data[DOMAIN].pop(entry.entry_id)
return unload_ok
return await hass.config_entries.async_unload_platforms(entry, PLATFORMS)
@callback
@@ -12,13 +12,11 @@ from homeassistant.components.binary_sensor import (
BinarySensorEntity,
BinarySensorEntityDescription,
)
from homeassistant.config_entries import ConfigEntry
from homeassistant.const import EntityCategory
from homeassistant.core import HomeAssistant
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from .const import DOMAIN
from .coordinator import RitualsDataUpdateCoordinator
from .coordinator import RitualsConfigEntry
from .entity import DiffuserEntity
PARALLEL_UPDATES = 0
@@ -45,13 +43,11 @@ ENTITY_DESCRIPTIONS = (
async def async_setup_entry(
hass: HomeAssistant,
config_entry: ConfigEntry,
config_entry: RitualsConfigEntry,
async_add_entities: AddConfigEntryEntitiesCallback,
) -> None:
"""Set up the diffuser binary sensors."""
coordinators: dict[str, RitualsDataUpdateCoordinator] = hass.data[DOMAIN][
config_entry.entry_id
]
coordinators = config_entry.runtime_data
async_add_entities(
RitualsBinarySensorEntity(coordinator, description)
@@ -15,11 +15,13 @@ from .const import DOMAIN
_LOGGER = logging.getLogger(__name__)
type RitualsConfigEntry = ConfigEntry[dict[str, RitualsDataUpdateCoordinator]]
class RitualsDataUpdateCoordinator(DataUpdateCoordinator[None]):
"""Class to manage fetching Rituals Perfume Genie device data from single endpoint."""
config_entry: ConfigEntry
config_entry: RitualsConfigEntry
def __init__(
self,
@@ -5,11 +5,9 @@ from __future__ import annotations
from typing import Any
from homeassistant.components.diagnostics import async_redact_data
from homeassistant.config_entries import ConfigEntry
from homeassistant.core import HomeAssistant
from .const import DOMAIN
from .coordinator import RitualsDataUpdateCoordinator
from .coordinator import RitualsConfigEntry
TO_REDACT = {
"hublot",
@@ -18,15 +16,12 @@ TO_REDACT = {
async def async_get_config_entry_diagnostics(
hass: HomeAssistant, entry: ConfigEntry
hass: HomeAssistant, entry: RitualsConfigEntry
) -> dict[str, Any]:
"""Return diagnostics for a config entry."""
coordinators: dict[str, RitualsDataUpdateCoordinator] = hass.data[DOMAIN][
entry.entry_id
]
return {
"diffusers": [
async_redact_data(coordinator.diffuser.data, TO_REDACT)
for coordinator in coordinators.values()
for coordinator in entry.runtime_data.values()
]
}
@@ -9,12 +9,10 @@ from typing import Any
from pyrituals import Diffuser
from homeassistant.components.number import NumberEntity, NumberEntityDescription
from homeassistant.config_entries import ConfigEntry
from homeassistant.core import HomeAssistant
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from .const import DOMAIN
from .coordinator import RitualsDataUpdateCoordinator
from .coordinator import RitualsConfigEntry
from .entity import DiffuserEntity
PARALLEL_UPDATES = 1
@@ -42,13 +40,11 @@ ENTITY_DESCRIPTIONS = (
async def async_setup_entry(
hass: HomeAssistant,
config_entry: ConfigEntry,
config_entry: RitualsConfigEntry,
async_add_entities: AddConfigEntryEntitiesCallback,
) -> None:
"""Set up the diffuser numbers."""
coordinators: dict[str, RitualsDataUpdateCoordinator] = hass.data[DOMAIN][
config_entry.entry_id
]
coordinators = config_entry.runtime_data
async_add_entities(
RitualsNumberEntity(coordinator, description)
for coordinator in coordinators.values()
@@ -8,13 +8,11 @@ from dataclasses import dataclass
from pyrituals import Diffuser
from homeassistant.components.select import SelectEntity, SelectEntityDescription
from homeassistant.config_entries import ConfigEntry
from homeassistant.const import EntityCategory, UnitOfArea
from homeassistant.core import HomeAssistant
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from .const import DOMAIN
from .coordinator import RitualsDataUpdateCoordinator
from .coordinator import RitualsConfigEntry, RitualsDataUpdateCoordinator
from .entity import DiffuserEntity
PARALLEL_UPDATES = 1
@@ -45,13 +43,11 @@ ENTITY_DESCRIPTIONS = (
async def async_setup_entry(
hass: HomeAssistant,
config_entry: ConfigEntry,
config_entry: RitualsConfigEntry,
async_add_entities: AddConfigEntryEntitiesCallback,
) -> None:
"""Set up the diffuser select entities."""
coordinators: dict[str, RitualsDataUpdateCoordinator] = hass.data[DOMAIN][
config_entry.entry_id
]
coordinators = config_entry.runtime_data
async_add_entities(
RitualsSelectEntity(coordinator, description)
@@ -12,13 +12,11 @@ from homeassistant.components.sensor import (
SensorEntity,
SensorEntityDescription,
)
from homeassistant.config_entries import ConfigEntry
from homeassistant.const import PERCENTAGE, EntityCategory
from homeassistant.core import HomeAssistant
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from .const import DOMAIN
from .coordinator import RitualsDataUpdateCoordinator
from .coordinator import RitualsConfigEntry
from .entity import DiffuserEntity
PARALLEL_UPDATES = 0
@@ -61,13 +59,11 @@ ENTITY_DESCRIPTIONS = (
async def async_setup_entry(
hass: HomeAssistant,
config_entry: ConfigEntry,
config_entry: RitualsConfigEntry,
async_add_entities: AddConfigEntryEntitiesCallback,
) -> None:
"""Set up the diffuser sensors."""
coordinators: dict[str, RitualsDataUpdateCoordinator] = hass.data[DOMAIN][
config_entry.entry_id
]
coordinators = config_entry.runtime_data
async_add_entities(
RitualsSensorEntity(coordinator, description)
@@ -9,12 +9,10 @@ from typing import Any
from pyrituals import Diffuser
from homeassistant.components.switch import SwitchEntity, SwitchEntityDescription
from homeassistant.config_entries import ConfigEntry
from homeassistant.core import HomeAssistant, callback
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from .const import DOMAIN
from .coordinator import RitualsDataUpdateCoordinator
from .coordinator import RitualsConfigEntry, RitualsDataUpdateCoordinator
from .entity import DiffuserEntity
PARALLEL_UPDATES = 1
@@ -43,13 +41,11 @@ ENTITY_DESCRIPTIONS = (
async def async_setup_entry(
hass: HomeAssistant,
config_entry: ConfigEntry,
config_entry: RitualsConfigEntry,
async_add_entities: AddConfigEntryEntitiesCallback,
) -> None:
"""Set up the diffuser switch."""
coordinators: dict[str, RitualsDataUpdateCoordinator] = hass.data[DOMAIN][
config_entry.entry_id
]
coordinators = config_entry.runtime_data
async_add_entities(
RitualsSwitchEntity(coordinator, description)
@@ -257,6 +257,7 @@ SENSOR_DESCRIPTIONS = [
RoborockSensorDescription(
key="mop_clean_remaining",
native_unit_of_measurement=UnitOfTime.SECONDS,
suggested_unit_of_measurement=UnitOfTime.HOURS,
device_class=SensorDeviceClass.DURATION,
value_fn=lambda data: data.status.rdt,
translation_key="mop_drying_remaining_time",
+3 -11
View File
@@ -3,7 +3,6 @@
from __future__ import annotations
import asyncio
from typing import TYPE_CHECKING
from sfrbox_api.bridge import SFRBox
from sfrbox_api.exceptions import SFRBoxAuthenticationError, SFRBoxError
@@ -14,14 +13,13 @@ from homeassistant.exceptions import ConfigEntryAuthFailed, ConfigEntryNotReady
from homeassistant.helpers import device_registry as dr
from homeassistant.helpers.aiohttp_client import async_get_clientsession
from .const import DOMAIN, PLATFORMS, PLATFORMS_WITH_AUTH
from .const import DOMAIN, PLATFORMS
from .coordinator import SFRConfigEntry, SFRDataUpdateCoordinator, SFRRuntimeData
async def async_setup_entry(hass: HomeAssistant, entry: SFRConfigEntry) -> bool:
"""Set up SFR box as config entry."""
box = SFRBox(ip=entry.data[CONF_HOST], client=async_get_clientsession(hass))
platforms = PLATFORMS
has_auth = False
if (username := entry.data.get(CONF_USERNAME)) and (
password := entry.data.get(CONF_PASSWORD)
@@ -39,11 +37,11 @@ async def async_setup_entry(hass: HomeAssistant, entry: SFRConfigEntry) -> bool:
translation_key="unknown_error",
translation_placeholders={"error": str(err)},
) from err
platforms = PLATFORMS_WITH_AUTH
has_auth = True
data = SFRRuntimeData(
box=box,
has_authentication=has_auth,
dsl=SFRDataUpdateCoordinator(
hass, entry, box, "dsl", lambda b: b.dsl_get_info()
),
@@ -65,8 +63,6 @@ async def async_setup_entry(hass: HomeAssistant, entry: SFRConfigEntry) -> bool:
# Preload system information
await data.system.async_config_entry_first_refresh()
system_info = data.system.data
if TYPE_CHECKING:
assert system_info is not None
# Preload other coordinators (based on net infrastructure)
tasks = [data.wan.async_config_entry_first_refresh()]
@@ -91,15 +87,11 @@ async def async_setup_entry(hass: HomeAssistant, entry: SFRConfigEntry) -> bool:
)
entry.runtime_data = data
await hass.config_entries.async_forward_entry_setups(entry, platforms)
await hass.config_entries.async_forward_entry_setups(entry, PLATFORMS)
return True
async def async_unload_entry(hass: HomeAssistant, entry: SFRConfigEntry) -> bool:
"""Unload a config entry."""
if entry.data.get(CONF_USERNAME) and entry.data.get(CONF_PASSWORD):
return await hass.config_entries.async_unload_platforms(
entry, PLATFORMS_WITH_AUTH
)
return await hass.config_entries.async_unload_platforms(entry, PLATFORMS)
@@ -4,7 +4,6 @@ from __future__ import annotations
from collections.abc import Callable
from dataclasses import dataclass
from typing import TYPE_CHECKING
from sfrbox_api.models import DslInfo, FtthInfo, VoipInfo, WanInfo
@@ -88,8 +87,6 @@ async def async_setup_entry(
"""Set up the sensors."""
data = entry.runtime_data
system_info = data.system.data
if TYPE_CHECKING:
assert system_info is not None
entities: list[SFRBoxBinarySensor] = [
SFRBoxBinarySensor(data.wan, description, system_info)
+5 -3
View File
@@ -5,7 +5,7 @@ from __future__ import annotations
from collections.abc import Awaitable, Callable, Coroutine
from dataclasses import dataclass
from functools import wraps
from typing import TYPE_CHECKING, Any, Concatenate
from typing import Any, Concatenate
from sfrbox_api.bridge import SFRBox
from sfrbox_api.exceptions import SFRBoxError
@@ -78,9 +78,11 @@ async def async_setup_entry(
) -> None:
"""Set up the buttons."""
data = entry.runtime_data
if not data.has_authentication:
# All buttons currently require authentication
return
system_info = data.system.data
if TYPE_CHECKING:
assert system_info is not None
entities = [
SFRBoxButton(data.box, description, system_info) for description in BUTTON_TYPES
+1 -2
View File
@@ -7,5 +7,4 @@ DEFAULT_USERNAME = "admin"
DOMAIN = "sfr_box"
PLATFORMS = [Platform.BINARY_SENSOR, Platform.SENSOR]
PLATFORMS_WITH_AUTH = [*PLATFORMS, Platform.BUTTON]
PLATFORMS = [Platform.BINARY_SENSOR, Platform.BUTTON, Platform.SENSOR]
@@ -29,6 +29,7 @@ class SFRRuntimeData:
"""Runtime data for SFR Box."""
box: SFRBox
has_authentication: bool
dsl: SFRDataUpdateCoordinator[DslInfo]
ftth: SFRDataUpdateCoordinator[FtthInfo]
system: SFRDataUpdateCoordinator[SystemInfo]
@@ -2,7 +2,6 @@
from collections.abc import Callable
from dataclasses import dataclass
from typing import TYPE_CHECKING
from sfrbox_api.models import DslInfo, SystemInfo, VoipInfo, WanInfo
@@ -236,8 +235,6 @@ async def async_setup_entry(
"""Set up the sensors."""
data = entry.runtime_data
system_info = data.system.data
if TYPE_CHECKING:
assert system_info is not None
entities: list[SFRBoxSensor] = [
SFRBoxSensor(data.system, description, system_info)
+1 -1
View File
@@ -74,7 +74,7 @@ def async_setup_block_attribute_entities(
for block in coordinator.device.blocks:
for sensor_id in block.sensor_ids:
description = sensors.get((cast(str, block.type), sensor_id))
description = sensors.get((block.type, sensor_id))
if description is None:
continue
@@ -17,7 +17,7 @@
"iot_class": "local_push",
"loggers": ["aioshelly"],
"quality_scale": "platinum",
"requirements": ["aioshelly==13.23.1"],
"requirements": ["aioshelly==13.24.0"],
"zeroconf": [
{
"name": "shelly*",
+1 -1
View File
@@ -122,7 +122,7 @@ def get_block_number_of_channels(device: BlockDevice, block: Block) -> int:
def get_block_custom_name(device: BlockDevice, block: Block | None) -> str | None:
"""Get custom name from device settings."""
if block and (key := cast(str, block.type) + "s") and key in device.settings:
if block and (key := block.type + "s") and key in device.settings:
assert block.channel
if name := device.settings[key][int(block.channel)].get("name"):
+5 -10
View File
@@ -2,30 +2,25 @@
from __future__ import annotations
from homeassistant.config_entries import ConfigEntry
from homeassistant.const import Platform
from homeassistant.core import HomeAssistant
from .const import DOMAIN
from .coordinator import TailscaleDataUpdateCoordinator
from .coordinator import TailscaleConfigEntry, TailscaleDataUpdateCoordinator
PLATFORMS = [Platform.BINARY_SENSOR, Platform.SENSOR]
async def async_setup_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool:
async def async_setup_entry(hass: HomeAssistant, entry: TailscaleConfigEntry) -> bool:
"""Set up Tailscale from a config entry."""
coordinator = TailscaleDataUpdateCoordinator(hass, entry)
await coordinator.async_config_entry_first_refresh()
hass.data.setdefault(DOMAIN, {})[entry.entry_id] = coordinator
entry.runtime_data = coordinator
await hass.config_entries.async_forward_entry_setups(entry, PLATFORMS)
return True
async def async_unload_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool:
async def async_unload_entry(hass: HomeAssistant, entry: TailscaleConfigEntry) -> bool:
"""Unload Tailscale config entry."""
unload_ok = await hass.config_entries.async_unload_platforms(entry, PLATFORMS)
if unload_ok:
del hass.data[DOMAIN][entry.entry_id]
return unload_ok
return await hass.config_entries.async_unload_platforms(entry, PLATFORMS)
@@ -12,14 +12,15 @@ from homeassistant.components.binary_sensor import (
BinarySensorEntity,
BinarySensorEntityDescription,
)
from homeassistant.config_entries import ConfigEntry
from homeassistant.const import EntityCategory
from homeassistant.core import HomeAssistant
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from .const import DOMAIN
from .coordinator import TailscaleConfigEntry
from .entity import TailscaleEntity
PARALLEL_UPDATES = 0
@dataclass(frozen=True, kw_only=True)
class TailscaleBinarySensorEntityDescription(BinarySensorEntityDescription):
@@ -97,11 +98,11 @@ BINARY_SENSORS: tuple[TailscaleBinarySensorEntityDescription, ...] = (
async def async_setup_entry(
hass: HomeAssistant,
entry: ConfigEntry,
entry: TailscaleConfigEntry,
async_add_entities: AddConfigEntryEntitiesCallback,
) -> None:
"""Set up a Tailscale binary sensors based on a config entry."""
coordinator = hass.data[DOMAIN][entry.entry_id]
coordinator = entry.runtime_data
async_add_entities(
TailscaleBinarySensorEntity(
coordinator=coordinator,
@@ -14,13 +14,15 @@ from homeassistant.helpers.update_coordinator import DataUpdateCoordinator
from .const import CONF_TAILNET, DOMAIN, LOGGER, SCAN_INTERVAL
type TailscaleConfigEntry = ConfigEntry[TailscaleDataUpdateCoordinator]
class TailscaleDataUpdateCoordinator(DataUpdateCoordinator[dict[str, Device]]):
"""The Tailscale Data Update Coordinator."""
config_entry: ConfigEntry
config_entry: TailscaleConfigEntry
def __init__(self, hass: HomeAssistant, config_entry: ConfigEntry) -> None:
def __init__(self, hass: HomeAssistant, config_entry: TailscaleConfigEntry) -> None:
"""Initialize the Tailscale coordinator."""
session = async_get_clientsession(hass)
self.tailscale = Tailscale(
@@ -6,12 +6,11 @@ import json
from typing import Any
from homeassistant.components.diagnostics import async_redact_data
from homeassistant.config_entries import ConfigEntry
from homeassistant.const import CONF_API_KEY
from homeassistant.core import HomeAssistant
from .const import CONF_TAILNET, DOMAIN
from .coordinator import TailscaleDataUpdateCoordinator
from .const import CONF_TAILNET
from .coordinator import TailscaleConfigEntry
TO_REDACT = {
CONF_API_KEY,
@@ -22,16 +21,19 @@ TO_REDACT = {
"hostname",
"machine_key",
"name",
"node_id",
"node_key",
"tailnet_lock_key",
"user",
}
async def async_get_config_entry_diagnostics(
hass: HomeAssistant, entry: ConfigEntry
hass: HomeAssistant, entry: TailscaleConfigEntry
) -> dict[str, Any]:
"""Return diagnostics for a config entry."""
coordinator: TailscaleDataUpdateCoordinator = hass.data[DOMAIN][entry.entry_id]
# Round-trip via JSON to trigger serialization
devices = [json.loads(device.to_json()) for device in coordinator.data.values()]
devices = [
json.loads(device.to_json()) for device in entry.runtime_data.data.values()
]
return async_redact_data({"devices": devices}, TO_REDACT)
+4 -6
View File
@@ -6,15 +6,13 @@ from tailscale import Device as TailscaleDevice
from homeassistant.helpers.device_registry import DeviceEntryType, DeviceInfo
from homeassistant.helpers.entity import EntityDescription
from homeassistant.helpers.update_coordinator import (
CoordinatorEntity,
DataUpdateCoordinator,
)
from homeassistant.helpers.update_coordinator import CoordinatorEntity
from .const import DOMAIN
from .coordinator import TailscaleDataUpdateCoordinator
class TailscaleEntity(CoordinatorEntity):
class TailscaleEntity(CoordinatorEntity[TailscaleDataUpdateCoordinator]):
"""Defines a Tailscale base entity."""
_attr_has_entity_name = True
@@ -22,7 +20,7 @@ class TailscaleEntity(CoordinatorEntity):
def __init__(
self,
*,
coordinator: DataUpdateCoordinator,
coordinator: TailscaleDataUpdateCoordinator,
device: TailscaleDevice,
description: EntityDescription,
) -> None:
@@ -6,5 +6,5 @@
"documentation": "https://www.home-assistant.io/integrations/tailscale",
"integration_type": "hub",
"iot_class": "cloud_polling",
"requirements": ["tailscale==0.6.2"]
"requirements": ["tailscale==0.7.0"]
}
+5 -4
View File
@@ -13,14 +13,15 @@ from homeassistant.components.sensor import (
SensorEntity,
SensorEntityDescription,
)
from homeassistant.config_entries import ConfigEntry
from homeassistant.const import EntityCategory
from homeassistant.core import HomeAssistant
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from .const import DOMAIN
from .coordinator import TailscaleConfigEntry
from .entity import TailscaleEntity
PARALLEL_UPDATES = 0
@dataclass(frozen=True, kw_only=True)
class TailscaleSensorEntityDescription(SensorEntityDescription):
@@ -54,11 +55,11 @@ SENSORS: tuple[TailscaleSensorEntityDescription, ...] = (
async def async_setup_entry(
hass: HomeAssistant,
entry: ConfigEntry,
entry: TailscaleConfigEntry,
async_add_entities: AddConfigEntryEntitiesCallback,
) -> None:
"""Set up a Tailscale sensors based on a config entry."""
coordinator = hass.data[DOMAIN][entry.entry_id]
coordinator = entry.runtime_data
async_add_entities(
TailscaleSensorEntity(
coordinator=coordinator,
+6 -13
View File
@@ -4,18 +4,17 @@ from __future__ import annotations
from Tami4EdgeAPI import Tami4EdgeAPI, exceptions
from homeassistant.config_entries import ConfigEntry
from homeassistant.const import Platform
from homeassistant.core import HomeAssistant
from homeassistant.exceptions import ConfigEntryError, ConfigEntryNotReady
from .const import API, CONF_REFRESH_TOKEN, COORDINATOR, DOMAIN
from .coordinator import Tami4EdgeCoordinator
from .const import CONF_REFRESH_TOKEN
from .coordinator import Tami4ConfigEntry, Tami4EdgeCoordinator
PLATFORMS: list[Platform] = [Platform.BUTTON, Platform.SENSOR]
async def async_setup_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool:
async def async_setup_entry(hass: HomeAssistant, entry: Tami4ConfigEntry) -> bool:
"""Set up tami4 from a config entry."""
refresh_token = entry.data.get(CONF_REFRESH_TOKEN)
@@ -29,19 +28,13 @@ async def async_setup_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool:
coordinator = Tami4EdgeCoordinator(hass, entry, api)
await coordinator.async_config_entry_first_refresh()
hass.data.setdefault(DOMAIN, {})[entry.entry_id] = {
API: api,
COORDINATOR: coordinator,
}
entry.runtime_data = coordinator
await hass.config_entries.async_forward_entry_setups(entry, PLATFORMS)
return True
async def async_unload_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool:
async def async_unload_entry(hass: HomeAssistant, entry: Tami4ConfigEntry) -> bool:
"""Unload a config entry."""
if unload_ok := await hass.config_entries.async_unload_platforms(entry, PLATFORMS):
hass.data[DOMAIN].pop(entry.entry_id)
return unload_ok
return await hass.config_entries.async_unload_platforms(entry, PLATFORMS)
+3 -4
View File
@@ -8,12 +8,11 @@ from Tami4EdgeAPI import Tami4EdgeAPI
from Tami4EdgeAPI.drink import Drink
from homeassistant.components.button import ButtonEntity, ButtonEntityDescription
from homeassistant.config_entries import ConfigEntry
from homeassistant.core import HomeAssistant
from homeassistant.helpers.entity import EntityDescription
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from .const import API, DOMAIN
from .coordinator import Tami4ConfigEntry
from .entity import Tami4EdgeBaseEntity
_LOGGER = logging.getLogger(__name__)
@@ -42,12 +41,12 @@ BOIL_WATER_BUTTON = Tami4EdgeButtonEntityDescription(
async def async_setup_entry(
hass: HomeAssistant,
entry: ConfigEntry,
entry: Tami4ConfigEntry,
async_add_entities: AddConfigEntryEntitiesCallback,
) -> None:
"""Perform the setup for Tami4Edge."""
api: Tami4EdgeAPI = hass.data[DOMAIN][entry.entry_id][API]
api = entry.runtime_data.api
buttons: list[Tami4EdgeBaseEntity] = [Tami4EdgeButton(api, BOIL_WATER_BUTTON)]
device = await hass.async_add_executor_job(api.get_device)
-2
View File
@@ -3,5 +3,3 @@
DOMAIN = "tami4"
CONF_PHONE = "phone"
CONF_REFRESH_TOKEN = "refresh_token"
API = "api"
COORDINATOR = "coordinator"
@@ -13,6 +13,8 @@ from homeassistant.helpers.update_coordinator import DataUpdateCoordinator, Upda
_LOGGER = logging.getLogger(__name__)
type Tami4ConfigEntry = ConfigEntry[Tami4EdgeCoordinator]
@dataclass
class FlattenedWaterQuality:
@@ -37,10 +39,10 @@ class FlattenedWaterQuality:
class Tami4EdgeCoordinator(DataUpdateCoordinator[FlattenedWaterQuality]):
"""Tami4Edge water quality coordinator."""
config_entry: ConfigEntry
config_entry: Tami4ConfigEntry
def __init__(
self, hass: HomeAssistant, config_entry: ConfigEntry, api: Tami4EdgeAPI
self, hass: HomeAssistant, config_entry: Tami4ConfigEntry, api: Tami4EdgeAPI
) -> None:
"""Initialize the water quality coordinator."""
super().__init__(
@@ -50,12 +52,12 @@ class Tami4EdgeCoordinator(DataUpdateCoordinator[FlattenedWaterQuality]):
name="Tami4Edge water quality coordinator",
update_interval=timedelta(minutes=60),
)
self._api = api
self.api = api
async def _async_update_data(self) -> FlattenedWaterQuality:
"""Fetch data from the API endpoint."""
try:
device = await self.hass.async_add_executor_job(self._api.get_device)
device = await self.hass.async_add_executor_job(self.api.get_device)
return FlattenedWaterQuality(device.water_quality)
except exceptions.APIRequestFailedException as ex:
+4 -12
View File
@@ -2,22 +2,18 @@
import logging
from Tami4EdgeAPI import Tami4EdgeAPI
from homeassistant.components.sensor import (
SensorDeviceClass,
SensorEntity,
SensorEntityDescription,
SensorStateClass,
)
from homeassistant.config_entries import ConfigEntry
from homeassistant.const import UnitOfVolume
from homeassistant.core import HomeAssistant, callback
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from homeassistant.helpers.update_coordinator import CoordinatorEntity
from .const import API, COORDINATOR, DOMAIN
from .coordinator import Tami4EdgeCoordinator
from .coordinator import Tami4ConfigEntry, Tami4EdgeCoordinator
from .entity import Tami4EdgeBaseEntity
_LOGGER = logging.getLogger(__name__)
@@ -53,18 +49,15 @@ ENTITY_DESCRIPTIONS = [
async def async_setup_entry(
hass: HomeAssistant,
entry: ConfigEntry,
entry: Tami4ConfigEntry,
async_add_entities: AddConfigEntryEntitiesCallback,
) -> None:
"""Perform the setup for Tami4Edge."""
data = hass.data[DOMAIN][entry.entry_id]
api: Tami4EdgeAPI = data[API]
coordinator: Tami4EdgeCoordinator = data[COORDINATOR]
coordinator = entry.runtime_data
async_add_entities(
Tami4EdgeSensorEntity(
coordinator=coordinator,
api=api,
entity_description=entity_description,
)
for entity_description in ENTITY_DESCRIPTIONS
@@ -81,11 +74,10 @@ class Tami4EdgeSensorEntity(
def __init__(
self,
coordinator: Tami4EdgeCoordinator,
api: Tami4EdgeAPI,
entity_description: SensorEntityDescription,
) -> None:
"""Initialize the Tami4Edge sensor entity."""
Tami4EdgeBaseEntity.__init__(self, api, entity_description)
Tami4EdgeBaseEntity.__init__(self, coordinator.api, entity_description)
CoordinatorEntity.__init__(self, coordinator)
self._update_attr()
@@ -2,17 +2,16 @@
import logging
from homeassistant.config_entries import ConfigEntry
from homeassistant.const import CONF_API_KEY, CONF_HOST
from homeassistant.core import HomeAssistant
from .const import DOMAIN, PLATFORMS, TTN_API_HOST
from .coordinator import TTNCoordinator
from .const import PLATFORMS, TTN_API_HOST
from .coordinator import TTNConfigEntry, TTNCoordinator
_LOGGER = logging.getLogger(__name__)
async def async_setup_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool:
async def async_setup_entry(hass: HomeAssistant, entry: TTNConfigEntry) -> bool:
"""Establish connection with The Things Network."""
_LOGGER.debug(
@@ -25,14 +24,14 @@ async def async_setup_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool:
await coordinator.async_config_entry_first_refresh()
hass.data.setdefault(DOMAIN, {})[entry.entry_id] = coordinator
entry.runtime_data = coordinator
await hass.config_entries.async_forward_entry_setups(entry, PLATFORMS)
return True
async def async_unload_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool:
async def async_unload_entry(hass: HomeAssistant, entry: TTNConfigEntry) -> bool:
"""Unload a config entry."""
_LOGGER.debug(
@@ -41,8 +40,4 @@ async def async_unload_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool:
entry.data.get(CONF_HOST, TTN_API_HOST),
)
# Unload entities created for each supported platform
unload_ok = await hass.config_entries.async_unload_platforms(entry, PLATFORMS)
if unload_ok:
del hass.data[DOMAIN][entry.entry_id]
return True
return await hass.config_entries.async_unload_platforms(entry, PLATFORMS)
@@ -15,13 +15,15 @@ from .const import CONF_APP_ID, POLLING_PERIOD_S
_LOGGER = logging.getLogger(__name__)
type TTNConfigEntry = ConfigEntry[TTNCoordinator]
class TTNCoordinator(DataUpdateCoordinator[TTNClient.DATA_TYPE]):
"""TTN coordinator."""
config_entry: ConfigEntry
config_entry: TTNConfigEntry
def __init__(self, hass: HomeAssistant, entry: ConfigEntry) -> None:
def __init__(self, hass: HomeAssistant, entry: TTNConfigEntry) -> None:
"""Initialize my coordinator."""
super().__init__(
hass,
@@ -5,12 +5,12 @@ import logging
from ttn_client import TTNSensorValue
from homeassistant.components.sensor import SensorEntity
from homeassistant.config_entries import ConfigEntry
from homeassistant.core import HomeAssistant
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from homeassistant.helpers.typing import StateType
from .const import CONF_APP_ID, DOMAIN
from .const import CONF_APP_ID
from .coordinator import TTNConfigEntry
from .entity import TTNEntity
_LOGGER = logging.getLogger(__name__)
@@ -18,12 +18,12 @@ _LOGGER = logging.getLogger(__name__)
async def async_setup_entry(
hass: HomeAssistant,
entry: ConfigEntry,
entry: TTNConfigEntry,
async_add_entities: AddConfigEntryEntitiesCallback,
) -> None:
"""Add entities for TTN."""
coordinator = hass.data[DOMAIN][entry.entry_id]
coordinator = entry.runtime_data
sensors: set[tuple[str, str]] = set()
+5 -11
View File
@@ -5,12 +5,10 @@ import logging
from todoist_api_python.api_async import TodoistAPIAsync
from homeassistant.config_entries import ConfigEntry
from homeassistant.const import CONF_TOKEN, Platform
from homeassistant.core import HomeAssistant
from .const import DOMAIN
from .coordinator import TodoistCoordinator
from .coordinator import TodoistConfigEntry, TodoistCoordinator
_LOGGER = logging.getLogger(__name__)
@@ -20,7 +18,7 @@ SCAN_INTERVAL = datetime.timedelta(minutes=1)
PLATFORMS: list[Platform] = [Platform.CALENDAR, Platform.TODO]
async def async_setup_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool:
async def async_setup_entry(hass: HomeAssistant, entry: TodoistConfigEntry) -> bool:
"""Set up todoist from a config entry."""
token = entry.data[CONF_TOKEN]
@@ -28,17 +26,13 @@ async def async_setup_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool:
coordinator = TodoistCoordinator(hass, _LOGGER, entry, SCAN_INTERVAL, api, token)
await coordinator.async_config_entry_first_refresh()
hass.data.setdefault(DOMAIN, {})
hass.data[DOMAIN][entry.entry_id] = coordinator
entry.runtime_data = coordinator
await hass.config_entries.async_forward_entry_setups(entry, PLATFORMS)
return True
async def async_unload_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool:
async def async_unload_entry(hass: HomeAssistant, entry: TodoistConfigEntry) -> bool:
"""Unload a config entry."""
if unload_ok := await hass.config_entries.async_unload_platforms(entry, PLATFORMS):
hass.data[DOMAIN].pop(entry.entry_id)
return unload_ok
return await hass.config_entries.async_unload_platforms(entry, PLATFORMS)
+3 -4
View File
@@ -16,7 +16,6 @@ from homeassistant.components.calendar import (
CalendarEntity,
CalendarEvent,
)
from homeassistant.config_entries import ConfigEntry
from homeassistant.const import CONF_ID, CONF_NAME, CONF_TOKEN, EVENT_HOMEASSISTANT_STOP
from homeassistant.core import Event, HomeAssistant, ServiceCall, callback
from homeassistant.exceptions import ServiceValidationError
@@ -60,7 +59,7 @@ from .const import (
START,
SUMMARY,
)
from .coordinator import TodoistCoordinator, flatten_async_pages
from .coordinator import TodoistConfigEntry, TodoistCoordinator, flatten_async_pages
from .types import CalData, CustomProject, ProjectData, TodoistEvent
from .util import parse_due_date
@@ -116,11 +115,11 @@ SCAN_INTERVAL = timedelta(minutes=1)
async def async_setup_entry(
hass: HomeAssistant,
entry: ConfigEntry,
entry: TodoistConfigEntry,
async_add_entities: AddConfigEntryEntitiesCallback,
) -> None:
"""Set up the Todoist calendar platform config entry."""
coordinator = hass.data[DOMAIN][entry.entry_id]
coordinator = entry.runtime_data
projects = await coordinator.async_get_projects()
labels = await coordinator.async_get_labels()
@@ -15,6 +15,8 @@ from homeassistant.helpers.update_coordinator import DataUpdateCoordinator, Upda
from .const import MAX_PAGE_SIZE
type TodoistConfigEntry = ConfigEntry[TodoistCoordinator]
T = TypeVar("T")
+3 -5
View File
@@ -12,23 +12,21 @@ from homeassistant.components.todo import (
TodoListEntity,
TodoListEntityFeature,
)
from homeassistant.config_entries import ConfigEntry
from homeassistant.core import HomeAssistant, callback
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from homeassistant.helpers.update_coordinator import CoordinatorEntity
from .const import DOMAIN
from .coordinator import TodoistCoordinator
from .coordinator import TodoistConfigEntry, TodoistCoordinator
from .util import parse_due_date
async def async_setup_entry(
hass: HomeAssistant,
entry: ConfigEntry,
entry: TodoistConfigEntry,
async_add_entities: AddConfigEntryEntitiesCallback,
) -> None:
"""Set up the Todoist todo platform config entry."""
coordinator: TodoistCoordinator = hass.data[DOMAIN][entry.entry_id]
coordinator = entry.runtime_data
projects = await coordinator.async_get_projects()
async_add_entities(
TodoistTodoListEntity(coordinator, entry.entry_id, project.id, project.name)
+6 -13
View File
@@ -21,7 +21,7 @@ from homeassistant.helpers.config_entry_oauth2_flow import (
from homeassistant.helpers.typing import ConfigType
from .const import CONF_AGREEMENT_ID, CONF_MIGRATE, DEFAULT_SCAN_INTERVAL, DOMAIN
from .coordinator import ToonDataUpdateCoordinator
from .coordinator import ToonConfigEntry, ToonDataUpdateCoordinator
from .oauth2 import register_oauth2_implementations
PLATFORMS = [
@@ -94,7 +94,7 @@ async def async_migrate_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool:
return True
async def async_setup_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool:
async def async_setup_entry(hass: HomeAssistant, entry: ToonConfigEntry) -> bool:
"""Set up Toon from a config entry."""
try:
implementation = await async_get_config_entry_implementation(hass, entry)
@@ -111,8 +111,7 @@ async def async_setup_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool:
)
await coordinator.async_config_entry_first_refresh()
hass.data.setdefault(DOMAIN, {})
hass.data[DOMAIN][entry.entry_id] = coordinator
entry.runtime_data = coordinator
# Register device for the Meter Adapter, since it will have no entities.
device_registry = dr.async_get(hass)
@@ -145,17 +144,11 @@ async def async_setup_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool:
return True
async def async_unload_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool:
async def async_unload_entry(hass: HomeAssistant, entry: ToonConfigEntry) -> bool:
"""Unload Toon config entry."""
# Remove webhooks registration
await hass.data[DOMAIN][entry.entry_id].unregister_webhook()
await entry.runtime_data.unregister_webhook()
# Unload entities for this entry/device.
unload_ok = await hass.config_entries.async_unload_platforms(entry, PLATFORMS)
# Cleanup
if unload_ok:
del hass.data[DOMAIN][entry.entry_id]
return unload_ok
return await hass.config_entries.async_unload_platforms(entry, PLATFORMS)
@@ -9,12 +9,11 @@ from homeassistant.components.binary_sensor import (
BinarySensorEntity,
BinarySensorEntityDescription,
)
from homeassistant.config_entries import ConfigEntry
from homeassistant.core import HomeAssistant
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from .const import DOMAIN
from .coordinator import ToonDataUpdateCoordinator
from .coordinator import ToonConfigEntry, ToonDataUpdateCoordinator
from .entity import (
ToonBoilerDeviceEntity,
ToonBoilerModuleDeviceEntity,
@@ -26,11 +25,11 @@ from .entity import (
async def async_setup_entry(
hass: HomeAssistant,
entry: ConfigEntry,
entry: ToonConfigEntry,
async_add_entities: AddConfigEntryEntitiesCallback,
) -> None:
"""Set up a Toon binary sensor based on a config entry."""
coordinator = hass.data[DOMAIN][entry.entry_id]
coordinator = entry.runtime_data
entities = [
description.cls(coordinator, description)
+3 -4
View File
@@ -21,24 +21,23 @@ from homeassistant.components.climate import (
HVACAction,
HVACMode,
)
from homeassistant.config_entries import ConfigEntry
from homeassistant.const import ATTR_TEMPERATURE, UnitOfTemperature
from homeassistant.core import HomeAssistant
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from . import ToonDataUpdateCoordinator
from .const import DEFAULT_MAX_TEMP, DEFAULT_MIN_TEMP, DOMAIN
from .coordinator import ToonConfigEntry, ToonDataUpdateCoordinator
from .entity import ToonDisplayDeviceEntity
from .helpers import toon_exception_handler
async def async_setup_entry(
hass: HomeAssistant,
entry: ConfigEntry,
entry: ToonConfigEntry,
async_add_entities: AddConfigEntryEntitiesCallback,
) -> None:
"""Set up a Toon binary sensors based on a config entry."""
coordinator = hass.data[DOMAIN][entry.entry_id]
coordinator = entry.runtime_data
async_add_entities([ToonThermostatDevice(coordinator)])
+4 -2
View File
@@ -24,14 +24,16 @@ from .const import CONF_CLOUDHOOK_URL, DEFAULT_SCAN_INTERVAL, DOMAIN
_LOGGER = logging.getLogger(__name__)
type ToonConfigEntry = ConfigEntry[ToonDataUpdateCoordinator]
class ToonDataUpdateCoordinator(DataUpdateCoordinator[Status]):
"""Class to manage fetching Toon data from single endpoint."""
config_entry: ConfigEntry
config_entry: ToonConfigEntry
def __init__(
self, hass: HomeAssistant, entry: ConfigEntry, session: OAuth2Session
self, hass: HomeAssistant, entry: ToonConfigEntry, session: OAuth2Session
) -> None:
"""Initialize global Toon data updater."""
self.session = session
+3 -4
View File
@@ -10,7 +10,6 @@ from homeassistant.components.sensor import (
SensorEntityDescription,
SensorStateClass,
)
from homeassistant.config_entries import ConfigEntry
from homeassistant.const import (
PERCENTAGE,
UnitOfEnergy,
@@ -22,7 +21,7 @@ from homeassistant.core import HomeAssistant
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from .const import CURRENCY_EUR, DOMAIN, VOLUME_CM3, VOLUME_LMIN
from .coordinator import ToonDataUpdateCoordinator
from .coordinator import ToonConfigEntry, ToonDataUpdateCoordinator
from .entity import (
ToonBoilerDeviceEntity,
ToonDisplayDeviceEntity,
@@ -37,11 +36,11 @@ from .entity import (
async def async_setup_entry(
hass: HomeAssistant,
entry: ConfigEntry,
entry: ToonConfigEntry,
async_add_entities: AddConfigEntryEntitiesCallback,
) -> None:
"""Set up Toon sensors based on a config entry."""
coordinator = hass.data[DOMAIN][entry.entry_id]
coordinator = entry.runtime_data
entities = [
description.cls(coordinator, description) for description in SENSOR_ENTITIES

Some files were not shown because too many files have changed in this diff Show More