Compare commits

...

146 Commits

Author SHA1 Message Date
Paulus Schoutsen
d084e70aff 2023.3.4 (#89647) 2023-03-14 00:10:23 -04:00
puddly
69582b7ecb Bump ZHA dependencies (#89667)
* Bump `zha-quirks` library and account for `setup_quirks` signature

* Bump other ZHA dependencies

* Revert zigpy bump
2023-03-13 22:06:05 -04:00
Paulus Schoutsen
160518350f Bump SQLAlchemy to 2.0.6 (#89650) 2023-03-13 14:54:27 -04:00
Paulus Schoutsen
daa5718a80 Bumped version to 2023.3.4 2023-03-13 13:26:50 -04:00
tomrennen
f5562e93ac Improved "ON" state check for Use room sensor for cooling (#89634) 2023-03-13 13:26:44 -04:00
Erik Montnemery
d2f90236d1 Rename modules named repairs.py which are not repairs platforms (#89618) 2023-03-13 13:26:43 -04:00
J. Nick Koston
65c614421a Increase maximum aiohttp connections to 4096 (#89611)
fixes #89408
2023-03-13 13:26:41 -04:00
Eugenio Panadero
22922da607 Bump aiopvpc to 4.1.0 (#89593) 2023-03-13 13:26:40 -04:00
J. Nick Koston
ca0304ffc4 Fix get_significant_states_with_session query looking at legacy columns (#89558) 2023-03-13 13:26:39 -04:00
Robert Svensson
950a1f6e9e Bump pydeconz to v110 (#89527)
* Bump pydeconz to v109

* Bump pydeconz to v110 for additional color modes
2023-03-13 13:26:38 -04:00
rappenze
1e7f58d859 Fix bug in fibaro cover (#89502) 2023-03-13 13:26:37 -04:00
J. Nick Koston
7cb4620671 Fix data migration never finishing when database has invalid datetimes (#89474)
* Fix data migration never finishing when database has invalid datetimes

If there were impossible datetime values in the database (likely
from a manual sqlite to MySQL conversion) the conversion would
never complete

* Update homeassistant/components/recorder/migration.py
2023-03-13 13:26:36 -04:00
Kevin Worrel
8c2569d2ce Reconnect on any ScreenLogic exception (#89269)
Co-authored-by: J. Nick Koston <nick@koston.org>
2023-03-13 13:26:34 -04:00
Arjan
6ebd493c4d Fix gtfs with 2023.3 (sqlachemy update) (#89175) 2023-03-13 13:26:33 -04:00
Jan Stienstra
990ecbba72 Recode Home Assistant instance name to ascii for Jellyfin (#87368)
Recode instance name to ascii
2023-03-13 13:26:32 -04:00
Paulus Schoutsen
ddde17606d 2023.3.3 (#89459) 2023-03-09 14:40:06 -05:00
Paulus Schoutsen
3fba181e7b Bumped version to 2023.3.3 2023-03-09 13:30:46 -05:00
Erik Montnemery
da79bf8534 Fix Dormakaba dKey deadbolt binary sensor (#89447)
* Fix Dormakaba dKey deadbolt binary sensor

* Spelling
2023-03-09 13:18:23 -05:00
Paul Bottein
83e2cc32b7 Update frontend to 20230309.0 (#89446) 2023-03-09 13:18:22 -05:00
Joakim Sørensen
c7fb404a17 Add paths for add-on changelog and documentation (#89411) 2023-03-09 13:18:21 -05:00
Jan Bouwhuis
f1e114380a Allow enum as MQTT sensor device_class (#89391) 2023-03-09 13:18:20 -05:00
Brandon Rothweiler
04e4a644cb Bump pymazda to 0.3.8 (#89387) 2023-03-09 13:18:19 -05:00
Dillon Fearns
e606c2e227 Bump roombapy to 1.6.6 (#89366)
Co-authored-by: J. Nick Koston <nick@koston.org>
2023-03-09 13:18:17 -05:00
Jan Bouwhuis
ebf95feff3 Fix MQTT rgb light brightness scaling (#89264)
* Normalize received RGB colors to 100% brightness

* Assert on rgb_color attribute

* Use max for RGB to get brightness

* Avoid division and add clamp

* remove clamp

Co-authored-by: Erik Montnemery <erik@montnemery.com>

---------

Co-authored-by: Erik Montnemery <erik@montnemery.com>
2023-03-09 13:18:15 -05:00
Franck Nijhof
3dca4c2f23 2023.3.2 (#89381) 2023-03-08 18:35:50 +01:00
Franck Nijhof
3f8f38f2df Bumped version to 2023.3.2 2023-03-08 16:24:08 +01:00
epenet
0844a0b269 Fix invalid state class in litterrobot (#89380) 2023-03-08 16:23:30 +01:00
Franck Nijhof
b65180d20a Improve Supervisor API handling (#89379) 2023-03-08 16:23:26 +01:00
starkillerOG
7f8a9697f0 Fix setting Reolink focus (#89374)
fix setting focus
2023-03-08 16:23:22 +01:00
J. Nick Koston
563bd4a0dd Fix bluetooth history and device expire running in the executor (#89342) 2023-03-08 16:23:18 +01:00
Florent Thoumie
29b5ef31c1 Recreate iaqualink httpx client upon service exception (#89341) 2023-03-08 16:23:13 +01:00
Renat Sibgatulin
863f8b727d Remove invalid device class in air-Q integration (#89329)
Remove device_class from sensors using inconsistent units
2023-03-08 16:23:09 +01:00
J. Nick Koston
83ed8cf689 Fix thread diagnostics loading blocking the event loop (#89307)
* Fix thread diagnostics loading blocking the event loop

* patch target
2023-03-08 16:23:06 +01:00
Tom Harris
52cd2f9429 Fix Insteon open issues with adding devices by address and missing events (#89305)
* Add missing events

* Bump dependancies

* Update for code review
2023-03-08 16:23:02 +01:00
puddly
74d3b2374b Clean ZHA radio path with trailing whitespace (#89299)
* Clean config flow entries with trailing whitespace

* Rewrite the config entry at runtime, without upgrading

* Skip intermediate `data = config_entry.data` variable

* Perform a deepcopy to ensure the config entry will actually be updated
2023-03-08 16:22:58 +01:00
epenet
f982af2412 Ignore DSL entities if SFR box is not adsl (#89291) 2023-03-08 16:22:53 +01:00
luar123
0b5ddd9cbf Bump python-snapcast to 2.3.2 (#89259) 2023-03-08 16:22:49 +01:00
J. Nick Koston
8d1aa0132e Make sql subqueries threadsafe (#89254)
* Make sql subqueries threadsafe

fixes #89224

* fix join outside of lambda

* move statement generation into a seperate function to make it easier to test

* add cache key tests

* no need to mock hass
2023-03-08 16:22:45 +01:00
J. Nick Koston
d737b97c91 Bump sqlalchemy to 2.0.5post1 (#89253)
changelog: https://docs.sqlalchemy.org/en/20/changelog/changelog_20.html#change-2.0.5

mostly bugfixes for 2.x regressions
2023-03-08 16:22:41 +01:00
Marc Mueller
0fac12866d Fix conditional check (#89231) 2023-03-08 16:22:38 +01:00
Bram Kragten
e3fe71f76e Update frontend to 20230306.0 (#89227) 2023-03-08 16:22:34 +01:00
J. Nick Koston
eba1bfad51 Bump aioesphomeapi to 13.4.2 (#89210) 2023-03-08 16:22:30 +01:00
Franck Nijhof
1a0a385e03 Fix Tuya Python 3.11 compatibility issue (#89189) 2023-03-08 16:22:26 +01:00
MarkGodwin
c9999cd08c Fix host IP and scheme entry issues in TP-Link Omada (#89130)
Fixing host IP and scheme entry issues
2023-03-08 16:22:22 +01:00
rappenze
8252aeead2 Bump pyfibaro version to 0.6.9 (#89120) 2023-03-08 16:22:18 +01:00
J. Nick Koston
c27a69ef85 Handle InnoDB deadlocks during migration (#89073)
* Handle slow InnoDB rollback when encountering duplicates during migration

fixes #89069

* adjust

* fix mock

* tests

* return on success
2023-03-08 16:22:15 +01:00
J. Nick Koston
d4c28a1f4a Cache transient templates compiles provided via api (#89065)
* Cache transient templates compiles provided via api

partially fixes #89047 (there is more going on here)

* add a bit more coverage just to be sure

* switch method

* Revert "switch method"

This reverts commit 0e9e1c8cbe.

* tweak

* hold hass

* empty for github flakey
2023-03-08 16:22:10 +01:00
Andrew Westrope
322eb4bd83 Check type key of zone exists in geniushub (#86798)
Co-authored-by: epenet <6771947+epenet@users.noreply.github.com>
2023-03-08 16:22:05 +01:00
Paulus Schoutsen
f0f12fd14a 2023.3.1 (#89059) 2023-03-02 15:53:50 -05:00
Mitch
1836e35717 Bump nuheat to 1.0.1 (#88958) 2023-03-02 15:15:15 -05:00
Paulus Schoutsen
4eb55146be Bumped version to 2023.3.1 2023-03-02 14:22:23 -05:00
Jan Bouwhuis
b1ee6e304e Fix check on non numeric custom sensor device classes (#89052)
* Custom device classes are not numeric

* Update homeassistant/components/sensor/__init__.py

Co-authored-by: Paulus Schoutsen <paulus@home-assistant.io>

* Add test

* Update homeassistant/components/sensor/__init__.py

Co-authored-by: epenet <6771947+epenet@users.noreply.github.com>

---------

Co-authored-by: Paulus Schoutsen <paulus@home-assistant.io>
Co-authored-by: epenet <6771947+epenet@users.noreply.github.com>
2023-03-02 14:22:12 -05:00
Paul Bottein
d0b195516b Update frontend to 20230302.0 (#89042) 2023-03-02 14:22:11 -05:00
Franck Nijhof
a867f1d3c8 Update orjson to 3.8.7 (#89037) 2023-03-02 14:22:09 -05:00
Matthias Alphart
f7eaeb7a39 Fix KNX Keyfile upload (#89029)
* Fix KNX Keyfile upload

* use shutil.move instead
2023-03-02 14:22:08 -05:00
Erik Montnemery
3e961d3e17 Bump py-dormakaba-dkey to 1.0.4 (#88992) 2023-03-02 14:22:07 -05:00
Mitch
c28e16fa8b Bump requests to 2.28.2 (#88956)
Co-authored-by: Martin Hjelmare <marhje52@gmail.com>
2023-03-02 14:22:06 -05:00
Toni Juvani
e2e8d74aa6 Update pyTibber to 0.27.0 (#86940)
* Update pyTibber to 0.27.0

* Handle new exceptions
2023-03-02 14:22:05 -05:00
Franck Nijhof
8a9fbd650a 2023.3.0 (#88979) 2023-03-01 19:53:46 +01:00
Erik Montnemery
243725efe3 Tweak OTBR tests (#88839) 2023-03-01 17:53:38 +01:00
Franck Nijhof
8d59489da8 Bumped version to 2023.3.0 2023-03-01 17:25:44 +01:00
Stefan Agner
c146413a1a Add Home Assistant with space as brand (#88976) 2023-03-01 17:25:08 +01:00
Bram Kragten
a46d63a11b Update frontend to 20230301.0 (#88975) 2023-03-01 17:25:05 +01:00
mkmer
db4f6fb94d Bump Aiosomecomfort to 0.0.11 (#88970) 2023-03-01 17:25:01 +01:00
Erik Montnemery
c50c920589 Revert "Add state_class = MEASUREMENT to Derivative sensor (#88408)" (#88952) 2023-03-01 17:24:56 +01:00
starkillerOG
fe22aa0b4b Motion Blinds DHCP restrict (#88919)
Co-authored-by: J. Nick Koston <nick@koston.org>
2023-03-01 17:23:00 +01:00
Aaron Godfrey
a0162e4986 Fix todoist filtering custom projects by labels (#87904)
* Fix filtering custom projects by labels.

* Don't lowercase the label.

* Labels are case-sensitive, don't lowercase them.
2023-03-01 17:22:56 +01:00
RogerSelwyn
62c5cf51f5 Fix geniushub heating hvac action (#87531) 2023-03-01 17:22:53 +01:00
Frédéric Guardia
89aebba3ab Fix Google Assistant temperature attribute (#85921) 2023-03-01 17:22:48 +01:00
Paulus Schoutsen
6c73b9024b Bumped version to 2023.3.0b7 2023-02-28 22:18:39 -05:00
Michael Hansen
59a9ace171 Update intent sentences package (#88933)
* Actually use translated state names in response

* Change test result now that locks are excluded from HassTurnOn

* Bump home-assistant-intents and hassil versions
2023-02-28 22:18:32 -05:00
PatrickGlesner
e751948bc8 Update Tado services.yaml defaults (#88929)
Update services.yaml

Deletes default values in 'time_period' and 'requested_overlay' fields in 'set_climate_timer'.
2023-02-28 22:18:31 -05:00
djtimca
702646427d Bump auroranoaa to 0.0.3 (#88927)
* Bump aurora_api version to fix issues with NOAA conversion values. Fix #82587

* update requirements for aurora.

* Add state_class to aurora sensor.

* Fixed environment to run requirements_all script.

* Revert "Add state_class to aurora sensor."

This reverts commit 213e21e842.
2023-02-28 22:18:30 -05:00
Tom Harris
8a605b1377 Bump pyinsteon to 1.3.3 (#88925)
Bump pyinsteon
2023-02-28 22:18:29 -05:00
Erik Montnemery
8eb8415d3f Bump py-dormakaba-dkey to 1.0.3 (#88924)
* Bump py-dormakaba-dkey to 1.0.3

* Log unexpected errors in config flow
2023-02-28 22:18:28 -05:00
Volker Stolz
9f3f71d0c3 Introduce a UUID configuration option for API token (#88765)
* Introduce a UUID configuration option for API token. (#86547)

If the uuid is configured, it will be used in the HTTP headers. Otherwise,
we'll hash the salted instance URL which should be good enough(tm).

* Generate random 6-digit uuid on startup.
2023-02-28 22:18:28 -05:00
Paulus Schoutsen
b82da9418d Bumped version to 2023.3.0b6 2023-02-28 12:13:24 -05:00
Erik Montnemery
38cf725075 Fix Dormakaba dKey binary sensor (#88922) 2023-02-28 12:12:52 -05:00
Franck Nijhof
04cedab8d4 Small improvements to middleware filter (#88921)
Small improvements middleware filter
2023-02-28 12:12:51 -05:00
Erik Montnemery
2238a3f201 Reset state of template cover on error (#88915) 2023-02-28 12:12:50 -05:00
Marcel van der Veldt
f58ca17926 Bump aiohue library to version 4.6.2 (#88907)
* Bump aiohue library to 4.6.2

* Fix long press (fixed in aiohue lib)

* fix test
2023-02-28 12:12:48 -05:00
Marcel van der Veldt
d5e517b874 Do not create Area for Hue zones (#88904)
Do not create HA area for Hue zones
2023-02-28 12:12:47 -05:00
Bram Kragten
f9eeb4f4d8 Fix string for OTBR config flow abort (#88902) 2023-02-28 12:12:46 -05:00
Marcel van der Veldt
86d5e4aaa8 Fix removal of non device-bound resources in Hue (#88897)
Fix removal of non device-bound resources (like entertainment areas)
2023-02-28 12:12:45 -05:00
b-uwe
a56935ed7c Add virtual integration for HELTUN (#88892) 2023-02-28 12:12:44 -05:00
Erik Montnemery
fc56c958c3 Only allow channel 15 during configuration of OTBR (#88874)
* Only allow channel 15 during automatic configuration of OTBR

* Also force channel 15 when creating a new network
2023-02-28 12:12:43 -05:00
Erik Montnemery
a8e1dc8962 Create repairs issue if Thread network is insecure (#88888)
* Bump python-otbr-api to 1.0.5

* Create repairs issue if Thread network is insecure

* Address review comments
2023-02-28 12:12:11 -05:00
Erik Montnemery
32b138b6c6 Add WS API for creating a Thread network (#88830)
* Add WS API for creating a Thread network

* Add tests
2023-02-28 12:11:14 -05:00
Erik Montnemery
2112c66804 Add confirm step to thread zeroconf flow (#88869)
Co-authored-by: Martin Hjelmare <marhje52@gmail.com>
2023-02-28 12:08:18 -05:00
Paulus Schoutsen
72c0526d87 Bumped version to 2023.3.0b5 2023-02-27 20:58:22 -05:00
Matthias Alphart
9ed4e01e94 Update xknx to 2.6.0 (#88864) 2023-02-27 20:58:11 -05:00
Paul Bottein
dcf1ecfeb5 Update frontend to 20230227.0 (#88857) 2023-02-27 20:58:10 -05:00
Klaas Schoute
b72224ceff Bump odp-amsterdam to v5.1.0 (#88847) 2023-02-27 20:58:09 -05:00
Erik Montnemery
96ad5c9666 Add thread user flow (#88842) 2023-02-27 20:58:09 -05:00
Erik Montnemery
00b59c142a Fix sensor unit conversion bug (#88825)
* Fix sensor unit conversion bug

* Ensure the correct unit is stored in the entity registry
2023-02-27 20:58:08 -05:00
Michael Davie
b054c81e13 Bump env_canada to 0.5.29 (#88821) 2023-02-27 20:58:07 -05:00
puddly
b0cbcad440 Bump ZHA dependencies (#88799)
* Bump ZHA dependencies

* Use `importlib.metadata.version` to get package versions
2023-02-27 20:58:06 -05:00
stickpin
bafe552af6 Upgrade caldav to 1.2.0 (#88791) 2023-02-27 20:58:05 -05:00
stickpin
d399855e50 Upgrade caldav to 1.1.3 (#88681)
* Update caldav to 1.1.3

* update caldav to 1.1.3

* update caldav to 1.1.3

---------

Co-authored-by: Allen Porter <allen@thebends.org>
2023-02-27 20:58:03 -05:00
mkmer
d26f430766 Bump aiosomecomfort to 0.0.10 (#88766) 2023-02-27 20:56:46 -05:00
Erik Montnemery
f2e4943a53 Catch CancelledError when setting up components (#88635)
* Catch CancelledError when setting up components

* Catch CancelledError when setting up components

* Also catch SystemExit
2023-02-27 20:56:45 -05:00
Bouwe Westerdijk
6512cd901f Correct Plugwise gas_consumed_interval sensor (#87449)
Co-authored-by: Franck Nijhof <frenck@frenck.nl>
2023-02-27 20:56:45 -05:00
Paulus Schoutsen
fbe1524f6c Bumped version to 2023.3.0b4 2023-02-26 22:37:34 -05:00
J. Nick Koston
95e337277c Avoid starting a bluetooth poll when Home Assistant is stopping (#88819)
* Avoid starting a bluetooth poll when Home Assistant is stopping

* tests
2023-02-26 22:37:26 -05:00
J. Nick Koston
1503674bd6 Prevent integrations from retrying setup once shutdown has started (#88818)
* Prevent integrations from retrying setup once shutdown has started

* coverage
2023-02-26 22:37:25 -05:00
J. Nick Koston
ab6bd75b70 Fix flux_led discovery running at shutdown (#88817) 2023-02-26 22:37:24 -05:00
J. Nick Koston
2fff836bd4 Fix lock services not removing entity fields (#88805) 2023-02-26 22:37:23 -05:00
J. Nick Koston
d8850758f1 Fix unifiprotect discovery running at shutdown (#88802)
* Fix unifiprotect discovery running at shutdown

Move the discovery start into `async_setup` so we only
start discovery once reguardless of how many config entries
for unifiprotect they have (or how many times they reload).

Always make discovery a background task so it does not get
to block shutdown

* missing decorator
2023-02-26 22:37:22 -05:00
J. Nick Koston
0449856064 Bump yalexs-ble to 2.0.4 (#88798)
changelog: https://github.com/bdraco/yalexs-ble/compare/v2.0.3...v2.0.4
2023-02-26 22:37:21 -05:00
starkillerOG
e48089e0c9 Do not block on reolink firmware check fail (#88797)
Do not block on firmware check fail
2023-02-26 22:37:20 -05:00
starkillerOG
a7e081f70d Simplify reolink update unique_id (#88794)
simplify unique_id
2023-02-26 22:37:19 -05:00
Paulus Schoutsen
fe181425d8 Check circular dependencies (#88778) 2023-02-26 22:37:18 -05:00
Joakim Plate
8c7b29db25 Update nibe library to 2.0.0 (#88769) 2023-02-26 22:37:17 -05:00
J. Nick Koston
aaa5bb9f86 Fix checking if a package is installed on py3.11 (#88768)
pkg_resources is abandoned and we need to move away
from using it https://github.com/pypa/pkg_resources

In the mean time we need to keep it working. This fixes
a new exception in py3.11 when a module is not installed
which allows proper fallback to pkg_resources.Requirement.parse
when needed

```
2023-02-25 15:46:21.101 ERROR (MainThread) [aiohttp.server] Error handling request
Traceback (most recent call last):
  File "/opt/homebrew/lib/python3.11/site-packages/aiohttp/web_protocol.py", line 433, in _handle_request
    resp = await request_handler(request)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/opt/homebrew/lib/python3.11/site-packages/aiohttp/web_app.py", line 504, in _handle
    resp = await handler(request)
           ^^^^^^^^^^^^^^^^^^^^^^
  File "/opt/homebrew/lib/python3.11/site-packages/aiohttp/web_middlewares.py", line 117, in impl
    return await handler(request)
           ^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/bdraco/home-assistant/homeassistant/components/http/security_filter.py", line 60, in security_filter_middleware
    return await handler(request)
           ^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/bdraco/home-assistant/homeassistant/components/http/forwarded.py", line 100, in forwarded_middleware
    return await handler(request)
           ^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/bdraco/home-assistant/homeassistant/components/http/request_context.py", line 28, in request_context_middleware
    return await handler(request)
           ^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/bdraco/home-assistant/homeassistant/components/http/ban.py", line 80, in ban_middleware
    return await handler(request)
           ^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/bdraco/home-assistant/homeassistant/components/http/auth.py", line 235, in auth_middleware
    return await handler(request)
           ^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/bdraco/home-assistant/homeassistant/components/http/view.py", line 146, in handle
    result = await result
             ^^^^^^^^^^^^
  File "/Users/bdraco/home-assistant/homeassistant/components/config/config_entries.py", line 148, in post
    return await super().post(request)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/bdraco/home-assistant/homeassistant/components/http/data_validator.py", line 72, in wrapper
    result = await method(view, request, data, *args, **kwargs)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/bdraco/home-assistant/homeassistant/helpers/data_entry_flow.py", line 71, in post
    result = await self._flow_mgr.async_init(
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/bdraco/home-assistant/homeassistant/config_entries.py", line 826, in async_init
    flow, result = await task
                   ^^^^^^^^^^
  File "/Users/bdraco/home-assistant/homeassistant/config_entries.py", line 844, in _async_init
    flow = await self.async_create_flow(handler, context=context, data=data)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/bdraco/home-assistant/homeassistant/config_entries.py", line 950, in async_create_flow
    await async_process_deps_reqs(self.hass, self._hass_config, integration)
  File "/Users/bdraco/home-assistant/homeassistant/setup.py", line 384, in async_process_deps_reqs
    await requirements.async_get_integration_with_requirements(
  File "/Users/bdraco/home-assistant/homeassistant/requirements.py", line 52, in async_get_integration_with_requirements
    return await manager.async_get_integration_with_requirements(domain)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/bdraco/home-assistant/homeassistant/requirements.py", line 171, in async_get_integration_with_requirements
    await self._async_process_integration(integration, done)
  File "/Users/bdraco/home-assistant/homeassistant/requirements.py", line 186, in _async_process_integration
    await self.async_process_requirements(
  File "/Users/bdraco/home-assistant/homeassistant/requirements.py", line 252, in async_process_requirements
    await self._async_process_requirements(name, missing)
  File "/Users/bdraco/home-assistant/homeassistant/requirements.py", line 284, in _async_process_requirements
    installed, failures = await self.hass.async_add_executor_job(
                          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/opt/homebrew/Cellar/python@3.11/3.11.1/Frameworks/Python.framework/Versions/3.11/lib/python3.11/concurrent/futures/thread.py", line 58, in run
    result = self.fn(*self.args, **self.kwargs)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/bdraco/home-assistant/homeassistant/requirements.py", line 113, in _install_requirements_if_missing
    if pkg_util.is_installed(req) or _install_with_retry(req, kwargs):
       ^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/bdraco/home-assistant/homeassistant/util/package.py", line 40, in is_installed
    pkg_resources.get_distribution(package)
  File "/opt/homebrew/lib/python3.11/site-packages/pkg_resources/__init__.py", line 478, in get_distribution
    dist = get_provider(dist)
           ^^^^^^^^^^^^^^^^^^
  File "/opt/homebrew/lib/python3.11/site-packages/pkg_resources/__init__.py", line 354, in get_provider
    return working_set.find(moduleOrReq) or require(str(moduleOrReq))[0]
                                            ~~~~~~~~~~~~~~~~~~~~~~~~~^^^
IndexError: list index out of range
``
2023-02-26 22:37:17 -05:00
J. Nick Koston
5b78e0c4ff Restore previous behavior of only waiting for new tasks at shutdown (#88740)
* Restore previous behavior of only waiting for new tasks at shutdown

* cleanup

* do a swap instead

* await canceled tasks

* await canceled tasks

* fix

* not needed since we no longer clear

* log it

* reword

* wait for airvisual

* tests
2023-02-26 22:37:16 -05:00
Franck Nijhof
2063dbf00d Bumped version to 2023.3.0b3 2023-02-25 12:07:47 +01:00
Joakim Sørensen
91a03ab83d Remove homeassistant_hardware after dependency from zha (#88751) 2023-02-25 12:07:25 +01:00
J. Nick Koston
ed8f538890 Prevent new discovery flows from being created when stopping (#88743) 2023-02-25 12:07:22 +01:00
J. Nick Koston
6196607c5d Make hass.async_stop an untracked task (#88738) 2023-02-25 12:07:19 +01:00
J. Nick Koston
833ccafb76 Log futures that are blocking shutdown stages (#88736) 2023-02-25 12:07:15 +01:00
mkmer
ca539d0a09 Add missing reauth strings to Honeywell (#88733)
Add missing reauth strings
2023-02-25 12:07:12 +01:00
Austin Mroczek
0e3e954000 Bump total_connect_client to v2023.2 (#88729)
* bump total_connect_client to v2023.2

* Trigger Build
2023-02-25 12:07:09 +01:00
avee87
4ef96c76e4 Fix log message in recorder on total_increasing reset (#88710) 2023-02-25 12:07:05 +01:00
Álvaro Fernández Rojas
d5b0c1faa0 Update aioqsw v0.3.2 (#88695)
Signed-off-by: Álvaro Fernández Rojas <noltari@gmail.com>
2023-02-25 12:07:02 +01:00
Arturo
2405908cdd Fix matter light color capabilities bit map (#88693)
* Adds matter light color capabilities bit map

* Fixed matter light hue and saturation test
2023-02-25 12:06:58 +01:00
Paulus Schoutsen
b6e50135f5 Bumped version to 2023.3.0b2 2023-02-24 21:41:02 -05:00
Bram Kragten
64197aa5f5 Update frontend to 20230224.0 (#88721) 2023-02-24 21:40:56 -05:00
J. Nick Koston
5a2d7a5dd4 Reduce overhead to save json data to postgresql (#88717)
* Reduce overhead to strip nulls from json

* Reduce overhead to strip nulls from json

* small cleanup
2023-02-24 21:40:55 -05:00
J. Nick Koston
2d6f84b2a8 Fix timeout in purpleapi test (#88715)
https://github.com/home-assistant/core/actions/runs/4264644494/jobs/7423099757
2023-02-24 21:40:54 -05:00
J. Nick Koston
0c6a469218 Fix migration failing when existing data has duplicates (#88712) 2023-02-24 21:40:53 -05:00
J. Nick Koston
e69271cb46 Bump aioesphomeapi to 13.4.1 (#88703)
changelog: https://github.com/esphome/aioesphomeapi/releases/tag/v13.4.1
2023-02-24 21:40:52 -05:00
Michael Hansen
02bd3f897d Make a copy of matching states so translated state names can be used (#88683) 2023-02-24 21:40:51 -05:00
J. Nick Koston
64ad5326dd Bump mopeka_iot_ble to 0.4.1 (#88680)
* Bump mopeka_iot_ble to 0.4.1

closes #88232

* adjust tests
2023-02-24 21:40:50 -05:00
puddly
74696a3fac Name the Yellow-internal radio and multi-PAN addon as ZHA serial ports (#88208)
* Expose the Yellow-internal radio and multi-PAN addon as named serial ports

* Remove the serial number if it isn't available

* Use consistent names for the addon and Zigbee radio

* Add `homeassistant_hardware` and `_yellow` as `after_dependencies`

* Handle `hassio` not existing when listing serial ports

* Add unit tests
2023-02-24 21:40:49 -05:00
Paulus Schoutsen
70e1d14da0 Bumped version to 2023.3.0b1 2023-02-23 15:00:13 -05:00
Bram Kragten
25f066d476 Update frontend to 20230223.0 (#88677) 2023-02-23 15:00:07 -05:00
Marcel van der Veldt
5adf1dcc90 Fix support for Bridge(d) and composed devices in Matter (#88662)
* Refactor discovery of entities to support composed and bridged devices

* Bump library version to 3.1.0

* move discovery schemas to platforms

* optimize a tiny bit

* simplify even more

* fixed bug in light platform

* fix color control logic

* fix some issues

* Update homeassistant/components/matter/discovery.py

Co-authored-by: Paulus Schoutsen <balloob@gmail.com>

* fix some tests

* fix light test

---------

Co-authored-by: Paulus Schoutsen <balloob@gmail.com>
2023-02-23 15:00:05 -05:00
epenet
0fb28dcf9e Add missing async_setup_entry mock in openuv (#88661) 2023-02-23 15:00:04 -05:00
Allen Porter
2fddbcedcf Fix local calendar issue with events created with fixed UTC offsets (#88650)
Fix issue with events created with UTC offsets
2023-02-23 15:00:03 -05:00
J. Nick Koston
951df3df57 Fix untrapped exceptions during Yale Access Bluetooth first setup (#88642) 2023-02-23 15:00:02 -05:00
starkillerOG
35142e456a Bump reolink-aio to 0.5.1 and check if update supported (#88641) 2023-02-23 15:00:01 -05:00
Paulus Schoutsen
cfaba87dd6 Error checking for OTBR (#88620)
* Error checking for OTBR

* Other errors in flow too

* Tests
2023-02-23 15:00:00 -05:00
Erik Montnemery
2db8d4b73a Bump python-otbr-api to 1.0.4 (#88613)
* Bump python-otbr-api to 1.0.4

* Adjust tests
2023-02-23 14:59:59 -05:00
Raman Gupta
0d2006bf33 Add support for firmware target in zwave_js FirmwareUploadView (#88523)
* Add support for firmware target in zwave_js FirmwareUploadView

fix

* Update tests/components/zwave_js/test_api.py

Co-authored-by: Martin Hjelmare <marhje52@gmail.com>

* Update tests/components/zwave_js/test_api.py

Co-authored-by: Martin Hjelmare <marhje52@gmail.com>

* Update tests/components/zwave_js/test_api.py

Co-authored-by: Martin Hjelmare <marhje52@gmail.com>

* Update tests/components/zwave_js/test_api.py

Co-authored-by: Martin Hjelmare <marhje52@gmail.com>

* fix types

* Switch back to using Any

---------

Co-authored-by: Martin Hjelmare <marhje52@gmail.com>
2023-02-23 14:59:58 -05:00
puddly
45547d226e Disable the ZHA bellows UART thread when connecting to a TCP coordinator (#88202)
Disable the bellows UART thread when connecting to a TCP coordinator
2023-02-23 14:59:56 -05:00
Franck Nijhof
cebc6dd096 Bumped version to 2023.3.0b0 2023-02-22 20:44:37 +01:00
218 changed files with 4066 additions and 1568 deletions

View File

@@ -1100,6 +1100,7 @@ build.json @home-assistant/supervisor
/homeassistant/components/smhi/ @gjohansson-ST
/tests/components/smhi/ @gjohansson-ST
/homeassistant/components/sms/ @ocalvo
/homeassistant/components/snapcast/ @luar123
/homeassistant/components/snooz/ @AustinBrunkhorst
/tests/components/snooz/ @AustinBrunkhorst
/homeassistant/components/solaredge/ @frenck

View File

@@ -0,0 +1,5 @@
{
"domain": "heltun",
"name": "HELTUN",
"iot_standards": ["zwave"]
}

View File

@@ -68,7 +68,6 @@ SENSOR_TYPES: list[AirQEntityDescription] = [
AirQEntityDescription(
key="co",
name="CO",
device_class=SensorDeviceClass.CO,
native_unit_of_measurement=CONCENTRATION_MILLIGRAMS_PER_CUBIC_METER,
state_class=SensorStateClass.MEASUREMENT,
value=lambda data: data.get("co"),
@@ -289,7 +288,6 @@ SENSOR_TYPES: list[AirQEntityDescription] = [
AirQEntityDescription(
key="tvoc",
name="VOC",
device_class=SensorDeviceClass.VOLATILE_ORGANIC_COMPOUNDS,
native_unit_of_measurement=CONCENTRATION_PARTS_PER_BILLION,
state_class=SensorStateClass.MEASUREMENT,
value=lambda data: data.get("tvoc"),
@@ -297,7 +295,6 @@ SENSOR_TYPES: list[AirQEntityDescription] = [
AirQEntityDescription(
key="tvoc_ionsc",
name="VOC (Industrial)",
device_class=SensorDeviceClass.VOLATILE_ORGANIC_COMPOUNDS,
native_unit_of_measurement=CONCENTRATION_PARTS_PER_BILLION,
state_class=SensorStateClass.MEASUREMENT,
value=lambda data: data.get("tvoc_ionsc"),

View File

@@ -1,5 +1,6 @@
"""Rest API for Home Assistant."""
import asyncio
from functools import lru_cache
from http import HTTPStatus
import logging
@@ -350,6 +351,12 @@ class APIComponentsView(HomeAssistantView):
return self.json(request.app["hass"].config.components)
@lru_cache
def _cached_template(template_str: str, hass: ha.HomeAssistant) -> template.Template:
"""Return a cached template."""
return template.Template(template_str, hass)
class APITemplateView(HomeAssistantView):
"""View to handle Template requests."""
@@ -362,7 +369,7 @@ class APITemplateView(HomeAssistantView):
raise Unauthorized()
try:
data = await request.json()
tpl = template.Template(data["template"], request.app["hass"])
tpl = _cached_template(data["template"], request.app["hass"])
return tpl.async_render(variables=data.get("variables"), parse_result=False)
except (ValueError, TemplateError) as ex:
return self.json_message(

View File

@@ -28,5 +28,5 @@
"documentation": "https://www.home-assistant.io/integrations/august",
"iot_class": "cloud_push",
"loggers": ["pubnub", "yalexs"],
"requirements": ["yalexs==1.2.7", "yalexs_ble==2.0.2"]
"requirements": ["yalexs==1.2.7", "yalexs_ble==2.0.4"]
}

View File

@@ -6,5 +6,5 @@
"documentation": "https://www.home-assistant.io/integrations/aurora",
"iot_class": "cloud_polling",
"loggers": ["auroranoaa"],
"requirements": ["auroranoaa==0.0.2"]
"requirements": ["auroranoaa==0.0.3"]
}

View File

@@ -60,7 +60,7 @@ from .const import (
DEFAULT_PROBABILITY_THRESHOLD,
)
from .helpers import Observation
from .repairs import raise_mirrored_entries, raise_no_prob_given_false
from .issues import raise_mirrored_entries, raise_no_prob_given_false
_LOGGER = logging.getLogger(__name__)

View File

@@ -1,4 +1,4 @@
"""Helpers for generating repairs."""
"""Helpers for generating issues."""
from __future__ import annotations
from homeassistant.core import HomeAssistant

View File

@@ -106,6 +106,8 @@ class ActiveBluetoothDataUpdateCoordinator(
def needs_poll(self, service_info: BluetoothServiceInfoBleak) -> bool:
"""Return true if time to try and poll."""
if self.hass.is_stopping:
return False
poll_age: float | None = None
if self._last_poll:
poll_age = monotonic_time_coarse() - self._last_poll

View File

@@ -99,6 +99,8 @@ class ActiveBluetoothProcessorCoordinator(
def needs_poll(self, service_info: BluetoothServiceInfoBleak) -> bool:
"""Return true if time to try and poll."""
if self.hass.is_stopping:
return False
poll_age: float | None = None
if self._last_poll:
poll_age = monotonic_time_coarse() - self._last_poll

View File

@@ -227,20 +227,21 @@ class BaseHaRemoteScanner(BaseHaScanner):
self.hass, self._async_expire_devices, timedelta(seconds=30)
)
cancel_stop = self.hass.bus.async_listen(
EVENT_HOMEASSISTANT_STOP, self._save_history
EVENT_HOMEASSISTANT_STOP, self._async_save_history
)
self._async_setup_scanner_watchdog()
@hass_callback
def _cancel() -> None:
self._save_history()
self._async_save_history()
self._async_stop_scanner_watchdog()
cancel_track()
cancel_stop()
return _cancel
def _save_history(self, event: Event | None = None) -> None:
@hass_callback
def _async_save_history(self, event: Event | None = None) -> None:
"""Save the history."""
self._storage.async_set_advertisement_history(
self.source,
@@ -252,6 +253,7 @@ class BaseHaRemoteScanner(BaseHaScanner):
),
)
@hass_callback
def _async_expire_devices(self, _datetime: datetime.datetime) -> None:
"""Expire old devices."""
now = MONOTONIC_TIME()

View File

@@ -5,5 +5,5 @@
"documentation": "https://www.home-assistant.io/integrations/caldav",
"iot_class": "cloud_polling",
"loggers": ["caldav", "vobject"],
"requirements": ["caldav==1.1.1"]
"requirements": ["caldav==1.2.0"]
}

View File

@@ -66,6 +66,55 @@ SCAN_INTERVAL = datetime.timedelta(seconds=60)
# Don't support rrules more often than daily
VALID_FREQS = {"DAILY", "WEEKLY", "MONTHLY", "YEARLY"}
def _has_consistent_timezone(*keys: Any) -> Callable[[dict[str, Any]], dict[str, Any]]:
"""Verify that all datetime values have a consistent timezone."""
def validate(obj: dict[str, Any]) -> dict[str, Any]:
"""Test that all keys that are datetime values have the same timezone."""
tzinfos = []
for key in keys:
if not (value := obj.get(key)) or not isinstance(value, datetime.datetime):
return obj
tzinfos.append(value.tzinfo)
uniq_values = groupby(tzinfos)
if len(list(uniq_values)) > 1:
raise vol.Invalid("Expected all values to have the same timezone")
return obj
return validate
def _as_local_timezone(*keys: Any) -> Callable[[dict[str, Any]], dict[str, Any]]:
"""Convert all datetime values to the local timezone."""
def validate(obj: dict[str, Any]) -> dict[str, Any]:
"""Test that all keys that are datetime values have the same timezone."""
for k in keys:
if (value := obj.get(k)) and isinstance(value, datetime.datetime):
obj[k] = dt.as_local(value)
return obj
return validate
def _is_sorted(*keys: Any) -> Callable[[dict[str, Any]], dict[str, Any]]:
"""Verify that the specified values are sequential."""
def validate(obj: dict[str, Any]) -> dict[str, Any]:
"""Test that all keys in the dict are in order."""
values = []
for k in keys:
if not (value := obj.get(k)):
return obj
values.append(value)
if all(values) and values != sorted(values):
raise vol.Invalid(f"Values were not in order: {values}")
return obj
return validate
CREATE_EVENT_SERVICE = "create_event"
CREATE_EVENT_SCHEMA = vol.All(
cv.has_at_least_one_key(EVENT_START_DATE, EVENT_START_DATETIME, EVENT_IN),
@@ -98,6 +147,10 @@ CREATE_EVENT_SCHEMA = vol.All(
),
},
),
_has_consistent_timezone(EVENT_START_DATETIME, EVENT_END_DATETIME),
_as_local_timezone(EVENT_START_DATETIME, EVENT_END_DATETIME),
_is_sorted(EVENT_START_DATE, EVENT_END_DATE),
_is_sorted(EVENT_START_DATETIME, EVENT_END_DATETIME),
)
@@ -441,36 +494,6 @@ def _has_same_type(*keys: Any) -> Callable[[dict[str, Any]], dict[str, Any]]:
return validate
def _has_consistent_timezone(*keys: Any) -> Callable[[dict[str, Any]], dict[str, Any]]:
"""Verify that all datetime values have a consistent timezone."""
def validate(obj: dict[str, Any]) -> dict[str, Any]:
"""Test that all keys that are datetime values have the same timezone."""
values = [obj[k] for k in keys]
if all(isinstance(value, datetime.datetime) for value in values):
uniq_values = groupby(value.tzinfo for value in values)
if len(list(uniq_values)) > 1:
raise vol.Invalid(
f"Expected all values to have the same timezone: {values}"
)
return obj
return validate
def _is_sorted(*keys: Any) -> Callable[[dict[str, Any]], dict[str, Any]]:
"""Verify that the specified values are sequential."""
def validate(obj: dict[str, Any]) -> dict[str, Any]:
"""Test that all keys in the dict are in order."""
values = [obj[k] for k in keys]
if values != sorted(values):
raise vol.Invalid(f"Values were not in order: {values}")
return obj
return validate
@websocket_api.websocket_command(
{
vol.Required("type"): "calendar/event/create",
@@ -486,6 +509,7 @@ def _is_sorted(*keys: Any) -> Callable[[dict[str, Any]], dict[str, Any]]:
},
_has_same_type(EVENT_START, EVENT_END),
_has_consistent_timezone(EVENT_START, EVENT_END),
_as_local_timezone(EVENT_START, EVENT_END),
_is_sorted(EVENT_START, EVENT_END),
)
),
@@ -582,6 +606,7 @@ async def handle_calendar_event_delete(
},
_has_same_type(EVENT_START, EVENT_END),
_has_consistent_timezone(EVENT_START, EVENT_END),
_as_local_timezone(EVENT_START, EVENT_END),
_is_sorted(EVENT_START, EVENT_END),
)
),

View File

@@ -227,7 +227,21 @@ class DefaultAgent(AbstractConversationAgent):
intent_response: intent.IntentResponse,
recognize_result: RecognizeResult,
) -> str:
all_states = intent_response.matched_states + intent_response.unmatched_states
# Make copies of the states here so we can add translated names for responses.
matched: list[core.State] = []
for state in intent_response.matched_states:
state_copy = core.State.from_dict(state.as_dict())
if state_copy is not None:
matched.append(state_copy)
unmatched: list[core.State] = []
for state in intent_response.unmatched_states:
state_copy = core.State.from_dict(state.as_dict())
if state_copy is not None:
unmatched.append(state_copy)
all_states = matched + unmatched
domains = {state.domain for state in all_states}
translations = await translation.async_get_translations(
self.hass, language, "state", domains
@@ -243,9 +257,9 @@ class DefaultAgent(AbstractConversationAgent):
# This is available in the response template as "state".
state1: core.State | None = None
if intent_response.matched_states:
state1 = intent_response.matched_states[0]
state1 = matched[0]
elif intent_response.unmatched_states:
state1 = intent_response.unmatched_states[0]
state1 = unmatched[0]
# Render response template
speech = response_template.async_render(
@@ -262,13 +276,11 @@ class DefaultAgent(AbstractConversationAgent):
"query": {
# Entity states that matched the query (e.g, "on")
"matched": [
template.TemplateState(self.hass, state)
for state in intent_response.matched_states
template.TemplateState(self.hass, state) for state in matched
],
# Entity states that did not match the query
"unmatched": [
template.TemplateState(self.hass, state)
for state in intent_response.unmatched_states
template.TemplateState(self.hass, state) for state in unmatched
],
},
}

View File

@@ -7,5 +7,5 @@
"integration_type": "system",
"iot_class": "local_push",
"quality_scale": "internal",
"requirements": ["hassil==1.0.5", "home-assistant-intents==2023.2.22"]
"requirements": ["hassil==1.0.6", "home-assistant-intents==2023.2.28"]
}

View File

@@ -8,7 +8,7 @@
"iot_class": "local_push",
"loggers": ["pydeconz"],
"quality_scale": "platinum",
"requirements": ["pydeconz==108"],
"requirements": ["pydeconz==110"],
"ssdp": [
{
"manufacturer": "Royal Philips Electronics",

View File

@@ -8,11 +8,7 @@ from typing import TYPE_CHECKING
import voluptuous as vol
from homeassistant.components.sensor import (
PLATFORM_SCHEMA,
SensorEntity,
SensorStateClass,
)
from homeassistant.components.sensor import PLATFORM_SCHEMA, SensorEntity
from homeassistant.config_entries import ConfigEntry
from homeassistant.const import (
ATTR_UNIT_OF_MEASUREMENT,
@@ -135,7 +131,6 @@ class DerivativeSensor(RestoreEntity, SensorEntity):
_attr_icon = ICON
_attr_should_poll = False
_attr_state_class = SensorStateClass.MEASUREMENT
def __init__(
self,

View File

@@ -19,7 +19,7 @@ from homeassistant.helpers.update_coordinator import DataUpdateCoordinator, Upda
from .const import CONF_ASSOCIATION_DATA, DOMAIN, UPDATE_SECONDS
from .models import DormakabaDkeyData
PLATFORMS: list[Platform] = [Platform.LOCK, Platform.SENSOR]
PLATFORMS: list[Platform] = [Platform.BINARY_SENSOR, Platform.LOCK, Platform.SENSOR]
_LOGGER = logging.getLogger(__name__)

View File

@@ -45,9 +45,10 @@ BINARY_SENSOR_DESCRIPTIONS = (
),
DormakabaDkeyBinarySensorDescription(
key="security_locked",
name="Dead bolt",
name="Deadbolt",
device_class=BinarySensorDeviceClass.LOCK,
is_on=lambda state: state.unlock_status != UnlockStatus.SECURITY_LOCKED,
is_on=lambda state: state.unlock_status
not in (UnlockStatus.SECURITY_LOCKED, UnlockStatus.UNLOCKED_SECURITY_LOCKED),
),
)

View File

@@ -132,7 +132,8 @@ class ConfigFlow(config_entries.ConfigFlow, domain=DOMAIN):
try:
association_data = await lock.associate(user_input["activation_code"])
except BleakError:
except BleakError as err:
_LOGGER.warning("BleakError", exc_info=err)
return self.async_abort(reason="cannot_connect")
except dkey_errors.InvalidActivationCode:
errors["base"] = "invalid_code"

View File

@@ -11,5 +11,5 @@
"documentation": "https://www.home-assistant.io/integrations/dormakaba_dkey",
"integration_type": "device",
"iot_class": "local_polling",
"requirements": ["py-dormakaba-dkey==1.0.2"]
"requirements": ["py-dormakaba-dkey==1.0.4"]
}

View File

@@ -2,6 +2,7 @@
from __future__ import annotations
from datetime import datetime, timedelta
from random import randint
from enturclient import EnturPublicTransportData
import voluptuous as vol
@@ -22,7 +23,7 @@ from homeassistant.helpers.typing import ConfigType, DiscoveryInfoType
from homeassistant.util import Throttle
import homeassistant.util.dt as dt_util
API_CLIENT_NAME = "homeassistant-homeassistant"
API_CLIENT_NAME = "homeassistant-{}"
CONF_STOP_IDS = "stop_ids"
CONF_EXPAND_PLATFORMS = "expand_platforms"
@@ -105,7 +106,7 @@ async def async_setup_platform(
quays = [s for s in stop_ids if "Quay" in s]
data = EnturPublicTransportData(
API_CLIENT_NAME,
API_CLIENT_NAME.format(str(randint(100000, 999999))),
stops=stops,
quays=quays,
line_whitelist=line_whitelist,

View File

@@ -6,5 +6,5 @@
"documentation": "https://www.home-assistant.io/integrations/environment_canada",
"iot_class": "cloud_polling",
"loggers": ["env_canada"],
"requirements": ["env_canada==0.5.28"]
"requirements": ["env_canada==0.5.29"]
}

View File

@@ -14,6 +14,6 @@
"integration_type": "device",
"iot_class": "local_push",
"loggers": ["aioesphomeapi", "noiseprotocol"],
"requirements": ["aioesphomeapi==13.4.0", "esphome-dashboard-api==1.2.3"],
"requirements": ["aioesphomeapi==13.4.2", "esphome-dashboard-api==1.2.3"],
"zeroconf": ["_esphomelib._tcp.local."]
}

View File

@@ -94,9 +94,9 @@ class FibaroCover(FibaroDevice, CoverEntity):
"""Return if the cover is closed."""
if self._is_open_close_only():
state = self.fibaro_device.state
if not state.has_value or state.str_value.lower() == "unknown":
if not state.has_value or state.str_value().lower() == "unknown":
return None
return state.str_value.lower() == "closed"
return state.str_value().lower() == "closed"
if self.current_cover_position is None:
return None

View File

@@ -7,5 +7,5 @@
"integration_type": "hub",
"iot_class": "local_push",
"loggers": ["pyfibaro"],
"requirements": ["pyfibaro==0.6.8"]
"requirements": ["pyfibaro==0.6.9"]
}

View File

@@ -87,14 +87,23 @@ async def async_setup(hass: HomeAssistant, config: ConfigType) -> bool:
hass, STARTUP_SCAN_TIMEOUT
)
@callback
def _async_start_background_discovery(*_: Any) -> None:
"""Run discovery in the background."""
hass.async_create_background_task(_async_discovery(), "flux_led-discovery")
async def _async_discovery(*_: Any) -> None:
async_trigger_discovery(
hass, await async_discover_devices(hass, DISCOVER_SCAN_TIMEOUT)
)
async_trigger_discovery(hass, domain_data[FLUX_LED_DISCOVERY])
hass.bus.async_listen_once(EVENT_HOMEASSISTANT_STARTED, _async_discovery)
async_track_time_interval(hass, _async_discovery, DISCOVERY_INTERVAL)
hass.bus.async_listen_once(
EVENT_HOMEASSISTANT_STARTED, _async_start_background_discovery
)
async_track_time_interval(
hass, _async_start_background_discovery, DISCOVERY_INTERVAL
)
return True

View File

@@ -20,5 +20,5 @@
"documentation": "https://www.home-assistant.io/integrations/frontend",
"integration_type": "system",
"quality_scale": "internal",
"requirements": ["home-assistant-frontend==20230222.0"]
"requirements": ["home-assistant-frontend==20230309.0"]
}

View File

@@ -5,5 +5,5 @@
"config_flow": true,
"documentation": "https://www.home-assistant.io/integrations/garages_amsterdam",
"iot_class": "cloud_polling",
"requirements": ["odp-amsterdam==5.0.1"]
"requirements": ["odp-amsterdam==5.1.0"]
}

View File

@@ -41,7 +41,7 @@ async def async_setup_platform(
[
GeniusClimateZone(broker, z)
for z in broker.client.zone_objs
if z.data["type"] in GH_ZONES
if z.data.get("type") in GH_ZONES
]
)
@@ -79,10 +79,10 @@ class GeniusClimateZone(GeniusHeatingZone, ClimateEntity):
def hvac_action(self) -> str | None:
"""Return the current running hvac operation if supported."""
if "_state" in self._zone.data: # only for v3 API
if self._zone.data["output"] == 1:
return HVACAction.HEATING
if not self._zone.data["_state"].get("bIsActive"):
return HVACAction.OFF
if self._zone.data["_state"].get("bOutRequestHeat"):
return HVACAction.HEATING
return HVACAction.IDLE
return None

View File

@@ -42,7 +42,7 @@ async def async_setup_platform(
[
GeniusSwitch(broker, z)
for z in broker.client.zone_objs
if z.data["type"] == GH_ON_OFF_ZONE
if z.data.get("type") == GH_ON_OFF_ZONE
]
)

View File

@@ -48,7 +48,7 @@ async def async_setup_platform(
[
GeniusWaterHeater(broker, z)
for z in broker.client.zone_objs
if z.data["type"] in GH_HEATERS
if z.data.get("type") in GH_HEATERS
]
)

View File

@@ -832,7 +832,7 @@ class TemperatureControlTrait(_Trait):
"temperatureUnitForUX": _google_temp_unit(
self.hass.config.units.temperature_unit
),
"queryOnlyTemperatureSetting": True,
"queryOnlyTemperatureControl": True,
"temperatureRange": {
"minThresholdCelsius": -100,
"maxThresholdCelsius": 100,

View File

@@ -342,12 +342,14 @@ def get_next_departure(
origin_stop_time.departure_time
LIMIT :limit
"""
result = schedule.engine.execute(
result = schedule.engine.connect().execute(
text(sql_query),
origin_station_id=start_station_id,
end_station_id=end_station_id,
today=now_date,
limit=limit,
{
"origin_station_id": start_station_id,
"end_station_id": end_station_id,
"today": now_date,
"limit": limit,
},
)
# Create lookup timetable for today and possibly tomorrow, taking into
@@ -357,7 +359,8 @@ def get_next_departure(
yesterday_start = today_start = tomorrow_start = None
yesterday_last = today_last = ""
for row in result:
for row_cursor in result:
row = row_cursor._asdict()
if row["yesterday"] == 1 and yesterday_date >= row["start_date"]:
extras = {"day": "yesterday", "first": None, "last": False}
if yesterday_start is None:
@@ -800,7 +803,10 @@ class GTFSDepartureSensor(SensorEntity):
@staticmethod
def dict_for_table(resource: Any) -> dict:
"""Return a dictionary for the SQLAlchemy resource given."""
return {col: getattr(resource, col) for col in resource.__table__.columns}
_dict = {}
for column in resource.__table__.columns:
_dict[column.name] = str(getattr(resource, column.name))
return _dict
def append_keys(self, resource: dict, prefix: str | None = None) -> None:
"""Properly format key val pairs to append to attributes."""

View File

@@ -96,7 +96,7 @@ from .handler import ( # noqa: F401
)
from .http import HassIOView
from .ingress import async_setup_ingress_view
from .repairs import SupervisorRepairs
from .issues import SupervisorIssues
from .websocket_api import async_load_websocket_api
_LOGGER = logging.getLogger(__name__)
@@ -123,7 +123,7 @@ DATA_SUPERVISOR_INFO = "hassio_supervisor_info"
DATA_ADDONS_CHANGELOGS = "hassio_addons_changelogs"
DATA_ADDONS_INFO = "hassio_addons_info"
DATA_ADDONS_STATS = "hassio_addons_stats"
DATA_SUPERVISOR_REPAIRS = "supervisor_repairs"
DATA_SUPERVISOR_ISSUES = "supervisor_issues"
HASSIO_UPDATE_INTERVAL = timedelta(minutes=5)
ADDONS_COORDINATOR = "hassio_addons_coordinator"
@@ -581,9 +581,9 @@ async def async_setup(hass: HomeAssistant, config: ConfigType) -> bool: # noqa:
hass.config_entries.flow.async_init(DOMAIN, context={"source": "system"})
)
# Start listening for problems with supervisor and making repairs
hass.data[DATA_SUPERVISOR_REPAIRS] = repairs = SupervisorRepairs(hass, hassio)
await repairs.setup()
# Start listening for problems with supervisor and making issues
hass.data[DATA_SUPERVISOR_ISSUES] = issues = SupervisorIssues(hass, hassio)
await issues.setup()
return True

View File

@@ -36,6 +36,7 @@ X_AUTH_TOKEN = "X-Supervisor-Token"
X_INGRESS_PATH = "X-Ingress-Path"
X_HASS_USER_ID = "X-Hass-User-ID"
X_HASS_IS_ADMIN = "X-Hass-Is-Admin"
X_HASS_SOURCE = "X-Hass-Source"
WS_TYPE = "type"
WS_ID = "id"

View File

@@ -17,7 +17,7 @@ from homeassistant.const import SERVER_PORT
from homeassistant.core import HomeAssistant
from homeassistant.loader import bind_hass
from .const import ATTR_DISCOVERY, DOMAIN
from .const import ATTR_DISCOVERY, DOMAIN, X_HASS_SOURCE
_LOGGER = logging.getLogger(__name__)
@@ -445,6 +445,8 @@ class HassIO:
payload=None,
timeout=10,
return_text=False,
*,
source="core.handler",
):
"""Send API command to Hass.io.
@@ -458,7 +460,8 @@ class HassIO:
headers={
aiohttp.hdrs.AUTHORIZATION: (
f"Bearer {os.environ.get('SUPERVISOR_TOKEN', '')}"
)
),
X_HASS_SOURCE: source,
},
timeout=aiohttp.ClientTimeout(total=timeout),
)

View File

@@ -6,6 +6,7 @@ from http import HTTPStatus
import logging
import os
import re
from urllib.parse import quote, unquote
import aiohttp
from aiohttp import web
@@ -19,13 +20,16 @@ from aiohttp.hdrs import (
TRANSFER_ENCODING,
)
from aiohttp.web_exceptions import HTTPBadGateway
from multidict import istr
from homeassistant.components.http import KEY_AUTHENTICATED, HomeAssistantView
from homeassistant.components.http import (
KEY_AUTHENTICATED,
KEY_HASS_USER,
HomeAssistantView,
)
from homeassistant.components.onboarding import async_is_onboarded
from homeassistant.core import HomeAssistant
from .const import X_HASS_IS_ADMIN, X_HASS_USER_ID
from .const import X_HASS_SOURCE
_LOGGER = logging.getLogger(__name__)
@@ -34,23 +38,53 @@ MAX_UPLOAD_SIZE = 1024 * 1024 * 1024
# pylint: disable=implicit-str-concat
NO_TIMEOUT = re.compile(
r"^(?:"
r"|homeassistant/update"
r"|hassos/update"
r"|hassos/update/cli"
r"|supervisor/update"
r"|addons/[^/]+/(?:update|install|rebuild)"
r"|backups/.+/full"
r"|backups/.+/partial"
r"|backups/[^/]+/(?:upload|download)"
r")$"
)
NO_AUTH_ONBOARDING = re.compile(r"^(?:" r"|supervisor/logs" r"|backups/[^/]+/.+" r")$")
# fmt: off
# Onboarding can upload backups and restore it
PATHS_NOT_ONBOARDED = re.compile(
r"^(?:"
r"|backups/[a-f0-9]{8}(/info|/new/upload|/download|/restore/full|/restore/partial)?"
r"|backups/new/upload"
r")$"
)
NO_AUTH = re.compile(r"^(?:" r"|app/.*" r"|[store\/]*addons/[^/]+/(logo|icon)" r")$")
# Authenticated users manage backups + download logs, changelog and documentation
PATHS_ADMIN = re.compile(
r"^(?:"
r"|backups/[a-f0-9]{8}(/info|/download|/restore/full|/restore/partial)?"
r"|backups/new/upload"
r"|audio/logs"
r"|cli/logs"
r"|core/logs"
r"|dns/logs"
r"|host/logs"
r"|multicast/logs"
r"|observer/logs"
r"|supervisor/logs"
r"|addons/[^/]+/(changelog|documentation|logs)"
r")$"
)
NO_STORE = re.compile(r"^(?:" r"|app/entrypoint.js" r")$")
# Unauthenticated requests come in for Supervisor panel + add-on images
PATHS_NO_AUTH = re.compile(
r"^(?:"
r"|app/.*"
r"|(store/)?addons/[^/]+/(logo|icon)"
r")$"
)
NO_STORE = re.compile(
r"^(?:"
r"|app/entrypoint.js"
r")$"
)
# pylint: enable=implicit-str-concat
# fmt: on
class HassIOView(HomeAssistantView):
@@ -65,38 +99,66 @@ class HassIOView(HomeAssistantView):
self._host = host
self._websession = websession
async def _handle(
self, request: web.Request, path: str
) -> web.Response | web.StreamResponse:
"""Route data to Hass.io."""
hass = request.app["hass"]
if _need_auth(hass, path) and not request[KEY_AUTHENTICATED]:
return web.Response(status=HTTPStatus.UNAUTHORIZED)
return await self._command_proxy(path, request)
delete = _handle
get = _handle
post = _handle
async def _command_proxy(
self, path: str, request: web.Request
) -> web.StreamResponse:
async def _handle(self, request: web.Request, path: str) -> web.StreamResponse:
"""Return a client request with proxy origin for Hass.io supervisor.
This method is a coroutine.
Use cases:
- Onboarding allows restoring backups
- Load Supervisor panel and add-on logo unauthenticated
- User upload/restore backups
"""
headers = _init_header(request)
if path == "backups/new/upload":
# We need to reuse the full content type that includes the boundary
headers[
CONTENT_TYPE
] = request._stored_content_type # pylint: disable=protected-access
# No bullshit
if path != unquote(path):
return web.Response(status=HTTPStatus.BAD_REQUEST)
hass: HomeAssistant = request.app["hass"]
is_admin = request[KEY_AUTHENTICATED] and request[KEY_HASS_USER].is_admin
authorized = is_admin
if is_admin:
allowed_paths = PATHS_ADMIN
elif not async_is_onboarded(hass):
allowed_paths = PATHS_NOT_ONBOARDED
# During onboarding we need the user to manage backups
authorized = True
else:
# Either unauthenticated or not an admin
allowed_paths = PATHS_NO_AUTH
no_auth_path = PATHS_NO_AUTH.match(path)
headers = {
X_HASS_SOURCE: "core.http",
}
if no_auth_path:
if request.method != "GET":
return web.Response(status=HTTPStatus.METHOD_NOT_ALLOWED)
else:
if not allowed_paths.match(path):
return web.Response(status=HTTPStatus.UNAUTHORIZED)
if authorized:
headers[
AUTHORIZATION
] = f"Bearer {os.environ.get('SUPERVISOR_TOKEN', '')}"
if request.method == "POST":
headers[CONTENT_TYPE] = request.content_type
# _stored_content_type is only computed once `content_type` is accessed
if path == "backups/new/upload":
# We need to reuse the full content type that includes the boundary
headers[
CONTENT_TYPE
] = request._stored_content_type # pylint: disable=protected-access
try:
client = await self._websession.request(
method=request.method,
url=f"http://{self._host}/{path}",
url=f"http://{self._host}/{quote(path)}",
params=request.query,
data=request.content,
headers=headers,
@@ -123,20 +185,8 @@ class HassIOView(HomeAssistantView):
raise HTTPBadGateway()
def _init_header(request: web.Request) -> dict[istr, str]:
"""Create initial header."""
headers = {
AUTHORIZATION: f"Bearer {os.environ.get('SUPERVISOR_TOKEN', '')}",
CONTENT_TYPE: request.content_type,
}
# Add user data
if request.get("hass_user") is not None:
headers[istr(X_HASS_USER_ID)] = request["hass_user"].id
headers[istr(X_HASS_IS_ADMIN)] = str(int(request["hass_user"].is_admin))
return headers
get = _handle
post = _handle
def _response_header(response: aiohttp.ClientResponse, path: str) -> dict[str, str]:
@@ -164,12 +214,3 @@ def _get_timeout(path: str) -> ClientTimeout:
if NO_TIMEOUT.match(path):
return ClientTimeout(connect=10, total=None)
return ClientTimeout(connect=10, total=300)
def _need_auth(hass: HomeAssistant, path: str) -> bool:
"""Return if a path need authentication."""
if not async_is_onboarded(hass) and NO_AUTH_ONBOARDING.match(path):
return False
if NO_AUTH.match(path):
return False
return True

View File

@@ -3,20 +3,22 @@ from __future__ import annotations
import asyncio
from collections.abc import Iterable
from functools import lru_cache
from ipaddress import ip_address
import logging
import os
from urllib.parse import quote
import aiohttp
from aiohttp import ClientTimeout, hdrs, web
from aiohttp.web_exceptions import HTTPBadGateway, HTTPBadRequest
from multidict import CIMultiDict
from yarl import URL
from homeassistant.components.http import HomeAssistantView
from homeassistant.core import HomeAssistant, callback
from homeassistant.helpers.aiohttp_client import async_get_clientsession
from .const import X_AUTH_TOKEN, X_INGRESS_PATH
from .const import X_HASS_SOURCE, X_INGRESS_PATH
_LOGGER = logging.getLogger(__name__)
@@ -42,9 +44,19 @@ class HassIOIngress(HomeAssistantView):
self._host = host
self._websession = websession
@lru_cache
def _create_url(self, token: str, path: str) -> str:
"""Create URL to service."""
return f"http://{self._host}/ingress/{token}/{path}"
base_path = f"/ingress/{token}/"
url = f"http://{self._host}{base_path}{quote(path)}"
try:
if not URL(url).path.startswith(base_path):
raise HTTPBadRequest()
except ValueError as err:
raise HTTPBadRequest() from err
return url
async def _handle(
self, request: web.Request, token: str, path: str
@@ -185,10 +197,8 @@ def _init_header(request: web.Request, token: str) -> CIMultiDict | dict[str, st
continue
headers[name] = value
# Inject token / cleanup later on Supervisor
headers[X_AUTH_TOKEN] = os.environ.get("SUPERVISOR_TOKEN", "")
# Ingress information
headers[X_HASS_SOURCE] = "core.ingress"
headers[X_INGRESS_PATH] = f"/api/hassio_ingress/{token}"
# Set X-Forwarded-For

View File

@@ -70,11 +70,11 @@ UNHEALTHY_REASONS = {
}
class SupervisorRepairs:
"""Create repairs from supervisor events."""
class SupervisorIssues:
"""Create issues from supervisor events."""
def __init__(self, hass: HomeAssistant, client: HassIO) -> None:
"""Initialize supervisor repairs."""
"""Initialize supervisor issues."""
self._hass = hass
self._client = client
self._unsupported_reasons: set[str] = set()
@@ -87,7 +87,7 @@ class SupervisorRepairs:
@unhealthy_reasons.setter
def unhealthy_reasons(self, reasons: set[str]) -> None:
"""Set unhealthy reasons. Create or delete repairs as necessary."""
"""Set unhealthy reasons. Create or delete issues as necessary."""
for unhealthy in reasons - self.unhealthy_reasons:
if unhealthy in UNHEALTHY_REASONS:
translation_key = f"unhealthy_{unhealthy}"
@@ -119,7 +119,7 @@ class SupervisorRepairs:
@unsupported_reasons.setter
def unsupported_reasons(self, reasons: set[str]) -> None:
"""Set unsupported reasons. Create or delete repairs as necessary."""
"""Set unsupported reasons. Create or delete issues as necessary."""
for unsupported in reasons - UNSUPPORTED_SKIP_REPAIR - self.unsupported_reasons:
if unsupported in UNSUPPORTED_REASONS:
translation_key = f"unsupported_{unsupported}"
@@ -149,18 +149,18 @@ class SupervisorRepairs:
await self.update()
async_dispatcher_connect(
self._hass, EVENT_SUPERVISOR_EVENT, self._supervisor_events_to_repairs
self._hass, EVENT_SUPERVISOR_EVENT, self._supervisor_events_to_issues
)
async def update(self) -> None:
"""Update repairs from Supervisor resolution center."""
"""Update issuess from Supervisor resolution center."""
data = await self._client.get_resolution_info()
self.unhealthy_reasons = set(data[ATTR_UNHEALTHY])
self.unsupported_reasons = set(data[ATTR_UNSUPPORTED])
@callback
def _supervisor_events_to_repairs(self, event: dict[str, Any]) -> None:
"""Create repairs from supervisor events."""
def _supervisor_events_to_issues(self, event: dict[str, Any]) -> None:
"""Create issues from supervisor events."""
if ATTR_WS_EVENT not in event:
return

View File

@@ -1,7 +1,6 @@
{
"domain": "hassio",
"name": "Home Assistant Supervisor",
"after_dependencies": ["panel_custom"],
"codeowners": ["@home-assistant/supervisor"],
"dependencies": ["http"],
"documentation": "https://www.home-assistant.io/integrations/hassio",

View File

@@ -116,6 +116,7 @@ async def websocket_supervisor_api(
method=msg[ATTR_METHOD],
timeout=msg.get(ATTR_TIMEOUT, 10),
payload=msg.get(ATTR_DATA, {}),
source="core.websocket_api",
)
if result.get(ATTR_RESULT) == "error":

View File

@@ -421,6 +421,7 @@ class HoneywellUSThermostat(ClimateEntity):
"""Get the latest state from the service."""
try:
await self._device.refresh()
self._attr_available = True
except (
aiosomecomfort.SomeComfortError,
OSError,
@@ -428,8 +429,10 @@ class HoneywellUSThermostat(ClimateEntity):
try:
await self._data.client.login()
except aiosomecomfort.SomeComfortError:
except aiosomecomfort.AuthError:
self._attr_available = False
await self.hass.async_create_task(
self.hass.config_entries.async_reload(self._data.entry_id)
)
except aiosomecomfort.SomeComfortError:
self._attr_available = False

View File

@@ -6,5 +6,5 @@
"documentation": "https://www.home-assistant.io/integrations/honeywell",
"iot_class": "cloud_polling",
"loggers": ["somecomfort"],
"requirements": ["aiosomecomfort==0.0.8"]
"requirements": ["aiosomecomfort==0.0.11"]
}

View File

@@ -7,6 +7,13 @@
"username": "[%key:common::config_flow::data::username%]",
"password": "[%key:common::config_flow::data::password%]"
}
},
"reauth_confirm": {
"title": "[%key:common::config_flow::title::reauth%]",
"description": "The Honeywell integration needs to re-authenticate your account",
"data": {
"password": "[%key:common::config_flow::data::password%]"
}
}
},
"error": {

View File

@@ -5,6 +5,7 @@ from collections.abc import Awaitable, Callable
import logging
import re
from typing import Final
from urllib.parse import unquote
from aiohttp.web import Application, HTTPBadRequest, Request, StreamResponse, middleware
@@ -39,18 +40,24 @@ FILTERS: Final = re.compile(
def setup_security_filter(app: Application) -> None:
"""Create security filter middleware for the app."""
def _recursive_unquote(value: str) -> str:
"""Handle values that are encoded multiple times."""
if (unquoted := unquote(value)) != value:
unquoted = _recursive_unquote(unquoted)
return unquoted
@middleware
async def security_filter_middleware(
request: Request, handler: Callable[[Request], Awaitable[StreamResponse]]
) -> StreamResponse:
"""Process request and tblock commonly known exploit attempts."""
if FILTERS.search(request.path):
"""Process request and block commonly known exploit attempts."""
if FILTERS.search(_recursive_unquote(request.path)):
_LOGGER.warning(
"Filtered a potential harmful request to: %s", request.raw_path
)
raise HTTPBadRequest
if FILTERS.search(request.query_string):
if FILTERS.search(_recursive_unquote(request.query_string)):
_LOGGER.warning(
"Filtered a request with a potential harmful query string: %s",
request.raw_path,

View File

@@ -35,6 +35,7 @@ TRIGGER_TYPE = {
"remote_double_button_long_press": "both {subtype} released after long press",
"remote_double_button_short_press": "both {subtype} released",
"initial_press": "{subtype} pressed initially",
"long_press": "{subtype} long press",
"repeat": "{subtype} held down",
"short_release": "{subtype} released after short press",
"long_release": "{subtype} released after long press",

View File

@@ -11,6 +11,6 @@
"iot_class": "local_push",
"loggers": ["aiohue"],
"quality_scale": "platinum",
"requirements": ["aiohue==4.6.1"],
"requirements": ["aiohue==4.6.2"],
"zeroconf": ["_hue._tcp.local."]
}

View File

@@ -118,13 +118,14 @@ class HueSceneEntityBase(HueBaseEntity, SceneEntity):
"""Return device (service) info."""
# we create a virtual service/device for Hue scenes
# so we have a parent for grouped lights and scenes
group_type = self.group.type.value.title()
return DeviceInfo(
identifiers={(DOMAIN, self.group.id)},
entry_type=DeviceEntryType.SERVICE,
name=self.group.metadata.name,
manufacturer=self.bridge.api.config.bridge_device.product_data.manufacturer_name,
model=self.group.type.value.title(),
suggested_area=self.group.metadata.name,
suggested_area=self.group.metadata.name if group_type == "Room" else None,
via_device=(DOMAIN, self.bridge.api.config.bridge_device.id),
)

View File

@@ -46,6 +46,7 @@ DEFAULT_BUTTON_EVENT_TYPES = (
ButtonEvent.INITIAL_PRESS,
ButtonEvent.REPEAT,
ButtonEvent.SHORT_RELEASE,
ButtonEvent.LONG_PRESS,
ButtonEvent.LONG_RELEASE,
)

View File

@@ -55,7 +55,13 @@ class HueBaseEntity(Entity):
self._attr_unique_id = resource.id
# device is precreated in main handler
# this attaches the entity to the precreated device
if self.device is not None:
if self.device is None:
# attach all device-less entities to the bridge itself
# e.g. config based sensors like entertainment area
self._attr_device_info = DeviceInfo(
identifiers={(DOMAIN, bridge.api.config.bridge.bridge_id)},
)
else:
self._attr_device_info = DeviceInfo(
identifiers={(DOMAIN, self.device.id)},
)
@@ -137,17 +143,14 @@ class HueBaseEntity(Entity):
def _handle_event(self, event_type: EventType, resource: HueResource) -> None:
"""Handle status event for this resource (or it's parent)."""
if event_type == EventType.RESOURCE_DELETED:
# remove any services created for zones/rooms
# handle removal of room and zone 'virtual' devices/services
# regular devices are removed automatically by the logic in device.py.
if resource.type in (ResourceTypes.ROOM, ResourceTypes.ZONE):
dev_reg = async_get_device_registry(self.hass)
if device := dev_reg.async_get_device({(DOMAIN, resource.id)}):
dev_reg.async_remove_device(device.id)
if resource.type in (
ResourceTypes.GROUPED_LIGHT,
ResourceTypes.SCENE,
ResourceTypes.SMART_SCENE,
):
# cleanup entities that are not strictly device-bound and have the bridge as parent
if self.device is None:
ent_reg = async_get_entity_registry(self.hass)
ent_reg.async_remove(self.entity_id)
return

View File

@@ -153,6 +153,7 @@ async def async_setup_entry( # noqa: C901
system.serial,
svc_exception,
)
await system.aqualink.close()
else:
cur = system.online
if cur and not prev:

View File

@@ -3,6 +3,7 @@ from __future__ import annotations
from collections.abc import Awaitable
import httpx
from iaqualink.exception import AqualinkServiceException
from homeassistant.exceptions import HomeAssistantError
@@ -12,5 +13,5 @@ async def await_or_reraise(awaitable: Awaitable) -> None:
"""Execute API call while catching service exceptions."""
try:
await awaitable
except AqualinkServiceException as svc_exception:
except (AqualinkServiceException, httpx.HTTPError) as svc_exception:
raise HomeAssistantError(f"Aqualink error: {svc_exception}") from svc_exception

View File

@@ -17,8 +17,8 @@
"iot_class": "local_push",
"loggers": ["pyinsteon", "pypubsub"],
"requirements": [
"pyinsteon==1.3.2",
"insteon-frontend-home-assistant==0.3.2"
"pyinsteon==1.3.4",
"insteon-frontend-home-assistant==0.3.3"
],
"usb": [
{

View File

@@ -1,11 +1,13 @@
"""Utilities used by insteon component."""
import asyncio
from collections.abc import Callable
import logging
from pyinsteon import devices
from pyinsteon.address import Address
from pyinsteon.constants import ALDBStatus, DeviceAction
from pyinsteon.events import OFF_EVENT, OFF_FAST_EVENT, ON_EVENT, ON_FAST_EVENT
from pyinsteon.device_types.device_base import Device
from pyinsteon.events import OFF_EVENT, OFF_FAST_EVENT, ON_EVENT, ON_FAST_EVENT, Event
from pyinsteon.managers.link_manager import (
async_enter_linking_mode,
async_enter_unlinking_mode,
@@ -27,7 +29,7 @@ from homeassistant.const import (
CONF_PLATFORM,
ENTITY_MATCH_ALL,
)
from homeassistant.core import ServiceCall, callback
from homeassistant.core import HomeAssistant, ServiceCall, callback
from homeassistant.helpers import device_registry as dr
from homeassistant.helpers.dispatcher import (
async_dispatcher_connect,
@@ -89,49 +91,52 @@ from .schemas import (
_LOGGER = logging.getLogger(__name__)
def add_on_off_event_device(hass, device):
def _register_event(event: Event, listener: Callable) -> None:
"""Register the events raised by a device."""
_LOGGER.debug(
"Registering on/off event for %s %d %s",
str(event.address),
event.group,
event.name,
)
event.subscribe(listener, force_strong_ref=True)
def add_on_off_event_device(hass: HomeAssistant, device: Device) -> None:
"""Register an Insteon device as an on/off event device."""
@callback
def async_fire_group_on_off_event(name, address, group, button):
def async_fire_group_on_off_event(
name: str, address: Address, group: int, button: str
):
# Firing an event when a button is pressed.
if button and button[-2] == "_":
button_id = button[-1].lower()
else:
button_id = None
schema = {CONF_ADDRESS: address}
schema = {CONF_ADDRESS: address, "group": group}
if button_id:
schema[EVENT_CONF_BUTTON] = button_id
if name == ON_EVENT:
event = EVENT_GROUP_ON
if name == OFF_EVENT:
elif name == OFF_EVENT:
event = EVENT_GROUP_OFF
if name == ON_FAST_EVENT:
elif name == ON_FAST_EVENT:
event = EVENT_GROUP_ON_FAST
if name == OFF_FAST_EVENT:
elif name == OFF_FAST_EVENT:
event = EVENT_GROUP_OFF_FAST
else:
event = f"insteon.{name}"
_LOGGER.debug("Firing event %s with %s", event, schema)
hass.bus.async_fire(event, schema)
for group in device.events:
if isinstance(group, int):
for event in device.events[group]:
if event in [
OFF_EVENT,
ON_EVENT,
OFF_FAST_EVENT,
ON_FAST_EVENT,
]:
_LOGGER.debug(
"Registering on/off event for %s %d %s",
str(device.address),
group,
event,
)
device.events[group][event].subscribe(
async_fire_group_on_off_event, force_strong_ref=True
)
for name_or_group, event in device.events.items():
if isinstance(name_or_group, int):
for _, event in device.events[name_or_group].items():
_register_event(event, async_fire_group_on_off_event)
else:
_register_event(event, async_fire_group_on_off_event)
def register_new_device_callback(hass):

View File

@@ -20,10 +20,10 @@ async def async_setup_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool:
entry_data[CONF_CLIENT_DEVICE_ID] = entry.entry_id
hass.config_entries.async_update_entry(entry, data=entry_data)
client = create_client(
device_id=entry.data[CONF_CLIENT_DEVICE_ID],
device_name=hass.config.location_name,
)
device_id = entry.data[CONF_CLIENT_DEVICE_ID]
device_name = ascii(hass.config.location_name)
client = create_client(device_id=device_id, device_name=device_name)
try:
user_id, connect_result = await validate_input(hass, dict(entry.data), client)

View File

@@ -4,6 +4,7 @@ from __future__ import annotations
from abc import ABC, abstractmethod
from collections.abc import AsyncGenerator
from pathlib import Path
import shutil
from typing import Any, Final
import voluptuous as vol
@@ -549,9 +550,12 @@ class KNXCommonFlow(ABC, FlowHandler):
),
None,
)
_tunnel_identifier = selected_tunnel_ia or self.new_entry_data.get(
CONF_HOST
)
_tunnel_suffix = f" @ {_tunnel_identifier}" if _tunnel_identifier else ""
self.new_title = (
f"{'Secure ' if _if_user_id else ''}"
f"Tunneling @ {selected_tunnel_ia or self.new_entry_data[CONF_HOST]}"
f"{'Secure ' if _if_user_id else ''}Tunneling{_tunnel_suffix}"
)
return self.finish_flow()
@@ -708,7 +712,8 @@ class KNXCommonFlow(ABC, FlowHandler):
else:
dest_path = Path(self.hass.config.path(STORAGE_DIR, DOMAIN))
dest_path.mkdir(exist_ok=True)
file_path.rename(dest_path / DEFAULT_KNX_KEYRING_FILENAME)
dest_file = dest_path / DEFAULT_KNX_KEYRING_FILENAME
shutil.move(file_path, dest_file)
return keyring, errors
keyring, errors = await self.hass.async_add_executor_job(_process_upload)

View File

@@ -9,5 +9,5 @@
"iot_class": "local_push",
"loggers": ["xknx"],
"quality_scale": "platinum",
"requirements": ["xknx==2.5.0"]
"requirements": ["xknx==2.6.0"]
}

View File

@@ -84,7 +84,7 @@ def ensure_zone(value):
if value is None:
raise vol.Invalid("zone value is None")
if str(value) not in ZONES is None:
if str(value) not in ZONES:
raise vol.Invalid("zone not valid")
return str(value)

View File

@@ -140,7 +140,7 @@ ROBOT_SENSOR_MAP: dict[type[Robot], list[RobotSensorEntityDescription]] = {
name="Pet weight",
native_unit_of_measurement=UnitOfMass.POUNDS,
device_class=SensorDeviceClass.WEIGHT,
state_class=SensorStateClass.TOTAL,
state_class=SensorStateClass.MEASUREMENT,
),
],
FeederRobot: [

View File

@@ -15,7 +15,9 @@ from pydantic import ValidationError
import voluptuous as vol
from homeassistant.components.calendar import (
EVENT_END,
EVENT_RRULE,
EVENT_START,
CalendarEntity,
CalendarEntityFeature,
CalendarEvent,
@@ -151,6 +153,21 @@ def _parse_event(event: dict[str, Any]) -> Event:
"""Parse an ical event from a home assistant event dictionary."""
if rrule := event.get(EVENT_RRULE):
event[EVENT_RRULE] = Recur.from_rrule(rrule)
# This function is called with new events created in the local timezone,
# however ical library does not properly return recurrence_ids for
# start dates with a timezone. For now, ensure any datetime is stored as a
# floating local time to ensure we still apply proper local timezone rules.
# This can be removed when ical is updated with a new recurrence_id format
# https://github.com/home-assistant/core/issues/87759
for key in (EVENT_START, EVENT_END):
if (
(value := event[key])
and isinstance(value, datetime)
and value.tzinfo is not None
):
event[key] = dt_util.as_local(value).replace(tzinfo=None)
try:
return Event.parse_obj(event)
except ValidationError as err:
@@ -162,8 +179,12 @@ def _get_calendar_event(event: Event) -> CalendarEvent:
"""Return a CalendarEvent from an API event."""
return CalendarEvent(
summary=event.summary,
start=event.start,
end=event.end,
start=dt_util.as_local(event.start)
if isinstance(event.start, datetime)
else event.start,
end=dt_util.as_local(event.end)
if isinstance(event.end, datetime)
else event.end,
description=event.description,
uid=event.uid,
rrule=event.rrule.as_rrule_str() if event.rrule else None,

View File

@@ -33,6 +33,7 @@ from homeassistant.helpers.config_validation import ( # noqa: F401
)
from homeassistant.helpers.entity import Entity, EntityDescription
from homeassistant.helpers.entity_component import EntityComponent
from homeassistant.helpers.service import remove_entity_service_fields
from homeassistant.helpers.typing import ConfigType, StateType
_LOGGER = logging.getLogger(__name__)
@@ -92,7 +93,7 @@ async def _async_lock(entity: LockEntity, service_call: ServiceCall) -> None:
raise ValueError(
f"Code '{code}' for locking {entity.entity_id} doesn't match pattern {entity.code_format}"
)
await entity.async_lock(**service_call.data)
await entity.async_lock(**remove_entity_service_fields(service_call))
async def _async_unlock(entity: LockEntity, service_call: ServiceCall) -> None:
@@ -102,7 +103,7 @@ async def _async_unlock(entity: LockEntity, service_call: ServiceCall) -> None:
raise ValueError(
f"Code '{code}' for unlocking {entity.entity_id} doesn't match pattern {entity.code_format}"
)
await entity.async_unlock(**service_call.data)
await entity.async_unlock(**remove_entity_service_fields(service_call))
async def _async_open(entity: LockEntity, service_call: ServiceCall) -> None:
@@ -112,7 +113,7 @@ async def _async_open(entity: LockEntity, service_call: ServiceCall) -> None:
raise ValueError(
f"Code '{code}' for opening {entity.entity_id} doesn't match pattern {entity.code_format}"
)
await entity.async_open(**service_call.data)
await entity.async_open(**remove_entity_service_fields(service_call))
async def async_setup_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool:

View File

@@ -27,7 +27,7 @@ from .adapter import MatterAdapter
from .addon import get_addon_manager
from .api import async_register_api
from .const import CONF_INTEGRATION_CREATED_ADDON, CONF_USE_ADDON, DOMAIN, LOGGER
from .device_platform import DEVICE_PLATFORM
from .discovery import SUPPORTED_PLATFORMS
from .helpers import MatterEntryData, get_matter, get_node_from_device_entry
CONNECT_TIMEOUT = 10
@@ -101,12 +101,12 @@ async def async_setup_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool:
matter = MatterAdapter(hass, matter_client, entry)
hass.data[DOMAIN][entry.entry_id] = MatterEntryData(matter, listen_task)
await hass.config_entries.async_forward_entry_setups(entry, DEVICE_PLATFORM)
await hass.config_entries.async_forward_entry_setups(entry, SUPPORTED_PLATFORMS)
await matter.setup_nodes()
# If the listen task is already failed, we need to raise ConfigEntryNotReady
if listen_task.done() and (listen_error := listen_task.exception()) is not None:
await hass.config_entries.async_unload_platforms(entry, DEVICE_PLATFORM)
await hass.config_entries.async_unload_platforms(entry, SUPPORTED_PLATFORMS)
hass.data[DOMAIN].pop(entry.entry_id)
try:
await matter_client.disconnect()
@@ -142,7 +142,9 @@ async def _client_listen(
async def async_unload_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool:
"""Unload a config entry."""
unload_ok = await hass.config_entries.async_unload_platforms(entry, DEVICE_PLATFORM)
unload_ok = await hass.config_entries.async_unload_platforms(
entry, SUPPORTED_PLATFORMS
)
if unload_ok:
matter_entry_data: MatterEntryData = hass.data[DOMAIN].pop(entry.entry_id)

View File

@@ -3,11 +3,6 @@ from __future__ import annotations
from typing import TYPE_CHECKING, cast
from chip.clusters import Objects as all_clusters
from matter_server.client.models.node_device import (
AbstractMatterNodeDevice,
MatterBridgedNodeDevice,
)
from matter_server.common.models import EventType, ServerInfoMessage
from homeassistant.config_entries import ConfigEntry
@@ -17,12 +12,12 @@ from homeassistant.helpers import device_registry as dr
from homeassistant.helpers.entity_platform import AddEntitiesCallback
from .const import DOMAIN, ID_TYPE_DEVICE_ID, ID_TYPE_SERIAL, LOGGER
from .device_platform import DEVICE_PLATFORM
from .discovery import async_discover_entities
from .helpers import get_device_id
if TYPE_CHECKING:
from matter_server.client import MatterClient
from matter_server.client.models.node import MatterNode
from matter_server.client.models.node import MatterEndpoint, MatterNode
class MatterAdapter:
@@ -51,12 +46,8 @@ class MatterAdapter:
for node in await self.matter_client.get_nodes():
self._setup_node(node)
def node_added_callback(event: EventType, node: MatterNode | None) -> None:
def node_added_callback(event: EventType, node: MatterNode) -> None:
"""Handle node added event."""
if node is None:
# We can clean this up when we've improved the typing in the library.
# https://github.com/home-assistant-libs/python-matter-server/pull/153
raise RuntimeError("Node added event without node")
self._setup_node(node)
self.config_entry.async_on_unload(
@@ -67,48 +58,32 @@ class MatterAdapter:
"""Set up an node."""
LOGGER.debug("Setting up entities for node %s", node.node_id)
bridge_unique_id: str | None = None
if (
node.aggregator_device_type_instance is not None
and node.root_device_type_instance is not None
and node.root_device_type_instance.get_cluster(
all_clusters.BasicInformation
)
):
# create virtual (parent) device for bridge node device
bridge_device = MatterBridgedNodeDevice(
node.aggregator_device_type_instance
)
self._create_device_registry(bridge_device)
server_info = cast(ServerInfoMessage, self.matter_client.server_info)
bridge_unique_id = get_device_id(server_info, bridge_device)
for node_device in node.node_devices:
self._setup_node_device(node_device, bridge_unique_id)
for endpoint in node.endpoints.values():
# Node endpoints are translated into HA devices
self._setup_endpoint(endpoint)
def _create_device_registry(
self,
node_device: AbstractMatterNodeDevice,
bridge_unique_id: str | None = None,
endpoint: MatterEndpoint,
) -> None:
"""Create a device registry entry."""
"""Create a device registry entry for a MatterNode."""
server_info = cast(ServerInfoMessage, self.matter_client.server_info)
basic_info = node_device.device_info()
device_type_instances = node_device.device_type_instances()
basic_info = endpoint.device_info
name = basic_info.nodeLabel or basic_info.productLabel or basic_info.productName
name = basic_info.nodeLabel
if not name and isinstance(node_device, MatterBridgedNodeDevice):
# fallback name for Bridge
name = "Hub device"
elif not name and device_type_instances:
# use the productName if no node label is present
name = basic_info.productName
# handle bridged devices
bridge_device_id = None
if endpoint.is_bridged_device:
bridge_device_id = get_device_id(
server_info,
endpoint.node.endpoints[0],
)
bridge_device_id = f"{ID_TYPE_DEVICE_ID}_{bridge_device_id}"
node_device_id = get_device_id(
server_info,
node_device,
endpoint,
)
identifiers = {(DOMAIN, f"{ID_TYPE_DEVICE_ID}_{node_device_id}")}
# if available, we also add the serialnumber as identifier
@@ -124,50 +99,21 @@ class MatterAdapter:
sw_version=basic_info.softwareVersionString,
manufacturer=basic_info.vendorName,
model=basic_info.productName,
via_device=(DOMAIN, bridge_unique_id) if bridge_unique_id else None,
via_device=(DOMAIN, bridge_device_id) if bridge_device_id else None,
)
def _setup_node_device(
self, node_device: AbstractMatterNodeDevice, bridge_unique_id: str | None
) -> None:
"""Set up a node device."""
self._create_device_registry(node_device, bridge_unique_id)
def _setup_endpoint(self, endpoint: MatterEndpoint) -> None:
"""Set up a MatterEndpoint as HA Device."""
# pre-create device registry entry
self._create_device_registry(endpoint)
# run platform discovery from device type instances
for instance in node_device.device_type_instances():
created = False
for platform, devices in DEVICE_PLATFORM.items():
entity_descriptions = devices.get(instance.device_type)
if entity_descriptions is None:
continue
if not isinstance(entity_descriptions, list):
entity_descriptions = [entity_descriptions]
entities = []
for entity_description in entity_descriptions:
LOGGER.debug(
"Creating %s entity for %s (%s)",
platform,
instance.device_type.__name__,
hex(instance.device_type.device_type),
)
entities.append(
entity_description.entity_cls(
self.matter_client,
node_device,
instance,
entity_description,
)
)
self.platform_handlers[platform](entities)
created = True
if not created:
LOGGER.warning(
"Found unsupported device %s (%s)",
type(instance).__name__,
hex(instance.device_type.device_type),
)
for entity_info in async_discover_entities(endpoint):
LOGGER.debug(
"Creating %s entity for %s",
entity_info.platform,
entity_info.primary_attribute,
)
new_entity = entity_info.entity_class(
self.matter_client, endpoint, entity_info
)
self.platform_handlers[entity_info.platform]([new_entity])

View File

@@ -1,11 +1,9 @@
"""Matter binary sensors."""
from __future__ import annotations
from dataclasses import dataclass
from functools import partial
from chip.clusters import Objects as clusters
from matter_server.client.models import device_types
from chip.clusters.Objects import uint
from chip.clusters.Types import Nullable, NullValue
from homeassistant.components.binary_sensor import (
BinarySensorDeviceClass,
@@ -17,8 +15,9 @@ from homeassistant.const import Platform
from homeassistant.core import HomeAssistant, callback
from homeassistant.helpers.entity_platform import AddEntitiesCallback
from .entity import MatterEntity, MatterEntityDescriptionBaseClass
from .entity import MatterEntity
from .helpers import get_matter
from .models import MatterDiscoverySchema
async def async_setup_entry(
@@ -34,60 +33,70 @@ async def async_setup_entry(
class MatterBinarySensor(MatterEntity, BinarySensorEntity):
"""Representation of a Matter binary sensor."""
entity_description: MatterBinarySensorEntityDescription
@callback
def _update_from_device(self) -> None:
"""Update from device."""
self._attr_is_on = self.get_matter_attribute_value(
# We always subscribe to a single value
self.entity_description.subscribe_attributes[0],
)
value: bool | uint | int | Nullable | None
value = self.get_matter_attribute_value(self._entity_info.primary_attribute)
if value in (None, NullValue):
value = None
elif value_convert := self._entity_info.measurement_to_ha:
value = value_convert(value)
self._attr_is_on = value
class MatterOccupancySensor(MatterBinarySensor):
"""Representation of a Matter occupancy sensor."""
_attr_device_class = BinarySensorDeviceClass.OCCUPANCY
@callback
def _update_from_device(self) -> None:
"""Update from device."""
value = self.get_matter_attribute_value(
# We always subscribe to a single value
self.entity_description.subscribe_attributes[0],
)
# Discovery schema(s) to map Matter Attributes to HA entities
DISCOVERY_SCHEMAS = [
# device specific: translate Hue motion to sensor to HA Motion sensor
# instead of generic occupancy sensor
MatterDiscoverySchema(
platform=Platform.BINARY_SENSOR,
entity_description=BinarySensorEntityDescription(
key="HueMotionSensor",
device_class=BinarySensorDeviceClass.MOTION,
name="Motion",
),
entity_class=MatterBinarySensor,
required_attributes=(clusters.OccupancySensing.Attributes.Occupancy,),
vendor_id=(4107,),
product_name=("Hue motion sensor",),
measurement_to_ha=lambda x: (x & 1 == 1) if x is not None else None,
),
MatterDiscoverySchema(
platform=Platform.BINARY_SENSOR,
entity_description=BinarySensorEntityDescription(
key="ContactSensor",
device_class=BinarySensorDeviceClass.DOOR,
name="Contact",
),
entity_class=MatterBinarySensor,
required_attributes=(clusters.BooleanState.Attributes.StateValue,),
# value is inverted on matter to what we expect
measurement_to_ha=lambda x: not x,
),
MatterDiscoverySchema(
platform=Platform.BINARY_SENSOR,
entity_description=BinarySensorEntityDescription(
key="OccupancySensor",
device_class=BinarySensorDeviceClass.OCCUPANCY,
name="Occupancy",
),
entity_class=MatterBinarySensor,
required_attributes=(clusters.OccupancySensing.Attributes.Occupancy,),
# The first bit = if occupied
self._attr_is_on = (value & 1 == 1) if value is not None else None
@dataclass
class MatterBinarySensorEntityDescription(
BinarySensorEntityDescription,
MatterEntityDescriptionBaseClass,
):
"""Matter Binary Sensor entity description."""
# You can't set default values on inherited data classes
MatterSensorEntityDescriptionFactory = partial(
MatterBinarySensorEntityDescription, entity_cls=MatterBinarySensor
)
DEVICE_ENTITY: dict[
type[device_types.DeviceType],
MatterEntityDescriptionBaseClass | list[MatterEntityDescriptionBaseClass],
] = {
device_types.ContactSensor: MatterSensorEntityDescriptionFactory(
key=device_types.ContactSensor,
name="Contact",
subscribe_attributes=(clusters.BooleanState.Attributes.StateValue,),
device_class=BinarySensorDeviceClass.DOOR,
measurement_to_ha=lambda x: (x & 1 == 1) if x is not None else None,
),
device_types.OccupancySensor: MatterSensorEntityDescriptionFactory(
key=device_types.OccupancySensor,
name="Occupancy",
entity_cls=MatterOccupancySensor,
subscribe_attributes=(clusters.OccupancySensing.Attributes.Occupancy,),
MatterDiscoverySchema(
platform=Platform.BINARY_SENSOR,
entity_description=BinarySensorEntityDescription(
key="BatteryChargeLevel",
device_class=BinarySensorDeviceClass.BATTERY,
name="Battery Status",
),
entity_class=MatterBinarySensor,
required_attributes=(clusters.PowerSource.Attributes.BatChargeLevel,),
# only add binary battery sensor if a regular percentage based is not available
absent_attributes=(clusters.PowerSource.Attributes.BatPercentRemaining,),
measurement_to_ha=lambda x: x != clusters.PowerSource.Enums.BatChargeLevel.kOk,
),
}
]

View File

@@ -1,30 +0,0 @@
"""All mappings of Matter devices to Home Assistant platforms."""
from __future__ import annotations
from typing import TYPE_CHECKING
from homeassistant.const import Platform
from .binary_sensor import DEVICE_ENTITY as BINARY_SENSOR_DEVICE_ENTITY
from .light import DEVICE_ENTITY as LIGHT_DEVICE_ENTITY
from .sensor import DEVICE_ENTITY as SENSOR_DEVICE_ENTITY
from .switch import DEVICE_ENTITY as SWITCH_DEVICE_ENTITY
if TYPE_CHECKING:
from matter_server.client.models.device_types import DeviceType
from .entity import MatterEntityDescriptionBaseClass
DEVICE_PLATFORM: dict[
Platform,
dict[
type[DeviceType],
MatterEntityDescriptionBaseClass | list[MatterEntityDescriptionBaseClass],
],
] = {
Platform.BINARY_SENSOR: BINARY_SENSOR_DEVICE_ENTITY,
Platform.LIGHT: LIGHT_DEVICE_ENTITY,
Platform.SENSOR: SENSOR_DEVICE_ENTITY,
Platform.SWITCH: SWITCH_DEVICE_ENTITY,
}

View File

@@ -0,0 +1,115 @@
"""Map Matter Nodes and Attributes to Home Assistant entities."""
from __future__ import annotations
from collections.abc import Generator
from chip.clusters.Objects import ClusterAttributeDescriptor
from matter_server.client.models.node import MatterEndpoint
from homeassistant.const import Platform
from homeassistant.core import callback
from .binary_sensor import DISCOVERY_SCHEMAS as BINARY_SENSOR_SCHEMAS
from .light import DISCOVERY_SCHEMAS as LIGHT_SCHEMAS
from .models import MatterDiscoverySchema, MatterEntityInfo
from .sensor import DISCOVERY_SCHEMAS as SENSOR_SCHEMAS
from .switch import DISCOVERY_SCHEMAS as SWITCH_SCHEMAS
DISCOVERY_SCHEMAS: dict[Platform, list[MatterDiscoverySchema]] = {
Platform.BINARY_SENSOR: BINARY_SENSOR_SCHEMAS,
Platform.LIGHT: LIGHT_SCHEMAS,
Platform.SENSOR: SENSOR_SCHEMAS,
Platform.SWITCH: SWITCH_SCHEMAS,
}
SUPPORTED_PLATFORMS = tuple(DISCOVERY_SCHEMAS.keys())
@callback
def iter_schemas() -> Generator[MatterDiscoverySchema, None, None]:
"""Iterate over all available discovery schemas."""
for platform_schemas in DISCOVERY_SCHEMAS.values():
yield from platform_schemas
@callback
def async_discover_entities(
endpoint: MatterEndpoint,
) -> Generator[MatterEntityInfo, None, None]:
"""Run discovery on MatterEndpoint and return matching MatterEntityInfo(s)."""
discovered_attributes: set[type[ClusterAttributeDescriptor]] = set()
device_info = endpoint.device_info
for schema in iter_schemas():
# abort if attribute(s) already discovered
if any(x in schema.required_attributes for x in discovered_attributes):
continue
# check vendor_id
if (
schema.vendor_id is not None
and device_info.vendorID not in schema.vendor_id
):
continue
# check product_name
if (
schema.product_name is not None
and device_info.productName not in schema.product_name
):
continue
# check required device_type
if schema.device_type is not None and not any(
x in schema.device_type for x in endpoint.device_types
):
continue
# check absent device_type
if schema.not_device_type is not None and any(
x in schema.not_device_type for x in endpoint.device_types
):
continue
# check endpoint_id
if (
schema.endpoint_id is not None
and endpoint.endpoint_id not in schema.endpoint_id
):
continue
# check required attributes
if schema.required_attributes is not None and not all(
endpoint.has_attribute(None, val_schema)
for val_schema in schema.required_attributes
):
continue
# check for values that may not be present
if schema.absent_attributes is not None and any(
endpoint.has_attribute(None, val_schema)
for val_schema in schema.absent_attributes
):
continue
# all checks passed, this value belongs to an entity
attributes_to_watch = list(schema.required_attributes)
if schema.optional_attributes:
# check optional attributes
for optional_attribute in schema.optional_attributes:
if optional_attribute in attributes_to_watch:
continue
if endpoint.has_attribute(None, optional_attribute):
attributes_to_watch.append(optional_attribute)
yield MatterEntityInfo(
endpoint=endpoint,
platform=schema.platform,
attributes_to_watch=attributes_to_watch,
entity_description=schema.entity_description,
entity_class=schema.entity_class,
measurement_to_ha=schema.measurement_to_ha,
)
# prevent re-discovery of the same attributes
if not schema.allow_multi:
discovered_attributes.update(attributes_to_watch)

View File

@@ -3,90 +3,77 @@ from __future__ import annotations
from abc import abstractmethod
from collections.abc import Callable
from dataclasses import dataclass
import logging
from typing import TYPE_CHECKING, Any, cast
from chip.clusters.Objects import ClusterAttributeDescriptor
from matter_server.client.models.device_type_instance import MatterDeviceTypeInstance
from matter_server.client.models.node_device import AbstractMatterNodeDevice
from matter_server.common.helpers.util import create_attribute_path
from matter_server.common.models import EventType, ServerInfoMessage
from homeassistant.core import callback
from homeassistant.helpers.entity import DeviceInfo, Entity, EntityDescription
from homeassistant.helpers.entity import DeviceInfo, Entity
from .const import DOMAIN, ID_TYPE_DEVICE_ID
from .helpers import get_device_id, get_operational_instance_id
from .helpers import get_device_id
if TYPE_CHECKING:
from matter_server.client import MatterClient
from matter_server.client.models.node import MatterEndpoint
from .discovery import MatterEntityInfo
LOGGER = logging.getLogger(__name__)
@dataclass
class MatterEntityDescription:
"""Mixin to map a matter device to a Home Assistant entity."""
entity_cls: type[MatterEntity]
subscribe_attributes: tuple
@dataclass
class MatterEntityDescriptionBaseClass(EntityDescription, MatterEntityDescription):
"""For typing a base class that inherits from both entity descriptions."""
class MatterEntity(Entity):
"""Entity class for Matter devices."""
entity_description: MatterEntityDescriptionBaseClass
_attr_should_poll = False
_attr_has_entity_name = True
def __init__(
self,
matter_client: MatterClient,
node_device: AbstractMatterNodeDevice,
device_type_instance: MatterDeviceTypeInstance,
entity_description: MatterEntityDescriptionBaseClass,
endpoint: MatterEndpoint,
entity_info: MatterEntityInfo,
) -> None:
"""Initialize the entity."""
self.matter_client = matter_client
self._node_device = node_device
self._device_type_instance = device_type_instance
self.entity_description = entity_description
self._endpoint = endpoint
self._entity_info = entity_info
self.entity_description = entity_info.entity_description
self._unsubscribes: list[Callable] = []
# for fast lookups we create a mapping to the attribute paths
self._attributes_map: dict[type, str] = {}
# The server info is set when the client connects to the server.
server_info = cast(ServerInfoMessage, self.matter_client.server_info)
# create unique_id based on "Operational Instance Name" and endpoint/device type
node_device_id = get_device_id(server_info, endpoint)
self._attr_unique_id = (
f"{get_operational_instance_id(server_info, self._node_device.node())}-"
f"{device_type_instance.endpoint.endpoint_id}-"
f"{device_type_instance.device_type.device_type}"
f"{node_device_id}-"
f"{endpoint.endpoint_id}-"
f"{entity_info.entity_description.key}-"
f"{entity_info.primary_attribute.cluster_id}-"
f"{entity_info.primary_attribute.attribute_id}"
)
node_device_id = get_device_id(server_info, node_device)
self._attr_device_info = DeviceInfo(
identifiers={(DOMAIN, f"{ID_TYPE_DEVICE_ID}_{node_device_id}")}
)
self._attr_available = self._node_device.node().available
self._attr_available = self._endpoint.node.available
async def async_added_to_hass(self) -> None:
"""Handle being added to Home Assistant."""
await super().async_added_to_hass()
# Subscribe to attribute updates.
for attr_cls in self.entity_description.subscribe_attributes:
for attr_cls in self._entity_info.attributes_to_watch:
attr_path = self.get_matter_attribute_path(attr_cls)
self._attributes_map[attr_cls] = attr_path
self._unsubscribes.append(
self.matter_client.subscribe(
callback=self._on_matter_event,
event_filter=EventType.ATTRIBUTE_UPDATED,
node_filter=self._device_type_instance.node.node_id,
node_filter=self._endpoint.node.node_id,
attr_path_filter=attr_path,
)
)
@@ -95,7 +82,7 @@ class MatterEntity(Entity):
self.matter_client.subscribe(
callback=self._on_matter_event,
event_filter=EventType.NODE_UPDATED,
node_filter=self._device_type_instance.node.node_id,
node_filter=self._endpoint.node.node_id,
)
)
@@ -110,7 +97,7 @@ class MatterEntity(Entity):
@callback
def _on_matter_event(self, event: EventType, data: Any = None) -> None:
"""Call on update."""
self._attr_available = self._device_type_instance.node.available
self._attr_available = self._endpoint.node.available
self._update_from_device()
self.async_write_ha_state()
@@ -124,14 +111,13 @@ class MatterEntity(Entity):
self, attribute: type[ClusterAttributeDescriptor]
) -> Any:
"""Get current value for given attribute."""
return self._device_type_instance.get_attribute_value(None, attribute)
return self._endpoint.get_attribute_value(None, attribute)
@callback
def get_matter_attribute_path(
self, attribute: type[ClusterAttributeDescriptor]
) -> str:
"""Return AttributePath by providing the endpoint and Attribute class."""
endpoint = self._device_type_instance.endpoint.endpoint_id
return create_attribute_path(
endpoint, attribute.cluster_id, attribute.attribute_id
self._endpoint.endpoint_id, attribute.cluster_id, attribute.attribute_id
)

View File

@@ -11,8 +11,7 @@ from homeassistant.helpers import device_registry as dr
from .const import DOMAIN, ID_TYPE_DEVICE_ID
if TYPE_CHECKING:
from matter_server.client.models.node import MatterNode
from matter_server.client.models.node_device import AbstractMatterNodeDevice
from matter_server.client.models.node import MatterEndpoint, MatterNode
from matter_server.common.models import ServerInfoMessage
from .adapter import MatterAdapter
@@ -50,15 +49,21 @@ def get_operational_instance_id(
def get_device_id(
server_info: ServerInfoMessage,
node_device: AbstractMatterNodeDevice,
endpoint: MatterEndpoint,
) -> str:
"""Return HA device_id for the given MatterNodeDevice."""
operational_instance_id = get_operational_instance_id(
server_info, node_device.node()
)
# Append nodedevice(type) to differentiate between a root node
# and bridge within Home Assistant devices.
return f"{operational_instance_id}-{node_device.__class__.__name__}"
"""Return HA device_id for the given MatterEndpoint."""
operational_instance_id = get_operational_instance_id(server_info, endpoint.node)
# Append endpoint ID if this endpoint is a bridged or composed device
if endpoint.is_composed_device:
compose_parent = endpoint.node.get_compose_parent(endpoint.endpoint_id)
assert compose_parent is not None
postfix = str(compose_parent.endpoint_id)
elif endpoint.is_bridged_device:
postfix = str(endpoint.endpoint_id)
else:
# this should be compatible with previous versions
postfix = "MatterNodeDevice"
return f"{operational_instance_id}-{postfix}"
async def get_node_from_device_entry(
@@ -91,8 +96,8 @@ async def get_node_from_device_entry(
(
node
for node in await matter_client.get_nodes()
for node_device in node.node_devices
if get_device_id(server_info, node_device) == device_id
for endpoint in node.endpoints.values()
if get_device_id(server_info, endpoint) == device_id
),
None,
)

View File

@@ -1,9 +1,7 @@
"""Matter light."""
from __future__ import annotations
from dataclasses import dataclass
from enum import Enum
from functools import partial
from enum import IntFlag
from typing import Any
from chip.clusters import Objects as clusters
@@ -24,8 +22,9 @@ from homeassistant.core import HomeAssistant, callback
from homeassistant.helpers.entity_platform import AddEntitiesCallback
from .const import LOGGER
from .entity import MatterEntity, MatterEntityDescriptionBaseClass
from .entity import MatterEntity
from .helpers import get_matter
from .models import MatterDiscoverySchema
from .util import (
convert_to_hass_hs,
convert_to_hass_xy,
@@ -34,32 +33,13 @@ from .util import (
renormalize,
)
class MatterColorMode(Enum):
"""Matter color mode."""
HS = 0
XY = 1
COLOR_TEMP = 2
COLOR_MODE_MAP = {
MatterColorMode.HS: ColorMode.HS,
MatterColorMode.XY: ColorMode.XY,
MatterColorMode.COLOR_TEMP: ColorMode.COLOR_TEMP,
clusters.ColorControl.Enums.ColorMode.kCurrentHueAndCurrentSaturation: ColorMode.HS,
clusters.ColorControl.Enums.ColorMode.kCurrentXAndCurrentY: ColorMode.XY,
clusters.ColorControl.Enums.ColorMode.kColorTemperature: ColorMode.COLOR_TEMP,
}
class MatterColorControlFeatures(Enum):
"""Matter color control features."""
HS = 0 # Hue and saturation (Optional if device is color capable)
EHUE = 1 # Enhanced hue and saturation (Optional if device is color capable)
COLOR_LOOP = 2 # Color loop (Optional if device is color capable)
XY = 3 # XY (Mandatory if device is color capable)
COLOR_TEMP = 4 # Color temperature (Mandatory if device is color capable)
async def async_setup_entry(
hass: HomeAssistant,
config_entry: ConfigEntry,
@@ -73,63 +53,37 @@ async def async_setup_entry(
class MatterLight(MatterEntity, LightEntity):
"""Representation of a Matter light."""
entity_description: MatterLightEntityDescription
def _supports_feature(
self, feature_map: int, feature: MatterColorControlFeatures
) -> bool:
"""Return if device supports given feature."""
return (feature_map & (1 << feature.value)) != 0
def _supports_color_mode(self, color_feature: MatterColorControlFeatures) -> bool:
"""Return if device supports given color mode."""
feature_map = self.get_matter_attribute_value(
clusters.ColorControl.Attributes.FeatureMap,
)
assert isinstance(feature_map, int)
return self._supports_feature(feature_map, color_feature)
def _supports_hs_color(self) -> bool:
"""Return if device supports hs color."""
return self._supports_color_mode(MatterColorControlFeatures.HS)
def _supports_xy_color(self) -> bool:
"""Return if device supports xy color."""
return self._supports_color_mode(MatterColorControlFeatures.XY)
def _supports_color_temperature(self) -> bool:
"""Return if device supports color temperature."""
return self._supports_color_mode(MatterColorControlFeatures.COLOR_TEMP)
def _supports_brightness(self) -> bool:
"""Return if device supports brightness."""
entity_description: LightEntityDescription
@property
def supports_color(self) -> bool:
"""Return if the device supports color control."""
if not self._attr_supported_color_modes:
return False
return (
clusters.LevelControl.Attributes.CurrentLevel
in self.entity_description.subscribe_attributes
ColorMode.HS in self._attr_supported_color_modes
or ColorMode.XY in self._attr_supported_color_modes
)
def _supports_color(self) -> bool:
"""Return if device supports color."""
@property
def supports_color_temperature(self) -> bool:
"""Return if the device supports color temperature control."""
if not self._attr_supported_color_modes:
return False
return ColorMode.COLOR_TEMP in self._attr_supported_color_modes
return (
clusters.ColorControl.Attributes.ColorMode
in self.entity_description.subscribe_attributes
)
@property
def supports_brightness(self) -> bool:
"""Return if the device supports bridghtness control."""
if not self._attr_supported_color_modes:
return False
return ColorMode.BRIGHTNESS in self._attr_supported_color_modes
async def _set_xy_color(self, xy_color: tuple[float, float]) -> None:
"""Set xy color."""
matter_xy = convert_to_matter_xy(xy_color)
LOGGER.debug("Setting xy color to %s", matter_xy)
await self.send_device_command(
clusters.ColorControl.Commands.MoveToColor(
colorX=int(matter_xy[0]),
@@ -144,7 +98,6 @@ class MatterLight(MatterEntity, LightEntity):
matter_hs = convert_to_matter_hs(hs_color)
LOGGER.debug("Setting hs color to %s", matter_hs)
await self.send_device_command(
clusters.ColorControl.Commands.MoveToHueAndSaturation(
hue=int(matter_hs[0]),
@@ -157,7 +110,6 @@ class MatterLight(MatterEntity, LightEntity):
async def _set_color_temp(self, color_temp: int) -> None:
"""Set color temperature."""
LOGGER.debug("Setting color temperature to %s", color_temp)
await self.send_device_command(
clusters.ColorControl.Commands.MoveToColorTemperature(
colorTemperature=color_temp,
@@ -169,8 +121,7 @@ class MatterLight(MatterEntity, LightEntity):
async def _set_brightness(self, brightness: int) -> None:
"""Set brightness."""
LOGGER.debug("Setting brightness to %s", brightness)
level_control = self._device_type_instance.get_cluster(clusters.LevelControl)
level_control = self._endpoint.get_cluster(clusters.LevelControl)
assert level_control is not None
@@ -207,7 +158,7 @@ class MatterLight(MatterEntity, LightEntity):
LOGGER.debug(
"Got xy color %s for %s",
xy_color,
self._device_type_instance,
self.entity_id,
)
return xy_color
@@ -231,7 +182,7 @@ class MatterLight(MatterEntity, LightEntity):
LOGGER.debug(
"Got hs color %s for %s",
hs_color,
self._device_type_instance,
self.entity_id,
)
return hs_color
@@ -248,7 +199,7 @@ class MatterLight(MatterEntity, LightEntity):
LOGGER.debug(
"Got color temperature %s for %s",
color_temp,
self._device_type_instance,
self.entity_id,
)
return int(color_temp)
@@ -256,7 +207,7 @@ class MatterLight(MatterEntity, LightEntity):
def _get_brightness(self) -> int:
"""Get brightness from matter."""
level_control = self._device_type_instance.get_cluster(clusters.LevelControl)
level_control = self._endpoint.get_cluster(clusters.LevelControl)
# We should not get here if brightness is not supported.
assert level_control is not None
@@ -264,7 +215,7 @@ class MatterLight(MatterEntity, LightEntity):
LOGGER.debug( # type: ignore[unreachable]
"Got brightness %s for %s",
level_control.currentLevel,
self._device_type_instance,
self.entity_id,
)
return round(
@@ -284,10 +235,12 @@ class MatterLight(MatterEntity, LightEntity):
assert color_mode is not None
ha_color_mode = COLOR_MODE_MAP[MatterColorMode(color_mode)]
ha_color_mode = COLOR_MODE_MAP[color_mode]
LOGGER.debug(
"Got color mode (%s) for %s", ha_color_mode, self._device_type_instance
"Got color mode (%s) for %s",
ha_color_mode,
self.entity_id,
)
return ha_color_mode
@@ -295,8 +248,8 @@ class MatterLight(MatterEntity, LightEntity):
async def send_device_command(self, command: Any) -> None:
"""Send device command."""
await self.matter_client.send_device_command(
node_id=self._device_type_instance.node.node_id,
endpoint_id=self._device_type_instance.endpoint_id,
node_id=self._endpoint.node.node_id,
endpoint_id=self._endpoint.endpoint_id,
command=command,
)
@@ -308,15 +261,18 @@ class MatterLight(MatterEntity, LightEntity):
color_temp = kwargs.get(ATTR_COLOR_TEMP)
brightness = kwargs.get(ATTR_BRIGHTNESS)
if self._supports_color():
if hs_color is not None and self._supports_hs_color():
if self.supported_color_modes is not None:
if hs_color is not None and ColorMode.HS in self.supported_color_modes:
await self._set_hs_color(hs_color)
elif xy_color is not None and self._supports_xy_color():
elif xy_color is not None and ColorMode.XY in self.supported_color_modes:
await self._set_xy_color(xy_color)
elif color_temp is not None and self._supports_color_temperature():
elif (
color_temp is not None
and ColorMode.COLOR_TEMP in self.supported_color_modes
):
await self._set_color_temp(color_temp)
if brightness is not None and self._supports_brightness():
if brightness is not None and self.supports_brightness:
await self._set_brightness(brightness)
return
@@ -333,107 +289,81 @@ class MatterLight(MatterEntity, LightEntity):
@callback
def _update_from_device(self) -> None:
"""Update from device."""
supports_color = self._supports_color()
supports_color_temperature = (
self._supports_color_temperature() if supports_color else False
)
supports_brightness = self._supports_brightness()
if self._attr_supported_color_modes is None:
supported_color_modes = set()
if supports_color:
supported_color_modes.add(ColorMode.XY)
if self._supports_hs_color():
# work out what (color)features are supported
supported_color_modes: set[ColorMode] = set()
# brightness support
if self._entity_info.endpoint.has_attribute(
None, clusters.LevelControl.Attributes.CurrentLevel
):
supported_color_modes.add(ColorMode.BRIGHTNESS)
# colormode(s)
if self._entity_info.endpoint.has_attribute(
None, clusters.ColorControl.Attributes.ColorMode
):
capabilities = self.get_matter_attribute_value(
clusters.ColorControl.Attributes.ColorCapabilities
)
assert capabilities is not None
if capabilities & ColorCapabilities.kHueSaturationSupported:
supported_color_modes.add(ColorMode.HS)
if supports_color_temperature:
supported_color_modes.add(ColorMode.COLOR_TEMP)
if capabilities & ColorCapabilities.kXYAttributesSupported:
supported_color_modes.add(ColorMode.XY)
if supports_brightness:
supported_color_modes.add(ColorMode.BRIGHTNESS)
if capabilities & ColorCapabilities.kColorTemperatureSupported:
supported_color_modes.add(ColorMode.COLOR_TEMP)
self._attr_supported_color_modes = (
supported_color_modes if supported_color_modes else None
self._attr_supported_color_modes = supported_color_modes
LOGGER.debug(
"Supported color modes: %s for %s",
self._attr_supported_color_modes,
self.entity_id,
)
LOGGER.debug(
"Supported color modes: %s for %s",
self._attr_supported_color_modes,
self._device_type_instance,
)
# set current values
if supports_color:
if self.supports_color:
self._attr_color_mode = self._get_color_mode()
if self._attr_color_mode == ColorMode.HS:
self._attr_hs_color = self._get_hs_color()
else:
self._attr_xy_color = self._get_xy_color()
if supports_color_temperature:
if self.supports_color_temperature:
self._attr_color_temp = self._get_color_temperature()
self._attr_is_on = self.get_matter_attribute_value(
clusters.OnOff.Attributes.OnOff
)
if supports_brightness:
if self.supports_brightness:
self._attr_brightness = self._get_brightness()
@dataclass
class MatterLightEntityDescription(
LightEntityDescription,
MatterEntityDescriptionBaseClass,
):
"""Matter light entity description."""
# This enum should be removed once the ColorControlCapabilities enum is added to the CHIP (Matter) library
# clusters.ColorControl.Bitmap.ColorCapabilities
class ColorCapabilities(IntFlag):
"""Color control capabilities bitmap."""
kHueSaturationSupported = 0x1
kEnhancedHueSupported = 0x2
kColorLoopSupported = 0x4
kXYAttributesSupported = 0x8
kColorTemperatureSupported = 0x10
# You can't set default values on inherited data classes
MatterLightEntityDescriptionFactory = partial(
MatterLightEntityDescription, entity_cls=MatterLight
)
# Mapping of a Matter Device type to Light Entity Description.
# A Matter device type (instance) can consist of multiple attributes.
# For example a Color Light which has an attribute to control brightness
# but also for color.
DEVICE_ENTITY: dict[
type[device_types.DeviceType],
MatterEntityDescriptionBaseClass | list[MatterEntityDescriptionBaseClass],
] = {
device_types.OnOffLight: MatterLightEntityDescriptionFactory(
key=device_types.OnOffLight,
subscribe_attributes=(clusters.OnOff.Attributes.OnOff,),
),
device_types.DimmableLight: MatterLightEntityDescriptionFactory(
key=device_types.DimmableLight,
subscribe_attributes=(
clusters.OnOff.Attributes.OnOff,
clusters.LevelControl.Attributes.CurrentLevel,
),
),
device_types.DimmablePlugInUnit: MatterLightEntityDescriptionFactory(
key=device_types.DimmablePlugInUnit,
subscribe_attributes=(
clusters.OnOff.Attributes.OnOff,
clusters.LevelControl.Attributes.CurrentLevel,
),
),
device_types.ColorTemperatureLight: MatterLightEntityDescriptionFactory(
key=device_types.ColorTemperatureLight,
subscribe_attributes=(
clusters.OnOff.Attributes.OnOff,
clusters.LevelControl.Attributes.CurrentLevel,
clusters.ColorControl.Attributes.ColorMode,
clusters.ColorControl.Attributes.ColorTemperatureMireds,
),
),
device_types.ExtendedColorLight: MatterLightEntityDescriptionFactory(
key=device_types.ExtendedColorLight,
subscribe_attributes=(
clusters.OnOff.Attributes.OnOff,
# Discovery schema(s) to map Matter Attributes to HA entities
DISCOVERY_SCHEMAS = [
MatterDiscoverySchema(
platform=Platform.LIGHT,
entity_description=LightEntityDescription(key="MatterLight"),
entity_class=MatterLight,
required_attributes=(clusters.OnOff.Attributes.OnOff,),
optional_attributes=(
clusters.LevelControl.Attributes.CurrentLevel,
clusters.ColorControl.Attributes.ColorMode,
clusters.ColorControl.Attributes.CurrentHue,
@@ -442,5 +372,7 @@ DEVICE_ENTITY: dict[
clusters.ColorControl.Attributes.CurrentY,
clusters.ColorControl.Attributes.ColorTemperatureMireds,
),
# restrict device type to prevent discovery in switch platform
not_device_type=(device_types.OnOffPlugInUnit,),
),
}
]

View File

@@ -6,5 +6,5 @@
"dependencies": ["websocket_api"],
"documentation": "https://www.home-assistant.io/integrations/matter",
"iot_class": "local_push",
"requirements": ["python-matter-server==3.0.0"]
"requirements": ["python-matter-server==3.1.0"]
}

View File

@@ -0,0 +1,109 @@
"""Models used for the Matter integration."""
from collections.abc import Callable
from dataclasses import asdict, dataclass
from typing import Any
from chip.clusters import Objects as clusters
from chip.clusters.Objects import ClusterAttributeDescriptor
from matter_server.client.models.device_types import DeviceType
from matter_server.client.models.node import MatterEndpoint
from homeassistant.const import Platform
from homeassistant.helpers.entity import EntityDescription
class DataclassMustHaveAtLeastOne:
"""A dataclass that must have at least one input parameter that is not None."""
def __post_init__(self) -> None:
"""Post dataclass initialization."""
if all(val is None for val in asdict(self).values()):
raise ValueError("At least one input parameter must not be None")
SensorValueTypes = type[
clusters.uint | int | clusters.Nullable | clusters.float32 | float
]
@dataclass
class MatterEntityInfo:
"""Info discovered from (primary) Matter Attribute to create entity."""
# MatterEndpoint to which the value(s) belongs
endpoint: MatterEndpoint
# the home assistant platform for which an entity should be created
platform: Platform
# All attributes that need to be watched by entity (incl. primary)
attributes_to_watch: list[type[ClusterAttributeDescriptor]]
# the entity description to use
entity_description: EntityDescription
# entity class to use to instantiate the entity
entity_class: type
# [optional] function to call to convert the value from the primary attribute
measurement_to_ha: Callable[[SensorValueTypes], SensorValueTypes] | None = None
@property
def primary_attribute(self) -> type[ClusterAttributeDescriptor]:
"""Return Primary Attribute belonging to the entity."""
return self.attributes_to_watch[0]
@dataclass
class MatterDiscoverySchema:
"""Matter discovery schema.
The Matter endpoint and it's (primary) Attribute for an entity must match these conditions.
"""
# specify the hass platform for which this scheme applies (e.g. light, sensor)
platform: Platform
# platform-specific entity description
entity_description: EntityDescription
# entity class to use to instantiate the entity
entity_class: type
# DISCOVERY OPTIONS
# [required] attributes that ALL need to be present
# on the node for this scheme to pass (minimal one == primary)
required_attributes: tuple[type[ClusterAttributeDescriptor], ...]
# [optional] the value's endpoint must contain this devicetype(s)
device_type: tuple[type[DeviceType] | DeviceType, ...] | None = None
# [optional] the value's endpoint must NOT contain this devicetype(s)
not_device_type: tuple[type[DeviceType] | DeviceType, ...] | None = None
# [optional] the endpoint's vendor_id must match ANY of these values
vendor_id: tuple[int, ...] | None = None
# [optional] the endpoint's product_name must match ANY of these values
product_name: tuple[str, ...] | None = None
# [optional] the attribute's endpoint_id must match ANY of these values
endpoint_id: tuple[int, ...] | None = None
# [optional] additional attributes that MAY NOT be present
# on the node for this scheme to pass
absent_attributes: tuple[type[ClusterAttributeDescriptor], ...] | None = None
# [optional] additional attributes that may be present
# these attributes are copied over to attributes_to_watch and
# are not discovered by other entities
optional_attributes: tuple[type[ClusterAttributeDescriptor], ...] | None = None
# [optional] bool to specify if this primary value may be discovered
# by multiple platforms
allow_multi: bool = False
# [optional] function to call to convert the value from the primary attribute
measurement_to_ha: Callable[[Any], Any] | None = None

View File

@@ -1,13 +1,8 @@
"""Matter sensors."""
from __future__ import annotations
from collections.abc import Callable
from dataclasses import dataclass
from functools import partial
from chip.clusters import Objects as clusters
from chip.clusters.Types import Nullable, NullValue
from matter_server.client.models import device_types
from homeassistant.components.sensor import (
SensorDeviceClass,
@@ -27,8 +22,9 @@ from homeassistant.const import (
from homeassistant.core import HomeAssistant, callback
from homeassistant.helpers.entity_platform import AddEntitiesCallback
from .entity import MatterEntity, MatterEntityDescriptionBaseClass
from .entity import MatterEntity
from .helpers import get_matter
from .models import MatterDiscoverySchema
async def async_setup_entry(
@@ -45,94 +41,94 @@ class MatterSensor(MatterEntity, SensorEntity):
"""Representation of a Matter sensor."""
_attr_state_class = SensorStateClass.MEASUREMENT
entity_description: MatterSensorEntityDescription
@callback
def _update_from_device(self) -> None:
"""Update from device."""
measurement: Nullable | float | None
measurement = self.get_matter_attribute_value(
# We always subscribe to a single value
self.entity_description.subscribe_attributes[0],
)
if measurement == NullValue or measurement is None:
measurement = None
else:
measurement = self.entity_description.measurement_to_ha(measurement)
self._attr_native_value = measurement
value: Nullable | float | None
value = self.get_matter_attribute_value(self._entity_info.primary_attribute)
if value in (None, NullValue):
value = None
elif value_convert := self._entity_info.measurement_to_ha:
value = value_convert(value)
self._attr_native_value = value
@dataclass
class MatterSensorEntityDescriptionMixin:
"""Required fields for sensor device mapping."""
measurement_to_ha: Callable[[float], float]
@dataclass
class MatterSensorEntityDescription(
SensorEntityDescription,
MatterEntityDescriptionBaseClass,
MatterSensorEntityDescriptionMixin,
):
"""Matter Sensor entity description."""
# You can't set default values on inherited data classes
MatterSensorEntityDescriptionFactory = partial(
MatterSensorEntityDescription, entity_cls=MatterSensor
)
DEVICE_ENTITY: dict[
type[device_types.DeviceType],
MatterEntityDescriptionBaseClass | list[MatterEntityDescriptionBaseClass],
] = {
device_types.TemperatureSensor: MatterSensorEntityDescriptionFactory(
key=device_types.TemperatureSensor,
name="Temperature",
measurement_to_ha=lambda x: x / 100,
subscribe_attributes=(
clusters.TemperatureMeasurement.Attributes.MeasuredValue,
# Discovery schema(s) to map Matter Attributes to HA entities
DISCOVERY_SCHEMAS = [
MatterDiscoverySchema(
platform=Platform.SENSOR,
entity_description=SensorEntityDescription(
key="TemperatureSensor",
name="Temperature",
native_unit_of_measurement=UnitOfTemperature.CELSIUS,
device_class=SensorDeviceClass.TEMPERATURE,
),
native_unit_of_measurement=UnitOfTemperature.CELSIUS,
device_class=SensorDeviceClass.TEMPERATURE,
),
device_types.PressureSensor: MatterSensorEntityDescriptionFactory(
key=device_types.PressureSensor,
name="Pressure",
measurement_to_ha=lambda x: x / 10,
subscribe_attributes=(clusters.PressureMeasurement.Attributes.MeasuredValue,),
native_unit_of_measurement=UnitOfPressure.KPA,
device_class=SensorDeviceClass.PRESSURE,
),
device_types.FlowSensor: MatterSensorEntityDescriptionFactory(
key=device_types.FlowSensor,
name="Flow",
measurement_to_ha=lambda x: x / 10,
subscribe_attributes=(clusters.FlowMeasurement.Attributes.MeasuredValue,),
native_unit_of_measurement=UnitOfVolumeFlowRate.CUBIC_METERS_PER_HOUR,
),
device_types.HumiditySensor: MatterSensorEntityDescriptionFactory(
key=device_types.HumiditySensor,
name="Humidity",
entity_class=MatterSensor,
required_attributes=(clusters.TemperatureMeasurement.Attributes.MeasuredValue,),
measurement_to_ha=lambda x: x / 100,
subscribe_attributes=(
),
MatterDiscoverySchema(
platform=Platform.SENSOR,
entity_description=SensorEntityDescription(
key="PressureSensor",
name="Pressure",
native_unit_of_measurement=UnitOfPressure.KPA,
device_class=SensorDeviceClass.PRESSURE,
),
entity_class=MatterSensor,
required_attributes=(clusters.PressureMeasurement.Attributes.MeasuredValue,),
measurement_to_ha=lambda x: x / 10,
),
MatterDiscoverySchema(
platform=Platform.SENSOR,
entity_description=SensorEntityDescription(
key="FlowSensor",
name="Flow",
native_unit_of_measurement=UnitOfVolumeFlowRate.CUBIC_METERS_PER_HOUR,
device_class=SensorDeviceClass.WATER, # what is the device class here ?
),
entity_class=MatterSensor,
required_attributes=(clusters.FlowMeasurement.Attributes.MeasuredValue,),
measurement_to_ha=lambda x: x / 10,
),
MatterDiscoverySchema(
platform=Platform.SENSOR,
entity_description=SensorEntityDescription(
key="HumiditySensor",
name="Humidity",
native_unit_of_measurement=PERCENTAGE,
device_class=SensorDeviceClass.HUMIDITY,
),
entity_class=MatterSensor,
required_attributes=(
clusters.RelativeHumidityMeasurement.Attributes.MeasuredValue,
),
native_unit_of_measurement=PERCENTAGE,
device_class=SensorDeviceClass.HUMIDITY,
measurement_to_ha=lambda x: x / 100,
),
device_types.LightSensor: MatterSensorEntityDescriptionFactory(
key=device_types.LightSensor,
name="Light",
measurement_to_ha=lambda x: round(pow(10, ((x - 1) / 10000)), 1),
subscribe_attributes=(
clusters.IlluminanceMeasurement.Attributes.MeasuredValue,
MatterDiscoverySchema(
platform=Platform.SENSOR,
entity_description=SensorEntityDescription(
key="LightSensor",
name="Illuminance",
native_unit_of_measurement=LIGHT_LUX,
device_class=SensorDeviceClass.ILLUMINANCE,
),
native_unit_of_measurement=LIGHT_LUX,
device_class=SensorDeviceClass.ILLUMINANCE,
entity_class=MatterSensor,
required_attributes=(clusters.IlluminanceMeasurement.Attributes.MeasuredValue,),
measurement_to_ha=lambda x: round(pow(10, ((x - 1) / 10000)), 1),
),
}
MatterDiscoverySchema(
platform=Platform.SENSOR,
entity_description=SensorEntityDescription(
key="PowerSource",
name="Battery",
native_unit_of_measurement=PERCENTAGE,
device_class=SensorDeviceClass.BATTERY,
),
entity_class=MatterSensor,
required_attributes=(clusters.PowerSource.Attributes.BatPercentRemaining,),
# value has double precision
measurement_to_ha=lambda x: int(x / 2),
),
]

View File

@@ -1,8 +1,6 @@
"""Matter switches."""
from __future__ import annotations
from dataclasses import dataclass
from functools import partial
from typing import Any
from chip.clusters import Objects as clusters
@@ -18,8 +16,9 @@ from homeassistant.const import Platform
from homeassistant.core import HomeAssistant, callback
from homeassistant.helpers.entity_platform import AddEntitiesCallback
from .entity import MatterEntity, MatterEntityDescriptionBaseClass
from .entity import MatterEntity
from .helpers import get_matter
from .models import MatterDiscoverySchema
async def async_setup_entry(
@@ -35,21 +34,19 @@ async def async_setup_entry(
class MatterSwitch(MatterEntity, SwitchEntity):
"""Representation of a Matter switch."""
entity_description: MatterSwitchEntityDescription
async def async_turn_on(self, **kwargs: Any) -> None:
"""Turn switch on."""
await self.matter_client.send_device_command(
node_id=self._device_type_instance.node.node_id,
endpoint_id=self._device_type_instance.endpoint_id,
node_id=self._endpoint.node.node_id,
endpoint_id=self._endpoint.endpoint_id,
command=clusters.OnOff.Commands.On(),
)
async def async_turn_off(self, **kwargs: Any) -> None:
"""Turn switch off."""
await self.matter_client.send_device_command(
node_id=self._device_type_instance.node.node_id,
endpoint_id=self._device_type_instance.endpoint_id,
node_id=self._endpoint.node.node_id,
endpoint_id=self._endpoint.endpoint_id,
command=clusters.OnOff.Commands.Off(),
)
@@ -57,31 +54,21 @@ class MatterSwitch(MatterEntity, SwitchEntity):
def _update_from_device(self) -> None:
"""Update from device."""
self._attr_is_on = self.get_matter_attribute_value(
clusters.OnOff.Attributes.OnOff
self._entity_info.primary_attribute
)
@dataclass
class MatterSwitchEntityDescription(
SwitchEntityDescription,
MatterEntityDescriptionBaseClass,
):
"""Matter Switch entity description."""
# You can't set default values on inherited data classes
MatterSwitchEntityDescriptionFactory = partial(
MatterSwitchEntityDescription, entity_cls=MatterSwitch
)
DEVICE_ENTITY: dict[
type[device_types.DeviceType],
MatterEntityDescriptionBaseClass | list[MatterEntityDescriptionBaseClass],
] = {
device_types.OnOffPlugInUnit: MatterSwitchEntityDescriptionFactory(
key=device_types.OnOffPlugInUnit,
subscribe_attributes=(clusters.OnOff.Attributes.OnOff,),
device_class=SwitchDeviceClass.OUTLET,
# Discovery schema(s) to map Matter Attributes to HA entities
DISCOVERY_SCHEMAS = [
MatterDiscoverySchema(
platform=Platform.SWITCH,
entity_description=SwitchEntityDescription(
key="MatterPlug", device_class=SwitchDeviceClass.OUTLET
),
entity_class=MatterSwitch,
required_attributes=(clusters.OnOff.Attributes.OnOff,),
# restrict device type to prevent discovery by light
# platform which also uses OnOff cluster
not_device_type=(device_types.OnOffLight, device_types.DimmableLight),
),
}
]

View File

@@ -7,5 +7,5 @@
"iot_class": "cloud_polling",
"loggers": ["pymazda"],
"quality_scale": "platinum",
"requirements": ["pymazda==0.3.7"]
"requirements": ["pymazda==0.3.8"]
}

View File

@@ -4,7 +4,7 @@ from __future__ import annotations
import asyncio
from collections.abc import Callable, Coroutine
from contextlib import suppress
from functools import wraps
from functools import lru_cache, wraps
from http import HTTPStatus
import logging
import secrets
@@ -365,6 +365,12 @@ async def webhook_stream_camera(
return webhook_response(resp, registration=config_entry.data)
@lru_cache
def _cached_template(template_str: str, hass: HomeAssistant) -> template.Template:
"""Return a cached template."""
return template.Template(template_str, hass)
@WEBHOOK_COMMANDS.register("render_template")
@validate_schema(
{
@@ -381,7 +387,7 @@ async def webhook_render_template(
resp = {}
for key, item in data.items():
try:
tpl = template.Template(item[ATTR_TEMPLATE], hass)
tpl = _cached_template(item[ATTR_TEMPLATE], hass)
resp[key] = tpl.async_render(item.get(ATTR_TEMPLATE_VARIABLES))
except TemplateError as ex:
resp[key] = {"error": str(ex)}

View File

@@ -21,5 +21,5 @@
"documentation": "https://www.home-assistant.io/integrations/mopeka",
"integration_type": "device",
"iot_class": "local_push",
"requirements": ["mopeka_iot_ble==0.4.0"]
"requirements": ["mopeka_iot_ble==0.4.1"]
}

View File

@@ -3,7 +3,7 @@ from __future__ import annotations
from typing import Any
from motionblinds import MotionDiscovery
from motionblinds import MotionDiscovery, MotionGateway
import voluptuous as vol
from homeassistant import config_entries
@@ -86,6 +86,16 @@ class MotionBlindsFlowHandler(config_entries.ConfigFlow, domain=DOMAIN):
await self.async_set_unique_id(mac_address)
self._abort_if_unique_id_configured(updates={CONF_HOST: discovery_info.ip})
gateway = MotionGateway(ip=discovery_info.ip, key="abcd1234-56ef-78")
try:
# key not needed for GetDeviceList request
await self.hass.async_add_executor_job(gateway.GetDeviceList)
except Exception: # pylint: disable=broad-except
return self.async_abort(reason="not_motionblinds")
if not gateway.available:
return self.async_abort(reason="not_motionblinds")
short_mac = mac_address[-6:].upper()
self.context["title_placeholders"] = {
"short_mac": short_mac,

View File

@@ -28,7 +28,8 @@
"abort": {
"already_configured": "[%key:common::config_flow::abort::already_configured_device%]",
"already_in_progress": "[%key:common::config_flow::abort::already_in_progress%]",
"connection_error": "[%key:common::config_flow::error::cannot_connect%]"
"connection_error": "[%key:common::config_flow::error::cannot_connect%]",
"not_motionblinds": "Discovered device is not a Motion gateway"
}
},
"options": {

View File

@@ -495,8 +495,12 @@ class MqttLight(MqttEntity, LightEntity, RestoreEntity):
self._attr_color_mode = color_mode
if self._topic[CONF_BRIGHTNESS_STATE_TOPIC] is None:
rgb = convert_color(*color)
percent_bright = float(color_util.color_RGB_to_hsv(*rgb)[2]) / 100.0
self._attr_brightness = min(round(percent_bright * 255), 255)
brightness = max(rgb)
self._attr_brightness = brightness
# Normalize the color to 100% brightness
color = tuple(
min(round(channel / brightness * 255), 255) for channel in color
)
return color
@callback

View File

@@ -281,7 +281,7 @@ class MqttSensor(MqttEntity, RestoreSensor):
else:
self._attr_native_value = new_value
return
if self.device_class is None:
if self.device_class in {None, SensorDeviceClass.ENUM}:
self._attr_native_value = new_value
return
if (payload_datetime := dt_util.parse_datetime(new_value)) is None:

View File

@@ -8,11 +8,11 @@ from datetime import timedelta
from functools import cached_property
from typing import Any, Generic, TypeVar
from nibe.coil import Coil
from nibe.coil import Coil, CoilData
from nibe.connection import Connection
from nibe.connection.modbus import Modbus
from nibe.connection.nibegw import NibeGW, ProductInfo
from nibe.exceptions import CoilNotFoundException, CoilReadException
from nibe.exceptions import CoilNotFoundException, ReadException
from nibe.heatpump import HeatPump, Model, Series
from homeassistant.config_entries import ConfigEntry
@@ -182,7 +182,7 @@ class ContextCoordinator(
return release_update
class Coordinator(ContextCoordinator[dict[int, Coil], int]):
class Coordinator(ContextCoordinator[dict[int, CoilData], int]):
"""Update coordinator for nibe heat pumps."""
config_entry: ConfigEntry
@@ -199,17 +199,18 @@ class Coordinator(ContextCoordinator[dict[int, Coil], int]):
)
self.data = {}
self.seed: dict[int, Coil] = {}
self.seed: dict[int, CoilData] = {}
self.connection = connection
self.heatpump = heatpump
self.task: asyncio.Task | None = None
heatpump.subscribe(heatpump.COIL_UPDATE_EVENT, self._on_coil_update)
def _on_coil_update(self, coil: Coil):
def _on_coil_update(self, data: CoilData):
"""Handle callback on coil updates."""
self.data[coil.address] = coil
self.seed[coil.address] = coil
coil = data.coil
self.data[coil.address] = data
self.seed[coil.address] = data
self.async_update_context_listeners([coil.address])
@property
@@ -246,26 +247,26 @@ class Coordinator(ContextCoordinator[dict[int, Coil], int]):
async def async_write_coil(self, coil: Coil, value: int | float | str) -> None:
"""Write coil and update state."""
coil.value = value
coil = await self.connection.write_coil(coil)
data = CoilData(coil, value)
await self.connection.write_coil(data)
self.data[coil.address] = coil
self.data[coil.address] = data
self.async_update_context_listeners([coil.address])
async def async_read_coil(self, coil: Coil) -> Coil:
async def async_read_coil(self, coil: Coil) -> CoilData:
"""Read coil and update state using callbacks."""
return await self.connection.read_coil(coil)
async def _async_update_data(self) -> dict[int, Coil]:
async def _async_update_data(self) -> dict[int, CoilData]:
self.task = asyncio.current_task()
try:
return await self._async_update_data_internal()
finally:
self.task = None
async def _async_update_data_internal(self) -> dict[int, Coil]:
result: dict[int, Coil] = {}
async def _async_update_data_internal(self) -> dict[int, CoilData]:
result: dict[int, CoilData] = {}
def _get_coils() -> Iterable[Coil]:
for address in sorted(self.context_callbacks.keys()):
@@ -282,10 +283,10 @@ class Coordinator(ContextCoordinator[dict[int, Coil], int]):
yield coil
try:
async for coil in self.connection.read_coils(_get_coils()):
result[coil.address] = coil
self.seed.pop(coil.address, None)
except CoilReadException as exception:
async for data in self.connection.read_coils(_get_coils()):
result[data.coil.address] = data
self.seed.pop(data.coil.address, None)
except ReadException as exception:
if not result:
raise UpdateFailed(f"Failed to update: {exception}") from exception
self.logger.debug(
@@ -329,7 +330,7 @@ class CoilEntity(CoordinatorEntity[Coordinator]):
self.coordinator.data or {}
)
def _async_read_coil(self, coil: Coil):
def _async_read_coil(self, data: CoilData):
"""Update state of entity based on coil data."""
async def _async_write_coil(self, value: int | float | str):
@@ -337,10 +338,9 @@ class CoilEntity(CoordinatorEntity[Coordinator]):
await self.coordinator.async_write_coil(self._coil, value)
def _handle_coordinator_update(self) -> None:
coil = self.coordinator.data.get(self._coil.address)
if coil is None:
data = self.coordinator.data.get(self._coil.address)
if data is None:
return
self._coil = coil
self._async_read_coil(coil)
self._async_read_coil(data)
self.async_write_ha_state()

View File

@@ -1,7 +1,7 @@
"""The Nibe Heat Pump binary sensors."""
from __future__ import annotations
from nibe.coil import Coil
from nibe.coil import Coil, CoilData
from homeassistant.components.binary_sensor import ENTITY_ID_FORMAT, BinarySensorEntity
from homeassistant.config_entries import ConfigEntry
@@ -37,5 +37,5 @@ class BinarySensor(CoilEntity, BinarySensorEntity):
"""Initialize entity."""
super().__init__(coordinator, coil, ENTITY_ID_FORMAT)
def _async_read_coil(self, coil: Coil) -> None:
self._attr_is_on = coil.value == "ON"
def _async_read_coil(self, data: CoilData) -> None:
self._attr_is_on = data.value == "ON"

View File

@@ -139,7 +139,7 @@ class NibeClimateEntity(CoordinatorEntity[Coordinator], ClimateEntity):
mode = HVACMode.OFF
if _get_value(self._coil_use_room_sensor) == "ON":
if _get_value(self._coil_cooling_with_room_sensor) == "ON":
if _get_value(self._coil_cooling_with_room_sensor) != "OFF":
mode = HVACMode.HEAT_COOL
else:
mode = HVACMode.HEAT

View File

@@ -8,10 +8,10 @@ from nibe.connection.nibegw import NibeGW
from nibe.exceptions import (
AddressInUseException,
CoilNotFoundException,
CoilReadException,
CoilReadSendException,
CoilWriteException,
CoilWriteSendException,
ReadException,
ReadSendException,
WriteException,
)
from nibe.heatpump import HeatPump, Model
import voluptuous as vol
@@ -108,13 +108,13 @@ async def validate_nibegw_input(
try:
await connection.verify_connectivity()
except (CoilReadSendException, CoilWriteSendException) as exception:
except (ReadSendException, CoilWriteSendException) as exception:
raise FieldError(str(exception), CONF_IP_ADDRESS, "address") from exception
except CoilNotFoundException as exception:
raise FieldError("Coils not found", "base", "model") from exception
except CoilReadException as exception:
except ReadException as exception:
raise FieldError("Timeout on read from pump", "base", "read") from exception
except CoilWriteException as exception:
except WriteException as exception:
raise FieldError("Timeout on writing to pump", "base", "write") from exception
finally:
await connection.stop()
@@ -147,13 +147,13 @@ async def validate_modbus_input(
try:
await connection.verify_connectivity()
except (CoilReadSendException, CoilWriteSendException) as exception:
except (ReadSendException, CoilWriteSendException) as exception:
raise FieldError(str(exception), CONF_MODBUS_URL, "address") from exception
except CoilNotFoundException as exception:
raise FieldError("Coils not found", "base", "model") from exception
except CoilReadException as exception:
except ReadException as exception:
raise FieldError("Timeout on read from pump", "base", "read") from exception
except CoilWriteException as exception:
except WriteException as exception:
raise FieldError("Timeout on writing to pump", "base", "write") from exception
finally:
await connection.stop()

View File

@@ -5,5 +5,5 @@
"config_flow": true,
"documentation": "https://www.home-assistant.io/integrations/nibe_heatpump",
"iot_class": "local_polling",
"requirements": ["nibe==1.6.0"]
"requirements": ["nibe==2.0.0"]
}

View File

@@ -1,7 +1,7 @@
"""The Nibe Heat Pump numbers."""
from __future__ import annotations
from nibe.coil import Coil
from nibe.coil import Coil, CoilData
from homeassistant.components.number import ENTITY_ID_FORMAT, NumberEntity
from homeassistant.config_entries import ConfigEntry
@@ -58,13 +58,13 @@ class Number(CoilEntity, NumberEntity):
self._attr_native_unit_of_measurement = coil.unit
self._attr_native_value = None
def _async_read_coil(self, coil: Coil) -> None:
if coil.value is None:
def _async_read_coil(self, data: CoilData) -> None:
if data.value is None:
self._attr_native_value = None
return
try:
self._attr_native_value = float(coil.value)
self._attr_native_value = float(data.value)
except ValueError:
self._attr_native_value = None

View File

@@ -1,7 +1,7 @@
"""The Nibe Heat Pump select."""
from __future__ import annotations
from nibe.coil import Coil
from nibe.coil import Coil, CoilData
from homeassistant.components.select import ENTITY_ID_FORMAT, SelectEntity
from homeassistant.config_entries import ConfigEntry
@@ -40,12 +40,12 @@ class Select(CoilEntity, SelectEntity):
self._attr_options = list(coil.mappings.values())
self._attr_current_option = None
def _async_read_coil(self, coil: Coil) -> None:
if not isinstance(coil.value, str):
def _async_read_coil(self, data: CoilData) -> None:
if not isinstance(data.value, str):
self._attr_current_option = None
return
self._attr_current_option = coil.value
self._attr_current_option = data.value
async def async_select_option(self, option: str) -> None:
"""Support writing value."""

View File

@@ -1,7 +1,7 @@
"""The Nibe Heat Pump sensors."""
from __future__ import annotations
from nibe.coil import Coil
from nibe.coil import Coil, CoilData
from homeassistant.components.sensor import (
ENTITY_ID_FORMAT,
@@ -146,5 +146,5 @@ class Sensor(CoilEntity, SensorEntity):
self._attr_native_unit_of_measurement = coil.unit
self._attr_entity_category = EntityCategory.DIAGNOSTIC
def _async_read_coil(self, coil: Coil):
self._attr_native_value = coil.value
def _async_read_coil(self, data: CoilData):
self._attr_native_value = data.value

View File

@@ -3,7 +3,7 @@ from __future__ import annotations
from typing import Any
from nibe.coil import Coil
from nibe.coil import Coil, CoilData
from homeassistant.components.switch import ENTITY_ID_FORMAT, SwitchEntity
from homeassistant.config_entries import ConfigEntry
@@ -40,8 +40,8 @@ class Switch(CoilEntity, SwitchEntity):
super().__init__(coordinator, coil, ENTITY_ID_FORMAT)
self._attr_is_on = None
def _async_read_coil(self, coil: Coil) -> None:
self._attr_is_on = coil.value == "ON"
def _async_read_coil(self, data: CoilData) -> None:
self._attr_is_on = data.value == "ON"
async def async_turn_on(self, **kwargs: Any) -> None:
"""Turn the entity on."""

View File

@@ -12,5 +12,5 @@
"documentation": "https://www.home-assistant.io/integrations/nuheat",
"iot_class": "cloud_polling",
"loggers": ["nuheat"],
"requirements": ["nuheat==1.0.0"]
"requirements": ["nuheat==1.0.1"]
}

View File

@@ -1,17 +1,22 @@
"""The Open Thread Border Router integration."""
from __future__ import annotations
import asyncio
from collections.abc import Callable, Coroutine
import dataclasses
from functools import wraps
from typing import Any, Concatenate, ParamSpec, TypeVar
import aiohttp
import python_otbr_api
from python_otbr_api import tlv_parser
from python_otbr_api.pskc import compute_pskc
from homeassistant.components.thread import async_add_dataset
from homeassistant.config_entries import ConfigEntry
from homeassistant.core import HomeAssistant
from homeassistant.exceptions import ConfigEntryNotReady, HomeAssistantError
from homeassistant.helpers import issue_registry as ir
from homeassistant.helpers.aiohttp_client import async_get_clientsession
from homeassistant.helpers.typing import ConfigType
@@ -21,6 +26,18 @@ from .const import DOMAIN
_R = TypeVar("_R")
_P = ParamSpec("_P")
INSECURE_NETWORK_KEYS = (
# Thread web UI default
bytes.fromhex("00112233445566778899AABBCCDDEEFF"),
)
INSECURE_PASSPHRASES = (
# Thread web UI default
"j01Nme",
# Thread documentation default
"J01NME",
)
def _handle_otbr_error(
func: Callable[Concatenate[OTBRData, _P], Coroutine[Any, Any, _R]]
@@ -44,11 +61,23 @@ class OTBRData:
url: str
api: python_otbr_api.OTBR
@_handle_otbr_error
async def set_enabled(self, enabled: bool) -> None:
"""Enable or disable the router."""
return await self.api.set_enabled(enabled)
@_handle_otbr_error
async def get_active_dataset_tlvs(self) -> bytes | None:
"""Get current active operational dataset in TLVS format, or None."""
return await self.api.get_active_dataset_tlvs()
@_handle_otbr_error
async def create_active_dataset(
self, dataset: python_otbr_api.OperationalDataSet
) -> None:
"""Create an active operational dataset."""
return await self.api.create_active_dataset(dataset)
async def async_setup(hass: HomeAssistant, config: ConfigType) -> bool:
"""Set up the Open Thread Border Router component."""
@@ -56,17 +85,65 @@ async def async_setup(hass: HomeAssistant, config: ConfigType) -> bool:
return True
def _warn_on_default_network_settings(
hass: HomeAssistant, entry: ConfigEntry, dataset_tlvs: bytes
) -> None:
"""Warn user if insecure default network settings are used."""
dataset = tlv_parser.parse_tlv(dataset_tlvs.hex())
insecure = False
if (
network_key := dataset.get(tlv_parser.MeshcopTLVType.NETWORKKEY)
) is not None and bytes.fromhex(network_key) in INSECURE_NETWORK_KEYS:
insecure = True
if (
not insecure
and tlv_parser.MeshcopTLVType.EXTPANID in dataset
and tlv_parser.MeshcopTLVType.NETWORKNAME in dataset
and tlv_parser.MeshcopTLVType.PSKC in dataset
):
ext_pan_id = dataset[tlv_parser.MeshcopTLVType.EXTPANID]
network_name = dataset[tlv_parser.MeshcopTLVType.NETWORKNAME]
pskc = bytes.fromhex(dataset[tlv_parser.MeshcopTLVType.PSKC])
for passphrase in INSECURE_PASSPHRASES:
if pskc == compute_pskc(ext_pan_id, network_name, passphrase):
insecure = True
break
if insecure:
ir.async_create_issue(
hass,
DOMAIN,
f"insecure_thread_network_{entry.entry_id}",
is_fixable=False,
is_persistent=False,
severity=ir.IssueSeverity.WARNING,
translation_key="insecure_thread_network",
)
else:
ir.async_delete_issue(
hass,
DOMAIN,
f"insecure_thread_network_{entry.entry_id}",
)
async def async_setup_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool:
"""Set up an Open Thread Border Router config entry."""
api = python_otbr_api.OTBR(entry.data["url"], async_get_clientsession(hass), 10)
otbrdata = OTBRData(entry.data["url"], api)
try:
dataset = await otbrdata.get_active_dataset_tlvs()
except HomeAssistantError as err:
raise ConfigEntryNotReady from err
if dataset:
await async_add_dataset(hass, entry.title, dataset.hex())
dataset_tlvs = await otbrdata.get_active_dataset_tlvs()
except (
HomeAssistantError,
aiohttp.ClientError,
asyncio.TimeoutError,
) as err:
raise ConfigEntryNotReady("Unable to connect") from err
if dataset_tlvs:
_warn_on_default_network_settings(hass, entry, dataset_tlvs)
await async_add_dataset(hass, entry.title, dataset_tlvs.hex())
hass.data[DOMAIN] = otbrdata

View File

@@ -1,9 +1,12 @@
"""Config flow for the Open Thread Border Router integration."""
from __future__ import annotations
import asyncio
import logging
import aiohttp
import python_otbr_api
from python_otbr_api import tlv_parser
import voluptuous as vol
from homeassistant.components.hassio import HassioServiceInfo
@@ -13,7 +16,7 @@ from homeassistant.const import CONF_URL
from homeassistant.data_entry_flow import FlowResult
from homeassistant.helpers.aiohttp_client import async_get_clientsession
from .const import DOMAIN
from .const import DEFAULT_CHANNEL, DOMAIN
_LOGGER = logging.getLogger(__name__)
@@ -27,11 +30,26 @@ class OTBRConfigFlow(ConfigFlow, domain=DOMAIN):
"""Connect to the OTBR and create a dataset if it doesn't have one."""
api = python_otbr_api.OTBR(url, async_get_clientsession(self.hass), 10)
if await api.get_active_dataset_tlvs() is None:
if dataset := await async_get_preferred_dataset(self.hass):
await api.set_active_dataset_tlvs(bytes.fromhex(dataset))
# We currently have no way to know which channel zha is using, assume it's
# the default
zha_channel = DEFAULT_CHANNEL
thread_dataset_channel = None
thread_dataset_tlv = await async_get_preferred_dataset(self.hass)
if thread_dataset_tlv:
dataset = tlv_parser.parse_tlv(thread_dataset_tlv)
if channel_str := dataset.get(tlv_parser.MeshcopTLVType.CHANNEL):
thread_dataset_channel = int(channel_str, base=16)
if thread_dataset_tlv is not None and zha_channel == thread_dataset_channel:
await api.set_active_dataset_tlvs(bytes.fromhex(thread_dataset_tlv))
else:
_LOGGER.debug(
"not importing TLV with channel %s", thread_dataset_channel
)
await api.create_active_dataset(
python_otbr_api.OperationalDataSet(network_name="home-assistant")
python_otbr_api.OperationalDataSet(
channel=zha_channel, network_name="home-assistant"
)
)
await api.set_enabled(True)
@@ -48,7 +66,11 @@ class OTBRConfigFlow(ConfigFlow, domain=DOMAIN):
url = user_input[CONF_URL]
try:
await self._connect_and_create_dataset(url)
except python_otbr_api.OTBRError:
except (
python_otbr_api.OTBRError,
aiohttp.ClientError,
asyncio.TimeoutError,
):
errors["base"] = "cannot_connect"
else:
await self.async_set_unique_id(DOMAIN)

View File

@@ -1,3 +1,5 @@
"""Constants for the Open Thread Border Router integration."""
DOMAIN = "otbr"
DEFAULT_CHANNEL = 15

View File

@@ -8,5 +8,5 @@
"documentation": "https://www.home-assistant.io/integrations/otbr",
"integration_type": "service",
"iot_class": "local_polling",
"requirements": ["python-otbr-api==1.0.3"]
"requirements": ["python-otbr-api==1.0.5"]
}

View File

@@ -12,7 +12,13 @@
"cannot_connect": "[%key:common::config_flow::error::cannot_connect%]"
},
"abort": {
"already_configured": "[%key:common::config_flow::abort::already_configured_service%]"
"single_instance_allowed": "[%key:common::config_flow::abort::single_instance_allowed%]"
}
},
"issues": {
"insecure_thread_network": {
"title": "Insecure Thread network settings detected",
"description": "Your Thread network is using a default network key or pass phrase.\n\nThis is a security risk, please create a new Thread network."
}
}
}

View File

@@ -1,6 +1,8 @@
"""Websocket API for OTBR."""
from typing import TYPE_CHECKING
import python_otbr_api
from homeassistant.components.websocket_api import (
ActiveConnection,
async_register_command,
@@ -10,7 +12,7 @@ from homeassistant.components.websocket_api import (
from homeassistant.core import HomeAssistant, callback
from homeassistant.exceptions import HomeAssistantError
from .const import DOMAIN
from .const import DEFAULT_CHANNEL, DOMAIN
if TYPE_CHECKING:
from . import OTBRData
@@ -20,6 +22,7 @@ if TYPE_CHECKING:
def async_setup(hass: HomeAssistant) -> None:
"""Set up the OTBR Websocket API."""
async_register_command(hass, websocket_info)
async_register_command(hass, websocket_create_network)
@websocket_command(
@@ -51,3 +54,48 @@ async def websocket_info(
"active_dataset_tlvs": dataset.hex() if dataset else None,
},
)
@websocket_command(
{
"type": "otbr/create_network",
}
)
@async_response
async def websocket_create_network(
hass: HomeAssistant, connection: ActiveConnection, msg: dict
) -> None:
"""Create a new Thread network."""
if DOMAIN not in hass.data:
connection.send_error(msg["id"], "not_loaded", "No OTBR API loaded")
return
# We currently have no way to know which channel zha is using, assume it's
# the default
zha_channel = DEFAULT_CHANNEL
data: OTBRData = hass.data[DOMAIN]
try:
await data.set_enabled(False)
except HomeAssistantError as exc:
connection.send_error(msg["id"], "set_enabled_failed", str(exc))
return
try:
await data.create_active_dataset(
python_otbr_api.OperationalDataSet(
channel=zha_channel, network_name="home-assistant"
)
)
except HomeAssistantError as exc:
connection.send_error(msg["id"], "create_active_dataset_failed", str(exc))
return
try:
await data.set_enabled(True)
except HomeAssistantError as exc:
connection.send_error(msg["id"], "set_enabled_failed", str(exc))
return
connection.send_result(msg["id"])

View File

@@ -17,6 +17,7 @@ from homeassistant.const import (
UnitOfPower,
UnitOfPressure,
UnitOfTemperature,
UnitOfTime,
UnitOfVolume,
)
from homeassistant.core import HomeAssistant
@@ -303,9 +304,9 @@ SENSORS: tuple[SensorEntityDescription, ...] = (
SensorEntityDescription(
key="gas_consumed_interval",
name="Gas consumed interval",
native_unit_of_measurement=UnitOfVolume.CUBIC_METERS,
device_class=SensorDeviceClass.GAS,
state_class=SensorStateClass.TOTAL,
icon="mdi:meter-gas",
native_unit_of_measurement=f"{UnitOfVolume.CUBIC_METERS}/{UnitOfTime.HOURS}",
state_class=SensorStateClass.MEASUREMENT,
),
SensorEntityDescription(
key="gas_consumed_cumulative",

View File

@@ -7,5 +7,5 @@
"iot_class": "cloud_polling",
"loggers": ["aiopvpc"],
"quality_scale": "platinum",
"requirements": ["aiopvpc==4.0.1"]
"requirements": ["aiopvpc==4.1.0"]
}

View File

@@ -11,5 +11,5 @@
"documentation": "https://www.home-assistant.io/integrations/qnap_qsw",
"iot_class": "local_polling",
"loggers": ["aioqsw"],
"requirements": ["aioqsw==0.3.1"]
"requirements": ["aioqsw==0.3.2"]
}

Some files were not shown because too many files have changed in this diff Show More