* LaCrosse View new endpoint (#137284)

* Switch to new endpoint in LaCrosse View

* Coverage

* Avoid merge conflict

* Switch to UpdateFailed

* Convert coinbase account amounts as floats to properly add them together (#137588)

Convert coinbase account amounts as floats to properly add

* Bump ohmepy to 1.2.9 (#137695)

* Bump onedrive_personal_sdk to 0.0.9 (#137729)

* Limit habitica ConfigEntrySelect to integration domain (#137767)

* Limit nordpool ConfigEntrySelect to integration domain (#137768)

* Limit transmission ConfigEntrySelect to integration domain (#137769)

* Fix tplink child updates taking up to 60s (#137782)

* Fix tplink child updates taking up to 60s

fixes #137562

* Fix tplink child updates taking up to 60s

fixes #137562

* Fix tplink child updates taking up to 60s

fixes #137562

* Fix tplink child updates taking up to 60s

fixes #137562

* Fix tplink child updates taking up to 60s

fixes #137562

* Fix tplink child updates taking up to 60s

fixes #137562

* Fix tplink child updates taking up to 60s

fixes #137562

* Revert "Fix tplink child updates taking up to 60s"

This reverts commit 5cd20a120f772b8df96ec32890b071b22135895e.

* Call backup listener during setup in Google Drive (#137789)

* Use the external URL set in Settings > System > Network if my is disabled as redirect URL for Google Drive instructions (#137791)

* Use the Assistant URL set in Settings > System > Network if my is disabled

* fix

* Remove async_get_redirect_uri

* Fix manufacturer_id matching for 0 (#137802)

fix manufacturer_id matching for 0

* Fix DAB radio in Onkyo (#137852)

* Fix LG webOS TV fails to setup when device is off (#137870)

* Fix heos migration (#137887)

* Fix heos migration

* Fix for loop

* Bump pydrawise to 2025.2.0 (#137961)

* Bump aioshelly to version 12.4.2 (#137986)

* Prevent crash if telegram message failed and did not generate an ID (#137989)

Fix #137901 - Regression introduced in 6fdccda2256f92c824a98712ef102b4a77140126

* Bump habiticalib to v0.3.7 (#137993)

* bump habiticalib to 0.3.6

* bump to v0.3.7

* Refresh the nest authentication token on integration start before invoking the pub/sub subsciber (#138003)

* Refresh the nest authentication token on integration start before invoking the pub/sub subscriber

* Apply suggestions from code review

---------

Co-authored-by: Paulus Schoutsen <paulus@home-assistant.io>

* Use resumable uploads in Google Drive (#138010)

* Use resumable uploads in Google Drive

* tests

* Bump py-synologydsm-api to 2.6.2 (#138060)

bump py-synologydsm-api to 2.6.2

* Handle generic agent exceptions when getting and deleting backups (#138145)

* Handle generic agent exceptions when getting backups

* Update hassio test

* Update delete_backup

* Bump onedrive-personal-sdk to 0.0.10 (#138186)

* Keep one backup per backup agent when executing retention policy (#138189)

* Keep one backup per backup agent when executing retention policy

* Add tests

* Use defaultdict instead of dict.setdefault

* Update hassio tests

* Improve inexogy logging when failed to update (#138210)

* Bump pyheos to v1.0.2 (#138224)

Bump pyheos

* Update frontend to 20250210.0 (#138227)

* Bump version to 2025.2.2

* Bump lacrosse-view to 1.1.1 (#137282)

---------

Co-authored-by: IceBotYT <34712694+IceBotYT@users.noreply.github.com>
Co-authored-by: Nathan Spencer <natekspencer@gmail.com>
Co-authored-by: Dan Raper <me@danr.uk>
Co-authored-by: Josef Zweck <josef@zweck.dev>
Co-authored-by: Marc Mueller <30130371+cdce8p@users.noreply.github.com>
Co-authored-by: J. Nick Koston <nick@koston.org>
Co-authored-by: tronikos <tronikos@users.noreply.github.com>
Co-authored-by: Patrick <14628713+patman15@users.noreply.github.com>
Co-authored-by: Artur Pragacz <49985303+arturpragacz@users.noreply.github.com>
Co-authored-by: Shay Levy <levyshay1@gmail.com>
Co-authored-by: Paulus Schoutsen <balloob@gmail.com>
Co-authored-by: David Knowles <dknowles2@gmail.com>
Co-authored-by: Maciej Bieniek <bieniu@users.noreply.github.com>
Co-authored-by: Daniel O'Connor <daniel.oconnor@gmail.com>
Co-authored-by: Manu <4445816+tr4nt0r@users.noreply.github.com>
Co-authored-by: Allen Porter <allen@thebends.org>
Co-authored-by: Paulus Schoutsen <paulus@home-assistant.io>
Co-authored-by: Michael <35783820+mib1185@users.noreply.github.com>
Co-authored-by: Abílio Costa <abmantis@users.noreply.github.com>
Co-authored-by: Erik Montnemery <erik@montnemery.com>
Co-authored-by: Jan-Philipp Benecke <jan-philipp@bnck.me>
Co-authored-by: Andrew Sayre <6730289+andrewsayre@users.noreply.github.com>
Co-authored-by: Bram Kragten <mail@bramkragten.nl>
This commit is contained in:
Franck Nijhof 2025-02-10 22:08:18 +01:00 committed by GitHub
commit 2d5a75d4f2
No known key found for this signature in database
GPG Key ID: B5690EEEBB952194
59 changed files with 999 additions and 547 deletions

View File

@ -4,6 +4,7 @@ from __future__ import annotations
import abc import abc
import asyncio import asyncio
from collections import defaultdict
from collections.abc import AsyncIterator, Callable, Coroutine from collections.abc import AsyncIterator, Callable, Coroutine
from dataclasses import dataclass, replace from dataclasses import dataclass, replace
from enum import StrEnum from enum import StrEnum
@ -560,8 +561,15 @@ class BackupManager:
return_exceptions=True, return_exceptions=True,
) )
for idx, result in enumerate(list_backups_results): for idx, result in enumerate(list_backups_results):
agent_id = agent_ids[idx]
if isinstance(result, BackupAgentError): if isinstance(result, BackupAgentError):
agent_errors[agent_ids[idx]] = result agent_errors[agent_id] = result
continue
if isinstance(result, Exception):
agent_errors[agent_id] = result
LOGGER.error(
"Unexpected error for %s: %s", agent_id, result, exc_info=result
)
continue continue
if isinstance(result, BaseException): if isinstance(result, BaseException):
raise result # unexpected error raise result # unexpected error
@ -588,7 +596,7 @@ class BackupManager:
name=agent_backup.name, name=agent_backup.name,
with_automatic_settings=with_automatic_settings, with_automatic_settings=with_automatic_settings,
) )
backups[backup_id].agents[agent_ids[idx]] = AgentBackupStatus( backups[backup_id].agents[agent_id] = AgentBackupStatus(
protected=agent_backup.protected, protected=agent_backup.protected,
size=agent_backup.size, size=agent_backup.size,
) )
@ -611,8 +619,15 @@ class BackupManager:
return_exceptions=True, return_exceptions=True,
) )
for idx, result in enumerate(get_backup_results): for idx, result in enumerate(get_backup_results):
agent_id = agent_ids[idx]
if isinstance(result, BackupAgentError): if isinstance(result, BackupAgentError):
agent_errors[agent_ids[idx]] = result agent_errors[agent_id] = result
continue
if isinstance(result, Exception):
agent_errors[agent_id] = result
LOGGER.error(
"Unexpected error for %s: %s", agent_id, result, exc_info=result
)
continue continue
if isinstance(result, BaseException): if isinstance(result, BaseException):
raise result # unexpected error raise result # unexpected error
@ -640,7 +655,7 @@ class BackupManager:
name=result.name, name=result.name,
with_automatic_settings=with_automatic_settings, with_automatic_settings=with_automatic_settings,
) )
backup.agents[agent_ids[idx]] = AgentBackupStatus( backup.agents[agent_id] = AgentBackupStatus(
protected=result.protected, protected=result.protected,
size=result.size, size=result.size,
) )
@ -663,10 +678,13 @@ class BackupManager:
return None return None
return with_automatic_settings return with_automatic_settings
async def async_delete_backup(self, backup_id: str) -> dict[str, Exception]: async def async_delete_backup(
self, backup_id: str, *, agent_ids: list[str] | None = None
) -> dict[str, Exception]:
"""Delete a backup.""" """Delete a backup."""
agent_errors: dict[str, Exception] = {} agent_errors: dict[str, Exception] = {}
agent_ids = list(self.backup_agents) if agent_ids is None:
agent_ids = list(self.backup_agents)
delete_backup_results = await asyncio.gather( delete_backup_results = await asyncio.gather(
*( *(
@ -676,8 +694,15 @@ class BackupManager:
return_exceptions=True, return_exceptions=True,
) )
for idx, result in enumerate(delete_backup_results): for idx, result in enumerate(delete_backup_results):
agent_id = agent_ids[idx]
if isinstance(result, BackupAgentError): if isinstance(result, BackupAgentError):
agent_errors[agent_ids[idx]] = result agent_errors[agent_id] = result
continue
if isinstance(result, Exception):
agent_errors[agent_id] = result
LOGGER.error(
"Unexpected error for %s: %s", agent_id, result, exc_info=result
)
continue continue
if isinstance(result, BaseException): if isinstance(result, BaseException):
raise result # unexpected error raise result # unexpected error
@ -710,35 +735,71 @@ class BackupManager:
# Run the include filter first to ensure we only consider backups that # Run the include filter first to ensure we only consider backups that
# should be included in the deletion process. # should be included in the deletion process.
backups = include_filter(backups) backups = include_filter(backups)
backups_by_agent: dict[str, dict[str, ManagerBackup]] = defaultdict(dict)
for backup_id, backup in backups.items():
for agent_id in backup.agents:
backups_by_agent[agent_id][backup_id] = backup
LOGGER.debug("Total automatic backups: %s", backups) LOGGER.debug("Backups returned by include filter: %s", backups)
LOGGER.debug(
"Backups returned by include filter by agent: %s",
{agent_id: list(backups) for agent_id, backups in backups_by_agent.items()},
)
backups_to_delete = delete_filter(backups) backups_to_delete = delete_filter(backups)
LOGGER.debug("Backups returned by delete filter: %s", backups_to_delete)
if not backups_to_delete: if not backups_to_delete:
return return
# always delete oldest backup first # always delete oldest backup first
backups_to_delete = dict( backups_to_delete_by_agent: dict[str, dict[str, ManagerBackup]] = defaultdict(
sorted( dict
backups_to_delete.items(), )
key=lambda backup_item: backup_item[1].date, for backup_id, backup in sorted(
) backups_to_delete.items(),
key=lambda backup_item: backup_item[1].date,
):
for agent_id in backup.agents:
backups_to_delete_by_agent[agent_id][backup_id] = backup
LOGGER.debug(
"Backups returned by delete filter by agent: %s",
{
agent_id: list(backups)
for agent_id, backups in backups_to_delete_by_agent.items()
},
)
for agent_id, to_delete_from_agent in backups_to_delete_by_agent.items():
if len(to_delete_from_agent) >= len(backups_by_agent[agent_id]):
# Never delete the last backup.
last_backup = to_delete_from_agent.popitem()
LOGGER.debug(
"Keeping the last backup %s for agent %s", last_backup, agent_id
)
LOGGER.debug(
"Backups to delete by agent: %s",
{
agent_id: list(backups)
for agent_id, backups in backups_to_delete_by_agent.items()
},
) )
if len(backups_to_delete) >= len(backups): backup_ids_to_delete: dict[str, set[str]] = defaultdict(set)
# Never delete the last backup. for agent_id, to_delete in backups_to_delete_by_agent.items():
last_backup = backups_to_delete.popitem() for backup_id in to_delete:
LOGGER.debug("Keeping the last backup: %s", last_backup) backup_ids_to_delete[backup_id].add(agent_id)
LOGGER.debug("Backups to delete: %s", backups_to_delete) if not backup_ids_to_delete:
if not backups_to_delete:
return return
backup_ids = list(backups_to_delete) backup_ids = list(backup_ids_to_delete)
delete_results = await asyncio.gather( delete_results = await asyncio.gather(
*(self.async_delete_backup(backup_id) for backup_id in backups_to_delete) *(
self.async_delete_backup(backup_id, agent_ids=list(agent_ids))
for backup_id, agent_ids in backup_ids_to_delete.items()
)
) )
agent_errors = { agent_errors = {
backup_id: error backup_id: error

View File

@ -411,7 +411,7 @@ def ble_device_matches(
) and service_data_uuid not in service_info.service_data: ) and service_data_uuid not in service_info.service_data:
return False return False
if manufacturer_id := matcher.get(MANUFACTURER_ID): if (manufacturer_id := matcher.get(MANUFACTURER_ID)) is not None:
if manufacturer_id not in service_info.manufacturer_data: if manufacturer_id not in service_info.manufacturer_data:
return False return False

View File

@ -140,8 +140,10 @@ def get_accounts(client, version):
API_ACCOUNT_ID: account[API_V3_ACCOUNT_ID], API_ACCOUNT_ID: account[API_V3_ACCOUNT_ID],
API_ACCOUNT_NAME: account[API_ACCOUNT_NAME], API_ACCOUNT_NAME: account[API_ACCOUNT_NAME],
API_ACCOUNT_CURRENCY: account[API_ACCOUNT_CURRENCY], API_ACCOUNT_CURRENCY: account[API_ACCOUNT_CURRENCY],
API_ACCOUNT_AMOUNT: account[API_ACCOUNT_AVALIABLE][API_ACCOUNT_VALUE] API_ACCOUNT_AMOUNT: (
+ account[API_ACCOUNT_HOLD][API_ACCOUNT_VALUE], float(account[API_ACCOUNT_AVALIABLE][API_ACCOUNT_VALUE])
+ float(account[API_ACCOUNT_HOLD][API_ACCOUNT_VALUE])
),
ACCOUNT_IS_VAULT: account[API_RESOURCE_TYPE] == API_V3_TYPE_VAULT, ACCOUNT_IS_VAULT: account[API_RESOURCE_TYPE] == API_V3_TYPE_VAULT,
} }
for account in accounts for account in accounts

View File

@ -44,9 +44,7 @@ class DiscovergyUpdateCoordinator(DataUpdateCoordinator[Reading]):
) )
except InvalidLogin as err: except InvalidLogin as err:
raise ConfigEntryAuthFailed( raise ConfigEntryAuthFailed(
f"Auth expired while fetching last reading for meter {self.meter.meter_id}" "Auth expired while fetching last reading"
) from err ) from err
except (HTTPError, DiscovergyClientError) as err: except (HTTPError, DiscovergyClientError) as err:
raise UpdateFailed( raise UpdateFailed(f"Error while fetching last reading: {err}") from err
f"Error while fetching last reading for meter {self.meter.meter_id}"
) from err

View File

@ -21,5 +21,5 @@
"documentation": "https://www.home-assistant.io/integrations/frontend", "documentation": "https://www.home-assistant.io/integrations/frontend",
"integration_type": "system", "integration_type": "system",
"quality_scale": "internal", "quality_scale": "internal",
"requirements": ["home-assistant-frontend==20250205.0"] "requirements": ["home-assistant-frontend==20250210.0"]
} }

View File

@ -7,7 +7,7 @@ from collections.abc import Callable
from google_drive_api.exceptions import GoogleDriveApiError from google_drive_api.exceptions import GoogleDriveApiError
from homeassistant.config_entries import ConfigEntry from homeassistant.config_entries import ConfigEntry
from homeassistant.core import HomeAssistant from homeassistant.core import HomeAssistant, callback
from homeassistant.exceptions import ConfigEntryNotReady from homeassistant.exceptions import ConfigEntryNotReady
from homeassistant.helpers import instance_id from homeassistant.helpers import instance_id
from homeassistant.helpers.aiohttp_client import async_get_clientsession from homeassistant.helpers.aiohttp_client import async_get_clientsession
@ -49,6 +49,8 @@ async def async_setup_entry(hass: HomeAssistant, entry: GoogleDriveConfigEntry)
except GoogleDriveApiError as err: except GoogleDriveApiError as err:
raise ConfigEntryNotReady from err raise ConfigEntryNotReady from err
_async_notify_backup_listeners_soon(hass)
return True return True
@ -56,10 +58,15 @@ async def async_unload_entry(
hass: HomeAssistant, entry: GoogleDriveConfigEntry hass: HomeAssistant, entry: GoogleDriveConfigEntry
) -> bool: ) -> bool:
"""Unload a config entry.""" """Unload a config entry."""
hass.loop.call_soon(_notify_backup_listeners, hass) _async_notify_backup_listeners_soon(hass)
return True return True
def _notify_backup_listeners(hass: HomeAssistant) -> None: def _async_notify_backup_listeners(hass: HomeAssistant) -> None:
for listener in hass.data.get(DATA_BACKUP_AGENT_LISTENERS, []): for listener in hass.data.get(DATA_BACKUP_AGENT_LISTENERS, []):
listener() listener()
@callback
def _async_notify_backup_listeners_soon(hass: HomeAssistant) -> None:
hass.loop.call_soon(_async_notify_backup_listeners, hass)

View File

@ -146,9 +146,10 @@ class DriveClient:
backup.backup_id, backup.backup_id,
backup_metadata, backup_metadata,
) )
await self._api.upload_file( await self._api.resumable_upload_file(
backup_metadata, backup_metadata,
open_stream, open_stream,
backup.size,
timeout=ClientTimeout(total=_UPLOAD_AND_DOWNLOAD_TIMEOUT), timeout=ClientTimeout(total=_UPLOAD_AND_DOWNLOAD_TIMEOUT),
) )
_LOGGER.debug( _LOGGER.debug(

View File

@ -2,7 +2,10 @@
from homeassistant.components.application_credentials import AuthorizationServer from homeassistant.components.application_credentials import AuthorizationServer
from homeassistant.core import HomeAssistant from homeassistant.core import HomeAssistant
from homeassistant.helpers import config_entry_oauth2_flow from homeassistant.helpers.config_entry_oauth2_flow import (
AUTH_CALLBACK_PATH,
MY_AUTH_CALLBACK_PATH,
)
async def async_get_authorization_server(hass: HomeAssistant) -> AuthorizationServer: async def async_get_authorization_server(hass: HomeAssistant) -> AuthorizationServer:
@ -15,9 +18,14 @@ async def async_get_authorization_server(hass: HomeAssistant) -> AuthorizationSe
async def async_get_description_placeholders(hass: HomeAssistant) -> dict[str, str]: async def async_get_description_placeholders(hass: HomeAssistant) -> dict[str, str]:
"""Return description placeholders for the credentials dialog.""" """Return description placeholders for the credentials dialog."""
if "my" in hass.config.components:
redirect_url = MY_AUTH_CALLBACK_PATH
else:
ha_host = hass.config.external_url or "https://YOUR_DOMAIN:PORT"
redirect_url = f"{ha_host}{AUTH_CALLBACK_PATH}"
return { return {
"oauth_consent_url": "https://console.cloud.google.com/apis/credentials/consent", "oauth_consent_url": "https://console.cloud.google.com/apis/credentials/consent",
"more_info_url": "https://www.home-assistant.io/integrations/google_drive/", "more_info_url": "https://www.home-assistant.io/integrations/google_drive/",
"oauth_creds_url": "https://console.cloud.google.com/apis/credentials", "oauth_creds_url": "https://console.cloud.google.com/apis/credentials",
"redirect_url": config_entry_oauth2_flow.async_get_redirect_uri(hass), "redirect_url": redirect_url,
} }

View File

@ -10,5 +10,5 @@
"iot_class": "cloud_polling", "iot_class": "cloud_polling",
"loggers": ["google_drive_api"], "loggers": ["google_drive_api"],
"quality_scale": "platinum", "quality_scale": "platinum",
"requirements": ["python-google-drive-api==0.0.2"] "requirements": ["python-google-drive-api==0.1.0"]
} }

View File

@ -11,6 +11,7 @@ from typing import Any
from aiohttp import ClientError from aiohttp import ClientError
from habiticalib import ( from habiticalib import (
Avatar,
ContentData, ContentData,
Habitica, Habitica,
HabiticaException, HabiticaException,
@ -19,7 +20,6 @@ from habiticalib import (
TaskFilter, TaskFilter,
TooManyRequestsError, TooManyRequestsError,
UserData, UserData,
UserStyles,
) )
from homeassistant.config_entries import ConfigEntry from homeassistant.config_entries import ConfigEntry
@ -159,12 +159,10 @@ class HabiticaDataUpdateCoordinator(DataUpdateCoordinator[HabiticaData]):
else: else:
await self.async_request_refresh() await self.async_request_refresh()
async def generate_avatar(self, user_styles: UserStyles) -> bytes: async def generate_avatar(self, avatar: Avatar) -> bytes:
"""Generate Avatar.""" """Generate Avatar."""
avatar = BytesIO() png = BytesIO()
await self.habitica.generate_avatar( await self.habitica.generate_avatar(fp=png, avatar=avatar, fmt="PNG")
fp=avatar, user_styles=user_styles, fmt="PNG"
)
return avatar.getvalue() return png.getvalue()

View File

@ -23,5 +23,5 @@ async def async_get_config_entry_diagnostics(
CONF_URL: config_entry.data[CONF_URL], CONF_URL: config_entry.data[CONF_URL],
CONF_API_USER: config_entry.data[CONF_API_USER], CONF_API_USER: config_entry.data[CONF_API_USER],
}, },
"habitica_data": habitica_data.to_dict()["data"], "habitica_data": habitica_data.to_dict(omit_none=False)["data"],
} }

View File

@ -2,10 +2,9 @@
from __future__ import annotations from __future__ import annotations
from dataclasses import asdict
from enum import StrEnum from enum import StrEnum
from habiticalib import UserStyles from habiticalib import Avatar, extract_avatar
from homeassistant.components.image import ImageEntity, ImageEntityDescription from homeassistant.components.image import ImageEntity, ImageEntityDescription
from homeassistant.core import HomeAssistant from homeassistant.core import HomeAssistant
@ -45,7 +44,7 @@ class HabiticaImage(HabiticaBase, ImageEntity):
translation_key=HabiticaImageEntity.AVATAR, translation_key=HabiticaImageEntity.AVATAR,
) )
_attr_content_type = "image/png" _attr_content_type = "image/png"
_current_appearance: UserStyles | None = None _current_appearance: Avatar | None = None
_cache: bytes | None = None _cache: bytes | None = None
def __init__( def __init__(
@ -60,7 +59,7 @@ class HabiticaImage(HabiticaBase, ImageEntity):
def _handle_coordinator_update(self) -> None: def _handle_coordinator_update(self) -> None:
"""Check if equipped gear and other things have changed since last avatar image generation.""" """Check if equipped gear and other things have changed since last avatar image generation."""
new_appearance = UserStyles.from_dict(asdict(self.coordinator.data.user)) new_appearance = extract_avatar(self.coordinator.data.user)
if self._current_appearance != new_appearance: if self._current_appearance != new_appearance:
self._current_appearance = new_appearance self._current_appearance = new_appearance

View File

@ -6,5 +6,5 @@
"documentation": "https://www.home-assistant.io/integrations/habitica", "documentation": "https://www.home-assistant.io/integrations/habitica",
"iot_class": "cloud_polling", "iot_class": "cloud_polling",
"loggers": ["habiticalib"], "loggers": ["habiticalib"],
"requirements": ["habiticalib==0.3.5"] "requirements": ["habiticalib==0.3.7"]
} }

View File

@ -77,7 +77,7 @@ SERVICE_API_CALL_SCHEMA = vol.Schema(
SERVICE_CAST_SKILL_SCHEMA = vol.Schema( SERVICE_CAST_SKILL_SCHEMA = vol.Schema(
{ {
vol.Required(ATTR_CONFIG_ENTRY): ConfigEntrySelector(), vol.Required(ATTR_CONFIG_ENTRY): ConfigEntrySelector({"integration": DOMAIN}),
vol.Required(ATTR_SKILL): cv.string, vol.Required(ATTR_SKILL): cv.string,
vol.Optional(ATTR_TASK): cv.string, vol.Optional(ATTR_TASK): cv.string,
} }
@ -85,12 +85,12 @@ SERVICE_CAST_SKILL_SCHEMA = vol.Schema(
SERVICE_MANAGE_QUEST_SCHEMA = vol.Schema( SERVICE_MANAGE_QUEST_SCHEMA = vol.Schema(
{ {
vol.Required(ATTR_CONFIG_ENTRY): ConfigEntrySelector(), vol.Required(ATTR_CONFIG_ENTRY): ConfigEntrySelector({"integration": DOMAIN}),
} }
) )
SERVICE_SCORE_TASK_SCHEMA = vol.Schema( SERVICE_SCORE_TASK_SCHEMA = vol.Schema(
{ {
vol.Required(ATTR_CONFIG_ENTRY): ConfigEntrySelector(), vol.Required(ATTR_CONFIG_ENTRY): ConfigEntrySelector({"integration": DOMAIN}),
vol.Required(ATTR_TASK): cv.string, vol.Required(ATTR_TASK): cv.string,
vol.Optional(ATTR_DIRECTION): cv.string, vol.Optional(ATTR_DIRECTION): cv.string,
} }
@ -98,7 +98,7 @@ SERVICE_SCORE_TASK_SCHEMA = vol.Schema(
SERVICE_TRANSFORMATION_SCHEMA = vol.Schema( SERVICE_TRANSFORMATION_SCHEMA = vol.Schema(
{ {
vol.Required(ATTR_CONFIG_ENTRY): ConfigEntrySelector(), vol.Required(ATTR_CONFIG_ENTRY): ConfigEntrySelector({"integration": DOMAIN}),
vol.Required(ATTR_ITEM): cv.string, vol.Required(ATTR_ITEM): cv.string,
vol.Required(ATTR_TARGET): cv.string, vol.Required(ATTR_TARGET): cv.string,
} }
@ -106,7 +106,7 @@ SERVICE_TRANSFORMATION_SCHEMA = vol.Schema(
SERVICE_GET_TASKS_SCHEMA = vol.Schema( SERVICE_GET_TASKS_SCHEMA = vol.Schema(
{ {
vol.Required(ATTR_CONFIG_ENTRY): ConfigEntrySelector(), vol.Required(ATTR_CONFIG_ENTRY): ConfigEntrySelector({"integration": DOMAIN}),
vol.Optional(ATTR_TYPE): vol.All( vol.Optional(ATTR_TYPE): vol.All(
cv.ensure_list, [vol.All(vol.Upper, vol.In({x.name for x in TaskType}))] cv.ensure_list, [vol.All(vol.Upper, vol.In({x.name for x in TaskType}))]
), ),
@ -510,7 +510,9 @@ def async_setup_services(hass: HomeAssistant) -> None: # noqa: C901
or (task.notes and keyword in task.notes.lower()) or (task.notes and keyword in task.notes.lower())
or any(keyword in item.text.lower() for item in task.checklist) or any(keyword in item.text.lower() for item in task.checklist)
] ]
result: dict[str, Any] = {"tasks": [task.to_dict() for task in response]} result: dict[str, Any] = {
"tasks": [task.to_dict(omit_none=False) for task in response]
}
return result return result

View File

@ -37,24 +37,24 @@ async def async_setup_entry(hass: HomeAssistant, entry: HeosConfigEntry) -> bool
for device in device_registry.devices.get_devices_for_config_entry_id( for device in device_registry.devices.get_devices_for_config_entry_id(
entry.entry_id entry.entry_id
): ):
for domain, player_id in device.identifiers: for ident in device.identifiers:
if domain == DOMAIN and not isinstance(player_id, str): if ident[0] != DOMAIN or isinstance(ident[1], str):
# Create set of identifiers excluding this integration continue
identifiers = { # type: ignore[unreachable]
(domain, identifier) player_id = int(ident[1]) # type: ignore[unreachable]
for domain, identifier in device.identifiers
if domain != DOMAIN # Create set of identifiers excluding this integration
} identifiers = {ident for ident in device.identifiers if ident[0] != DOMAIN}
migrated_identifiers = {(DOMAIN, str(player_id))} migrated_identifiers = {(DOMAIN, str(player_id))}
# Add migrated if not already present in another device, which occurs if the user downgraded and then upgraded # Add migrated if not already present in another device, which occurs if the user downgraded and then upgraded
if not device_registry.async_get_device(migrated_identifiers): if not device_registry.async_get_device(migrated_identifiers):
identifiers.update(migrated_identifiers) identifiers.update(migrated_identifiers)
if len(identifiers) > 0: if len(identifiers) > 0:
device_registry.async_update_device( device_registry.async_update_device(
device.id, new_identifiers=identifiers device.id, new_identifiers=identifiers
) )
else: else:
device_registry.async_remove_device(device.id) device_registry.async_remove_device(device.id)
break break
coordinator = HeosCoordinator(hass, entry) coordinator = HeosCoordinator(hass, entry)

View File

@ -8,7 +8,7 @@
"iot_class": "local_push", "iot_class": "local_push",
"loggers": ["pyheos"], "loggers": ["pyheos"],
"quality_scale": "silver", "quality_scale": "silver",
"requirements": ["pyheos==1.0.1"], "requirements": ["pyheos==1.0.2"],
"single_config_entry": true, "single_config_entry": true,
"ssdp": [ "ssdp": [
{ {

View File

@ -6,5 +6,5 @@
"documentation": "https://www.home-assistant.io/integrations/hydrawise", "documentation": "https://www.home-assistant.io/integrations/hydrawise",
"iot_class": "cloud_polling", "iot_class": "cloud_polling",
"loggers": ["pydrawise"], "loggers": ["pydrawise"],
"requirements": ["pydrawise==2025.1.0"] "requirements": ["pydrawise==2025.2.0"]
} }

View File

@ -10,8 +10,8 @@ from lacrosse_view import HTTPError, LaCrosse, Location, LoginError, Sensor
from homeassistant.config_entries import ConfigEntry from homeassistant.config_entries import ConfigEntry
from homeassistant.core import HomeAssistant from homeassistant.core import HomeAssistant
from homeassistant.exceptions import ConfigEntryAuthFailed, ConfigEntryNotReady from homeassistant.exceptions import ConfigEntryAuthFailed
from homeassistant.helpers.update_coordinator import DataUpdateCoordinator from homeassistant.helpers.update_coordinator import DataUpdateCoordinator, UpdateFailed
from .const import SCAN_INTERVAL from .const import SCAN_INTERVAL
@ -26,6 +26,7 @@ class LaCrosseUpdateCoordinator(DataUpdateCoordinator[list[Sensor]]):
name: str name: str
id: str id: str
hass: HomeAssistant hass: HomeAssistant
devices: list[Sensor] | None = None
def __init__( def __init__(
self, self,
@ -60,24 +61,34 @@ class LaCrosseUpdateCoordinator(DataUpdateCoordinator[list[Sensor]]):
except LoginError as error: except LoginError as error:
raise ConfigEntryAuthFailed from error raise ConfigEntryAuthFailed from error
if self.devices is None:
_LOGGER.debug("Getting devices")
try:
self.devices = await self.api.get_devices(
location=Location(id=self.id, name=self.name),
)
except HTTPError as error:
raise UpdateFailed from error
try: try:
# Fetch last hour of data # Fetch last hour of data
sensors = await self.api.get_sensors( for sensor in self.devices:
location=Location(id=self.id, name=self.name), sensor.data = (
tz=self.hass.config.time_zone, await self.api.get_sensor_status(
start=str(now - 3600), sensor=sensor,
end=str(now), tz=self.hass.config.time_zone,
) )
except HTTPError as error: )["data"]["current"]
raise ConfigEntryNotReady from error _LOGGER.debug("Got data: %s", sensor.data)
_LOGGER.debug("Got data: %s", sensors) except HTTPError as error:
raise UpdateFailed from error
# Verify that we have permission to read the sensors # Verify that we have permission to read the sensors
for sensor in sensors: for sensor in self.devices:
if not sensor.permissions.get("read", False): if not sensor.permissions.get("read", False):
raise ConfigEntryAuthFailed( raise ConfigEntryAuthFailed(
f"This account does not have permission to read {sensor.name}" f"This account does not have permission to read {sensor.name}"
) )
return sensors return self.devices

View File

@ -6,5 +6,5 @@
"documentation": "https://www.home-assistant.io/integrations/lacrosse_view", "documentation": "https://www.home-assistant.io/integrations/lacrosse_view",
"iot_class": "cloud_polling", "iot_class": "cloud_polling",
"loggers": ["lacrosse_view"], "loggers": ["lacrosse_view"],
"requirements": ["lacrosse-view==1.0.4"] "requirements": ["lacrosse-view==1.1.1"]
} }

View File

@ -45,10 +45,10 @@ class LaCrosseSensorEntityDescription(SensorEntityDescription):
def get_value(sensor: Sensor, field: str) -> float | int | str | None: def get_value(sensor: Sensor, field: str) -> float | int | str | None:
"""Get the value of a sensor field.""" """Get the value of a sensor field."""
field_data = sensor.data.get(field) field_data = sensor.data.get(field) if sensor.data is not None else None
if field_data is None: if field_data is None:
return None return None
value = field_data["values"][-1]["s"] value = field_data["spot"]["value"]
try: try:
value = float(value) value = float(value)
except ValueError: except ValueError:
@ -178,7 +178,7 @@ async def async_setup_entry(
continue continue
# if the API returns a different unit of measurement from the description, update it # if the API returns a different unit of measurement from the description, update it
if sensor.data.get(field) is not None: if sensor.data is not None and sensor.data.get(field) is not None:
native_unit_of_measurement = UNIT_OF_MEASUREMENT_MAP.get( native_unit_of_measurement = UNIT_OF_MEASUREMENT_MAP.get(
sensor.data[field].get("unit") sensor.data[field].get("unit")
) )
@ -240,7 +240,9 @@ class LaCrosseViewSensor(
@property @property
def available(self) -> bool: def available(self) -> bool:
"""Return True if entity is available.""" """Return True if entity is available."""
data = self.coordinator.data[self.index].data
return ( return (
super().available super().available
and self.entity_description.key in self.coordinator.data[self.index].data and data is not None
and self.entity_description.key in data
) )

View File

@ -198,7 +198,16 @@ async def async_setup_entry(hass: HomeAssistant, entry: NestConfigEntry) -> bool
entry, unique_id=entry.data[CONF_PROJECT_ID] entry, unique_id=entry.data[CONF_PROJECT_ID]
) )
subscriber = await api.new_subscriber(hass, entry) auth = await api.new_auth(hass, entry)
try:
await auth.async_get_access_token()
except AuthException as err:
raise ConfigEntryAuthFailed(f"Authentication error: {err!s}") from err
except ConfigurationException as err:
_LOGGER.error("Configuration error: %s", err)
return False
subscriber = await api.new_subscriber(hass, entry, auth)
if not subscriber: if not subscriber:
return False return False
# Keep media for last N events in memory # Keep media for last N events in memory

View File

@ -101,9 +101,7 @@ class AccessTokenAuthImpl(AbstractAuth):
) )
async def new_subscriber( async def new_auth(hass: HomeAssistant, entry: NestConfigEntry) -> AbstractAuth:
hass: HomeAssistant, entry: NestConfigEntry
) -> GoogleNestSubscriber | None:
"""Create a GoogleNestSubscriber.""" """Create a GoogleNestSubscriber."""
implementation = ( implementation = (
await config_entry_oauth2_flow.async_get_config_entry_implementation( await config_entry_oauth2_flow.async_get_config_entry_implementation(
@ -114,14 +112,22 @@ async def new_subscriber(
implementation, config_entry_oauth2_flow.LocalOAuth2Implementation implementation, config_entry_oauth2_flow.LocalOAuth2Implementation
): ):
raise TypeError(f"Unexpected auth implementation {implementation}") raise TypeError(f"Unexpected auth implementation {implementation}")
if (subscription_name := entry.data.get(CONF_SUBSCRIPTION_NAME)) is None: return AsyncConfigEntryAuth(
subscription_name = entry.data[CONF_SUBSCRIBER_ID]
auth = AsyncConfigEntryAuth(
aiohttp_client.async_get_clientsession(hass), aiohttp_client.async_get_clientsession(hass),
config_entry_oauth2_flow.OAuth2Session(hass, entry, implementation), config_entry_oauth2_flow.OAuth2Session(hass, entry, implementation),
implementation.client_id, implementation.client_id,
implementation.client_secret, implementation.client_secret,
) )
async def new_subscriber(
hass: HomeAssistant,
entry: NestConfigEntry,
auth: AbstractAuth,
) -> GoogleNestSubscriber:
"""Create a GoogleNestSubscriber."""
if (subscription_name := entry.data.get(CONF_SUBSCRIPTION_NAME)) is None:
subscription_name = entry.data[CONF_SUBSCRIBER_ID]
return GoogleNestSubscriber(auth, entry.data[CONF_PROJECT_ID], subscription_name) return GoogleNestSubscriber(auth, entry.data[CONF_PROJECT_ID], subscription_name)

View File

@ -41,7 +41,7 @@ ATTR_CURRENCY = "currency"
SERVICE_GET_PRICES_FOR_DATE = "get_prices_for_date" SERVICE_GET_PRICES_FOR_DATE = "get_prices_for_date"
SERVICE_GET_PRICES_SCHEMA = vol.Schema( SERVICE_GET_PRICES_SCHEMA = vol.Schema(
{ {
vol.Required(ATTR_CONFIG_ENTRY): ConfigEntrySelector(), vol.Required(ATTR_CONFIG_ENTRY): ConfigEntrySelector({"integration": DOMAIN}),
vol.Required(ATTR_DATE): cv.date, vol.Required(ATTR_DATE): cv.date,
vol.Optional(ATTR_AREAS): vol.All(vol.In(list(AREAS)), cv.ensure_list, [str]), vol.Optional(ATTR_AREAS): vol.All(vol.In(list(AREAS)), cv.ensure_list, [str]),
vol.Optional(ATTR_CURRENCY): vol.All( vol.Optional(ATTR_CURRENCY): vol.All(

View File

@ -7,5 +7,5 @@
"integration_type": "device", "integration_type": "device",
"iot_class": "cloud_polling", "iot_class": "cloud_polling",
"quality_scale": "silver", "quality_scale": "silver",
"requirements": ["ohme==1.2.8"] "requirements": ["ohme==1.2.9"]
} }

View File

@ -133,7 +133,7 @@ async def _migrate_backup_files(client: OneDriveClient, backup_folder_id: str) -
metadata_file = await client.upload_file( metadata_file = await client.upload_file(
backup_folder_id, backup_folder_id,
metadata_filename, metadata_filename,
dumps(metadata), # type: ignore[arg-type] dumps(metadata),
) )
metadata_description = { metadata_description = {
"metadata_version": 2, "metadata_version": 2,

View File

@ -168,7 +168,7 @@ class OneDriveBackupAgent(BackupAgent):
metadata_file = await self._client.upload_file( metadata_file = await self._client.upload_file(
self._folder_id, self._folder_id,
metadata_filename, metadata_filename,
description, # type: ignore[arg-type] description,
) )
# add metadata to the metadata file # add metadata to the metadata file

View File

@ -9,5 +9,5 @@
"iot_class": "cloud_polling", "iot_class": "cloud_polling",
"loggers": ["onedrive_personal_sdk"], "loggers": ["onedrive_personal_sdk"],
"quality_scale": "bronze", "quality_scale": "bronze",
"requirements": ["onedrive-personal-sdk==0.0.8"] "requirements": ["onedrive-personal-sdk==0.0.10"]
} }

View File

@ -92,7 +92,7 @@ SUPPORT_ONKYO = (
DEFAULT_PLAYABLE_SOURCES = ( DEFAULT_PLAYABLE_SOURCES = (
InputSource.from_meaning("FM"), InputSource.from_meaning("FM"),
InputSource.from_meaning("AM"), InputSource.from_meaning("AM"),
InputSource.from_meaning("TUNER"), InputSource.from_meaning("DAB"),
) )
ATTR_PRESET = "preset" ATTR_PRESET = "preset"

View File

@ -8,7 +8,7 @@
"integration_type": "device", "integration_type": "device",
"iot_class": "local_push", "iot_class": "local_push",
"loggers": ["aioshelly"], "loggers": ["aioshelly"],
"requirements": ["aioshelly==12.4.1"], "requirements": ["aioshelly==12.4.2"],
"zeroconf": [ "zeroconf": [
{ {
"type": "_http._tcp.local.", "type": "_http._tcp.local.",

View File

@ -7,7 +7,7 @@
"documentation": "https://www.home-assistant.io/integrations/synology_dsm", "documentation": "https://www.home-assistant.io/integrations/synology_dsm",
"iot_class": "local_polling", "iot_class": "local_polling",
"loggers": ["synology_dsm"], "loggers": ["synology_dsm"],
"requirements": ["py-synologydsm-api==2.6.0"], "requirements": ["py-synologydsm-api==2.6.2"],
"ssdp": [ "ssdp": [
{ {
"manufacturer": "Synology", "manufacturer": "Synology",

View File

@ -756,7 +756,8 @@ class TelegramNotificationService:
message_thread_id=params[ATTR_MESSAGE_THREAD_ID], message_thread_id=params[ATTR_MESSAGE_THREAD_ID],
context=context, context=context,
) )
msg_ids[chat_id] = msg.id if msg is not None:
msg_ids[chat_id] = msg.id
return msg_ids return msg_ids
async def delete_message(self, chat_id=None, context=None, **kwargs): async def delete_message(self, chat_id=None, context=None, **kwargs):

View File

@ -46,9 +46,11 @@ class TPLinkDataUpdateCoordinator(DataUpdateCoordinator[None]):
device: Device, device: Device,
update_interval: timedelta, update_interval: timedelta,
config_entry: TPLinkConfigEntry, config_entry: TPLinkConfigEntry,
parent_coordinator: TPLinkDataUpdateCoordinator | None = None,
) -> None: ) -> None:
"""Initialize DataUpdateCoordinator to gather data for specific SmartPlug.""" """Initialize DataUpdateCoordinator to gather data for specific SmartPlug."""
self.device = device self.device = device
self.parent_coordinator = parent_coordinator
# The iot HS300 allows a limited number of concurrent requests and # The iot HS300 allows a limited number of concurrent requests and
# fetching the emeter information requires separate ones, so child # fetching the emeter information requires separate ones, so child
@ -95,6 +97,12 @@ class TPLinkDataUpdateCoordinator(DataUpdateCoordinator[None]):
) from ex ) from ex
await self._process_child_devices() await self._process_child_devices()
if not self._update_children:
# If the children are not being updated, it means this is an
# IotStrip, and we need to tell the children to write state
# since the power state is provided by the parent.
for child_coordinator in self._child_coordinators.values():
child_coordinator.async_set_updated_data(None)
async def _process_child_devices(self) -> None: async def _process_child_devices(self) -> None:
"""Process child devices and remove stale devices.""" """Process child devices and remove stale devices."""
@ -132,7 +140,11 @@ class TPLinkDataUpdateCoordinator(DataUpdateCoordinator[None]):
# The child coordinators only update energy data so we can # The child coordinators only update energy data so we can
# set a longer update interval to avoid flooding the device # set a longer update interval to avoid flooding the device
child_coordinator = TPLinkDataUpdateCoordinator( child_coordinator = TPLinkDataUpdateCoordinator(
self.hass, child, timedelta(seconds=60), self.config_entry self.hass,
child,
timedelta(seconds=60),
self.config_entry,
parent_coordinator=self,
) )
self._child_coordinators[child.device_id] = child_coordinator self._child_coordinators[child.device_id] = child_coordinator
return child_coordinator return child_coordinator

View File

@ -151,7 +151,13 @@ def async_refresh_after[_T: CoordinatedTPLinkEntity, **_P](
"exc": str(ex), "exc": str(ex),
}, },
) from ex ) from ex
await self.coordinator.async_request_refresh() coordinator = self.coordinator
if coordinator.parent_coordinator:
# If there is a parent coordinator we need to refresh
# the parent as its what provides the power state data
# for the child entities.
coordinator = coordinator.parent_coordinator
await coordinator.async_request_refresh()
return _async_wrap return _async_wrap

View File

@ -78,7 +78,9 @@ MIGRATION_NAME_TO_KEY = {
SERVICE_BASE_SCHEMA = vol.Schema( SERVICE_BASE_SCHEMA = vol.Schema(
{ {
vol.Required(CONF_ENTRY_ID): selector.ConfigEntrySelector(), vol.Required(CONF_ENTRY_ID): selector.ConfigEntrySelector(
{"integration": DOMAIN}
),
} }
) )

View File

@ -31,6 +31,7 @@ WEBOSTV_EXCEPTIONS = (
WebOsTvCommandError, WebOsTvCommandError,
aiohttp.ClientConnectorError, aiohttp.ClientConnectorError,
aiohttp.ServerDisconnectedError, aiohttp.ServerDisconnectedError,
aiohttp.WSMessageTypeError,
asyncio.CancelledError, asyncio.CancelledError,
asyncio.TimeoutError, asyncio.TimeoutError,
) )

View File

@ -25,7 +25,7 @@ if TYPE_CHECKING:
APPLICATION_NAME: Final = "HomeAssistant" APPLICATION_NAME: Final = "HomeAssistant"
MAJOR_VERSION: Final = 2025 MAJOR_VERSION: Final = 2025
MINOR_VERSION: Final = 2 MINOR_VERSION: Final = 2
PATCH_VERSION: Final = "1" PATCH_VERSION: Final = "2"
__short_version__: Final = f"{MAJOR_VERSION}.{MINOR_VERSION}" __short_version__: Final = f"{MAJOR_VERSION}.{MINOR_VERSION}"
__version__: Final = f"{__short_version__}.{PATCH_VERSION}" __version__: Final = f"{__short_version__}.{PATCH_VERSION}"
REQUIRED_PYTHON_VER: Final[tuple[int, int, int]] = (3, 13, 0) REQUIRED_PYTHON_VER: Final[tuple[int, int, int]] = (3, 13, 0)

View File

@ -37,7 +37,7 @@ habluetooth==3.21.1
hass-nabucasa==0.88.1 hass-nabucasa==0.88.1
hassil==2.2.3 hassil==2.2.3
home-assistant-bluetooth==1.13.0 home-assistant-bluetooth==1.13.0
home-assistant-frontend==20250205.0 home-assistant-frontend==20250210.0
home-assistant-intents==2025.2.5 home-assistant-intents==2025.2.5
httpx==0.28.1 httpx==0.28.1
ifaddr==0.2.0 ifaddr==0.2.0

View File

@ -4,7 +4,7 @@ build-backend = "setuptools.build_meta"
[project] [project]
name = "homeassistant" name = "homeassistant"
version = "2025.2.1" version = "2025.2.2"
license = {text = "Apache-2.0"} license = {text = "Apache-2.0"}
description = "Open-source home automation platform running on Python 3." description = "Open-source home automation platform running on Python 3."
readme = "README.rst" readme = "README.rst"

20
requirements_all.txt generated
View File

@ -368,7 +368,7 @@ aioruuvigateway==0.1.0
aiosenz==1.0.0 aiosenz==1.0.0
# homeassistant.components.shelly # homeassistant.components.shelly
aioshelly==12.4.1 aioshelly==12.4.2
# homeassistant.components.skybell # homeassistant.components.skybell
aioskybell==22.7.0 aioskybell==22.7.0
@ -1097,7 +1097,7 @@ ha-iotawattpy==0.1.2
ha-philipsjs==3.2.2 ha-philipsjs==3.2.2
# homeassistant.components.habitica # homeassistant.components.habitica
habiticalib==0.3.5 habiticalib==0.3.7
# homeassistant.components.bluetooth # homeassistant.components.bluetooth
habluetooth==3.21.1 habluetooth==3.21.1
@ -1143,7 +1143,7 @@ hole==0.8.0
holidays==0.66 holidays==0.66
# homeassistant.components.frontend # homeassistant.components.frontend
home-assistant-frontend==20250205.0 home-assistant-frontend==20250210.0
# homeassistant.components.conversation # homeassistant.components.conversation
home-assistant-intents==2025.2.5 home-assistant-intents==2025.2.5
@ -1281,7 +1281,7 @@ konnected==1.2.0
krakenex==2.2.2 krakenex==2.2.2
# homeassistant.components.lacrosse_view # homeassistant.components.lacrosse_view
lacrosse-view==1.0.4 lacrosse-view==1.1.1
# homeassistant.components.eufy # homeassistant.components.eufy
lakeside==0.13 lakeside==0.13
@ -1544,7 +1544,7 @@ odp-amsterdam==6.0.2
oemthermostat==1.1.1 oemthermostat==1.1.1
# homeassistant.components.ohme # homeassistant.components.ohme
ohme==1.2.8 ohme==1.2.9
# homeassistant.components.ollama # homeassistant.components.ollama
ollama==0.4.7 ollama==0.4.7
@ -1556,7 +1556,7 @@ omnilogic==0.4.5
ondilo==0.5.0 ondilo==0.5.0
# homeassistant.components.onedrive # homeassistant.components.onedrive
onedrive-personal-sdk==0.0.8 onedrive-personal-sdk==0.0.10
# homeassistant.components.onvif # homeassistant.components.onvif
onvif-zeep-async==3.2.5 onvif-zeep-async==3.2.5
@ -1746,7 +1746,7 @@ py-schluter==0.1.7
py-sucks==0.9.10 py-sucks==0.9.10
# homeassistant.components.synology_dsm # homeassistant.components.synology_dsm
py-synologydsm-api==2.6.0 py-synologydsm-api==2.6.2
# homeassistant.components.atome # homeassistant.components.atome
pyAtome==0.1.1 pyAtome==0.1.1
@ -1897,7 +1897,7 @@ pydiscovergy==3.0.2
pydoods==1.0.2 pydoods==1.0.2
# homeassistant.components.hydrawise # homeassistant.components.hydrawise
pydrawise==2025.1.0 pydrawise==2025.2.0
# homeassistant.components.android_ip_webcam # homeassistant.components.android_ip_webcam
pydroid-ipcam==2.0.0 pydroid-ipcam==2.0.0
@ -1987,7 +1987,7 @@ pygti==0.9.4
pyhaversion==22.8.0 pyhaversion==22.8.0
# homeassistant.components.heos # homeassistant.components.heos
pyheos==1.0.1 pyheos==1.0.2
# homeassistant.components.hive # homeassistant.components.hive
pyhive-integration==1.0.1 pyhive-integration==1.0.1
@ -2385,7 +2385,7 @@ python-gc100==1.0.3a0
python-gitlab==1.6.0 python-gitlab==1.6.0
# homeassistant.components.google_drive # homeassistant.components.google_drive
python-google-drive-api==0.0.2 python-google-drive-api==0.1.0
# homeassistant.components.analytics_insights # homeassistant.components.analytics_insights
python-homeassistant-analytics==0.8.1 python-homeassistant-analytics==0.8.1

View File

@ -350,7 +350,7 @@ aioruuvigateway==0.1.0
aiosenz==1.0.0 aiosenz==1.0.0
# homeassistant.components.shelly # homeassistant.components.shelly
aioshelly==12.4.1 aioshelly==12.4.2
# homeassistant.components.skybell # homeassistant.components.skybell
aioskybell==22.7.0 aioskybell==22.7.0
@ -938,7 +938,7 @@ ha-iotawattpy==0.1.2
ha-philipsjs==3.2.2 ha-philipsjs==3.2.2
# homeassistant.components.habitica # homeassistant.components.habitica
habiticalib==0.3.5 habiticalib==0.3.7
# homeassistant.components.bluetooth # homeassistant.components.bluetooth
habluetooth==3.21.1 habluetooth==3.21.1
@ -972,7 +972,7 @@ hole==0.8.0
holidays==0.66 holidays==0.66
# homeassistant.components.frontend # homeassistant.components.frontend
home-assistant-frontend==20250205.0 home-assistant-frontend==20250210.0
# homeassistant.components.conversation # homeassistant.components.conversation
home-assistant-intents==2025.2.5 home-assistant-intents==2025.2.5
@ -1083,7 +1083,7 @@ konnected==1.2.0
krakenex==2.2.2 krakenex==2.2.2
# homeassistant.components.lacrosse_view # homeassistant.components.lacrosse_view
lacrosse-view==1.0.4 lacrosse-view==1.1.1
# homeassistant.components.laundrify # homeassistant.components.laundrify
laundrify-aio==1.2.2 laundrify-aio==1.2.2
@ -1292,7 +1292,7 @@ objgraph==3.5.0
odp-amsterdam==6.0.2 odp-amsterdam==6.0.2
# homeassistant.components.ohme # homeassistant.components.ohme
ohme==1.2.8 ohme==1.2.9
# homeassistant.components.ollama # homeassistant.components.ollama
ollama==0.4.7 ollama==0.4.7
@ -1304,7 +1304,7 @@ omnilogic==0.4.5
ondilo==0.5.0 ondilo==0.5.0
# homeassistant.components.onedrive # homeassistant.components.onedrive
onedrive-personal-sdk==0.0.8 onedrive-personal-sdk==0.0.10
# homeassistant.components.onvif # homeassistant.components.onvif
onvif-zeep-async==3.2.5 onvif-zeep-async==3.2.5
@ -1444,7 +1444,7 @@ py-nightscout==1.2.2
py-sucks==0.9.10 py-sucks==0.9.10
# homeassistant.components.synology_dsm # homeassistant.components.synology_dsm
py-synologydsm-api==2.6.0 py-synologydsm-api==2.6.2
# homeassistant.components.hdmi_cec # homeassistant.components.hdmi_cec
pyCEC==0.5.2 pyCEC==0.5.2
@ -1547,7 +1547,7 @@ pydexcom==0.2.3
pydiscovergy==3.0.2 pydiscovergy==3.0.2
# homeassistant.components.hydrawise # homeassistant.components.hydrawise
pydrawise==2025.1.0 pydrawise==2025.2.0
# homeassistant.components.android_ip_webcam # homeassistant.components.android_ip_webcam
pydroid-ipcam==2.0.0 pydroid-ipcam==2.0.0
@ -1616,7 +1616,7 @@ pygti==0.9.4
pyhaversion==22.8.0 pyhaversion==22.8.0
# homeassistant.components.heos # homeassistant.components.heos
pyheos==1.0.1 pyheos==1.0.2
# homeassistant.components.hive # homeassistant.components.hive
pyhive-integration==1.0.1 pyhive-integration==1.0.1
@ -1930,7 +1930,7 @@ python-fullykiosk==0.0.14
# python-gammu==3.2.4 # python-gammu==3.2.4
# homeassistant.components.google_drive # homeassistant.components.google_drive
python-google-drive-api==0.0.2 python-google-drive-api==0.1.0
# homeassistant.components.analytics_insights # homeassistant.components.analytics_insights
python-homeassistant-analytics==0.8.1 python-homeassistant-analytics==0.8.1

View File

@ -3697,12 +3697,13 @@
# --- # ---
# name: test_delete_with_errors[side_effect1-storage_data0] # name: test_delete_with_errors[side_effect1-storage_data0]
dict({ dict({
'error': dict({
'code': 'home_assistant_error',
'message': 'Boom!',
}),
'id': 1, 'id': 1,
'success': False, 'result': dict({
'agent_errors': dict({
'domain.test': 'Boom!',
}),
}),
'success': True,
'type': 'result', 'type': 'result',
}) })
# --- # ---
@ -3757,12 +3758,13 @@
# --- # ---
# name: test_delete_with_errors[side_effect1-storage_data1] # name: test_delete_with_errors[side_effect1-storage_data1]
dict({ dict({
'error': dict({
'code': 'home_assistant_error',
'message': 'Boom!',
}),
'id': 1, 'id': 1,
'success': False, 'result': dict({
'agent_errors': dict({
'domain.test': 'Boom!',
}),
}),
'success': True,
'type': 'result', 'type': 'result',
}) })
# --- # ---
@ -4019,12 +4021,89 @@
# --- # ---
# name: test_details_with_errors[side_effect0] # name: test_details_with_errors[side_effect0]
dict({ dict({
'error': dict({
'code': 'home_assistant_error',
'message': 'Boom!',
}),
'id': 1, 'id': 1,
'success': False, 'result': dict({
'agent_errors': dict({
'domain.test': 'Oops',
}),
'backup': dict({
'addons': list([
dict({
'name': 'Test',
'slug': 'test',
'version': '1.0.0',
}),
]),
'agents': dict({
'backup.local': dict({
'protected': False,
'size': 0,
}),
}),
'backup_id': 'abc123',
'database_included': True,
'date': '1970-01-01T00:00:00.000Z',
'extra_metadata': dict({
'instance_id': 'our_uuid',
'with_automatic_settings': True,
}),
'failed_agent_ids': list([
]),
'folders': list([
'media',
'share',
]),
'homeassistant_included': True,
'homeassistant_version': '2024.12.0',
'name': 'Test',
'with_automatic_settings': True,
}),
}),
'success': True,
'type': 'result',
})
# ---
# name: test_details_with_errors[side_effect1]
dict({
'id': 1,
'result': dict({
'agent_errors': dict({
'domain.test': 'Boom!',
}),
'backup': dict({
'addons': list([
dict({
'name': 'Test',
'slug': 'test',
'version': '1.0.0',
}),
]),
'agents': dict({
'backup.local': dict({
'protected': False,
'size': 0,
}),
}),
'backup_id': 'abc123',
'database_included': True,
'date': '1970-01-01T00:00:00.000Z',
'extra_metadata': dict({
'instance_id': 'our_uuid',
'with_automatic_settings': True,
}),
'failed_agent_ids': list([
]),
'folders': list([
'media',
'share',
]),
'homeassistant_included': True,
'homeassistant_version': '2024.12.0',
'name': 'Test',
'with_automatic_settings': True,
}),
}),
'success': True,
'type': 'result', 'type': 'result',
}) })
# --- # ---
@ -4542,12 +4621,105 @@
# --- # ---
# name: test_info_with_errors[side_effect0] # name: test_info_with_errors[side_effect0]
dict({ dict({
'error': dict({
'code': 'home_assistant_error',
'message': 'Boom!',
}),
'id': 1, 'id': 1,
'success': False, 'result': dict({
'agent_errors': dict({
'domain.test': 'Oops',
}),
'backups': list([
dict({
'addons': list([
dict({
'name': 'Test',
'slug': 'test',
'version': '1.0.0',
}),
]),
'agents': dict({
'backup.local': dict({
'protected': False,
'size': 0,
}),
}),
'backup_id': 'abc123',
'database_included': True,
'date': '1970-01-01T00:00:00.000Z',
'extra_metadata': dict({
'instance_id': 'our_uuid',
'with_automatic_settings': True,
}),
'failed_agent_ids': list([
]),
'folders': list([
'media',
'share',
]),
'homeassistant_included': True,
'homeassistant_version': '2024.12.0',
'name': 'Test',
'with_automatic_settings': True,
}),
]),
'last_attempted_automatic_backup': None,
'last_completed_automatic_backup': None,
'last_non_idle_event': None,
'next_automatic_backup': None,
'next_automatic_backup_additional': False,
'state': 'idle',
}),
'success': True,
'type': 'result',
})
# ---
# name: test_info_with_errors[side_effect1]
dict({
'id': 1,
'result': dict({
'agent_errors': dict({
'domain.test': 'Boom!',
}),
'backups': list([
dict({
'addons': list([
dict({
'name': 'Test',
'slug': 'test',
'version': '1.0.0',
}),
]),
'agents': dict({
'backup.local': dict({
'protected': False,
'size': 0,
}),
}),
'backup_id': 'abc123',
'database_included': True,
'date': '1970-01-01T00:00:00.000Z',
'extra_metadata': dict({
'instance_id': 'our_uuid',
'with_automatic_settings': True,
}),
'failed_agent_ids': list([
]),
'folders': list([
'media',
'share',
]),
'homeassistant_included': True,
'homeassistant_version': '2024.12.0',
'name': 'Test',
'with_automatic_settings': True,
}),
]),
'last_attempted_automatic_backup': None,
'last_completed_automatic_backup': None,
'last_non_idle_event': None,
'next_automatic_backup': None,
'next_automatic_backup_additional': False,
'state': 'idle',
}),
'success': True,
'type': 'result', 'type': 'result',
}) })
# --- # ---

View File

@ -6,6 +6,7 @@ from unittest.mock import ANY, AsyncMock, MagicMock, Mock, call, patch
from freezegun.api import FrozenDateTimeFactory from freezegun.api import FrozenDateTimeFactory
import pytest import pytest
from pytest_unordered import unordered
from syrupy import SnapshotAssertion from syrupy import SnapshotAssertion
from homeassistant.components.backup import ( from homeassistant.components.backup import (
@ -20,6 +21,7 @@ from homeassistant.components.backup import (
from homeassistant.components.backup.agent import BackupAgentUnreachableError from homeassistant.components.backup.agent import BackupAgentUnreachableError
from homeassistant.components.backup.const import DATA_MANAGER, DOMAIN from homeassistant.components.backup.const import DATA_MANAGER, DOMAIN
from homeassistant.components.backup.manager import ( from homeassistant.components.backup.manager import (
AgentBackupStatus,
CreateBackupEvent, CreateBackupEvent,
CreateBackupState, CreateBackupState,
ManagerBackup, ManagerBackup,
@ -148,7 +150,8 @@ async def test_info(
@pytest.mark.parametrize( @pytest.mark.parametrize(
"side_effect", [HomeAssistantError("Boom!"), BackupAgentUnreachableError] "side_effect",
[Exception("Oops"), HomeAssistantError("Boom!"), BackupAgentUnreachableError],
) )
async def test_info_with_errors( async def test_info_with_errors(
hass: HomeAssistant, hass: HomeAssistant,
@ -209,7 +212,8 @@ async def test_details(
@pytest.mark.parametrize( @pytest.mark.parametrize(
"side_effect", [HomeAssistantError("Boom!"), BackupAgentUnreachableError] "side_effect",
[Exception("Oops"), HomeAssistantError("Boom!"), BackupAgentUnreachableError],
) )
async def test_details_with_errors( async def test_details_with_errors(
hass: HomeAssistant, hass: HomeAssistant,
@ -1798,21 +1802,25 @@ async def test_config_schedule_logic(
}, },
{ {
"backup-1": MagicMock( "backup-1": MagicMock(
agents={"test.test-agent": MagicMock(spec=AgentBackupStatus)},
date="2024-11-10T04:45:00+01:00", date="2024-11-10T04:45:00+01:00",
with_automatic_settings=True, with_automatic_settings=True,
spec=ManagerBackup, spec=ManagerBackup,
), ),
"backup-2": MagicMock( "backup-2": MagicMock(
agents={"test.test-agent": MagicMock(spec=AgentBackupStatus)},
date="2024-11-11T04:45:00+01:00", date="2024-11-11T04:45:00+01:00",
with_automatic_settings=True, with_automatic_settings=True,
spec=ManagerBackup, spec=ManagerBackup,
), ),
"backup-3": MagicMock( "backup-3": MagicMock(
agents={"test.test-agent": MagicMock(spec=AgentBackupStatus)},
date="2024-11-12T04:45:00+01:00", date="2024-11-12T04:45:00+01:00",
with_automatic_settings=True, with_automatic_settings=True,
spec=ManagerBackup, spec=ManagerBackup,
), ),
"backup-4": MagicMock( "backup-4": MagicMock(
agents={"test.test-agent": MagicMock(spec=AgentBackupStatus)},
date="2024-11-12T04:45:00+01:00", date="2024-11-12T04:45:00+01:00",
with_automatic_settings=False, with_automatic_settings=False,
spec=ManagerBackup, spec=ManagerBackup,
@ -1837,21 +1845,25 @@ async def test_config_schedule_logic(
}, },
{ {
"backup-1": MagicMock( "backup-1": MagicMock(
agents={"test.test-agent": MagicMock(spec=AgentBackupStatus)},
date="2024-11-10T04:45:00+01:00", date="2024-11-10T04:45:00+01:00",
with_automatic_settings=True, with_automatic_settings=True,
spec=ManagerBackup, spec=ManagerBackup,
), ),
"backup-2": MagicMock( "backup-2": MagicMock(
agents={"test.test-agent": MagicMock(spec=AgentBackupStatus)},
date="2024-11-11T04:45:00+01:00", date="2024-11-11T04:45:00+01:00",
with_automatic_settings=True, with_automatic_settings=True,
spec=ManagerBackup, spec=ManagerBackup,
), ),
"backup-3": MagicMock( "backup-3": MagicMock(
agents={"test.test-agent": MagicMock(spec=AgentBackupStatus)},
date="2024-11-12T04:45:00+01:00", date="2024-11-12T04:45:00+01:00",
with_automatic_settings=True, with_automatic_settings=True,
spec=ManagerBackup, spec=ManagerBackup,
), ),
"backup-4": MagicMock( "backup-4": MagicMock(
agents={"test.test-agent": MagicMock(spec=AgentBackupStatus)},
date="2024-11-12T04:45:00+01:00", date="2024-11-12T04:45:00+01:00",
with_automatic_settings=False, with_automatic_settings=False,
spec=ManagerBackup, spec=ManagerBackup,
@ -1876,11 +1888,13 @@ async def test_config_schedule_logic(
}, },
{ {
"backup-1": MagicMock( "backup-1": MagicMock(
agents={"test.test-agent": MagicMock(spec=AgentBackupStatus)},
date="2024-11-10T04:45:00+01:00", date="2024-11-10T04:45:00+01:00",
with_automatic_settings=True, with_automatic_settings=True,
spec=ManagerBackup, spec=ManagerBackup,
), ),
"backup-2": MagicMock( "backup-2": MagicMock(
agents={"test.test-agent": MagicMock(spec=AgentBackupStatus)},
date="2024-11-11T04:45:00+01:00", date="2024-11-11T04:45:00+01:00",
with_automatic_settings=True, with_automatic_settings=True,
spec=ManagerBackup, spec=ManagerBackup,
@ -1905,26 +1919,46 @@ async def test_config_schedule_logic(
}, },
{ {
"backup-1": MagicMock( "backup-1": MagicMock(
agents={
"test.test-agent": MagicMock(spec=AgentBackupStatus),
"test.test-agent2": MagicMock(spec=AgentBackupStatus),
},
date="2024-11-09T04:45:00+01:00", date="2024-11-09T04:45:00+01:00",
with_automatic_settings=True, with_automatic_settings=True,
spec=ManagerBackup, spec=ManagerBackup,
), ),
"backup-2": MagicMock( "backup-2": MagicMock(
agents={
"test.test-agent": MagicMock(spec=AgentBackupStatus),
"test.test-agent2": MagicMock(spec=AgentBackupStatus),
},
date="2024-11-10T04:45:00+01:00", date="2024-11-10T04:45:00+01:00",
with_automatic_settings=True, with_automatic_settings=True,
spec=ManagerBackup, spec=ManagerBackup,
), ),
"backup-3": MagicMock( "backup-3": MagicMock(
agents={
"test.test-agent": MagicMock(spec=AgentBackupStatus),
"test.test-agent2": MagicMock(spec=AgentBackupStatus),
},
date="2024-11-11T04:45:00+01:00", date="2024-11-11T04:45:00+01:00",
with_automatic_settings=True, with_automatic_settings=True,
spec=ManagerBackup, spec=ManagerBackup,
), ),
"backup-4": MagicMock( "backup-4": MagicMock(
agents={
"test.test-agent": MagicMock(spec=AgentBackupStatus),
"test.test-agent2": MagicMock(spec=AgentBackupStatus),
},
date="2024-11-12T04:45:00+01:00", date="2024-11-12T04:45:00+01:00",
with_automatic_settings=True, with_automatic_settings=True,
spec=ManagerBackup, spec=ManagerBackup,
), ),
"backup-5": MagicMock( "backup-5": MagicMock(
agents={
"test.test-agent": MagicMock(spec=AgentBackupStatus),
"test.test-agent2": MagicMock(spec=AgentBackupStatus),
},
date="2024-11-12T04:45:00+01:00", date="2024-11-12T04:45:00+01:00",
with_automatic_settings=False, with_automatic_settings=False,
spec=ManagerBackup, spec=ManagerBackup,
@ -1938,7 +1972,80 @@ async def test_config_schedule_logic(
1, 1,
1, 1,
1, 1,
[call("backup-1")], [
call(
"backup-1",
agent_ids=unordered(["test.test-agent", "test.test-agent2"]),
)
],
),
(
{
"type": "backup/config/update",
"create_backup": {"agent_ids": ["test.test-agent"]},
"retention": {"copies": 3, "days": None},
"schedule": {"recurrence": "daily"},
},
{
"backup-1": MagicMock(
agents={
"test.test-agent": MagicMock(spec=AgentBackupStatus),
"test.test-agent2": MagicMock(spec=AgentBackupStatus),
},
date="2024-11-09T04:45:00+01:00",
with_automatic_settings=True,
spec=ManagerBackup,
),
"backup-2": MagicMock(
agents={
"test.test-agent": MagicMock(spec=AgentBackupStatus),
"test.test-agent2": MagicMock(spec=AgentBackupStatus),
},
date="2024-11-10T04:45:00+01:00",
with_automatic_settings=True,
spec=ManagerBackup,
),
"backup-3": MagicMock(
agents={
"test.test-agent": MagicMock(spec=AgentBackupStatus),
"test.test-agent2": MagicMock(spec=AgentBackupStatus),
},
date="2024-11-11T04:45:00+01:00",
with_automatic_settings=True,
spec=ManagerBackup,
),
"backup-4": MagicMock(
agents={
"test.test-agent": MagicMock(spec=AgentBackupStatus),
},
date="2024-11-12T04:45:00+01:00",
with_automatic_settings=True,
spec=ManagerBackup,
),
"backup-5": MagicMock(
agents={
"test.test-agent": MagicMock(spec=AgentBackupStatus),
"test.test-agent2": MagicMock(spec=AgentBackupStatus),
},
date="2024-11-12T04:45:00+01:00",
with_automatic_settings=False,
spec=ManagerBackup,
),
},
{},
{},
"2024-11-11T04:45:00+01:00",
"2024-11-12T04:45:00+01:00",
"2024-11-12T04:45:00+01:00",
1,
1,
1,
[
call(
"backup-1",
agent_ids=unordered(["test.test-agent", "test.test-agent2"]),
)
],
), ),
( (
{ {
@ -1949,26 +2056,31 @@ async def test_config_schedule_logic(
}, },
{ {
"backup-1": MagicMock( "backup-1": MagicMock(
agents={"test.test-agent": MagicMock(spec=AgentBackupStatus)},
date="2024-11-09T04:45:00+01:00", date="2024-11-09T04:45:00+01:00",
with_automatic_settings=True, with_automatic_settings=True,
spec=ManagerBackup, spec=ManagerBackup,
), ),
"backup-2": MagicMock( "backup-2": MagicMock(
agents={"test.test-agent": MagicMock(spec=AgentBackupStatus)},
date="2024-11-10T04:45:00+01:00", date="2024-11-10T04:45:00+01:00",
with_automatic_settings=True, with_automatic_settings=True,
spec=ManagerBackup, spec=ManagerBackup,
), ),
"backup-3": MagicMock( "backup-3": MagicMock(
agents={"test.test-agent": MagicMock(spec=AgentBackupStatus)},
date="2024-11-11T04:45:00+01:00", date="2024-11-11T04:45:00+01:00",
with_automatic_settings=True, with_automatic_settings=True,
spec=ManagerBackup, spec=ManagerBackup,
), ),
"backup-4": MagicMock( "backup-4": MagicMock(
agents={"test.test-agent": MagicMock(spec=AgentBackupStatus)},
date="2024-11-12T04:45:00+01:00", date="2024-11-12T04:45:00+01:00",
with_automatic_settings=True, with_automatic_settings=True,
spec=ManagerBackup, spec=ManagerBackup,
), ),
"backup-5": MagicMock( "backup-5": MagicMock(
agents={"test.test-agent": MagicMock(spec=AgentBackupStatus)},
date="2024-11-12T04:45:00+01:00", date="2024-11-12T04:45:00+01:00",
with_automatic_settings=False, with_automatic_settings=False,
spec=ManagerBackup, spec=ManagerBackup,
@ -1982,7 +2094,10 @@ async def test_config_schedule_logic(
1, 1,
1, 1,
2, 2,
[call("backup-1"), call("backup-2")], [
call("backup-1", agent_ids=["test.test-agent"]),
call("backup-2", agent_ids=["test.test-agent"]),
],
), ),
( (
{ {
@ -1993,21 +2108,25 @@ async def test_config_schedule_logic(
}, },
{ {
"backup-1": MagicMock( "backup-1": MagicMock(
agents={"test.test-agent": MagicMock(spec=AgentBackupStatus)},
date="2024-11-10T04:45:00+01:00", date="2024-11-10T04:45:00+01:00",
with_automatic_settings=True, with_automatic_settings=True,
spec=ManagerBackup, spec=ManagerBackup,
), ),
"backup-2": MagicMock( "backup-2": MagicMock(
agents={"test.test-agent": MagicMock(spec=AgentBackupStatus)},
date="2024-11-11T04:45:00+01:00", date="2024-11-11T04:45:00+01:00",
with_automatic_settings=True, with_automatic_settings=True,
spec=ManagerBackup, spec=ManagerBackup,
), ),
"backup-3": MagicMock( "backup-3": MagicMock(
agents={"test.test-agent": MagicMock(spec=AgentBackupStatus)},
date="2024-11-12T04:45:00+01:00", date="2024-11-12T04:45:00+01:00",
with_automatic_settings=True, with_automatic_settings=True,
spec=ManagerBackup, spec=ManagerBackup,
), ),
"backup-4": MagicMock( "backup-4": MagicMock(
agents={"test.test-agent": MagicMock(spec=AgentBackupStatus)},
date="2024-11-12T04:45:00+01:00", date="2024-11-12T04:45:00+01:00",
with_automatic_settings=False, with_automatic_settings=False,
spec=ManagerBackup, spec=ManagerBackup,
@ -2021,7 +2140,7 @@ async def test_config_schedule_logic(
1, 1,
1, 1,
1, 1,
[call("backup-1")], [call("backup-1", agent_ids=["test.test-agent"])],
), ),
( (
{ {
@ -2032,21 +2151,25 @@ async def test_config_schedule_logic(
}, },
{ {
"backup-1": MagicMock( "backup-1": MagicMock(
agents={"test.test-agent": MagicMock(spec=AgentBackupStatus)},
date="2024-11-10T04:45:00+01:00", date="2024-11-10T04:45:00+01:00",
with_automatic_settings=True, with_automatic_settings=True,
spec=ManagerBackup, spec=ManagerBackup,
), ),
"backup-2": MagicMock( "backup-2": MagicMock(
agents={"test.test-agent": MagicMock(spec=AgentBackupStatus)},
date="2024-11-11T04:45:00+01:00", date="2024-11-11T04:45:00+01:00",
with_automatic_settings=True, with_automatic_settings=True,
spec=ManagerBackup, spec=ManagerBackup,
), ),
"backup-3": MagicMock( "backup-3": MagicMock(
agents={"test.test-agent": MagicMock(spec=AgentBackupStatus)},
date="2024-11-12T04:45:00+01:00", date="2024-11-12T04:45:00+01:00",
with_automatic_settings=True, with_automatic_settings=True,
spec=ManagerBackup, spec=ManagerBackup,
), ),
"backup-4": MagicMock( "backup-4": MagicMock(
agents={"test.test-agent": MagicMock(spec=AgentBackupStatus)},
date="2024-11-12T04:45:00+01:00", date="2024-11-12T04:45:00+01:00",
with_automatic_settings=False, with_automatic_settings=False,
spec=ManagerBackup, spec=ManagerBackup,
@ -2060,7 +2183,7 @@ async def test_config_schedule_logic(
1, 1,
1, 1,
1, 1,
[call("backup-1")], [call("backup-1", agent_ids=["test.test-agent"])],
), ),
( (
{ {
@ -2071,26 +2194,46 @@ async def test_config_schedule_logic(
}, },
{ {
"backup-1": MagicMock( "backup-1": MagicMock(
agents={
"test.test-agent": MagicMock(spec=AgentBackupStatus),
"test.test-agent2": MagicMock(spec=AgentBackupStatus),
},
date="2024-11-09T04:45:00+01:00", date="2024-11-09T04:45:00+01:00",
with_automatic_settings=True, with_automatic_settings=True,
spec=ManagerBackup, spec=ManagerBackup,
), ),
"backup-2": MagicMock( "backup-2": MagicMock(
agents={
"test.test-agent": MagicMock(spec=AgentBackupStatus),
"test.test-agent2": MagicMock(spec=AgentBackupStatus),
},
date="2024-11-10T04:45:00+01:00", date="2024-11-10T04:45:00+01:00",
with_automatic_settings=True, with_automatic_settings=True,
spec=ManagerBackup, spec=ManagerBackup,
), ),
"backup-3": MagicMock( "backup-3": MagicMock(
agents={
"test.test-agent": MagicMock(spec=AgentBackupStatus),
"test.test-agent2": MagicMock(spec=AgentBackupStatus),
},
date="2024-11-11T04:45:00+01:00", date="2024-11-11T04:45:00+01:00",
with_automatic_settings=True, with_automatic_settings=True,
spec=ManagerBackup, spec=ManagerBackup,
), ),
"backup-4": MagicMock( "backup-4": MagicMock(
agents={
"test.test-agent": MagicMock(spec=AgentBackupStatus),
"test.test-agent2": MagicMock(spec=AgentBackupStatus),
},
date="2024-11-12T04:45:00+01:00", date="2024-11-12T04:45:00+01:00",
with_automatic_settings=True, with_automatic_settings=True,
spec=ManagerBackup, spec=ManagerBackup,
), ),
"backup-5": MagicMock( "backup-5": MagicMock(
agents={
"test.test-agent": MagicMock(spec=AgentBackupStatus),
"test.test-agent2": MagicMock(spec=AgentBackupStatus),
},
date="2024-11-12T04:45:00+01:00", date="2024-11-12T04:45:00+01:00",
with_automatic_settings=False, with_automatic_settings=False,
spec=ManagerBackup, spec=ManagerBackup,
@ -2104,7 +2247,20 @@ async def test_config_schedule_logic(
1, 1,
1, 1,
3, 3,
[call("backup-1"), call("backup-2"), call("backup-3")], [
call(
"backup-1",
agent_ids=unordered(["test.test-agent", "test.test-agent2"]),
),
call(
"backup-2",
agent_ids=unordered(["test.test-agent", "test.test-agent2"]),
),
call(
"backup-3",
agent_ids=unordered(["test.test-agent", "test.test-agent2"]),
),
],
), ),
( (
{ {
@ -2115,11 +2271,86 @@ async def test_config_schedule_logic(
}, },
{ {
"backup-1": MagicMock( "backup-1": MagicMock(
agents={
"test.test-agent": MagicMock(spec=AgentBackupStatus),
"test.test-agent2": MagicMock(spec=AgentBackupStatus),
},
date="2024-11-09T04:45:00+01:00",
with_automatic_settings=True,
spec=ManagerBackup,
),
"backup-2": MagicMock(
agents={
"test.test-agent": MagicMock(spec=AgentBackupStatus),
"test.test-agent2": MagicMock(spec=AgentBackupStatus),
},
date="2024-11-10T04:45:00+01:00",
with_automatic_settings=True,
spec=ManagerBackup,
),
"backup-3": MagicMock(
agents={
"test.test-agent": MagicMock(spec=AgentBackupStatus),
"test.test-agent2": MagicMock(spec=AgentBackupStatus),
},
date="2024-11-11T04:45:00+01:00",
with_automatic_settings=True,
spec=ManagerBackup,
),
"backup-4": MagicMock(
agents={
"test.test-agent": MagicMock(spec=AgentBackupStatus),
},
date="2024-11-12T04:45:00+01:00",
with_automatic_settings=True,
spec=ManagerBackup,
),
"backup-5": MagicMock(
agents={
"test.test-agent": MagicMock(spec=AgentBackupStatus),
"test.test-agent2": MagicMock(spec=AgentBackupStatus),
},
date="2024-11-12T04:45:00+01:00",
with_automatic_settings=False,
spec=ManagerBackup,
),
},
{},
{},
"2024-11-11T04:45:00+01:00",
"2024-11-12T04:45:00+01:00",
"2024-11-12T04:45:00+01:00",
1,
1,
3,
[
call(
"backup-1",
agent_ids=unordered(["test.test-agent", "test.test-agent2"]),
),
call(
"backup-2",
agent_ids=unordered(["test.test-agent", "test.test-agent2"]),
),
call("backup-3", agent_ids=["test.test-agent"]),
],
),
(
{
"type": "backup/config/update",
"create_backup": {"agent_ids": ["test.test-agent"]},
"retention": {"copies": 0, "days": None},
"schedule": {"recurrence": "daily"},
},
{
"backup-1": MagicMock(
agents={"test.test-agent": MagicMock(spec=AgentBackupStatus)},
date="2024-11-12T04:45:00+01:00", date="2024-11-12T04:45:00+01:00",
with_automatic_settings=True, with_automatic_settings=True,
spec=ManagerBackup, spec=ManagerBackup,
), ),
"backup-2": MagicMock( "backup-2": MagicMock(
agents={"test.test-agent": MagicMock(spec=AgentBackupStatus)},
date="2024-11-12T04:45:00+01:00", date="2024-11-12T04:45:00+01:00",
with_automatic_settings=False, with_automatic_settings=False,
spec=ManagerBackup, spec=ManagerBackup,
@ -2259,21 +2490,25 @@ async def test_config_retention_copies_logic(
}, },
{ {
"backup-1": MagicMock( "backup-1": MagicMock(
agents={"test.test-agent": MagicMock(spec=AgentBackupStatus)},
date="2024-11-10T04:45:00+01:00", date="2024-11-10T04:45:00+01:00",
with_automatic_settings=True, with_automatic_settings=True,
spec=ManagerBackup, spec=ManagerBackup,
), ),
"backup-2": MagicMock( "backup-2": MagicMock(
agents={"test.test-agent": MagicMock(spec=AgentBackupStatus)},
date="2024-11-11T04:45:00+01:00", date="2024-11-11T04:45:00+01:00",
with_automatic_settings=True, with_automatic_settings=True,
spec=ManagerBackup, spec=ManagerBackup,
), ),
"backup-3": MagicMock( "backup-3": MagicMock(
agents={"test.test-agent": MagicMock(spec=AgentBackupStatus)},
date="2024-11-12T04:45:00+01:00", date="2024-11-12T04:45:00+01:00",
with_automatic_settings=True, with_automatic_settings=True,
spec=ManagerBackup, spec=ManagerBackup,
), ),
"backup-4": MagicMock( "backup-4": MagicMock(
agents={"test.test-agent": MagicMock(spec=AgentBackupStatus)},
date="2024-11-12T04:45:00+01:00", date="2024-11-12T04:45:00+01:00",
with_automatic_settings=False, with_automatic_settings=False,
spec=ManagerBackup, spec=ManagerBackup,
@ -2295,21 +2530,25 @@ async def test_config_retention_copies_logic(
}, },
{ {
"backup-1": MagicMock( "backup-1": MagicMock(
agents={"test.test-agent": MagicMock(spec=AgentBackupStatus)},
date="2024-11-10T04:45:00+01:00", date="2024-11-10T04:45:00+01:00",
with_automatic_settings=True, with_automatic_settings=True,
spec=ManagerBackup, spec=ManagerBackup,
), ),
"backup-2": MagicMock( "backup-2": MagicMock(
agents={"test.test-agent": MagicMock(spec=AgentBackupStatus)},
date="2024-11-11T04:45:00+01:00", date="2024-11-11T04:45:00+01:00",
with_automatic_settings=True, with_automatic_settings=True,
spec=ManagerBackup, spec=ManagerBackup,
), ),
"backup-3": MagicMock( "backup-3": MagicMock(
agents={"test.test-agent": MagicMock(spec=AgentBackupStatus)},
date="2024-11-12T04:45:00+01:00", date="2024-11-12T04:45:00+01:00",
with_automatic_settings=True, with_automatic_settings=True,
spec=ManagerBackup, spec=ManagerBackup,
), ),
"backup-4": MagicMock( "backup-4": MagicMock(
agents={"test.test-agent": MagicMock(spec=AgentBackupStatus)},
date="2024-11-12T04:45:00+01:00", date="2024-11-12T04:45:00+01:00",
with_automatic_settings=False, with_automatic_settings=False,
spec=ManagerBackup, spec=ManagerBackup,
@ -2331,26 +2570,31 @@ async def test_config_retention_copies_logic(
}, },
{ {
"backup-1": MagicMock( "backup-1": MagicMock(
agents={"test.test-agent": MagicMock(spec=AgentBackupStatus)},
date="2024-11-09T04:45:00+01:00", date="2024-11-09T04:45:00+01:00",
with_automatic_settings=True, with_automatic_settings=True,
spec=ManagerBackup, spec=ManagerBackup,
), ),
"backup-2": MagicMock( "backup-2": MagicMock(
agents={"test.test-agent": MagicMock(spec=AgentBackupStatus)},
date="2024-11-10T04:45:00+01:00", date="2024-11-10T04:45:00+01:00",
with_automatic_settings=True, with_automatic_settings=True,
spec=ManagerBackup, spec=ManagerBackup,
), ),
"backup-3": MagicMock( "backup-3": MagicMock(
agents={"test.test-agent": MagicMock(spec=AgentBackupStatus)},
date="2024-11-11T04:45:00+01:00", date="2024-11-11T04:45:00+01:00",
with_automatic_settings=True, with_automatic_settings=True,
spec=ManagerBackup, spec=ManagerBackup,
), ),
"backup-4": MagicMock( "backup-4": MagicMock(
agents={"test.test-agent": MagicMock(spec=AgentBackupStatus)},
date="2024-11-12T04:45:00+01:00", date="2024-11-12T04:45:00+01:00",
with_automatic_settings=True, with_automatic_settings=True,
spec=ManagerBackup, spec=ManagerBackup,
), ),
"backup-5": MagicMock( "backup-5": MagicMock(
agents={"test.test-agent": MagicMock(spec=AgentBackupStatus)},
date="2024-11-12T04:45:00+01:00", date="2024-11-12T04:45:00+01:00",
with_automatic_settings=False, with_automatic_settings=False,
spec=ManagerBackup, spec=ManagerBackup,
@ -2361,7 +2605,7 @@ async def test_config_retention_copies_logic(
1, 1,
1, 1,
1, 1,
[call("backup-1")], [call("backup-1", agent_ids=["test.test-agent"])],
), ),
( (
{ {
@ -2372,26 +2616,31 @@ async def test_config_retention_copies_logic(
}, },
{ {
"backup-1": MagicMock( "backup-1": MagicMock(
agents={"test.test-agent": MagicMock(spec=AgentBackupStatus)},
date="2024-11-09T04:45:00+01:00", date="2024-11-09T04:45:00+01:00",
with_automatic_settings=True, with_automatic_settings=True,
spec=ManagerBackup, spec=ManagerBackup,
), ),
"backup-2": MagicMock( "backup-2": MagicMock(
agents={"test.test-agent": MagicMock(spec=AgentBackupStatus)},
date="2024-11-10T04:45:00+01:00", date="2024-11-10T04:45:00+01:00",
with_automatic_settings=True, with_automatic_settings=True,
spec=ManagerBackup, spec=ManagerBackup,
), ),
"backup-3": MagicMock( "backup-3": MagicMock(
agents={"test.test-agent": MagicMock(spec=AgentBackupStatus)},
date="2024-11-11T04:45:00+01:00", date="2024-11-11T04:45:00+01:00",
with_automatic_settings=True, with_automatic_settings=True,
spec=ManagerBackup, spec=ManagerBackup,
), ),
"backup-4": MagicMock( "backup-4": MagicMock(
agents={"test.test-agent": MagicMock(spec=AgentBackupStatus)},
date="2024-11-12T04:45:00+01:00", date="2024-11-12T04:45:00+01:00",
with_automatic_settings=True, with_automatic_settings=True,
spec=ManagerBackup, spec=ManagerBackup,
), ),
"backup-5": MagicMock( "backup-5": MagicMock(
agents={"test.test-agent": MagicMock(spec=AgentBackupStatus)},
date="2024-11-12T04:45:00+01:00", date="2024-11-12T04:45:00+01:00",
with_automatic_settings=False, with_automatic_settings=False,
spec=ManagerBackup, spec=ManagerBackup,
@ -2402,7 +2651,10 @@ async def test_config_retention_copies_logic(
1, 1,
1, 1,
2, 2,
[call("backup-1"), call("backup-2")], [
call("backup-1", agent_ids=["test.test-agent"]),
call("backup-2", agent_ids=["test.test-agent"]),
],
), ),
], ],
) )
@ -2517,16 +2769,19 @@ async def test_config_retention_copies_logic_manual_backup(
[], [],
{ {
"backup-1": MagicMock( "backup-1": MagicMock(
agents={"test.test-agent": MagicMock(spec=AgentBackupStatus)},
date="2024-11-10T04:45:00+01:00", date="2024-11-10T04:45:00+01:00",
with_automatic_settings=True, with_automatic_settings=True,
spec=ManagerBackup, spec=ManagerBackup,
), ),
"backup-2": MagicMock( "backup-2": MagicMock(
agents={"test.test-agent": MagicMock(spec=AgentBackupStatus)},
date="2024-11-11T04:45:00+01:00", date="2024-11-11T04:45:00+01:00",
with_automatic_settings=True, with_automatic_settings=True,
spec=ManagerBackup, spec=ManagerBackup,
), ),
"backup-3": MagicMock( "backup-3": MagicMock(
agents={"test.test-agent": MagicMock(spec=AgentBackupStatus)},
date="2024-11-10T04:45:00+01:00", date="2024-11-10T04:45:00+01:00",
with_automatic_settings=False, with_automatic_settings=False,
spec=ManagerBackup, spec=ManagerBackup,
@ -2539,7 +2794,7 @@ async def test_config_retention_copies_logic_manual_backup(
"2024-11-12T12:00:00+01:00", "2024-11-12T12:00:00+01:00",
1, 1,
1, 1,
[call("backup-1")], [call("backup-1", agent_ids=["test.test-agent"])],
), ),
# No config update - No cleanup # No config update - No cleanup
( (
@ -2547,16 +2802,19 @@ async def test_config_retention_copies_logic_manual_backup(
[], [],
{ {
"backup-1": MagicMock( "backup-1": MagicMock(
agents={"test.test-agent": MagicMock(spec=AgentBackupStatus)},
date="2024-11-10T04:45:00+01:00", date="2024-11-10T04:45:00+01:00",
with_automatic_settings=True, with_automatic_settings=True,
spec=ManagerBackup, spec=ManagerBackup,
), ),
"backup-2": MagicMock( "backup-2": MagicMock(
agents={"test.test-agent": MagicMock(spec=AgentBackupStatus)},
date="2024-11-11T04:45:00+01:00", date="2024-11-11T04:45:00+01:00",
with_automatic_settings=True, with_automatic_settings=True,
spec=ManagerBackup, spec=ManagerBackup,
), ),
"backup-3": MagicMock( "backup-3": MagicMock(
agents={"test.test-agent": MagicMock(spec=AgentBackupStatus)},
date="2024-11-10T04:45:00+01:00", date="2024-11-10T04:45:00+01:00",
with_automatic_settings=False, with_automatic_settings=False,
spec=ManagerBackup, spec=ManagerBackup,
@ -2584,16 +2842,19 @@ async def test_config_retention_copies_logic_manual_backup(
], ],
{ {
"backup-1": MagicMock( "backup-1": MagicMock(
agents={"test.test-agent": MagicMock(spec=AgentBackupStatus)},
date="2024-11-10T04:45:00+01:00", date="2024-11-10T04:45:00+01:00",
with_automatic_settings=True, with_automatic_settings=True,
spec=ManagerBackup, spec=ManagerBackup,
), ),
"backup-2": MagicMock( "backup-2": MagicMock(
agents={"test.test-agent": MagicMock(spec=AgentBackupStatus)},
date="2024-11-11T04:45:00+01:00", date="2024-11-11T04:45:00+01:00",
with_automatic_settings=True, with_automatic_settings=True,
spec=ManagerBackup, spec=ManagerBackup,
), ),
"backup-3": MagicMock( "backup-3": MagicMock(
agents={"test.test-agent": MagicMock(spec=AgentBackupStatus)},
date="2024-11-10T04:45:00+01:00", date="2024-11-10T04:45:00+01:00",
with_automatic_settings=False, with_automatic_settings=False,
spec=ManagerBackup, spec=ManagerBackup,
@ -2606,7 +2867,7 @@ async def test_config_retention_copies_logic_manual_backup(
"2024-11-12T12:00:00+01:00", "2024-11-12T12:00:00+01:00",
1, 1,
1, 1,
[call("backup-1")], [call("backup-1", agent_ids=["test.test-agent"])],
), ),
( (
None, None,
@ -2620,16 +2881,19 @@ async def test_config_retention_copies_logic_manual_backup(
], ],
{ {
"backup-1": MagicMock( "backup-1": MagicMock(
agents={"test.test-agent": MagicMock(spec=AgentBackupStatus)},
date="2024-11-10T04:45:00+01:00", date="2024-11-10T04:45:00+01:00",
with_automatic_settings=True, with_automatic_settings=True,
spec=ManagerBackup, spec=ManagerBackup,
), ),
"backup-2": MagicMock( "backup-2": MagicMock(
agents={"test.test-agent": MagicMock(spec=AgentBackupStatus)},
date="2024-11-11T04:45:00+01:00", date="2024-11-11T04:45:00+01:00",
with_automatic_settings=True, with_automatic_settings=True,
spec=ManagerBackup, spec=ManagerBackup,
), ),
"backup-3": MagicMock( "backup-3": MagicMock(
agents={"test.test-agent": MagicMock(spec=AgentBackupStatus)},
date="2024-11-10T04:45:00+01:00", date="2024-11-10T04:45:00+01:00",
with_automatic_settings=False, with_automatic_settings=False,
spec=ManagerBackup, spec=ManagerBackup,
@ -2642,7 +2906,7 @@ async def test_config_retention_copies_logic_manual_backup(
"2024-11-12T12:00:00+01:00", "2024-11-12T12:00:00+01:00",
1, 1,
1, 1,
[call("backup-1")], [call("backup-1", agent_ids=["test.test-agent"])],
), ),
( (
None, None,
@ -2656,16 +2920,19 @@ async def test_config_retention_copies_logic_manual_backup(
], ],
{ {
"backup-1": MagicMock( "backup-1": MagicMock(
agents={"test.test-agent": MagicMock(spec=AgentBackupStatus)},
date="2024-11-10T04:45:00+01:00", date="2024-11-10T04:45:00+01:00",
with_automatic_settings=True, with_automatic_settings=True,
spec=ManagerBackup, spec=ManagerBackup,
), ),
"backup-2": MagicMock( "backup-2": MagicMock(
agents={"test.test-agent": MagicMock(spec=AgentBackupStatus)},
date="2024-11-11T04:45:00+01:00", date="2024-11-11T04:45:00+01:00",
with_automatic_settings=True, with_automatic_settings=True,
spec=ManagerBackup, spec=ManagerBackup,
), ),
"backup-3": MagicMock( "backup-3": MagicMock(
agents={"test.test-agent": MagicMock(spec=AgentBackupStatus)},
date="2024-11-10T04:45:00+01:00", date="2024-11-10T04:45:00+01:00",
with_automatic_settings=False, with_automatic_settings=False,
spec=ManagerBackup, spec=ManagerBackup,
@ -2692,21 +2959,25 @@ async def test_config_retention_copies_logic_manual_backup(
], ],
{ {
"backup-1": MagicMock( "backup-1": MagicMock(
agents={"test.test-agent": MagicMock(spec=AgentBackupStatus)},
date="2024-11-09T04:45:00+01:00", date="2024-11-09T04:45:00+01:00",
with_automatic_settings=True, with_automatic_settings=True,
spec=ManagerBackup, spec=ManagerBackup,
), ),
"backup-2": MagicMock( "backup-2": MagicMock(
agents={"test.test-agent": MagicMock(spec=AgentBackupStatus)},
date="2024-11-10T04:45:00+01:00", date="2024-11-10T04:45:00+01:00",
with_automatic_settings=True, with_automatic_settings=True,
spec=ManagerBackup, spec=ManagerBackup,
), ),
"backup-3": MagicMock( "backup-3": MagicMock(
agents={"test.test-agent": MagicMock(spec=AgentBackupStatus)},
date="2024-11-11T04:45:00+01:00", date="2024-11-11T04:45:00+01:00",
with_automatic_settings=True, with_automatic_settings=True,
spec=ManagerBackup, spec=ManagerBackup,
), ),
"backup-4": MagicMock( "backup-4": MagicMock(
agents={"test.test-agent": MagicMock(spec=AgentBackupStatus)},
date="2024-11-10T04:45:00+01:00", date="2024-11-10T04:45:00+01:00",
with_automatic_settings=False, with_automatic_settings=False,
spec=ManagerBackup, spec=ManagerBackup,
@ -2719,7 +2990,10 @@ async def test_config_retention_copies_logic_manual_backup(
"2024-11-12T12:00:00+01:00", "2024-11-12T12:00:00+01:00",
1, 1,
2, 2,
[call("backup-1"), call("backup-2")], [
call("backup-1", agent_ids=["test.test-agent"]),
call("backup-2", agent_ids=["test.test-agent"]),
],
), ),
( (
None, None,
@ -2733,16 +3007,19 @@ async def test_config_retention_copies_logic_manual_backup(
], ],
{ {
"backup-1": MagicMock( "backup-1": MagicMock(
agents={"test.test-agent": MagicMock(spec=AgentBackupStatus)},
date="2024-11-10T04:45:00+01:00", date="2024-11-10T04:45:00+01:00",
with_automatic_settings=True, with_automatic_settings=True,
spec=ManagerBackup, spec=ManagerBackup,
), ),
"backup-2": MagicMock( "backup-2": MagicMock(
agents={"test.test-agent": MagicMock(spec=AgentBackupStatus)},
date="2024-11-11T04:45:00+01:00", date="2024-11-11T04:45:00+01:00",
with_automatic_settings=True, with_automatic_settings=True,
spec=ManagerBackup, spec=ManagerBackup,
), ),
"backup-3": MagicMock( "backup-3": MagicMock(
agents={"test.test-agent": MagicMock(spec=AgentBackupStatus)},
date="2024-11-10T04:45:00+01:00", date="2024-11-10T04:45:00+01:00",
with_automatic_settings=False, with_automatic_settings=False,
spec=ManagerBackup, spec=ManagerBackup,
@ -2755,7 +3032,7 @@ async def test_config_retention_copies_logic_manual_backup(
"2024-11-12T12:00:00+01:00", "2024-11-12T12:00:00+01:00",
1, 1,
1, 1,
[call("backup-1")], [call("backup-1", agent_ids=["test.test-agent"])],
), ),
( (
None, None,
@ -2769,16 +3046,19 @@ async def test_config_retention_copies_logic_manual_backup(
], ],
{ {
"backup-1": MagicMock( "backup-1": MagicMock(
agents={"test.test-agent": MagicMock(spec=AgentBackupStatus)},
date="2024-11-10T04:45:00+01:00", date="2024-11-10T04:45:00+01:00",
with_automatic_settings=True, with_automatic_settings=True,
spec=ManagerBackup, spec=ManagerBackup,
), ),
"backup-2": MagicMock( "backup-2": MagicMock(
agents={"test.test-agent": MagicMock(spec=AgentBackupStatus)},
date="2024-11-11T04:45:00+01:00", date="2024-11-11T04:45:00+01:00",
with_automatic_settings=True, with_automatic_settings=True,
spec=ManagerBackup, spec=ManagerBackup,
), ),
"backup-3": MagicMock( "backup-3": MagicMock(
agents={"test.test-agent": MagicMock(spec=AgentBackupStatus)},
date="2024-11-10T04:45:00+01:00", date="2024-11-10T04:45:00+01:00",
with_automatic_settings=False, with_automatic_settings=False,
spec=ManagerBackup, spec=ManagerBackup,
@ -2791,7 +3071,7 @@ async def test_config_retention_copies_logic_manual_backup(
"2024-11-12T12:00:00+01:00", "2024-11-12T12:00:00+01:00",
1, 1,
1, 1,
[call("backup-1")], [call("backup-1", agent_ids=["test.test-agent"])],
), ),
( (
None, None,
@ -2805,21 +3085,25 @@ async def test_config_retention_copies_logic_manual_backup(
], ],
{ {
"backup-1": MagicMock( "backup-1": MagicMock(
agents={"test.test-agent": MagicMock(spec=AgentBackupStatus)},
date="2024-11-09T04:45:00+01:00", date="2024-11-09T04:45:00+01:00",
with_automatic_settings=True, with_automatic_settings=True,
spec=ManagerBackup, spec=ManagerBackup,
), ),
"backup-2": MagicMock( "backup-2": MagicMock(
agents={"test.test-agent": MagicMock(spec=AgentBackupStatus)},
date="2024-11-10T04:45:00+01:00", date="2024-11-10T04:45:00+01:00",
with_automatic_settings=True, with_automatic_settings=True,
spec=ManagerBackup, spec=ManagerBackup,
), ),
"backup-3": MagicMock( "backup-3": MagicMock(
agents={"test.test-agent": MagicMock(spec=AgentBackupStatus)},
date="2024-11-11T04:45:00+01:00", date="2024-11-11T04:45:00+01:00",
with_automatic_settings=True, with_automatic_settings=True,
spec=ManagerBackup, spec=ManagerBackup,
), ),
"backup-4": MagicMock( "backup-4": MagicMock(
agents={"test.test-agent": MagicMock(spec=AgentBackupStatus)},
date="2024-11-10T04:45:00+01:00", date="2024-11-10T04:45:00+01:00",
with_automatic_settings=False, with_automatic_settings=False,
spec=ManagerBackup, spec=ManagerBackup,
@ -2832,7 +3116,10 @@ async def test_config_retention_copies_logic_manual_backup(
"2024-11-12T12:00:00+01:00", "2024-11-12T12:00:00+01:00",
1, 1,
2, 2,
[call("backup-1"), call("backup-2")], [
call("backup-1", agent_ids=["test.test-agent"]),
call("backup-2", agent_ids=["test.test-agent"]),
],
), ),
], ],
) )

View File

@ -136,7 +136,7 @@
}), }),
), ),
tuple( tuple(
'upload_file', 'resumable_upload_file',
tuple( tuple(
dict({ dict({
'description': '{"addons": [{"name": "Test", "slug": "test", "version": "1.0.0"}], "backup_id": "test-backup", "date": "2025-01-01T01:23:45.678Z", "database_included": true, "extra_metadata": {"with_automatic_settings": false}, "folders": [], "homeassistant_included": true, "homeassistant_version": "2024.12.0", "name": "Test", "protected": false, "size": 987}', 'description': '{"addons": [{"name": "Test", "slug": "test", "version": "1.0.0"}], "backup_id": "test-backup", "date": "2025-01-01T01:23:45.678Z", "database_included": true, "extra_metadata": {"with_automatic_settings": false}, "folders": [], "homeassistant_included": true, "homeassistant_version": "2024.12.0", "name": "Test", "protected": false, "size": 987}',
@ -151,6 +151,7 @@
}), }),
}), }),
"CoreBackupReaderWriter.async_receive_backup.<locals>.open_backup() -> 'AsyncIterator[bytes]'", "CoreBackupReaderWriter.async_receive_backup.<locals>.open_backup() -> 'AsyncIterator[bytes]'",
987,
), ),
dict({ dict({
'timeout': dict({ 'timeout': dict({
@ -207,7 +208,7 @@
}), }),
), ),
tuple( tuple(
'upload_file', 'resumable_upload_file',
tuple( tuple(
dict({ dict({
'description': '{"addons": [{"name": "Test", "slug": "test", "version": "1.0.0"}], "backup_id": "test-backup", "date": "2025-01-01T01:23:45.678Z", "database_included": true, "extra_metadata": {"with_automatic_settings": false}, "folders": [], "homeassistant_included": true, "homeassistant_version": "2024.12.0", "name": "Test", "protected": false, "size": 987}', 'description': '{"addons": [{"name": "Test", "slug": "test", "version": "1.0.0"}], "backup_id": "test-backup", "date": "2025-01-01T01:23:45.678Z", "database_included": true, "extra_metadata": {"with_automatic_settings": false}, "folders": [], "homeassistant_included": true, "homeassistant_version": "2024.12.0", "name": "Test", "protected": false, "size": 987}',
@ -222,6 +223,7 @@
}), }),
}), }),
"CoreBackupReaderWriter.async_receive_backup.<locals>.open_backup() -> 'AsyncIterator[bytes]'", "CoreBackupReaderWriter.async_receive_backup.<locals>.open_backup() -> 'AsyncIterator[bytes]'",
987,
), ),
dict({ dict({
'timeout': dict({ 'timeout': dict({

View File

@ -0,0 +1,36 @@
"""Test the Google Drive application_credentials."""
import pytest
from homeassistant import setup
from homeassistant.components.google_drive.application_credentials import (
async_get_description_placeholders,
)
from homeassistant.core import HomeAssistant
@pytest.mark.parametrize(
("additional_components", "external_url", "expected_redirect_uri"),
[
([], "https://example.com", "https://example.com/auth/external/callback"),
([], None, "https://YOUR_DOMAIN:PORT/auth/external/callback"),
(["my"], "https://example.com", "https://my.home-assistant.io/redirect/oauth"),
],
)
async def test_description_placeholders(
hass: HomeAssistant,
additional_components: list[str],
external_url: str | None,
expected_redirect_uri: str,
) -> None:
"""Test description placeholders."""
for component in additional_components:
assert await setup.async_setup_component(hass, component, {})
hass.config.external_url = external_url
placeholders = await async_get_description_placeholders(hass)
assert placeholders == {
"oauth_consent_url": "https://console.cloud.google.com/apis/credentials/consent",
"more_info_url": "https://www.home-assistant.io/integrations/google_drive/",
"oauth_creds_url": "https://console.cloud.google.com/apis/credentials",
"redirect_url": expected_redirect_uri,
}

View File

@ -281,7 +281,7 @@ async def test_agents_upload(
snapshot: SnapshotAssertion, snapshot: SnapshotAssertion,
) -> None: ) -> None:
"""Test agent upload backup.""" """Test agent upload backup."""
mock_api.upload_file = AsyncMock(return_value=None) mock_api.resumable_upload_file = AsyncMock(return_value=None)
client = await hass_client() client = await hass_client()
@ -306,7 +306,7 @@ async def test_agents_upload(
assert f"Uploading backup: {TEST_AGENT_BACKUP.backup_id}" in caplog.text assert f"Uploading backup: {TEST_AGENT_BACKUP.backup_id}" in caplog.text
assert f"Uploaded backup: {TEST_AGENT_BACKUP.backup_id}" in caplog.text assert f"Uploaded backup: {TEST_AGENT_BACKUP.backup_id}" in caplog.text
mock_api.upload_file.assert_called_once() mock_api.resumable_upload_file.assert_called_once()
assert [tuple(mock_call) for mock_call in mock_api.mock_calls] == snapshot assert [tuple(mock_call) for mock_call in mock_api.mock_calls] == snapshot
@ -322,7 +322,7 @@ async def test_agents_upload_create_folder_if_missing(
mock_api.create_file = AsyncMock( mock_api.create_file = AsyncMock(
return_value={"id": "new folder id", "name": "Home Assistant"} return_value={"id": "new folder id", "name": "Home Assistant"}
) )
mock_api.upload_file = AsyncMock(return_value=None) mock_api.resumable_upload_file = AsyncMock(return_value=None)
client = await hass_client() client = await hass_client()
@ -348,7 +348,7 @@ async def test_agents_upload_create_folder_if_missing(
assert f"Uploaded backup: {TEST_AGENT_BACKUP.backup_id}" in caplog.text assert f"Uploaded backup: {TEST_AGENT_BACKUP.backup_id}" in caplog.text
mock_api.create_file.assert_called_once() mock_api.create_file.assert_called_once()
mock_api.upload_file.assert_called_once() mock_api.resumable_upload_file.assert_called_once()
assert [tuple(mock_call) for mock_call in mock_api.mock_calls] == snapshot assert [tuple(mock_call) for mock_call in mock_api.mock_calls] == snapshot
@ -359,7 +359,9 @@ async def test_agents_upload_fail(
mock_api: MagicMock, mock_api: MagicMock,
) -> None: ) -> None:
"""Test agent upload backup fails.""" """Test agent upload backup fails."""
mock_api.upload_file = AsyncMock(side_effect=GoogleDriveApiError("some error")) mock_api.resumable_upload_file = AsyncMock(
side_effect=GoogleDriveApiError("some error")
)
client = await hass_client() client = await hass_client()

View File

@ -143,6 +143,25 @@
"trinkets": 0 "trinkets": 0
} }
} }
} },
"webhooks": [
{
"id": "43a67e37-1bae-4b11-8d3d-6c4b1b480231",
"type": "taskActivity",
"label": "My Webhook",
"url": "https://some-webhook-url.com",
"enabled": true,
"failures": 0,
"options": {
"created": false,
"updated": false,
"deleted": false,
"checklistScored": false,
"scored": true
},
"createdAt": "2025-02-08T22:06:08.894Z",
"updatedAt": "2025-02-08T22:06:17.195Z"
}
]
} }
} }

View File

@ -8,48 +8,31 @@
'habitica_data': dict({ 'habitica_data': dict({
'tasks': list([ 'tasks': list([
dict({ dict({
'alias': None,
'attribute': 'str', 'attribute': 'str',
'byHabitica': False, 'byHabitica': False,
'challenge': dict({ 'challenge': dict({
'broken': None,
'id': None,
'shortName': None,
'taskId': None,
'winner': None,
}), }),
'checklist': list([ 'checklist': list([
]), ]),
'collapseChecklist': False, 'collapseChecklist': False,
'completed': None,
'counterDown': 0, 'counterDown': 0,
'counterUp': 0, 'counterUp': 0,
'createdAt': '2024-10-10T15:57:14.287000+00:00', 'createdAt': '2024-10-10T15:57:14.287000+00:00',
'date': None,
'daysOfMonth': list([ 'daysOfMonth': list([
]), ]),
'down': False, 'down': False,
'everyX': None,
'frequency': 'daily', 'frequency': 'daily',
'group': dict({ 'group': dict({
'assignedDate': None,
'assignedUsers': list([ 'assignedUsers': list([
]), ]),
'assignedUsersDetail': dict({ 'assignedUsersDetail': dict({
}), }),
'assigningUsername': None,
'completedBy': dict({ 'completedBy': dict({
'date': None,
'userId': None,
}), }),
'id': None,
'managerNotes': None,
'taskId': None,
}), }),
'history': list([ 'history': list([
]), ]),
'id': '30923acd-3b4c-486d-9ef3-c8f57cf56049', 'id': '30923acd-3b4c-486d-9ef3-c8f57cf56049',
'isDue': None,
'nextDue': list([ 'nextDue': list([
]), ]),
'notes': 'task notes', 'notes': 'task notes',
@ -65,8 +48,6 @@
'th': False, 'th': False,
'w': True, 'w': True,
}), }),
'startDate': None,
'streak': None,
'tags': list([ 'tags': list([
]), ]),
'text': 'task text', 'text': 'task text',
@ -77,51 +58,30 @@
'value': 0.0, 'value': 0.0,
'weeksOfMonth': list([ 'weeksOfMonth': list([
]), ]),
'yesterDaily': None,
}), }),
dict({ dict({
'alias': None,
'attribute': 'str', 'attribute': 'str',
'byHabitica': True, 'byHabitica': True,
'challenge': dict({ 'challenge': dict({
'broken': None,
'id': None,
'shortName': None,
'taskId': None,
'winner': None,
}), }),
'checklist': list([ 'checklist': list([
]), ]),
'collapseChecklist': False, 'collapseChecklist': False,
'completed': False, 'completed': False,
'counterDown': None,
'counterUp': None,
'createdAt': '2024-10-10T15:57:14.290000+00:00', 'createdAt': '2024-10-10T15:57:14.290000+00:00',
'date': None,
'daysOfMonth': list([ 'daysOfMonth': list([
]), ]),
'down': None,
'everyX': None,
'frequency': None,
'group': dict({ 'group': dict({
'assignedDate': None,
'assignedUsers': list([ 'assignedUsers': list([
]), ]),
'assignedUsersDetail': dict({ 'assignedUsersDetail': dict({
}), }),
'assigningUsername': None,
'completedBy': dict({ 'completedBy': dict({
'date': None,
'userId': None,
}), }),
'id': None,
'managerNotes': None,
'taskId': None,
}), }),
'history': list([ 'history': list([
]), ]),
'id': 'e6e06dc6-c887-4b86-b175-b99cc2e20fdf', 'id': 'e6e06dc6-c887-4b86-b175-b99cc2e20fdf',
'isDue': None,
'nextDue': list([ 'nextDue': list([
]), ]),
'notes': 'task notes', 'notes': 'task notes',
@ -137,63 +97,38 @@
'th': False, 'th': False,
'w': True, 'w': True,
}), }),
'startDate': None,
'streak': None,
'tags': list([ 'tags': list([
]), ]),
'text': 'task text', 'text': 'task text',
'type': 'todo', 'type': 'todo',
'up': None,
'updatedAt': '2024-11-27T19:34:29.001000+00:00', 'updatedAt': '2024-11-27T19:34:29.001000+00:00',
'userId': 'ffce870c-3ff3-4fa4-bad1-87612e52b8e7', 'userId': 'ffce870c-3ff3-4fa4-bad1-87612e52b8e7',
'value': -6.418582324043852, 'value': -6.418582324043852,
'weeksOfMonth': list([ 'weeksOfMonth': list([
]), ]),
'yesterDaily': None,
}), }),
dict({ dict({
'alias': None,
'attribute': 'str', 'attribute': 'str',
'byHabitica': False, 'byHabitica': False,
'challenge': dict({ 'challenge': dict({
'broken': None,
'id': None,
'shortName': None,
'taskId': None,
'winner': None,
}), }),
'checklist': list([ 'checklist': list([
]), ]),
'collapseChecklist': False, 'collapseChecklist': False,
'completed': None,
'counterDown': None,
'counterUp': None,
'createdAt': '2024-10-10T15:57:14.290000+00:00', 'createdAt': '2024-10-10T15:57:14.290000+00:00',
'date': None,
'daysOfMonth': list([ 'daysOfMonth': list([
]), ]),
'down': None,
'everyX': None,
'frequency': None,
'group': dict({ 'group': dict({
'assignedDate': None,
'assignedUsers': list([ 'assignedUsers': list([
]), ]),
'assignedUsersDetail': dict({ 'assignedUsersDetail': dict({
}), }),
'assigningUsername': None,
'completedBy': dict({ 'completedBy': dict({
'date': None,
'userId': None,
}), }),
'id': None,
'managerNotes': None,
'taskId': None,
}), }),
'history': list([ 'history': list([
]), ]),
'id': '2fbf11a5-ab1e-4fb7-97f0-dfb5c45c96a9', 'id': '2fbf11a5-ab1e-4fb7-97f0-dfb5c45c96a9',
'isDue': None,
'nextDue': list([ 'nextDue': list([
]), ]),
'notes': 'task notes', 'notes': 'task notes',
@ -209,106 +144,73 @@
'th': False, 'th': False,
'w': True, 'w': True,
}), }),
'startDate': None,
'streak': None,
'tags': list([ 'tags': list([
]), ]),
'text': 'task text', 'text': 'task text',
'type': 'reward', 'type': 'reward',
'up': None,
'updatedAt': '2024-10-10T15:57:14.290000+00:00', 'updatedAt': '2024-10-10T15:57:14.290000+00:00',
'userId': 'ffce870c-3ff3-4fa4-bad1-87612e52b8e7', 'userId': 'ffce870c-3ff3-4fa4-bad1-87612e52b8e7',
'value': 10.0, 'value': 10.0,
'weeksOfMonth': list([ 'weeksOfMonth': list([
]), ]),
'yesterDaily': None,
}), }),
dict({ dict({
'alias': None,
'attribute': 'str', 'attribute': 'str',
'byHabitica': False, 'byHabitica': False,
'challenge': dict({ 'challenge': dict({
'broken': None,
'id': None,
'shortName': None,
'taskId': None,
'winner': None,
}), }),
'checklist': list([ 'checklist': list([
]), ]),
'collapseChecklist': False, 'collapseChecklist': False,
'completed': False, 'completed': False,
'counterDown': None,
'counterUp': None,
'createdAt': '2024-10-10T15:57:14.304000+00:00', 'createdAt': '2024-10-10T15:57:14.304000+00:00',
'date': None,
'daysOfMonth': list([ 'daysOfMonth': list([
]), ]),
'down': None,
'everyX': 1, 'everyX': 1,
'frequency': 'weekly', 'frequency': 'weekly',
'group': dict({ 'group': dict({
'assignedDate': None,
'assignedUsers': list([ 'assignedUsers': list([
]), ]),
'assignedUsersDetail': dict({ 'assignedUsersDetail': dict({
}), }),
'assigningUsername': None,
'completedBy': dict({ 'completedBy': dict({
'date': None,
'userId': None,
}), }),
'id': None,
'managerNotes': None,
'taskId': None,
}), }),
'history': list([ 'history': list([
dict({ dict({
'completed': True, 'completed': True,
'date': '2024-10-30T19:37:01.817000+00:00', 'date': '2024-10-30T19:37:01.817000+00:00',
'isDue': True, 'isDue': True,
'scoredDown': None,
'scoredUp': None,
'value': 1.0, 'value': 1.0,
}), }),
dict({ dict({
'completed': True, 'completed': True,
'date': '2024-10-31T23:33:14.890000+00:00', 'date': '2024-10-31T23:33:14.890000+00:00',
'isDue': True, 'isDue': True,
'scoredDown': None,
'scoredUp': None,
'value': 1.9747, 'value': 1.9747,
}), }),
dict({ dict({
'completed': False, 'completed': False,
'date': '2024-11-05T18:25:04.730000+00:00', 'date': '2024-11-05T18:25:04.730000+00:00',
'isDue': True, 'isDue': True,
'scoredDown': None,
'scoredUp': None,
'value': 1.024043774264157, 'value': 1.024043774264157,
}), }),
dict({ dict({
'completed': False, 'completed': False,
'date': '2024-11-21T15:09:07.573000+00:00', 'date': '2024-11-21T15:09:07.573000+00:00',
'isDue': True, 'isDue': True,
'scoredDown': None,
'scoredUp': None,
'value': 0.049944135963563174, 'value': 0.049944135963563174,
}), }),
dict({ dict({
'completed': False, 'completed': False,
'date': '2024-11-22T00:41:21.228000+00:00', 'date': '2024-11-22T00:41:21.228000+00:00',
'isDue': True, 'isDue': True,
'scoredDown': None,
'scoredUp': None,
'value': -0.9487768368544092, 'value': -0.9487768368544092,
}), }),
dict({ dict({
'completed': False, 'completed': False,
'date': '2024-11-27T19:34:28.973000+00:00', 'date': '2024-11-27T19:34:28.973000+00:00',
'isDue': True, 'isDue': True,
'scoredDown': None,
'scoredUp': None,
'value': -1.973387732005249, 'value': -1.973387732005249,
}), }),
]), ]),
@ -341,7 +243,6 @@
]), ]),
'text': 'task text', 'text': 'task text',
'type': 'daily', 'type': 'daily',
'up': None,
'updatedAt': '2024-11-27T19:34:29.001000+00:00', 'updatedAt': '2024-11-27T19:34:29.001000+00:00',
'userId': 'ffce870c-3ff3-4fa4-bad1-87612e52b8e7', 'userId': 'ffce870c-3ff3-4fa4-bad1-87612e52b8e7',
'value': -1.973387732005249, 'value': -1.973387732005249,
@ -352,60 +253,23 @@
]), ]),
'user': dict({ 'user': dict({
'achievements': dict({ 'achievements': dict({
'backToBasics': None,
'boneCollector': None,
'challenges': list([ 'challenges': list([
]), ]),
'completedTask': True, 'completedTask': True,
'createdTask': True, 'createdTask': True,
'dustDevil': None,
'fedPet': None,
'goodAsGold': None,
'hatchedPet': None,
'joinedChallenge': None,
'joinedGuild': None,
'partyUp': None,
'perfect': 2, 'perfect': 2,
'primedForPainting': None,
'purchasedEquipment': None,
'quests': dict({ 'quests': dict({
'atom1': None,
'atom2': None,
'atom3': None,
'bewilder': None,
'burnout': None,
'dilatory': None,
'dilatory_derby': None,
'dysheartener': None,
'evilsanta': None,
'evilsanta2': None,
'gryphon': None,
'harpy': None,
'stressbeast': None,
'vice1': None,
'vice3': None,
}), }),
'seeingRed': None,
'shadyCustomer': None,
'streak': 0, 'streak': 0,
'tickledPink': None,
'ultimateGearSets': dict({ 'ultimateGearSets': dict({
'healer': False, 'healer': False,
'rogue': False, 'rogue': False,
'warrior': False, 'warrior': False,
'wizard': False, 'wizard': False,
}), }),
'violetsAreBlue': None,
}), }),
'auth': dict({ 'auth': dict({
'apple': None,
'facebook': None,
'google': None,
'local': dict({ 'local': dict({
'email': None,
'has_password': None,
'lowerCaseUsername': None,
'username': None,
}), }),
'timestamps': dict({ 'timestamps': dict({
'created': '2024-10-10T15:57:01.106000+00:00', 'created': '2024-10-10T15:57:01.106000+00:00',
@ -414,17 +278,11 @@
}), }),
}), }),
'backer': dict({ 'backer': dict({
'npc': None,
'tier': None,
'tokensApplied': None,
}), }),
'balance': 0.0, 'balance': 0.0,
'challenges': list([ 'challenges': list([
]), ]),
'contributor': dict({ 'contributor': dict({
'contributions': None,
'level': None,
'text': None,
}), }),
'extra': dict({ 'extra': dict({
}), }),
@ -433,23 +291,17 @@
'armoireEnabled': True, 'armoireEnabled': True,
'armoireOpened': False, 'armoireOpened': False,
'cardReceived': False, 'cardReceived': False,
'chatRevoked': None,
'chatShadowMuted': None,
'classSelected': False, 'classSelected': False,
'communityGuidelinesAccepted': True, 'communityGuidelinesAccepted': True,
'cronCount': 6, 'cronCount': 6,
'customizationsNotification': True, 'customizationsNotification': True,
'dropsEnabled': False, 'dropsEnabled': False,
'itemsEnabled': True, 'itemsEnabled': True,
'lastFreeRebirth': None,
'lastNewStuffRead': '', 'lastNewStuffRead': '',
'lastWeeklyRecap': '2024-10-10T15:57:01.106000+00:00', 'lastWeeklyRecap': '2024-10-10T15:57:01.106000+00:00',
'lastWeeklyRecapDiscriminator': None,
'levelDrops': dict({ 'levelDrops': dict({
}), }),
'mathUpdates': None,
'newStuff': False, 'newStuff': False,
'onboardingEmailsPhase': None,
'rebirthEnabled': False, 'rebirthEnabled': False,
'recaptureEmailsPhase': 0, 'recaptureEmailsPhase': 0,
'rewrite': True, 'rewrite': True,
@ -508,101 +360,53 @@
'history': dict({ 'history': dict({
'exp': list([ 'exp': list([
dict({ dict({
'completed': None,
'date': '2024-10-30T19:37:01.970000+00:00', 'date': '2024-10-30T19:37:01.970000+00:00',
'isDue': None,
'scoredDown': None,
'scoredUp': None,
'value': 24.0, 'value': 24.0,
}), }),
dict({ dict({
'completed': None,
'date': '2024-10-31T23:33:14.972000+00:00', 'date': '2024-10-31T23:33:14.972000+00:00',
'isDue': None,
'scoredDown': None,
'scoredUp': None,
'value': 48.0, 'value': 48.0,
}), }),
dict({ dict({
'completed': None,
'date': '2024-11-05T18:25:04.681000+00:00', 'date': '2024-11-05T18:25:04.681000+00:00',
'isDue': None,
'scoredDown': None,
'scoredUp': None,
'value': 66.0, 'value': 66.0,
}), }),
dict({ dict({
'completed': None,
'date': '2024-11-21T15:09:07.501000+00:00', 'date': '2024-11-21T15:09:07.501000+00:00',
'isDue': None,
'scoredDown': None,
'scoredUp': None,
'value': 66.0, 'value': 66.0,
}), }),
dict({ dict({
'completed': None,
'date': '2024-11-22T00:41:21.137000+00:00', 'date': '2024-11-22T00:41:21.137000+00:00',
'isDue': None,
'scoredDown': None,
'scoredUp': None,
'value': 66.0, 'value': 66.0,
}), }),
dict({ dict({
'completed': None,
'date': '2024-11-27T19:34:28.887000+00:00', 'date': '2024-11-27T19:34:28.887000+00:00',
'isDue': None,
'scoredDown': None,
'scoredUp': None,
'value': 66.0, 'value': 66.0,
}), }),
]), ]),
'todos': list([ 'todos': list([
dict({ dict({
'completed': None,
'date': '2024-10-30T19:37:01.970000+00:00', 'date': '2024-10-30T19:37:01.970000+00:00',
'isDue': None,
'scoredDown': None,
'scoredUp': None,
'value': -5.0, 'value': -5.0,
}), }),
dict({ dict({
'completed': None,
'date': '2024-10-31T23:33:14.972000+00:00', 'date': '2024-10-31T23:33:14.972000+00:00',
'isDue': None,
'scoredDown': None,
'scoredUp': None,
'value': -10.129783523135325, 'value': -10.129783523135325,
}), }),
dict({ dict({
'completed': None,
'date': '2024-11-05T18:25:04.681000+00:00', 'date': '2024-11-05T18:25:04.681000+00:00',
'isDue': None,
'scoredDown': None,
'scoredUp': None,
'value': -16.396221153338182, 'value': -16.396221153338182,
}), }),
dict({ dict({
'completed': None,
'date': '2024-11-21T15:09:07.501000+00:00', 'date': '2024-11-21T15:09:07.501000+00:00',
'isDue': None,
'scoredDown': None,
'scoredUp': None,
'value': -22.8326979965846, 'value': -22.8326979965846,
}), }),
dict({ dict({
'completed': None,
'date': '2024-11-22T00:41:21.137000+00:00', 'date': '2024-11-22T00:41:21.137000+00:00',
'isDue': None,
'scoredDown': None,
'scoredUp': None,
'value': -29.448636229365235, 'value': -29.448636229365235,
}), }),
dict({ dict({
'completed': None,
'date': '2024-11-27T19:34:28.887000+00:00', 'date': '2024-11-27T19:34:28.887000+00:00',
'isDue': None,
'scoredDown': None,
'scoredUp': None,
'value': -36.25425987861077, 'value': -36.25425987861077,
}), }),
]), ]),
@ -643,23 +447,13 @@
'gear': dict({ 'gear': dict({
'costume': dict({ 'costume': dict({
'armor': 'armor_base_0', 'armor': 'armor_base_0',
'back': None,
'body': None,
'eyewear': None,
'head': 'head_base_0', 'head': 'head_base_0',
'headAccessory': None,
'shield': 'shield_base_0', 'shield': 'shield_base_0',
'weapon': None,
}), }),
'equipped': dict({ 'equipped': dict({
'armor': 'armor_base_0', 'armor': 'armor_base_0',
'back': None,
'body': None,
'eyewear': None,
'head': 'head_base_0', 'head': 'head_base_0',
'headAccessory': None,
'shield': 'shield_base_0', 'shield': 'shield_base_0',
'weapon': None,
}), }),
'owned': dict({ 'owned': dict({
'armor_special_bardRobes': True, 'armor_special_bardRobes': True,
@ -736,7 +530,6 @@
}), }),
'lastCron': '2024-11-27T19:34:28.887000+00:00', 'lastCron': '2024-11-27T19:34:28.887000+00:00',
'loginIncentives': 6, 'loginIncentives': 6,
'needsCron': None,
'newMessages': dict({ 'newMessages': dict({
}), }),
'notifications': list([ 'notifications': list([
@ -747,7 +540,6 @@
'orderAscending': 'ascending', 'orderAscending': 'ascending',
'quest': dict({ 'quest': dict({
'RSVPNeeded': True, 'RSVPNeeded': True,
'completed': None,
'key': 'dustbunnies', 'key': 'dustbunnies',
'progress': dict({ 'progress': dict({
'collect': dict({ 'collect': dict({
@ -759,37 +551,31 @@
}), }),
}), }),
'permissions': dict({ 'permissions': dict({
'challengeAdmin': None,
'coupons': None,
'fullAccess': None,
'moderator': None,
'news': None,
'userSupport': None,
}), }),
'pinnedItems': list([ 'pinnedItems': list([
dict({ dict({
'Type': 'marketGear',
'path': 'gear.flat.weapon_warrior_0', 'path': 'gear.flat.weapon_warrior_0',
'type': 'marketGear',
}), }),
dict({ dict({
'Type': 'marketGear',
'path': 'gear.flat.armor_warrior_1', 'path': 'gear.flat.armor_warrior_1',
'type': 'marketGear',
}), }),
dict({ dict({
'Type': 'marketGear',
'path': 'gear.flat.shield_warrior_1', 'path': 'gear.flat.shield_warrior_1',
'type': 'marketGear',
}), }),
dict({ dict({
'Type': 'marketGear',
'path': 'gear.flat.head_warrior_1', 'path': 'gear.flat.head_warrior_1',
'type': 'marketGear',
}), }),
dict({ dict({
'Type': 'potion',
'path': 'potion', 'path': 'potion',
'type': 'potion',
}), }),
dict({ dict({
'Type': 'armoire',
'path': 'armoire', 'path': 'armoire',
'type': 'armoire',
}), }),
]), ]),
'pinnedItemsOrder': list([ 'pinnedItemsOrder': list([
@ -798,7 +584,6 @@
'advancedCollapsed': False, 'advancedCollapsed': False,
'allocationMode': 'flat', 'allocationMode': 'flat',
'autoEquip': True, 'autoEquip': True,
'automaticAllocation': None,
'background': 'violet', 'background': 'violet',
'chair': 'none', 'chair': 'none',
'costume': False, 'costume': False,
@ -888,9 +673,6 @@
}), }),
}), }),
'profile': dict({ 'profile': dict({
'blurb': None,
'imageUrl': None,
'name': None,
}), }),
'purchased': dict({ 'purchased': dict({
'ads': False, 'ads': False,
@ -904,21 +686,11 @@
}), }),
'hair': dict({ 'hair': dict({
}), }),
'mobileChat': None,
'plan': dict({ 'plan': dict({
'consecutive': dict({ 'consecutive': dict({
'count': None,
'gemCapExtra': None,
'offset': None,
'trinkets': None,
}), }),
'dateUpdated': None,
'extraMonths': None,
'gemsBought': None,
'mysteryItems': list([ 'mysteryItems': list([
]), ]),
'perkMonthCount': None,
'quantity': None,
}), }),
'shirt': dict({ 'shirt': dict({
}), }),
@ -928,81 +700,73 @@
}), }),
'pushDevices': list([ 'pushDevices': list([
]), ]),
'secret': None,
'stats': dict({ 'stats': dict({
'Class': 'warrior',
'Int': 0,
'Str': 0,
'buffs': dict({ 'buffs': dict({
'Int': 0,
'Str': 0,
'con': 0, 'con': 0,
'int': 0,
'per': 0, 'per': 0,
'seafoam': False, 'seafoam': False,
'shinySeed': False, 'shinySeed': False,
'snowball': False, 'snowball': False,
'spookySparkles': False, 'spookySparkles': False,
'stealth': 0, 'stealth': 0,
'str': 0,
'streaks': False, 'streaks': False,
}), }),
'class': 'warrior',
'con': 0, 'con': 0,
'exp': 41, 'exp': 41,
'gp': 11.100978952781748, 'gp': 11.100978952781748,
'hp': 25.40000000000002, 'hp': 25.40000000000002,
'int': 0,
'lvl': 2, 'lvl': 2,
'maxHealth': 50, 'maxHealth': 50,
'maxMP': 32, 'maxMP': 32,
'mp': 32.0, 'mp': 32.0,
'per': 0, 'per': 0,
'points': 2, 'points': 2,
'str': 0,
'toNextLevel': 50, 'toNextLevel': 50,
'training': dict({ 'training': dict({
'Int': 0,
'Str': 0.0,
'con': 0, 'con': 0,
'int': 0,
'per': 0, 'per': 0,
'str': 0.0,
}), }),
}), }),
'tags': list([ 'tags': list([
dict({ dict({
'challenge': True, 'challenge': True,
'group': None,
'id': 'c1a35186-9895-4ac0-9cd7-49e7bb875695', 'id': 'c1a35186-9895-4ac0-9cd7-49e7bb875695',
'name': 'tag', 'name': 'tag',
}), }),
dict({ dict({
'challenge': True, 'challenge': True,
'group': None,
'id': '53d1deb8-ed2b-4f94-bbfc-955e9e92aa98', 'id': '53d1deb8-ed2b-4f94-bbfc-955e9e92aa98',
'name': 'tag', 'name': 'tag',
}), }),
dict({ dict({
'challenge': True, 'challenge': True,
'group': None,
'id': '29bf6a99-536f-446b-838f-a81d41e1ed4d', 'id': '29bf6a99-536f-446b-838f-a81d41e1ed4d',
'name': 'tag', 'name': 'tag',
}), }),
dict({ dict({
'challenge': True, 'challenge': True,
'group': None,
'id': '1b1297e7-4fd8-460a-b148-e92d7bcfa9a5', 'id': '1b1297e7-4fd8-460a-b148-e92d7bcfa9a5',
'name': 'tag', 'name': 'tag',
}), }),
dict({ dict({
'challenge': True, 'challenge': True,
'group': None,
'id': '05e6cf40-48ea-415a-9b8b-e2ecad258ef6', 'id': '05e6cf40-48ea-415a-9b8b-e2ecad258ef6',
'name': 'tag', 'name': 'tag',
}), }),
dict({ dict({
'challenge': True, 'challenge': True,
'group': None,
'id': 'fe53f179-59d8-4c28-9bf7-b9068ab552a4', 'id': 'fe53f179-59d8-4c28-9bf7-b9068ab552a4',
'name': 'tag', 'name': 'tag',
}), }),
dict({ dict({
'challenge': True, 'challenge': True,
'group': None,
'id': 'c44e9e8c-4bff-42df-98d5-1a1a7b69eada', 'id': 'c44e9e8c-4bff-42df-98d5-1a1a7b69eada',
'name': 'tag', 'name': 'tag',
}), }),

View File

@ -661,8 +661,8 @@ async def test_agent_get_backup(
( (
SupervisorBadRequestError("blah"), SupervisorBadRequestError("blah"),
{ {
"success": False, "success": True,
"error": {"code": "unknown_error", "message": "Unknown error"}, "result": {"agent_errors": {"hassio.local": "blah"}, "backup": None},
}, },
), ),
( (
@ -733,8 +733,8 @@ async def test_agent_delete_backup(
( (
SupervisorBadRequestError("blah"), SupervisorBadRequestError("blah"),
{ {
"success": False, "success": True,
"error": {"code": "unknown_error", "message": "Unknown error"}, "result": {"agent_errors": {"hassio.local": "blah"}},
}, },
), ),
( (

View File

@ -10,6 +10,9 @@ from aiohasupervisor.models import HomeAssistantUpdateOptions, StoreAddonUpdate
import pytest import pytest
from homeassistant.components.backup import BackupManagerError, ManagerBackup from homeassistant.components.backup import BackupManagerError, ManagerBackup
# pylint: disable-next=hass-component-root-import
from homeassistant.components.backup.manager import AgentBackupStatus
from homeassistant.components.hassio import DOMAIN from homeassistant.components.hassio import DOMAIN
from homeassistant.components.hassio.const import REQUEST_REFRESH_DELAY from homeassistant.components.hassio.const import REQUEST_REFRESH_DELAY
from homeassistant.const import __version__ as HAVERSION from homeassistant.const import __version__ as HAVERSION
@ -348,34 +351,40 @@ async def test_update_addon_with_backup(
( (
{ {
"backup-1": MagicMock( "backup-1": MagicMock(
agents={"hassio.local": MagicMock(spec=AgentBackupStatus)},
date="2024-11-10T04:45:00+01:00", date="2024-11-10T04:45:00+01:00",
with_automatic_settings=True, with_automatic_settings=True,
spec=ManagerBackup, spec=ManagerBackup,
), ),
"backup-2": MagicMock( "backup-2": MagicMock(
agents={"hassio.local": MagicMock(spec=AgentBackupStatus)},
date="2024-11-11T04:45:00+01:00", date="2024-11-11T04:45:00+01:00",
with_automatic_settings=False, with_automatic_settings=False,
spec=ManagerBackup, spec=ManagerBackup,
), ),
"backup-3": MagicMock( "backup-3": MagicMock(
agents={"hassio.local": MagicMock(spec=AgentBackupStatus)},
date="2024-11-11T04:45:00+01:00", date="2024-11-11T04:45:00+01:00",
extra_metadata={"supervisor.addon_update": "other"}, extra_metadata={"supervisor.addon_update": "other"},
with_automatic_settings=True, with_automatic_settings=True,
spec=ManagerBackup, spec=ManagerBackup,
), ),
"backup-4": MagicMock( "backup-4": MagicMock(
agents={"hassio.local": MagicMock(spec=AgentBackupStatus)},
date="2024-11-11T04:45:00+01:00", date="2024-11-11T04:45:00+01:00",
extra_metadata={"supervisor.addon_update": "other"}, extra_metadata={"supervisor.addon_update": "other"},
with_automatic_settings=True, with_automatic_settings=True,
spec=ManagerBackup, spec=ManagerBackup,
), ),
"backup-5": MagicMock( "backup-5": MagicMock(
agents={"hassio.local": MagicMock(spec=AgentBackupStatus)},
date="2024-11-11T04:45:00+01:00", date="2024-11-11T04:45:00+01:00",
extra_metadata={"supervisor.addon_update": "test"}, extra_metadata={"supervisor.addon_update": "test"},
with_automatic_settings=True, with_automatic_settings=True,
spec=ManagerBackup, spec=ManagerBackup,
), ),
"backup-6": MagicMock( "backup-6": MagicMock(
agents={"hassio.local": MagicMock(spec=AgentBackupStatus)},
date="2024-11-12T04:45:00+01:00", date="2024-11-12T04:45:00+01:00",
extra_metadata={"supervisor.addon_update": "test"}, extra_metadata={"supervisor.addon_update": "test"},
with_automatic_settings=True, with_automatic_settings=True,

View File

@ -9,6 +9,9 @@ from aiohasupervisor.models import HomeAssistantUpdateOptions, StoreAddonUpdate
import pytest import pytest
from homeassistant.components.backup import BackupManagerError, ManagerBackup from homeassistant.components.backup import BackupManagerError, ManagerBackup
# pylint: disable-next=hass-component-root-import
from homeassistant.components.backup.manager import AgentBackupStatus
from homeassistant.components.hassio import DOMAIN from homeassistant.components.hassio import DOMAIN
from homeassistant.components.hassio.const import ( from homeassistant.components.hassio.const import (
ATTR_DATA, ATTR_DATA,
@ -467,34 +470,40 @@ async def test_update_addon_with_backup(
( (
{ {
"backup-1": MagicMock( "backup-1": MagicMock(
agents={"hassio.local": MagicMock(spec=AgentBackupStatus)},
date="2024-11-10T04:45:00+01:00", date="2024-11-10T04:45:00+01:00",
with_automatic_settings=True, with_automatic_settings=True,
spec=ManagerBackup, spec=ManagerBackup,
), ),
"backup-2": MagicMock( "backup-2": MagicMock(
agents={"hassio.local": MagicMock(spec=AgentBackupStatus)},
date="2024-11-11T04:45:00+01:00", date="2024-11-11T04:45:00+01:00",
with_automatic_settings=False, with_automatic_settings=False,
spec=ManagerBackup, spec=ManagerBackup,
), ),
"backup-3": MagicMock( "backup-3": MagicMock(
agents={"hassio.local": MagicMock(spec=AgentBackupStatus)},
date="2024-11-11T04:45:00+01:00", date="2024-11-11T04:45:00+01:00",
extra_metadata={"supervisor.addon_update": "other"}, extra_metadata={"supervisor.addon_update": "other"},
with_automatic_settings=True, with_automatic_settings=True,
spec=ManagerBackup, spec=ManagerBackup,
), ),
"backup-4": MagicMock( "backup-4": MagicMock(
agents={"hassio.local": MagicMock(spec=AgentBackupStatus)},
date="2024-11-11T04:45:00+01:00", date="2024-11-11T04:45:00+01:00",
extra_metadata={"supervisor.addon_update": "other"}, extra_metadata={"supervisor.addon_update": "other"},
with_automatic_settings=True, with_automatic_settings=True,
spec=ManagerBackup, spec=ManagerBackup,
), ),
"backup-5": MagicMock( "backup-5": MagicMock(
agents={"hassio.local": MagicMock(spec=AgentBackupStatus)},
date="2024-11-11T04:45:00+01:00", date="2024-11-11T04:45:00+01:00",
extra_metadata={"supervisor.addon_update": "test"}, extra_metadata={"supervisor.addon_update": "test"},
with_automatic_settings=True, with_automatic_settings=True,
spec=ManagerBackup, spec=ManagerBackup,
), ),
"backup-6": MagicMock( "backup-6": MagicMock(
agents={"hassio.local": MagicMock(spec=AgentBackupStatus)},
date="2024-11-12T04:45:00+01:00", date="2024-11-12T04:45:00+01:00",
extra_metadata={"supervisor.addon_update": "test"}, extra_metadata={"supervisor.addon_update": "test"},
with_automatic_settings=True, with_automatic_settings=True,

View File

@ -110,6 +110,7 @@ def system_info_fixture() -> HeosSystem:
"1.0.0", "1.0.0",
"127.0.0.1", "127.0.0.1",
NetworkType.WIRED, NetworkType.WIRED,
True,
) )
return HeosSystem( return HeosSystem(
"user@user.com", "user@user.com",
@ -123,6 +124,7 @@ def system_info_fixture() -> HeosSystem:
"1.0.0", "1.0.0",
"127.0.0.2", "127.0.0.2",
NetworkType.WIFI, NetworkType.WIFI,
True,
), ),
], ],
) )
@ -140,6 +142,7 @@ def players_fixture() -> dict[int, HeosPlayer]:
model="HEOS Drive HS2" if i == 1 else "Speaker", model="HEOS Drive HS2" if i == 1 else "Speaker",
serial="123456", serial="123456",
version="1.0.0", version="1.0.0",
supported_version=True,
line_out=LineOutLevelType.VARIABLE, line_out=LineOutLevelType.VARIABLE,
is_muted=False, is_muted=False,
available=True, available=True,

View File

@ -105,6 +105,7 @@
'name': 'Test Player', 'name': 'Test Player',
'network': 'wired', 'network': 'wired',
'serial': '**REDACTED**', 'serial': '**REDACTED**',
'supported_version': True,
'version': '1.0.0', 'version': '1.0.0',
}), }),
'hosts': list([ 'hosts': list([
@ -114,6 +115,7 @@
'name': 'Test Player', 'name': 'Test Player',
'network': 'wired', 'network': 'wired',
'serial': '**REDACTED**', 'serial': '**REDACTED**',
'supported_version': True,
'version': '1.0.0', 'version': '1.0.0',
}), }),
dict({ dict({
@ -122,6 +124,7 @@
'name': 'Test Player 2', 'name': 'Test Player 2',
'network': 'wifi', 'network': 'wifi',
'serial': '**REDACTED**', 'serial': '**REDACTED**',
'supported_version': True,
'version': '1.0.0', 'version': '1.0.0',
}), }),
]), ]),
@ -133,6 +136,7 @@
'name': 'Test Player', 'name': 'Test Player',
'network': 'wired', 'network': 'wired',
'serial': '**REDACTED**', 'serial': '**REDACTED**',
'supported_version': True,
'version': '1.0.0', 'version': '1.0.0',
}), }),
]), ]),
@ -371,6 +375,7 @@
'serial': '**REDACTED**', 'serial': '**REDACTED**',
'shuffle': False, 'shuffle': False,
'state': 'stop', 'state': 'stop',
'supported_version': True,
'version': '1.0.0', 'version': '1.0.0',
'volume': 25, 'volume': 25,
}), }),

View File

@ -15,7 +15,13 @@ TEST_SENSOR = Sensor(
sensor_id="2", sensor_id="2",
sensor_field_names=["Temperature"], sensor_field_names=["Temperature"],
location=Location(id="1", name="Test"), location=Location(id="1", name="Test"),
data={"Temperature": {"values": [{"s": "2"}], "unit": "degrees_celsius"}}, data={
"data": {
"current": {
"Temperature": {"spot": {"value": "2"}, "unit": "degrees_celsius"}
}
}
},
permissions={"read": True}, permissions={"read": True},
model="Test", model="Test",
) )
@ -26,7 +32,13 @@ TEST_NO_PERMISSION_SENSOR = Sensor(
sensor_id="2", sensor_id="2",
sensor_field_names=["Temperature"], sensor_field_names=["Temperature"],
location=Location(id="1", name="Test"), location=Location(id="1", name="Test"),
data={"Temperature": {"values": [{"s": "2"}], "unit": "degrees_celsius"}}, data={
"data": {
"current": {
"Temperature": {"spot": {"value": "2"}, "unit": "degrees_celsius"}
}
}
},
permissions={"read": False}, permissions={"read": False},
model="Test", model="Test",
) )
@ -37,7 +49,16 @@ TEST_UNSUPPORTED_SENSOR = Sensor(
sensor_id="2", sensor_id="2",
sensor_field_names=["SomeUnsupportedField"], sensor_field_names=["SomeUnsupportedField"],
location=Location(id="1", name="Test"), location=Location(id="1", name="Test"),
data={"SomeUnsupportedField": {"values": [{"s": "2"}], "unit": "degrees_celsius"}}, data={
"data": {
"current": {
"SomeUnsupportedField": {
"spot": {"value": "2"},
"unit": "degrees_celsius",
}
}
}
},
permissions={"read": True}, permissions={"read": True},
model="Test", model="Test",
) )
@ -48,7 +69,13 @@ TEST_FLOAT_SENSOR = Sensor(
sensor_id="2", sensor_id="2",
sensor_field_names=["Temperature"], sensor_field_names=["Temperature"],
location=Location(id="1", name="Test"), location=Location(id="1", name="Test"),
data={"Temperature": {"values": [{"s": "2.3"}], "unit": "degrees_celsius"}}, data={
"data": {
"current": {
"Temperature": {"spot": {"value": "2.3"}, "unit": "degrees_celsius"}
}
}
},
permissions={"read": True}, permissions={"read": True},
model="Test", model="Test",
) )
@ -59,7 +86,9 @@ TEST_STRING_SENSOR = Sensor(
sensor_id="2", sensor_id="2",
sensor_field_names=["WetDry"], sensor_field_names=["WetDry"],
location=Location(id="1", name="Test"), location=Location(id="1", name="Test"),
data={"WetDry": {"values": [{"s": "dry"}], "unit": "wet_dry"}}, data={
"data": {"current": {"WetDry": {"spot": {"value": "dry"}, "unit": "wet_dry"}}}
},
permissions={"read": True}, permissions={"read": True},
model="Test", model="Test",
) )
@ -70,7 +99,13 @@ TEST_ALREADY_FLOAT_SENSOR = Sensor(
sensor_id="2", sensor_id="2",
sensor_field_names=["HeatIndex"], sensor_field_names=["HeatIndex"],
location=Location(id="1", name="Test"), location=Location(id="1", name="Test"),
data={"HeatIndex": {"values": [{"s": 2.3}], "unit": "degrees_fahrenheit"}}, data={
"data": {
"current": {
"HeatIndex": {"spot": {"value": 2.3}, "unit": "degrees_fahrenheit"}
}
}
},
permissions={"read": True}, permissions={"read": True},
model="Test", model="Test",
) )
@ -81,7 +116,13 @@ TEST_ALREADY_INT_SENSOR = Sensor(
sensor_id="2", sensor_id="2",
sensor_field_names=["WindSpeed"], sensor_field_names=["WindSpeed"],
location=Location(id="1", name="Test"), location=Location(id="1", name="Test"),
data={"WindSpeed": {"values": [{"s": 2}], "unit": "kilometers_per_hour"}}, data={
"data": {
"current": {
"WindSpeed": {"spot": {"value": 2}, "unit": "kilometers_per_hour"}
}
}
},
permissions={"read": True}, permissions={"read": True},
model="Test", model="Test",
) )
@ -92,7 +133,7 @@ TEST_NO_FIELD_SENSOR = Sensor(
sensor_id="2", sensor_id="2",
sensor_field_names=["Temperature"], sensor_field_names=["Temperature"],
location=Location(id="1", name="Test"), location=Location(id="1", name="Test"),
data={}, data={"data": {"current": {}}},
permissions={"read": True}, permissions={"read": True},
model="Test", model="Test",
) )
@ -103,7 +144,7 @@ TEST_MISSING_FIELD_DATA_SENSOR = Sensor(
sensor_id="2", sensor_id="2",
sensor_field_names=["Temperature"], sensor_field_names=["Temperature"],
location=Location(id="1", name="Test"), location=Location(id="1", name="Test"),
data={"Temperature": None}, data={"data": {"current": {"Temperature": None}}},
permissions={"read": True}, permissions={"read": True},
model="Test", model="Test",
) )
@ -114,7 +155,13 @@ TEST_UNITS_OVERRIDE_SENSOR = Sensor(
sensor_id="2", sensor_id="2",
sensor_field_names=["Temperature"], sensor_field_names=["Temperature"],
location=Location(id="1", name="Test"), location=Location(id="1", name="Test"),
data={"Temperature": {"values": [{"s": "2.1"}], "unit": "degrees_fahrenheit"}}, data={
"data": {
"current": {
"Temperature": {"spot": {"value": "2.1"}, "unit": "degrees_fahrenheit"}
}
}
},
permissions={"read": True}, permissions={"read": True},
model="Test", model="Test",
) )

View File

@ -4,7 +4,7 @@
'coordinator_data': list([ 'coordinator_data': list([
dict({ dict({
'__type': "<class 'lacrosse_view.Sensor'>", '__type': "<class 'lacrosse_view.Sensor'>",
'repr': "Sensor(name='Test', device_id='1', type='Test', sensor_id='2', sensor_field_names=['Temperature'], location=Location(id='1', name='Test'), permissions={'read': True}, model='Test', data={'Temperature': {'values': [{'s': '2'}], 'unit': 'degrees_celsius'}})", 'repr': "Sensor(name='Test', device_id='1', type='Test', sensor_id='2', sensor_field_names=['Temperature'], location=Location(id='1', name='Test'), permissions={'read': True}, model='Test', data={'Temperature': {'spot': {'value': '2'}, 'unit': 'degrees_celsius'}})",
}), }),
]), ]),
'entry': dict({ 'entry': dict({

View File

@ -26,9 +26,14 @@ async def test_entry_diagnostics(
) )
config_entry.add_to_hass(hass) config_entry.add_to_hass(hass)
sensor = TEST_SENSOR.model_copy()
status = sensor.data
sensor.data = None
with ( with (
patch("lacrosse_view.LaCrosse.login", return_value=True), patch("lacrosse_view.LaCrosse.login", return_value=True),
patch("lacrosse_view.LaCrosse.get_sensors", return_value=[TEST_SENSOR]), patch("lacrosse_view.LaCrosse.get_devices", return_value=[sensor]),
patch("lacrosse_view.LaCrosse.get_sensor_status", return_value=status),
): ):
assert await hass.config_entries.async_setup(config_entry.entry_id) assert await hass.config_entries.async_setup(config_entry.entry_id)
await hass.async_block_till_done() await hass.async_block_till_done()

View File

@ -20,12 +20,17 @@ async def test_unload_entry(hass: HomeAssistant) -> None:
config_entry = MockConfigEntry(domain=DOMAIN, data=MOCK_ENTRY_DATA) config_entry = MockConfigEntry(domain=DOMAIN, data=MOCK_ENTRY_DATA)
config_entry.add_to_hass(hass) config_entry.add_to_hass(hass)
sensor = TEST_SENSOR.model_copy()
status = sensor.data
sensor.data = None
with ( with (
patch("lacrosse_view.LaCrosse.login", return_value=True), patch("lacrosse_view.LaCrosse.login", return_value=True),
patch( patch(
"lacrosse_view.LaCrosse.get_sensors", "lacrosse_view.LaCrosse.get_devices",
return_value=[TEST_SENSOR], return_value=[sensor],
), ),
patch("lacrosse_view.LaCrosse.get_sensor_status", return_value=status),
): ):
assert await hass.config_entries.async_setup(config_entry.entry_id) assert await hass.config_entries.async_setup(config_entry.entry_id)
await hass.async_block_till_done() await hass.async_block_till_done()
@ -68,7 +73,7 @@ async def test_http_error(hass: HomeAssistant) -> None:
with ( with (
patch("lacrosse_view.LaCrosse.login", return_value=True), patch("lacrosse_view.LaCrosse.login", return_value=True),
patch("lacrosse_view.LaCrosse.get_sensors", side_effect=HTTPError), patch("lacrosse_view.LaCrosse.get_devices", side_effect=HTTPError),
): ):
assert not await hass.config_entries.async_setup(config_entry.entry_id) assert not await hass.config_entries.async_setup(config_entry.entry_id)
await hass.async_block_till_done() await hass.async_block_till_done()
@ -84,12 +89,17 @@ async def test_new_token(hass: HomeAssistant, freezer: FrozenDateTimeFactory) ->
config_entry = MockConfigEntry(domain=DOMAIN, data=MOCK_ENTRY_DATA) config_entry = MockConfigEntry(domain=DOMAIN, data=MOCK_ENTRY_DATA)
config_entry.add_to_hass(hass) config_entry.add_to_hass(hass)
sensor = TEST_SENSOR.model_copy()
status = sensor.data
sensor.data = None
with ( with (
patch("lacrosse_view.LaCrosse.login", return_value=True) as login, patch("lacrosse_view.LaCrosse.login", return_value=True) as login,
patch( patch(
"lacrosse_view.LaCrosse.get_sensors", "lacrosse_view.LaCrosse.get_devices",
return_value=[TEST_SENSOR], return_value=[sensor],
), ),
patch("lacrosse_view.LaCrosse.get_sensor_status", return_value=status),
): ):
assert await hass.config_entries.async_setup(config_entry.entry_id) assert await hass.config_entries.async_setup(config_entry.entry_id)
await hass.async_block_till_done() await hass.async_block_till_done()
@ -103,7 +113,7 @@ async def test_new_token(hass: HomeAssistant, freezer: FrozenDateTimeFactory) ->
with ( with (
patch("lacrosse_view.LaCrosse.login", return_value=True) as login, patch("lacrosse_view.LaCrosse.login", return_value=True) as login,
patch( patch(
"lacrosse_view.LaCrosse.get_sensors", "lacrosse_view.LaCrosse.get_devices",
return_value=[TEST_SENSOR], return_value=[TEST_SENSOR],
), ),
): ):
@ -121,12 +131,17 @@ async def test_failed_token(
config_entry = MockConfigEntry(domain=DOMAIN, data=MOCK_ENTRY_DATA) config_entry = MockConfigEntry(domain=DOMAIN, data=MOCK_ENTRY_DATA)
config_entry.add_to_hass(hass) config_entry.add_to_hass(hass)
sensor = TEST_SENSOR.model_copy()
status = sensor.data
sensor.data = None
with ( with (
patch("lacrosse_view.LaCrosse.login", return_value=True) as login, patch("lacrosse_view.LaCrosse.login", return_value=True) as login,
patch( patch(
"lacrosse_view.LaCrosse.get_sensors", "lacrosse_view.LaCrosse.get_devices",
return_value=[TEST_SENSOR], return_value=[sensor],
), ),
patch("lacrosse_view.LaCrosse.get_sensor_status", return_value=status),
): ):
assert await hass.config_entries.async_setup(config_entry.entry_id) assert await hass.config_entries.async_setup(config_entry.entry_id)
await hass.async_block_till_done() await hass.async_block_till_done()

View File

@ -32,9 +32,14 @@ async def test_entities_added(hass: HomeAssistant) -> None:
config_entry = MockConfigEntry(domain=DOMAIN, data=MOCK_ENTRY_DATA) config_entry = MockConfigEntry(domain=DOMAIN, data=MOCK_ENTRY_DATA)
config_entry.add_to_hass(hass) config_entry.add_to_hass(hass)
sensor = TEST_SENSOR.model_copy()
status = sensor.data
sensor.data = None
with ( with (
patch("lacrosse_view.LaCrosse.login", return_value=True), patch("lacrosse_view.LaCrosse.login", return_value=True),
patch("lacrosse_view.LaCrosse.get_sensors", return_value=[TEST_SENSOR]), patch("lacrosse_view.LaCrosse.get_devices", return_value=[sensor]),
patch("lacrosse_view.LaCrosse.get_sensor_status", return_value=status),
): ):
assert await hass.config_entries.async_setup(config_entry.entry_id) assert await hass.config_entries.async_setup(config_entry.entry_id)
await hass.async_block_till_done() await hass.async_block_till_done()
@ -54,12 +59,17 @@ async def test_sensor_permission(
config_entry = MockConfigEntry(domain=DOMAIN, data=MOCK_ENTRY_DATA) config_entry = MockConfigEntry(domain=DOMAIN, data=MOCK_ENTRY_DATA)
config_entry.add_to_hass(hass) config_entry.add_to_hass(hass)
sensor = TEST_NO_PERMISSION_SENSOR.model_copy()
status = sensor.data
sensor.data = None
with ( with (
patch("lacrosse_view.LaCrosse.login", return_value=True), patch("lacrosse_view.LaCrosse.login", return_value=True),
patch( patch(
"lacrosse_view.LaCrosse.get_sensors", "lacrosse_view.LaCrosse.get_devices",
return_value=[TEST_NO_PERMISSION_SENSOR], return_value=[sensor],
), ),
patch("lacrosse_view.LaCrosse.get_sensor_status", return_value=status),
): ):
assert not await hass.config_entries.async_setup(config_entry.entry_id) assert not await hass.config_entries.async_setup(config_entry.entry_id)
await hass.async_block_till_done() await hass.async_block_till_done()
@ -79,11 +89,14 @@ async def test_field_not_supported(
config_entry = MockConfigEntry(domain=DOMAIN, data=MOCK_ENTRY_DATA) config_entry = MockConfigEntry(domain=DOMAIN, data=MOCK_ENTRY_DATA)
config_entry.add_to_hass(hass) config_entry.add_to_hass(hass)
sensor = TEST_UNSUPPORTED_SENSOR.model_copy()
status = sensor.data
sensor.data = None
with ( with (
patch("lacrosse_view.LaCrosse.login", return_value=True), patch("lacrosse_view.LaCrosse.login", return_value=True),
patch( patch("lacrosse_view.LaCrosse.get_devices", return_value=[sensor]),
"lacrosse_view.LaCrosse.get_sensors", return_value=[TEST_UNSUPPORTED_SENSOR] patch("lacrosse_view.LaCrosse.get_sensor_status", return_value=status),
),
): ):
assert await hass.config_entries.async_setup(config_entry.entry_id) assert await hass.config_entries.async_setup(config_entry.entry_id)
await hass.async_block_till_done() await hass.async_block_till_done()
@ -114,12 +127,17 @@ async def test_field_types(
config_entry = MockConfigEntry(domain=DOMAIN, data=MOCK_ENTRY_DATA) config_entry = MockConfigEntry(domain=DOMAIN, data=MOCK_ENTRY_DATA)
config_entry.add_to_hass(hass) config_entry.add_to_hass(hass)
sensor = test_input.model_copy()
status = sensor.data
sensor.data = None
with ( with (
patch("lacrosse_view.LaCrosse.login", return_value=True), patch("lacrosse_view.LaCrosse.login", return_value=True),
patch( patch(
"lacrosse_view.LaCrosse.get_sensors", "lacrosse_view.LaCrosse.get_devices",
return_value=[test_input], return_value=[test_input],
), ),
patch("lacrosse_view.LaCrosse.get_sensor_status", return_value=status),
): ):
assert await hass.config_entries.async_setup(config_entry.entry_id) assert await hass.config_entries.async_setup(config_entry.entry_id)
await hass.async_block_till_done() await hass.async_block_till_done()
@ -137,12 +155,17 @@ async def test_no_field(hass: HomeAssistant, caplog: pytest.LogCaptureFixture) -
config_entry = MockConfigEntry(domain=DOMAIN, data=MOCK_ENTRY_DATA) config_entry = MockConfigEntry(domain=DOMAIN, data=MOCK_ENTRY_DATA)
config_entry.add_to_hass(hass) config_entry.add_to_hass(hass)
sensor = TEST_NO_FIELD_SENSOR.model_copy()
status = sensor.data
sensor.data = None
with ( with (
patch("lacrosse_view.LaCrosse.login", return_value=True), patch("lacrosse_view.LaCrosse.login", return_value=True),
patch( patch(
"lacrosse_view.LaCrosse.get_sensors", "lacrosse_view.LaCrosse.get_devices",
return_value=[TEST_NO_FIELD_SENSOR], return_value=[sensor],
), ),
patch("lacrosse_view.LaCrosse.get_sensor_status", return_value=status),
): ):
assert await hass.config_entries.async_setup(config_entry.entry_id) assert await hass.config_entries.async_setup(config_entry.entry_id)
await hass.async_block_till_done() await hass.async_block_till_done()
@ -160,12 +183,17 @@ async def test_field_data_missing(hass: HomeAssistant) -> None:
config_entry = MockConfigEntry(domain=DOMAIN, data=MOCK_ENTRY_DATA) config_entry = MockConfigEntry(domain=DOMAIN, data=MOCK_ENTRY_DATA)
config_entry.add_to_hass(hass) config_entry.add_to_hass(hass)
sensor = TEST_MISSING_FIELD_DATA_SENSOR.model_copy()
status = sensor.data
sensor.data = None
with ( with (
patch("lacrosse_view.LaCrosse.login", return_value=True), patch("lacrosse_view.LaCrosse.login", return_value=True),
patch( patch(
"lacrosse_view.LaCrosse.get_sensors", "lacrosse_view.LaCrosse.get_devices",
return_value=[TEST_MISSING_FIELD_DATA_SENSOR], return_value=[sensor],
), ),
patch("lacrosse_view.LaCrosse.get_sensor_status", return_value=status),
): ):
assert await hass.config_entries.async_setup(config_entry.entry_id) assert await hass.config_entries.async_setup(config_entry.entry_id)
await hass.async_block_till_done() await hass.async_block_till_done()

View File

@ -89,80 +89,3 @@ async def test_auth(
assert creds.client_id == CLIENT_ID assert creds.client_id == CLIENT_ID
assert creds.client_secret == CLIENT_SECRET assert creds.client_secret == CLIENT_SECRET
assert creds.scopes == SDM_SCOPES assert creds.scopes == SDM_SCOPES
# This tests needs to be adjusted to remove lingering tasks
@pytest.mark.parametrize("expected_lingering_tasks", [True])
@pytest.mark.parametrize(
"token_expiration_time",
[time.time() - 7 * 86400],
ids=["expires-in-past"],
)
async def test_auth_expired_token(
hass: HomeAssistant,
aioclient_mock: AiohttpClientMocker,
setup_platform: PlatformSetup,
token_expiration_time: float,
) -> None:
"""Verify behavior of an expired token."""
# Prepare a token refresh response
aioclient_mock.post(
OAUTH2_TOKEN,
json={
"access_token": FAKE_UPDATED_TOKEN,
"expires_at": time.time() + 86400,
"expires_in": 86400,
},
)
# Prepare to capture credentials in API request. Empty payloads just mean
# no devices or structures are loaded.
aioclient_mock.get(f"{API_URL}/enterprises/{PROJECT_ID}/structures", json={})
aioclient_mock.get(f"{API_URL}/enterprises/{PROJECT_ID}/devices", json={})
# Prepare to capture credentials for Subscriber
captured_creds = None
def async_new_subscriber(
credentials: Credentials,
) -> Mock:
"""Capture credentials for tests."""
nonlocal captured_creds
captured_creds = credentials
return AsyncMock()
with patch(
"google_nest_sdm.subscriber_client.pubsub_v1.SubscriberAsyncClient",
side_effect=async_new_subscriber,
) as new_subscriber_mock:
await setup_platform()
calls = aioclient_mock.mock_calls
assert len(calls) == 3
# Verify refresh token call to get an updated token
(method, url, data, headers) = calls[0]
assert data == {
"client_id": CLIENT_ID,
"client_secret": CLIENT_SECRET,
"grant_type": "refresh_token",
"refresh_token": FAKE_REFRESH_TOKEN,
}
# Verify API requests are made with the new token
(method, url, data, headers) = calls[1]
assert headers == {"Authorization": f"Bearer {FAKE_UPDATED_TOKEN}"}
(method, url, data, headers) = calls[2]
assert headers == {"Authorization": f"Bearer {FAKE_UPDATED_TOKEN}"}
# The subscriber is created with a token that is expired. Verify that the
# credential is expired so the subscriber knows it needs to refresh it.
assert len(new_subscriber_mock.mock_calls) == 1
assert captured_creds
creds = captured_creds
assert creds.token == FAKE_TOKEN
assert creds.refresh_token == FAKE_REFRESH_TOKEN
assert int(dt_util.as_timestamp(creds.expiry)) == int(token_expiration_time)
assert not creds.valid
assert creds.expired
assert creds.token_uri == OAUTH2_TOKEN
assert creds.client_id == CLIENT_ID
assert creds.client_secret == CLIENT_SECRET
assert creds.scopes == SDM_SCOPES

View File

@ -5,10 +5,10 @@ from json import dumps
from onedrive_personal_sdk.models.items import ( from onedrive_personal_sdk.models.items import (
AppRoot, AppRoot,
Contributor,
File, File,
Folder, Folder,
Hashes, Hashes,
IdentitySet,
ItemParentReference, ItemParentReference,
User, User,
) )
@ -31,7 +31,7 @@ BACKUP_METADATA = {
"size": 34519040, "size": 34519040,
} }
CONTRIBUTOR = Contributor( IDENTITY_SET = IdentitySet(
user=User( user=User(
display_name="John Doe", display_name="John Doe",
id="id", id="id",
@ -47,7 +47,7 @@ MOCK_APPROOT = AppRoot(
parent_reference=ItemParentReference( parent_reference=ItemParentReference(
drive_id="mock_drive_id", id="id", path="path" drive_id="mock_drive_id", id="id", path="path"
), ),
created_by=CONTRIBUTOR, created_by=IDENTITY_SET,
) )
MOCK_BACKUP_FOLDER = Folder( MOCK_BACKUP_FOLDER = Folder(
@ -58,7 +58,7 @@ MOCK_BACKUP_FOLDER = Folder(
parent_reference=ItemParentReference( parent_reference=ItemParentReference(
drive_id="mock_drive_id", id="id", path="path" drive_id="mock_drive_id", id="id", path="path"
), ),
created_by=CONTRIBUTOR, created_by=IDENTITY_SET,
) )
MOCK_BACKUP_FILE = File( MOCK_BACKUP_FILE = File(
@ -73,7 +73,7 @@ MOCK_BACKUP_FILE = File(
), ),
mime_type="application/x-tar", mime_type="application/x-tar",
description="", description="",
created_by=CONTRIBUTOR, created_by=IDENTITY_SET,
) )
MOCK_METADATA_FILE = File( MOCK_METADATA_FILE = File(
@ -96,5 +96,5 @@ MOCK_METADATA_FILE = File(
} }
) )
), ),
created_by=CONTRIBUTOR, created_by=IDENTITY_SET,
) )