Merge feature branch with backup changes to dev (#132954)

* Reapply "Make WS command backup/generate send events" (#131530)

This reverts commit 9b8316df3f78d136ae73c096168bd73ffebc4465.

* MVP implementation of Backup sync agents (#126122)

* init sync agent

* add syncing

* root import

* rename list to info and add sync state

* Add base backup class

* Revert unneded change

* adjust tests

* move to kitchen_sink

* split

* move

* Adjustments

* Adjustment

* update

* Tests

* Test unknown agent

* adjust

* Adjust for different test environments

* Change /info WS to contain a dictinary

* reorder

* Add websocket command to trigger sync from the supervisor

* cleanup

* Make mypy happier

---------

Co-authored-by: Erik <erik@montnemery.com>

* Make BackupSyncMetadata model a dataclass (#130555)

Make backup BackupSyncMetadata model a dataclass

* Rename backup sync agent to backup agent (#130575)

* Rename sync agent module to agent

* Rename BackupSyncAgent to BackupAgent

* Fix test typo

* Rename async_get_backup_sync_agents to async_get_backup_agents

* Rename and clean up remaining sync things

* Update kitchen sink

* Apply suggestions from code review

* Update test_manager.py

---------

Co-authored-by: Erik Montnemery <erik@montnemery.com>

* Add additional options to WS command backup/generate (#130530)

* Add additional options to WS command backup/generate

* Improve test

* Improve test

* Align parameter names in backup/agents/* WS commands (#130590)

* Allow setting password for backups (#110630)

* Allow setting password for backups

* use is_hassio from helpers

* move it

* Fix getting psw

* Fix restoring with psw

* Address review comments

* Improve docstring

* Adjust kitchen sink

* Adjust

---------

Co-authored-by: Erik <erik@montnemery.com>

* Export relevant names from backup integration (#130596)

* Tweak backup agent interface (#130613)

* Tweak backup agent interface

* Adjust kitchen_sink

* Test kitchen sink backup (#130609)

* Test agents_list_backups

* Test agents_info

* Test agents_download

* Export Backup from manager

* Test agents_upload

* Update tests after rebase

* Use backup domain

* Remove WS command backup/upload (#130588)

* Remove WS command backup/upload

* Disable failing kitchen_sink test

* Make local backup a backup agent (#130623)

* Make local backup a backup agent

* Adjust

* Adjust

* Adjust

* Adjust tests

* Adjust

* Adjust

* Adjust docstring

* Adjust

* Protect members of CoreLocalBackupAgent

* Remove redundant check for file

* Make the backup.create service use the first local agent

* Add BackupAgent.async_get_backup

* Fix some TODOs

* Add support for downloading backup from a remote agent

* Fix restore

* Fix test

* Adjust kitchen_sink test

* Remove unused method BackupManager.async_get_backup_path

* Re-enable kitchen sink test

* Remove BaseBackupManager.async_upload_backup

* Support restore from remote agent

* Fix review comments

* Include backup agent error in response to WS command backup/info (#130884)

* Adjust code related to WS command backup/info (#130890)

* Include backup agent error in response to WS command backup/details (#130892)

* Remove LOCAL_AGENT_ID constant from backup manager (#130895)

* Add backup config storage (#130871)

* Add base for backup config

* Allow updating backup config

* Test loading backup config

* Add backup config update method

* Add temporary check for BackupAgent.async_remove_backup (#130893)

* Rename backup slug to backup_id (#130902)

* Improve backup websocket API tests (#130912)

* Improve backup websocket API tests

* Add missing snapshot

* Fix tests leaving files behind

* Improve backup manager backup creation tests (#130916)

* Remove class backup.backup.LocalBackup (#130919)

* Add agent delete backup (#130921)

* Add backup agent delete backup

* Remove agents delete websocket command

* Update docstring

Co-authored-by: Erik Montnemery <erik@montnemery.com>

---------

Co-authored-by: Erik Montnemery <erik@montnemery.com>

* Disable core local backup agent in hassio (#130933)

* Rename remove backup to delete backup (#130940)

* Rename remove backup to delete backup

* Revert "backup/delete"

* Refactor BackupManager (#130947)

* Refactor BackupManager

* Adjust

* Adjust backup creation

* Copy in executor

* Fix BackupManager.async_get_backup (#130975)

* Fix typo in backup tests (#130978)

* Adjust backup NewBackup class (#130976)

* Remove class backup.BackupUploadMetadata (#130977)

Remove class backup.BackupMetadata

* Report backup size in bytes instead of MB (#131028)

Co-authored-by: Robert Resch <robert@resch.dev>

* Speed up CI for feature branch (#131030)

* Speed up CI for feature branch

* adjust

* fix

* fix

* fix

* fix

* Rename remove to delete in backup websocket type (#131023)

* Revert "Speed up CI for feature branch" (#131074)

Revert "Speed up CI for feature branch (#131030)"

This reverts commit 791280506d1859b1a722f5064d75bcbe48acc1c3.

* Rename class BaseBackup to AgentBackup (#131083)

* Rename class BaseBackup to AgentBackup

* Update tests

* Speed up CI for backup feature branch (#131079)

* Add backup platform to the hassio integration (#130991)

* Add backup platform to the hassio integration

* Add hassio to after_dependencies of backup

* Address review comments

* Remove redundant hassio parametrization of tests

* Add tests

* Address review comments

* Bump CI cache version

* Revert "Bump CI cache version"

This reverts commit 2ab4d2b1795c953ccfc9b17c47f9df3faac83749.

* Extend backup info class AgentBackup (#131110)

* Extend backup info class AgentBackup

* Update kitchen sink

* Update kitchen sink test

* Update kitchen sink test

* Exclude cloud and hassio from core files (#131117)

* Remove unnecessary **kwargs from backup API (#131124)

* Fix backup tests (#131128)

* Freeze backup dataclasses (#131122)

* Protect CoreLocalBackupAgent.load_backups (#131126)

* Use backup metadata v2 in core/container backups (#131125)

* Extend backup creation API (#131121)

* Extend backup creation API

* Add tests

* Fix merge

* Fix merge

* Return agent errors when deleting a backup (#131142)

* Return agent errors when deleting a backup

* Remove redundant calls to dict.keys()

* Add enum type for backup folder (#131158)

* Add method AgentBackup.from_dict (#131164)

* Remove WS command backup/agents/list_backups (#131163)

* Handle backup schedule (#131127)

* Add backup schedule handling

* Fix unrelated incorrect type annotation in test

* Clarify delay save

* Make the backup time compatible with the recorder nightly job

* Update create backup parameters

* Use typed dict for create backup parameters

* Simplify schedule state

* Group create backup parameters

* Move parameter

* Fix typo

* Use Folder model

* Handle deserialization of folders better

* Fail on attempt to include addons or folders in core backup (#131204)

* Fix AgentBackup test (#131201)

* Add options to WS command backup/restore (#131194)

* Add options to WS command backup/restore

* Add tests

* Fix test

* Teach core backup to restore only database or only settings (#131225)

* Exclude tmp_backups/*.tar from backups (#131243)

* Add WS command backup/subscribe_events (#131250)

* Clean up temporary directory after restoring backup (#131263)

* Improve hassio backup agent list (#131268)

* Include `last_automatic_backup` in reply to backup/info (#131293)

Include last_automatic_backup in reply to backup/info

* Handle backup delete after config (#131259)

* Handle delete after copies

* Handle delete after days

* Add some test examples

* Test config_delete_after_logic

* Test config_delete_after_copies_logic

* Test more delete after days

* Add debug logs

* Always delete the oldest backup first

* Never remove the last backup

* Clean up words

Co-authored-by: Erik Montnemery <erik@montnemery.com>

* Fix after cleaning words

* Use utcnow

* Remove duplicate guard

* Simplify sorting

* Delete backups even if there are agent errors on get backups

---------

Co-authored-by: Erik Montnemery <erik@montnemery.com>

* Rename backup delete after to backup retention (#131364)

* Rename backup delete after to backup retention

* Tweak

* Remove length limit on `agent_ids` when configuring backup (#132057)

Remove length limit on agent_ids when configuring backup

* Rename backup retention_config to retention (#132068)

* Modify backup agent API to be stream oriented (#132090)

* Modify backup agent API to be stream oriented

* Fix tests

* Adjust after code review

* Remove no longer needed pylint override

* Improve test coverage

* Change BackupAgent API to work with AsyncIterator objects

* Don't close files in the event loop

* Don't close files in the event loop

* Fix backup manager create backup log (#132174)

* Fix debug log level (#132186)

* Add cloud backup agent (#129621)

* Init cloud backup sync

* Add more metadata

* Fix typo

* Adjust to base changes

* Don't raise on list if more than one backup is available

* Adjust to base branch

* Fetch always and verify on download

* Update homeassistant/components/cloud/backup.py

Co-authored-by: Martin Hjelmare <marhje52@gmail.com>

* Adjust to base branch changes

* Not required anymore

* Workaround

* Fix blocking event loop

* Fix

* Add some tests

* some tests

* Add cloud backup delete functionality

* Enable check

* Fix ruff

* Use fixture

* Use iter_chunks instead

* Remove read

* Remove explicit export of read_backup

* Align with BackupAgent API changes

* Improve test coverage

* Improve error handling

* Adjust docstrings

* Catch aiohttp.ClientError bubbling up from hass_nabucasa

* Improve iteration

---------

Co-authored-by: Erik <erik@montnemery.com>
Co-authored-by: Robert Resch <robert@resch.dev>
Co-authored-by: Martin Hjelmare <marhje52@gmail.com>
Co-authored-by: Krisjanis Lejejs <krisjanis.lejejs@gmail.com>

* Extract file receiver from `BackupManager.async_receive_backup` to util (#132271)

* Extract file receiver from BackupManager.async_receive_backup to util

* Apply suggestions from code review

Co-authored-by: Martin Hjelmare <marhje52@gmail.com>

---------

Co-authored-by: Martin Hjelmare <marhje52@gmail.com>

* Make sure backup directory exists (#132269)

* Make sure backup directory exists

* Hand off directory creation to executor

* Use mkdir's exist_ok feeature

* Organize BackupManager instance attributes (#132277)

* Don't store received backups in a TempDir (#132272)

* Don't store received backups in a TempDir

* Fix tests

* Make sure backup directory exists

* Address review comments

* Fix tests

* Rewrite backup manager state handling (#132375)

* Rewrite backup manager state handling

* Address review comments

* Modify backup reader/writer API to be stream oriented (#132464)

* Internalize backup tasks (#132482)

* Internalize backup tasks

* Update test after rebase

* Handle backup error during automatic backup (#132511)

* Improve backup manager state logging (#132549)

* Fix backup manager state when restore completes (#132548)

* Remove WS command backup/agents/download (#132664)

* Add WS command backup/generate_with_stored_settings (#132671)

* Add WS command backup/generate_with_stored_settings

* Register the new command, add tests

* Refactor local agent backup tests (#132683)

* Refactor test_load_backups

* Refactor test loading agents

* Refactor test_delete_backup

* Refactor test_upload

* Clean up duplicate tests

* Refactor backup manager receive tests (#132701)

* Refactor backup manager receive tests

* Clean up

* Refactor pre and post platform tests (#132708)

* Refactor backup pre platform test

* Refactor backup post platform test

* Bump aiohasupervisor to version 0.2.2b0 (#132704)

* Bump aiohasupervisor to version 0.2.2b0

* Adjust tests

* Publish event when manager is idle after creating backup (#132724)

* Handle busy backup manager when uploading backup (#132736)

* Adjust hassio backup agent to supervisor changes (#132732)

* Adjust hassio backup agent to supervisor changes

* Fix typo

* Refactor test for create backup with wrong parameters (#132763)

* Refactor test not loading bad backup platforms (#132769)

* Improve receive backup coverage (#132758)

* Refactor initiate backup test (#132829)

* Rename Backup to ManagerBackup (#132841)

* Refactor backup config (#132845)

* Refactor backup config

* Remove unnecessary condition

* Adjust tests

* Improve initiate backup test (#132858)

* Store the time of automatic backup attempts (#132860)

* Store the time of automatic backup attempts

* Address review comments

* Update test

* Update cloud test

* Save agent failures when creating backups (#132850)

* Save agent failures when creating backups

* Update tests

* Store KnownBackups

* Add test

* Only clear known_backups on no error, add tests

* Address review comments

* Store known backups as a list

* Update tests

* Track all backups created with backup strategy settings (#132916)

* Track all backups created with saved settings

* Rename

* Add explicit call to save the store

* Don't register service backup.create in HassOS installations (#132932)

* Revert changes to action service backup.create (#132938)

* Fix logic for cleaning up temporary backup file (#132934)

* Fix logic for cleaning up temporary backup file

* Reduce scope of patch

* Fix with_strategy_settings info not sent over websocket (#132939)

* Fix with_strategy_settings info not sent over websocket

* Fix kitchen sink tests

* Fix cloud and hassio tests

* Revert backup ci changes (#132955)

Revert changes speeding up CI

* Fix revert of CI changes (#132960)

---------

Co-authored-by: Joakim Sørensen <joasoe@gmail.com>
Co-authored-by: Martin Hjelmare <marhje52@gmail.com>
Co-authored-by: Robert Resch <robert@resch.dev>
Co-authored-by: Paul Bottein <paul.bottein@gmail.com>
Co-authored-by: Krisjanis Lejejs <krisjanis.lejejs@gmail.com>
This commit is contained in:
Erik Montnemery 2024-12-11 21:49:34 +01:00 committed by GitHub
parent a1e4b3b0af
commit 8e991fc92f
No known key found for this signature in database
GPG Key ID: B5690EEEBB952194
38 changed files with 9977 additions and 773 deletions

View File

@ -1,6 +1,10 @@
"""Home Assistant module to handle restoring backups."""
from __future__ import annotations
from collections.abc import Iterable
from dataclasses import dataclass
import hashlib
import json
import logging
from pathlib import Path
@ -14,7 +18,12 @@ import securetar
from .const import __version__ as HA_VERSION
RESTORE_BACKUP_FILE = ".HA_RESTORE"
KEEP_PATHS = ("backups",)
KEEP_BACKUPS = ("backups",)
KEEP_DATABASE = (
"home-assistant_v2.db",
"home-assistant_v2.db-wal",
)
_LOGGER = logging.getLogger(__name__)
@ -24,6 +33,21 @@ class RestoreBackupFileContent:
"""Definition for restore backup file content."""
backup_file_path: Path
password: str | None
remove_after_restore: bool
restore_database: bool
restore_homeassistant: bool
def password_to_key(password: str) -> bytes:
"""Generate a AES Key from password.
Matches the implementation in supervisor.backups.utils.password_to_key.
"""
key: bytes = password.encode()
for _ in range(100):
key = hashlib.sha256(key).digest()
return key[:16]
def restore_backup_file_content(config_dir: Path) -> RestoreBackupFileContent | None:
@ -32,20 +56,24 @@ def restore_backup_file_content(config_dir: Path) -> RestoreBackupFileContent |
try:
instruction_content = json.loads(instruction_path.read_text(encoding="utf-8"))
return RestoreBackupFileContent(
backup_file_path=Path(instruction_content["path"])
backup_file_path=Path(instruction_content["path"]),
password=instruction_content["password"],
remove_after_restore=instruction_content["remove_after_restore"],
restore_database=instruction_content["restore_database"],
restore_homeassistant=instruction_content["restore_homeassistant"],
)
except (FileNotFoundError, json.JSONDecodeError):
except (FileNotFoundError, KeyError, json.JSONDecodeError):
return None
def _clear_configuration_directory(config_dir: Path) -> None:
"""Delete all files and directories in the config directory except for the backups directory."""
keep_paths = [config_dir.joinpath(path) for path in KEEP_PATHS]
config_contents = sorted(
[entry for entry in config_dir.iterdir() if entry not in keep_paths]
def _clear_configuration_directory(config_dir: Path, keep: Iterable[str]) -> None:
"""Delete all files and directories in the config directory except entries in the keep list."""
keep_paths = [config_dir.joinpath(path) for path in keep]
entries_to_remove = sorted(
entry for entry in config_dir.iterdir() if entry not in keep_paths
)
for entry in config_contents:
for entry in entries_to_remove:
entrypath = config_dir.joinpath(entry)
if entrypath.is_file():
@ -54,12 +82,15 @@ def _clear_configuration_directory(config_dir: Path) -> None:
shutil.rmtree(entrypath)
def _extract_backup(config_dir: Path, backup_file_path: Path) -> None:
def _extract_backup(
config_dir: Path,
restore_content: RestoreBackupFileContent,
) -> None:
"""Extract the backup file to the config directory."""
with (
TemporaryDirectory() as tempdir,
securetar.SecureTarFile(
backup_file_path,
restore_content.backup_file_path,
gzip=False,
mode="r",
) as ostf,
@ -88,22 +119,41 @@ def _extract_backup(config_dir: Path, backup_file_path: Path) -> None:
f"homeassistant.tar{'.gz' if backup_meta["compressed"] else ''}",
),
gzip=backup_meta["compressed"],
key=password_to_key(restore_content.password)
if restore_content.password is not None
else None,
mode="r",
) as istf:
for member in istf.getmembers():
if member.name == "data":
continue
member.name = member.name.replace("data/", "")
_clear_configuration_directory(config_dir)
istf.extractall(
path=config_dir,
members=[
member
for member in securetar.secure_path(istf)
if member.name != "data"
],
path=Path(tempdir, "homeassistant"),
members=securetar.secure_path(istf),
filter="fully_trusted",
)
if restore_content.restore_homeassistant:
keep = list(KEEP_BACKUPS)
if not restore_content.restore_database:
keep.extend(KEEP_DATABASE)
_clear_configuration_directory(config_dir, keep)
shutil.copytree(
Path(tempdir, "homeassistant", "data"),
config_dir,
dirs_exist_ok=True,
ignore=shutil.ignore_patterns(*(keep)),
)
elif restore_content.restore_database:
for entry in KEEP_DATABASE:
entrypath = config_dir / entry
if entrypath.is_file():
entrypath.unlink()
elif entrypath.is_dir():
shutil.rmtree(entrypath)
for entry in KEEP_DATABASE:
shutil.copy(
Path(tempdir, "homeassistant", "data", entry),
config_dir,
)
def restore_backup(config_dir_path: str) -> bool:
@ -119,8 +169,13 @@ def restore_backup(config_dir_path: str) -> bool:
backup_file_path = restore_content.backup_file_path
_LOGGER.info("Restoring %s", backup_file_path)
try:
_extract_backup(config_dir, backup_file_path)
_extract_backup(
config_dir=config_dir,
restore_content=restore_content,
)
except FileNotFoundError as err:
raise ValueError(f"Backup file {backup_file_path} does not exist") from err
if restore_content.remove_after_restore:
backup_file_path.unlink(missing_ok=True)
_LOGGER.info("Restore complete, restarting")
return True

View File

@ -5,36 +5,81 @@ from homeassistant.helpers import config_validation as cv
from homeassistant.helpers.hassio import is_hassio
from homeassistant.helpers.typing import ConfigType
from .const import DATA_MANAGER, DOMAIN, LOGGER
from .agent import (
BackupAgent,
BackupAgentError,
BackupAgentPlatformProtocol,
LocalBackupAgent,
)
from .const import DATA_MANAGER, DOMAIN
from .http import async_register_http_views
from .manager import BackupManager
from .manager import (
BackupManager,
BackupPlatformProtocol,
BackupReaderWriter,
CoreBackupReaderWriter,
CreateBackupEvent,
ManagerBackup,
NewBackup,
WrittenBackup,
)
from .models import AddonInfo, AgentBackup, Folder
from .websocket import async_register_websocket_handlers
__all__ = [
"AddonInfo",
"AgentBackup",
"ManagerBackup",
"BackupAgent",
"BackupAgentError",
"BackupAgentPlatformProtocol",
"BackupPlatformProtocol",
"BackupReaderWriter",
"CreateBackupEvent",
"Folder",
"LocalBackupAgent",
"NewBackup",
"WrittenBackup",
]
CONFIG_SCHEMA = cv.empty_config_schema(DOMAIN)
async def async_setup(hass: HomeAssistant, config: ConfigType) -> bool:
"""Set up the Backup integration."""
backup_manager = BackupManager(hass)
hass.data[DATA_MANAGER] = backup_manager
with_hassio = is_hassio(hass)
reader_writer: BackupReaderWriter
if not with_hassio:
reader_writer = CoreBackupReaderWriter(hass)
else:
# pylint: disable-next=import-outside-toplevel, hass-component-root-import
from homeassistant.components.hassio.backup import SupervisorBackupReaderWriter
reader_writer = SupervisorBackupReaderWriter(hass)
backup_manager = BackupManager(hass, reader_writer)
hass.data[DATA_MANAGER] = backup_manager
await backup_manager.async_setup()
async_register_websocket_handlers(hass, with_hassio)
if with_hassio:
if DOMAIN in config:
LOGGER.error(
"The backup integration is not supported on this installation method, "
"please remove it from your configuration"
)
return True
async def async_handle_create_service(call: ServiceCall) -> None:
"""Service handler for creating backups."""
await backup_manager.async_create_backup()
agent_id = list(backup_manager.local_backup_agents)[0]
await backup_manager.async_create_backup(
agent_ids=[agent_id],
include_addons=None,
include_all_addons=False,
include_database=True,
include_folders=None,
include_homeassistant=True,
name=None,
password=None,
)
hass.services.async_register(DOMAIN, "create", async_handle_create_service)
if not with_hassio:
hass.services.async_register(DOMAIN, "create", async_handle_create_service)
async_register_http_views(hass)

View File

@ -0,0 +1,100 @@
"""Backup agents for the Backup integration."""
from __future__ import annotations
import abc
from collections.abc import AsyncIterator, Callable, Coroutine
from pathlib import Path
from typing import Any, Protocol
from homeassistant.core import HomeAssistant
from homeassistant.exceptions import HomeAssistantError
from .models import AgentBackup
class BackupAgentError(HomeAssistantError):
"""Base class for backup agent errors."""
class BackupAgentUnreachableError(BackupAgentError):
"""Raised when the agent can't reach its API."""
_message = "The backup agent is unreachable."
class BackupAgent(abc.ABC):
"""Backup agent interface."""
name: str
@abc.abstractmethod
async def async_download_backup(
self,
backup_id: str,
**kwargs: Any,
) -> AsyncIterator[bytes]:
"""Download a backup file.
:param backup_id: The ID of the backup that was returned in async_list_backups.
:return: An async iterator that yields bytes.
"""
@abc.abstractmethod
async def async_upload_backup(
self,
*,
open_stream: Callable[[], Coroutine[Any, Any, AsyncIterator[bytes]]],
backup: AgentBackup,
**kwargs: Any,
) -> None:
"""Upload a backup.
:param open_stream: A function returning an async iterator that yields bytes.
:param backup: Metadata about the backup that should be uploaded.
"""
@abc.abstractmethod
async def async_delete_backup(
self,
backup_id: str,
**kwargs: Any,
) -> None:
"""Delete a backup file.
:param backup_id: The ID of the backup that was returned in async_list_backups.
"""
@abc.abstractmethod
async def async_list_backups(self, **kwargs: Any) -> list[AgentBackup]:
"""List backups."""
@abc.abstractmethod
async def async_get_backup(
self,
backup_id: str,
**kwargs: Any,
) -> AgentBackup | None:
"""Return a backup."""
class LocalBackupAgent(BackupAgent):
"""Local backup agent."""
@abc.abstractmethod
def get_backup_path(self, backup_id: str) -> Path:
"""Return the local path to a backup.
The method should return the path to the backup file with the specified id.
"""
class BackupAgentPlatformProtocol(Protocol):
"""Define the format of backup platforms which implement backup agents."""
async def async_get_backup_agents(
self,
hass: HomeAssistant,
**kwargs: Any,
) -> list[BackupAgent]:
"""Return a list of backup agents."""

View File

@ -0,0 +1,124 @@
"""Local backup support for Core and Container installations."""
from __future__ import annotations
from collections.abc import AsyncIterator, Callable, Coroutine
import json
from pathlib import Path
from tarfile import TarError
from typing import Any
from homeassistant.core import HomeAssistant
from homeassistant.helpers.hassio import is_hassio
from .agent import BackupAgent, LocalBackupAgent
from .const import LOGGER
from .models import AgentBackup
from .util import read_backup
async def async_get_backup_agents(
hass: HomeAssistant,
**kwargs: Any,
) -> list[BackupAgent]:
"""Return the local backup agent."""
if is_hassio(hass):
return []
return [CoreLocalBackupAgent(hass)]
class CoreLocalBackupAgent(LocalBackupAgent):
"""Local backup agent for Core and Container installations."""
name = "local"
def __init__(self, hass: HomeAssistant) -> None:
"""Initialize the backup agent."""
super().__init__()
self._hass = hass
self._backup_dir = Path(hass.config.path("backups"))
self._backups: dict[str, AgentBackup] = {}
self._loaded_backups = False
async def _load_backups(self) -> None:
"""Load data of stored backup files."""
backups = await self._hass.async_add_executor_job(self._read_backups)
LOGGER.debug("Loaded %s local backups", len(backups))
self._backups = backups
self._loaded_backups = True
def _read_backups(self) -> dict[str, AgentBackup]:
"""Read backups from disk."""
backups: dict[str, AgentBackup] = {}
for backup_path in self._backup_dir.glob("*.tar"):
try:
backup = read_backup(backup_path)
backups[backup.backup_id] = backup
except (OSError, TarError, json.JSONDecodeError, KeyError) as err:
LOGGER.warning("Unable to read backup %s: %s", backup_path, err)
return backups
async def async_download_backup(
self,
backup_id: str,
**kwargs: Any,
) -> AsyncIterator[bytes]:
"""Download a backup file."""
raise NotImplementedError
async def async_upload_backup(
self,
*,
open_stream: Callable[[], Coroutine[Any, Any, AsyncIterator[bytes]]],
backup: AgentBackup,
**kwargs: Any,
) -> None:
"""Upload a backup."""
self._backups[backup.backup_id] = backup
async def async_list_backups(self, **kwargs: Any) -> list[AgentBackup]:
"""List backups."""
if not self._loaded_backups:
await self._load_backups()
return list(self._backups.values())
async def async_get_backup(
self,
backup_id: str,
**kwargs: Any,
) -> AgentBackup | None:
"""Return a backup."""
if not self._loaded_backups:
await self._load_backups()
if not (backup := self._backups.get(backup_id)):
return None
backup_path = self.get_backup_path(backup_id)
if not await self._hass.async_add_executor_job(backup_path.exists):
LOGGER.debug(
(
"Removing tracked backup (%s) that does not exists on the expected"
" path %s"
),
backup.backup_id,
backup_path,
)
self._backups.pop(backup_id)
return None
return backup
def get_backup_path(self, backup_id: str) -> Path:
"""Return the local path to a backup."""
return self._backup_dir / f"{backup_id}.tar"
async def async_delete_backup(self, backup_id: str, **kwargs: Any) -> None:
"""Delete a backup file."""
if await self.async_get_backup(backup_id) is None:
return
backup_path = self.get_backup_path(backup_id)
await self._hass.async_add_executor_job(backup_path.unlink, True)
LOGGER.debug("Deleted backup located at %s", backup_path)
self._backups.pop(backup_id)

View File

@ -0,0 +1,444 @@
"""Provide persistent configuration for the backup integration."""
from __future__ import annotations
import asyncio
from collections.abc import Callable
from dataclasses import dataclass, field, replace
from datetime import datetime, timedelta
from enum import StrEnum
from typing import TYPE_CHECKING, Self, TypedDict
from cronsim import CronSim
from homeassistant.core import HomeAssistant, callback
from homeassistant.helpers.event import async_call_later, async_track_point_in_time
from homeassistant.helpers.typing import UNDEFINED, UndefinedType
from homeassistant.util import dt as dt_util
from .const import LOGGER
from .models import Folder
if TYPE_CHECKING:
from .manager import BackupManager, ManagerBackup
# The time of the automatic backup event should be compatible with
# the time of the recorder's nightly job which runs at 04:12.
# Run the backup at 04:45.
CRON_PATTERN_DAILY = "45 4 * * *"
CRON_PATTERN_WEEKLY = "45 4 * * {}"
class StoredBackupConfig(TypedDict):
"""Represent the stored backup config."""
create_backup: StoredCreateBackupConfig
last_attempted_strategy_backup: datetime | None
last_completed_strategy_backup: datetime | None
retention: StoredRetentionConfig
schedule: StoredBackupSchedule
@dataclass(kw_only=True)
class BackupConfigData:
"""Represent loaded backup config data."""
create_backup: CreateBackupConfig
last_attempted_strategy_backup: datetime | None = None
last_completed_strategy_backup: datetime | None = None
retention: RetentionConfig
schedule: BackupSchedule
@classmethod
def from_dict(cls, data: StoredBackupConfig) -> Self:
"""Initialize backup config data from a dict."""
include_folders_data = data["create_backup"]["include_folders"]
if include_folders_data:
include_folders = [Folder(folder) for folder in include_folders_data]
else:
include_folders = None
retention = data["retention"]
return cls(
create_backup=CreateBackupConfig(
agent_ids=data["create_backup"]["agent_ids"],
include_addons=data["create_backup"]["include_addons"],
include_all_addons=data["create_backup"]["include_all_addons"],
include_database=data["create_backup"]["include_database"],
include_folders=include_folders,
name=data["create_backup"]["name"],
password=data["create_backup"]["password"],
),
last_attempted_strategy_backup=data["last_attempted_strategy_backup"],
last_completed_strategy_backup=data["last_completed_strategy_backup"],
retention=RetentionConfig(
copies=retention["copies"],
days=retention["days"],
),
schedule=BackupSchedule(state=ScheduleState(data["schedule"]["state"])),
)
def to_dict(self) -> StoredBackupConfig:
"""Convert backup config data to a dict."""
return StoredBackupConfig(
create_backup=self.create_backup.to_dict(),
last_attempted_strategy_backup=self.last_attempted_strategy_backup,
last_completed_strategy_backup=self.last_completed_strategy_backup,
retention=self.retention.to_dict(),
schedule=self.schedule.to_dict(),
)
class BackupConfig:
"""Handle backup config."""
def __init__(self, hass: HomeAssistant, manager: BackupManager) -> None:
"""Initialize backup config."""
self.data = BackupConfigData(
create_backup=CreateBackupConfig(),
retention=RetentionConfig(),
schedule=BackupSchedule(),
)
self._manager = manager
def load(self, stored_config: StoredBackupConfig) -> None:
"""Load config."""
self.data = BackupConfigData.from_dict(stored_config)
self.data.schedule.apply(self._manager)
async def update(
self,
*,
create_backup: CreateBackupParametersDict | UndefinedType = UNDEFINED,
retention: RetentionParametersDict | UndefinedType = UNDEFINED,
schedule: ScheduleState | UndefinedType = UNDEFINED,
) -> None:
"""Update config."""
if create_backup is not UNDEFINED:
self.data.create_backup = replace(self.data.create_backup, **create_backup)
if retention is not UNDEFINED:
new_retention = RetentionConfig(**retention)
if new_retention != self.data.retention:
self.data.retention = new_retention
self.data.retention.apply(self._manager)
if schedule is not UNDEFINED:
new_schedule = BackupSchedule(state=schedule)
if new_schedule.to_dict() != self.data.schedule.to_dict():
self.data.schedule = new_schedule
self.data.schedule.apply(self._manager)
self._manager.store.save()
@dataclass(kw_only=True)
class RetentionConfig:
"""Represent the backup retention configuration."""
copies: int | None = None
days: int | None = None
def apply(self, manager: BackupManager) -> None:
"""Apply backup retention configuration."""
if self.days is not None:
self._schedule_next(manager)
else:
self._unschedule_next(manager)
def to_dict(self) -> StoredRetentionConfig:
"""Convert backup retention configuration to a dict."""
return StoredRetentionConfig(
copies=self.copies,
days=self.days,
)
@callback
def _schedule_next(
self,
manager: BackupManager,
) -> None:
"""Schedule the next delete after days."""
self._unschedule_next(manager)
async def _delete_backups(now: datetime) -> None:
"""Delete backups older than days."""
self._schedule_next(manager)
def _backups_filter(
backups: dict[str, ManagerBackup],
) -> dict[str, ManagerBackup]:
"""Return backups older than days to delete."""
# we need to check here since we await before
# this filter is applied
if self.days is None:
return {}
now = dt_util.utcnow()
return {
backup_id: backup
for backup_id, backup in backups.items()
if dt_util.parse_datetime(backup.date, raise_on_error=True)
+ timedelta(days=self.days)
< now
}
await _delete_filtered_backups(manager, _backups_filter)
manager.remove_next_delete_event = async_call_later(
manager.hass, timedelta(days=1), _delete_backups
)
@callback
def _unschedule_next(self, manager: BackupManager) -> None:
"""Unschedule the next delete after days."""
if (remove_next_event := manager.remove_next_delete_event) is not None:
remove_next_event()
manager.remove_next_delete_event = None
class StoredRetentionConfig(TypedDict):
"""Represent the stored backup retention configuration."""
copies: int | None
days: int | None
class RetentionParametersDict(TypedDict, total=False):
"""Represent the parameters for retention."""
copies: int | None
days: int | None
class StoredBackupSchedule(TypedDict):
"""Represent the stored backup schedule configuration."""
state: ScheduleState
class ScheduleState(StrEnum):
"""Represent the schedule state."""
NEVER = "never"
DAILY = "daily"
MONDAY = "mon"
TUESDAY = "tue"
WEDNESDAY = "wed"
THURSDAY = "thu"
FRIDAY = "fri"
SATURDAY = "sat"
SUNDAY = "sun"
@dataclass(kw_only=True)
class BackupSchedule:
"""Represent the backup schedule."""
state: ScheduleState = ScheduleState.NEVER
cron_event: CronSim | None = field(init=False, default=None)
@callback
def apply(
self,
manager: BackupManager,
) -> None:
"""Apply a new schedule.
There are only three possible state types: never, daily, or weekly.
"""
if self.state is ScheduleState.NEVER:
self._unschedule_next(manager)
return
if self.state is ScheduleState.DAILY:
self._schedule_next(CRON_PATTERN_DAILY, manager)
else:
self._schedule_next(
CRON_PATTERN_WEEKLY.format(self.state.value),
manager,
)
@callback
def _schedule_next(
self,
cron_pattern: str,
manager: BackupManager,
) -> None:
"""Schedule the next backup."""
self._unschedule_next(manager)
now = dt_util.now()
if (cron_event := self.cron_event) is None:
seed_time = manager.config.data.last_completed_strategy_backup or now
cron_event = self.cron_event = CronSim(cron_pattern, seed_time)
next_time = next(cron_event)
if next_time < now:
# schedule a backup at next daily time once
# if we missed the last scheduled backup
cron_event = CronSim(CRON_PATTERN_DAILY, now)
next_time = next(cron_event)
# reseed the cron event attribute
# add a day to the next time to avoid scheduling at the same time again
self.cron_event = CronSim(cron_pattern, now + timedelta(days=1))
async def _create_backup(now: datetime) -> None:
"""Create backup."""
manager.remove_next_backup_event = None
config_data = manager.config.data
self._schedule_next(cron_pattern, manager)
# create the backup
try:
await manager.async_create_backup(
agent_ids=config_data.create_backup.agent_ids,
include_addons=config_data.create_backup.include_addons,
include_all_addons=config_data.create_backup.include_all_addons,
include_database=config_data.create_backup.include_database,
include_folders=config_data.create_backup.include_folders,
include_homeassistant=True, # always include HA
name=config_data.create_backup.name,
password=config_data.create_backup.password,
with_strategy_settings=True,
)
except Exception: # noqa: BLE001
# another more specific exception will be added
# and handled in the future
LOGGER.exception("Unexpected error creating automatic backup")
# delete old backups more numerous than copies
def _backups_filter(
backups: dict[str, ManagerBackup],
) -> dict[str, ManagerBackup]:
"""Return oldest backups more numerous than copies to delete."""
# we need to check here since we await before
# this filter is applied
if config_data.retention.copies is None:
return {}
return dict(
sorted(
backups.items(),
key=lambda backup_item: backup_item[1].date,
)[: len(backups) - config_data.retention.copies]
)
await _delete_filtered_backups(manager, _backups_filter)
manager.remove_next_backup_event = async_track_point_in_time(
manager.hass, _create_backup, next_time
)
def to_dict(self) -> StoredBackupSchedule:
"""Convert backup schedule to a dict."""
return StoredBackupSchedule(state=self.state)
@callback
def _unschedule_next(self, manager: BackupManager) -> None:
"""Unschedule the next backup."""
if (remove_next_event := manager.remove_next_backup_event) is not None:
remove_next_event()
manager.remove_next_backup_event = None
@dataclass(kw_only=True)
class CreateBackupConfig:
"""Represent the config for async_create_backup."""
agent_ids: list[str] = field(default_factory=list)
include_addons: list[str] | None = None
include_all_addons: bool = False
include_database: bool = True
include_folders: list[Folder] | None = None
name: str | None = None
password: str | None = None
def to_dict(self) -> StoredCreateBackupConfig:
"""Convert create backup config to a dict."""
return {
"agent_ids": self.agent_ids,
"include_addons": self.include_addons,
"include_all_addons": self.include_all_addons,
"include_database": self.include_database,
"include_folders": self.include_folders,
"name": self.name,
"password": self.password,
}
class StoredCreateBackupConfig(TypedDict):
"""Represent the stored config for async_create_backup."""
agent_ids: list[str]
include_addons: list[str] | None
include_all_addons: bool
include_database: bool
include_folders: list[Folder] | None
name: str | None
password: str | None
class CreateBackupParametersDict(TypedDict, total=False):
"""Represent the parameters for async_create_backup."""
agent_ids: list[str]
include_addons: list[str] | None
include_all_addons: bool
include_database: bool
include_folders: list[Folder] | None
name: str | None
password: str | None
async def _delete_filtered_backups(
manager: BackupManager,
backup_filter: Callable[[dict[str, ManagerBackup]], dict[str, ManagerBackup]],
) -> None:
"""Delete backups parsed with a filter.
:param manager: The backup manager.
:param backup_filter: A filter that should return the backups to delete.
"""
backups, get_agent_errors = await manager.async_get_backups()
if get_agent_errors:
LOGGER.debug(
"Error getting backups; continuing anyway: %s",
get_agent_errors,
)
LOGGER.debug("Total backups: %s", backups)
filtered_backups = backup_filter(backups)
if not filtered_backups:
return
# always delete oldest backup first
filtered_backups = dict(
sorted(
filtered_backups.items(),
key=lambda backup_item: backup_item[1].date,
)
)
if len(filtered_backups) >= len(backups):
# Never delete the last backup.
last_backup = filtered_backups.popitem()
LOGGER.debug("Keeping the last backup: %s", last_backup)
LOGGER.debug("Backups to delete: %s", filtered_backups)
if not filtered_backups:
return
backup_ids = list(filtered_backups)
delete_results = await asyncio.gather(
*(manager.async_delete_backup(backup_id) for backup_id in filtered_backups)
)
agent_errors = {
backup_id: error
for backup_id, error in zip(backup_ids, delete_results, strict=True)
if error
}
if agent_errors:
LOGGER.error(
"Error deleting old copies: %s",
agent_errors,
)

View File

@ -10,6 +10,7 @@ from homeassistant.util.hass_dict import HassKey
if TYPE_CHECKING:
from .manager import BackupManager
BUF_SIZE = 2**20 * 4 # 4MB
DOMAIN = "backup"
DATA_MANAGER: HassKey[BackupManager] = HassKey(DOMAIN)
LOGGER = getLogger(__package__)
@ -22,6 +23,12 @@ EXCLUDE_FROM_BACKUP = [
"*.log.*",
"*.log",
"backups/*.tar",
"tmp_backups/*.tar",
"OZW_Log.txt",
"tts/*",
]
EXCLUDE_DATABASE_FROM_BACKUP = [
"home-assistant_v2.db",
"home-assistant_v2.db-wal",
]

View File

@ -8,10 +8,11 @@ from typing import cast
from aiohttp import BodyPartReader
from aiohttp.hdrs import CONTENT_DISPOSITION
from aiohttp.web import FileResponse, Request, Response
from aiohttp.web import FileResponse, Request, Response, StreamResponse
from homeassistant.components.http import KEY_HASS, HomeAssistantView, require_admin
from homeassistant.core import HomeAssistant, callback
from homeassistant.exceptions import HomeAssistantError
from homeassistant.util import slugify
from .const import DATA_MANAGER
@ -27,30 +28,47 @@ def async_register_http_views(hass: HomeAssistant) -> None:
class DownloadBackupView(HomeAssistantView):
"""Generate backup view."""
url = "/api/backup/download/{slug}"
url = "/api/backup/download/{backup_id}"
name = "api:backup:download"
async def get(
self,
request: Request,
slug: str,
) -> FileResponse | Response:
backup_id: str,
) -> StreamResponse | FileResponse | Response:
"""Download a backup file."""
if not request["hass_user"].is_admin:
return Response(status=HTTPStatus.UNAUTHORIZED)
try:
agent_id = request.query.getone("agent_id")
except KeyError:
return Response(status=HTTPStatus.BAD_REQUEST)
manager = request.app[KEY_HASS].data[DATA_MANAGER]
backup = await manager.async_get_backup(slug=slug)
if agent_id not in manager.backup_agents:
return Response(status=HTTPStatus.BAD_REQUEST)
agent = manager.backup_agents[agent_id]
backup = await agent.async_get_backup(backup_id)
if backup is None or not backup.path.exists():
# We don't need to check if the path exists, aiohttp.FileResponse will handle
# that
if backup is None:
return Response(status=HTTPStatus.NOT_FOUND)
return FileResponse(
path=backup.path.as_posix(),
headers={
CONTENT_DISPOSITION: f"attachment; filename={slugify(backup.name)}.tar"
},
)
headers = {
CONTENT_DISPOSITION: f"attachment; filename={slugify(backup.name)}.tar"
}
if agent_id in manager.local_backup_agents:
local_agent = manager.local_backup_agents[agent_id]
path = local_agent.get_backup_path(backup_id)
return FileResponse(path=path.as_posix(), headers=headers)
stream = await agent.async_download_backup(backup_id)
response = StreamResponse(status=HTTPStatus.OK, headers=headers)
await response.prepare(request)
async for chunk in stream:
await response.write(chunk)
return response
class UploadBackupView(HomeAssistantView):
@ -62,15 +80,24 @@ class UploadBackupView(HomeAssistantView):
@require_admin
async def post(self, request: Request) -> Response:
"""Upload a backup file."""
try:
agent_ids = request.query.getall("agent_id")
except KeyError:
return Response(status=HTTPStatus.BAD_REQUEST)
manager = request.app[KEY_HASS].data[DATA_MANAGER]
reader = await request.multipart()
contents = cast(BodyPartReader, await reader.next())
try:
await manager.async_receive_backup(contents=contents)
await manager.async_receive_backup(contents=contents, agent_ids=agent_ids)
except OSError as err:
return Response(
body=f"Can't write backup file {err}",
body=f"Can't write backup file: {err}",
status=HTTPStatus.INTERNAL_SERVER_ERROR,
)
except HomeAssistantError as err:
return Response(
body=f"Can't upload backup file: {err}",
status=HTTPStatus.INTERNAL_SERVER_ERROR,
)
except asyncio.CancelledError:

File diff suppressed because it is too large Load Diff

View File

@ -1,11 +1,12 @@
{
"domain": "backup",
"name": "Backup",
"after_dependencies": ["hassio"],
"codeowners": ["@home-assistant/core"],
"dependencies": ["http", "websocket_api"],
"documentation": "https://www.home-assistant.io/integrations/backup",
"integration_type": "system",
"iot_class": "calculated",
"quality_scale": "internal",
"requirements": ["securetar==2024.11.0"]
"requirements": ["cronsim==2.6", "securetar==2024.11.0"]
}

View File

@ -0,0 +1,61 @@
"""Models for the backup integration."""
from __future__ import annotations
from dataclasses import asdict, dataclass
from enum import StrEnum
from typing import Any, Self
@dataclass(frozen=True, kw_only=True)
class AddonInfo:
"""Addon information."""
name: str
slug: str
version: str
class Folder(StrEnum):
"""Folder type."""
SHARE = "share"
ADDONS = "addons/local"
SSL = "ssl"
MEDIA = "media"
@dataclass(frozen=True, kw_only=True)
class AgentBackup:
"""Base backup class."""
addons: list[AddonInfo]
backup_id: str
date: str
database_included: bool
folders: list[Folder]
homeassistant_included: bool
homeassistant_version: str | None # None if homeassistant_included is False
name: str
protected: bool
size: int
def as_dict(self) -> dict:
"""Return a dict representation of this backup."""
return asdict(self)
@classmethod
def from_dict(cls, data: dict[str, Any]) -> Self:
"""Create an instance from a JSON serialization."""
return cls(
addons=[AddonInfo(**addon) for addon in data["addons"]],
backup_id=data["backup_id"],
date=data["date"],
database_included=data["database_included"],
folders=[Folder(folder) for folder in data["folders"]],
homeassistant_included=data["homeassistant_included"],
homeassistant_version=data["homeassistant_version"],
name=data["name"],
protected=data["protected"],
size=data["size"],
)

View File

@ -0,0 +1,52 @@
"""Store backup configuration."""
from __future__ import annotations
from typing import TYPE_CHECKING, TypedDict
from homeassistant.core import HomeAssistant, callback
from homeassistant.helpers.storage import Store
from .const import DOMAIN
if TYPE_CHECKING:
from .config import StoredBackupConfig
from .manager import BackupManager, StoredKnownBackup
STORE_DELAY_SAVE = 30
STORAGE_KEY = DOMAIN
STORAGE_VERSION = 1
class StoredBackupData(TypedDict):
"""Represent the stored backup config."""
backups: list[StoredKnownBackup]
config: StoredBackupConfig
class BackupStore:
"""Store backup config."""
def __init__(self, hass: HomeAssistant, manager: BackupManager) -> None:
"""Initialize the backup manager."""
self._hass = hass
self._manager = manager
self._store: Store[StoredBackupData] = Store(hass, STORAGE_VERSION, STORAGE_KEY)
async def load(self) -> StoredBackupData | None:
"""Load the store."""
return await self._store.async_load()
@callback
def save(self) -> None:
"""Save config."""
self._store.async_delay_save(self._data_to_save, STORE_DELAY_SAVE)
@callback
def _data_to_save(self) -> StoredBackupData:
"""Return data to save."""
return {
"backups": self._manager.known_backups.to_list(),
"config": self._manager.config.data.to_dict(),
}

View File

@ -0,0 +1,111 @@
"""Local backup support for Core and Container installations."""
from __future__ import annotations
import asyncio
from pathlib import Path
from queue import SimpleQueue
import tarfile
from typing import cast
import aiohttp
from homeassistant.core import HomeAssistant
from homeassistant.util.json import JsonObjectType, json_loads_object
from .const import BUF_SIZE
from .models import AddonInfo, AgentBackup, Folder
def make_backup_dir(path: Path) -> None:
"""Create a backup directory if it does not exist."""
path.mkdir(exist_ok=True)
def read_backup(backup_path: Path) -> AgentBackup:
"""Read a backup from disk."""
with tarfile.open(backup_path, "r:", bufsize=BUF_SIZE) as backup_file:
if not (data_file := backup_file.extractfile("./backup.json")):
raise KeyError("backup.json not found in tar file")
data = json_loads_object(data_file.read())
addons = [
AddonInfo(
name=cast(str, addon["name"]),
slug=cast(str, addon["slug"]),
version=cast(str, addon["version"]),
)
for addon in cast(list[JsonObjectType], data.get("addons", []))
]
folders = [
Folder(folder)
for folder in cast(list[str], data.get("folders", []))
if folder != "homeassistant"
]
homeassistant_included = False
homeassistant_version: str | None = None
database_included = False
if (
homeassistant := cast(JsonObjectType, data.get("homeassistant"))
) and "version" in homeassistant:
homeassistant_version = cast(str, homeassistant["version"])
database_included = not cast(
bool, homeassistant.get("exclude_database", False)
)
return AgentBackup(
addons=addons,
backup_id=cast(str, data["slug"]),
database_included=database_included,
date=cast(str, data["date"]),
folders=folders,
homeassistant_included=homeassistant_included,
homeassistant_version=homeassistant_version,
name=cast(str, data["name"]),
protected=cast(bool, data.get("protected", False)),
size=backup_path.stat().st_size,
)
async def receive_file(
hass: HomeAssistant, contents: aiohttp.BodyPartReader, path: Path
) -> None:
"""Receive a file from a stream and write it to a file."""
queue: SimpleQueue[tuple[bytes, asyncio.Future[None] | None] | None] = SimpleQueue()
def _sync_queue_consumer() -> None:
with path.open("wb") as file_handle:
while True:
if (_chunk_future := queue.get()) is None:
break
_chunk, _future = _chunk_future
if _future is not None:
hass.loop.call_soon_threadsafe(_future.set_result, None)
file_handle.write(_chunk)
fut: asyncio.Future[None] | None = None
try:
fut = hass.async_add_executor_job(_sync_queue_consumer)
megabytes_sending = 0
while chunk := await contents.read_chunk(BUF_SIZE):
megabytes_sending += 1
if megabytes_sending % 5 != 0:
queue.put_nowait((chunk, None))
continue
chunk_future = hass.loop.create_future()
queue.put_nowait((chunk, chunk_future))
await asyncio.wait(
(fut, chunk_future),
return_when=asyncio.FIRST_COMPLETED,
)
if fut.done():
# The executor job failed
break
queue.put_nowait(None) # terminate queue consumer
finally:
if fut is not None:
await fut

View File

@ -7,22 +7,31 @@ import voluptuous as vol
from homeassistant.components import websocket_api
from homeassistant.core import HomeAssistant, callback
from .config import ScheduleState
from .const import DATA_MANAGER, LOGGER
from .manager import ManagerStateEvent
from .models import Folder
@callback
def async_register_websocket_handlers(hass: HomeAssistant, with_hassio: bool) -> None:
"""Register websocket commands."""
websocket_api.async_register_command(hass, backup_agents_info)
if with_hassio:
websocket_api.async_register_command(hass, handle_backup_end)
websocket_api.async_register_command(hass, handle_backup_start)
return
websocket_api.async_register_command(hass, handle_details)
websocket_api.async_register_command(hass, handle_info)
websocket_api.async_register_command(hass, handle_create)
websocket_api.async_register_command(hass, handle_remove)
websocket_api.async_register_command(hass, handle_create_with_strategy_settings)
websocket_api.async_register_command(hass, handle_delete)
websocket_api.async_register_command(hass, handle_restore)
websocket_api.async_register_command(hass, handle_subscribe_events)
websocket_api.async_register_command(hass, handle_config_info)
websocket_api.async_register_command(hass, handle_config_update)
@websocket_api.require_admin
@ -35,12 +44,16 @@ async def handle_info(
) -> None:
"""List all stored backups."""
manager = hass.data[DATA_MANAGER]
backups = await manager.async_get_backups()
backups, agent_errors = await manager.async_get_backups()
connection.send_result(
msg["id"],
{
"agent_errors": {
agent_id: str(err) for agent_id, err in agent_errors.items()
},
"backups": list(backups.values()),
"backing_up": manager.backing_up,
"last_attempted_strategy_backup": manager.config.data.last_attempted_strategy_backup,
"last_completed_strategy_backup": manager.config.data.last_completed_strategy_backup,
},
)
@ -49,7 +62,7 @@ async def handle_info(
@websocket_api.websocket_command(
{
vol.Required("type"): "backup/details",
vol.Required("slug"): str,
vol.Required("backup_id"): str,
}
)
@websocket_api.async_response
@ -58,11 +71,16 @@ async def handle_details(
connection: websocket_api.ActiveConnection,
msg: dict[str, Any],
) -> None:
"""Get backup details for a specific slug."""
backup = await hass.data[DATA_MANAGER].async_get_backup(slug=msg["slug"])
"""Get backup details for a specific backup."""
backup, agent_errors = await hass.data[DATA_MANAGER].async_get_backup(
msg["backup_id"]
)
connection.send_result(
msg["id"],
{
"agent_errors": {
agent_id: str(err) for agent_id, err in agent_errors.items()
},
"backup": backup,
},
)
@ -71,26 +89,39 @@ async def handle_details(
@websocket_api.require_admin
@websocket_api.websocket_command(
{
vol.Required("type"): "backup/remove",
vol.Required("slug"): str,
vol.Required("type"): "backup/delete",
vol.Required("backup_id"): str,
}
)
@websocket_api.async_response
async def handle_remove(
async def handle_delete(
hass: HomeAssistant,
connection: websocket_api.ActiveConnection,
msg: dict[str, Any],
) -> None:
"""Remove a backup."""
await hass.data[DATA_MANAGER].async_remove_backup(slug=msg["slug"])
connection.send_result(msg["id"])
"""Delete a backup."""
agent_errors = await hass.data[DATA_MANAGER].async_delete_backup(msg["backup_id"])
connection.send_result(
msg["id"],
{
"agent_errors": {
agent_id: str(err) for agent_id, err in agent_errors.items()
}
},
)
@websocket_api.require_admin
@websocket_api.websocket_command(
{
vol.Required("type"): "backup/restore",
vol.Required("slug"): str,
vol.Required("backup_id"): str,
vol.Required("agent_id"): str,
vol.Optional("password"): str,
vol.Optional("restore_addons"): [str],
vol.Optional("restore_database", default=True): bool,
vol.Optional("restore_folders"): [vol.Coerce(Folder)],
vol.Optional("restore_homeassistant", default=True): bool,
}
)
@websocket_api.async_response
@ -100,12 +131,32 @@ async def handle_restore(
msg: dict[str, Any],
) -> None:
"""Restore a backup."""
await hass.data[DATA_MANAGER].async_restore_backup(msg["slug"])
await hass.data[DATA_MANAGER].async_restore_backup(
msg["backup_id"],
agent_id=msg["agent_id"],
password=msg.get("password"),
restore_addons=msg.get("restore_addons"),
restore_database=msg["restore_database"],
restore_folders=msg.get("restore_folders"),
restore_homeassistant=msg["restore_homeassistant"],
)
connection.send_result(msg["id"])
@websocket_api.require_admin
@websocket_api.websocket_command({vol.Required("type"): "backup/generate"})
@websocket_api.websocket_command(
{
vol.Required("type"): "backup/generate",
vol.Required("agent_ids"): [str],
vol.Optional("include_addons"): [str],
vol.Optional("include_all_addons", default=False): bool,
vol.Optional("include_database", default=True): bool,
vol.Optional("include_folders"): [vol.Coerce(Folder)],
vol.Optional("include_homeassistant", default=True): bool,
vol.Optional("name"): str,
vol.Optional("password"): str,
}
)
@websocket_api.async_response
async def handle_create(
hass: HomeAssistant,
@ -113,7 +164,46 @@ async def handle_create(
msg: dict[str, Any],
) -> None:
"""Generate a backup."""
backup = await hass.data[DATA_MANAGER].async_create_backup()
backup = await hass.data[DATA_MANAGER].async_initiate_backup(
agent_ids=msg["agent_ids"],
include_addons=msg.get("include_addons"),
include_all_addons=msg["include_all_addons"],
include_database=msg["include_database"],
include_folders=msg.get("include_folders"),
include_homeassistant=msg["include_homeassistant"],
name=msg.get("name"),
password=msg.get("password"),
)
connection.send_result(msg["id"], backup)
@websocket_api.require_admin
@websocket_api.websocket_command(
{
vol.Required("type"): "backup/generate_with_strategy_settings",
}
)
@websocket_api.async_response
async def handle_create_with_strategy_settings(
hass: HomeAssistant,
connection: websocket_api.ActiveConnection,
msg: dict[str, Any],
) -> None:
"""Generate a backup with stored settings."""
config_data = hass.data[DATA_MANAGER].config.data
backup = await hass.data[DATA_MANAGER].async_initiate_backup(
agent_ids=config_data.create_backup.agent_ids,
include_addons=config_data.create_backup.include_addons,
include_all_addons=config_data.create_backup.include_all_addons,
include_database=config_data.create_backup.include_database,
include_folders=config_data.create_backup.include_folders,
include_homeassistant=True, # always include HA
name=config_data.create_backup.name,
password=config_data.create_backup.password,
with_strategy_settings=True,
)
connection.send_result(msg["id"], backup)
@ -127,7 +217,6 @@ async def handle_backup_start(
) -> None:
"""Backup start notification."""
manager = hass.data[DATA_MANAGER]
manager.backing_up = True
LOGGER.debug("Backup start notification")
try:
@ -149,7 +238,6 @@ async def handle_backup_end(
) -> None:
"""Backup end notification."""
manager = hass.data[DATA_MANAGER]
manager.backing_up = False
LOGGER.debug("Backup end notification")
try:
@ -159,3 +247,97 @@ async def handle_backup_end(
return
connection.send_result(msg["id"])
@websocket_api.require_admin
@websocket_api.websocket_command({vol.Required("type"): "backup/agents/info"})
@websocket_api.async_response
async def backup_agents_info(
hass: HomeAssistant,
connection: websocket_api.ActiveConnection,
msg: dict[str, Any],
) -> None:
"""Return backup agents info."""
manager = hass.data[DATA_MANAGER]
connection.send_result(
msg["id"],
{
"agents": [{"agent_id": agent_id} for agent_id in manager.backup_agents],
},
)
@websocket_api.require_admin
@websocket_api.websocket_command({vol.Required("type"): "backup/config/info"})
@websocket_api.async_response
async def handle_config_info(
hass: HomeAssistant,
connection: websocket_api.ActiveConnection,
msg: dict[str, Any],
) -> None:
"""Send the stored backup config."""
manager = hass.data[DATA_MANAGER]
connection.send_result(
msg["id"],
{
"config": manager.config.data.to_dict(),
},
)
@websocket_api.require_admin
@websocket_api.websocket_command(
{
vol.Required("type"): "backup/config/update",
vol.Optional("create_backup"): vol.Schema(
{
vol.Optional("agent_ids"): vol.All(list[str]),
vol.Optional("include_addons"): vol.Any(list[str], None),
vol.Optional("include_all_addons"): bool,
vol.Optional("include_database"): bool,
vol.Optional("include_folders"): vol.Any([vol.Coerce(Folder)], None),
vol.Optional("name"): vol.Any(str, None),
vol.Optional("password"): vol.Any(str, None),
},
),
vol.Optional("retention"): vol.Schema(
{
vol.Optional("copies"): vol.Any(int, None),
vol.Optional("days"): vol.Any(int, None),
},
),
vol.Optional("schedule"): vol.All(str, vol.Coerce(ScheduleState)),
}
)
@websocket_api.async_response
async def handle_config_update(
hass: HomeAssistant,
connection: websocket_api.ActiveConnection,
msg: dict[str, Any],
) -> None:
"""Update the stored backup config."""
manager = hass.data[DATA_MANAGER]
changes = dict(msg)
changes.pop("id")
changes.pop("type")
await manager.config.update(**changes)
connection.send_result(msg["id"])
@websocket_api.require_admin
@websocket_api.websocket_command({vol.Required("type"): "backup/subscribe_events"})
@websocket_api.async_response
async def handle_subscribe_events(
hass: HomeAssistant,
connection: websocket_api.ActiveConnection,
msg: dict[str, Any],
) -> None:
"""Subscribe to backup events."""
def on_event(event: ManagerStateEvent) -> None:
connection.send_message(websocket_api.event_message(msg["id"], event))
manager = hass.data[DATA_MANAGER]
on_event(manager.last_event)
connection.subscriptions[msg["id"]] = manager.async_subscribe_events(on_event)
connection.send_result(msg["id"])

View File

@ -0,0 +1,196 @@
"""Backup platform for the cloud integration."""
from __future__ import annotations
import base64
from collections.abc import AsyncIterator, Callable, Coroutine
import hashlib
from typing import Any, Self
from aiohttp import ClientError, StreamReader
from hass_nabucasa import Cloud, CloudError
from hass_nabucasa.cloud_api import (
async_files_delete_file,
async_files_download_details,
async_files_list,
async_files_upload_details,
)
from homeassistant.components.backup import AgentBackup, BackupAgent, BackupAgentError
from homeassistant.core import HomeAssistant, callback
from .client import CloudClient
from .const import DATA_CLOUD, DOMAIN
_STORAGE_BACKUP = "backup"
async def _b64md5(stream: AsyncIterator[bytes]) -> str:
"""Calculate the MD5 hash of a file."""
file_hash = hashlib.md5()
async for chunk in stream:
file_hash.update(chunk)
return base64.b64encode(file_hash.digest()).decode()
async def async_get_backup_agents(
hass: HomeAssistant,
**kwargs: Any,
) -> list[BackupAgent]:
"""Return the cloud backup agent."""
return [CloudBackupAgent(hass=hass, cloud=hass.data[DATA_CLOUD])]
class ChunkAsyncStreamIterator:
"""Async iterator for chunked streams.
Based on aiohttp.streams.ChunkTupleAsyncStreamIterator, but yields
bytes instead of tuple[bytes, bool].
"""
__slots__ = ("_stream",)
def __init__(self, stream: StreamReader) -> None:
"""Initialize."""
self._stream = stream
def __aiter__(self) -> Self:
"""Iterate."""
return self
async def __anext__(self) -> bytes:
"""Yield next chunk."""
rv = await self._stream.readchunk()
if rv == (b"", False):
raise StopAsyncIteration
return rv[0]
class CloudBackupAgent(BackupAgent):
"""Cloud backup agent."""
name = DOMAIN
def __init__(self, hass: HomeAssistant, cloud: Cloud[CloudClient]) -> None:
"""Initialize the cloud backup sync agent."""
super().__init__()
self._cloud = cloud
self._hass = hass
@callback
def _get_backup_filename(self) -> str:
"""Return the backup filename."""
return f"{self._cloud.client.prefs.instance_id}.tar"
async def async_download_backup(
self,
backup_id: str,
**kwargs: Any,
) -> AsyncIterator[bytes]:
"""Download a backup file.
:param backup_id: The ID of the backup that was returned in async_list_backups.
:return: An async iterator that yields bytes.
"""
if not await self.async_get_backup(backup_id):
raise BackupAgentError("Backup not found")
try:
details = await async_files_download_details(
self._cloud,
storage_type=_STORAGE_BACKUP,
filename=self._get_backup_filename(),
)
except (ClientError, CloudError) as err:
raise BackupAgentError("Failed to get download details") from err
try:
resp = await self._cloud.websession.get(details["url"])
resp.raise_for_status()
except ClientError as err:
raise BackupAgentError("Failed to download backup") from err
return ChunkAsyncStreamIterator(resp.content)
async def async_upload_backup(
self,
*,
open_stream: Callable[[], Coroutine[Any, Any, AsyncIterator[bytes]]],
backup: AgentBackup,
**kwargs: Any,
) -> None:
"""Upload a backup.
:param open_stream: A function returning an async iterator that yields bytes.
:param backup: Metadata about the backup that should be uploaded.
"""
if not backup.protected:
raise BackupAgentError("Cloud backups must be protected")
base64md5hash = await _b64md5(await open_stream())
try:
details = await async_files_upload_details(
self._cloud,
storage_type=_STORAGE_BACKUP,
filename=self._get_backup_filename(),
metadata=backup.as_dict(),
size=backup.size,
base64md5hash=base64md5hash,
)
except (ClientError, CloudError) as err:
raise BackupAgentError("Failed to get upload details") from err
try:
upload_status = await self._cloud.websession.put(
details["url"],
data=await open_stream(),
headers=details["headers"] | {"content-length": str(backup.size)},
)
upload_status.raise_for_status()
except ClientError as err:
raise BackupAgentError("Failed to upload backup") from err
async def async_delete_backup(
self,
backup_id: str,
**kwargs: Any,
) -> None:
"""Delete a backup file.
:param backup_id: The ID of the backup that was returned in async_list_backups.
"""
if not await self.async_get_backup(backup_id):
raise BackupAgentError("Backup not found")
try:
await async_files_delete_file(
self._cloud,
storage_type=_STORAGE_BACKUP,
filename=self._get_backup_filename(),
)
except (ClientError, CloudError) as err:
raise BackupAgentError("Failed to delete backup") from err
async def async_list_backups(self, **kwargs: Any) -> list[AgentBackup]:
"""List backups."""
try:
backups = await async_files_list(self._cloud, storage_type=_STORAGE_BACKUP)
except (ClientError, CloudError) as err:
raise BackupAgentError("Failed to list backups") from err
return [AgentBackup.from_dict(backup["Metadata"]) for backup in backups]
async def async_get_backup(
self,
backup_id: str,
**kwargs: Any,
) -> AgentBackup | None:
"""Return a backup."""
backups = await self.async_list_backups()
for backup in backups:
if backup.backup_id == backup_id:
return backup
return None

View File

@ -1,7 +1,12 @@
{
"domain": "cloud",
"name": "Home Assistant Cloud",
"after_dependencies": ["assist_pipeline", "google_assistant", "alexa"],
"after_dependencies": [
"alexa",
"assist_pipeline",
"backup",
"google_assistant"
],
"codeowners": ["@home-assistant/cloud"],
"dependencies": ["auth", "http", "repairs", "webhook"],
"documentation": "https://www.home-assistant.io/integrations/cloud",

View File

@ -0,0 +1,365 @@
"""Backup functionality for supervised installations."""
from __future__ import annotations
import asyncio
from collections.abc import AsyncIterator, Callable, Coroutine, Mapping
from pathlib import Path
from typing import Any, cast
from aiohasupervisor.exceptions import SupervisorBadRequestError
from aiohasupervisor.models import (
backups as supervisor_backups,
mounts as supervisor_mounts,
)
from homeassistant.components.backup import (
DATA_MANAGER,
AddonInfo,
AgentBackup,
BackupAgent,
BackupReaderWriter,
CreateBackupEvent,
Folder,
NewBackup,
WrittenBackup,
)
from homeassistant.core import HomeAssistant, callback
from homeassistant.exceptions import HomeAssistantError
from homeassistant.helpers.dispatcher import async_dispatcher_connect
from .const import DOMAIN, EVENT_SUPERVISOR_EVENT
from .handler import get_supervisor_client
LOCATION_CLOUD_BACKUP = ".cloud_backup"
async def async_get_backup_agents(
hass: HomeAssistant,
**kwargs: Any,
) -> list[BackupAgent]:
"""Return the hassio backup agents."""
client = get_supervisor_client(hass)
mounts = await client.mounts.info()
agents: list[BackupAgent] = [SupervisorBackupAgent(hass, "local", None)]
for mount in mounts.mounts:
if mount.usage is not supervisor_mounts.MountUsage.BACKUP:
continue
agents.append(SupervisorBackupAgent(hass, mount.name, mount.name))
return agents
def _backup_details_to_agent_backup(
details: supervisor_backups.BackupComplete,
) -> AgentBackup:
"""Convert a supervisor backup details object to an agent backup."""
homeassistant_included = details.homeassistant is not None
if not homeassistant_included:
database_included = False
else:
database_included = details.homeassistant_exclude_database is False
addons = [
AddonInfo(name=addon.name, slug=addon.slug, version=addon.version)
for addon in details.addons
]
return AgentBackup(
addons=addons,
backup_id=details.slug,
database_included=database_included,
date=details.date.isoformat(),
folders=[Folder(folder) for folder in details.folders],
homeassistant_included=homeassistant_included,
homeassistant_version=details.homeassistant,
name=details.name,
protected=details.protected,
size=details.size_bytes,
)
class SupervisorBackupAgent(BackupAgent):
"""Backup agent for supervised installations."""
def __init__(self, hass: HomeAssistant, name: str, location: str | None) -> None:
"""Initialize the backup agent."""
super().__init__()
self._hass = hass
self._backup_dir = Path("/backups")
self._client = get_supervisor_client(hass)
self.name = name
self.location = location
async def async_download_backup(
self,
backup_id: str,
**kwargs: Any,
) -> AsyncIterator[bytes]:
"""Download a backup file."""
return await self._client.backups.download_backup(backup_id)
async def async_upload_backup(
self,
*,
open_stream: Callable[[], Coroutine[Any, Any, AsyncIterator[bytes]]],
backup: AgentBackup,
**kwargs: Any,
) -> None:
"""Upload a backup.
Not required for supervisor, the SupervisorBackupReaderWriter stores files.
"""
async def async_list_backups(self, **kwargs: Any) -> list[AgentBackup]:
"""List backups."""
backup_list = await self._client.backups.list()
result = []
for backup in backup_list:
if not backup.locations or self.location not in backup.locations:
continue
details = await self._client.backups.backup_info(backup.slug)
result.append(_backup_details_to_agent_backup(details))
return result
async def async_get_backup(
self,
backup_id: str,
**kwargs: Any,
) -> AgentBackup | None:
"""Return a backup."""
details = await self._client.backups.backup_info(backup_id)
if self.location not in details.locations:
return None
return _backup_details_to_agent_backup(details)
async def async_delete_backup(self, backup_id: str, **kwargs: Any) -> None:
"""Remove a backup."""
try:
await self._client.backups.remove_backup(backup_id)
except SupervisorBadRequestError as err:
if err.args[0] != "Backup does not exist":
raise
class SupervisorBackupReaderWriter(BackupReaderWriter):
"""Class for reading and writing backups in supervised installations."""
def __init__(self, hass: HomeAssistant) -> None:
"""Initialize the backup reader/writer."""
self._hass = hass
self._client = get_supervisor_client(hass)
async def async_create_backup(
self,
*,
agent_ids: list[str],
backup_name: str,
include_addons: list[str] | None,
include_all_addons: bool,
include_database: bool,
include_folders: list[Folder] | None,
include_homeassistant: bool,
on_progress: Callable[[CreateBackupEvent], None],
password: str | None,
) -> tuple[NewBackup, asyncio.Task[WrittenBackup]]:
"""Create a backup."""
manager = self._hass.data[DATA_MANAGER]
include_addons_set = set(include_addons) if include_addons else None
include_folders_set = (
{supervisor_backups.Folder(folder) for folder in include_folders}
if include_folders
else None
)
hassio_agents: list[SupervisorBackupAgent] = [
cast(SupervisorBackupAgent, manager.backup_agents[agent_id])
for agent_id in agent_ids
if agent_id.startswith(DOMAIN)
]
locations = {agent.location for agent in hassio_agents}
backup = await self._client.backups.partial_backup(
supervisor_backups.PartialBackupOptions(
addons=include_addons_set,
folders=include_folders_set,
homeassistant=include_homeassistant,
name=backup_name,
password=password,
compressed=True,
location=locations or LOCATION_CLOUD_BACKUP,
homeassistant_exclude_database=not include_database,
background=True,
)
)
backup_task = self._hass.async_create_task(
self._async_wait_for_backup(
backup, remove_after_upload=not bool(locations)
),
name="backup_manager_create_backup",
eager_start=False, # To ensure the task is not started before we return
)
return (NewBackup(backup_job_id=backup.job_id), backup_task)
async def _async_wait_for_backup(
self, backup: supervisor_backups.NewBackup, *, remove_after_upload: bool
) -> WrittenBackup:
"""Wait for a backup to complete."""
backup_complete = asyncio.Event()
backup_id: str | None = None
@callback
def on_progress(data: Mapping[str, Any]) -> None:
"""Handle backup progress."""
nonlocal backup_id
if data.get("done") is True:
backup_id = data.get("reference")
backup_complete.set()
try:
unsub = self._async_listen_job_events(backup.job_id, on_progress)
await backup_complete.wait()
finally:
unsub()
if not backup_id:
raise HomeAssistantError("Backup failed")
async def open_backup() -> AsyncIterator[bytes]:
return await self._client.backups.download_backup(backup_id)
async def remove_backup() -> None:
if not remove_after_upload:
return
await self._client.backups.remove_backup(backup_id)
details = await self._client.backups.backup_info(backup_id)
return WrittenBackup(
backup=_backup_details_to_agent_backup(details),
open_stream=open_backup,
release_stream=remove_backup,
)
async def async_receive_backup(
self,
*,
agent_ids: list[str],
stream: AsyncIterator[bytes],
suggested_filename: str,
) -> WrittenBackup:
"""Receive a backup."""
manager = self._hass.data[DATA_MANAGER]
hassio_agents: list[SupervisorBackupAgent] = [
cast(SupervisorBackupAgent, manager.backup_agents[agent_id])
for agent_id in agent_ids
if agent_id.startswith(DOMAIN)
]
locations = {agent.location for agent in hassio_agents}
backup_id = await self._client.backups.upload_backup(
stream,
supervisor_backups.UploadBackupOptions(
location=locations or {LOCATION_CLOUD_BACKUP}
),
)
async def open_backup() -> AsyncIterator[bytes]:
return await self._client.backups.download_backup(backup_id)
async def remove_backup() -> None:
if locations:
return
await self._client.backups.remove_backup(backup_id)
details = await self._client.backups.backup_info(backup_id)
return WrittenBackup(
backup=_backup_details_to_agent_backup(details),
open_stream=open_backup,
release_stream=remove_backup,
)
async def async_restore_backup(
self,
backup_id: str,
*,
agent_id: str,
open_stream: Callable[[], Coroutine[Any, Any, AsyncIterator[bytes]]],
password: str | None,
restore_addons: list[str] | None,
restore_database: bool,
restore_folders: list[Folder] | None,
restore_homeassistant: bool,
) -> None:
"""Restore a backup."""
if restore_homeassistant and not restore_database:
raise HomeAssistantError("Cannot restore Home Assistant without database")
if not restore_homeassistant and restore_database:
raise HomeAssistantError("Cannot restore database without Home Assistant")
restore_addons_set = set(restore_addons) if restore_addons else None
restore_folders_set = (
{supervisor_backups.Folder(folder) for folder in restore_folders}
if restore_folders
else None
)
if not agent_id.startswith(DOMAIN):
# Download the backup to the supervisor. Supervisor will clean up the backup
# two days after the restore is done.
await self.async_receive_backup(
agent_ids=[],
stream=await open_stream(),
suggested_filename=f"{backup_id}.tar",
)
job = await self._client.backups.partial_restore(
backup_id,
supervisor_backups.PartialRestoreOptions(
addons=restore_addons_set,
folders=restore_folders_set,
homeassistant=restore_homeassistant,
password=password,
background=True,
),
)
restore_complete = asyncio.Event()
@callback
def on_progress(data: Mapping[str, Any]) -> None:
"""Handle backup progress."""
if data.get("done") is True:
restore_complete.set()
try:
unsub = self._async_listen_job_events(job.job_id, on_progress)
await restore_complete.wait()
finally:
unsub()
@callback
def _async_listen_job_events(
self, job_id: str, on_event: Callable[[Mapping[str, Any]], None]
) -> Callable[[], None]:
"""Listen for job events."""
@callback
def unsub() -> None:
"""Unsubscribe from job events."""
unsub_signal()
@callback
def handle_signal(data: Mapping[str, Any]) -> None:
"""Handle a job signal."""
if (
data.get("event") != "job"
or not (event_data := data.get("data"))
or event_data.get("uuid") != job_id
):
return
on_event(event_data)
unsub_signal = async_dispatcher_connect(
self._hass, EVENT_SUPERVISOR_EVENT, handle_signal
)
return unsub

View File

@ -6,6 +6,6 @@
"documentation": "https://www.home-assistant.io/integrations/hassio",
"iot_class": "local_polling",
"quality_scale": "internal",
"requirements": ["aiohasupervisor==0.2.1"],
"requirements": ["aiohasupervisor==0.2.2b0"],
"single_config_entry": true
}

View File

@ -0,0 +1,92 @@
"""Backup platform for the kitchen_sink integration."""
from __future__ import annotations
import asyncio
from collections.abc import AsyncIterator, Callable, Coroutine
import logging
from typing import Any
from homeassistant.components.backup import AddonInfo, AgentBackup, BackupAgent, Folder
from homeassistant.core import HomeAssistant
LOGGER = logging.getLogger(__name__)
async def async_get_backup_agents(
hass: HomeAssistant,
) -> list[BackupAgent]:
"""Register the backup agents."""
return [KitchenSinkBackupAgent("syncer")]
class KitchenSinkBackupAgent(BackupAgent):
"""Kitchen sink backup agent."""
def __init__(self, name: str) -> None:
"""Initialize the kitchen sink backup sync agent."""
super().__init__()
self.name = name
self._uploads = [
AgentBackup(
addons=[AddonInfo(name="Test", slug="test", version="1.0.0")],
backup_id="abc123",
database_included=False,
date="1970-01-01T00:00:00Z",
folders=[Folder.MEDIA, Folder.SHARE],
homeassistant_included=True,
homeassistant_version="2024.12.0",
name="Kitchen sink syncer",
protected=False,
size=1234,
)
]
async def async_download_backup(
self,
backup_id: str,
**kwargs: Any,
) -> AsyncIterator[bytes]:
"""Download a backup file."""
LOGGER.info("Downloading backup %s", backup_id)
reader = asyncio.StreamReader()
reader.feed_data(b"backup data")
reader.feed_eof()
return reader
async def async_upload_backup(
self,
*,
open_stream: Callable[[], Coroutine[Any, Any, AsyncIterator[bytes]]],
backup: AgentBackup,
**kwargs: Any,
) -> None:
"""Upload a backup."""
LOGGER.info("Uploading backup %s %s", backup.backup_id, backup)
self._uploads.append(backup)
async def async_delete_backup(
self,
backup_id: str,
**kwargs: Any,
) -> None:
"""Delete a backup file."""
self._uploads = [
upload for upload in self._uploads if upload.backup_id != backup_id
]
LOGGER.info("Deleted backup %s", backup_id)
async def async_list_backups(self, **kwargs: Any) -> list[AgentBackup]:
"""List synced backups."""
return self._uploads
async def async_get_backup(
self,
backup_id: str,
**kwargs: Any,
) -> AgentBackup | None:
"""Return a backup."""
for backup in self._uploads:
if backup.backup_id == backup_id:
return backup
return None

View File

@ -3,7 +3,7 @@
aiodhcpwatcher==1.0.2
aiodiscover==2.1.0
aiodns==3.2.0
aiohasupervisor==0.2.1
aiohasupervisor==0.2.2b0
aiohttp-fast-zlib==0.2.0
aiohttp==3.11.10
aiohttp_cors==0.7.0

View File

@ -28,7 +28,7 @@ dependencies = [
# Integrations may depend on hassio integration without listing it to
# change behavior based on presence of supervisor. Deprecated with #127228
# Lib can be removed with 2025.11
"aiohasupervisor==0.2.1",
"aiohasupervisor==0.2.2b0",
"aiohttp==3.11.10",
"aiohttp_cors==0.7.0",
"aiohttp-fast-zlib==0.2.0",

View File

@ -4,7 +4,7 @@
# Home Assistant Core
aiodns==3.2.0
aiohasupervisor==0.2.1
aiohasupervisor==0.2.2b0
aiohttp==3.11.10
aiohttp_cors==0.7.0
aiohttp-fast-zlib==0.2.0

View File

@ -262,7 +262,7 @@ aioguardian==2022.07.0
aioharmony==0.2.10
# homeassistant.components.hassio
aiohasupervisor==0.2.1
aiohasupervisor==0.2.2b0
# homeassistant.components.homekit_controller
aiohomekit==3.2.7
@ -704,6 +704,7 @@ connect-box==0.3.1
# homeassistant.components.xiaomi_miio
construct==2.10.68
# homeassistant.components.backup
# homeassistant.components.utility_meter
cronsim==2.6

View File

@ -247,7 +247,7 @@ aioguardian==2022.07.0
aioharmony==0.2.10
# homeassistant.components.hassio
aiohasupervisor==0.2.1
aiohasupervisor==0.2.2b0
# homeassistant.components.homekit_controller
aiohomekit==3.2.7
@ -600,6 +600,7 @@ colorthief==0.2.1
# homeassistant.components.xiaomi_miio
construct==2.10.68
# homeassistant.components.backup
# homeassistant.components.utility_meter
cronsim==2.6

View File

@ -2,29 +2,162 @@
from __future__ import annotations
from collections.abc import AsyncIterator, Callable, Coroutine
from pathlib import Path
from unittest.mock import patch
from typing import Any
from unittest.mock import AsyncMock, Mock, patch
from homeassistant.components.backup import DOMAIN
from homeassistant.components.backup.manager import Backup
from homeassistant.components.backup import (
DOMAIN,
AddonInfo,
AgentBackup,
BackupAgent,
BackupAgentPlatformProtocol,
Folder,
)
from homeassistant.components.backup.const import DATA_MANAGER
from homeassistant.core import HomeAssistant
from homeassistant.helpers.typing import ConfigType
from homeassistant.setup import async_setup_component
TEST_BACKUP = Backup(
slug="abc123",
name="Test",
from tests.common import MockPlatform, mock_platform
LOCAL_AGENT_ID = f"{DOMAIN}.local"
TEST_BACKUP_ABC123 = AgentBackup(
addons=[AddonInfo(name="Test", slug="test", version="1.0.0")],
backup_id="abc123",
database_included=True,
date="1970-01-01T00:00:00.000Z",
path=Path("abc123.tar"),
size=0.0,
folders=[Folder.MEDIA, Folder.SHARE],
homeassistant_included=True,
homeassistant_version="2024.12.0",
name="Test",
protected=False,
size=0,
)
TEST_BACKUP_PATH_ABC123 = Path("abc123.tar")
TEST_BACKUP_DEF456 = AgentBackup(
addons=[],
backup_id="def456",
database_included=False,
date="1980-01-01T00:00:00.000Z",
folders=[Folder.MEDIA, Folder.SHARE],
homeassistant_included=True,
homeassistant_version="2024.12.0",
name="Test 2",
protected=False,
size=1,
)
TEST_DOMAIN = "test"
class BackupAgentTest(BackupAgent):
"""Test backup agent."""
def __init__(self, name: str, backups: list[AgentBackup] | None = None) -> None:
"""Initialize the backup agent."""
self.name = name
if backups is None:
backups = [
AgentBackup(
addons=[AddonInfo(name="Test", slug="test", version="1.0.0")],
backup_id="abc123",
database_included=True,
date="1970-01-01T00:00:00Z",
folders=[Folder.MEDIA, Folder.SHARE],
homeassistant_included=True,
homeassistant_version="2024.12.0",
name="Test",
protected=False,
size=13,
)
]
self._backup_data: bytearray | None = None
self._backups = {backup.backup_id: backup for backup in backups}
async def async_download_backup(
self,
backup_id: str,
**kwargs: Any,
) -> AsyncIterator[bytes]:
"""Download a backup file."""
return AsyncMock(spec_set=["__aiter__"])
async def async_upload_backup(
self,
*,
open_stream: Callable[[], Coroutine[Any, Any, AsyncIterator[bytes]]],
backup: AgentBackup,
**kwargs: Any,
) -> None:
"""Upload a backup."""
self._backups[backup.backup_id] = backup
backup_stream = await open_stream()
self._backup_data = bytearray()
async for chunk in backup_stream:
self._backup_data += chunk
async def async_list_backups(self, **kwargs: Any) -> list[AgentBackup]:
"""List backups."""
return list(self._backups.values())
async def async_get_backup(
self,
backup_id: str,
**kwargs: Any,
) -> AgentBackup | None:
"""Return a backup."""
return self._backups.get(backup_id)
async def async_delete_backup(
self,
backup_id: str,
**kwargs: Any,
) -> None:
"""Delete a backup file."""
async def setup_backup_integration(
hass: HomeAssistant,
with_hassio: bool = False,
configuration: ConfigType | None = None,
*,
backups: dict[str, list[AgentBackup]] | None = None,
remote_agents: list[str] | None = None,
) -> bool:
"""Set up the Backup integration."""
with patch("homeassistant.components.backup.is_hassio", return_value=with_hassio):
return await async_setup_component(hass, DOMAIN, configuration or {})
with (
patch("homeassistant.components.backup.is_hassio", return_value=with_hassio),
patch(
"homeassistant.components.backup.backup.is_hassio", return_value=with_hassio
),
):
remote_agents = remote_agents or []
platform = Mock(
async_get_backup_agents=AsyncMock(
return_value=[BackupAgentTest(agent, []) for agent in remote_agents]
),
spec_set=BackupAgentPlatformProtocol,
)
mock_platform(hass, f"{TEST_DOMAIN}.backup", platform or MockPlatform())
assert await async_setup_component(hass, TEST_DOMAIN, {})
result = await async_setup_component(hass, DOMAIN, configuration or {})
await hass.async_block_till_done()
if not backups:
return result
for agent_id, agent_backups in backups.items():
if with_hassio and agent_id == LOCAL_AGENT_ID:
continue
agent = hass.data[DATA_MANAGER].backup_agents[agent_id]
agent._backups = {backups.backup_id: backups for backups in agent_backups}
if agent_id == LOCAL_AGENT_ID:
agent._loaded_backups = True
return result

View File

@ -0,0 +1,97 @@
"""Test fixtures for the Backup integration."""
from __future__ import annotations
from collections.abc import Generator
from pathlib import Path
from unittest.mock import MagicMock, Mock, patch
import pytest
from homeassistant.core import HomeAssistant
from .common import TEST_BACKUP_PATH_ABC123
@pytest.fixture(name="mocked_json_bytes")
def mocked_json_bytes_fixture() -> Generator[Mock]:
"""Mock json_bytes."""
with patch(
"homeassistant.components.backup.manager.json_bytes",
return_value=b"{}", # Empty JSON
) as mocked_json_bytes:
yield mocked_json_bytes
@pytest.fixture(name="mocked_tarfile")
def mocked_tarfile_fixture() -> Generator[Mock]:
"""Mock tarfile."""
with patch(
"homeassistant.components.backup.manager.SecureTarFile"
) as mocked_tarfile:
yield mocked_tarfile
@pytest.fixture(name="path_glob")
def path_glob_fixture() -> Generator[MagicMock]:
"""Mock path glob."""
with patch(
"pathlib.Path.glob", return_value=[TEST_BACKUP_PATH_ABC123]
) as path_glob:
yield path_glob
CONFIG_DIR = {
"testing_config": [
Path("test.txt"),
Path(".DS_Store"),
Path(".storage"),
Path("backups"),
Path("tmp_backups"),
Path("home-assistant_v2.db"),
],
"backups": [
Path("backups/backup.tar"),
Path("backups/not_backup"),
],
"tmp_backups": [
Path("tmp_backups/forgotten_backup.tar"),
Path("tmp_backups/not_backup"),
],
}
CONFIG_DIR_DIRS = {Path(".storage"), Path("backups"), Path("tmp_backups")}
@pytest.fixture(name="mock_backup_generation")
def mock_backup_generation_fixture(
hass: HomeAssistant, mocked_json_bytes: Mock, mocked_tarfile: Mock
) -> Generator[None]:
"""Mock backup generator."""
with (
patch("pathlib.Path.iterdir", lambda x: CONFIG_DIR.get(x.name, [])),
patch("pathlib.Path.stat", return_value=MagicMock(st_size=123)),
patch("pathlib.Path.is_file", lambda x: x not in CONFIG_DIR_DIRS),
patch("pathlib.Path.is_dir", lambda x: x in CONFIG_DIR_DIRS),
patch(
"pathlib.Path.exists",
lambda x: x
not in (
Path(hass.config.path("backups")),
Path(hass.config.path("tmp_backups")),
),
),
patch(
"pathlib.Path.is_symlink",
lambda _: False,
),
patch(
"pathlib.Path.mkdir",
MagicMock(),
),
patch(
"homeassistant.components.backup.manager.HAVERSION",
"2025.1.0",
),
):
yield

View File

@ -0,0 +1,206 @@
# serializer version: 1
# name: test_delete_backup[found_backups0-True-1]
dict({
'id': 1,
'result': dict({
'agent_errors': dict({
}),
}),
'success': True,
'type': 'result',
})
# ---
# name: test_delete_backup[found_backups1-False-0]
dict({
'id': 1,
'result': dict({
'agent_errors': dict({
}),
}),
'success': True,
'type': 'result',
})
# ---
# name: test_delete_backup[found_backups2-True-0]
dict({
'id': 1,
'result': dict({
'agent_errors': dict({
}),
}),
'success': True,
'type': 'result',
})
# ---
# name: test_load_backups[None]
dict({
'id': 1,
'result': dict({
'agents': list([
dict({
'agent_id': 'backup.local',
}),
]),
}),
'success': True,
'type': 'result',
})
# ---
# name: test_load_backups[None].1
dict({
'id': 2,
'result': dict({
'agent_errors': dict({
}),
'backups': list([
dict({
'addons': list([
dict({
'name': 'Test',
'slug': 'test',
'version': '1.0.0',
}),
]),
'agent_ids': list([
'backup.local',
]),
'backup_id': 'abc123',
'database_included': True,
'date': '1970-01-01T00:00:00.000Z',
'failed_agent_ids': list([
]),
'folders': list([
'media',
'share',
]),
'homeassistant_included': True,
'homeassistant_version': '2024.12.0',
'name': 'Test',
'protected': False,
'size': 0,
'with_strategy_settings': False,
}),
]),
'last_attempted_strategy_backup': None,
'last_completed_strategy_backup': None,
}),
'success': True,
'type': 'result',
})
# ---
# name: test_load_backups[side_effect1]
dict({
'id': 1,
'result': dict({
'agents': list([
dict({
'agent_id': 'backup.local',
}),
]),
}),
'success': True,
'type': 'result',
})
# ---
# name: test_load_backups[side_effect1].1
dict({
'id': 2,
'result': dict({
'agent_errors': dict({
}),
'backups': list([
]),
'last_attempted_strategy_backup': None,
'last_completed_strategy_backup': None,
}),
'success': True,
'type': 'result',
})
# ---
# name: test_load_backups[side_effect2]
dict({
'id': 1,
'result': dict({
'agents': list([
dict({
'agent_id': 'backup.local',
}),
]),
}),
'success': True,
'type': 'result',
})
# ---
# name: test_load_backups[side_effect2].1
dict({
'id': 2,
'result': dict({
'agent_errors': dict({
}),
'backups': list([
]),
'last_attempted_strategy_backup': None,
'last_completed_strategy_backup': None,
}),
'success': True,
'type': 'result',
})
# ---
# name: test_load_backups[side_effect3]
dict({
'id': 1,
'result': dict({
'agents': list([
dict({
'agent_id': 'backup.local',
}),
]),
}),
'success': True,
'type': 'result',
})
# ---
# name: test_load_backups[side_effect3].1
dict({
'id': 2,
'result': dict({
'agent_errors': dict({
}),
'backups': list([
]),
'last_attempted_strategy_backup': None,
'last_completed_strategy_backup': None,
}),
'success': True,
'type': 'result',
})
# ---
# name: test_load_backups[side_effect4]
dict({
'id': 1,
'result': dict({
'agents': list([
dict({
'agent_id': 'backup.local',
}),
]),
}),
'success': True,
'type': 'result',
})
# ---
# name: test_load_backups[side_effect4].1
dict({
'id': 2,
'result': dict({
'agent_errors': dict({
}),
'backups': list([
]),
'last_attempted_strategy_backup': None,
'last_completed_strategy_backup': None,
}),
'success': True,
'type': 'result',
})
# ---

File diff suppressed because it is too large Load Diff

View File

@ -0,0 +1,129 @@
"""Test the builtin backup platform."""
from __future__ import annotations
from collections.abc import Generator
from io import StringIO
import json
from pathlib import Path
from tarfile import TarError
from unittest.mock import MagicMock, mock_open, patch
import pytest
from syrupy import SnapshotAssertion
from homeassistant.components.backup import DOMAIN
from homeassistant.core import HomeAssistant
from homeassistant.setup import async_setup_component
from .common import TEST_BACKUP_ABC123, TEST_BACKUP_PATH_ABC123
from tests.typing import ClientSessionGenerator, WebSocketGenerator
@pytest.fixture(name="read_backup")
def read_backup_fixture(path_glob: MagicMock) -> Generator[MagicMock]:
"""Mock read backup."""
with patch(
"homeassistant.components.backup.backup.read_backup",
return_value=TEST_BACKUP_ABC123,
) as read_backup:
yield read_backup
@pytest.mark.parametrize(
"side_effect",
[
None,
OSError("Boom"),
TarError("Boom"),
json.JSONDecodeError("Boom", "test", 1),
KeyError("Boom"),
],
)
async def test_load_backups(
hass: HomeAssistant,
hass_ws_client: WebSocketGenerator,
snapshot: SnapshotAssertion,
read_backup: MagicMock,
side_effect: Exception | None,
) -> None:
"""Test load backups."""
assert await async_setup_component(hass, DOMAIN, {})
await hass.async_block_till_done()
client = await hass_ws_client(hass)
read_backup.side_effect = side_effect
# list agents
await client.send_json_auto_id({"type": "backup/agents/info"})
assert await client.receive_json() == snapshot
# load and list backups
await client.send_json_auto_id({"type": "backup/info"})
assert await client.receive_json() == snapshot
async def test_upload(
hass: HomeAssistant,
hass_client: ClientSessionGenerator,
) -> None:
"""Test upload backup."""
assert await async_setup_component(hass, DOMAIN, {})
await hass.async_block_till_done()
client = await hass_client()
open_mock = mock_open()
with (
patch("pathlib.Path.open", open_mock),
patch("shutil.move") as move_mock,
patch(
"homeassistant.components.backup.manager.read_backup",
return_value=TEST_BACKUP_ABC123,
),
):
resp = await client.post(
"/api/backup/upload?agent_id=backup.local",
data={"file": StringIO("test")},
)
assert resp.status == 201
assert open_mock.call_count == 1
assert move_mock.call_count == 1
assert move_mock.mock_calls[0].args[1].name == "abc123.tar"
@pytest.mark.usefixtures("read_backup")
@pytest.mark.parametrize(
("found_backups", "backup_exists", "unlink_calls"),
[
([TEST_BACKUP_PATH_ABC123], True, 1),
([TEST_BACKUP_PATH_ABC123], False, 0),
(([], True, 0)),
],
)
async def test_delete_backup(
hass: HomeAssistant,
caplog: pytest.LogCaptureFixture,
hass_ws_client: WebSocketGenerator,
snapshot: SnapshotAssertion,
path_glob: MagicMock,
found_backups: list[Path],
backup_exists: bool,
unlink_calls: int,
) -> None:
"""Test delete backup."""
assert await async_setup_component(hass, DOMAIN, {})
await hass.async_block_till_done()
client = await hass_ws_client(hass)
path_glob.return_value = found_backups
with (
patch("pathlib.Path.exists", return_value=backup_exists),
patch("pathlib.Path.unlink") as unlink,
):
await client.send_json_auto_id(
{"type": "backup/delete", "backup_id": TEST_BACKUP_ABC123.backup_id}
)
assert await client.receive_json() == snapshot
assert unlink.call_count == unlink_calls

View File

@ -7,27 +7,28 @@ from unittest.mock import patch
from aiohttp import web
import pytest
from homeassistant.components.backup.const import DATA_MANAGER
from homeassistant.core import HomeAssistant
from .common import TEST_BACKUP, setup_backup_integration
from .common import TEST_BACKUP_ABC123, BackupAgentTest, setup_backup_integration
from tests.common import MockUser
from tests.typing import ClientSessionGenerator
async def test_downloading_backup(
async def test_downloading_local_backup(
hass: HomeAssistant,
hass_client: ClientSessionGenerator,
) -> None:
"""Test downloading a backup file."""
"""Test downloading a local backup file."""
await setup_backup_integration(hass)
client = await hass_client()
with (
patch(
"homeassistant.components.backup.manager.BackupManager.async_get_backup",
return_value=TEST_BACKUP,
"homeassistant.components.backup.backup.CoreLocalBackupAgent.async_get_backup",
return_value=TEST_BACKUP_ABC123,
),
patch("pathlib.Path.exists", return_value=True),
patch(
@ -35,10 +36,29 @@ async def test_downloading_backup(
return_value=web.Response(text=""),
),
):
resp = await client.get("/api/backup/download/abc123")
resp = await client.get("/api/backup/download/abc123?agent_id=backup.local")
assert resp.status == 200
async def test_downloading_remote_backup(
hass: HomeAssistant,
hass_client: ClientSessionGenerator,
) -> None:
"""Test downloading a remote backup."""
await setup_backup_integration(hass)
hass.data[DATA_MANAGER].backup_agents["domain.test"] = BackupAgentTest("test")
client = await hass_client()
with (
patch.object(BackupAgentTest, "async_download_backup") as download_mock,
):
download_mock.return_value.__aiter__.return_value = iter((b"backup data",))
resp = await client.get("/api/backup/download/abc123?agent_id=domain.test")
assert resp.status == 200
assert await resp.content.read() == b"backup data"
async def test_downloading_backup_not_found(
hass: HomeAssistant,
hass_client: ClientSessionGenerator,
@ -48,7 +68,7 @@ async def test_downloading_backup_not_found(
client = await hass_client()
resp = await client.get("/api/backup/download/abc123")
resp = await client.get("/api/backup/download/abc123?agent_id=backup.local")
assert resp.status == 404
@ -63,7 +83,7 @@ async def test_downloading_as_non_admin(
client = await hass_client()
resp = await client.get("/api/backup/download/abc123")
resp = await client.get("/api/backup/download/abc123?agent_id=backup.local")
assert resp.status == 401
@ -80,7 +100,7 @@ async def test_uploading_a_backup_file(
"homeassistant.components.backup.manager.BackupManager.async_receive_backup",
) as async_receive_backup_mock:
resp = await client.post(
"/api/backup/upload",
"/api/backup/upload?agent_id=backup.local",
data={"file": StringIO("test")},
)
assert resp.status == 201
@ -90,7 +110,7 @@ async def test_uploading_a_backup_file(
@pytest.mark.parametrize(
("error", "message"),
[
(OSError("Boom!"), "Can't write backup file Boom!"),
(OSError("Boom!"), "Can't write backup file: Boom!"),
(asyncio.CancelledError("Boom!"), ""),
],
)
@ -110,7 +130,7 @@ async def test_error_handling_uploading_a_backup_file(
side_effect=error,
):
resp = await client.post(
"/api/backup/upload",
"/api/backup/upload?agent_id=backup.local",
data={"file": StringIO("test")},
)
assert resp.status == 500

View File

@ -1,15 +1,18 @@
"""Tests for the Backup integration."""
from typing import Any
from unittest.mock import patch
import pytest
from homeassistant.components.backup.const import DOMAIN
from homeassistant.components.backup.const import DATA_MANAGER, DOMAIN
from homeassistant.core import HomeAssistant
from homeassistant.exceptions import ServiceNotFound
from .common import setup_backup_integration
@pytest.mark.usefixtures("supervisor_client")
async def test_setup_with_hassio(
hass: HomeAssistant,
caplog: pytest.LogCaptureFixture,
@ -20,14 +23,14 @@ async def test_setup_with_hassio(
with_hassio=True,
configuration={DOMAIN: {}},
)
assert (
"The backup integration is not supported on this installation method, please"
" remove it from your configuration"
) in caplog.text
manager = hass.data[DATA_MANAGER]
assert not manager.backup_agents
@pytest.mark.parametrize("service_data", [None, {}])
async def test_create_service(
hass: HomeAssistant,
service_data: dict[str, Any] | None,
) -> None:
"""Test generate backup."""
await setup_backup_integration(hass)
@ -39,6 +42,15 @@ async def test_create_service(
DOMAIN,
"create",
blocking=True,
service_data=service_data,
)
assert generate_backup.called
async def test_create_service_with_hassio(hass: HomeAssistant) -> None:
"""Test action backup.create does not exist with hassio."""
await setup_backup_integration(hass, with_hassio=True)
with pytest.raises(ServiceNotFound):
await hass.services.async_call(DOMAIN, "create", blocking=True)

File diff suppressed because it is too large Load Diff

View File

@ -0,0 +1,11 @@
"""Tests for the Backup integration."""
from homeassistant.components.backup import AgentBackup
from .common import TEST_BACKUP_ABC123
async def test_agent_backup_serialization() -> None:
"""Test AgentBackup serialization."""
assert AgentBackup.from_dict(TEST_BACKUP_ABC123.as_dict()) == TEST_BACKUP_ABC123

File diff suppressed because it is too large Load Diff

View File

@ -0,0 +1,568 @@
"""Test the cloud backup platform."""
from collections.abc import AsyncGenerator, AsyncIterator, Generator
from io import StringIO
from typing import Any
from unittest.mock import Mock, PropertyMock, patch
from aiohttp import ClientError
from hass_nabucasa import CloudError
import pytest
from yarl import URL
from homeassistant.components.backup import (
DOMAIN as BACKUP_DOMAIN,
AddonInfo,
AgentBackup,
Folder,
)
from homeassistant.components.cloud import DOMAIN
from homeassistant.core import HomeAssistant
from homeassistant.setup import async_setup_component
from tests.test_util.aiohttp import AiohttpClientMocker
from tests.typing import ClientSessionGenerator, MagicMock, WebSocketGenerator
@pytest.fixture(autouse=True)
async def setup_integration(
hass: HomeAssistant, aioclient_mock: AiohttpClientMocker, cloud: MagicMock
) -> AsyncGenerator[None]:
"""Set up cloud integration."""
with patch("homeassistant.components.backup.is_hassio", return_value=False):
assert await async_setup_component(hass, BACKUP_DOMAIN, {BACKUP_DOMAIN: {}})
assert await async_setup_component(hass, DOMAIN, {DOMAIN: {}})
await hass.async_block_till_done()
yield
@pytest.fixture
def mock_delete_file() -> Generator[MagicMock]:
"""Mock list files."""
with patch(
"homeassistant.components.cloud.backup.async_files_delete_file",
spec_set=True,
) as delete_file:
yield delete_file
@pytest.fixture
def mock_get_download_details() -> Generator[MagicMock]:
"""Mock list files."""
with patch(
"homeassistant.components.cloud.backup.async_files_download_details",
spec_set=True,
) as download_details:
download_details.return_value = {
"url": (
"https://blabla.cloudflarestorage.com/blabla/backup/"
"462e16810d6841228828d9dd2f9e341e.tar?X-Amz-Algorithm=blah"
),
}
yield download_details
@pytest.fixture
def mock_get_upload_details() -> Generator[MagicMock]:
"""Mock list files."""
with patch(
"homeassistant.components.cloud.backup.async_files_upload_details",
spec_set=True,
) as download_details:
download_details.return_value = {
"url": (
"https://blabla.cloudflarestorage.com/blabla/backup/"
"ea5c969e492c49df89d432a1483b8dc3.tar?X-Amz-Algorithm=blah"
),
"headers": {
"content-md5": "HOhSM3WZkpHRYGiz4YRGIQ==",
"x-amz-meta-storage-type": "backup",
"x-amz-meta-b64json": (
"eyJhZGRvbnMiOltdLCJiYWNrdXBfaWQiOiJjNDNiNWU2MCIsImRhdGUiOiIyMDI0LT"
"EyLTAzVDA0OjI1OjUwLjMyMDcwMy0wNTowMCIsImRhdGFiYXNlX2luY2x1ZGVkIjpm"
"YWxzZSwiZm9sZGVycyI6W10sImhvbWVhc3Npc3RhbnRfaW5jbHVkZWQiOnRydWUsIm"
"hvbWVhc3Npc3RhbnRfdmVyc2lvbiI6IjIwMjQuMTIuMC5kZXYwIiwibmFtZSI6ImVy"
"aWsiLCJwcm90ZWN0ZWQiOnRydWUsInNpemUiOjM1NjI0OTYwfQ=="
),
},
}
yield download_details
@pytest.fixture
def mock_list_files() -> Generator[MagicMock]:
"""Mock list files."""
with patch(
"homeassistant.components.cloud.backup.async_files_list", spec_set=True
) as list_files:
list_files.return_value = [
{
"Key": "462e16810d6841228828d9dd2f9e341e.tar",
"LastModified": "2024-11-22T10:49:01.182Z",
"Size": 34519040,
"Metadata": {
"addons": [],
"backup_id": "23e64aec",
"date": "2024-11-22T11:48:48.727189+01:00",
"database_included": True,
"folders": [],
"homeassistant_included": True,
"homeassistant_version": "2024.12.0.dev0",
"name": "Core 2024.12.0.dev0",
"protected": False,
"size": 34519040,
"storage-type": "backup",
},
}
]
yield list_files
@pytest.fixture
def cloud_logged_in(cloud: MagicMock):
"""Mock cloud logged in."""
type(cloud).is_logged_in = PropertyMock(return_value=True)
async def test_agents_info(
hass: HomeAssistant,
hass_ws_client: WebSocketGenerator,
) -> None:
"""Test backup agent info."""
client = await hass_ws_client(hass)
await client.send_json_auto_id({"type": "backup/agents/info"})
response = await client.receive_json()
assert response["success"]
assert response["result"] == {
"agents": [{"agent_id": "backup.local"}, {"agent_id": "cloud.cloud"}],
}
async def test_agents_list_backups(
hass: HomeAssistant,
hass_ws_client: WebSocketGenerator,
cloud: MagicMock,
mock_list_files: Mock,
) -> None:
"""Test agent list backups."""
client = await hass_ws_client(hass)
await client.send_json_auto_id({"type": "backup/info"})
response = await client.receive_json()
mock_list_files.assert_called_once_with(cloud, storage_type="backup")
assert response["success"]
assert response["result"]["agent_errors"] == {}
assert response["result"]["backups"] == [
{
"addons": [],
"backup_id": "23e64aec",
"date": "2024-11-22T11:48:48.727189+01:00",
"database_included": True,
"folders": [],
"homeassistant_included": True,
"homeassistant_version": "2024.12.0.dev0",
"name": "Core 2024.12.0.dev0",
"protected": False,
"size": 34519040,
"agent_ids": ["cloud.cloud"],
"failed_agent_ids": [],
"with_strategy_settings": False,
}
]
@pytest.mark.parametrize("side_effect", [ClientError, CloudError])
async def test_agents_list_backups_fail_cloud(
hass: HomeAssistant,
hass_ws_client: WebSocketGenerator,
cloud: MagicMock,
mock_list_files: Mock,
side_effect: Exception,
) -> None:
"""Test agent list backups."""
client = await hass_ws_client(hass)
mock_list_files.side_effect = side_effect
await client.send_json_auto_id({"type": "backup/info"})
response = await client.receive_json()
assert response["success"]
assert response["result"] == {
"agent_errors": {"cloud.cloud": "Failed to list backups"},
"backups": [],
"last_attempted_strategy_backup": None,
"last_completed_strategy_backup": None,
}
@pytest.mark.parametrize(
("backup_id", "expected_result"),
[
(
"23e64aec",
{
"addons": [],
"backup_id": "23e64aec",
"date": "2024-11-22T11:48:48.727189+01:00",
"database_included": True,
"folders": [],
"homeassistant_included": True,
"homeassistant_version": "2024.12.0.dev0",
"name": "Core 2024.12.0.dev0",
"protected": False,
"size": 34519040,
"agent_ids": ["cloud.cloud"],
"failed_agent_ids": [],
"with_strategy_settings": False,
},
),
(
"12345",
None,
),
],
ids=["found", "not_found"],
)
async def test_agents_get_backup(
hass: HomeAssistant,
hass_ws_client: WebSocketGenerator,
cloud: MagicMock,
backup_id: str,
expected_result: dict[str, Any] | None,
mock_list_files: Mock,
) -> None:
"""Test agent get backup."""
client = await hass_ws_client(hass)
await client.send_json_auto_id({"type": "backup/details", "backup_id": backup_id})
response = await client.receive_json()
mock_list_files.assert_called_once_with(cloud, storage_type="backup")
assert response["success"]
assert response["result"]["agent_errors"] == {}
assert response["result"]["backup"] == expected_result
@pytest.mark.usefixtures("cloud_logged_in", "mock_list_files")
async def test_agents_download(
hass: HomeAssistant,
hass_client: ClientSessionGenerator,
aioclient_mock: AiohttpClientMocker,
mock_get_download_details: Mock,
) -> None:
"""Test agent download backup."""
client = await hass_client()
backup_id = "23e64aec"
aioclient_mock.get(
mock_get_download_details.return_value["url"], content=b"backup data"
)
resp = await client.get(f"/api/backup/download/{backup_id}?agent_id=cloud.cloud")
assert resp.status == 200
assert await resp.content.read() == b"backup data"
@pytest.mark.parametrize("side_effect", [ClientError, CloudError])
@pytest.mark.usefixtures("cloud_logged_in", "mock_list_files")
async def test_agents_download_fail_cloud(
hass: HomeAssistant,
hass_client: ClientSessionGenerator,
mock_get_download_details: Mock,
side_effect: Exception,
) -> None:
"""Test agent download backup, when cloud user is logged in."""
client = await hass_client()
backup_id = "23e64aec"
mock_get_download_details.side_effect = side_effect
resp = await client.get(f"/api/backup/download/{backup_id}?agent_id=cloud.cloud")
assert resp.status == 500
content = await resp.content.read()
assert "Failed to get download details" in content.decode()
@pytest.mark.usefixtures("cloud_logged_in", "mock_list_files")
async def test_agents_download_fail_get(
hass: HomeAssistant,
hass_client: ClientSessionGenerator,
aioclient_mock: AiohttpClientMocker,
mock_get_download_details: Mock,
) -> None:
"""Test agent download backup, when cloud user is logged in."""
client = await hass_client()
backup_id = "23e64aec"
aioclient_mock.get(mock_get_download_details.return_value["url"], status=500)
resp = await client.get(f"/api/backup/download/{backup_id}?agent_id=cloud.cloud")
assert resp.status == 500
content = await resp.content.read()
assert "Failed to download backup" in content.decode()
@pytest.mark.usefixtures("cloud_logged_in", "mock_list_files")
async def test_agents_download_not_found(
hass: HomeAssistant,
hass_client: ClientSessionGenerator,
) -> None:
"""Test agent download backup raises error if not found."""
client = await hass_client()
backup_id = "1234"
resp = await client.get(f"/api/backup/download/{backup_id}?agent_id=cloud.cloud")
assert resp.status == 404
assert await resp.content.read() == b""
@pytest.mark.usefixtures("cloud_logged_in", "mock_list_files")
async def test_agents_upload(
hass: HomeAssistant,
hass_client: ClientSessionGenerator,
caplog: pytest.LogCaptureFixture,
aioclient_mock: AiohttpClientMocker,
mock_get_upload_details: Mock,
) -> None:
"""Test agent upload backup."""
client = await hass_client()
backup_id = "test-backup"
test_backup = AgentBackup(
addons=[AddonInfo(name="Test", slug="test", version="1.0.0")],
backup_id=backup_id,
database_included=True,
date="1970-01-01T00:00:00.000Z",
folders=[Folder.MEDIA, Folder.SHARE],
homeassistant_included=True,
homeassistant_version="2024.12.0",
name="Test",
protected=True,
size=0.0,
)
aioclient_mock.put(mock_get_upload_details.return_value["url"])
with (
patch(
"homeassistant.components.backup.manager.BackupManager.async_get_backup",
) as fetch_backup,
patch(
"homeassistant.components.backup.manager.read_backup",
return_value=test_backup,
),
patch("pathlib.Path.open") as mocked_open,
):
mocked_open.return_value.read = Mock(side_effect=[b"test", b""])
fetch_backup.return_value = test_backup
resp = await client.post(
"/api/backup/upload?agent_id=cloud.cloud",
data={"file": StringIO("test")},
)
assert len(aioclient_mock.mock_calls) == 1
assert aioclient_mock.mock_calls[-1][0] == "PUT"
assert aioclient_mock.mock_calls[-1][1] == URL(
mock_get_upload_details.return_value["url"]
)
assert isinstance(aioclient_mock.mock_calls[-1][2], AsyncIterator)
assert resp.status == 201
assert f"Uploading backup {backup_id}" in caplog.text
@pytest.mark.usefixtures("cloud_logged_in", "mock_list_files")
async def test_agents_upload_fail_put(
hass: HomeAssistant,
hass_client: ClientSessionGenerator,
caplog: pytest.LogCaptureFixture,
aioclient_mock: AiohttpClientMocker,
mock_get_upload_details: Mock,
) -> None:
"""Test agent upload backup fails."""
client = await hass_client()
backup_id = "test-backup"
test_backup = AgentBackup(
addons=[AddonInfo(name="Test", slug="test", version="1.0.0")],
backup_id=backup_id,
database_included=True,
date="1970-01-01T00:00:00.000Z",
folders=[Folder.MEDIA, Folder.SHARE],
homeassistant_included=True,
homeassistant_version="2024.12.0",
name="Test",
protected=True,
size=0.0,
)
aioclient_mock.put(mock_get_upload_details.return_value["url"], status=500)
with (
patch(
"homeassistant.components.backup.manager.BackupManager.async_get_backup",
) as fetch_backup,
patch(
"homeassistant.components.backup.manager.read_backup",
return_value=test_backup,
),
patch("pathlib.Path.open") as mocked_open,
):
mocked_open.return_value.read = Mock(side_effect=[b"test", b""])
fetch_backup.return_value = test_backup
resp = await client.post(
"/api/backup/upload?agent_id=cloud.cloud",
data={"file": StringIO("test")},
)
assert resp.status == 201
assert "Error during backup upload - Failed to upload backup" in caplog.text
@pytest.mark.parametrize("side_effect", [ClientError, CloudError])
@pytest.mark.usefixtures("cloud_logged_in")
async def test_agents_upload_fail_cloud(
hass: HomeAssistant,
hass_client: ClientSessionGenerator,
mock_get_upload_details: Mock,
side_effect: Exception,
caplog: pytest.LogCaptureFixture,
) -> None:
"""Test agent upload backup, when cloud user is logged in."""
client = await hass_client()
backup_id = "test-backup"
mock_get_upload_details.side_effect = side_effect
test_backup = AgentBackup(
addons=[AddonInfo(name="Test", slug="test", version="1.0.0")],
backup_id=backup_id,
database_included=True,
date="1970-01-01T00:00:00.000Z",
folders=[Folder.MEDIA, Folder.SHARE],
homeassistant_included=True,
homeassistant_version="2024.12.0",
name="Test",
protected=True,
size=0.0,
)
with (
patch(
"homeassistant.components.backup.manager.BackupManager.async_get_backup",
) as fetch_backup,
patch(
"homeassistant.components.backup.manager.read_backup",
return_value=test_backup,
),
patch("pathlib.Path.open") as mocked_open,
):
mocked_open.return_value.read = Mock(side_effect=[b"test", b""])
fetch_backup.return_value = test_backup
resp = await client.post(
"/api/backup/upload?agent_id=cloud.cloud",
data={"file": StringIO("test")},
)
assert resp.status == 201
assert "Error during backup upload - Failed to get upload details" in caplog.text
async def test_agents_upload_not_protected(
hass: HomeAssistant,
hass_client: ClientSessionGenerator,
caplog: pytest.LogCaptureFixture,
) -> None:
"""Test agent upload backup, when cloud user is logged in."""
client = await hass_client()
backup_id = "test-backup"
test_backup = AgentBackup(
addons=[AddonInfo(name="Test", slug="test", version="1.0.0")],
backup_id=backup_id,
database_included=True,
date="1970-01-01T00:00:00.000Z",
folders=[Folder.MEDIA, Folder.SHARE],
homeassistant_included=True,
homeassistant_version="2024.12.0",
name="Test",
protected=False,
size=0.0,
)
with (
patch("pathlib.Path.open"),
patch(
"homeassistant.components.backup.manager.read_backup",
return_value=test_backup,
),
):
resp = await client.post(
"/api/backup/upload?agent_id=cloud.cloud",
data={"file": StringIO("test")},
)
assert resp.status == 201
assert "Error during backup upload - Cloud backups must be protected" in caplog.text
@pytest.mark.usefixtures("cloud_logged_in", "mock_list_files")
async def test_agents_delete(
hass: HomeAssistant,
hass_ws_client: WebSocketGenerator,
mock_delete_file: Mock,
) -> None:
"""Test agent delete backup."""
client = await hass_ws_client(hass)
backup_id = "23e64aec"
await client.send_json_auto_id(
{
"type": "backup/delete",
"backup_id": backup_id,
}
)
response = await client.receive_json()
assert response["success"]
assert response["result"] == {"agent_errors": {}}
mock_delete_file.assert_called_once()
@pytest.mark.parametrize("side_effect", [ClientError, CloudError])
@pytest.mark.usefixtures("cloud_logged_in", "mock_list_files")
async def test_agents_delete_fail_cloud(
hass: HomeAssistant,
hass_ws_client: WebSocketGenerator,
mock_delete_file: Mock,
side_effect: Exception,
) -> None:
"""Test agent delete backup."""
client = await hass_ws_client(hass)
backup_id = "23e64aec"
mock_delete_file.side_effect = side_effect
await client.send_json_auto_id(
{
"type": "backup/delete",
"backup_id": backup_id,
}
)
response = await client.receive_json()
assert response["success"]
assert response["result"] == {
"agent_errors": {"cloud.cloud": "Failed to delete backup"}
}
@pytest.mark.usefixtures("cloud_logged_in", "mock_list_files")
async def test_agents_delete_not_found(
hass: HomeAssistant,
hass_ws_client: WebSocketGenerator,
) -> None:
"""Test agent download backup raises error if not found."""
client = await hass_ws_client(hass)
backup_id = "1234"
await client.send_json_auto_id(
{
"type": "backup/delete",
"backup_id": backup_id,
}
)
response = await client.receive_json()
assert response["success"]
assert response["result"] == {"agent_errors": {"cloud.cloud": "Backup not found"}}

View File

@ -533,6 +533,10 @@ def supervisor_client() -> Generator[AsyncMock]:
"homeassistant.components.hassio.addon_manager.get_supervisor_client",
return_value=supervisor_client,
),
patch(
"homeassistant.components.hassio.backup.get_supervisor_client",
return_value=supervisor_client,
),
patch(
"homeassistant.components.hassio.discovery.get_supervisor_client",
return_value=supervisor_client,

View File

@ -0,0 +1,403 @@
"""Test supervisor backup functionality."""
from collections.abc import AsyncGenerator, Generator
from datetime import datetime
from io import StringIO
import os
from typing import Any
from unittest.mock import AsyncMock, patch
from aiohasupervisor.models import backups as supervisor_backups
import pytest
from homeassistant.components.backup import (
DOMAIN as BACKUP_DOMAIN,
AddonInfo,
AgentBackup,
Folder,
)
from homeassistant.core import HomeAssistant
from homeassistant.setup import async_setup_component
from .test_init import MOCK_ENVIRON
from tests.typing import ClientSessionGenerator, WebSocketGenerator
TEST_BACKUP = supervisor_backups.Backup(
compressed=False,
content=supervisor_backups.BackupContent(
addons=["ssl"],
folders=["share"],
homeassistant=True,
),
date=datetime.fromisoformat("1970-01-01T00:00:00Z"),
location=None,
locations={None},
name="Test",
protected=False,
size=1.0,
size_bytes=1048576,
slug="abc123",
type=supervisor_backups.BackupType.PARTIAL,
)
TEST_BACKUP_DETAILS = supervisor_backups.BackupComplete(
addons=[
supervisor_backups.BackupAddon(
name="Terminal & SSH",
size=0.0,
slug="core_ssh",
version="9.14.0",
)
],
compressed=TEST_BACKUP.compressed,
date=TEST_BACKUP.date,
extra=None,
folders=["share"],
homeassistant_exclude_database=False,
homeassistant="2024.12.0",
location=TEST_BACKUP.location,
locations=TEST_BACKUP.locations,
name=TEST_BACKUP.name,
protected=TEST_BACKUP.protected,
repositories=[],
size=TEST_BACKUP.size,
size_bytes=TEST_BACKUP.size_bytes,
slug=TEST_BACKUP.slug,
supervisor_version="2024.11.2",
type=TEST_BACKUP.type,
)
@pytest.fixture(autouse=True)
def fixture_supervisor_environ() -> Generator[None]:
"""Mock os environ for supervisor."""
with patch.dict(os.environ, MOCK_ENVIRON):
yield
@pytest.fixture(autouse=True)
async def setup_integration(
hass: HomeAssistant, supervisor_client: AsyncMock
) -> AsyncGenerator[None]:
"""Set up Backup integration."""
with (
patch("homeassistant.components.backup.is_hassio", return_value=True),
patch("homeassistant.components.backup.backup.is_hassio", return_value=True),
):
assert await async_setup_component(hass, BACKUP_DOMAIN, {BACKUP_DOMAIN: {}})
await hass.async_block_till_done()
yield
@pytest.mark.usefixtures("hassio_client")
async def test_agent_info(
hass: HomeAssistant,
hass_ws_client: WebSocketGenerator,
) -> None:
"""Test backup agent info."""
client = await hass_ws_client(hass)
await client.send_json_auto_id({"type": "backup/agents/info"})
response = await client.receive_json()
assert response["success"]
assert response["result"] == {
"agents": [{"agent_id": "hassio.local"}],
}
@pytest.mark.usefixtures("hassio_client")
async def test_agent_list_backups(
hass: HomeAssistant,
hass_ws_client: WebSocketGenerator,
supervisor_client: AsyncMock,
) -> None:
"""Test agent list backups."""
client = await hass_ws_client(hass)
supervisor_client.backups.list.return_value = [TEST_BACKUP]
supervisor_client.backups.backup_info.return_value = TEST_BACKUP_DETAILS
await client.send_json_auto_id({"type": "backup/info"})
response = await client.receive_json()
assert response["success"]
assert response["result"]["backups"] == [
{
"addons": [
{"name": "Terminal & SSH", "slug": "core_ssh", "version": "9.14.0"}
],
"agent_ids": ["hassio.local"],
"backup_id": "abc123",
"database_included": True,
"date": "1970-01-01T00:00:00+00:00",
"failed_agent_ids": [],
"folders": ["share"],
"homeassistant_included": True,
"homeassistant_version": "2024.12.0",
"name": "Test",
"protected": False,
"size": 1048576,
"with_strategy_settings": False,
}
]
@pytest.mark.usefixtures("hassio_client")
async def test_agent_download(
hass: HomeAssistant,
hass_client: ClientSessionGenerator,
supervisor_client: AsyncMock,
) -> None:
"""Test agent download backup, when cloud user is logged in."""
client = await hass_client()
backup_id = "abc123"
supervisor_client.backups.list.return_value = [TEST_BACKUP]
supervisor_client.backups.backup_info.return_value = TEST_BACKUP_DETAILS
supervisor_client.backups.download_backup.return_value.__aiter__.return_value = (
iter((b"backup data",))
)
resp = await client.get(f"/api/backup/download/{backup_id}?agent_id=hassio.local")
assert resp.status == 200
assert await resp.content.read() == b"backup data"
@pytest.mark.usefixtures("hassio_client")
async def test_agent_upload(
hass: HomeAssistant,
hass_client: ClientSessionGenerator,
supervisor_client: AsyncMock,
) -> None:
"""Test agent upload backup."""
client = await hass_client()
backup_id = "test-backup"
supervisor_client.backups.backup_info.return_value = TEST_BACKUP_DETAILS
test_backup = AgentBackup(
addons=[AddonInfo(name="Test", slug="test", version="1.0.0")],
backup_id=backup_id,
database_included=True,
date="1970-01-01T00:00:00.000Z",
folders=[Folder.MEDIA, Folder.SHARE],
homeassistant_included=True,
homeassistant_version="2024.12.0",
name="Test",
protected=False,
size=0.0,
)
supervisor_client.backups.reload.assert_not_called()
with (
patch("pathlib.Path.mkdir"),
patch("pathlib.Path.open"),
patch(
"homeassistant.components.backup.manager.BackupManager.async_get_backup",
) as fetch_backup,
patch(
"homeassistant.components.backup.manager.read_backup",
return_value=test_backup,
),
patch("shutil.copy"),
):
fetch_backup.return_value = test_backup
resp = await client.post(
"/api/backup/upload?agent_id=hassio.local",
data={"file": StringIO("test")},
)
assert resp.status == 201
supervisor_client.backups.reload.assert_not_called()
@pytest.mark.usefixtures("hassio_client")
async def test_agent_delete_backup(
hass: HomeAssistant,
hass_ws_client: WebSocketGenerator,
supervisor_client: AsyncMock,
) -> None:
"""Test agent delete backup."""
client = await hass_ws_client(hass)
backup_id = "abc123"
await client.send_json_auto_id(
{
"type": "backup/delete",
"backup_id": backup_id,
}
)
response = await client.receive_json()
assert response["success"]
assert response["result"] == {"agent_errors": {}}
supervisor_client.backups.remove_backup.assert_called_once_with(backup_id)
@pytest.mark.usefixtures("hassio_client")
async def test_reader_writer_create(
hass: HomeAssistant,
hass_ws_client: WebSocketGenerator,
supervisor_client: AsyncMock,
) -> None:
"""Test generating a backup."""
client = await hass_ws_client(hass)
supervisor_client.backups.partial_backup.return_value.job_id = "abc123"
supervisor_client.backups.backup_info.return_value = TEST_BACKUP_DETAILS
await client.send_json_auto_id({"type": "backup/subscribe_events"})
response = await client.receive_json()
assert response["event"] == {"manager_state": "idle"}
response = await client.receive_json()
assert response["success"]
await client.send_json_auto_id(
{"type": "backup/generate", "agent_ids": ["hassio.local"], "name": "Test"}
)
response = await client.receive_json()
assert response["event"] == {
"manager_state": "create_backup",
"stage": None,
"state": "in_progress",
}
response = await client.receive_json()
assert response["success"]
assert response["result"] == {"backup_job_id": "abc123"}
supervisor_client.backups.partial_backup.assert_called_once_with(
supervisor_backups.PartialBackupOptions(
addons=None,
background=True,
compressed=True,
folders=None,
homeassistant_exclude_database=False,
homeassistant=True,
location={None},
name="Test",
password=None,
)
)
await client.send_json_auto_id(
{
"type": "supervisor/event",
"data": {
"event": "job",
"data": {"done": True, "uuid": "abc123", "reference": "test_slug"},
},
}
)
response = await client.receive_json()
assert response["success"]
response = await client.receive_json()
assert response["event"] == {
"manager_state": "create_backup",
"stage": "upload_to_agents",
"state": "in_progress",
}
response = await client.receive_json()
assert response["event"] == {
"manager_state": "create_backup",
"stage": None,
"state": "completed",
}
@pytest.mark.usefixtures("hassio_client")
async def test_reader_writer_restore(
hass: HomeAssistant,
hass_ws_client: WebSocketGenerator,
supervisor_client: AsyncMock,
) -> None:
"""Test restoring a backup."""
client = await hass_ws_client(hass)
supervisor_client.backups.partial_restore.return_value.job_id = "abc123"
supervisor_client.backups.list.return_value = [TEST_BACKUP]
supervisor_client.backups.backup_info.return_value = TEST_BACKUP_DETAILS
await client.send_json_auto_id({"type": "backup/subscribe_events"})
response = await client.receive_json()
assert response["event"] == {"manager_state": "idle"}
response = await client.receive_json()
assert response["success"]
await client.send_json_auto_id(
{"type": "backup/restore", "agent_id": "hassio.local", "backup_id": "abc123"}
)
response = await client.receive_json()
assert response["event"] == {
"manager_state": "restore_backup",
"stage": None,
"state": "in_progress",
}
supervisor_client.backups.partial_restore.assert_called_once_with(
"abc123",
supervisor_backups.PartialRestoreOptions(
addons=None,
background=True,
folders=None,
homeassistant=True,
password=None,
),
)
await client.send_json_auto_id(
{
"type": "supervisor/event",
"data": {
"event": "job",
"data": {"done": True, "uuid": "abc123"},
},
}
)
response = await client.receive_json()
assert response["success"]
response = await client.receive_json()
assert response["event"] == {"manager_state": "idle"}
response = await client.receive_json()
assert response["success"]
assert response["result"] is None
@pytest.mark.parametrize(
("parameters", "expected_error"),
[
(
{"restore_database": False},
"Cannot restore Home Assistant without database",
),
(
{"restore_homeassistant": False},
"Cannot restore database without Home Assistant",
),
],
)
@pytest.mark.usefixtures("hassio_client")
async def test_reader_writer_restore_wrong_parameters(
hass: HomeAssistant,
hass_ws_client: WebSocketGenerator,
supervisor_client: AsyncMock,
parameters: dict[str, Any],
expected_error: str,
) -> None:
"""Test trigger restore."""
client = await hass_ws_client(hass)
supervisor_client.backups.list.return_value = [TEST_BACKUP]
supervisor_client.backups.backup_info.return_value = TEST_BACKUP_DETAILS
default_parameters = {
"type": "backup/restore",
"agent_id": "hassio.local",
"backup_id": "abc123",
}
await client.send_json_auto_id(default_parameters | parameters)
response = await client.receive_json()
assert not response["success"]
assert response["error"] == {
"code": "home_assistant_error",
"message": expected_error,
}

View File

@ -0,0 +1,194 @@
"""Test the Kitchen Sink backup platform."""
from collections.abc import AsyncGenerator
from io import StringIO
from unittest.mock import patch
import pytest
from homeassistant.components.backup import (
DOMAIN as BACKUP_DOMAIN,
AddonInfo,
AgentBackup,
Folder,
)
from homeassistant.components.kitchen_sink import DOMAIN
from homeassistant.core import HomeAssistant
from homeassistant.setup import async_setup_component
from tests.typing import ClientSessionGenerator, WebSocketGenerator
@pytest.fixture(autouse=True)
async def backup_only() -> AsyncGenerator[None]:
"""Enable only the backup platform.
The backup platform is not an entity platform.
"""
with patch(
"homeassistant.components.kitchen_sink.COMPONENTS_WITH_DEMO_PLATFORM",
[],
):
yield
@pytest.fixture(autouse=True)
async def setup_integration(hass: HomeAssistant) -> AsyncGenerator[None]:
"""Set up Kitchen Sink integration."""
with patch("homeassistant.components.backup.is_hassio", return_value=False):
assert await async_setup_component(hass, BACKUP_DOMAIN, {BACKUP_DOMAIN: {}})
assert await async_setup_component(hass, DOMAIN, {DOMAIN: {}})
await hass.async_block_till_done()
yield
async def test_agents_info(
hass: HomeAssistant,
hass_ws_client: WebSocketGenerator,
) -> None:
"""Test backup agent info."""
client = await hass_ws_client(hass)
await client.send_json_auto_id({"type": "backup/agents/info"})
response = await client.receive_json()
assert response["success"]
assert response["result"] == {
"agents": [{"agent_id": "backup.local"}, {"agent_id": "kitchen_sink.syncer"}],
}
async def test_agents_list_backups(
hass: HomeAssistant,
hass_ws_client: WebSocketGenerator,
) -> None:
"""Test agent list backups."""
client = await hass_ws_client(hass)
await client.send_json_auto_id({"type": "backup/info"})
response = await client.receive_json()
assert response["success"]
assert response["result"]["backups"] == [
{
"addons": [{"name": "Test", "slug": "test", "version": "1.0.0"}],
"agent_ids": ["kitchen_sink.syncer"],
"backup_id": "abc123",
"database_included": False,
"date": "1970-01-01T00:00:00Z",
"failed_agent_ids": [],
"folders": ["media", "share"],
"homeassistant_included": True,
"homeassistant_version": "2024.12.0",
"name": "Kitchen sink syncer",
"protected": False,
"size": 1234,
"with_strategy_settings": False,
}
]
async def test_agents_download(
hass: HomeAssistant,
hass_client: ClientSessionGenerator,
) -> None:
"""Test downloading a backup."""
client = await hass_client()
resp = await client.get("/api/backup/download/abc123?agent_id=kitchen_sink.syncer")
assert resp.status == 200
assert await resp.content.read() == b"backup data"
async def test_agents_upload(
hass: HomeAssistant,
hass_client: ClientSessionGenerator,
hass_ws_client: WebSocketGenerator,
caplog: pytest.LogCaptureFixture,
hass_supervisor_access_token: str,
) -> None:
"""Test agent upload backup."""
ws_client = await hass_ws_client(hass, hass_supervisor_access_token)
client = await hass_client()
backup_id = "test-backup"
test_backup = AgentBackup(
addons=[AddonInfo(name="Test", slug="test", version="1.0.0")],
backup_id=backup_id,
database_included=True,
date="1970-01-01T00:00:00.000Z",
folders=[Folder.MEDIA, Folder.SHARE],
homeassistant_included=True,
homeassistant_version="2024.12.0",
name="Test",
protected=False,
size=0.0,
)
with (
patch("pathlib.Path.open"),
patch(
"homeassistant.components.backup.manager.BackupManager.async_get_backup",
) as fetch_backup,
patch(
"homeassistant.components.backup.manager.read_backup",
return_value=test_backup,
),
):
fetch_backup.return_value = test_backup
resp = await client.post(
"/api/backup/upload?agent_id=kitchen_sink.syncer",
data={"file": StringIO("test")},
)
assert resp.status == 201
assert f"Uploading backup {backup_id}" in caplog.text
await ws_client.send_json_auto_id({"type": "backup/info"})
response = await ws_client.receive_json()
assert response["success"]
backup_list = response["result"]["backups"]
assert len(backup_list) == 2
assert backup_list[1] == {
"addons": [{"name": "Test", "slug": "test", "version": "1.0.0"}],
"agent_ids": ["kitchen_sink.syncer"],
"backup_id": "test-backup",
"database_included": True,
"date": "1970-01-01T00:00:00.000Z",
"failed_agent_ids": [],
"folders": ["media", "share"],
"homeassistant_included": True,
"homeassistant_version": "2024.12.0",
"name": "Test",
"protected": False,
"size": 0.0,
"with_strategy_settings": False,
}
async def test_agent_delete_backup(
hass: HomeAssistant,
hass_ws_client: WebSocketGenerator,
caplog: pytest.LogCaptureFixture,
) -> None:
"""Test agent delete backup."""
client = await hass_ws_client(hass)
backup_id = "abc123"
await client.send_json_auto_id(
{
"type": "backup/delete",
"backup_id": backup_id,
}
)
response = await client.receive_json()
assert response["success"]
assert f"Deleted backup {backup_id}" in caplog.text
await client.send_json_auto_id({"type": "backup/info"})
response = await client.receive_json()
assert response["success"]
backup_list = response["result"]["backups"]
assert not backup_list

View File

@ -19,7 +19,29 @@ from .common import get_test_config_dir
(
None,
'{"path": "test"}',
backup_restore.RestoreBackupFileContent(backup_file_path=Path("test")),
None,
),
(
None,
'{"path": "test", "password": "psw", "remove_after_restore": false, "restore_database": false, "restore_homeassistant": true}',
backup_restore.RestoreBackupFileContent(
backup_file_path=Path("test"),
password="psw",
remove_after_restore=False,
restore_database=False,
restore_homeassistant=True,
),
),
(
None,
'{"path": "test", "password": null, "remove_after_restore": true, "restore_database": true, "restore_homeassistant": false}',
backup_restore.RestoreBackupFileContent(
backup_file_path=Path("test"),
password=None,
remove_after_restore=True,
restore_database=True,
restore_homeassistant=False,
),
),
],
)
@ -49,7 +71,11 @@ def test_restoring_backup_that_does_not_exist() -> None:
mock.patch(
"homeassistant.backup_restore.restore_backup_file_content",
return_value=backup_restore.RestoreBackupFileContent(
backup_file_path=backup_file_path
backup_file_path=backup_file_path,
password=None,
remove_after_restore=False,
restore_database=True,
restore_homeassistant=True,
),
),
mock.patch("pathlib.Path.read_text", side_effect=FileNotFoundError),
@ -78,7 +104,11 @@ def test_restoring_backup_that_is_not_a_file() -> None:
mock.patch(
"homeassistant.backup_restore.restore_backup_file_content",
return_value=backup_restore.RestoreBackupFileContent(
backup_file_path=backup_file_path
backup_file_path=backup_file_path,
password=None,
remove_after_restore=False,
restore_database=True,
restore_homeassistant=True,
),
),
mock.patch("pathlib.Path.exists", return_value=True),
@ -102,7 +132,11 @@ def test_aborting_for_older_versions() -> None:
mock.patch(
"homeassistant.backup_restore.restore_backup_file_content",
return_value=backup_restore.RestoreBackupFileContent(
backup_file_path=backup_file_path
backup_file_path=backup_file_path,
password=None,
remove_after_restore=False,
restore_database=True,
restore_homeassistant=True,
),
),
mock.patch("securetar.SecureTarFile"),
@ -117,14 +151,78 @@ def test_aborting_for_older_versions() -> None:
assert backup_restore.restore_backup(config_dir) is True
def test_removal_of_current_configuration_when_restoring() -> None:
@pytest.mark.parametrize(
(
"restore_backup_content",
"expected_removed_files",
"expected_removed_directories",
"expected_copied_files",
"expected_copied_trees",
),
[
(
backup_restore.RestoreBackupFileContent(
backup_file_path=None,
password=None,
remove_after_restore=False,
restore_database=True,
restore_homeassistant=True,
),
(
".HA_RESTORE",
".HA_VERSION",
"home-assistant_v2.db",
"home-assistant_v2.db-wal",
),
("tmp_backups", "www"),
(),
("data",),
),
(
backup_restore.RestoreBackupFileContent(
backup_file_path=None,
password=None,
restore_database=False,
remove_after_restore=False,
restore_homeassistant=True,
),
(".HA_RESTORE", ".HA_VERSION"),
("tmp_backups", "www"),
(),
("data",),
),
(
backup_restore.RestoreBackupFileContent(
backup_file_path=None,
password=None,
restore_database=True,
remove_after_restore=False,
restore_homeassistant=False,
),
("home-assistant_v2.db", "home-assistant_v2.db-wal"),
(),
("home-assistant_v2.db", "home-assistant_v2.db-wal"),
(),
),
],
)
def test_removal_of_current_configuration_when_restoring(
restore_backup_content: backup_restore.RestoreBackupFileContent,
expected_removed_files: tuple[str, ...],
expected_removed_directories: tuple[str, ...],
expected_copied_files: tuple[str, ...],
expected_copied_trees: tuple[str, ...],
) -> None:
"""Test that we are removing the current configuration directory."""
config_dir = Path(get_test_config_dir())
backup_file_path = Path(config_dir, "backups", "test.tar")
restore_backup_content.backup_file_path = Path(config_dir, "backups", "test.tar")
mock_config_dir = [
{"path": Path(config_dir, ".HA_RESTORE"), "is_file": True},
{"path": Path(config_dir, ".HA_VERSION"), "is_file": True},
{"path": Path(config_dir, "home-assistant_v2.db"), "is_file": True},
{"path": Path(config_dir, "home-assistant_v2.db-wal"), "is_file": True},
{"path": Path(config_dir, "backups"), "is_file": False},
{"path": Path(config_dir, "tmp_backups"), "is_file": False},
{"path": Path(config_dir, "www"), "is_file": False},
]
@ -140,12 +238,10 @@ def test_removal_of_current_configuration_when_restoring() -> None:
with (
mock.patch(
"homeassistant.backup_restore.restore_backup_file_content",
return_value=backup_restore.RestoreBackupFileContent(
backup_file_path=backup_file_path
),
return_value=restore_backup_content,
),
mock.patch("securetar.SecureTarFile"),
mock.patch("homeassistant.backup_restore.TemporaryDirectory"),
mock.patch("homeassistant.backup_restore.TemporaryDirectory") as temp_dir_mock,
mock.patch("homeassistant.backup_restore.HA_VERSION", "2013.09.17"),
mock.patch("pathlib.Path.read_text", _patched_path_read_text),
mock.patch("pathlib.Path.is_file", _patched_path_is_file),
@ -154,17 +250,33 @@ def test_removal_of_current_configuration_when_restoring() -> None:
"pathlib.Path.iterdir",
return_value=[x["path"] for x in mock_config_dir],
),
mock.patch("pathlib.Path.unlink") as unlink_mock,
mock.patch("shutil.rmtree") as rmtreemock,
mock.patch("pathlib.Path.unlink", autospec=True) as unlink_mock,
mock.patch("shutil.copy") as copy_mock,
mock.patch("shutil.copytree") as copytree_mock,
mock.patch("shutil.rmtree") as rmtree_mock,
):
assert backup_restore.restore_backup(config_dir) is True
assert unlink_mock.call_count == 2
assert (
rmtreemock.call_count == 1
) # We have 2 directories in the config directory, but backups is kept
temp_dir_mock.return_value.__enter__.return_value = "tmp"
removed_directories = {Path(call.args[0]) for call in rmtreemock.mock_calls}
assert removed_directories == {Path(config_dir, "www")}
assert backup_restore.restore_backup(config_dir) is True
tmp_ha = Path("tmp", "homeassistant")
assert copy_mock.call_count == len(expected_copied_files)
copied_files = {Path(call.args[0]) for call in copy_mock.mock_calls}
assert copied_files == {Path(tmp_ha, "data", f) for f in expected_copied_files}
assert copytree_mock.call_count == len(expected_copied_trees)
copied_trees = {Path(call.args[0]) for call in copytree_mock.mock_calls}
assert copied_trees == {Path(tmp_ha, t) for t in expected_copied_trees}
assert unlink_mock.call_count == len(expected_removed_files)
removed_files = {Path(call.args[0]) for call in unlink_mock.mock_calls}
assert removed_files == {Path(config_dir, f) for f in expected_removed_files}
assert rmtree_mock.call_count == len(expected_removed_directories)
removed_directories = {Path(call.args[0]) for call in rmtree_mock.mock_calls}
assert removed_directories == {
Path(config_dir, d) for d in expected_removed_directories
}
def test_extracting_the_contents_of_a_backup_file() -> None:
@ -177,8 +289,8 @@ def test_extracting_the_contents_of_a_backup_file() -> None:
getmembers_mock = mock.MagicMock(
return_value=[
tarfile.TarInfo(name="../data/test"),
tarfile.TarInfo(name="data"),
tarfile.TarInfo(name="data/../test"),
tarfile.TarInfo(name="data/.HA_VERSION"),
tarfile.TarInfo(name="data/.storage"),
tarfile.TarInfo(name="data/www"),
@ -190,7 +302,11 @@ def test_extracting_the_contents_of_a_backup_file() -> None:
mock.patch(
"homeassistant.backup_restore.restore_backup_file_content",
return_value=backup_restore.RestoreBackupFileContent(
backup_file_path=backup_file_path
backup_file_path=backup_file_path,
password=None,
remove_after_restore=False,
restore_database=True,
restore_homeassistant=True,
),
),
mock.patch(
@ -205,11 +321,59 @@ def test_extracting_the_contents_of_a_backup_file() -> None:
mock.patch("pathlib.Path.read_text", _patched_path_read_text),
mock.patch("pathlib.Path.is_file", return_value=False),
mock.patch("pathlib.Path.iterdir", return_value=[]),
mock.patch("shutil.copytree"),
):
assert backup_restore.restore_backup(config_dir) is True
assert getmembers_mock.call_count == 1
assert extractall_mock.call_count == 2
assert {
member.name for member in extractall_mock.mock_calls[-1].kwargs["members"]
} == {".HA_VERSION", ".storage", "www"}
} == {"data", "data/.HA_VERSION", "data/.storage", "data/www"}
@pytest.mark.parametrize(
("remove_after_restore", "unlink_calls"), [(True, 1), (False, 0)]
)
def test_remove_backup_file_after_restore(
remove_after_restore: bool, unlink_calls: int
) -> None:
"""Test removing a backup file after restore."""
config_dir = Path(get_test_config_dir())
backup_file_path = Path(config_dir, "backups", "test.tar")
with (
mock.patch(
"homeassistant.backup_restore.restore_backup_file_content",
return_value=backup_restore.RestoreBackupFileContent(
backup_file_path=backup_file_path,
password=None,
remove_after_restore=remove_after_restore,
restore_database=True,
restore_homeassistant=True,
),
),
mock.patch("homeassistant.backup_restore._extract_backup"),
mock.patch("pathlib.Path.unlink", autospec=True) as mock_unlink,
):
assert backup_restore.restore_backup(config_dir) is True
assert mock_unlink.call_count == unlink_calls
for call in mock_unlink.mock_calls:
assert call.args[0] == backup_file_path
@pytest.mark.parametrize(
("password", "expected"),
[
("test", b"\xf0\x9b\xb9\x1f\xdc,\xff\xd5x\xd6\xd6\x8fz\x19.\x0f"),
("lorem ipsum...", b"#\xe0\xfc\xe0\xdb?_\x1f,$\rQ\xf4\xf5\xd8\xfb"),
],
)
def test_pw_to_key(password: str | None, expected: bytes | None) -> None:
"""Test password to key conversion."""
assert backup_restore.password_to_key(password) == expected
def test_pw_to_key_none() -> None:
"""Test password to key conversion."""
with pytest.raises(AttributeError):
backup_restore.password_to_key(None)