mirror of
https://github.com/home-assistant/core.git
synced 2026-04-08 16:35:17 +00:00
Compare commits
1 Commits
knx-expose
...
adjust_dev
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
b4a117e6b1 |
@@ -195,7 +195,6 @@ GITHUB_USER=$(gh api user --jq .login 2>/dev/null || git remote get-url "$PUSH_R
|
||||
# Create PR (gh pr create pushes the branch automatically)
|
||||
gh pr create --repo home-assistant/core --base dev \
|
||||
--head "$GITHUB_USER:$BRANCH" \
|
||||
--draft \
|
||||
--title "TITLE_HERE" \
|
||||
--body "$(cat <<'EOF'
|
||||
BODY_HERE
|
||||
|
||||
@@ -1,7 +1,6 @@
|
||||
---
|
||||
name: github-pr-reviewer
|
||||
description: Reviews GitHub pull requests and provides feedback comments.
|
||||
disallowedTools: Write, Edit
|
||||
description: Review a GitHub pull request and provide feedback comments. Use when the user says "review the current PR" or asks to review a specific PR.
|
||||
---
|
||||
|
||||
# Review GitHub Pull Request
|
||||
@@ -3,27 +3,54 @@ name: Home Assistant Integration knowledge
|
||||
description: Everything you need to know to build, test and review Home Assistant Integrations. If you're looking at an integration, you must use this as your primary reference.
|
||||
---
|
||||
|
||||
## File Locations
|
||||
### File Locations
|
||||
- **Integration code**: `./homeassistant/components/<integration_domain>/`
|
||||
- **Integration tests**: `./tests/components/<integration_domain>/`
|
||||
|
||||
## General guidelines
|
||||
## Integration Templates
|
||||
|
||||
- When looking for examples, prefer integrations with the platinum or gold quality scale level first.
|
||||
- Polling intervals are NOT user-configurable. Never add scan_interval, update_interval, or polling frequency options to config flows or config entries.
|
||||
- Do NOT allow users to set config entry names in config flows. Names are automatically generated or can be customized later in UI. Exception: helper integrations may allow custom names.
|
||||
### Standard Integration Structure
|
||||
```
|
||||
homeassistant/components/my_integration/
|
||||
├── __init__.py # Entry point with async_setup_entry
|
||||
├── manifest.json # Integration metadata and dependencies
|
||||
├── const.py # Domain and constants
|
||||
├── config_flow.py # UI configuration flow
|
||||
├── coordinator.py # Data update coordinator (if needed)
|
||||
├── entity.py # Base entity class (if shared patterns)
|
||||
├── sensor.py # Sensor platform
|
||||
├── strings.json # User-facing text and translations
|
||||
├── services.yaml # Service definitions (if applicable)
|
||||
└── quality_scale.yaml # Quality scale rule status
|
||||
```
|
||||
|
||||
The following platforms have extra guidelines:
|
||||
An integration can have platforms as needed (e.g., `sensor.py`, `switch.py`, etc.). The following platforms have extra guidelines:
|
||||
- **Diagnostics**: [`platform-diagnostics.md`](platform-diagnostics.md) for diagnostic data collection
|
||||
- **Repairs**: [`platform-repairs.md`](platform-repairs.md) for user-actionable repair issues
|
||||
|
||||
### Minimal Integration Checklist
|
||||
- [ ] `manifest.json` with required fields (domain, name, codeowners, etc.)
|
||||
- [ ] `__init__.py` with `async_setup_entry` and `async_unload_entry`
|
||||
- [ ] `config_flow.py` with UI configuration support
|
||||
- [ ] `const.py` with `DOMAIN` constant
|
||||
- [ ] `strings.json` with at least config flow text
|
||||
- [ ] Platform files (`sensor.py`, etc.) as needed
|
||||
- [ ] `quality_scale.yaml` with rule status tracking
|
||||
|
||||
## Integration Quality Scale
|
||||
|
||||
- When validating the quality scale rules, check them at https://developers.home-assistant.io/docs/core/integration-quality-scale/rules
|
||||
- When implementing or reviewing an integration, always consider the quality scale rules, since they promote best practices.
|
||||
Home Assistant uses an Integration Quality Scale to ensure code quality and consistency. The quality level determines which rules apply:
|
||||
|
||||
Template scale file: `./script/scaffold/templates/integration/integration/quality_scale.yaml`
|
||||
### Quality Scale Levels
|
||||
- **Bronze**: Basic requirements (ALL Bronze rules are mandatory)
|
||||
- **Silver**: Enhanced functionality
|
||||
- **Gold**: Advanced features
|
||||
- **Platinum**: Highest quality standards
|
||||
|
||||
### Quality Scale Progression
|
||||
- **Bronze → Silver**: Add entity unavailability, parallel updates, auth flows
|
||||
- **Silver → Gold**: Add device management, diagnostics, translations
|
||||
- **Gold → Platinum**: Add strict typing, async dependencies, websession injection
|
||||
|
||||
### How Rules Apply
|
||||
1. **Check `manifest.json`**: Look for `"quality_scale"` key to determine integration level
|
||||
@@ -34,7 +61,726 @@ Template scale file: `./script/scaffold/templates/integration/integration/qualit
|
||||
- `exempt`: Rule doesn't apply (with reason in comment)
|
||||
- `todo`: Rule needs implementation
|
||||
|
||||
### Example `quality_scale.yaml` Structure
|
||||
```yaml
|
||||
rules:
|
||||
# Bronze (mandatory)
|
||||
config-flow: done
|
||||
entity-unique-id: done
|
||||
action-setup:
|
||||
status: exempt
|
||||
comment: Integration does not register custom actions.
|
||||
|
||||
# Silver (if targeting Silver+)
|
||||
entity-unavailable: done
|
||||
parallel-updates: done
|
||||
|
||||
# Gold (if targeting Gold+)
|
||||
devices: done
|
||||
diagnostics: done
|
||||
|
||||
# Platinum (if targeting Platinum)
|
||||
strict-typing: done
|
||||
```
|
||||
|
||||
**When Reviewing/Creating Code**: Always check the integration's quality scale level and exemption status before applying rules.
|
||||
|
||||
## Code Organization
|
||||
|
||||
### Core Locations
|
||||
- Shared constants: `homeassistant/const.py` (use these instead of hardcoding)
|
||||
- Integration structure:
|
||||
- `homeassistant/components/{domain}/const.py` - Constants
|
||||
- `homeassistant/components/{domain}/models.py` - Data models
|
||||
- `homeassistant/components/{domain}/coordinator.py` - Update coordinator
|
||||
- `homeassistant/components/{domain}/config_flow.py` - Configuration flow
|
||||
- `homeassistant/components/{domain}/{platform}.py` - Platform implementations
|
||||
|
||||
### Common Modules
|
||||
- **coordinator.py**: Centralize data fetching logic
|
||||
```python
|
||||
class MyCoordinator(DataUpdateCoordinator[MyData]):
|
||||
def __init__(self, hass: HomeAssistant, client: MyClient, config_entry: ConfigEntry) -> None:
|
||||
super().__init__(
|
||||
hass,
|
||||
logger=LOGGER,
|
||||
name=DOMAIN,
|
||||
update_interval=timedelta(minutes=1),
|
||||
config_entry=config_entry, # ✅ Pass config_entry - it's accepted and recommended
|
||||
)
|
||||
```
|
||||
- **entity.py**: Base entity definitions to reduce duplication
|
||||
```python
|
||||
class MyEntity(CoordinatorEntity[MyCoordinator]):
|
||||
_attr_has_entity_name = True
|
||||
```
|
||||
|
||||
### Runtime Data Storage
|
||||
- **Use ConfigEntry.runtime_data**: Store non-persistent runtime data
|
||||
```python
|
||||
type MyIntegrationConfigEntry = ConfigEntry[MyClient]
|
||||
|
||||
async def async_setup_entry(hass: HomeAssistant, entry: MyIntegrationConfigEntry) -> bool:
|
||||
client = MyClient(entry.data[CONF_HOST])
|
||||
entry.runtime_data = client
|
||||
```
|
||||
|
||||
### Manifest Requirements
|
||||
- **Required Fields**: `domain`, `name`, `codeowners`, `integration_type`, `documentation`, `requirements`
|
||||
- **Integration Types**: `device`, `hub`, `service`, `system`, `helper`
|
||||
- **IoT Class**: Always specify connectivity method (e.g., `cloud_polling`, `local_polling`, `local_push`)
|
||||
- **Discovery Methods**: Add when applicable: `zeroconf`, `dhcp`, `bluetooth`, `ssdp`, `usb`
|
||||
- **Dependencies**: Include platform dependencies (e.g., `application_credentials`, `bluetooth_adapters`)
|
||||
|
||||
### Config Flow Patterns
|
||||
- **Version Control**: Always set `VERSION = 1` and `MINOR_VERSION = 1`
|
||||
- **Unique ID Management**:
|
||||
```python
|
||||
await self.async_set_unique_id(device_unique_id)
|
||||
self._abort_if_unique_id_configured()
|
||||
```
|
||||
- **Error Handling**: Define errors in `strings.json` under `config.error`
|
||||
- **Step Methods**: Use standard naming (`async_step_user`, `async_step_discovery`, etc.)
|
||||
|
||||
### Integration Ownership
|
||||
- **manifest.json**: Add GitHub usernames to `codeowners`:
|
||||
```json
|
||||
{
|
||||
"domain": "my_integration",
|
||||
"name": "My Integration",
|
||||
"codeowners": ["@me"]
|
||||
}
|
||||
```
|
||||
|
||||
### Async Dependencies (Platinum)
|
||||
- **Requirement**: All dependencies must use asyncio
|
||||
- Ensures efficient task handling without thread context switching
|
||||
|
||||
### WebSession Injection (Platinum)
|
||||
- **Pass WebSession**: Support passing web sessions to dependencies
|
||||
```python
|
||||
async def async_setup_entry(hass: HomeAssistant, entry: MyConfigEntry) -> bool:
|
||||
"""Set up integration from config entry."""
|
||||
client = MyClient(entry.data[CONF_HOST], async_get_clientsession(hass))
|
||||
```
|
||||
- For cookies: Use `async_create_clientsession` (aiohttp) or `create_async_httpx_client` (httpx)
|
||||
|
||||
### Data Update Coordinator
|
||||
- **Standard Pattern**: Use for efficient data management
|
||||
```python
|
||||
class MyCoordinator(DataUpdateCoordinator):
|
||||
def __init__(self, hass: HomeAssistant, client: MyClient, config_entry: ConfigEntry) -> None:
|
||||
super().__init__(
|
||||
hass,
|
||||
logger=LOGGER,
|
||||
name=DOMAIN,
|
||||
update_interval=timedelta(minutes=5),
|
||||
config_entry=config_entry, # ✅ Pass config_entry - it's accepted and recommended
|
||||
)
|
||||
self.client = client
|
||||
|
||||
async def _async_update_data(self):
|
||||
try:
|
||||
return await self.client.fetch_data()
|
||||
except ApiError as err:
|
||||
raise UpdateFailed(f"API communication error: {err}")
|
||||
```
|
||||
- **Error Types**: Use `UpdateFailed` for API errors, `ConfigEntryAuthFailed` for auth issues
|
||||
- **Config Entry**: Always pass `config_entry` parameter to coordinator - it's accepted and recommended
|
||||
|
||||
## Integration Guidelines
|
||||
|
||||
### Configuration Flow
|
||||
- **UI Setup Required**: All integrations must support configuration via UI
|
||||
- **Manifest**: Set `"config_flow": true` in `manifest.json`
|
||||
- **Data Storage**:
|
||||
- Connection-critical config: Store in `ConfigEntry.data`
|
||||
- Non-critical settings: Store in `ConfigEntry.options`
|
||||
- **Validation**: Always validate user input before creating entries
|
||||
- **Config Entry Naming**:
|
||||
- ❌ Do NOT allow users to set config entry names in config flows
|
||||
- Names are automatically generated or can be customized later in UI
|
||||
- ✅ Exception: Helper integrations MAY allow custom names in config flow
|
||||
- **Connection Testing**: Test device/service connection during config flow:
|
||||
```python
|
||||
try:
|
||||
await client.get_data()
|
||||
except MyException:
|
||||
errors["base"] = "cannot_connect"
|
||||
```
|
||||
- **Duplicate Prevention**: Prevent duplicate configurations:
|
||||
```python
|
||||
# Using unique ID
|
||||
await self.async_set_unique_id(identifier)
|
||||
self._abort_if_unique_id_configured()
|
||||
|
||||
# Using unique data
|
||||
self._async_abort_entries_match({CONF_HOST: user_input[CONF_HOST]})
|
||||
```
|
||||
|
||||
### Reauthentication Support
|
||||
- **Required Method**: Implement `async_step_reauth` in config flow
|
||||
- **Credential Updates**: Allow users to update credentials without re-adding
|
||||
- **Validation**: Verify account matches existing unique ID:
|
||||
```python
|
||||
await self.async_set_unique_id(user_id)
|
||||
self._abort_if_unique_id_mismatch(reason="wrong_account")
|
||||
return self.async_update_reload_and_abort(
|
||||
self._get_reauth_entry(),
|
||||
data_updates={CONF_API_TOKEN: user_input[CONF_API_TOKEN]}
|
||||
)
|
||||
```
|
||||
|
||||
### Reconfiguration Flow
|
||||
- **Purpose**: Allow configuration updates without removing device
|
||||
- **Implementation**: Add `async_step_reconfigure` method
|
||||
- **Validation**: Prevent changing underlying account with `_abort_if_unique_id_mismatch`
|
||||
|
||||
### Device Discovery
|
||||
- **Manifest Configuration**: Add discovery method (zeroconf, dhcp, etc.)
|
||||
```json
|
||||
{
|
||||
"zeroconf": ["_mydevice._tcp.local."]
|
||||
}
|
||||
```
|
||||
- **Discovery Handler**: Implement appropriate `async_step_*` method:
|
||||
```python
|
||||
async def async_step_zeroconf(self, discovery_info):
|
||||
"""Handle zeroconf discovery."""
|
||||
await self.async_set_unique_id(discovery_info.properties["serialno"])
|
||||
self._abort_if_unique_id_configured(updates={CONF_HOST: discovery_info.host})
|
||||
```
|
||||
- **Network Updates**: Use discovery to update dynamic IP addresses
|
||||
|
||||
### Network Discovery Implementation
|
||||
- **Zeroconf/mDNS**: Use async instances
|
||||
```python
|
||||
aiozc = await zeroconf.async_get_async_instance(hass)
|
||||
```
|
||||
- **SSDP Discovery**: Register callbacks with cleanup
|
||||
```python
|
||||
entry.async_on_unload(
|
||||
ssdp.async_register_callback(
|
||||
hass, _async_discovered_device,
|
||||
{"st": "urn:schemas-upnp-org:device:ZonePlayer:1"}
|
||||
)
|
||||
)
|
||||
```
|
||||
|
||||
### Bluetooth Integration
|
||||
- **Manifest Dependencies**: Add `bluetooth_adapters` to dependencies
|
||||
- **Connectable**: Set `"connectable": true` for connection-required devices
|
||||
- **Scanner Usage**: Always use shared scanner instance
|
||||
```python
|
||||
scanner = bluetooth.async_get_scanner()
|
||||
entry.async_on_unload(
|
||||
bluetooth.async_register_callback(
|
||||
hass, _async_discovered_device,
|
||||
{"service_uuid": "example_uuid"},
|
||||
bluetooth.BluetoothScanningMode.ACTIVE
|
||||
)
|
||||
)
|
||||
```
|
||||
- **Connection Handling**: Never reuse `BleakClient` instances, use 10+ second timeouts
|
||||
|
||||
### Setup Validation
|
||||
- **Test Before Setup**: Verify integration can be set up in `async_setup_entry`
|
||||
- **Exception Handling**:
|
||||
- `ConfigEntryNotReady`: Device offline or temporary failure
|
||||
- `ConfigEntryAuthFailed`: Authentication issues
|
||||
- `ConfigEntryError`: Unresolvable setup problems
|
||||
|
||||
### Config Entry Unloading
|
||||
- **Required**: Implement `async_unload_entry` for runtime removal/reload
|
||||
- **Platform Unloading**: Use `hass.config_entries.async_unload_platforms`
|
||||
- **Cleanup**: Register callbacks with `entry.async_on_unload`:
|
||||
```python
|
||||
async def async_unload_entry(hass: HomeAssistant, entry: MyConfigEntry) -> bool:
|
||||
"""Unload a config entry."""
|
||||
if unload_ok := await hass.config_entries.async_unload_platforms(entry, PLATFORMS):
|
||||
entry.runtime_data.listener() # Clean up resources
|
||||
return unload_ok
|
||||
```
|
||||
|
||||
### Service Actions
|
||||
- **Registration**: Register all service actions in `async_setup`, NOT in `async_setup_entry`
|
||||
- **Validation**: Check config entry existence and loaded state:
|
||||
```python
|
||||
async def async_setup(hass: HomeAssistant, config: ConfigType) -> bool:
|
||||
async def service_action(call: ServiceCall) -> ServiceResponse:
|
||||
if not (entry := hass.config_entries.async_get_entry(call.data[ATTR_CONFIG_ENTRY_ID])):
|
||||
raise ServiceValidationError("Entry not found")
|
||||
if entry.state is not ConfigEntryState.LOADED:
|
||||
raise ServiceValidationError("Entry not loaded")
|
||||
```
|
||||
- **Exception Handling**: Raise appropriate exceptions:
|
||||
```python
|
||||
# For invalid input
|
||||
if end_date < start_date:
|
||||
raise ServiceValidationError("End date must be after start date")
|
||||
|
||||
# For service errors
|
||||
try:
|
||||
await client.set_schedule(start_date, end_date)
|
||||
except MyConnectionError as err:
|
||||
raise HomeAssistantError("Could not connect to the schedule") from err
|
||||
```
|
||||
|
||||
### Service Registration Patterns
|
||||
- **Entity Services**: Register on platform setup
|
||||
```python
|
||||
platform.async_register_entity_service(
|
||||
"my_entity_service",
|
||||
{vol.Required("parameter"): cv.string},
|
||||
"handle_service_method"
|
||||
)
|
||||
```
|
||||
- **Service Schema**: Always validate input
|
||||
```python
|
||||
SERVICE_SCHEMA = vol.Schema({
|
||||
vol.Required("entity_id"): cv.entity_ids,
|
||||
vol.Required("parameter"): cv.string,
|
||||
vol.Optional("timeout", default=30): cv.positive_int,
|
||||
})
|
||||
```
|
||||
- **Services File**: Create `services.yaml` with descriptions and field definitions
|
||||
|
||||
### Polling
|
||||
- Use update coordinator pattern when possible
|
||||
- **Polling intervals are NOT user-configurable**: Never add scan_interval, update_interval, or polling frequency options to config flows or config entries
|
||||
- **Integration determines intervals**: Set `update_interval` programmatically based on integration logic, not user input
|
||||
- **Minimum Intervals**:
|
||||
- Local network: 5 seconds
|
||||
- Cloud services: 60 seconds
|
||||
- **Parallel Updates**: Specify number of concurrent updates:
|
||||
```python
|
||||
PARALLEL_UPDATES = 1 # Serialize updates to prevent overwhelming device
|
||||
# OR
|
||||
PARALLEL_UPDATES = 0 # Unlimited (for coordinator-based or read-only)
|
||||
```
|
||||
|
||||
## Entity Development
|
||||
|
||||
### Unique IDs
|
||||
- **Required**: Every entity must have a unique ID for registry tracking
|
||||
- Must be unique per platform (not per integration)
|
||||
- Don't include integration domain or platform in ID
|
||||
- **Implementation**:
|
||||
```python
|
||||
class MySensor(SensorEntity):
|
||||
def __init__(self, device_id: str) -> None:
|
||||
self._attr_unique_id = f"{device_id}_temperature"
|
||||
```
|
||||
|
||||
**Acceptable ID Sources**:
|
||||
- Device serial numbers
|
||||
- MAC addresses (formatted using `format_mac` from device registry)
|
||||
- Physical identifiers (printed/EEPROM)
|
||||
- Config entry ID as last resort: `f"{entry.entry_id}-battery"`
|
||||
|
||||
**Never Use**:
|
||||
- IP addresses, hostnames, URLs
|
||||
- Device names
|
||||
- Email addresses, usernames
|
||||
|
||||
### Entity Descriptions
|
||||
- **Lambda/Anonymous Functions**: Often used in EntityDescription for value transformation
|
||||
- **Multiline Lambdas**: When lambdas exceed line length, wrap in parentheses for readability
|
||||
- **Bad pattern**:
|
||||
```python
|
||||
SensorEntityDescription(
|
||||
key="temperature",
|
||||
name="Temperature",
|
||||
value_fn=lambda data: round(data["temp_value"] * 1.8 + 32, 1) if data.get("temp_value") is not None else None, # ❌ Too long
|
||||
)
|
||||
```
|
||||
- **Good pattern**:
|
||||
```python
|
||||
SensorEntityDescription(
|
||||
key="temperature",
|
||||
name="Temperature",
|
||||
value_fn=lambda data: ( # ✅ Parenthesis on same line as lambda
|
||||
round(data["temp_value"] * 1.8 + 32, 1)
|
||||
if data.get("temp_value") is not None
|
||||
else None
|
||||
),
|
||||
)
|
||||
```
|
||||
|
||||
### Entity Naming
|
||||
- **Use has_entity_name**: Set `_attr_has_entity_name = True`
|
||||
- **For specific fields**:
|
||||
```python
|
||||
class MySensor(SensorEntity):
|
||||
_attr_has_entity_name = True
|
||||
def __init__(self, device: Device, field: str) -> None:
|
||||
self._attr_device_info = DeviceInfo(
|
||||
identifiers={(DOMAIN, device.id)},
|
||||
name=device.name,
|
||||
)
|
||||
self._attr_name = field # e.g., "temperature", "humidity"
|
||||
```
|
||||
- **For device itself**: Set `_attr_name = None`
|
||||
|
||||
### Event Lifecycle Management
|
||||
- **Subscribe in `async_added_to_hass`**:
|
||||
```python
|
||||
async def async_added_to_hass(self) -> None:
|
||||
"""Subscribe to events."""
|
||||
self.async_on_remove(
|
||||
self.client.events.subscribe("my_event", self._handle_event)
|
||||
)
|
||||
```
|
||||
- **Unsubscribe in `async_will_remove_from_hass`** if not using `async_on_remove`
|
||||
- Never subscribe in `__init__` or other methods
|
||||
|
||||
### State Handling
|
||||
- Unknown values: Use `None` (not "unknown" or "unavailable")
|
||||
- Availability: Implement `available()` property instead of using "unavailable" state
|
||||
|
||||
### Entity Availability
|
||||
- **Mark Unavailable**: When data cannot be fetched from device/service
|
||||
- **Coordinator Pattern**:
|
||||
```python
|
||||
@property
|
||||
def available(self) -> bool:
|
||||
"""Return if entity is available."""
|
||||
return super().available and self.identifier in self.coordinator.data
|
||||
```
|
||||
- **Direct Update Pattern**:
|
||||
```python
|
||||
async def async_update(self) -> None:
|
||||
"""Update entity."""
|
||||
try:
|
||||
data = await self.client.get_data()
|
||||
except MyException:
|
||||
self._attr_available = False
|
||||
else:
|
||||
self._attr_available = True
|
||||
self._attr_native_value = data.value
|
||||
```
|
||||
|
||||
### Extra State Attributes
|
||||
- All attribute keys must always be present
|
||||
- Unknown values: Use `None`
|
||||
- Provide descriptive attributes
|
||||
|
||||
## Device Management
|
||||
|
||||
### Device Registry
|
||||
- **Create Devices**: Group related entities under devices
|
||||
- **Device Info**: Provide comprehensive metadata:
|
||||
```python
|
||||
_attr_device_info = DeviceInfo(
|
||||
connections={(CONNECTION_NETWORK_MAC, device.mac)},
|
||||
identifiers={(DOMAIN, device.id)},
|
||||
name=device.name,
|
||||
manufacturer="My Company",
|
||||
model="My Sensor",
|
||||
sw_version=device.version,
|
||||
)
|
||||
```
|
||||
- For services: Add `entry_type=DeviceEntryType.SERVICE`
|
||||
|
||||
### Dynamic Device Addition
|
||||
- **Auto-detect New Devices**: After initial setup
|
||||
- **Implementation Pattern**:
|
||||
```python
|
||||
def _check_device() -> None:
|
||||
current_devices = set(coordinator.data)
|
||||
new_devices = current_devices - known_devices
|
||||
if new_devices:
|
||||
known_devices.update(new_devices)
|
||||
async_add_entities([MySensor(coordinator, device_id) for device_id in new_devices])
|
||||
|
||||
entry.async_on_unload(coordinator.async_add_listener(_check_device))
|
||||
```
|
||||
|
||||
### Stale Device Removal
|
||||
- **Auto-remove**: When devices disappear from hub/account
|
||||
- **Device Registry Update**:
|
||||
```python
|
||||
device_registry.async_update_device(
|
||||
device_id=device.id,
|
||||
remove_config_entry_id=self.config_entry.entry_id,
|
||||
)
|
||||
```
|
||||
- **Manual Deletion**: Implement `async_remove_config_entry_device` when needed
|
||||
|
||||
### Entity Categories
|
||||
- **Required**: Assign appropriate category to entities
|
||||
- **Implementation**: Set `_attr_entity_category`
|
||||
```python
|
||||
class MySensor(SensorEntity):
|
||||
_attr_entity_category = EntityCategory.DIAGNOSTIC
|
||||
```
|
||||
- Categories include: `DIAGNOSTIC` for system/technical information
|
||||
|
||||
### Device Classes
|
||||
- **Use When Available**: Set appropriate device class for entity type
|
||||
```python
|
||||
class MyTemperatureSensor(SensorEntity):
|
||||
_attr_device_class = SensorDeviceClass.TEMPERATURE
|
||||
```
|
||||
- Provides context for: unit conversion, voice control, UI representation
|
||||
|
||||
### Disabled by Default
|
||||
- **Disable Noisy/Less Popular Entities**: Reduce resource usage
|
||||
```python
|
||||
class MySignalStrengthSensor(SensorEntity):
|
||||
_attr_entity_registry_enabled_default = False
|
||||
```
|
||||
- Target: frequently changing states, technical diagnostics
|
||||
|
||||
### Entity Translations
|
||||
- **Required with has_entity_name**: Support international users
|
||||
- **Implementation**:
|
||||
```python
|
||||
class MySensor(SensorEntity):
|
||||
_attr_has_entity_name = True
|
||||
_attr_translation_key = "phase_voltage"
|
||||
```
|
||||
- Create `strings.json` with translations:
|
||||
```json
|
||||
{
|
||||
"entity": {
|
||||
"sensor": {
|
||||
"phase_voltage": {
|
||||
"name": "Phase voltage"
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### Exception Translations (Gold)
|
||||
- **Translatable Errors**: Use translation keys for user-facing exceptions
|
||||
- **Implementation**:
|
||||
```python
|
||||
raise ServiceValidationError(
|
||||
translation_domain=DOMAIN,
|
||||
translation_key="end_date_before_start_date",
|
||||
)
|
||||
```
|
||||
- Add to `strings.json`:
|
||||
```json
|
||||
{
|
||||
"exceptions": {
|
||||
"end_date_before_start_date": {
|
||||
"message": "The end date cannot be before the start date."
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### Icon Translations (Gold)
|
||||
- **Dynamic Icons**: Support state and range-based icon selection
|
||||
- **State-based Icons**:
|
||||
```json
|
||||
{
|
||||
"entity": {
|
||||
"sensor": {
|
||||
"tree_pollen": {
|
||||
"default": "mdi:tree",
|
||||
"state": {
|
||||
"high": "mdi:tree-outline"
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
- **Range-based Icons** (for numeric values):
|
||||
```json
|
||||
{
|
||||
"entity": {
|
||||
"sensor": {
|
||||
"battery_level": {
|
||||
"default": "mdi:battery-unknown",
|
||||
"range": {
|
||||
"0": "mdi:battery-outline",
|
||||
"90": "mdi:battery-90",
|
||||
"100": "mdi:battery"
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## Testing Requirements
|
||||
|
||||
- Tests should avoid interacting or mocking internal integration details. For more info, see https://developers.home-assistant.io/docs/development_testing/#writing-tests-for-integrations
|
||||
- **Location**: `tests/components/{domain}/`
|
||||
- **Coverage Requirement**: Above 95% test coverage for all modules
|
||||
- **Best Practices**:
|
||||
- Use pytest fixtures from `tests.common`
|
||||
- Mock all external dependencies
|
||||
- Use snapshots for complex data structures
|
||||
- Follow existing test patterns
|
||||
|
||||
### Config Flow Testing
|
||||
- **100% Coverage Required**: All config flow paths must be tested
|
||||
- **Patch Boundaries**: Only patch library or client methods when testing config flows. Do not patch methods defined in `config_flow.py`; exercise the flow logic end-to-end.
|
||||
- **Test Scenarios**:
|
||||
- All flow initiation methods (user, discovery, import)
|
||||
- Successful configuration paths
|
||||
- Error recovery scenarios
|
||||
- Prevention of duplicate entries
|
||||
- Flow completion after errors
|
||||
- Reauthentication/reconfigure flows
|
||||
|
||||
### Testing
|
||||
- **Integration-specific tests** (recommended):
|
||||
```bash
|
||||
pytest ./tests/components/<integration_domain> \
|
||||
--cov=homeassistant.components.<integration_domain> \
|
||||
--cov-report term-missing \
|
||||
--durations-min=1 \
|
||||
--durations=0 \
|
||||
--numprocesses=auto
|
||||
```
|
||||
|
||||
### Testing Best Practices
|
||||
- **Never access `hass.data` directly** - Use fixtures and proper integration setup instead
|
||||
- **Use snapshot testing** - For verifying entity states and attributes
|
||||
- **Test through integration setup** - Don't test entities in isolation
|
||||
- **Mock external APIs** - Use fixtures with realistic JSON data
|
||||
- **Verify registries** - Ensure entities are properly registered with devices
|
||||
|
||||
### Config Flow Testing Template
|
||||
```python
|
||||
async def test_user_flow_success(hass, mock_api):
|
||||
"""Test successful user flow."""
|
||||
result = await hass.config_entries.flow.async_init(
|
||||
DOMAIN, context={"source": config_entries.SOURCE_USER}
|
||||
)
|
||||
assert result["type"] == FlowResultType.FORM
|
||||
assert result["step_id"] == "user"
|
||||
|
||||
# Test form submission
|
||||
result = await hass.config_entries.flow.async_configure(
|
||||
result["flow_id"], user_input=TEST_USER_INPUT
|
||||
)
|
||||
assert result["type"] == FlowResultType.CREATE_ENTRY
|
||||
assert result["title"] == "My Device"
|
||||
assert result["data"] == TEST_USER_INPUT
|
||||
|
||||
async def test_flow_connection_error(hass, mock_api_error):
|
||||
"""Test connection error handling."""
|
||||
result = await hass.config_entries.flow.async_init(
|
||||
DOMAIN, context={"source": config_entries.SOURCE_USER}
|
||||
)
|
||||
result = await hass.config_entries.flow.async_configure(
|
||||
result["flow_id"], user_input=TEST_USER_INPUT
|
||||
)
|
||||
assert result["type"] == FlowResultType.FORM
|
||||
assert result["errors"] == {"base": "cannot_connect"}
|
||||
```
|
||||
|
||||
### Entity Testing Patterns
|
||||
```python
|
||||
@pytest.fixture
|
||||
def platforms() -> list[Platform]:
|
||||
"""Overridden fixture to specify platforms to test."""
|
||||
return [Platform.SENSOR] # Or another specific platform as needed.
|
||||
|
||||
@pytest.mark.usefixtures("entity_registry_enabled_by_default", "init_integration")
|
||||
async def test_entities(
|
||||
hass: HomeAssistant,
|
||||
snapshot: SnapshotAssertion,
|
||||
entity_registry: er.EntityRegistry,
|
||||
device_registry: dr.DeviceRegistry,
|
||||
mock_config_entry: MockConfigEntry,
|
||||
) -> None:
|
||||
"""Test the sensor entities."""
|
||||
await snapshot_platform(hass, entity_registry, snapshot, mock_config_entry.entry_id)
|
||||
|
||||
# Ensure entities are correctly assigned to device
|
||||
device_entry = device_registry.async_get_device(
|
||||
identifiers={(DOMAIN, "device_unique_id")}
|
||||
)
|
||||
assert device_entry
|
||||
entity_entries = er.async_entries_for_config_entry(
|
||||
entity_registry, mock_config_entry.entry_id
|
||||
)
|
||||
for entity_entry in entity_entries:
|
||||
assert entity_entry.device_id == device_entry.id
|
||||
```
|
||||
|
||||
### Mock Patterns
|
||||
```python
|
||||
# Modern integration fixture setup
|
||||
@pytest.fixture
|
||||
def mock_config_entry() -> MockConfigEntry:
|
||||
"""Return the default mocked config entry."""
|
||||
return MockConfigEntry(
|
||||
title="My Integration",
|
||||
domain=DOMAIN,
|
||||
data={CONF_HOST: "127.0.0.1", CONF_API_KEY: "test_key"},
|
||||
unique_id="device_unique_id",
|
||||
)
|
||||
|
||||
@pytest.fixture
|
||||
def mock_device_api() -> Generator[MagicMock]:
|
||||
"""Return a mocked device API."""
|
||||
with patch("homeassistant.components.my_integration.MyDeviceAPI", autospec=True) as api_mock:
|
||||
api = api_mock.return_value
|
||||
api.get_data.return_value = MyDeviceData.from_json(
|
||||
load_fixture("device_data.json", DOMAIN)
|
||||
)
|
||||
yield api
|
||||
|
||||
@pytest.fixture
|
||||
def platforms() -> list[Platform]:
|
||||
"""Fixture to specify platforms to test."""
|
||||
return PLATFORMS
|
||||
|
||||
@pytest.fixture
|
||||
async def init_integration(
|
||||
hass: HomeAssistant,
|
||||
mock_config_entry: MockConfigEntry,
|
||||
mock_device_api: MagicMock,
|
||||
platforms: list[Platform],
|
||||
) -> MockConfigEntry:
|
||||
"""Set up the integration for testing."""
|
||||
mock_config_entry.add_to_hass(hass)
|
||||
|
||||
with patch("homeassistant.components.my_integration.PLATFORMS", platforms):
|
||||
await hass.config_entries.async_setup(mock_config_entry.entry_id)
|
||||
await hass.async_block_till_done()
|
||||
|
||||
return mock_config_entry
|
||||
```
|
||||
|
||||
## Debugging & Troubleshooting
|
||||
|
||||
### Common Issues & Solutions
|
||||
- **Integration won't load**: Check `manifest.json` syntax and required fields
|
||||
- **Entities not appearing**: Verify `unique_id` and `has_entity_name` implementation
|
||||
- **Config flow errors**: Check `strings.json` entries and error handling
|
||||
- **Discovery not working**: Verify manifest discovery configuration and callbacks
|
||||
- **Tests failing**: Check mock setup and async context
|
||||
|
||||
### Debug Logging Setup
|
||||
```python
|
||||
# Enable debug logging in tests
|
||||
caplog.set_level(logging.DEBUG, logger="my_integration")
|
||||
|
||||
# In integration code - use proper logging
|
||||
_LOGGER = logging.getLogger(__name__)
|
||||
_LOGGER.debug("Processing data: %s", data) # Use lazy logging
|
||||
```
|
||||
|
||||
### Validation Commands
|
||||
```bash
|
||||
# Check specific integration
|
||||
python -m script.hassfest --integration-path homeassistant/components/my_integration
|
||||
|
||||
# Validate quality scale
|
||||
# Check quality_scale.yaml against current rules
|
||||
|
||||
# Run integration tests with coverage
|
||||
pytest ./tests/components/my_integration \
|
||||
--cov=homeassistant.components.my_integration \
|
||||
--cov-report term-missing
|
||||
```
|
||||
|
||||
@@ -3,4 +3,17 @@
|
||||
Platform exists as `homeassistant/components/<domain>/diagnostics.py`.
|
||||
|
||||
- **Required**: Implement diagnostic data collection
|
||||
- **Implementation**:
|
||||
```python
|
||||
TO_REDACT = [CONF_API_KEY, CONF_LATITUDE, CONF_LONGITUDE]
|
||||
|
||||
async def async_get_config_entry_diagnostics(
|
||||
hass: HomeAssistant, entry: MyConfigEntry
|
||||
) -> dict[str, Any]:
|
||||
"""Return diagnostics for a config entry."""
|
||||
return {
|
||||
"entry_data": async_redact_data(entry.data, TO_REDACT),
|
||||
"data": entry.runtime_data.data,
|
||||
}
|
||||
```
|
||||
- **Security**: Never expose passwords, tokens, or sensitive coordinates
|
||||
|
||||
@@ -8,6 +8,29 @@ Platform exists as `homeassistant/components/<domain>/repairs.py`.
|
||||
- Provide specific steps users need to take to resolve the issue
|
||||
- Use friendly, helpful language
|
||||
- Include relevant context (device names, error details, etc.)
|
||||
- **Implementation**:
|
||||
```python
|
||||
ir.async_create_issue(
|
||||
hass,
|
||||
DOMAIN,
|
||||
"outdated_version",
|
||||
is_fixable=False,
|
||||
issue_domain=DOMAIN,
|
||||
severity=ir.IssueSeverity.ERROR,
|
||||
translation_key="outdated_version",
|
||||
)
|
||||
```
|
||||
- **Translation Strings Requirements**: Must contain user-actionable text in `strings.json`:
|
||||
```json
|
||||
{
|
||||
"issues": {
|
||||
"outdated_version": {
|
||||
"title": "Device firmware is outdated",
|
||||
"description": "Your device firmware version {current_version} is below the minimum required version {min_version}. To fix this issue: 1) Open the manufacturer's mobile app, 2) Navigate to device settings, 3) Select 'Update Firmware', 4) Wait for the update to complete, then 5) Restart Home Assistant."
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
- **String Content Must Include**:
|
||||
- What the problem is
|
||||
- Why it matters
|
||||
@@ -18,4 +41,15 @@ Platform exists as `homeassistant/components/<domain>/repairs.py`.
|
||||
- `CRITICAL`: Reserved for extreme scenarios only
|
||||
- `ERROR`: Requires immediate user attention
|
||||
- `WARNING`: Indicates future potential breakage
|
||||
- **Additional Attributes**:
|
||||
```python
|
||||
ir.async_create_issue(
|
||||
hass, DOMAIN, "issue_id",
|
||||
breaks_in_ha_version="2024.1.0",
|
||||
is_fixable=True,
|
||||
is_persistent=True,
|
||||
severity=ir.IssueSeverity.ERROR,
|
||||
translation_key="issue_description",
|
||||
)
|
||||
```
|
||||
- Only create issues for problems users can potentially resolve
|
||||
|
||||
5
.github/copilot-instructions.md
vendored
5
.github/copilot-instructions.md
vendored
@@ -11,9 +11,10 @@
|
||||
|
||||
This repository contains the core of Home Assistant, a Python 3 based home automation application.
|
||||
|
||||
## Git Commit Guidelines
|
||||
## Code Review Guidelines
|
||||
|
||||
- **Do NOT amend, squash, or rebase commits that have already been pushed to the PR branch after the PR is opened** - Reviewers need to follow the commit history, as well as see what changed since their last review
|
||||
**Git commit practices during review:**
|
||||
- **Do NOT amend, squash, or rebase commits after review has started** - Reviewers need to see what changed since their last review
|
||||
|
||||
## Development Commands
|
||||
|
||||
|
||||
6
.github/workflows/builder.yml
vendored
6
.github/workflows/builder.yml
vendored
@@ -47,6 +47,10 @@ jobs:
|
||||
with:
|
||||
python-version-file: ".python-version"
|
||||
|
||||
- name: Get information
|
||||
id: info
|
||||
uses: home-assistant/actions/helpers/info@master # zizmor: ignore[unpinned-uses]
|
||||
|
||||
- name: Get version
|
||||
id: version
|
||||
uses: home-assistant/actions/helpers/version@master # zizmor: ignore[unpinned-uses]
|
||||
@@ -338,7 +342,7 @@ jobs:
|
||||
registry: ["ghcr.io/home-assistant", "docker.io/homeassistant"]
|
||||
steps:
|
||||
- name: Install Cosign
|
||||
uses: sigstore/cosign-installer@cad07c2e89fa2edd6e2d7bab4c1aa38e53f76003 # v4.1.1
|
||||
uses: sigstore/cosign-installer@ba7bc0a3fef59531c69a25acd34668d6d3fe6f22 # v4.1.0
|
||||
with:
|
||||
cosign-release: "v2.5.3"
|
||||
|
||||
|
||||
6
.github/workflows/ci.yaml
vendored
6
.github/workflows/ci.yaml
vendored
@@ -1392,7 +1392,7 @@ jobs:
|
||||
pattern: coverage-*
|
||||
- name: Upload coverage to Codecov
|
||||
if: needs.info.outputs.test_full_suite == 'true'
|
||||
uses: codecov/codecov-action@57e3a136b779b570ffcdbf80b3bdc90e7fab3de2 # v6.0.0
|
||||
uses: codecov/codecov-action@1af58845a975a7985b0beb0cbe6fbbb71a41dbad # v5.5.3
|
||||
with:
|
||||
fail_ci_if_error: true
|
||||
flags: full-suite
|
||||
@@ -1563,7 +1563,7 @@ jobs:
|
||||
pattern: coverage-*
|
||||
- name: Upload coverage to Codecov
|
||||
if: needs.info.outputs.test_full_suite == 'false'
|
||||
uses: codecov/codecov-action@57e3a136b779b570ffcdbf80b3bdc90e7fab3de2 # v6.0.0
|
||||
uses: codecov/codecov-action@1af58845a975a7985b0beb0cbe6fbbb71a41dbad # v5.5.3
|
||||
with:
|
||||
fail_ci_if_error: true
|
||||
token: ${{ secrets.CODECOV_TOKEN }} # zizmor: ignore[secrets-outside-env]
|
||||
@@ -1591,7 +1591,7 @@ jobs:
|
||||
with:
|
||||
pattern: test-results-*
|
||||
- name: Upload test results to Codecov
|
||||
uses: codecov/codecov-action@57e3a136b779b570ffcdbf80b3bdc90e7fab3de2 # v6.0.0
|
||||
uses: codecov/codecov-action@1af58845a975a7985b0beb0cbe6fbbb71a41dbad # v5.5.3
|
||||
with:
|
||||
report_type: test_results
|
||||
fail_ci_if_error: true
|
||||
|
||||
4
.github/workflows/codeql.yml
vendored
4
.github/workflows/codeql.yml
vendored
@@ -28,11 +28,11 @@ jobs:
|
||||
persist-credentials: false
|
||||
|
||||
- name: Initialize CodeQL
|
||||
uses: github/codeql-action/init@c10b8064de6f491fea524254123dbe5e09572f13 # v4.35.1
|
||||
uses: github/codeql-action/init@0d579ffd059c29b07949a3cce3983f0780820c98 # v4.32.6
|
||||
with:
|
||||
languages: python
|
||||
|
||||
- name: Perform CodeQL Analysis
|
||||
uses: github/codeql-action/analyze@c10b8064de6f491fea524254123dbe5e09572f13 # v4.35.1
|
||||
uses: github/codeql-action/analyze@0d579ffd059c29b07949a3cce3983f0780820c98 # v4.32.6
|
||||
with:
|
||||
category: "/language:python"
|
||||
|
||||
@@ -87,13 +87,6 @@ repos:
|
||||
language: script
|
||||
types: [text]
|
||||
files: ^(homeassistant/.+/manifest\.json|homeassistant/brands/.+\.json|pyproject\.toml|\.pre-commit-config\.yaml|script/gen_requirements_all\.py)$
|
||||
- id: gen_copilot_instructions
|
||||
name: gen_copilot_instructions
|
||||
entry: script/run-in-env.sh python3 -m script.gen_copilot_instructions
|
||||
pass_filenames: false
|
||||
language: script
|
||||
types: [text]
|
||||
files: ^(AGENTS\.md|\.claude/skills/(?!github-pr-reviewer/).+/SKILL\.md|\.github/copilot-instructions\.md|script/gen_copilot_instructions\.py)$
|
||||
- id: hassfest
|
||||
name: hassfest
|
||||
entry: script/run-in-env.sh python3 -m script.hassfest
|
||||
|
||||
@@ -174,7 +174,6 @@ homeassistant.components.dnsip.*
|
||||
homeassistant.components.doorbird.*
|
||||
homeassistant.components.dormakaba_dkey.*
|
||||
homeassistant.components.downloader.*
|
||||
homeassistant.components.dropbox.*
|
||||
homeassistant.components.droplet.*
|
||||
homeassistant.components.dsmr.*
|
||||
homeassistant.components.duckdns.*
|
||||
|
||||
@@ -2,9 +2,10 @@
|
||||
|
||||
This repository contains the core of Home Assistant, a Python 3 based home automation application.
|
||||
|
||||
## Git Commit Guidelines
|
||||
## Code Review Guidelines
|
||||
|
||||
- **Do NOT amend, squash, or rebase commits that have already been pushed to the PR branch after the PR is opened** - Reviewers need to follow the commit history, as well as see what changed since their last review
|
||||
**Git commit practices during review:**
|
||||
- **Do NOT amend, squash, or rebase commits after review has started** - Reviewers need to see what changed since their last review
|
||||
|
||||
## Development Commands
|
||||
|
||||
|
||||
17
CODEOWNERS
generated
17
CODEOWNERS
generated
@@ -37,13 +37,6 @@ build.json @home-assistant/supervisor
|
||||
# Other code
|
||||
/homeassistant/scripts/check_config.py @kellerza
|
||||
|
||||
# Agent Configurations
|
||||
AGENTS.md @home-assistant/core
|
||||
CLAUDE.md @home-assistant/core
|
||||
/.agent/ @home-assistant/core
|
||||
/.claude/ @home-assistant/core
|
||||
/.gemini/ @home-assistant/core
|
||||
|
||||
# Integrations
|
||||
/homeassistant/components/abode/ @shred86
|
||||
/tests/components/abode/ @shred86
|
||||
@@ -408,8 +401,6 @@ CLAUDE.md @home-assistant/core
|
||||
/tests/components/dremel_3d_printer/ @tkdrob
|
||||
/homeassistant/components/drop_connect/ @ChandlerSystems @pfrazer
|
||||
/tests/components/drop_connect/ @ChandlerSystems @pfrazer
|
||||
/homeassistant/components/dropbox/ @bdr99
|
||||
/tests/components/dropbox/ @bdr99
|
||||
/homeassistant/components/droplet/ @sarahseidman
|
||||
/tests/components/droplet/ @sarahseidman
|
||||
/homeassistant/components/dsmr/ @Robbie1221
|
||||
@@ -1263,8 +1254,8 @@ CLAUDE.md @home-assistant/core
|
||||
/tests/components/openuv/ @bachya
|
||||
/homeassistant/components/openweathermap/ @fabaff @freekode @nzapponi @wittypluck
|
||||
/tests/components/openweathermap/ @fabaff @freekode @nzapponi @wittypluck
|
||||
/homeassistant/components/opnsense/ @HarlemSquirrel @Snuffy2
|
||||
/tests/components/opnsense/ @HarlemSquirrel @Snuffy2
|
||||
/homeassistant/components/opnsense/ @mtreinish
|
||||
/tests/components/opnsense/ @mtreinish
|
||||
/homeassistant/components/opower/ @tronikos
|
||||
/tests/components/opower/ @tronikos
|
||||
/homeassistant/components/oralb/ @bdraco @Lash-L
|
||||
@@ -1308,8 +1299,6 @@ CLAUDE.md @home-assistant/core
|
||||
/tests/components/pi_hole/ @shenxn
|
||||
/homeassistant/components/picnic/ @corneyl @codesalatdev
|
||||
/tests/components/picnic/ @corneyl @codesalatdev
|
||||
/homeassistant/components/picotts/ @rooggiieerr
|
||||
/tests/components/picotts/ @rooggiieerr
|
||||
/homeassistant/components/ping/ @jpbede
|
||||
/tests/components/ping/ @jpbede
|
||||
/homeassistant/components/plaato/ @JohNan
|
||||
@@ -1875,8 +1864,6 @@ CLAUDE.md @home-assistant/core
|
||||
/tests/components/vicare/ @CFenner
|
||||
/homeassistant/components/victron_ble/ @rajlaud
|
||||
/tests/components/victron_ble/ @rajlaud
|
||||
/homeassistant/components/victron_gx/ @tomer-w
|
||||
/tests/components/victron_gx/ @tomer-w
|
||||
/homeassistant/components/victron_remote_monitoring/ @AndyTempel
|
||||
/tests/components/victron_remote_monitoring/ @AndyTempel
|
||||
/homeassistant/components/vilfo/ @ManneW
|
||||
|
||||
@@ -1,5 +1,5 @@
|
||||
{
|
||||
"domain": "victron",
|
||||
"name": "Victron",
|
||||
"integrations": ["victron_gx", "victron_ble", "victron_remote_monitoring"]
|
||||
"integrations": ["victron_ble", "victron_remote_monitoring"]
|
||||
}
|
||||
|
||||
@@ -1,7 +1,11 @@
|
||||
"""The Actron Air integration."""
|
||||
|
||||
from actron_neo_api import ActronAirAPI, ActronAirAPIError, ActronAirAuthError
|
||||
from actron_neo_api.models.system import ActronAirSystemInfo
|
||||
from actron_neo_api import (
|
||||
ActronAirACSystem,
|
||||
ActronAirAPI,
|
||||
ActronAirAPIError,
|
||||
ActronAirAuthError,
|
||||
)
|
||||
|
||||
from homeassistant.const import CONF_API_TOKEN, Platform
|
||||
from homeassistant.core import HomeAssistant
|
||||
@@ -21,7 +25,7 @@ async def async_setup_entry(hass: HomeAssistant, entry: ActronAirConfigEntry) ->
|
||||
"""Set up Actron Air integration from a config entry."""
|
||||
|
||||
api = ActronAirAPI(refresh_token=entry.data[CONF_API_TOKEN])
|
||||
systems: list[ActronAirSystemInfo] = []
|
||||
systems: list[ActronAirACSystem] = []
|
||||
|
||||
try:
|
||||
systems = await api.get_ac_systems()
|
||||
@@ -32,17 +36,14 @@ async def async_setup_entry(hass: HomeAssistant, entry: ActronAirConfigEntry) ->
|
||||
translation_key="auth_error",
|
||||
) from err
|
||||
except ActronAirAPIError as err:
|
||||
raise ConfigEntryNotReady(
|
||||
translation_domain=DOMAIN,
|
||||
translation_key="setup_connection_error",
|
||||
) from err
|
||||
raise ConfigEntryNotReady from err
|
||||
|
||||
system_coordinators: dict[str, ActronAirSystemCoordinator] = {}
|
||||
for system in systems:
|
||||
coordinator = ActronAirSystemCoordinator(hass, entry, api, system)
|
||||
_LOGGER.debug("Setting up coordinator for system: %s", system.serial)
|
||||
_LOGGER.debug("Setting up coordinator for system: %s", system["serial"])
|
||||
await coordinator.async_config_entry_first_refresh()
|
||||
system_coordinators[system.serial] = coordinator
|
||||
system_coordinators[system["serial"]] = coordinator
|
||||
|
||||
entry.runtime_data = ActronAirRuntimeData(
|
||||
api=api,
|
||||
|
||||
@@ -18,7 +18,7 @@ from homeassistant.core import HomeAssistant
|
||||
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
|
||||
|
||||
from .coordinator import ActronAirConfigEntry, ActronAirSystemCoordinator
|
||||
from .entity import ActronAirAcEntity, ActronAirZoneEntity, actron_air_command
|
||||
from .entity import ActronAirAcEntity, ActronAirZoneEntity, handle_actron_api_errors
|
||||
|
||||
PARALLEL_UPDATES = 0
|
||||
|
||||
@@ -136,19 +136,19 @@ class ActronSystemClimate(ActronAirAcEntity, ActronAirClimateEntity):
|
||||
"""Return the target temperature."""
|
||||
return self._status.user_aircon_settings.temperature_setpoint_cool_c
|
||||
|
||||
@actron_air_command
|
||||
@handle_actron_api_errors
|
||||
async def async_set_fan_mode(self, fan_mode: str) -> None:
|
||||
"""Set a new fan mode."""
|
||||
api_fan_mode = FAN_MODE_MAPPING_HA_TO_ACTRONAIR.get(fan_mode)
|
||||
await self._status.user_aircon_settings.set_fan_mode(api_fan_mode)
|
||||
|
||||
@actron_air_command
|
||||
@handle_actron_api_errors
|
||||
async def async_set_hvac_mode(self, hvac_mode: HVACMode) -> None:
|
||||
"""Set the HVAC mode."""
|
||||
ac_mode = HVAC_MODE_MAPPING_HA_TO_ACTRONAIR.get(hvac_mode)
|
||||
await self._status.ac_system.set_system_mode(ac_mode)
|
||||
|
||||
@actron_air_command
|
||||
@handle_actron_api_errors
|
||||
async def async_set_temperature(self, **kwargs: Any) -> None:
|
||||
"""Set the temperature."""
|
||||
temp = kwargs.get(ATTR_TEMPERATURE)
|
||||
@@ -212,13 +212,13 @@ class ActronZoneClimate(ActronAirZoneEntity, ActronAirClimateEntity):
|
||||
"""Return the target temperature."""
|
||||
return self._zone.temperature_setpoint_cool_c
|
||||
|
||||
@actron_air_command
|
||||
@handle_actron_api_errors
|
||||
async def async_set_hvac_mode(self, hvac_mode: HVACMode) -> None:
|
||||
"""Set the HVAC mode."""
|
||||
is_enabled = hvac_mode != HVACMode.OFF
|
||||
await self._zone.enable(is_enabled)
|
||||
|
||||
@actron_air_command
|
||||
@handle_actron_api_errors
|
||||
async def async_set_temperature(self, **kwargs: Any) -> None:
|
||||
"""Set the temperature."""
|
||||
await self._zone.set_temperature(temperature=kwargs.get(ATTR_TEMPERATURE))
|
||||
|
||||
@@ -38,10 +38,10 @@ class ActronAirConfigFlow(ConfigFlow, domain=DOMAIN):
|
||||
_LOGGER.error("OAuth2 flow failed: %s", err)
|
||||
return self.async_abort(reason="oauth2_error")
|
||||
|
||||
self._device_code = device_code_response.device_code
|
||||
self._user_code = device_code_response.user_code
|
||||
self._verification_uri = device_code_response.verification_uri_complete
|
||||
self._expires_minutes = str(device_code_response.expires_in // 60)
|
||||
self._device_code = device_code_response["device_code"]
|
||||
self._user_code = device_code_response["user_code"]
|
||||
self._verification_uri = device_code_response["verification_uri_complete"]
|
||||
self._expires_minutes = str(device_code_response["expires_in"] // 60)
|
||||
|
||||
async def _wait_for_authorization() -> None:
|
||||
"""Wait for the user to authorize the device."""
|
||||
|
||||
@@ -6,12 +6,12 @@ from dataclasses import dataclass
|
||||
from datetime import timedelta
|
||||
|
||||
from actron_neo_api import (
|
||||
ActronAirACSystem,
|
||||
ActronAirAPI,
|
||||
ActronAirAPIError,
|
||||
ActronAirAuthError,
|
||||
ActronAirStatus,
|
||||
)
|
||||
from actron_neo_api.models.system import ActronAirSystemInfo
|
||||
|
||||
from homeassistant.config_entries import ConfigEntry
|
||||
from homeassistant.core import HomeAssistant
|
||||
@@ -38,7 +38,7 @@ class ActronAirRuntimeData:
|
||||
type ActronAirConfigEntry = ConfigEntry[ActronAirRuntimeData]
|
||||
|
||||
|
||||
class ActronAirSystemCoordinator(DataUpdateCoordinator[ActronAirStatus]):
|
||||
class ActronAirSystemCoordinator(DataUpdateCoordinator[ActronAirACSystem]):
|
||||
"""System coordinator for Actron Air integration."""
|
||||
|
||||
def __init__(
|
||||
@@ -46,7 +46,7 @@ class ActronAirSystemCoordinator(DataUpdateCoordinator[ActronAirStatus]):
|
||||
hass: HomeAssistant,
|
||||
entry: ActronAirConfigEntry,
|
||||
api: ActronAirAPI,
|
||||
system: ActronAirSystemInfo,
|
||||
system: ActronAirACSystem,
|
||||
) -> None:
|
||||
"""Initialize the coordinator."""
|
||||
super().__init__(
|
||||
@@ -57,7 +57,7 @@ class ActronAirSystemCoordinator(DataUpdateCoordinator[ActronAirStatus]):
|
||||
config_entry=entry,
|
||||
)
|
||||
self.system = system
|
||||
self.serial_number = system.serial
|
||||
self.serial_number = system["serial"]
|
||||
self.api = api
|
||||
self.status = self.api.state_manager.get_status(self.serial_number)
|
||||
self.last_seen = dt_util.utcnow()
|
||||
|
||||
@@ -1,35 +0,0 @@
|
||||
"""Diagnostics support for Actron Air."""
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
from typing import Any
|
||||
|
||||
from homeassistant.components.diagnostics import async_redact_data
|
||||
from homeassistant.const import CONF_API_TOKEN
|
||||
from homeassistant.core import HomeAssistant
|
||||
|
||||
from .coordinator import ActronAirConfigEntry
|
||||
|
||||
TO_REDACT = {CONF_API_TOKEN, "master_serial", "serial_number", "serial"}
|
||||
|
||||
|
||||
async def async_get_config_entry_diagnostics(
|
||||
hass: HomeAssistant,
|
||||
entry: ActronAirConfigEntry,
|
||||
) -> dict[str, Any]:
|
||||
"""Return diagnostics for a config entry."""
|
||||
coordinators: dict[int, Any] = {}
|
||||
for idx, coordinator in enumerate(entry.runtime_data.system_coordinators.values()):
|
||||
coordinators[idx] = {
|
||||
"system": async_redact_data(
|
||||
coordinator.system.model_dump(mode="json"), TO_REDACT
|
||||
),
|
||||
"status": async_redact_data(
|
||||
coordinator.data.model_dump(mode="json", exclude={"last_known_state"}),
|
||||
TO_REDACT,
|
||||
),
|
||||
}
|
||||
return {
|
||||
"entry_data": async_redact_data(entry.data, TO_REDACT),
|
||||
"coordinators": coordinators,
|
||||
}
|
||||
@@ -14,14 +14,10 @@ from .const import DOMAIN
|
||||
from .coordinator import ActronAirSystemCoordinator
|
||||
|
||||
|
||||
def actron_air_command[_EntityT: ActronAirEntity, **_P](
|
||||
def handle_actron_api_errors[_EntityT: ActronAirEntity, **_P](
|
||||
func: Callable[Concatenate[_EntityT, _P], Coroutine[Any, Any, Any]],
|
||||
) -> Callable[Concatenate[_EntityT, _P], Coroutine[Any, Any, None]]:
|
||||
"""Decorator for Actron Air API calls.
|
||||
|
||||
Handles ActronAirAPIError exceptions, and requests a coordinator update
|
||||
to update the status of the devices as soon as possible.
|
||||
"""
|
||||
"""Decorate Actron Air API calls to handle ActronAirAPIError exceptions."""
|
||||
|
||||
@wraps(func)
|
||||
async def wrapper(self: _EntityT, *args: _P.args, **kwargs: _P.kwargs) -> None:
|
||||
@@ -34,7 +30,6 @@ def actron_air_command[_EntityT: ActronAirEntity, **_P](
|
||||
translation_key="api_error",
|
||||
translation_placeholders={"error": str(err)},
|
||||
) from err
|
||||
self.coordinator.async_set_updated_data(self.coordinator.data)
|
||||
|
||||
return wrapper
|
||||
|
||||
|
||||
@@ -13,5 +13,5 @@
|
||||
"integration_type": "hub",
|
||||
"iot_class": "cloud_polling",
|
||||
"quality_scale": "silver",
|
||||
"requirements": ["actron-neo-api==0.5.0"]
|
||||
"requirements": ["actron-neo-api==0.4.1"]
|
||||
}
|
||||
|
||||
@@ -41,7 +41,7 @@ rules:
|
||||
|
||||
# Gold
|
||||
devices: done
|
||||
diagnostics: done
|
||||
diagnostics: todo
|
||||
discovery-update-info:
|
||||
status: exempt
|
||||
comment: This integration uses DHCP discovery, however is cloud polling. Therefore there is no information to update.
|
||||
@@ -64,7 +64,7 @@ rules:
|
||||
status: exempt
|
||||
comment: Not required for this integration at this stage.
|
||||
entity-translations: todo
|
||||
exception-translations: done
|
||||
exception-translations: todo
|
||||
icon-translations: todo
|
||||
reconfiguration-flow: todo
|
||||
repair-issues:
|
||||
|
||||
@@ -55,9 +55,6 @@
|
||||
"auth_error": {
|
||||
"message": "Authentication failed, please reauthenticate"
|
||||
},
|
||||
"setup_connection_error": {
|
||||
"message": "Failed to connect to the Actron Air API"
|
||||
},
|
||||
"update_error": {
|
||||
"message": "An error occurred while retrieving data from the Actron Air API: {error}"
|
||||
}
|
||||
|
||||
@@ -10,7 +10,7 @@ from homeassistant.core import HomeAssistant
|
||||
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
|
||||
|
||||
from .coordinator import ActronAirConfigEntry, ActronAirSystemCoordinator
|
||||
from .entity import ActronAirAcEntity, actron_air_command
|
||||
from .entity import ActronAirAcEntity, handle_actron_api_errors
|
||||
|
||||
PARALLEL_UPDATES = 0
|
||||
|
||||
@@ -105,12 +105,12 @@ class ActronAirSwitch(ActronAirAcEntity, SwitchEntity):
|
||||
"""Return true if the switch is on."""
|
||||
return self.entity_description.is_on_fn(self.coordinator)
|
||||
|
||||
@actron_air_command
|
||||
@handle_actron_api_errors
|
||||
async def async_turn_on(self, **kwargs: Any) -> None:
|
||||
"""Turn the switch on."""
|
||||
await self.entity_description.set_fn(self.coordinator, True)
|
||||
|
||||
@actron_air_command
|
||||
@handle_actron_api_errors
|
||||
async def async_turn_off(self, **kwargs: Any) -> None:
|
||||
"""Turn the switch off."""
|
||||
await self.entity_description.set_fn(self.coordinator, False)
|
||||
|
||||
@@ -74,8 +74,7 @@ async def _resolve_attachments(
|
||||
resolved_attachments.append(
|
||||
conversation.Attachment(
|
||||
media_content_id=media_content_id,
|
||||
mime_type=attachment.get("media_content_type")
|
||||
or image_data.content_type,
|
||||
mime_type=image_data.content_type,
|
||||
path=temp_filename,
|
||||
)
|
||||
)
|
||||
@@ -90,7 +89,7 @@ async def _resolve_attachments(
|
||||
resolved_attachments.append(
|
||||
conversation.Attachment(
|
||||
media_content_id=media_content_id,
|
||||
mime_type=attachment.get("media_content_type") or media.mime_type,
|
||||
mime_type=media.mime_type,
|
||||
path=media.path,
|
||||
)
|
||||
)
|
||||
|
||||
@@ -33,21 +33,14 @@ from homeassistant.helpers import device_registry as dr, entity_registry as er
|
||||
from homeassistant.helpers.aiohttp_client import async_get_clientsession
|
||||
|
||||
from .const import DEFAULT_SSL, DEFAULT_VERIFY_SSL, DOMAIN, SECTION_ADVANCED_SETTINGS
|
||||
from .coordinator import (
|
||||
AirOSConfigEntry,
|
||||
AirOSDataUpdateCoordinator,
|
||||
AirOSFirmwareUpdateCoordinator,
|
||||
AirOSRuntimeData,
|
||||
)
|
||||
from .coordinator import AirOSConfigEntry, AirOSDataUpdateCoordinator
|
||||
|
||||
_PLATFORMS: list[Platform] = [
|
||||
Platform.BINARY_SENSOR,
|
||||
Platform.BUTTON,
|
||||
Platform.SENSOR,
|
||||
Platform.UPDATE,
|
||||
]
|
||||
|
||||
|
||||
_LOGGER = logging.getLogger(__name__)
|
||||
|
||||
|
||||
@@ -93,20 +86,10 @@ async def async_setup_entry(hass: HomeAssistant, entry: AirOSConfigEntry) -> boo
|
||||
|
||||
airos_device = airos_class(**conn_data)
|
||||
|
||||
data_coordinator = AirOSDataUpdateCoordinator(
|
||||
hass, entry, device_data, airos_device
|
||||
)
|
||||
await data_coordinator.async_config_entry_first_refresh()
|
||||
coordinator = AirOSDataUpdateCoordinator(hass, entry, device_data, airos_device)
|
||||
await coordinator.async_config_entry_first_refresh()
|
||||
|
||||
firmware_coordinator: AirOSFirmwareUpdateCoordinator | None = None
|
||||
if device_data["fw_major"] >= 8:
|
||||
firmware_coordinator = AirOSFirmwareUpdateCoordinator(hass, entry, airos_device)
|
||||
await firmware_coordinator.async_config_entry_first_refresh()
|
||||
|
||||
entry.runtime_data = AirOSRuntimeData(
|
||||
status=data_coordinator,
|
||||
firmware=firmware_coordinator,
|
||||
)
|
||||
entry.runtime_data = coordinator
|
||||
|
||||
await hass.config_entries.async_forward_entry_setups(entry, _PLATFORMS)
|
||||
|
||||
|
||||
@@ -87,7 +87,7 @@ async def async_setup_entry(
|
||||
async_add_entities: AddConfigEntryEntitiesCallback,
|
||||
) -> None:
|
||||
"""Set up the AirOS binary sensors from a config entry."""
|
||||
coordinator = config_entry.runtime_data.status
|
||||
coordinator = config_entry.runtime_data
|
||||
|
||||
entities = [
|
||||
AirOSBinarySensor(coordinator, description)
|
||||
|
||||
@@ -31,9 +31,7 @@ async def async_setup_entry(
|
||||
async_add_entities: AddConfigEntryEntitiesCallback,
|
||||
) -> None:
|
||||
"""Set up the AirOS button from a config entry."""
|
||||
async_add_entities(
|
||||
[AirOSRebootButton(config_entry.runtime_data.status, REBOOT_BUTTON)]
|
||||
)
|
||||
async_add_entities([AirOSRebootButton(config_entry.runtime_data, REBOOT_BUTTON)])
|
||||
|
||||
|
||||
class AirOSRebootButton(AirOSEntity, ButtonEntity):
|
||||
|
||||
@@ -5,7 +5,6 @@ from datetime import timedelta
|
||||
DOMAIN = "airos"
|
||||
|
||||
SCAN_INTERVAL = timedelta(minutes=1)
|
||||
UPDATE_SCAN_INTERVAL = timedelta(days=1)
|
||||
|
||||
MANUFACTURER = "Ubiquiti"
|
||||
|
||||
|
||||
@@ -2,10 +2,7 @@
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
from collections.abc import Awaitable, Callable
|
||||
from dataclasses import dataclass
|
||||
import logging
|
||||
from typing import Any, TypeVar
|
||||
|
||||
from airos.airos6 import AirOS6, AirOS6Data
|
||||
from airos.airos8 import AirOS8, AirOS8Data
|
||||
@@ -22,61 +19,20 @@ from homeassistant.core import HomeAssistant
|
||||
from homeassistant.exceptions import ConfigEntryAuthFailed
|
||||
from homeassistant.helpers.update_coordinator import DataUpdateCoordinator, UpdateFailed
|
||||
|
||||
from .const import DOMAIN, SCAN_INTERVAL, UPDATE_SCAN_INTERVAL
|
||||
from .const import DOMAIN, SCAN_INTERVAL
|
||||
|
||||
_LOGGER = logging.getLogger(__name__)
|
||||
|
||||
type AirOSDeviceDetect = AirOS8 | AirOS6
|
||||
type AirOSDataDetect = AirOS8Data | AirOS6Data
|
||||
type AirOSUpdateData = dict[str, Any]
|
||||
AirOSDeviceDetect = AirOS8 | AirOS6
|
||||
AirOSDataDetect = AirOS8Data | AirOS6Data
|
||||
|
||||
type AirOSConfigEntry = ConfigEntry[AirOSRuntimeData]
|
||||
|
||||
T = TypeVar("T", bound=AirOSDataDetect | AirOSUpdateData)
|
||||
|
||||
|
||||
@dataclass
|
||||
class AirOSRuntimeData:
|
||||
"""Data for AirOS config entry."""
|
||||
|
||||
status: AirOSDataUpdateCoordinator
|
||||
firmware: AirOSFirmwareUpdateCoordinator | None
|
||||
|
||||
|
||||
async def async_fetch_airos_data(
|
||||
airos_device: AirOSDeviceDetect,
|
||||
update_method: Callable[[], Awaitable[T]],
|
||||
) -> T:
|
||||
"""Fetch data from AirOS device."""
|
||||
try:
|
||||
await airos_device.login()
|
||||
return await update_method()
|
||||
except AirOSConnectionAuthenticationError as err:
|
||||
_LOGGER.exception("Error authenticating with airOS device")
|
||||
raise ConfigEntryAuthFailed(
|
||||
translation_domain=DOMAIN, translation_key="invalid_auth"
|
||||
) from err
|
||||
except (
|
||||
AirOSConnectionSetupError,
|
||||
AirOSDeviceConnectionError,
|
||||
TimeoutError,
|
||||
) as err:
|
||||
_LOGGER.error("Error connecting to airOS device: %s", err)
|
||||
raise UpdateFailed(
|
||||
translation_domain=DOMAIN,
|
||||
translation_key="cannot_connect",
|
||||
) from err
|
||||
except AirOSDataMissingError as err:
|
||||
_LOGGER.error("Expected data not returned by airOS device: %s", err)
|
||||
raise UpdateFailed(
|
||||
translation_domain=DOMAIN,
|
||||
translation_key="error_data_missing",
|
||||
) from err
|
||||
type AirOSConfigEntry = ConfigEntry[AirOSDataUpdateCoordinator]
|
||||
|
||||
|
||||
class AirOSDataUpdateCoordinator(DataUpdateCoordinator[AirOSDataDetect]):
|
||||
"""Class to manage fetching AirOS status data from single endpoint."""
|
||||
"""Class to manage fetching AirOS data from single endpoint."""
|
||||
|
||||
airos_device: AirOSDeviceDetect
|
||||
config_entry: AirOSConfigEntry
|
||||
|
||||
def __init__(
|
||||
@@ -98,33 +54,28 @@ class AirOSDataUpdateCoordinator(DataUpdateCoordinator[AirOSDataDetect]):
|
||||
)
|
||||
|
||||
async def _async_update_data(self) -> AirOSDataDetect:
|
||||
"""Fetch status data from AirOS."""
|
||||
return await async_fetch_airos_data(self.airos_device, self.airos_device.status)
|
||||
|
||||
|
||||
class AirOSFirmwareUpdateCoordinator(DataUpdateCoordinator[AirOSUpdateData]):
|
||||
"""Class to manage fetching AirOS firmware."""
|
||||
|
||||
config_entry: AirOSConfigEntry
|
||||
|
||||
def __init__(
|
||||
self,
|
||||
hass: HomeAssistant,
|
||||
config_entry: AirOSConfigEntry,
|
||||
airos_device: AirOSDeviceDetect,
|
||||
) -> None:
|
||||
"""Initialize the coordinator."""
|
||||
self.airos_device = airos_device
|
||||
super().__init__(
|
||||
hass,
|
||||
_LOGGER,
|
||||
config_entry=config_entry,
|
||||
name=DOMAIN,
|
||||
update_interval=UPDATE_SCAN_INTERVAL,
|
||||
)
|
||||
|
||||
async def _async_update_data(self) -> AirOSUpdateData:
|
||||
"""Fetch firmware data from AirOS."""
|
||||
return await async_fetch_airos_data(
|
||||
self.airos_device, self.airos_device.update_check
|
||||
)
|
||||
"""Fetch data from AirOS."""
|
||||
try:
|
||||
await self.airos_device.login()
|
||||
return await self.airos_device.status()
|
||||
except AirOSConnectionAuthenticationError as err:
|
||||
_LOGGER.exception("Error authenticating with airOS device")
|
||||
raise ConfigEntryAuthFailed(
|
||||
translation_domain=DOMAIN, translation_key="invalid_auth"
|
||||
) from err
|
||||
except (
|
||||
AirOSConnectionSetupError,
|
||||
AirOSDeviceConnectionError,
|
||||
TimeoutError,
|
||||
) as err:
|
||||
_LOGGER.error("Error connecting to airOS device: %s", err)
|
||||
raise UpdateFailed(
|
||||
translation_domain=DOMAIN,
|
||||
translation_key="cannot_connect",
|
||||
) from err
|
||||
except AirOSDataMissingError as err:
|
||||
_LOGGER.error("Expected data not returned by airOS device: %s", err)
|
||||
raise UpdateFailed(
|
||||
translation_domain=DOMAIN,
|
||||
translation_key="error_data_missing",
|
||||
) from err
|
||||
|
||||
@@ -29,15 +29,5 @@ async def async_get_config_entry_diagnostics(
|
||||
"""Return diagnostics for a config entry."""
|
||||
return {
|
||||
"entry_data": async_redact_data(entry.data, TO_REDACT_HA),
|
||||
"data": {
|
||||
"status_data": async_redact_data(
|
||||
entry.runtime_data.status.data.to_dict(), TO_REDACT_AIROS
|
||||
),
|
||||
"firmware_data": async_redact_data(
|
||||
entry.runtime_data.firmware.data
|
||||
if entry.runtime_data.firmware is not None
|
||||
else {},
|
||||
TO_REDACT_AIROS,
|
||||
),
|
||||
},
|
||||
"data": async_redact_data(entry.runtime_data.data.to_dict(), TO_REDACT_AIROS),
|
||||
}
|
||||
|
||||
@@ -180,7 +180,7 @@ async def async_setup_entry(
|
||||
async_add_entities: AddConfigEntryEntitiesCallback,
|
||||
) -> None:
|
||||
"""Set up the AirOS sensors from a config entry."""
|
||||
coordinator = config_entry.runtime_data.status
|
||||
coordinator = config_entry.runtime_data
|
||||
|
||||
entities = [AirOSSensor(coordinator, description) for description in COMMON_SENSORS]
|
||||
|
||||
|
||||
@@ -206,12 +206,6 @@
|
||||
},
|
||||
"reboot_failed": {
|
||||
"message": "The device did not accept the reboot request. Try again, or check your device web interface for errors."
|
||||
},
|
||||
"update_connection_authentication_error": {
|
||||
"message": "Authentication or connection failed during firmware update"
|
||||
},
|
||||
"update_error": {
|
||||
"message": "Connection failed during firmware update"
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1,101 +0,0 @@
|
||||
"""AirOS update component for Home Assistant."""
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
import logging
|
||||
from typing import Any
|
||||
|
||||
from airos.exceptions import AirOSConnectionAuthenticationError, AirOSException
|
||||
|
||||
from homeassistant.components.update import (
|
||||
UpdateDeviceClass,
|
||||
UpdateEntity,
|
||||
UpdateEntityFeature,
|
||||
)
|
||||
from homeassistant.core import HomeAssistant
|
||||
from homeassistant.exceptions import HomeAssistantError
|
||||
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
|
||||
|
||||
from .const import DOMAIN
|
||||
from .coordinator import (
|
||||
AirOSConfigEntry,
|
||||
AirOSDataUpdateCoordinator,
|
||||
AirOSFirmwareUpdateCoordinator,
|
||||
)
|
||||
from .entity import AirOSEntity
|
||||
|
||||
PARALLEL_UPDATES = 0
|
||||
|
||||
_LOGGER = logging.getLogger(__name__)
|
||||
|
||||
|
||||
async def async_setup_entry(
|
||||
hass: HomeAssistant,
|
||||
config_entry: AirOSConfigEntry,
|
||||
async_add_entities: AddConfigEntryEntitiesCallback,
|
||||
) -> None:
|
||||
"""Set up the AirOS update entity from a config entry."""
|
||||
runtime_data = config_entry.runtime_data
|
||||
|
||||
if runtime_data.firmware is None: # Unsupported device
|
||||
return
|
||||
async_add_entities([AirOSUpdateEntity(runtime_data.status, runtime_data.firmware)])
|
||||
|
||||
|
||||
class AirOSUpdateEntity(AirOSEntity, UpdateEntity):
|
||||
"""Update entity for AirOS firmware updates."""
|
||||
|
||||
_attr_device_class = UpdateDeviceClass.FIRMWARE
|
||||
_attr_supported_features = UpdateEntityFeature.INSTALL
|
||||
|
||||
def __init__(
|
||||
self,
|
||||
status: AirOSDataUpdateCoordinator,
|
||||
firmware: AirOSFirmwareUpdateCoordinator,
|
||||
) -> None:
|
||||
"""Initialize the AirOS update entity."""
|
||||
super().__init__(status)
|
||||
self.status = status
|
||||
self.firmware = firmware
|
||||
|
||||
self._attr_unique_id = f"{status.data.derived.mac}_firmware_update"
|
||||
|
||||
@property
|
||||
def installed_version(self) -> str | None:
|
||||
"""Return the installed firmware version."""
|
||||
return self.status.data.host.fwversion
|
||||
|
||||
@property
|
||||
def latest_version(self) -> str | None:
|
||||
"""Return the latest firmware version."""
|
||||
if not self.firmware.data.get("update", False):
|
||||
return self.status.data.host.fwversion
|
||||
return self.firmware.data.get("version")
|
||||
|
||||
@property
|
||||
def release_url(self) -> str | None:
|
||||
"""Return the release url of the latest firmware."""
|
||||
return self.firmware.data.get("changelog")
|
||||
|
||||
async def async_install(
|
||||
self,
|
||||
version: str | None,
|
||||
backup: bool,
|
||||
**kwargs: Any,
|
||||
) -> None:
|
||||
"""Handle the firmware update installation."""
|
||||
_LOGGER.debug("Starting firmware update")
|
||||
try:
|
||||
await self.status.airos_device.login()
|
||||
await self.status.airos_device.download()
|
||||
await self.status.airos_device.install()
|
||||
except AirOSConnectionAuthenticationError as err:
|
||||
raise HomeAssistantError(
|
||||
translation_domain=DOMAIN,
|
||||
translation_key="update_connection_authentication_error",
|
||||
) from err
|
||||
except AirOSException as err:
|
||||
raise HomeAssistantError(
|
||||
translation_domain=DOMAIN,
|
||||
translation_key="update_error",
|
||||
) from err
|
||||
@@ -11,12 +11,12 @@
|
||||
"user": {
|
||||
"data": {
|
||||
"tracked_apps": "Apps",
|
||||
"tracked_custom_integrations": "Community integrations",
|
||||
"tracked_custom_integrations": "Custom integrations",
|
||||
"tracked_integrations": "Integrations"
|
||||
},
|
||||
"data_description": {
|
||||
"tracked_apps": "Select the apps you want to track",
|
||||
"tracked_custom_integrations": "Select the community integrations you want to track",
|
||||
"tracked_custom_integrations": "Select the custom integrations you want to track",
|
||||
"tracked_integrations": "Select the integrations you want to track"
|
||||
}
|
||||
}
|
||||
@@ -31,7 +31,7 @@
|
||||
"unit_of_measurement": "[%key:component::analytics_insights::entity::sensor::apps::unit_of_measurement%]"
|
||||
},
|
||||
"custom_integrations": {
|
||||
"name": "{custom_integration_domain} (community)",
|
||||
"name": "{custom_integration_domain} (custom)",
|
||||
"unit_of_measurement": "[%key:component::analytics_insights::entity::sensor::apps::unit_of_measurement%]"
|
||||
},
|
||||
"total_active_installations": {
|
||||
|
||||
@@ -92,7 +92,6 @@ class AnglianWaterUpdateCoordinator(DataUpdateCoordinator[None]):
|
||||
_LOGGER.debug("Updating statistics for the first time")
|
||||
usage_sum = 0.0
|
||||
last_stats_time = None
|
||||
allow_update_last_stored_hour = False
|
||||
else:
|
||||
if not meter.readings or len(meter.readings) == 0:
|
||||
_LOGGER.debug("No recent usage statistics found, skipping update")
|
||||
@@ -108,7 +107,6 @@ class AnglianWaterUpdateCoordinator(DataUpdateCoordinator[None]):
|
||||
continue
|
||||
start = dt_util.as_local(parsed_read_at) - timedelta(hours=1)
|
||||
_LOGGER.debug("Getting statistics at %s", start)
|
||||
stats: dict[str, list[Any]] = {}
|
||||
for end in (start + timedelta(seconds=1), None):
|
||||
stats = await get_instance(self.hass).async_add_executor_job(
|
||||
statistics_during_period,
|
||||
@@ -129,28 +127,15 @@ class AnglianWaterUpdateCoordinator(DataUpdateCoordinator[None]):
|
||||
"Not found, trying to find oldest statistic after %s",
|
||||
start,
|
||||
)
|
||||
assert stats
|
||||
|
||||
if not stats or not stats.get(usage_statistic_id):
|
||||
_LOGGER.debug(
|
||||
"Could not find existing statistics during period lookup for %s, "
|
||||
"falling back to last stored statistic",
|
||||
usage_statistic_id,
|
||||
)
|
||||
allow_update_last_stored_hour = True
|
||||
last_records = last_stat[usage_statistic_id]
|
||||
usage_sum = float(last_records[0].get("sum") or 0.0)
|
||||
last_stats_time = last_records[0]["start"]
|
||||
else:
|
||||
allow_update_last_stored_hour = False
|
||||
records = stats[usage_statistic_id]
|
||||
def _safe_get_sum(records: list[Any]) -> float:
|
||||
if records and "sum" in records[0]:
|
||||
return float(records[0]["sum"])
|
||||
return 0.0
|
||||
|
||||
def _safe_get_sum(records: list[Any]) -> float:
|
||||
if records and "sum" in records[0]:
|
||||
return float(records[0]["sum"])
|
||||
return 0.0
|
||||
|
||||
usage_sum = _safe_get_sum(records)
|
||||
last_stats_time = records[0]["start"]
|
||||
usage_sum = _safe_get_sum(stats.get(usage_statistic_id, []))
|
||||
last_stats_time = stats[usage_statistic_id][0]["start"]
|
||||
|
||||
usage_statistics = []
|
||||
|
||||
@@ -163,13 +148,7 @@ class AnglianWaterUpdateCoordinator(DataUpdateCoordinator[None]):
|
||||
)
|
||||
continue
|
||||
start = dt_util.as_local(parsed_read_at) - timedelta(hours=1)
|
||||
if last_stats_time is not None and (
|
||||
start.timestamp() < last_stats_time
|
||||
or (
|
||||
start.timestamp() == last_stats_time
|
||||
and not allow_update_last_stored_hour
|
||||
)
|
||||
):
|
||||
if last_stats_time is not None and start.timestamp() <= last_stats_time:
|
||||
continue
|
||||
usage_state = max(0, read["consumption"] / 1000)
|
||||
usage_sum = max(0, read["read"])
|
||||
|
||||
@@ -2,15 +2,19 @@
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
from homeassistant.config_entries import ConfigSubentry
|
||||
import anthropic
|
||||
|
||||
from homeassistant.config_entries import ConfigEntry, ConfigSubentry
|
||||
from homeassistant.const import CONF_API_KEY, Platform
|
||||
from homeassistant.core import HomeAssistant
|
||||
from homeassistant.exceptions import ConfigEntryAuthFailed, ConfigEntryNotReady
|
||||
from homeassistant.helpers import (
|
||||
config_validation as cv,
|
||||
device_registry as dr,
|
||||
entity_registry as er,
|
||||
issue_registry as ir,
|
||||
)
|
||||
from homeassistant.helpers.httpx_client import get_async_client
|
||||
from homeassistant.helpers.typing import ConfigType
|
||||
|
||||
from .const import (
|
||||
@@ -20,11 +24,12 @@ from .const import (
|
||||
DOMAIN,
|
||||
LOGGER,
|
||||
)
|
||||
from .coordinator import AnthropicConfigEntry, AnthropicCoordinator
|
||||
|
||||
PLATFORMS = (Platform.AI_TASK, Platform.CONVERSATION)
|
||||
CONFIG_SCHEMA = cv.config_entry_only_config_schema(DOMAIN)
|
||||
|
||||
type AnthropicConfigEntry = ConfigEntry[anthropic.AsyncClient]
|
||||
|
||||
|
||||
async def async_setup(hass: HomeAssistant, config: ConfigType) -> bool:
|
||||
"""Set up Anthropic."""
|
||||
@@ -34,9 +39,29 @@ async def async_setup(hass: HomeAssistant, config: ConfigType) -> bool:
|
||||
|
||||
async def async_setup_entry(hass: HomeAssistant, entry: AnthropicConfigEntry) -> bool:
|
||||
"""Set up Anthropic from a config entry."""
|
||||
coordinator = AnthropicCoordinator(hass, entry)
|
||||
await coordinator.async_config_entry_first_refresh()
|
||||
entry.runtime_data = coordinator
|
||||
client = anthropic.AsyncAnthropic(
|
||||
api_key=entry.data[CONF_API_KEY], http_client=get_async_client(hass)
|
||||
)
|
||||
try:
|
||||
await client.models.list(timeout=10.0)
|
||||
except anthropic.AuthenticationError as err:
|
||||
raise ConfigEntryAuthFailed(
|
||||
translation_domain=DOMAIN,
|
||||
translation_key="api_authentication_error",
|
||||
translation_placeholders={"message": err.message},
|
||||
) from err
|
||||
except anthropic.AnthropicError as err:
|
||||
raise ConfigEntryNotReady(
|
||||
translation_domain=DOMAIN,
|
||||
translation_key="api_error",
|
||||
translation_placeholders={
|
||||
"message": err.message
|
||||
if isinstance(err, anthropic.APIError)
|
||||
else str(err)
|
||||
},
|
||||
) from err
|
||||
|
||||
entry.runtime_data = client
|
||||
|
||||
await hass.config_entries.async_forward_entry_setups(entry, PLATFORMS)
|
||||
|
||||
|
||||
@@ -48,12 +48,10 @@ from .const import (
|
||||
CONF_CODE_EXECUTION,
|
||||
CONF_MAX_TOKENS,
|
||||
CONF_PROMPT,
|
||||
CONF_PROMPT_CACHING,
|
||||
CONF_RECOMMENDED,
|
||||
CONF_TEMPERATURE,
|
||||
CONF_THINKING_BUDGET,
|
||||
CONF_THINKING_EFFORT,
|
||||
CONF_TOOL_SEARCH,
|
||||
CONF_WEB_SEARCH,
|
||||
CONF_WEB_SEARCH_CITY,
|
||||
CONF_WEB_SEARCH_COUNTRY,
|
||||
@@ -67,9 +65,7 @@ from .const import (
|
||||
DOMAIN,
|
||||
NON_ADAPTIVE_THINKING_MODELS,
|
||||
NON_THINKING_MODELS,
|
||||
TOOL_SEARCH_UNSUPPORTED_MODELS,
|
||||
WEB_SEARCH_UNSUPPORTED_MODELS,
|
||||
PromptCaching,
|
||||
)
|
||||
|
||||
if TYPE_CHECKING:
|
||||
@@ -360,16 +356,6 @@ class ConversationSubentryFlowHandler(ConfigSubentryFlow):
|
||||
CONF_TEMPERATURE,
|
||||
default=DEFAULT[CONF_TEMPERATURE],
|
||||
): NumberSelector(NumberSelectorConfig(min=0, max=1, step=0.05)),
|
||||
vol.Optional(
|
||||
CONF_PROMPT_CACHING,
|
||||
default=DEFAULT[CONF_PROMPT_CACHING],
|
||||
): SelectSelector(
|
||||
SelectSelectorConfig(
|
||||
options=[x.value for x in PromptCaching],
|
||||
translation_key=CONF_PROMPT_CACHING,
|
||||
mode=SelectSelectorMode.DROPDOWN,
|
||||
)
|
||||
),
|
||||
}
|
||||
|
||||
if user_input is not None:
|
||||
@@ -468,16 +454,6 @@ class ConversationSubentryFlowHandler(ConfigSubentryFlow):
|
||||
self.options.pop(CONF_WEB_SEARCH_COUNTRY, None)
|
||||
self.options.pop(CONF_WEB_SEARCH_TIMEZONE, None)
|
||||
|
||||
if not model.startswith(tuple(TOOL_SEARCH_UNSUPPORTED_MODELS)):
|
||||
step_schema[
|
||||
vol.Optional(
|
||||
CONF_TOOL_SEARCH,
|
||||
default=DEFAULT[CONF_TOOL_SEARCH],
|
||||
)
|
||||
] = bool
|
||||
else:
|
||||
self.options.pop(CONF_TOOL_SEARCH, None)
|
||||
|
||||
if not step_schema:
|
||||
user_input = {}
|
||||
|
||||
|
||||
@@ -1,6 +1,5 @@
|
||||
"""Constants for the Anthropic integration."""
|
||||
|
||||
from enum import StrEnum
|
||||
import logging
|
||||
|
||||
DOMAIN = "anthropic"
|
||||
@@ -14,11 +13,9 @@ CONF_PROMPT = "prompt"
|
||||
CONF_CHAT_MODEL = "chat_model"
|
||||
CONF_CODE_EXECUTION = "code_execution"
|
||||
CONF_MAX_TOKENS = "max_tokens"
|
||||
CONF_PROMPT_CACHING = "prompt_caching"
|
||||
CONF_TEMPERATURE = "temperature"
|
||||
CONF_THINKING_BUDGET = "thinking_budget"
|
||||
CONF_THINKING_EFFORT = "thinking_effort"
|
||||
CONF_TOOL_SEARCH = "tool_search"
|
||||
CONF_WEB_SEARCH = "web_search"
|
||||
CONF_WEB_SEARCH_USER_LOCATION = "user_location"
|
||||
CONF_WEB_SEARCH_MAX_USES = "web_search_max_uses"
|
||||
@@ -27,31 +24,20 @@ CONF_WEB_SEARCH_REGION = "region"
|
||||
CONF_WEB_SEARCH_COUNTRY = "country"
|
||||
CONF_WEB_SEARCH_TIMEZONE = "timezone"
|
||||
|
||||
|
||||
class PromptCaching(StrEnum):
|
||||
"""Prompt caching options."""
|
||||
|
||||
OFF = "off"
|
||||
PROMPT = "prompt"
|
||||
AUTOMATIC = "automatic"
|
||||
|
||||
|
||||
MIN_THINKING_BUDGET = 1024
|
||||
|
||||
DEFAULT = {
|
||||
CONF_CHAT_MODEL: "claude-haiku-4-5",
|
||||
CONF_CODE_EXECUTION: False,
|
||||
CONF_MAX_TOKENS: 3000,
|
||||
CONF_PROMPT_CACHING: PromptCaching.PROMPT.value,
|
||||
CONF_TEMPERATURE: 1.0,
|
||||
CONF_THINKING_BUDGET: MIN_THINKING_BUDGET,
|
||||
CONF_THINKING_BUDGET: 0,
|
||||
CONF_THINKING_EFFORT: "low",
|
||||
CONF_TOOL_SEARCH: False,
|
||||
CONF_WEB_SEARCH: False,
|
||||
CONF_WEB_SEARCH_USER_LOCATION: False,
|
||||
CONF_WEB_SEARCH_MAX_USES: 5,
|
||||
}
|
||||
|
||||
MIN_THINKING_BUDGET = 1024
|
||||
|
||||
NON_THINKING_MODELS = [
|
||||
"claude-3-haiku",
|
||||
]
|
||||
@@ -95,11 +81,6 @@ PROGRAMMATIC_TOOL_CALLING_UNSUPPORTED_MODELS = [
|
||||
"claude-3-haiku",
|
||||
]
|
||||
|
||||
TOOL_SEARCH_UNSUPPORTED_MODELS = [
|
||||
"claude-3",
|
||||
"claude-haiku",
|
||||
]
|
||||
|
||||
DEPRECATED_MODELS = [
|
||||
"claude-3",
|
||||
]
|
||||
|
||||
@@ -1,78 +0,0 @@
|
||||
"""Coordinator for the Anthropic integration."""
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
from datetime import timedelta
|
||||
|
||||
import anthropic
|
||||
|
||||
from homeassistant.config_entries import ConfigEntry
|
||||
from homeassistant.const import CONF_API_KEY
|
||||
from homeassistant.core import HomeAssistant, callback
|
||||
from homeassistant.exceptions import ConfigEntryAuthFailed
|
||||
from homeassistant.helpers.httpx_client import get_async_client
|
||||
from homeassistant.helpers.update_coordinator import DataUpdateCoordinator, UpdateFailed
|
||||
|
||||
from .const import DOMAIN, LOGGER
|
||||
|
||||
UPDATE_INTERVAL_CONNECTED = timedelta(hours=12)
|
||||
UPDATE_INTERVAL_DISCONNECTED = timedelta(minutes=1)
|
||||
|
||||
type AnthropicConfigEntry = ConfigEntry[AnthropicCoordinator]
|
||||
|
||||
|
||||
class AnthropicCoordinator(DataUpdateCoordinator[None]):
|
||||
"""DataUpdateCoordinator which uses different intervals after successful and unsuccessful updates."""
|
||||
|
||||
client: anthropic.AsyncAnthropic
|
||||
|
||||
def __init__(self, hass: HomeAssistant, config_entry: AnthropicConfigEntry) -> None:
|
||||
"""Initialize the coordinator."""
|
||||
super().__init__(
|
||||
hass,
|
||||
LOGGER,
|
||||
config_entry=config_entry,
|
||||
name=config_entry.title,
|
||||
update_interval=UPDATE_INTERVAL_CONNECTED,
|
||||
update_method=self.async_update_data,
|
||||
always_update=False,
|
||||
)
|
||||
self.client = anthropic.AsyncAnthropic(
|
||||
api_key=config_entry.data[CONF_API_KEY], http_client=get_async_client(hass)
|
||||
)
|
||||
|
||||
@callback
|
||||
def async_set_updated_data(self, data: None) -> None:
|
||||
"""Manually update data, notify listeners and update refresh interval."""
|
||||
self.update_interval = UPDATE_INTERVAL_CONNECTED
|
||||
super().async_set_updated_data(data)
|
||||
|
||||
async def async_update_data(self) -> None:
|
||||
"""Fetch data from the API."""
|
||||
try:
|
||||
self.update_interval = UPDATE_INTERVAL_DISCONNECTED
|
||||
await self.client.models.list(timeout=10.0)
|
||||
self.update_interval = UPDATE_INTERVAL_CONNECTED
|
||||
except anthropic.APITimeoutError as err:
|
||||
raise TimeoutError(err.message or str(err)) from err
|
||||
except anthropic.AuthenticationError as err:
|
||||
raise ConfigEntryAuthFailed(
|
||||
translation_domain=DOMAIN,
|
||||
translation_key="api_authentication_error",
|
||||
translation_placeholders={"message": err.message},
|
||||
) from err
|
||||
except anthropic.APIError as err:
|
||||
raise UpdateFailed(
|
||||
translation_domain=DOMAIN,
|
||||
translation_key="api_error",
|
||||
translation_placeholders={"message": err.message},
|
||||
) from err
|
||||
|
||||
def mark_connection_error(self) -> None:
|
||||
"""Mark the connection as having an error and reschedule background check."""
|
||||
self.update_interval = UPDATE_INTERVAL_DISCONNECTED
|
||||
if self.last_update_success:
|
||||
self.last_update_success = False
|
||||
self.async_update_listeners()
|
||||
if self._listeners and not self.hass.is_stopping:
|
||||
self._schedule_refresh()
|
||||
@@ -58,8 +58,6 @@ from anthropic.types import (
|
||||
ToolChoiceAutoParam,
|
||||
ToolChoiceToolParam,
|
||||
ToolParam,
|
||||
ToolSearchToolBm25_20251119Param,
|
||||
ToolSearchToolResultBlock,
|
||||
ToolUnionParam,
|
||||
ToolUseBlock,
|
||||
ToolUseBlockParam,
|
||||
@@ -76,9 +74,6 @@ from anthropic.types.message_create_params import MessageCreateParamsStreaming
|
||||
from anthropic.types.text_editor_code_execution_tool_result_block_param import (
|
||||
Content as TextEditorCodeExecutionToolResultBlockParamContentParam,
|
||||
)
|
||||
from anthropic.types.tool_search_tool_result_block_param import (
|
||||
Content as ToolSearchToolResultBlockParamContentParam,
|
||||
)
|
||||
import voluptuous as vol
|
||||
from voluptuous_openapi import convert
|
||||
|
||||
@@ -87,20 +82,19 @@ from homeassistant.config_entries import ConfigSubentry
|
||||
from homeassistant.core import HomeAssistant
|
||||
from homeassistant.exceptions import HomeAssistantError
|
||||
from homeassistant.helpers import device_registry as dr, llm
|
||||
from homeassistant.helpers.entity import Entity
|
||||
from homeassistant.helpers.json import json_dumps
|
||||
from homeassistant.helpers.update_coordinator import CoordinatorEntity
|
||||
from homeassistant.util import slugify
|
||||
from homeassistant.util.json import JsonObjectType
|
||||
|
||||
from . import AnthropicConfigEntry
|
||||
from .const import (
|
||||
CONF_CHAT_MODEL,
|
||||
CONF_CODE_EXECUTION,
|
||||
CONF_MAX_TOKENS,
|
||||
CONF_PROMPT_CACHING,
|
||||
CONF_TEMPERATURE,
|
||||
CONF_THINKING_BUDGET,
|
||||
CONF_THINKING_EFFORT,
|
||||
CONF_TOOL_SEARCH,
|
||||
CONF_WEB_SEARCH,
|
||||
CONF_WEB_SEARCH_CITY,
|
||||
CONF_WEB_SEARCH_COUNTRY,
|
||||
@@ -116,9 +110,7 @@ from .const import (
|
||||
NON_THINKING_MODELS,
|
||||
PROGRAMMATIC_TOOL_CALLING_UNSUPPORTED_MODELS,
|
||||
UNSUPPORTED_STRUCTURED_OUTPUT_MODELS,
|
||||
PromptCaching,
|
||||
)
|
||||
from .coordinator import AnthropicConfigEntry, AnthropicCoordinator
|
||||
|
||||
# Max number of back and forth with the LLM to generate a response
|
||||
MAX_TOOL_ITERATIONS = 10
|
||||
@@ -210,7 +202,7 @@ class ContentDetails:
|
||||
]
|
||||
|
||||
|
||||
def _convert_content( # noqa: C901
|
||||
def _convert_content(
|
||||
chat_content: Iterable[conversation.Content],
|
||||
) -> tuple[list[MessageParam], str | None]:
|
||||
"""Transform HA chat_log content into Anthropic API format."""
|
||||
@@ -263,15 +255,6 @@ def _convert_content( # noqa: C901
|
||||
content.tool_result,
|
||||
),
|
||||
}
|
||||
elif content.tool_name == "tool_search":
|
||||
tool_result_block = {
|
||||
"type": "tool_search_tool_result",
|
||||
"tool_use_id": content.tool_call_id,
|
||||
"content": cast(
|
||||
ToolSearchToolResultBlockParamContentParam,
|
||||
content.tool_result,
|
||||
),
|
||||
}
|
||||
else:
|
||||
tool_result_block = {
|
||||
"type": "tool_result",
|
||||
@@ -402,7 +385,6 @@ def _convert_content( # noqa: C901
|
||||
"code_execution",
|
||||
"bash_code_execution",
|
||||
"text_editor_code_execution",
|
||||
"tool_search_tool_bm25",
|
||||
],
|
||||
tool_call.tool_name,
|
||||
),
|
||||
@@ -415,7 +397,6 @@ def _convert_content( # noqa: C901
|
||||
"code_execution",
|
||||
"bash_code_execution",
|
||||
"text_editor_code_execution",
|
||||
"tool_search_tool_bm25",
|
||||
]
|
||||
else ToolUseBlockParam(
|
||||
type="tool_use",
|
||||
@@ -577,7 +558,6 @@ async def _transform_stream( # noqa: C901 - This is complex, but better to have
|
||||
CodeExecutionToolResultBlock,
|
||||
BashCodeExecutionToolResultBlock,
|
||||
TextEditorCodeExecutionToolResultBlock,
|
||||
ToolSearchToolResultBlock,
|
||||
),
|
||||
):
|
||||
if content_details:
|
||||
@@ -678,7 +658,7 @@ def _create_token_stats(
|
||||
}
|
||||
|
||||
|
||||
class AnthropicBaseLLMEntity(CoordinatorEntity[AnthropicCoordinator]):
|
||||
class AnthropicBaseLLMEntity(Entity):
|
||||
"""Anthropic base LLM entity."""
|
||||
|
||||
_attr_has_entity_name = True
|
||||
@@ -686,7 +666,6 @@ class AnthropicBaseLLMEntity(CoordinatorEntity[AnthropicCoordinator]):
|
||||
|
||||
def __init__(self, entry: AnthropicConfigEntry, subentry: ConfigSubentry) -> None:
|
||||
"""Initialize the entity."""
|
||||
super().__init__(entry.runtime_data)
|
||||
self.entry = entry
|
||||
self.subentry = subentry
|
||||
self._attr_unique_id = subentry.subentry_id
|
||||
@@ -698,7 +677,7 @@ class AnthropicBaseLLMEntity(CoordinatorEntity[AnthropicCoordinator]):
|
||||
entry_type=dr.DeviceEntryType.SERVICE,
|
||||
)
|
||||
|
||||
async def _async_handle_chat_log( # noqa: C901
|
||||
async def _async_handle_chat_log(
|
||||
self,
|
||||
chat_log: conversation.ChatLog,
|
||||
structure_name: str | None = None,
|
||||
@@ -708,20 +687,21 @@ class AnthropicBaseLLMEntity(CoordinatorEntity[AnthropicCoordinator]):
|
||||
"""Generate an answer for the chat log."""
|
||||
options = self.subentry.data
|
||||
|
||||
preloaded_tools = [
|
||||
"HassTurnOn",
|
||||
"HassTurnOff",
|
||||
"GetLiveContext",
|
||||
"code_execution",
|
||||
"web_search",
|
||||
]
|
||||
|
||||
system = chat_log.content[0]
|
||||
if not isinstance(system, conversation.SystemContent):
|
||||
raise HomeAssistantError(
|
||||
translation_domain=DOMAIN, translation_key="system_message_not_found"
|
||||
)
|
||||
|
||||
# System prompt with caching enabled
|
||||
system_prompt: list[TextBlockParam] = [
|
||||
TextBlockParam(
|
||||
type="text",
|
||||
text=system.content,
|
||||
cache_control={"type": "ephemeral"},
|
||||
)
|
||||
]
|
||||
|
||||
messages, container_id = _convert_content(chat_log.content[1:])
|
||||
|
||||
model = options.get(CONF_CHAT_MODEL, DEFAULT[CONF_CHAT_MODEL])
|
||||
@@ -730,28 +710,11 @@ class AnthropicBaseLLMEntity(CoordinatorEntity[AnthropicCoordinator]):
|
||||
model=model,
|
||||
messages=messages,
|
||||
max_tokens=options.get(CONF_MAX_TOKENS, DEFAULT[CONF_MAX_TOKENS]),
|
||||
system=system.content,
|
||||
system=system_prompt,
|
||||
stream=True,
|
||||
container=container_id,
|
||||
)
|
||||
|
||||
if (
|
||||
options.get(CONF_PROMPT_CACHING, DEFAULT[CONF_PROMPT_CACHING])
|
||||
== PromptCaching.PROMPT
|
||||
):
|
||||
model_args["system"] = [
|
||||
{
|
||||
"type": "text",
|
||||
"text": system.content,
|
||||
"cache_control": {"type": "ephemeral"},
|
||||
}
|
||||
]
|
||||
elif (
|
||||
options.get(CONF_PROMPT_CACHING, DEFAULT[CONF_PROMPT_CACHING])
|
||||
== PromptCaching.AUTOMATIC
|
||||
):
|
||||
model_args["cache_control"] = {"type": "ephemeral"}
|
||||
|
||||
if not model.startswith(tuple(NON_ADAPTIVE_THINKING_MODELS)):
|
||||
thinking_effort = options.get(
|
||||
CONF_THINKING_EFFORT, DEFAULT[CONF_THINKING_EFFORT]
|
||||
@@ -910,27 +873,11 @@ class AnthropicBaseLLMEntity(CoordinatorEntity[AnthropicCoordinator]):
|
||||
),
|
||||
)
|
||||
)
|
||||
preloaded_tools.append(structure_name)
|
||||
|
||||
if tools:
|
||||
if (
|
||||
options.get(CONF_TOOL_SEARCH, DEFAULT[CONF_TOOL_SEARCH])
|
||||
and len(tools) > len(preloaded_tools) + 1
|
||||
):
|
||||
for tool in tools:
|
||||
if not tool["name"].endswith(tuple(preloaded_tools)):
|
||||
tool["defer_loading"] = True
|
||||
tools.append(
|
||||
ToolSearchToolBm25_20251119Param(
|
||||
type="tool_search_tool_bm25_20251119",
|
||||
name="tool_search_tool_bm25",
|
||||
)
|
||||
)
|
||||
|
||||
model_args["tools"] = tools
|
||||
|
||||
coordinator = self.entry.runtime_data
|
||||
client = coordinator.client
|
||||
client = self.entry.runtime_data
|
||||
|
||||
# To prevent infinite loops, we limit the number of iterations
|
||||
for _iteration in range(max_iterations):
|
||||
@@ -952,25 +899,13 @@ class AnthropicBaseLLMEntity(CoordinatorEntity[AnthropicCoordinator]):
|
||||
)
|
||||
messages.extend(new_messages)
|
||||
except anthropic.AuthenticationError as err:
|
||||
# Trigger coordinator to confirm the auth failure and trigger the reauth flow.
|
||||
await coordinator.async_request_refresh()
|
||||
self.entry.async_start_reauth(self.hass)
|
||||
raise HomeAssistantError(
|
||||
translation_domain=DOMAIN,
|
||||
translation_key="api_authentication_error",
|
||||
translation_placeholders={"message": err.message},
|
||||
) from err
|
||||
except anthropic.APIConnectionError as err:
|
||||
LOGGER.info("Connection error while talking to Anthropic: %s", err)
|
||||
coordinator.mark_connection_error()
|
||||
raise HomeAssistantError(
|
||||
translation_domain=DOMAIN,
|
||||
translation_key="api_error",
|
||||
translation_placeholders={"message": err.message},
|
||||
) from err
|
||||
except anthropic.AnthropicError as err:
|
||||
# Non-connection error, mark connection as healthy
|
||||
coordinator.async_set_updated_data(None)
|
||||
LOGGER.error("Error while talking to Anthropic: %s", err)
|
||||
raise HomeAssistantError(
|
||||
translation_domain=DOMAIN,
|
||||
translation_key="api_error",
|
||||
@@ -982,7 +917,6 @@ class AnthropicBaseLLMEntity(CoordinatorEntity[AnthropicCoordinator]):
|
||||
) from err
|
||||
|
||||
if not chat_log.unresponded_tool_results:
|
||||
coordinator.async_set_updated_data(None)
|
||||
break
|
||||
|
||||
|
||||
|
||||
@@ -8,6 +8,6 @@
|
||||
"documentation": "https://www.home-assistant.io/integrations/anthropic",
|
||||
"integration_type": "service",
|
||||
"iot_class": "cloud_polling",
|
||||
"quality_scale": "silver",
|
||||
"quality_scale": "bronze",
|
||||
"requirements": ["anthropic==0.83.0"]
|
||||
}
|
||||
|
||||
@@ -35,9 +35,9 @@ rules:
|
||||
config-entry-unloading: done
|
||||
docs-configuration-parameters: done
|
||||
docs-installation-parameters: done
|
||||
entity-unavailable: done
|
||||
entity-unavailable: todo
|
||||
integration-owner: done
|
||||
log-when-unavailable: done
|
||||
log-when-unavailable: todo
|
||||
parallel-updates:
|
||||
status: exempt
|
||||
comment: |
|
||||
|
||||
@@ -58,7 +58,7 @@ class ModelDeprecatedRepairFlow(RepairsFlow):
|
||||
if entry.entry_id in self._model_list_cache:
|
||||
model_list = self._model_list_cache[entry.entry_id]
|
||||
else:
|
||||
client = entry.runtime_data.client
|
||||
client = entry.runtime_data
|
||||
model_list = [
|
||||
model_option
|
||||
for model_option in await get_model_list(client)
|
||||
|
||||
@@ -47,13 +47,11 @@
|
||||
"data": {
|
||||
"chat_model": "[%key:common::generic::model%]",
|
||||
"max_tokens": "[%key:component::anthropic::config_subentries::conversation::step::advanced::data::max_tokens%]",
|
||||
"prompt_caching": "[%key:component::anthropic::config_subentries::conversation::step::advanced::data::prompt_caching%]",
|
||||
"temperature": "[%key:component::anthropic::config_subentries::conversation::step::advanced::data::temperature%]"
|
||||
},
|
||||
"data_description": {
|
||||
"chat_model": "[%key:component::anthropic::config_subentries::conversation::step::advanced::data_description::chat_model%]",
|
||||
"max_tokens": "[%key:component::anthropic::config_subentries::conversation::step::advanced::data_description::max_tokens%]",
|
||||
"prompt_caching": "[%key:component::anthropic::config_subentries::conversation::step::advanced::data_description::prompt_caching%]",
|
||||
"temperature": "[%key:component::anthropic::config_subentries::conversation::step::advanced::data_description::temperature%]"
|
||||
},
|
||||
"title": "[%key:component::anthropic::config_subentries::conversation::step::advanced::title%]"
|
||||
@@ -74,7 +72,6 @@
|
||||
"code_execution": "[%key:component::anthropic::config_subentries::conversation::step::model::data::code_execution%]",
|
||||
"thinking_budget": "[%key:component::anthropic::config_subentries::conversation::step::model::data::thinking_budget%]",
|
||||
"thinking_effort": "[%key:component::anthropic::config_subentries::conversation::step::model::data::thinking_effort%]",
|
||||
"tool_search": "[%key:component::anthropic::config_subentries::conversation::step::model::data::tool_search%]",
|
||||
"user_location": "[%key:component::anthropic::config_subentries::conversation::step::model::data::user_location%]",
|
||||
"web_search": "[%key:component::anthropic::config_subentries::conversation::step::model::data::web_search%]",
|
||||
"web_search_max_uses": "[%key:component::anthropic::config_subentries::conversation::step::model::data::web_search_max_uses%]"
|
||||
@@ -83,7 +80,6 @@
|
||||
"code_execution": "[%key:component::anthropic::config_subentries::conversation::step::model::data_description::code_execution%]",
|
||||
"thinking_budget": "[%key:component::anthropic::config_subentries::conversation::step::model::data_description::thinking_budget%]",
|
||||
"thinking_effort": "[%key:component::anthropic::config_subentries::conversation::step::model::data_description::thinking_effort%]",
|
||||
"tool_search": "[%key:component::anthropic::config_subentries::conversation::step::model::data_description::tool_search%]",
|
||||
"user_location": "[%key:component::anthropic::config_subentries::conversation::step::model::data_description::user_location%]",
|
||||
"web_search": "[%key:component::anthropic::config_subentries::conversation::step::model::data_description::web_search%]",
|
||||
"web_search_max_uses": "[%key:component::anthropic::config_subentries::conversation::step::model::data_description::web_search_max_uses%]"
|
||||
@@ -107,13 +103,11 @@
|
||||
"data": {
|
||||
"chat_model": "[%key:common::generic::model%]",
|
||||
"max_tokens": "Maximum tokens to return in response",
|
||||
"prompt_caching": "Caching strategy",
|
||||
"temperature": "Temperature"
|
||||
},
|
||||
"data_description": {
|
||||
"chat_model": "The model to serve the responses.",
|
||||
"max_tokens": "Limit the number of response tokens.",
|
||||
"prompt_caching": "Optimize your API cost and response times based on your usage.",
|
||||
"temperature": "Control the randomness of the response, trading off between creativity and coherence."
|
||||
},
|
||||
"title": "Advanced settings"
|
||||
@@ -138,7 +132,6 @@
|
||||
"code_execution": "Code execution",
|
||||
"thinking_budget": "Thinking budget",
|
||||
"thinking_effort": "Thinking effort",
|
||||
"tool_search": "Enable tool search tool",
|
||||
"user_location": "Include home location",
|
||||
"web_search": "Enable web search",
|
||||
"web_search_max_uses": "Maximum web searches"
|
||||
@@ -147,7 +140,6 @@
|
||||
"code_execution": "Allow the model to execute code in a secure sandbox environment, enabling it to analyze data and perform complex calculations.",
|
||||
"thinking_budget": "The number of tokens the model can use to think about the response out of the total maximum number of tokens. Set to 1024 or greater to enable extended thinking.",
|
||||
"thinking_effort": "Control how many tokens Claude uses when responding, trading off between response thoroughness and token efficiency",
|
||||
"tool_search": "Enable dynamic tool discovery instead of preloading all tools into the context",
|
||||
"user_location": "Localize search results based on home location",
|
||||
"web_search": "The web search tool gives Claude direct access to real-time web content, allowing it to answer questions with up-to-date information beyond its knowledge cutoff",
|
||||
"web_search_max_uses": "Limit the number of searches performed per response"
|
||||
@@ -218,13 +210,6 @@
|
||||
}
|
||||
},
|
||||
"selector": {
|
||||
"prompt_caching": {
|
||||
"options": {
|
||||
"automatic": "Full",
|
||||
"off": "Disabled",
|
||||
"prompt": "System prompt"
|
||||
}
|
||||
},
|
||||
"thinking_effort": {
|
||||
"options": {
|
||||
"high": "[%key:common::state::high%]",
|
||||
|
||||
@@ -7,7 +7,7 @@
|
||||
"integration_type": "device",
|
||||
"iot_class": "local_polling",
|
||||
"loggers": ["arcam"],
|
||||
"requirements": ["arcam-fmj==1.8.3"],
|
||||
"requirements": ["arcam-fmj==1.8.2"],
|
||||
"ssdp": [
|
||||
{
|
||||
"deviceType": "urn:schemas-upnp-org:device:MediaRenderer:1",
|
||||
|
||||
@@ -91,7 +91,6 @@ SENSORS: tuple[ArcamFmjSensorEntityDescription, ...] = (
|
||||
value_fn=lambda state: (
|
||||
vp.colorspace.name.lower()
|
||||
if (vp := state.get_incoming_video_parameters()) is not None
|
||||
and vp.colorspace is not None
|
||||
else None
|
||||
),
|
||||
),
|
||||
|
||||
@@ -75,7 +75,7 @@
|
||||
},
|
||||
"services": {
|
||||
"announce": {
|
||||
"description": "Lets an Assist satellite announce a message.",
|
||||
"description": "Lets a satellite announce a message.",
|
||||
"fields": {
|
||||
"media_id": {
|
||||
"description": "The media ID to announce instead of using text-to-speech.",
|
||||
@@ -94,10 +94,10 @@
|
||||
"name": "Preannounce media ID"
|
||||
}
|
||||
},
|
||||
"name": "Announce on satellite"
|
||||
"name": "Announce"
|
||||
},
|
||||
"ask_question": {
|
||||
"description": "Lets an Assist satellite ask a question and get the user's response.",
|
||||
"description": "Asks a question and gets the user's response.",
|
||||
"fields": {
|
||||
"answers": {
|
||||
"description": "Possible answers to the question.",
|
||||
@@ -124,10 +124,10 @@
|
||||
"name": "Question media ID"
|
||||
}
|
||||
},
|
||||
"name": "Ask question on satellite"
|
||||
"name": "Ask question"
|
||||
},
|
||||
"start_conversation": {
|
||||
"description": "Starts a conversation from an Assist satellite.",
|
||||
"description": "Starts a conversation from a satellite.",
|
||||
"fields": {
|
||||
"extra_system_prompt": {
|
||||
"description": "Provide background information to the AI about the request.",
|
||||
@@ -150,13 +150,13 @@
|
||||
"name": "Message"
|
||||
}
|
||||
},
|
||||
"name": "Start conversation on satellite"
|
||||
"name": "Start conversation"
|
||||
}
|
||||
},
|
||||
"title": "Assist satellite",
|
||||
"triggers": {
|
||||
"idle": {
|
||||
"description": "Triggers after one or more Assist satellites become idle after having processed a command.",
|
||||
"description": "Triggers after one or more voice assistant satellites become idle after having processed a command.",
|
||||
"fields": {
|
||||
"behavior": {
|
||||
"name": "[%key:component::assist_satellite::common::trigger_behavior_name%]"
|
||||
@@ -165,7 +165,7 @@
|
||||
"name": "Satellite became idle"
|
||||
},
|
||||
"listening": {
|
||||
"description": "Triggers after one or more Assist satellites start listening for a command from someone.",
|
||||
"description": "Triggers after one or more voice assistant satellites start listening for a command from someone.",
|
||||
"fields": {
|
||||
"behavior": {
|
||||
"name": "[%key:component::assist_satellite::common::trigger_behavior_name%]"
|
||||
@@ -174,7 +174,7 @@
|
||||
"name": "Satellite started listening"
|
||||
},
|
||||
"processing": {
|
||||
"description": "Triggers after one or more Assist satellites start processing a command after having heard it.",
|
||||
"description": "Triggers after one or more voice assistant satellites start processing a command after having heard it.",
|
||||
"fields": {
|
||||
"behavior": {
|
||||
"name": "[%key:component::assist_satellite::common::trigger_behavior_name%]"
|
||||
@@ -183,7 +183,7 @@
|
||||
"name": "Satellite started processing"
|
||||
},
|
||||
"responding": {
|
||||
"description": "Triggers after one or more Assist satellites start responding to a command after having processed it, or start announcing something.",
|
||||
"description": "Triggers after one or more voice assistant satellites start responding to a command after having processed it, or start announcing something.",
|
||||
"fields": {
|
||||
"behavior": {
|
||||
"name": "[%key:component::assist_satellite::common::trigger_behavior_name%]"
|
||||
|
||||
@@ -29,7 +29,7 @@
|
||||
"integration_type": "device",
|
||||
"iot_class": "local_push",
|
||||
"loggers": ["axis"],
|
||||
"requirements": ["axis==68"],
|
||||
"requirements": ["axis==67"],
|
||||
"ssdp": [
|
||||
{
|
||||
"manufacturer": "AXIS"
|
||||
|
||||
@@ -54,7 +54,7 @@
|
||||
"message": "Storage account {account_name} not found"
|
||||
},
|
||||
"cannot_connect": {
|
||||
"message": "Cannot connect to storage account {account_name}"
|
||||
"message": "Can not connect to storage account {account_name}"
|
||||
},
|
||||
"container_not_found": {
|
||||
"message": "Storage container {container_name} not found"
|
||||
|
||||
@@ -74,12 +74,6 @@ async def async_setup_entry(hass: HomeAssistant, entry: BackblazeConfigEntry) ->
|
||||
translation_domain=DOMAIN,
|
||||
translation_key="invalid_bucket_name",
|
||||
) from err
|
||||
except exception.BadRequest as err:
|
||||
raise ConfigEntryNotReady(
|
||||
translation_domain=DOMAIN,
|
||||
translation_key="bad_request",
|
||||
translation_placeholders={"error_message": str(err)},
|
||||
) from err
|
||||
except (
|
||||
exception.B2ConnectionError,
|
||||
exception.B2RequestTimeout,
|
||||
|
||||
@@ -174,14 +174,6 @@ class BackblazeConfigFlow(ConfigFlow, domain=DOMAIN):
|
||||
"Backblaze B2 bucket '%s' does not exist", user_input[CONF_BUCKET]
|
||||
)
|
||||
errors[CONF_BUCKET] = "invalid_bucket_name"
|
||||
except exception.BadRequest as err:
|
||||
_LOGGER.error(
|
||||
"Backblaze B2 API rejected the request for Key ID '%s': %s",
|
||||
user_input[CONF_KEY_ID],
|
||||
err,
|
||||
)
|
||||
errors["base"] = "bad_request"
|
||||
placeholders["error_message"] = str(err)
|
||||
except (
|
||||
exception.B2ConnectionError,
|
||||
exception.B2RequestTimeout,
|
||||
|
||||
@@ -8,5 +8,5 @@
|
||||
"iot_class": "cloud_push",
|
||||
"loggers": ["b2sdk"],
|
||||
"quality_scale": "bronze",
|
||||
"requirements": ["b2sdk==2.10.4"]
|
||||
"requirements": ["b2sdk==2.10.1"]
|
||||
}
|
||||
|
||||
@@ -6,7 +6,6 @@
|
||||
"reconfigure_successful": "[%key:common::config_flow::abort::reconfigure_successful%]"
|
||||
},
|
||||
"error": {
|
||||
"bad_request": "The Backblaze B2 API rejected the request: {error_message}",
|
||||
"cannot_connect": "[%key:common::config_flow::error::cannot_connect%]",
|
||||
"invalid_bucket_name": "[%key:component::backblaze_b2::exceptions::invalid_bucket_name::message%]",
|
||||
"invalid_capability": "[%key:component::backblaze_b2::exceptions::invalid_capability::message%]",
|
||||
@@ -61,9 +60,6 @@
|
||||
}
|
||||
},
|
||||
"exceptions": {
|
||||
"bad_request": {
|
||||
"message": "The Backblaze B2 API rejected the request: {error_message}"
|
||||
},
|
||||
"cannot_connect": {
|
||||
"message": "Cannot connect to endpoint"
|
||||
},
|
||||
|
||||
@@ -23,7 +23,7 @@ from . import util
|
||||
from .agent import BackupAgent
|
||||
from .const import DATA_MANAGER
|
||||
from .manager import BackupManager
|
||||
from .models import AgentBackup, BackupNotFound, InvalidBackupFilename
|
||||
from .models import AgentBackup, BackupNotFound
|
||||
|
||||
|
||||
@callback
|
||||
@@ -195,11 +195,6 @@ class UploadBackupView(HomeAssistantView):
|
||||
backup_id = await manager.async_receive_backup(
|
||||
contents=contents, agent_ids=agent_ids
|
||||
)
|
||||
except InvalidBackupFilename as err:
|
||||
return Response(
|
||||
body=str(err),
|
||||
status=HTTPStatus.BAD_REQUEST,
|
||||
)
|
||||
except OSError as err:
|
||||
return Response(
|
||||
body=f"Can't write backup file: {err}",
|
||||
|
||||
@@ -68,7 +68,6 @@ from .models import (
|
||||
BackupReaderWriterError,
|
||||
BaseBackup,
|
||||
Folder,
|
||||
InvalidBackupFilename,
|
||||
)
|
||||
from .store import BackupStore
|
||||
from .util import (
|
||||
@@ -1007,14 +1006,6 @@ class BackupManager:
|
||||
) -> str:
|
||||
"""Receive and store a backup file from upload."""
|
||||
contents.chunk_size = BUF_SIZE
|
||||
suggested_filename = contents.filename or "backup.tar"
|
||||
safe_filename = PureWindowsPath(suggested_filename).name
|
||||
if (
|
||||
not safe_filename
|
||||
or safe_filename != suggested_filename
|
||||
or safe_filename == ".."
|
||||
):
|
||||
raise InvalidBackupFilename(f"Invalid filename: {suggested_filename}")
|
||||
self.async_on_backup_event(
|
||||
ReceiveBackupEvent(
|
||||
reason=None,
|
||||
@@ -1025,7 +1016,7 @@ class BackupManager:
|
||||
written_backup = await self._reader_writer.async_receive_backup(
|
||||
agent_ids=agent_ids,
|
||||
stream=contents,
|
||||
suggested_filename=suggested_filename,
|
||||
suggested_filename=contents.filename or "backup.tar",
|
||||
)
|
||||
self.async_on_backup_event(
|
||||
ReceiveBackupEvent(
|
||||
@@ -1966,7 +1957,10 @@ class CoreBackupReaderWriter(BackupReaderWriter):
|
||||
suggested_filename: str,
|
||||
) -> WrittenBackup:
|
||||
"""Receive a backup."""
|
||||
temp_file = Path(self.temp_backup_dir, suggested_filename)
|
||||
safe_filename = PureWindowsPath(suggested_filename).name
|
||||
if not safe_filename or safe_filename == "..":
|
||||
safe_filename = "backup.tar"
|
||||
temp_file = Path(self.temp_backup_dir, safe_filename)
|
||||
|
||||
async_add_executor_job = self._hass.async_add_executor_job
|
||||
await async_add_executor_job(make_backup_dir, self.temp_backup_dir)
|
||||
|
||||
@@ -8,6 +8,6 @@
|
||||
"integration_type": "service",
|
||||
"iot_class": "calculated",
|
||||
"quality_scale": "internal",
|
||||
"requirements": ["cronsim==2.7", "securetar==2026.4.1"],
|
||||
"requirements": ["cronsim==2.7", "securetar==2026.2.0"],
|
||||
"single_config_entry": true
|
||||
}
|
||||
|
||||
@@ -95,12 +95,6 @@ class BackupReaderWriterError(BackupError):
|
||||
error_code = "backup_reader_writer_error"
|
||||
|
||||
|
||||
class InvalidBackupFilename(BackupManagerError):
|
||||
"""Raised when a backup filename is invalid."""
|
||||
|
||||
error_code = "invalid_backup_filename"
|
||||
|
||||
|
||||
class BackupNotFound(BackupAgentError, BackupManagerError):
|
||||
"""Raised when a backup is not found."""
|
||||
|
||||
|
||||
@@ -22,7 +22,6 @@ from securetar import (
|
||||
SecureTarFile,
|
||||
SecureTarReadError,
|
||||
SecureTarRootKeyContext,
|
||||
get_archive_max_ciphertext_size,
|
||||
)
|
||||
|
||||
from homeassistant.core import HomeAssistant
|
||||
@@ -384,12 +383,9 @@ def _encrypt_backup(
|
||||
if prefix not in expected_archives:
|
||||
LOGGER.debug("Unknown inner tar file %s will not be encrypted", obj.name)
|
||||
continue
|
||||
if (fileobj := input_tar.extractfile(obj)) is None:
|
||||
LOGGER.debug(
|
||||
"Non regular inner tar file %s will not be encrypted", obj.name
|
||||
)
|
||||
continue
|
||||
output_archive.import_tar(fileobj, obj, derived_key_id=inner_tar_idx)
|
||||
output_archive.import_tar(
|
||||
input_tar.extractfile(obj), obj, derived_key_id=inner_tar_idx
|
||||
)
|
||||
inner_tar_idx += 1
|
||||
|
||||
|
||||
@@ -423,7 +419,7 @@ class _CipherBackupStreamer:
|
||||
hass: HomeAssistant,
|
||||
backup: AgentBackup,
|
||||
open_stream: Callable[[], Coroutine[Any, Any, AsyncIterator[bytes]]],
|
||||
password: str,
|
||||
password: str | None,
|
||||
) -> None:
|
||||
"""Initialize."""
|
||||
self._workers: list[_CipherWorkerStatus] = []
|
||||
@@ -435,9 +431,7 @@ class _CipherBackupStreamer:
|
||||
|
||||
def size(self) -> int:
|
||||
"""Return the maximum size of the decrypted or encrypted backup."""
|
||||
return get_archive_max_ciphertext_size(
|
||||
self._backup.size, SECURETAR_CREATE_VERSION, self._num_tar_files()
|
||||
)
|
||||
return self._backup.size + self._num_tar_files() * tarfile.RECORDSIZE
|
||||
|
||||
def _num_tar_files(self) -> int:
|
||||
"""Return the number of inner tar files."""
|
||||
|
||||
@@ -20,7 +20,7 @@
|
||||
"bluetooth-adapters==2.1.0",
|
||||
"bluetooth-auto-recovery==1.5.3",
|
||||
"bluetooth-data-tools==1.28.4",
|
||||
"dbus-fast==4.0.4",
|
||||
"habluetooth==6.0.0"
|
||||
"dbus-fast==3.1.2",
|
||||
"habluetooth==5.11.1"
|
||||
]
|
||||
}
|
||||
|
||||
@@ -1,7 +1,7 @@
|
||||
{
|
||||
"issues": {
|
||||
"integration_removed": {
|
||||
"description": "The BMW Connected Drive integration has been removed from Home Assistant.\n\nIn September 2025, BMW blocked third-party access to their servers by adding additional security measures. For EU-registered cars, a [community integration]({custom_component_url}) using BMW's CarData API is available as an alternative.\n\nTo resolve this issue, please remove the (now defunct) integration entries from your Home Assistant setup. [Click here to see your existing BMW Connected Drive integration entries]({entries}).",
|
||||
"description": "The BMW Connected Drive integration has been removed from Home Assistant.\n\nIn September 2025, BMW blocked third-party access to their servers by adding additional security measures. For EU-registered cars, a community-developed [custom component]({custom_component_url}) using BMW's CarData API is available as an alternative.\n\nTo resolve this issue, please remove the (now defunct) integration entries from your Home Assistant setup. [Click here to see your existing BMW Connected Drive integration entries]({entries}).",
|
||||
"title": "The BMW Connected Drive integration has been removed"
|
||||
}
|
||||
}
|
||||
|
||||
@@ -10,7 +10,6 @@ from bsblan import (
|
||||
BSBLAN,
|
||||
BSBLANAuthError,
|
||||
BSBLANConnectionError,
|
||||
BSBLANError,
|
||||
HotWaterConfig,
|
||||
HotWaterSchedule,
|
||||
HotWaterState,
|
||||
@@ -51,7 +50,7 @@ class BSBLanFastData:
|
||||
|
||||
state: State
|
||||
sensor: Sensor
|
||||
dhw: HotWaterState | None = None
|
||||
dhw: HotWaterState
|
||||
|
||||
|
||||
@dataclass
|
||||
@@ -112,6 +111,7 @@ class BSBLanFastCoordinator(BSBLanCoordinator[BSBLanFastData]):
|
||||
# This reduces response time significantly (~0.2s per parameter)
|
||||
state = await self.client.state(include=STATE_INCLUDE)
|
||||
sensor = await self.client.sensor(include=SENSOR_INCLUDE)
|
||||
dhw = await self.client.hot_water_state(include=DHW_STATE_INCLUDE)
|
||||
|
||||
except BSBLANAuthError as err:
|
||||
raise ConfigEntryAuthFailed(
|
||||
@@ -126,19 +126,6 @@ class BSBLanFastCoordinator(BSBLanCoordinator[BSBLanFastData]):
|
||||
translation_placeholders={"host": host},
|
||||
) from err
|
||||
|
||||
# Fetch DHW state separately - device may not support hot water
|
||||
dhw: HotWaterState | None = None
|
||||
try:
|
||||
dhw = await self.client.hot_water_state(include=DHW_STATE_INCLUDE)
|
||||
except BSBLANError:
|
||||
# Preserve last known DHW state if available (entity may depend on it)
|
||||
if self.data:
|
||||
dhw = self.data.dhw
|
||||
LOGGER.debug(
|
||||
"DHW (Domestic Hot Water) state not available on device at %s",
|
||||
self.config_entry.data[CONF_HOST],
|
||||
)
|
||||
|
||||
return BSBLanFastData(
|
||||
state=state,
|
||||
sensor=sensor,
|
||||
@@ -172,6 +159,13 @@ class BSBLanSlowCoordinator(BSBLanCoordinator[BSBLanSlowData]):
|
||||
dhw_config = await self.client.hot_water_config(include=DHW_CONFIG_INCLUDE)
|
||||
dhw_schedule = await self.client.hot_water_schedule()
|
||||
|
||||
except AttributeError:
|
||||
# Device does not support DHW functionality
|
||||
LOGGER.debug(
|
||||
"DHW (Domestic Hot Water) not available on device at %s",
|
||||
self.config_entry.data[CONF_HOST],
|
||||
)
|
||||
return BSBLanSlowData()
|
||||
except (BSBLANConnectionError, BSBLANAuthError) as err:
|
||||
# If config update fails, keep existing data
|
||||
LOGGER.debug(
|
||||
@@ -183,13 +177,6 @@ class BSBLanSlowCoordinator(BSBLanCoordinator[BSBLanSlowData]):
|
||||
return self.data
|
||||
# First fetch failed, return empty data
|
||||
return BSBLanSlowData()
|
||||
except BSBLANError, AttributeError:
|
||||
# Device does not support DHW functionality
|
||||
LOGGER.debug(
|
||||
"DHW (Domestic Hot Water) not available on device at %s",
|
||||
self.config_entry.data[CONF_HOST],
|
||||
)
|
||||
return BSBLanSlowData()
|
||||
|
||||
return BSBLanSlowData(
|
||||
dhw_config=dhw_config,
|
||||
|
||||
@@ -22,9 +22,7 @@ async def async_get_config_entry_diagnostics(
|
||||
"fast_coordinator_data": {
|
||||
"state": data.fast_coordinator.data.state.model_dump(),
|
||||
"sensor": data.fast_coordinator.data.sensor.model_dump(),
|
||||
"dhw": data.fast_coordinator.data.dhw.model_dump()
|
||||
if data.fast_coordinator.data.dhw
|
||||
else None,
|
||||
"dhw": data.fast_coordinator.data.dhw.model_dump(),
|
||||
},
|
||||
"static": data.static.model_dump() if data.static is not None else None,
|
||||
}
|
||||
|
||||
@@ -2,9 +2,6 @@
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
from yarl import URL
|
||||
|
||||
from homeassistant.const import CONF_HOST, CONF_PORT
|
||||
from homeassistant.helpers.device_registry import (
|
||||
CONNECTION_NETWORK_MAC,
|
||||
DeviceInfo,
|
||||
@@ -13,7 +10,7 @@ from homeassistant.helpers.device_registry import (
|
||||
from homeassistant.helpers.update_coordinator import CoordinatorEntity
|
||||
|
||||
from . import BSBLanData
|
||||
from .const import DEFAULT_PORT, DOMAIN
|
||||
from .const import DOMAIN
|
||||
from .coordinator import BSBLanCoordinator, BSBLanFastCoordinator, BSBLanSlowCoordinator
|
||||
|
||||
|
||||
@@ -25,8 +22,7 @@ class BSBLanEntityBase[_T: BSBLanCoordinator](CoordinatorEntity[_T]):
|
||||
def __init__(self, coordinator: _T, data: BSBLanData) -> None:
|
||||
"""Initialize BSBLan entity with device info."""
|
||||
super().__init__(coordinator)
|
||||
host = coordinator.config_entry.data[CONF_HOST]
|
||||
port = coordinator.config_entry.data.get(CONF_PORT, DEFAULT_PORT)
|
||||
host = coordinator.config_entry.data["host"]
|
||||
mac = data.device.MAC
|
||||
self._attr_device_info = DeviceInfo(
|
||||
identifiers={(DOMAIN, mac)},
|
||||
@@ -48,7 +44,7 @@ class BSBLanEntityBase[_T: BSBLanCoordinator](CoordinatorEntity[_T]):
|
||||
else None
|
||||
),
|
||||
sw_version=data.device.version,
|
||||
configuration_url=str(URL.build(scheme="http", host=host, port=port)),
|
||||
configuration_url=f"http://{host}",
|
||||
)
|
||||
|
||||
|
||||
|
||||
@@ -4,7 +4,7 @@ from __future__ import annotations
|
||||
|
||||
from typing import Any
|
||||
|
||||
from bsblan import BSBLANError, HotWaterState, SetHotWaterParam
|
||||
from bsblan import BSBLANError, SetHotWaterParam
|
||||
|
||||
from homeassistant.components.water_heater import (
|
||||
STATE_ECO,
|
||||
@@ -46,10 +46,8 @@ async def async_setup_entry(
|
||||
data = entry.runtime_data
|
||||
|
||||
# Only create water heater entity if DHW (Domestic Hot Water) is available
|
||||
# Check if we have any DHW-related data indicating water heater support
|
||||
dhw_data = data.fast_coordinator.data.dhw
|
||||
if dhw_data is None:
|
||||
# Device does not support DHW, skip water heater setup
|
||||
return
|
||||
if (
|
||||
dhw_data.operating_mode is None
|
||||
and dhw_data.nominal_setpoint is None
|
||||
@@ -109,21 +107,11 @@ class BSBLANWaterHeater(BSBLanDualCoordinatorEntity, WaterHeaterEntity):
|
||||
else:
|
||||
self._attr_max_temp = 65.0 # Default maximum
|
||||
|
||||
@property
|
||||
def _dhw(self) -> HotWaterState:
|
||||
"""Return DHW state data.
|
||||
|
||||
This entity is only created when DHW data is available.
|
||||
"""
|
||||
dhw = self.coordinator.data.dhw
|
||||
assert dhw is not None
|
||||
return dhw
|
||||
|
||||
@property
|
||||
def current_operation(self) -> str | None:
|
||||
"""Return current operation."""
|
||||
if (
|
||||
operating_mode := self._dhw.operating_mode
|
||||
operating_mode := self.coordinator.data.dhw.operating_mode
|
||||
) is None or operating_mode.value is None:
|
||||
return None
|
||||
return BSBLAN_TO_HA_OPERATION_MODE.get(operating_mode.value)
|
||||
@@ -131,14 +119,16 @@ class BSBLANWaterHeater(BSBLanDualCoordinatorEntity, WaterHeaterEntity):
|
||||
@property
|
||||
def current_temperature(self) -> float | None:
|
||||
"""Return the current temperature."""
|
||||
if (current_temp := self._dhw.dhw_actual_value_top_temperature) is None:
|
||||
if (
|
||||
current_temp := self.coordinator.data.dhw.dhw_actual_value_top_temperature
|
||||
) is None:
|
||||
return None
|
||||
return current_temp.value
|
||||
|
||||
@property
|
||||
def target_temperature(self) -> float | None:
|
||||
"""Return the temperature we try to reach."""
|
||||
if (target_temp := self._dhw.nominal_setpoint) is None:
|
||||
if (target_temp := self.coordinator.data.dhw.nominal_setpoint) is None:
|
||||
return None
|
||||
return target_temp.value
|
||||
|
||||
|
||||
@@ -17,7 +17,6 @@ import voluptuous as vol
|
||||
|
||||
from homeassistant.components import frontend, http, websocket_api
|
||||
from homeassistant.components.websocket_api import (
|
||||
ERR_INVALID_FORMAT,
|
||||
ERR_NOT_FOUND,
|
||||
ERR_NOT_SUPPORTED,
|
||||
ActiveConnection,
|
||||
@@ -34,7 +33,6 @@ from homeassistant.core import (
|
||||
)
|
||||
from homeassistant.exceptions import HomeAssistantError
|
||||
from homeassistant.helpers import config_validation as cv, entity_registry as er
|
||||
from homeassistant.helpers.debounce import Debouncer
|
||||
from homeassistant.helpers.entity import Entity, EntityDescription
|
||||
from homeassistant.helpers.entity_component import EntityComponent
|
||||
from homeassistant.helpers.event import async_track_point_in_time
|
||||
@@ -78,7 +76,6 @@ ENTITY_ID_FORMAT = DOMAIN + ".{}"
|
||||
PLATFORM_SCHEMA = cv.PLATFORM_SCHEMA
|
||||
PLATFORM_SCHEMA_BASE = cv.PLATFORM_SCHEMA_BASE
|
||||
SCAN_INTERVAL = datetime.timedelta(seconds=60)
|
||||
EVENT_LISTENER_DEBOUNCE_COOLDOWN = 1.0 # seconds
|
||||
|
||||
# Don't support rrules more often than daily
|
||||
VALID_FREQS = {"DAILY", "WEEKLY", "MONTHLY", "YEARLY"}
|
||||
@@ -323,7 +320,6 @@ async def async_setup(hass: HomeAssistant, config: ConfigType) -> bool:
|
||||
websocket_api.async_register_command(hass, handle_calendar_event_create)
|
||||
websocket_api.async_register_command(hass, handle_calendar_event_delete)
|
||||
websocket_api.async_register_command(hass, handle_calendar_event_update)
|
||||
websocket_api.async_register_command(hass, handle_calendar_event_subscribe)
|
||||
|
||||
component.async_register_entity_service(
|
||||
CREATE_EVENT_SERVICE,
|
||||
@@ -521,17 +517,6 @@ class CalendarEntity(Entity):
|
||||
_entity_component_unrecorded_attributes = frozenset({"description"})
|
||||
|
||||
_alarm_unsubs: list[CALLBACK_TYPE] | None = None
|
||||
_event_listeners: (
|
||||
list[
|
||||
tuple[
|
||||
datetime.datetime,
|
||||
datetime.datetime,
|
||||
Callable[[list[JsonValueType] | None], None],
|
||||
]
|
||||
]
|
||||
| None
|
||||
) = None
|
||||
_event_listener_debouncer: Debouncer[None] | None = None
|
||||
|
||||
_attr_initial_color: str | None
|
||||
|
||||
@@ -600,10 +585,6 @@ class CalendarEntity(Entity):
|
||||
the current or upcoming event.
|
||||
"""
|
||||
super()._async_write_ha_state()
|
||||
|
||||
# Notify websocket subscribers of event changes (debounced)
|
||||
if self._event_listeners and self._event_listener_debouncer:
|
||||
self._event_listener_debouncer.async_schedule_call()
|
||||
if self._alarm_unsubs is None:
|
||||
self._alarm_unsubs = []
|
||||
_LOGGER.debug(
|
||||
@@ -644,13 +625,6 @@ class CalendarEntity(Entity):
|
||||
event.end_datetime_local,
|
||||
)
|
||||
|
||||
@callback
|
||||
def _async_cancel_event_listener_debouncer(self) -> None:
|
||||
"""Cancel and clear the event listener debouncer."""
|
||||
if self._event_listener_debouncer:
|
||||
self._event_listener_debouncer.async_cancel()
|
||||
self._event_listener_debouncer = None
|
||||
|
||||
async def async_will_remove_from_hass(self) -> None:
|
||||
"""Run when entity will be removed from hass.
|
||||
|
||||
@@ -659,90 +633,6 @@ class CalendarEntity(Entity):
|
||||
for unsub in self._alarm_unsubs or ():
|
||||
unsub()
|
||||
self._alarm_unsubs = None
|
||||
self._async_cancel_event_listener_debouncer()
|
||||
|
||||
@final
|
||||
@callback
|
||||
def async_subscribe_events(
|
||||
self,
|
||||
start_date: datetime.datetime,
|
||||
end_date: datetime.datetime,
|
||||
event_listener: Callable[[list[JsonValueType] | None], None],
|
||||
) -> CALLBACK_TYPE:
|
||||
"""Subscribe to calendar event updates.
|
||||
|
||||
Called by websocket API.
|
||||
"""
|
||||
if self._event_listeners is None:
|
||||
self._event_listeners = []
|
||||
|
||||
if self._event_listener_debouncer is None:
|
||||
self._event_listener_debouncer = Debouncer(
|
||||
self.hass,
|
||||
_LOGGER,
|
||||
cooldown=EVENT_LISTENER_DEBOUNCE_COOLDOWN,
|
||||
immediate=True,
|
||||
function=self.async_update_event_listeners,
|
||||
)
|
||||
|
||||
listener_data = (start_date, end_date, event_listener)
|
||||
self._event_listeners.append(listener_data)
|
||||
|
||||
@callback
|
||||
def unsubscribe() -> None:
|
||||
if self._event_listeners:
|
||||
self._event_listeners.remove(listener_data)
|
||||
if not self._event_listeners:
|
||||
self._async_cancel_event_listener_debouncer()
|
||||
|
||||
return unsubscribe
|
||||
|
||||
@final
|
||||
@callback
|
||||
def async_update_event_listeners(self) -> None:
|
||||
"""Push updated calendar events to all listeners."""
|
||||
if not self._event_listeners:
|
||||
return
|
||||
|
||||
for start_date, end_date, listener in self._event_listeners:
|
||||
self.async_update_single_event_listener(start_date, end_date, listener)
|
||||
|
||||
@final
|
||||
@callback
|
||||
def async_update_single_event_listener(
|
||||
self,
|
||||
start_date: datetime.datetime,
|
||||
end_date: datetime.datetime,
|
||||
listener: Callable[[list[JsonValueType] | None], None],
|
||||
) -> None:
|
||||
"""Schedule an event fetch and push to a single listener."""
|
||||
self.hass.async_create_task(
|
||||
self._async_update_listener(start_date, end_date, listener)
|
||||
)
|
||||
|
||||
async def _async_update_listener(
|
||||
self,
|
||||
start_date: datetime.datetime,
|
||||
end_date: datetime.datetime,
|
||||
listener: Callable[[list[JsonValueType] | None], None],
|
||||
) -> None:
|
||||
"""Fetch events and push to a single listener."""
|
||||
try:
|
||||
events = await self.async_get_events(self.hass, start_date, end_date)
|
||||
except HomeAssistantError as err:
|
||||
_LOGGER.debug(
|
||||
"Error fetching calendar events for %s: %s",
|
||||
self.entity_id,
|
||||
err,
|
||||
)
|
||||
listener(None)
|
||||
return
|
||||
|
||||
event_list: list[JsonValueType] = [
|
||||
dataclasses.asdict(event, dict_factory=_list_events_dict_factory)
|
||||
for event in events
|
||||
]
|
||||
listener(event_list)
|
||||
|
||||
async def async_get_events(
|
||||
self,
|
||||
@@ -977,65 +867,6 @@ async def handle_calendar_event_update(
|
||||
connection.send_result(msg["id"])
|
||||
|
||||
|
||||
@websocket_api.websocket_command(
|
||||
{
|
||||
vol.Required("type"): "calendar/event/subscribe",
|
||||
vol.Required("entity_id"): cv.entity_domain(DOMAIN),
|
||||
vol.Required("start"): cv.datetime,
|
||||
vol.Required("end"): cv.datetime,
|
||||
}
|
||||
)
|
||||
@websocket_api.async_response
|
||||
async def handle_calendar_event_subscribe(
|
||||
hass: HomeAssistant, connection: ActiveConnection, msg: dict[str, Any]
|
||||
) -> None:
|
||||
"""Subscribe to calendar event updates."""
|
||||
entity_id: str = msg["entity_id"]
|
||||
|
||||
if not (entity := hass.data[DATA_COMPONENT].get_entity(entity_id)):
|
||||
connection.send_error(
|
||||
msg["id"],
|
||||
ERR_NOT_FOUND,
|
||||
f"Calendar entity not found: {entity_id}",
|
||||
)
|
||||
return
|
||||
|
||||
start_date = dt_util.as_local(msg["start"])
|
||||
end_date = dt_util.as_local(msg["end"])
|
||||
|
||||
if start_date >= end_date:
|
||||
connection.send_error(
|
||||
msg["id"],
|
||||
ERR_INVALID_FORMAT,
|
||||
"Start must be before end",
|
||||
)
|
||||
return
|
||||
|
||||
subscription_id = msg["id"]
|
||||
|
||||
@callback
|
||||
def event_listener(events: list[JsonValueType] | None) -> None:
|
||||
"""Push updated calendar events to websocket."""
|
||||
if subscription_id not in connection.subscriptions:
|
||||
return
|
||||
connection.send_message(
|
||||
websocket_api.event_message(
|
||||
subscription_id,
|
||||
{
|
||||
"events": events,
|
||||
},
|
||||
)
|
||||
)
|
||||
|
||||
connection.subscriptions[subscription_id] = entity.async_subscribe_events(
|
||||
start_date, end_date, event_listener
|
||||
)
|
||||
connection.send_result(subscription_id)
|
||||
|
||||
# Push initial events only to the new subscriber
|
||||
entity.async_update_single_event_listener(start_date, end_date, event_listener)
|
||||
|
||||
|
||||
def _validate_timespan(
|
||||
values: dict[str, Any],
|
||||
) -> tuple[datetime.datetime | datetime.date, datetime.datetime | datetime.date]:
|
||||
|
||||
@@ -16,7 +16,6 @@ PLATFORMS: list[Platform] = [
|
||||
Platform.BUTTON,
|
||||
Platform.LIGHT,
|
||||
Platform.SELECT,
|
||||
Platform.SENSOR,
|
||||
]
|
||||
|
||||
|
||||
|
||||
@@ -4,11 +4,7 @@ from __future__ import annotations
|
||||
|
||||
from pycasperglow import GlowState
|
||||
|
||||
from homeassistant.components.binary_sensor import (
|
||||
BinarySensorDeviceClass,
|
||||
BinarySensorEntity,
|
||||
)
|
||||
from homeassistant.const import EntityCategory
|
||||
from homeassistant.components.binary_sensor import BinarySensorEntity
|
||||
from homeassistant.core import HomeAssistant, callback
|
||||
from homeassistant.helpers.device_registry import format_mac
|
||||
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
|
||||
@@ -25,12 +21,7 @@ async def async_setup_entry(
|
||||
async_add_entities: AddConfigEntryEntitiesCallback,
|
||||
) -> None:
|
||||
"""Set up the binary sensor platform for Casper Glow."""
|
||||
async_add_entities(
|
||||
[
|
||||
CasperGlowPausedBinarySensor(entry.runtime_data),
|
||||
CasperGlowChargingBinarySensor(entry.runtime_data),
|
||||
]
|
||||
)
|
||||
async_add_entities([CasperGlowPausedBinarySensor(entry.runtime_data)])
|
||||
|
||||
|
||||
class CasperGlowPausedBinarySensor(CasperGlowEntity, BinarySensorEntity):
|
||||
@@ -55,34 +46,6 @@ class CasperGlowPausedBinarySensor(CasperGlowEntity, BinarySensorEntity):
|
||||
@callback
|
||||
def _async_handle_state_update(self, state: GlowState) -> None:
|
||||
"""Handle a state update from the device."""
|
||||
if state.is_paused is not None and state.is_paused != self._attr_is_on:
|
||||
if state.is_paused is not None:
|
||||
self._attr_is_on = state.is_paused
|
||||
self.async_write_ha_state()
|
||||
|
||||
|
||||
class CasperGlowChargingBinarySensor(CasperGlowEntity, BinarySensorEntity):
|
||||
"""Binary sensor indicating whether the Casper Glow is charging."""
|
||||
|
||||
_attr_device_class = BinarySensorDeviceClass.BATTERY_CHARGING
|
||||
_attr_entity_category = EntityCategory.DIAGNOSTIC
|
||||
|
||||
def __init__(self, coordinator: CasperGlowCoordinator) -> None:
|
||||
"""Initialize the charging binary sensor."""
|
||||
super().__init__(coordinator)
|
||||
self._attr_unique_id = f"{format_mac(coordinator.device.address)}_charging"
|
||||
if coordinator.device.state.is_charging is not None:
|
||||
self._attr_is_on = coordinator.device.state.is_charging
|
||||
|
||||
async def async_added_to_hass(self) -> None:
|
||||
"""Register state update callback when entity is added."""
|
||||
await super().async_added_to_hass()
|
||||
self.async_on_remove(
|
||||
self._device.register_callback(self._async_handle_state_update)
|
||||
)
|
||||
|
||||
@callback
|
||||
def _async_handle_state_update(self, state: GlowState) -> None:
|
||||
"""Handle a state update from the device."""
|
||||
if state.is_charging is not None and state.is_charging != self._attr_is_on:
|
||||
self._attr_is_on = state.is_charging
|
||||
self.async_write_ha_state()
|
||||
self.async_write_ha_state()
|
||||
|
||||
@@ -51,24 +51,18 @@ rules:
|
||||
docs-supported-functions: done
|
||||
docs-troubleshooting: done
|
||||
docs-use-cases: todo
|
||||
dynamic-devices:
|
||||
status: exempt
|
||||
comment: Each config entry represents a single device.
|
||||
dynamic-devices: todo
|
||||
entity-category: done
|
||||
entity-device-class: done
|
||||
entity-disabled-by-default: done
|
||||
entity-device-class:
|
||||
status: exempt
|
||||
comment: No applicable device classes for binary_sensor, button, light, or select entities.
|
||||
entity-disabled-by-default: todo
|
||||
entity-translations: done
|
||||
exception-translations: done
|
||||
icon-translations: done
|
||||
reconfiguration-flow:
|
||||
status: exempt
|
||||
comment: No user-configurable settings in the configuration flow.
|
||||
repair-issues:
|
||||
status: exempt
|
||||
comment: Integration does not register repair issues.
|
||||
stale-devices:
|
||||
status: exempt
|
||||
comment: Each config entry represents a single device.
|
||||
reconfiguration-flow: todo
|
||||
repair-issues: todo
|
||||
stale-devices: todo
|
||||
|
||||
# Platinum
|
||||
async-dependency: done
|
||||
|
||||
@@ -1,134 +0,0 @@
|
||||
"""Casper Glow integration sensor platform."""
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
from datetime import datetime, timedelta
|
||||
|
||||
from pycasperglow import GlowState
|
||||
|
||||
from homeassistant.components.sensor import (
|
||||
SensorDeviceClass,
|
||||
SensorEntity,
|
||||
SensorStateClass,
|
||||
)
|
||||
from homeassistant.const import PERCENTAGE, EntityCategory
|
||||
from homeassistant.core import HomeAssistant, callback
|
||||
from homeassistant.helpers.device_registry import format_mac
|
||||
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
|
||||
from homeassistant.util.dt import utcnow
|
||||
from homeassistant.util.variance import ignore_variance
|
||||
|
||||
from .coordinator import CasperGlowConfigEntry, CasperGlowCoordinator
|
||||
from .entity import CasperGlowEntity
|
||||
|
||||
PARALLEL_UPDATES = 0
|
||||
|
||||
|
||||
async def async_setup_entry(
|
||||
hass: HomeAssistant,
|
||||
entry: CasperGlowConfigEntry,
|
||||
async_add_entities: AddConfigEntryEntitiesCallback,
|
||||
) -> None:
|
||||
"""Set up the sensor platform for Casper Glow."""
|
||||
async_add_entities(
|
||||
[
|
||||
CasperGlowBatterySensor(entry.runtime_data),
|
||||
CasperGlowDimmingEndTimeSensor(entry.runtime_data),
|
||||
]
|
||||
)
|
||||
|
||||
|
||||
class CasperGlowBatterySensor(CasperGlowEntity, SensorEntity):
|
||||
"""Sensor entity for Casper Glow battery level."""
|
||||
|
||||
_attr_device_class = SensorDeviceClass.BATTERY
|
||||
_attr_native_unit_of_measurement = PERCENTAGE
|
||||
_attr_state_class = SensorStateClass.MEASUREMENT
|
||||
_attr_entity_category = EntityCategory.DIAGNOSTIC
|
||||
|
||||
def __init__(self, coordinator: CasperGlowCoordinator) -> None:
|
||||
"""Initialize the battery sensor."""
|
||||
super().__init__(coordinator)
|
||||
self._attr_unique_id = f"{format_mac(coordinator.device.address)}_battery"
|
||||
if coordinator.device.state.battery_level is not None:
|
||||
self._attr_native_value = coordinator.device.state.battery_level.percentage
|
||||
|
||||
async def async_added_to_hass(self) -> None:
|
||||
"""Register state update callback when entity is added."""
|
||||
await super().async_added_to_hass()
|
||||
self.async_on_remove(
|
||||
self._device.register_callback(self._async_handle_state_update)
|
||||
)
|
||||
|
||||
@callback
|
||||
def _async_handle_state_update(self, state: GlowState) -> None:
|
||||
"""Handle a state update from the device."""
|
||||
if state.battery_level is not None:
|
||||
new_value = state.battery_level.percentage
|
||||
if new_value != self._attr_native_value:
|
||||
self._attr_native_value = new_value
|
||||
self.async_write_ha_state()
|
||||
|
||||
|
||||
class CasperGlowDimmingEndTimeSensor(CasperGlowEntity, SensorEntity):
|
||||
"""Sensor entity for Casper Glow dimming end time."""
|
||||
|
||||
_attr_translation_key = "dimming_end_time"
|
||||
_attr_device_class = SensorDeviceClass.TIMESTAMP
|
||||
_attr_entity_registry_enabled_default = False
|
||||
|
||||
def __init__(self, coordinator: CasperGlowCoordinator) -> None:
|
||||
"""Initialize the dimming end time sensor."""
|
||||
super().__init__(coordinator)
|
||||
self._attr_unique_id = (
|
||||
f"{format_mac(coordinator.device.address)}_dimming_end_time"
|
||||
)
|
||||
self._is_paused = False
|
||||
self._projected_end_time = ignore_variance(
|
||||
self._calculate_end_time,
|
||||
timedelta(minutes=1, seconds=30),
|
||||
)
|
||||
self._update_from_state(coordinator.device.state)
|
||||
|
||||
@staticmethod
|
||||
def _calculate_end_time(remaining_ms: int) -> datetime:
|
||||
"""Calculate projected dimming end time from remaining milliseconds."""
|
||||
return utcnow() + timedelta(milliseconds=remaining_ms)
|
||||
|
||||
async def async_added_to_hass(self) -> None:
|
||||
"""Register state update callback when entity is added."""
|
||||
await super().async_added_to_hass()
|
||||
self.async_on_remove(
|
||||
self._device.register_callback(self._async_handle_state_update)
|
||||
)
|
||||
|
||||
def _reset_projected_end_time(self) -> None:
|
||||
"""Clear the projected end time and reset the variance filter."""
|
||||
self._attr_native_value = None
|
||||
self._projected_end_time = ignore_variance(
|
||||
self._calculate_end_time,
|
||||
timedelta(minutes=1, seconds=30),
|
||||
)
|
||||
|
||||
@callback
|
||||
def _update_from_state(self, state: GlowState) -> None:
|
||||
"""Update entity attributes from device state."""
|
||||
if state.is_paused is not None:
|
||||
self._is_paused = state.is_paused
|
||||
|
||||
if self._is_paused:
|
||||
self._reset_projected_end_time()
|
||||
return
|
||||
|
||||
remaining_ms = state.dimming_time_remaining_ms
|
||||
if not remaining_ms:
|
||||
if remaining_ms == 0 or state.is_on is False:
|
||||
self._reset_projected_end_time()
|
||||
return
|
||||
self._attr_native_value = self._projected_end_time(remaining_ms)
|
||||
|
||||
@callback
|
||||
def _async_handle_state_update(self, state: GlowState) -> None:
|
||||
"""Handle a state update from the device."""
|
||||
self._update_from_state(state)
|
||||
self.async_write_ha_state()
|
||||
@@ -44,11 +44,6 @@
|
||||
"dimming_time": {
|
||||
"name": "Dimming time"
|
||||
}
|
||||
},
|
||||
"sensor": {
|
||||
"dimming_end_time": {
|
||||
"name": "Dimming end time"
|
||||
}
|
||||
}
|
||||
},
|
||||
"exceptions": {
|
||||
|
||||
@@ -44,10 +44,10 @@
|
||||
},
|
||||
"services": {
|
||||
"show_lovelace_view": {
|
||||
"description": "Shows a dashboard view on a Google Cast device.",
|
||||
"description": "Shows a dashboard view on a Chromecast device.",
|
||||
"fields": {
|
||||
"dashboard_path": {
|
||||
"description": "The URL path of the dashboard to show, defaults to `lovelace` if not specified.",
|
||||
"description": "The URL path of the dashboard to show, defaults to lovelace if not specified.",
|
||||
"name": "Dashboard path"
|
||||
},
|
||||
"entity_id": {
|
||||
@@ -59,7 +59,7 @@
|
||||
"name": "View path"
|
||||
}
|
||||
},
|
||||
"name": "Show dashboard view via Google Cast"
|
||||
"name": "Show dashboard view"
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
@@ -239,7 +239,7 @@
|
||||
"message": "Provided humidity {humidity} is not valid. Accepted range is {min_humidity} to {max_humidity}."
|
||||
},
|
||||
"low_temp_higher_than_high_temp": {
|
||||
"message": "'Lower target temperature' cannot be higher than 'Upper target temperature'."
|
||||
"message": "'Lower target temperature' can not be higher than 'Upper target temperature'."
|
||||
},
|
||||
"missing_target_temperature_entity_feature": {
|
||||
"message": "Set temperature action was used with the 'Target temperature' parameter but the entity does not support it."
|
||||
|
||||
@@ -8,5 +8,5 @@
|
||||
"iot_class": "local_polling",
|
||||
"loggers": ["aiocomelit"],
|
||||
"quality_scale": "platinum",
|
||||
"requirements": ["aiocomelit==2.0.2"]
|
||||
"requirements": ["aiocomelit==2.0.1"]
|
||||
}
|
||||
|
||||
@@ -21,7 +21,6 @@ from .const import ( # noqa: F401
|
||||
ATTR_DEV_ID,
|
||||
ATTR_GPS,
|
||||
ATTR_HOST_NAME,
|
||||
ATTR_IN_ZONES,
|
||||
ATTR_IP,
|
||||
ATTR_LOCATION_NAME,
|
||||
ATTR_MAC,
|
||||
|
||||
@@ -3,7 +3,7 @@
|
||||
from __future__ import annotations
|
||||
|
||||
import asyncio
|
||||
from typing import Any, final
|
||||
from typing import final
|
||||
|
||||
from propcache.api import cached_property
|
||||
|
||||
@@ -18,7 +18,7 @@ from homeassistant.const import (
|
||||
STATE_NOT_HOME,
|
||||
EntityCategory,
|
||||
)
|
||||
from homeassistant.core import Event, HomeAssistant, State, callback
|
||||
from homeassistant.core import Event, HomeAssistant, callback
|
||||
from homeassistant.helpers import device_registry as dr, entity_registry as er
|
||||
from homeassistant.helpers.device_registry import (
|
||||
DeviceInfo,
|
||||
@@ -33,7 +33,6 @@ from homeassistant.util.hass_dict import HassKey
|
||||
|
||||
from .const import (
|
||||
ATTR_HOST_NAME,
|
||||
ATTR_IN_ZONES,
|
||||
ATTR_IP,
|
||||
ATTR_MAC,
|
||||
ATTR_SOURCE_TYPE,
|
||||
@@ -224,9 +223,6 @@ class TrackerEntity(
|
||||
_attr_longitude: float | None = None
|
||||
_attr_source_type: SourceType = SourceType.GPS
|
||||
|
||||
__active_zone: State | None = None
|
||||
__in_zones: list[str] | None = None
|
||||
|
||||
@cached_property
|
||||
def should_poll(self) -> bool:
|
||||
"""No polling for entities that have location pushed."""
|
||||
@@ -260,18 +256,6 @@ class TrackerEntity(
|
||||
"""Return longitude value of the device."""
|
||||
return self._attr_longitude
|
||||
|
||||
@callback
|
||||
def _async_write_ha_state(self) -> None:
|
||||
"""Calculate active zones."""
|
||||
if self.available and self.latitude is not None and self.longitude is not None:
|
||||
self.__active_zone, self.__in_zones = zone.async_in_zones(
|
||||
self.hass, self.latitude, self.longitude, self.location_accuracy
|
||||
)
|
||||
else:
|
||||
self.__active_zone = None
|
||||
self.__in_zones = None
|
||||
super()._async_write_ha_state()
|
||||
|
||||
@property
|
||||
def state(self) -> str | None:
|
||||
"""Return the state of the device."""
|
||||
@@ -279,7 +263,9 @@ class TrackerEntity(
|
||||
return self.location_name
|
||||
|
||||
if self.latitude is not None and self.longitude is not None:
|
||||
zone_state = self.__active_zone
|
||||
zone_state = zone.async_active_zone(
|
||||
self.hass, self.latitude, self.longitude, self.location_accuracy
|
||||
)
|
||||
if zone_state is None:
|
||||
state = STATE_NOT_HOME
|
||||
elif zone_state.entity_id == zone.ENTITY_ID_HOME:
|
||||
@@ -292,13 +278,12 @@ class TrackerEntity(
|
||||
|
||||
@final
|
||||
@property
|
||||
def state_attributes(self) -> dict[str, Any]:
|
||||
def state_attributes(self) -> dict[str, StateType]:
|
||||
"""Return the device state attributes."""
|
||||
attr: dict[str, Any] = {ATTR_IN_ZONES: []}
|
||||
attr: dict[str, StateType] = {}
|
||||
attr.update(super().state_attributes)
|
||||
|
||||
if self.latitude is not None and self.longitude is not None:
|
||||
attr[ATTR_IN_ZONES] = self.__in_zones or []
|
||||
attr[ATTR_LATITUDE] = self.latitude
|
||||
attr[ATTR_LONGITUDE] = self.longitude
|
||||
attr[ATTR_GPS_ACCURACY] = self.location_accuracy
|
||||
|
||||
@@ -43,7 +43,6 @@ ATTR_BATTERY: Final = "battery"
|
||||
ATTR_DEV_ID: Final = "dev_id"
|
||||
ATTR_GPS: Final = "gps"
|
||||
ATTR_HOST_NAME: Final = "host_name"
|
||||
ATTR_IN_ZONES: Final = "in_zones"
|
||||
ATTR_LOCATION_NAME: Final = "location_name"
|
||||
ATTR_MAC: Final = "mac"
|
||||
ATTR_SOURCE_TYPE: Final = "source_type"
|
||||
|
||||
@@ -1,64 +0,0 @@
|
||||
"""The Dropbox integration."""
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
from python_dropbox_api import (
|
||||
DropboxAPIClient,
|
||||
DropboxAuthException,
|
||||
DropboxUnknownException,
|
||||
)
|
||||
|
||||
from homeassistant.config_entries import ConfigEntry
|
||||
from homeassistant.core import HomeAssistant
|
||||
from homeassistant.exceptions import ConfigEntryAuthFailed, ConfigEntryNotReady
|
||||
from homeassistant.helpers import aiohttp_client
|
||||
from homeassistant.helpers.config_entry_oauth2_flow import (
|
||||
ImplementationUnavailableError,
|
||||
OAuth2Session,
|
||||
async_get_config_entry_implementation,
|
||||
)
|
||||
|
||||
from .auth import DropboxConfigEntryAuth
|
||||
from .const import DATA_BACKUP_AGENT_LISTENERS, DOMAIN
|
||||
|
||||
type DropboxConfigEntry = ConfigEntry[DropboxAPIClient]
|
||||
|
||||
|
||||
async def async_setup_entry(hass: HomeAssistant, entry: DropboxConfigEntry) -> bool:
|
||||
"""Set up Dropbox from a config entry."""
|
||||
try:
|
||||
oauth2_implementation = await async_get_config_entry_implementation(hass, entry)
|
||||
except ImplementationUnavailableError as err:
|
||||
raise ConfigEntryNotReady(
|
||||
translation_domain=DOMAIN,
|
||||
translation_key="oauth2_implementation_unavailable",
|
||||
) from err
|
||||
oauth2_session = OAuth2Session(hass, entry, oauth2_implementation)
|
||||
|
||||
auth = DropboxConfigEntryAuth(
|
||||
aiohttp_client.async_get_clientsession(hass), oauth2_session
|
||||
)
|
||||
|
||||
client = DropboxAPIClient(auth)
|
||||
|
||||
try:
|
||||
await client.get_account_info()
|
||||
except DropboxAuthException as err:
|
||||
raise ConfigEntryAuthFailed from err
|
||||
except (DropboxUnknownException, TimeoutError) as err:
|
||||
raise ConfigEntryNotReady from err
|
||||
|
||||
entry.runtime_data = client
|
||||
|
||||
def async_notify_backup_listeners() -> None:
|
||||
for listener in hass.data.get(DATA_BACKUP_AGENT_LISTENERS, []):
|
||||
listener()
|
||||
|
||||
entry.async_on_unload(entry.async_on_state_change(async_notify_backup_listeners))
|
||||
|
||||
return True
|
||||
|
||||
|
||||
async def async_unload_entry(hass: HomeAssistant, entry: DropboxConfigEntry) -> bool:
|
||||
"""Unload a config entry."""
|
||||
return True
|
||||
@@ -1,38 +0,0 @@
|
||||
"""Application credentials platform for the Dropbox integration."""
|
||||
|
||||
from homeassistant.components.application_credentials import ClientCredential
|
||||
from homeassistant.core import HomeAssistant
|
||||
from homeassistant.helpers.config_entry_oauth2_flow import (
|
||||
AbstractOAuth2Implementation,
|
||||
LocalOAuth2ImplementationWithPkce,
|
||||
)
|
||||
|
||||
from .const import OAUTH2_AUTHORIZE, OAUTH2_SCOPES, OAUTH2_TOKEN
|
||||
|
||||
|
||||
async def async_get_auth_implementation(
|
||||
hass: HomeAssistant, auth_domain: str, credential: ClientCredential
|
||||
) -> AbstractOAuth2Implementation:
|
||||
"""Return custom auth implementation."""
|
||||
return DropboxOAuth2Implementation(
|
||||
hass,
|
||||
auth_domain,
|
||||
credential.client_id,
|
||||
OAUTH2_AUTHORIZE,
|
||||
OAUTH2_TOKEN,
|
||||
credential.client_secret,
|
||||
)
|
||||
|
||||
|
||||
class DropboxOAuth2Implementation(LocalOAuth2ImplementationWithPkce):
|
||||
"""Custom Dropbox OAuth2 implementation to add the necessary authorize url parameters."""
|
||||
|
||||
@property
|
||||
def extra_authorize_data(self) -> dict:
|
||||
"""Extra data that needs to be appended to the authorize url."""
|
||||
data: dict = {
|
||||
"token_access_type": "offline",
|
||||
"scope": " ".join(OAUTH2_SCOPES),
|
||||
}
|
||||
data.update(super().extra_authorize_data)
|
||||
return data
|
||||
@@ -1,44 +0,0 @@
|
||||
"""Authentication for Dropbox."""
|
||||
|
||||
from typing import cast
|
||||
|
||||
from aiohttp import ClientSession
|
||||
from python_dropbox_api import Auth
|
||||
|
||||
from homeassistant.helpers.config_entry_oauth2_flow import OAuth2Session
|
||||
|
||||
|
||||
class DropboxConfigEntryAuth(Auth):
|
||||
"""Provide Dropbox authentication tied to an OAuth2 based config entry."""
|
||||
|
||||
def __init__(
|
||||
self,
|
||||
websession: ClientSession,
|
||||
oauth_session: OAuth2Session,
|
||||
) -> None:
|
||||
"""Initialize DropboxConfigEntryAuth."""
|
||||
super().__init__(websession)
|
||||
self._oauth_session = oauth_session
|
||||
|
||||
async def async_get_access_token(self) -> str:
|
||||
"""Return a valid access token."""
|
||||
await self._oauth_session.async_ensure_token_valid()
|
||||
|
||||
return cast(str, self._oauth_session.token["access_token"])
|
||||
|
||||
|
||||
class DropboxConfigFlowAuth(Auth):
|
||||
"""Provide authentication tied to a fixed token for the config flow."""
|
||||
|
||||
def __init__(
|
||||
self,
|
||||
websession: ClientSession,
|
||||
token: str,
|
||||
) -> None:
|
||||
"""Initialize DropboxConfigFlowAuth."""
|
||||
super().__init__(websession)
|
||||
self._token = token
|
||||
|
||||
async def async_get_access_token(self) -> str:
|
||||
"""Return the fixed access token."""
|
||||
return self._token
|
||||
@@ -1,230 +0,0 @@
|
||||
"""Backup platform for the Dropbox integration."""
|
||||
|
||||
from collections.abc import AsyncIterator, Callable, Coroutine
|
||||
from functools import wraps
|
||||
import json
|
||||
import logging
|
||||
from typing import Any, Concatenate
|
||||
|
||||
from python_dropbox_api import (
|
||||
DropboxAPIClient,
|
||||
DropboxAuthException,
|
||||
DropboxFileOrFolderNotFoundException,
|
||||
DropboxUnknownException,
|
||||
)
|
||||
|
||||
from homeassistant.components.backup import (
|
||||
AgentBackup,
|
||||
BackupAgent,
|
||||
BackupAgentError,
|
||||
BackupNotFound,
|
||||
suggested_filename,
|
||||
)
|
||||
from homeassistant.core import HomeAssistant, callback
|
||||
|
||||
from . import DropboxConfigEntry
|
||||
from .const import DATA_BACKUP_AGENT_LISTENERS, DOMAIN
|
||||
|
||||
_LOGGER = logging.getLogger(__name__)
|
||||
|
||||
|
||||
def _suggested_filenames(backup: AgentBackup) -> tuple[str, str]:
|
||||
"""Return the suggested filenames for the backup and metadata."""
|
||||
base_name = suggested_filename(backup).rsplit(".", 1)[0]
|
||||
return f"{base_name}.tar", f"{base_name}.metadata.json"
|
||||
|
||||
|
||||
async def _async_string_iterator(content: str) -> AsyncIterator[bytes]:
|
||||
"""Yield a string as a single bytes chunk."""
|
||||
yield content.encode()
|
||||
|
||||
|
||||
def handle_backup_errors[_R, **P](
|
||||
func: Callable[Concatenate[DropboxBackupAgent, P], Coroutine[Any, Any, _R]],
|
||||
) -> Callable[Concatenate[DropboxBackupAgent, P], Coroutine[Any, Any, _R]]:
|
||||
"""Handle backup errors."""
|
||||
|
||||
@wraps(func)
|
||||
async def wrapper(
|
||||
self: DropboxBackupAgent, *args: P.args, **kwargs: P.kwargs
|
||||
) -> _R:
|
||||
try:
|
||||
return await func(self, *args, **kwargs)
|
||||
except DropboxFileOrFolderNotFoundException as err:
|
||||
raise BackupNotFound(
|
||||
f"Failed to {func.__name__.removeprefix('async_').replace('_', ' ')}"
|
||||
) from err
|
||||
except DropboxAuthException as err:
|
||||
self._entry.async_start_reauth(self._hass)
|
||||
raise BackupAgentError("Authentication error") from err
|
||||
except DropboxUnknownException as err:
|
||||
_LOGGER.error(
|
||||
"Error during %s: %s",
|
||||
func.__name__,
|
||||
err,
|
||||
)
|
||||
_LOGGER.debug("Full error: %s", err, exc_info=True)
|
||||
raise BackupAgentError(
|
||||
f"Failed to {func.__name__.removeprefix('async_').replace('_', ' ')}"
|
||||
) from err
|
||||
|
||||
return wrapper
|
||||
|
||||
|
||||
async def async_get_backup_agents(
|
||||
hass: HomeAssistant,
|
||||
**kwargs: Any,
|
||||
) -> list[BackupAgent]:
|
||||
"""Return a list of backup agents."""
|
||||
entries = hass.config_entries.async_loaded_entries(DOMAIN)
|
||||
return [DropboxBackupAgent(hass, entry) for entry in entries]
|
||||
|
||||
|
||||
@callback
|
||||
def async_register_backup_agents_listener(
|
||||
hass: HomeAssistant,
|
||||
*,
|
||||
listener: Callable[[], None],
|
||||
**kwargs: Any,
|
||||
) -> Callable[[], None]:
|
||||
"""Register a listener to be called when agents are added or removed.
|
||||
|
||||
:return: A function to unregister the listener.
|
||||
"""
|
||||
hass.data.setdefault(DATA_BACKUP_AGENT_LISTENERS, []).append(listener)
|
||||
|
||||
@callback
|
||||
def remove_listener() -> None:
|
||||
"""Remove the listener."""
|
||||
hass.data[DATA_BACKUP_AGENT_LISTENERS].remove(listener)
|
||||
if not hass.data[DATA_BACKUP_AGENT_LISTENERS]:
|
||||
del hass.data[DATA_BACKUP_AGENT_LISTENERS]
|
||||
|
||||
return remove_listener
|
||||
|
||||
|
||||
class DropboxBackupAgent(BackupAgent):
|
||||
"""Backup agent for the Dropbox integration."""
|
||||
|
||||
domain = DOMAIN
|
||||
|
||||
def __init__(self, hass: HomeAssistant, entry: DropboxConfigEntry) -> None:
|
||||
"""Initialize the backup agent."""
|
||||
super().__init__()
|
||||
self._hass = hass
|
||||
self._entry = entry
|
||||
self.name = entry.title
|
||||
assert entry.unique_id
|
||||
self.unique_id = entry.unique_id
|
||||
self._api: DropboxAPIClient = entry.runtime_data
|
||||
|
||||
async def _async_get_backups(self) -> list[tuple[AgentBackup, str]]:
|
||||
"""Get backups and their corresponding file names."""
|
||||
files = await self._api.list_folder("")
|
||||
|
||||
tar_files = {f.name for f in files if f.name.endswith(".tar")}
|
||||
metadata_files = [f for f in files if f.name.endswith(".metadata.json")]
|
||||
|
||||
backups: list[tuple[AgentBackup, str]] = []
|
||||
for metadata_file in metadata_files:
|
||||
tar_name = metadata_file.name.removesuffix(".metadata.json") + ".tar"
|
||||
if tar_name not in tar_files:
|
||||
_LOGGER.warning(
|
||||
"Found metadata file '%s' without matching backup file",
|
||||
metadata_file.name,
|
||||
)
|
||||
continue
|
||||
|
||||
metadata_stream = self._api.download_file(f"/{metadata_file.name}")
|
||||
raw = b"".join([chunk async for chunk in metadata_stream])
|
||||
try:
|
||||
data = json.loads(raw)
|
||||
backup = AgentBackup.from_dict(data)
|
||||
except (json.JSONDecodeError, ValueError, TypeError, KeyError) as err:
|
||||
_LOGGER.warning(
|
||||
"Skipping invalid metadata file '%s': %s",
|
||||
metadata_file.name,
|
||||
err,
|
||||
)
|
||||
continue
|
||||
backups.append((backup, tar_name))
|
||||
|
||||
return backups
|
||||
|
||||
@handle_backup_errors
|
||||
async def async_upload_backup(
|
||||
self,
|
||||
*,
|
||||
open_stream: Callable[[], Coroutine[Any, Any, AsyncIterator[bytes]]],
|
||||
backup: AgentBackup,
|
||||
**kwargs: Any,
|
||||
) -> None:
|
||||
"""Upload a backup."""
|
||||
backup_filename, metadata_filename = _suggested_filenames(backup)
|
||||
backup_path = f"/{backup_filename}"
|
||||
metadata_path = f"/{metadata_filename}"
|
||||
|
||||
file_stream = await open_stream()
|
||||
await self._api.upload_file(backup_path, file_stream)
|
||||
|
||||
metadata_stream = _async_string_iterator(json.dumps(backup.as_dict()))
|
||||
|
||||
try:
|
||||
await self._api.upload_file(metadata_path, metadata_stream)
|
||||
except (
|
||||
DropboxAuthException,
|
||||
DropboxUnknownException,
|
||||
):
|
||||
await self._api.delete_file(backup_path)
|
||||
raise
|
||||
|
||||
@handle_backup_errors
|
||||
async def async_list_backups(self, **kwargs: Any) -> list[AgentBackup]:
|
||||
"""List backups."""
|
||||
return [backup for backup, _ in await self._async_get_backups()]
|
||||
|
||||
@handle_backup_errors
|
||||
async def async_download_backup(
|
||||
self,
|
||||
backup_id: str,
|
||||
**kwargs: Any,
|
||||
) -> AsyncIterator[bytes]:
|
||||
"""Download a backup file."""
|
||||
backups = await self._async_get_backups()
|
||||
for backup, filename in backups:
|
||||
if backup.backup_id == backup_id:
|
||||
return self._api.download_file(f"/{filename}")
|
||||
|
||||
raise BackupNotFound(f"Backup {backup_id} not found")
|
||||
|
||||
@handle_backup_errors
|
||||
async def async_get_backup(
|
||||
self,
|
||||
backup_id: str,
|
||||
**kwargs: Any,
|
||||
) -> AgentBackup:
|
||||
"""Return a backup."""
|
||||
backups = await self._async_get_backups()
|
||||
|
||||
for backup, _ in backups:
|
||||
if backup.backup_id == backup_id:
|
||||
return backup
|
||||
|
||||
raise BackupNotFound(f"Backup {backup_id} not found")
|
||||
|
||||
@handle_backup_errors
|
||||
async def async_delete_backup(
|
||||
self,
|
||||
backup_id: str,
|
||||
**kwargs: Any,
|
||||
) -> None:
|
||||
"""Delete a backup file."""
|
||||
backups = await self._async_get_backups()
|
||||
for backup, tar_filename in backups:
|
||||
if backup.backup_id == backup_id:
|
||||
metadata_filename = tar_filename.removesuffix(".tar") + ".metadata.json"
|
||||
await self._api.delete_file(f"/{tar_filename}")
|
||||
await self._api.delete_file(f"/{metadata_filename}")
|
||||
return
|
||||
|
||||
raise BackupNotFound(f"Backup {backup_id} not found")
|
||||
@@ -1,60 +0,0 @@
|
||||
"""Config flow for Dropbox."""
|
||||
|
||||
from collections.abc import Mapping
|
||||
import logging
|
||||
from typing import Any
|
||||
|
||||
from python_dropbox_api import DropboxAPIClient
|
||||
|
||||
from homeassistant.config_entries import SOURCE_REAUTH, ConfigFlowResult
|
||||
from homeassistant.const import CONF_ACCESS_TOKEN, CONF_TOKEN
|
||||
from homeassistant.helpers.aiohttp_client import async_get_clientsession
|
||||
from homeassistant.helpers.config_entry_oauth2_flow import AbstractOAuth2FlowHandler
|
||||
|
||||
from .auth import DropboxConfigFlowAuth
|
||||
from .const import DOMAIN
|
||||
|
||||
|
||||
class DropboxConfigFlow(AbstractOAuth2FlowHandler, domain=DOMAIN):
|
||||
"""Config flow to handle Dropbox OAuth2 authentication."""
|
||||
|
||||
DOMAIN = DOMAIN
|
||||
|
||||
@property
|
||||
def logger(self) -> logging.Logger:
|
||||
"""Return logger."""
|
||||
return logging.getLogger(__name__)
|
||||
|
||||
async def async_oauth_create_entry(self, data: dict[str, Any]) -> ConfigFlowResult:
|
||||
"""Create an entry for the flow, or update existing entry."""
|
||||
access_token = data[CONF_TOKEN][CONF_ACCESS_TOKEN]
|
||||
|
||||
auth = DropboxConfigFlowAuth(async_get_clientsession(self.hass), access_token)
|
||||
|
||||
client = DropboxAPIClient(auth)
|
||||
account_info = await client.get_account_info()
|
||||
|
||||
await self.async_set_unique_id(account_info.account_id)
|
||||
if self.source == SOURCE_REAUTH:
|
||||
self._abort_if_unique_id_mismatch(reason="wrong_account")
|
||||
return self.async_update_reload_and_abort(
|
||||
self._get_reauth_entry(), data=data
|
||||
)
|
||||
|
||||
self._abort_if_unique_id_configured()
|
||||
|
||||
return self.async_create_entry(title=account_info.email, data=data)
|
||||
|
||||
async def async_step_reauth(
|
||||
self, entry_data: Mapping[str, Any]
|
||||
) -> ConfigFlowResult:
|
||||
"""Perform reauth upon an API authentication error."""
|
||||
return await self.async_step_reauth_confirm()
|
||||
|
||||
async def async_step_reauth_confirm(
|
||||
self, user_input: dict[str, Any] | None = None
|
||||
) -> ConfigFlowResult:
|
||||
"""Dialog that informs the user that reauth is required."""
|
||||
if user_input is None:
|
||||
return self.async_show_form(step_id="reauth_confirm")
|
||||
return await self.async_step_user()
|
||||
@@ -1,19 +0,0 @@
|
||||
"""Constants for the Dropbox integration."""
|
||||
|
||||
from collections.abc import Callable
|
||||
|
||||
from homeassistant.util.hass_dict import HassKey
|
||||
|
||||
DOMAIN = "dropbox"
|
||||
|
||||
OAUTH2_AUTHORIZE = "https://www.dropbox.com/oauth2/authorize"
|
||||
OAUTH2_TOKEN = "https://api.dropboxapi.com/oauth2/token"
|
||||
OAUTH2_SCOPES = [
|
||||
"account_info.read",
|
||||
"files.content.read",
|
||||
"files.content.write",
|
||||
]
|
||||
|
||||
DATA_BACKUP_AGENT_LISTENERS: HassKey[list[Callable[[], None]]] = HassKey(
|
||||
f"{DOMAIN}.backup_agent_listeners"
|
||||
)
|
||||
@@ -1,13 +0,0 @@
|
||||
{
|
||||
"domain": "dropbox",
|
||||
"name": "Dropbox",
|
||||
"after_dependencies": ["backup"],
|
||||
"codeowners": ["@bdr99"],
|
||||
"config_flow": true,
|
||||
"dependencies": ["application_credentials"],
|
||||
"documentation": "https://www.home-assistant.io/integrations/dropbox",
|
||||
"integration_type": "service",
|
||||
"iot_class": "cloud_polling",
|
||||
"quality_scale": "bronze",
|
||||
"requirements": ["python-dropbox-api==0.1.3"]
|
||||
}
|
||||
@@ -1,112 +0,0 @@
|
||||
rules:
|
||||
# Bronze
|
||||
action-setup:
|
||||
status: exempt
|
||||
comment: Integration does not register any actions.
|
||||
appropriate-polling:
|
||||
status: exempt
|
||||
comment: Integration does not poll.
|
||||
brands: done
|
||||
common-modules:
|
||||
status: exempt
|
||||
comment: Integration does not have any entities or coordinators.
|
||||
config-flow-test-coverage: done
|
||||
config-flow: done
|
||||
dependency-transparency: done
|
||||
docs-actions:
|
||||
status: exempt
|
||||
comment: Integration does not register any actions.
|
||||
docs-high-level-description: done
|
||||
docs-installation-instructions: done
|
||||
docs-removal-instructions: done
|
||||
entity-event-setup:
|
||||
status: exempt
|
||||
comment: Integration does not have any entities.
|
||||
entity-unique-id:
|
||||
status: exempt
|
||||
comment: Integration does not have any entities.
|
||||
has-entity-name:
|
||||
status: exempt
|
||||
comment: Integration does not have any entities.
|
||||
runtime-data: done
|
||||
test-before-configure: done
|
||||
test-before-setup: done
|
||||
unique-config-entry: done
|
||||
|
||||
# Silver
|
||||
action-exceptions:
|
||||
status: exempt
|
||||
comment: Integration does not register any actions.
|
||||
config-entry-unloading: done
|
||||
docs-configuration-parameters:
|
||||
status: exempt
|
||||
comment: Integration does not have any configuration parameters.
|
||||
docs-installation-parameters: done
|
||||
entity-unavailable:
|
||||
status: exempt
|
||||
comment: Integration does not have any entities.
|
||||
integration-owner: done
|
||||
log-when-unavailable: todo
|
||||
parallel-updates:
|
||||
status: exempt
|
||||
comment: Integration does not make any entity updates.
|
||||
reauthentication-flow: done
|
||||
test-coverage: done
|
||||
|
||||
# Gold
|
||||
devices:
|
||||
status: exempt
|
||||
comment: Integration does not have any entities.
|
||||
diagnostics:
|
||||
status: exempt
|
||||
comment: Integration does not have any data to diagnose.
|
||||
discovery-update-info:
|
||||
status: exempt
|
||||
comment: Integration is a service.
|
||||
discovery:
|
||||
status: exempt
|
||||
comment: Integration is a service.
|
||||
docs-data-update:
|
||||
status: exempt
|
||||
comment: Integration does not update any data.
|
||||
docs-examples:
|
||||
status: exempt
|
||||
comment: Integration only provides backup functionality.
|
||||
docs-known-limitations: todo
|
||||
docs-supported-devices:
|
||||
status: exempt
|
||||
comment: Integration does not support any devices.
|
||||
docs-supported-functions: done
|
||||
docs-troubleshooting: todo
|
||||
docs-use-cases: done
|
||||
dynamic-devices:
|
||||
status: exempt
|
||||
comment: Integration does not use any devices.
|
||||
entity-category:
|
||||
status: exempt
|
||||
comment: Integration does not have any entities.
|
||||
entity-device-class:
|
||||
status: exempt
|
||||
comment: Integration does not have any entities.
|
||||
entity-disabled-by-default:
|
||||
status: exempt
|
||||
comment: Integration does not have any entities.
|
||||
entity-translations:
|
||||
status: exempt
|
||||
comment: Integration does not have any entities.
|
||||
exception-translations: todo
|
||||
icon-translations:
|
||||
status: exempt
|
||||
comment: Integration does not have any entities.
|
||||
reconfiguration-flow: todo
|
||||
repair-issues:
|
||||
status: exempt
|
||||
comment: Integration does not have any repairs.
|
||||
stale-devices:
|
||||
status: exempt
|
||||
comment: Integration does not have any devices.
|
||||
|
||||
# Platinum
|
||||
async-dependency: done
|
||||
inject-websession: done
|
||||
strict-typing: done
|
||||
@@ -1,35 +0,0 @@
|
||||
{
|
||||
"config": {
|
||||
"abort": {
|
||||
"already_configured": "[%key:common::config_flow::abort::already_configured_account%]",
|
||||
"already_in_progress": "[%key:common::config_flow::abort::already_in_progress%]",
|
||||
"authorize_url_timeout": "[%key:common::config_flow::abort::oauth2_authorize_url_timeout%]",
|
||||
"missing_configuration": "[%key:common::config_flow::abort::oauth2_missing_configuration%]",
|
||||
"no_url_available": "[%key:common::config_flow::abort::oauth2_no_url_available%]",
|
||||
"oauth_error": "[%key:common::config_flow::abort::oauth2_error%]",
|
||||
"oauth_failed": "[%key:common::config_flow::abort::oauth2_failed%]",
|
||||
"oauth_timeout": "[%key:common::config_flow::abort::oauth2_timeout%]",
|
||||
"oauth_unauthorized": "[%key:common::config_flow::abort::oauth2_unauthorized%]",
|
||||
"reauth_successful": "[%key:common::config_flow::abort::reauth_successful%]",
|
||||
"user_rejected_authorize": "[%key:common::config_flow::abort::oauth2_user_rejected_authorize%]",
|
||||
"wrong_account": "Wrong account: Please authenticate with the correct account."
|
||||
},
|
||||
"create_entry": {
|
||||
"default": "[%key:common::config_flow::create_entry::authenticated%]"
|
||||
},
|
||||
"step": {
|
||||
"pick_implementation": {
|
||||
"title": "[%key:common::config_flow::title::oauth2_pick_implementation%]"
|
||||
},
|
||||
"reauth_confirm": {
|
||||
"description": "The Dropbox integration needs to re-authenticate your account.",
|
||||
"title": "[%key:common::config_flow::title::reauth%]"
|
||||
}
|
||||
}
|
||||
},
|
||||
"exceptions": {
|
||||
"oauth2_implementation_unavailable": {
|
||||
"message": "[%key:common::exceptions::oauth2_implementation_unavailable::message%]"
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -5,7 +5,7 @@
|
||||
"invalid_identifier": "[%key:component::dwd_weather_warnings::config::error::invalid_identifier%]"
|
||||
},
|
||||
"error": {
|
||||
"ambiguous_identifier": "The region identifier and device tracker cannot be specified together.",
|
||||
"ambiguous_identifier": "The region identifier and device tracker can not be specified together.",
|
||||
"attribute_not_found": "The required attributes 'Latitude' and 'Longitude' were not found in the specified device tracker.",
|
||||
"entity_not_found": "The specified device tracker entity was not found.",
|
||||
"invalid_identifier": "The specified region identifier / device tracker is invalid.",
|
||||
|
||||
@@ -24,10 +24,11 @@ class EcowittEntity(Entity):
|
||||
|
||||
self._attr_unique_id = f"{sensor.station.key}-{sensor.key}"
|
||||
self._attr_device_info = DeviceInfo(
|
||||
identifiers={(DOMAIN, sensor.station.key)},
|
||||
identifiers={
|
||||
(DOMAIN, sensor.station.key),
|
||||
},
|
||||
name=sensor.station.model,
|
||||
model=sensor.station.model,
|
||||
manufacturer="Ecowitt",
|
||||
sw_version=sensor.station.version,
|
||||
)
|
||||
|
||||
|
||||
@@ -29,11 +29,9 @@ VALID_NAME_PATTERN = re.compile(r"^(?![\d\s])[\w\d \.]*[\w\d]$")
|
||||
|
||||
|
||||
class ConfigFlowEkeyApi(ekey_bionyxpy.AbstractAuth):
|
||||
"""Authentication implementation used during config flow, without refresh.
|
||||
"""ekey bionyx authentication before a ConfigEntry exists.
|
||||
|
||||
This exists to allow the config flow to use the API before it has fully
|
||||
created a config entry required by OAuth2Session. This does not support
|
||||
refreshing tokens, which is fine since it should have been just created.
|
||||
This implementation directly provides the token without supporting refresh.
|
||||
"""
|
||||
|
||||
def __init__(
|
||||
|
||||
@@ -6,17 +6,25 @@ rules:
|
||||
appropriate-polling:
|
||||
status: exempt
|
||||
comment: The integration uses a push-based mechanism with a background sync task, not polling.
|
||||
brands: done
|
||||
common-modules: done
|
||||
config-flow-test-coverage: done
|
||||
config-flow: done
|
||||
dependency-transparency: done
|
||||
brands:
|
||||
status: done
|
||||
common-modules:
|
||||
status: done
|
||||
config-flow-test-coverage:
|
||||
status: done
|
||||
config-flow:
|
||||
status: done
|
||||
dependency-transparency:
|
||||
status: done
|
||||
docs-actions:
|
||||
status: exempt
|
||||
comment: The integration does not expose any custom service actions.
|
||||
docs-high-level-description: done
|
||||
docs-installation-instructions: done
|
||||
docs-removal-instructions: done
|
||||
docs-high-level-description:
|
||||
status: done
|
||||
docs-installation-instructions:
|
||||
status: done
|
||||
docs-removal-instructions:
|
||||
status: done
|
||||
entity-event-setup:
|
||||
status: exempt
|
||||
comment: This integration does not create its own entities.
|
||||
@@ -26,30 +34,40 @@ rules:
|
||||
has-entity-name:
|
||||
status: exempt
|
||||
comment: This integration does not create its own entities.
|
||||
runtime-data: done
|
||||
test-before-configure: done
|
||||
test-before-setup: done
|
||||
unique-config-entry: done
|
||||
runtime-data:
|
||||
status: done
|
||||
test-before-configure:
|
||||
status: done
|
||||
test-before-setup:
|
||||
status: done
|
||||
unique-config-entry:
|
||||
status: done
|
||||
|
||||
# Silver
|
||||
action-exceptions:
|
||||
status: exempt
|
||||
comment: The integration does not expose any custom service actions.
|
||||
config-entry-unloading: done
|
||||
docs-configuration-parameters: done
|
||||
docs-installation-parameters: done
|
||||
config-entry-unloading:
|
||||
status: done
|
||||
docs-configuration-parameters:
|
||||
status: done
|
||||
docs-installation-parameters:
|
||||
status: done
|
||||
entity-unavailable:
|
||||
status: exempt
|
||||
comment: This integration does not create its own entities.
|
||||
integration-owner: done
|
||||
integration-owner:
|
||||
status: done
|
||||
log-when-unavailable:
|
||||
status: done
|
||||
comment: The integration logs a single message when the EnergyID service is unavailable.
|
||||
parallel-updates:
|
||||
status: exempt
|
||||
comment: This integration does not create its own entities.
|
||||
reauthentication-flow: done
|
||||
test-coverage: done
|
||||
reauthentication-flow:
|
||||
status: done
|
||||
test-coverage:
|
||||
status: done
|
||||
|
||||
# Gold
|
||||
devices:
|
||||
@@ -64,15 +82,21 @@ rules:
|
||||
discovery-update-info:
|
||||
status: exempt
|
||||
comment: No discovery mechanism is used.
|
||||
docs-data-update: done
|
||||
docs-examples: done
|
||||
docs-known-limitations: done
|
||||
docs-data-update:
|
||||
status: done
|
||||
docs-examples:
|
||||
status: done
|
||||
docs-known-limitations:
|
||||
status: done
|
||||
docs-supported-devices:
|
||||
status: exempt
|
||||
comment: This is a service integration not tied to specific device models.
|
||||
docs-supported-functions: done
|
||||
docs-troubleshooting: done
|
||||
docs-use-cases: done
|
||||
docs-supported-functions:
|
||||
status: done
|
||||
docs-troubleshooting:
|
||||
status: done
|
||||
docs-use-cases:
|
||||
status: done
|
||||
dynamic-devices:
|
||||
status: exempt
|
||||
comment: The integration creates a single device entry for the service connection.
|
||||
@@ -88,7 +112,8 @@ rules:
|
||||
entity-translations:
|
||||
status: exempt
|
||||
comment: This integration does not create its own entities.
|
||||
exception-translations: done
|
||||
exception-translations:
|
||||
status: done
|
||||
icon-translations:
|
||||
status: exempt
|
||||
comment: This integration does not create its own entities.
|
||||
@@ -103,8 +128,10 @@ rules:
|
||||
comment: Creates a single service device entry tied to the config entry.
|
||||
|
||||
# Platinum
|
||||
async-dependency: done
|
||||
inject-websession: done
|
||||
async-dependency:
|
||||
status: done
|
||||
inject-websession:
|
||||
status: done
|
||||
strict-typing:
|
||||
status: todo
|
||||
comment: Full strict typing compliance will be addressed in a future update.
|
||||
|
||||
@@ -19,7 +19,7 @@ from pyfirefly.models import Account, Bill, Budget, Category, Currency
|
||||
from homeassistant.config_entries import ConfigEntry
|
||||
from homeassistant.const import CONF_API_KEY, CONF_URL, CONF_VERIFY_SSL
|
||||
from homeassistant.core import HomeAssistant
|
||||
from homeassistant.exceptions import ConfigEntryAuthFailed
|
||||
from homeassistant.exceptions import ConfigEntryAuthFailed, ConfigEntryNotReady
|
||||
from homeassistant.helpers.aiohttp_client import async_create_clientsession
|
||||
from homeassistant.helpers.update_coordinator import DataUpdateCoordinator, UpdateFailed
|
||||
|
||||
@@ -79,13 +79,13 @@ class FireflyDataUpdateCoordinator(DataUpdateCoordinator[FireflyCoordinatorData]
|
||||
translation_placeholders={"error": repr(err)},
|
||||
) from err
|
||||
except FireflyConnectionError as err:
|
||||
raise UpdateFailed(
|
||||
raise ConfigEntryNotReady(
|
||||
translation_domain=DOMAIN,
|
||||
translation_key="cannot_connect",
|
||||
translation_placeholders={"error": repr(err)},
|
||||
) from err
|
||||
except FireflyTimeoutError as err:
|
||||
raise UpdateFailed(
|
||||
raise ConfigEntryNotReady(
|
||||
translation_domain=DOMAIN,
|
||||
translation_key="timeout_connect",
|
||||
translation_placeholders={"error": repr(err)},
|
||||
|
||||
@@ -7,14 +7,13 @@ from typing import Any, cast
|
||||
|
||||
from fitbit import Fitbit
|
||||
from fitbit.exceptions import HTTPException, HTTPUnauthorized
|
||||
from fitbit_web_api import ApiClient, Configuration, DevicesApi, UserApi
|
||||
from fitbit_web_api import ApiClient, Configuration, DevicesApi
|
||||
from fitbit_web_api.exceptions import (
|
||||
ApiException,
|
||||
OpenApiException,
|
||||
UnauthorizedException,
|
||||
)
|
||||
from fitbit_web_api.models.device import Device
|
||||
from fitbit_web_api.models.user import User
|
||||
from requests.exceptions import ConnectionError as RequestsConnectionError
|
||||
|
||||
from homeassistant.const import CONF_ACCESS_TOKEN
|
||||
@@ -25,6 +24,7 @@ from homeassistant.util.unit_system import METRIC_SYSTEM
|
||||
|
||||
from .const import FitbitUnitSystem
|
||||
from .exceptions import FitbitApiException, FitbitAuthException
|
||||
from .model import FitbitProfile
|
||||
|
||||
_LOGGER = logging.getLogger(__name__)
|
||||
|
||||
@@ -46,7 +46,7 @@ class FitbitApi(ABC):
|
||||
) -> None:
|
||||
"""Initialize Fitbit auth."""
|
||||
self._hass = hass
|
||||
self._profile: User | None = None
|
||||
self._profile: FitbitProfile | None = None
|
||||
self._unit_system = unit_system
|
||||
|
||||
@abstractmethod
|
||||
@@ -74,16 +74,18 @@ class FitbitApi(ABC):
|
||||
configuration.access_token = token[CONF_ACCESS_TOKEN]
|
||||
return await self._hass.async_add_executor_job(ApiClient, configuration)
|
||||
|
||||
async def async_get_user_profile(self) -> User:
|
||||
async def async_get_user_profile(self) -> FitbitProfile:
|
||||
"""Return the user profile from the API."""
|
||||
if self._profile is None:
|
||||
client = await self._async_get_fitbit_web_api()
|
||||
api = UserApi(client)
|
||||
api_response = await self._run_async(api.get_profile)
|
||||
if not api_response.user:
|
||||
raise FitbitApiException("No user profile returned from fitbit API")
|
||||
_LOGGER.debug("user_profile_get=%s", api_response.to_dict())
|
||||
self._profile = api_response.user
|
||||
client = await self._async_get_client()
|
||||
response: dict[str, Any] = await self._run(client.user_profile_get)
|
||||
_LOGGER.debug("user_profile_get=%s", response)
|
||||
profile = response["user"]
|
||||
self._profile = FitbitProfile(
|
||||
encoded_id=profile["encodedId"],
|
||||
display_name=profile["displayName"],
|
||||
locale=profile.get("locale"),
|
||||
)
|
||||
return self._profile
|
||||
|
||||
async def async_get_unit_system(self) -> FitbitUnitSystem:
|
||||
|
||||
@@ -85,6 +85,4 @@ class OAuth2FlowHandler(
|
||||
)
|
||||
|
||||
self._abort_if_unique_id_configured()
|
||||
return self.async_create_entry(
|
||||
title=profile.display_name or "Fitbit", data=data
|
||||
)
|
||||
return self.async_create_entry(title=profile.display_name, data=data)
|
||||
|
||||
@@ -7,6 +7,20 @@ from typing import Any
|
||||
from .const import CONF_CLOCK_FORMAT, CONF_MONITORED_RESOURCES, FitbitScope
|
||||
|
||||
|
||||
@dataclass
|
||||
class FitbitProfile:
|
||||
"""User profile from the Fitbit API response."""
|
||||
|
||||
encoded_id: str
|
||||
"""The ID representing the Fitbit user."""
|
||||
|
||||
display_name: str
|
||||
"""The name shown when the user's friends look at their Fitbit profile."""
|
||||
|
||||
locale: str | None
|
||||
"""The locale defined in the user's Fitbit account settings."""
|
||||
|
||||
|
||||
@dataclass
|
||||
class FitbitConfig:
|
||||
"""Information from the fitbit ConfigEntry data."""
|
||||
|
||||
@@ -25,7 +25,6 @@ from homeassistant.const import (
|
||||
UnitOfVolume,
|
||||
)
|
||||
from homeassistant.core import HomeAssistant, callback
|
||||
from homeassistant.exceptions import ConfigEntryNotReady
|
||||
from homeassistant.helpers.device_registry import DeviceEntryType, DeviceInfo
|
||||
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
|
||||
from homeassistant.helpers.icon import icon_for_battery_level
|
||||
@@ -537,8 +536,6 @@ async def async_setup_entry(
|
||||
# These are run serially to reuse the cached user profile, not gathered
|
||||
# to avoid two racing requests.
|
||||
user_profile = await api.async_get_user_profile()
|
||||
if user_profile.encoded_id is None:
|
||||
raise ConfigEntryNotReady("Could not get user profile")
|
||||
unit_system = await api.async_get_unit_system()
|
||||
|
||||
fitbit_config = config_from_entry_data(entry.data)
|
||||
|
||||
@@ -2,14 +2,18 @@
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
from homeassistant.config_entries import ConfigEntry
|
||||
from homeassistant.const import CONF_API_KEY, Platform
|
||||
from homeassistant.core import HomeAssistant
|
||||
|
||||
from .coordinator import FlussConfigEntry, FlussDataUpdateCoordinator
|
||||
from .coordinator import FlussDataUpdateCoordinator
|
||||
|
||||
PLATFORMS: list[Platform] = [Platform.BUTTON]
|
||||
|
||||
|
||||
type FlussConfigEntry = ConfigEntry[FlussDataUpdateCoordinator]
|
||||
|
||||
|
||||
async def async_setup_entry(
|
||||
hass: HomeAssistant,
|
||||
entry: FlussConfigEntry,
|
||||
|
||||
@@ -1,13 +1,16 @@
|
||||
"""Support for Fluss Devices."""
|
||||
|
||||
from homeassistant.components.button import ButtonEntity
|
||||
from homeassistant.config_entries import ConfigEntry
|
||||
from homeassistant.core import HomeAssistant
|
||||
from homeassistant.exceptions import HomeAssistantError
|
||||
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
|
||||
|
||||
from .coordinator import FlussApiClientError, FlussConfigEntry
|
||||
from .coordinator import FlussApiClientError, FlussDataUpdateCoordinator
|
||||
from .entity import FlussEntity
|
||||
|
||||
type FlussConfigEntry = ConfigEntry[FlussDataUpdateCoordinator]
|
||||
|
||||
|
||||
async def async_setup_entry(
|
||||
hass: HomeAssistant,
|
||||
|
||||
@@ -2,26 +2,14 @@
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
from types import MappingProxyType
|
||||
|
||||
from homeassistant.config_entries import ConfigSubentry
|
||||
from homeassistant.const import CONF_API_KEY, Platform
|
||||
from homeassistant.const import Platform
|
||||
from homeassistant.core import HomeAssistant
|
||||
from homeassistant.exceptions import ConfigEntryError
|
||||
|
||||
from .const import (
|
||||
CONF_AZIMUTH,
|
||||
CONF_DAMPING,
|
||||
CONF_DAMPING_EVENING,
|
||||
CONF_DAMPING_MORNING,
|
||||
CONF_DECLINATION,
|
||||
CONF_MODULES_POWER,
|
||||
DEFAULT_AZIMUTH,
|
||||
DEFAULT_DAMPING,
|
||||
DEFAULT_DECLINATION,
|
||||
DEFAULT_MODULES_POWER,
|
||||
DOMAIN,
|
||||
SUBENTRY_TYPE_PLANE,
|
||||
)
|
||||
from .coordinator import ForecastSolarConfigEntry, ForecastSolarDataUpdateCoordinator
|
||||
|
||||
@@ -37,41 +25,14 @@ async def async_migrate_entry(
|
||||
new_options = entry.options.copy()
|
||||
new_options |= {
|
||||
CONF_MODULES_POWER: new_options.pop("modules power"),
|
||||
CONF_DAMPING_MORNING: new_options.get(CONF_DAMPING, DEFAULT_DAMPING),
|
||||
CONF_DAMPING_EVENING: new_options.pop(CONF_DAMPING, DEFAULT_DAMPING),
|
||||
CONF_DAMPING_MORNING: new_options.get(CONF_DAMPING, 0.0),
|
||||
CONF_DAMPING_EVENING: new_options.pop(CONF_DAMPING, 0.0),
|
||||
}
|
||||
|
||||
hass.config_entries.async_update_entry(
|
||||
entry, data=entry.data, options=new_options, version=2
|
||||
)
|
||||
|
||||
if entry.version == 2:
|
||||
# Migrate the main plane from options to a subentry
|
||||
declination = entry.options.get(CONF_DECLINATION, DEFAULT_DECLINATION)
|
||||
azimuth = entry.options.get(CONF_AZIMUTH, DEFAULT_AZIMUTH)
|
||||
modules_power = entry.options.get(CONF_MODULES_POWER, DEFAULT_MODULES_POWER)
|
||||
|
||||
subentry = ConfigSubentry(
|
||||
data=MappingProxyType(
|
||||
{
|
||||
CONF_DECLINATION: declination,
|
||||
CONF_AZIMUTH: azimuth,
|
||||
CONF_MODULES_POWER: modules_power,
|
||||
}
|
||||
),
|
||||
subentry_type=SUBENTRY_TYPE_PLANE,
|
||||
title=f"{declination}° / {azimuth}° / {modules_power}W",
|
||||
unique_id=None,
|
||||
)
|
||||
hass.config_entries.async_add_subentry(entry, subentry)
|
||||
|
||||
new_options = dict(entry.options)
|
||||
new_options.pop(CONF_DECLINATION, None)
|
||||
new_options.pop(CONF_AZIMUTH, None)
|
||||
new_options.pop(CONF_MODULES_POWER, None)
|
||||
|
||||
hass.config_entries.async_update_entry(entry, options=new_options, version=3)
|
||||
|
||||
return True
|
||||
|
||||
|
||||
@@ -79,19 +40,6 @@ async def async_setup_entry(
|
||||
hass: HomeAssistant, entry: ForecastSolarConfigEntry
|
||||
) -> bool:
|
||||
"""Set up Forecast.Solar from a config entry."""
|
||||
plane_subentries = entry.get_subentries_of_type(SUBENTRY_TYPE_PLANE)
|
||||
if not plane_subentries:
|
||||
raise ConfigEntryError(
|
||||
translation_domain=DOMAIN,
|
||||
translation_key="no_plane",
|
||||
)
|
||||
|
||||
if len(plane_subentries) > 1 and not entry.options.get(CONF_API_KEY):
|
||||
raise ConfigEntryError(
|
||||
translation_domain=DOMAIN,
|
||||
translation_key="api_key_required",
|
||||
)
|
||||
|
||||
coordinator = ForecastSolarDataUpdateCoordinator(hass, entry)
|
||||
await coordinator.async_config_entry_first_refresh()
|
||||
|
||||
@@ -99,18 +47,9 @@ async def async_setup_entry(
|
||||
|
||||
await hass.config_entries.async_forward_entry_setups(entry, PLATFORMS)
|
||||
|
||||
entry.async_on_unload(entry.add_update_listener(_async_update_listener))
|
||||
|
||||
return True
|
||||
|
||||
|
||||
async def _async_update_listener(
|
||||
hass: HomeAssistant, entry: ForecastSolarConfigEntry
|
||||
) -> None:
|
||||
"""Handle config entry updates (options or subentry changes)."""
|
||||
hass.config_entries.async_schedule_reload(entry.entry_id)
|
||||
|
||||
|
||||
async def async_unload_entry(
|
||||
hass: HomeAssistant, entry: ForecastSolarConfigEntry
|
||||
) -> bool:
|
||||
|
||||
@@ -11,13 +11,11 @@ from homeassistant.config_entries import (
|
||||
ConfigEntry,
|
||||
ConfigFlow,
|
||||
ConfigFlowResult,
|
||||
ConfigSubentryFlow,
|
||||
OptionsFlow,
|
||||
SubentryFlowResult,
|
||||
OptionsFlowWithReload,
|
||||
)
|
||||
from homeassistant.const import CONF_API_KEY, CONF_LATITUDE, CONF_LONGITUDE, CONF_NAME
|
||||
from homeassistant.core import callback
|
||||
from homeassistant.helpers import config_validation as cv, selector
|
||||
from homeassistant.helpers import config_validation as cv
|
||||
|
||||
from .const import (
|
||||
CONF_AZIMUTH,
|
||||
@@ -26,51 +24,16 @@ from .const import (
|
||||
CONF_DECLINATION,
|
||||
CONF_INVERTER_SIZE,
|
||||
CONF_MODULES_POWER,
|
||||
DEFAULT_AZIMUTH,
|
||||
DEFAULT_DAMPING,
|
||||
DEFAULT_DECLINATION,
|
||||
DEFAULT_MODULES_POWER,
|
||||
DOMAIN,
|
||||
MAX_PLANES,
|
||||
SUBENTRY_TYPE_PLANE,
|
||||
)
|
||||
|
||||
RE_API_KEY = re.compile(r"^[a-zA-Z0-9]{16}$")
|
||||
|
||||
PLANE_SCHEMA = vol.Schema(
|
||||
{
|
||||
vol.Required(CONF_DECLINATION): vol.All(
|
||||
selector.NumberSelector(
|
||||
selector.NumberSelectorConfig(
|
||||
min=0, max=90, step=1, mode=selector.NumberSelectorMode.BOX
|
||||
),
|
||||
),
|
||||
vol.Coerce(int),
|
||||
),
|
||||
vol.Required(CONF_AZIMUTH): vol.All(
|
||||
selector.NumberSelector(
|
||||
selector.NumberSelectorConfig(
|
||||
min=0, max=360, step=1, mode=selector.NumberSelectorMode.BOX
|
||||
),
|
||||
),
|
||||
vol.Coerce(int),
|
||||
),
|
||||
vol.Required(CONF_MODULES_POWER): vol.All(
|
||||
selector.NumberSelector(
|
||||
selector.NumberSelectorConfig(
|
||||
min=1, step=1, mode=selector.NumberSelectorMode.BOX
|
||||
),
|
||||
),
|
||||
vol.Coerce(int),
|
||||
),
|
||||
}
|
||||
)
|
||||
|
||||
|
||||
class ForecastSolarFlowHandler(ConfigFlow, domain=DOMAIN):
|
||||
"""Handle a config flow for Forecast.Solar."""
|
||||
|
||||
VERSION = 3
|
||||
VERSION = 2
|
||||
|
||||
@staticmethod
|
||||
@callback
|
||||
@@ -80,14 +43,6 @@ class ForecastSolarFlowHandler(ConfigFlow, domain=DOMAIN):
|
||||
"""Get the options flow for this handler."""
|
||||
return ForecastSolarOptionFlowHandler()
|
||||
|
||||
@classmethod
|
||||
@callback
|
||||
def async_get_supported_subentry_types(
|
||||
cls, config_entry: ConfigEntry
|
||||
) -> dict[str, type[ConfigSubentryFlow]]:
|
||||
"""Return subentries supported by this handler."""
|
||||
return {SUBENTRY_TYPE_PLANE: PlaneSubentryFlowHandler}
|
||||
|
||||
async def async_step_user(
|
||||
self, user_input: dict[str, Any] | None = None
|
||||
) -> ConfigFlowResult:
|
||||
@@ -99,112 +54,94 @@ class ForecastSolarFlowHandler(ConfigFlow, domain=DOMAIN):
|
||||
CONF_LATITUDE: user_input[CONF_LATITUDE],
|
||||
CONF_LONGITUDE: user_input[CONF_LONGITUDE],
|
||||
},
|
||||
subentries=[
|
||||
{
|
||||
"subentry_type": SUBENTRY_TYPE_PLANE,
|
||||
"data": {
|
||||
CONF_DECLINATION: user_input[CONF_DECLINATION],
|
||||
CONF_AZIMUTH: user_input[CONF_AZIMUTH],
|
||||
CONF_MODULES_POWER: user_input[CONF_MODULES_POWER],
|
||||
},
|
||||
"title": f"{user_input[CONF_DECLINATION]}° / {user_input[CONF_AZIMUTH]}° / {user_input[CONF_MODULES_POWER]}W",
|
||||
"unique_id": None,
|
||||
},
|
||||
],
|
||||
options={
|
||||
CONF_AZIMUTH: user_input[CONF_AZIMUTH],
|
||||
CONF_DECLINATION: user_input[CONF_DECLINATION],
|
||||
CONF_MODULES_POWER: user_input[CONF_MODULES_POWER],
|
||||
},
|
||||
)
|
||||
|
||||
return self.async_show_form(
|
||||
step_id="user",
|
||||
data_schema=self.add_suggested_values_to_schema(
|
||||
vol.Schema(
|
||||
{
|
||||
vol.Required(CONF_NAME): str,
|
||||
vol.Required(CONF_LATITUDE): cv.latitude,
|
||||
vol.Required(CONF_LONGITUDE): cv.longitude,
|
||||
}
|
||||
).extend(PLANE_SCHEMA.schema),
|
||||
data_schema=vol.Schema(
|
||||
{
|
||||
CONF_NAME: self.hass.config.location_name,
|
||||
CONF_LATITUDE: self.hass.config.latitude,
|
||||
CONF_LONGITUDE: self.hass.config.longitude,
|
||||
CONF_DECLINATION: DEFAULT_DECLINATION,
|
||||
CONF_AZIMUTH: DEFAULT_AZIMUTH,
|
||||
CONF_MODULES_POWER: DEFAULT_MODULES_POWER,
|
||||
},
|
||||
vol.Required(
|
||||
CONF_NAME, default=self.hass.config.location_name
|
||||
): str,
|
||||
vol.Required(
|
||||
CONF_LATITUDE, default=self.hass.config.latitude
|
||||
): cv.latitude,
|
||||
vol.Required(
|
||||
CONF_LONGITUDE, default=self.hass.config.longitude
|
||||
): cv.longitude,
|
||||
vol.Required(CONF_DECLINATION, default=25): vol.All(
|
||||
vol.Coerce(int), vol.Range(min=0, max=90)
|
||||
),
|
||||
vol.Required(CONF_AZIMUTH, default=180): vol.All(
|
||||
vol.Coerce(int), vol.Range(min=0, max=360)
|
||||
),
|
||||
vol.Required(CONF_MODULES_POWER): vol.All(
|
||||
vol.Coerce(int), vol.Range(min=1)
|
||||
),
|
||||
}
|
||||
),
|
||||
)
|
||||
|
||||
|
||||
class ForecastSolarOptionFlowHandler(OptionsFlow):
|
||||
class ForecastSolarOptionFlowHandler(OptionsFlowWithReload):
|
||||
"""Handle options."""
|
||||
|
||||
async def async_step_init(
|
||||
self, user_input: dict[str, Any] | None = None
|
||||
) -> ConfigFlowResult:
|
||||
"""Manage the options."""
|
||||
errors: dict[str, str] = {}
|
||||
planes_count = len(
|
||||
self.config_entry.get_subentries_of_type(SUBENTRY_TYPE_PLANE)
|
||||
)
|
||||
|
||||
errors = {}
|
||||
if user_input is not None:
|
||||
api_key = user_input.get(CONF_API_KEY)
|
||||
if planes_count > 1 and not api_key:
|
||||
errors[CONF_API_KEY] = "api_key_required"
|
||||
elif api_key and RE_API_KEY.match(api_key) is None:
|
||||
if (api_key := user_input.get(CONF_API_KEY)) and RE_API_KEY.match(
|
||||
api_key
|
||||
) is None:
|
||||
errors[CONF_API_KEY] = "invalid_api_key"
|
||||
else:
|
||||
return self.async_create_entry(
|
||||
title="", data=user_input | {CONF_API_KEY: api_key or None}
|
||||
)
|
||||
|
||||
suggested_api_key = self.config_entry.options.get(CONF_API_KEY, "")
|
||||
|
||||
return self.async_show_form(
|
||||
step_id="init",
|
||||
data_schema=vol.Schema(
|
||||
{
|
||||
vol.Required(
|
||||
vol.Optional(
|
||||
CONF_API_KEY,
|
||||
default=suggested_api_key,
|
||||
)
|
||||
if planes_count > 1
|
||||
else vol.Optional(
|
||||
CONF_API_KEY,
|
||||
description={"suggested_value": suggested_api_key},
|
||||
description={
|
||||
"suggested_value": self.config_entry.options.get(
|
||||
CONF_API_KEY, ""
|
||||
)
|
||||
},
|
||||
): str,
|
||||
vol.Required(
|
||||
CONF_DECLINATION,
|
||||
default=self.config_entry.options[CONF_DECLINATION],
|
||||
): vol.All(vol.Coerce(int), vol.Range(min=0, max=90)),
|
||||
vol.Required(
|
||||
CONF_AZIMUTH,
|
||||
default=self.config_entry.options.get(CONF_AZIMUTH),
|
||||
): vol.All(vol.Coerce(int), vol.Range(min=-0, max=360)),
|
||||
vol.Required(
|
||||
CONF_MODULES_POWER,
|
||||
default=self.config_entry.options[CONF_MODULES_POWER],
|
||||
): vol.All(vol.Coerce(int), vol.Range(min=1)),
|
||||
vol.Optional(
|
||||
CONF_DAMPING_MORNING,
|
||||
default=self.config_entry.options.get(
|
||||
CONF_DAMPING_MORNING, DEFAULT_DAMPING
|
||||
CONF_DAMPING_MORNING, 0.0
|
||||
),
|
||||
): vol.All(
|
||||
selector.NumberSelector(
|
||||
selector.NumberSelectorConfig(
|
||||
min=0,
|
||||
max=1,
|
||||
step=0.01,
|
||||
mode=selector.NumberSelectorMode.BOX,
|
||||
),
|
||||
),
|
||||
vol.Coerce(float),
|
||||
),
|
||||
): vol.Coerce(float),
|
||||
vol.Optional(
|
||||
CONF_DAMPING_EVENING,
|
||||
default=self.config_entry.options.get(
|
||||
CONF_DAMPING_EVENING, DEFAULT_DAMPING
|
||||
CONF_DAMPING_EVENING, 0.0
|
||||
),
|
||||
): vol.All(
|
||||
selector.NumberSelector(
|
||||
selector.NumberSelectorConfig(
|
||||
min=0,
|
||||
max=1,
|
||||
step=0.01,
|
||||
mode=selector.NumberSelectorMode.BOX,
|
||||
),
|
||||
),
|
||||
vol.Coerce(float),
|
||||
),
|
||||
): vol.Coerce(float),
|
||||
vol.Optional(
|
||||
CONF_INVERTER_SIZE,
|
||||
description={
|
||||
@@ -212,89 +149,8 @@ class ForecastSolarOptionFlowHandler(OptionsFlow):
|
||||
CONF_INVERTER_SIZE
|
||||
)
|
||||
},
|
||||
): vol.All(
|
||||
selector.NumberSelector(
|
||||
selector.NumberSelectorConfig(
|
||||
min=1,
|
||||
step=1,
|
||||
mode=selector.NumberSelectorMode.BOX,
|
||||
),
|
||||
),
|
||||
vol.Coerce(int),
|
||||
),
|
||||
): vol.Coerce(int),
|
||||
}
|
||||
),
|
||||
errors=errors,
|
||||
)
|
||||
|
||||
|
||||
class PlaneSubentryFlowHandler(ConfigSubentryFlow):
|
||||
"""Handle a subentry flow for adding/editing a plane."""
|
||||
|
||||
async def async_step_user(
|
||||
self, user_input: dict[str, Any] | None = None
|
||||
) -> SubentryFlowResult:
|
||||
"""Handle the user step to add a new plane."""
|
||||
entry = self._get_entry()
|
||||
planes_count = len(entry.get_subentries_of_type(SUBENTRY_TYPE_PLANE))
|
||||
if planes_count >= MAX_PLANES:
|
||||
return self.async_abort(reason="max_planes")
|
||||
if planes_count >= 1 and not entry.options.get(CONF_API_KEY):
|
||||
return self.async_abort(reason="api_key_required")
|
||||
|
||||
if user_input is not None:
|
||||
return self.async_create_entry(
|
||||
title=f"{user_input[CONF_DECLINATION]}° / {user_input[CONF_AZIMUTH]}° / {user_input[CONF_MODULES_POWER]}W",
|
||||
data={
|
||||
CONF_DECLINATION: user_input[CONF_DECLINATION],
|
||||
CONF_AZIMUTH: user_input[CONF_AZIMUTH],
|
||||
CONF_MODULES_POWER: user_input[CONF_MODULES_POWER],
|
||||
},
|
||||
)
|
||||
|
||||
return self.async_show_form(
|
||||
step_id="user",
|
||||
data_schema=self.add_suggested_values_to_schema(
|
||||
PLANE_SCHEMA,
|
||||
{
|
||||
CONF_DECLINATION: DEFAULT_DECLINATION,
|
||||
CONF_AZIMUTH: DEFAULT_AZIMUTH,
|
||||
CONF_MODULES_POWER: DEFAULT_MODULES_POWER,
|
||||
},
|
||||
),
|
||||
)
|
||||
|
||||
async def async_step_reconfigure(
|
||||
self, user_input: dict[str, Any] | None = None
|
||||
) -> SubentryFlowResult:
|
||||
"""Handle reconfiguration of an existing plane."""
|
||||
subentry = self._get_reconfigure_subentry()
|
||||
|
||||
if user_input is not None:
|
||||
entry = self._get_entry()
|
||||
if self._async_update(
|
||||
entry,
|
||||
subentry,
|
||||
data={
|
||||
CONF_DECLINATION: user_input[CONF_DECLINATION],
|
||||
CONF_AZIMUTH: user_input[CONF_AZIMUTH],
|
||||
CONF_MODULES_POWER: user_input[CONF_MODULES_POWER],
|
||||
},
|
||||
title=f"{user_input[CONF_DECLINATION]}° / {user_input[CONF_AZIMUTH]}° / {user_input[CONF_MODULES_POWER]}W",
|
||||
):
|
||||
if not entry.update_listeners:
|
||||
self.hass.config_entries.async_schedule_reload(entry.entry_id)
|
||||
|
||||
return self.async_abort(reason="reconfigure_successful")
|
||||
|
||||
return self.async_show_form(
|
||||
step_id="reconfigure",
|
||||
data_schema=self.add_suggested_values_to_schema(
|
||||
PLANE_SCHEMA,
|
||||
{
|
||||
CONF_DECLINATION: subentry.data[CONF_DECLINATION],
|
||||
CONF_AZIMUTH: subentry.data[CONF_AZIMUTH],
|
||||
CONF_MODULES_POWER: subentry.data[CONF_MODULES_POWER],
|
||||
},
|
||||
),
|
||||
)
|
||||
|
||||
@@ -14,9 +14,3 @@ CONF_DAMPING = "damping"
|
||||
CONF_DAMPING_MORNING = "damping_morning"
|
||||
CONF_DAMPING_EVENING = "damping_evening"
|
||||
CONF_INVERTER_SIZE = "inverter_size"
|
||||
DEFAULT_DECLINATION = 25
|
||||
DEFAULT_AZIMUTH = 180
|
||||
DEFAULT_MODULES_POWER = 10000
|
||||
DEFAULT_DAMPING = 0.0
|
||||
MAX_PLANES = 4
|
||||
SUBENTRY_TYPE_PLANE = "plane"
|
||||
|
||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user