Compare commits

..

2 Commits

Author SHA1 Message Date
Erik
e702d077f4 Adjust 2026-04-01 18:32:41 +02:00
Erik
35314fb9ed Remove device_registry.async_setup 2026-04-01 17:43:53 +02:00
157 changed files with 2346 additions and 9049 deletions

View File

@@ -195,7 +195,6 @@ GITHUB_USER=$(gh api user --jq .login 2>/dev/null || git remote get-url "$PUSH_R
# Create PR (gh pr create pushes the branch automatically)
gh pr create --repo home-assistant/core --base dev \
--head "$GITHUB_USER:$BRANCH" \
--draft \
--title "TITLE_HERE" \
--body "$(cat <<'EOF'
BODY_HERE

View File

@@ -1,7 +1,6 @@
---
name: github-pr-reviewer
description: Reviews GitHub pull requests and provides feedback comments.
disallowedTools: Write, Edit
description: Review a GitHub pull request and provide feedback comments. Use when the user says "review the current PR" or asks to review a specific PR.
---
# Review GitHub Pull Request

View File

@@ -3,27 +3,54 @@ name: Home Assistant Integration knowledge
description: Everything you need to know to build, test and review Home Assistant Integrations. If you're looking at an integration, you must use this as your primary reference.
---
## File Locations
### File Locations
- **Integration code**: `./homeassistant/components/<integration_domain>/`
- **Integration tests**: `./tests/components/<integration_domain>/`
## General guidelines
## Integration Templates
- When looking for examples, prefer integrations with the platinum or gold quality scale level first.
- Polling intervals are NOT user-configurable. Never add scan_interval, update_interval, or polling frequency options to config flows or config entries.
- Do NOT allow users to set config entry names in config flows. Names are automatically generated or can be customized later in UI. Exception: helper integrations may allow custom names.
### Standard Integration Structure
```
homeassistant/components/my_integration/
├── __init__.py # Entry point with async_setup_entry
├── manifest.json # Integration metadata and dependencies
├── const.py # Domain and constants
├── config_flow.py # UI configuration flow
├── coordinator.py # Data update coordinator (if needed)
├── entity.py # Base entity class (if shared patterns)
├── sensor.py # Sensor platform
├── strings.json # User-facing text and translations
├── services.yaml # Service definitions (if applicable)
└── quality_scale.yaml # Quality scale rule status
```
The following platforms have extra guidelines:
An integration can have platforms as needed (e.g., `sensor.py`, `switch.py`, etc.). The following platforms have extra guidelines:
- **Diagnostics**: [`platform-diagnostics.md`](platform-diagnostics.md) for diagnostic data collection
- **Repairs**: [`platform-repairs.md`](platform-repairs.md) for user-actionable repair issues
### Minimal Integration Checklist
- [ ] `manifest.json` with required fields (domain, name, codeowners, etc.)
- [ ] `__init__.py` with `async_setup_entry` and `async_unload_entry`
- [ ] `config_flow.py` with UI configuration support
- [ ] `const.py` with `DOMAIN` constant
- [ ] `strings.json` with at least config flow text
- [ ] Platform files (`sensor.py`, etc.) as needed
- [ ] `quality_scale.yaml` with rule status tracking
## Integration Quality Scale
- When validating the quality scale rules, check them at https://developers.home-assistant.io/docs/core/integration-quality-scale/rules
- When implementing or reviewing an integration, always consider the quality scale rules, since they promote best practices.
Home Assistant uses an Integration Quality Scale to ensure code quality and consistency. The quality level determines which rules apply:
Template scale file: `./script/scaffold/templates/integration/integration/quality_scale.yaml`
### Quality Scale Levels
- **Bronze**: Basic requirements (ALL Bronze rules are mandatory)
- **Silver**: Enhanced functionality
- **Gold**: Advanced features
- **Platinum**: Highest quality standards
### Quality Scale Progression
- **Bronze → Silver**: Add entity unavailability, parallel updates, auth flows
- **Silver → Gold**: Add device management, diagnostics, translations
- **Gold → Platinum**: Add strict typing, async dependencies, websession injection
### How Rules Apply
1. **Check `manifest.json`**: Look for `"quality_scale"` key to determine integration level
@@ -34,7 +61,726 @@ Template scale file: `./script/scaffold/templates/integration/integration/qualit
- `exempt`: Rule doesn't apply (with reason in comment)
- `todo`: Rule needs implementation
### Example `quality_scale.yaml` Structure
```yaml
rules:
# Bronze (mandatory)
config-flow: done
entity-unique-id: done
action-setup:
status: exempt
comment: Integration does not register custom actions.
# Silver (if targeting Silver+)
entity-unavailable: done
parallel-updates: done
# Gold (if targeting Gold+)
devices: done
diagnostics: done
# Platinum (if targeting Platinum)
strict-typing: done
```
**When Reviewing/Creating Code**: Always check the integration's quality scale level and exemption status before applying rules.
## Code Organization
### Core Locations
- Shared constants: `homeassistant/const.py` (use these instead of hardcoding)
- Integration structure:
- `homeassistant/components/{domain}/const.py` - Constants
- `homeassistant/components/{domain}/models.py` - Data models
- `homeassistant/components/{domain}/coordinator.py` - Update coordinator
- `homeassistant/components/{domain}/config_flow.py` - Configuration flow
- `homeassistant/components/{domain}/{platform}.py` - Platform implementations
### Common Modules
- **coordinator.py**: Centralize data fetching logic
```python
class MyCoordinator(DataUpdateCoordinator[MyData]):
def __init__(self, hass: HomeAssistant, client: MyClient, config_entry: ConfigEntry) -> None:
super().__init__(
hass,
logger=LOGGER,
name=DOMAIN,
update_interval=timedelta(minutes=1),
config_entry=config_entry, # ✅ Pass config_entry - it's accepted and recommended
)
```
- **entity.py**: Base entity definitions to reduce duplication
```python
class MyEntity(CoordinatorEntity[MyCoordinator]):
_attr_has_entity_name = True
```
### Runtime Data Storage
- **Use ConfigEntry.runtime_data**: Store non-persistent runtime data
```python
type MyIntegrationConfigEntry = ConfigEntry[MyClient]
async def async_setup_entry(hass: HomeAssistant, entry: MyIntegrationConfigEntry) -> bool:
client = MyClient(entry.data[CONF_HOST])
entry.runtime_data = client
```
### Manifest Requirements
- **Required Fields**: `domain`, `name`, `codeowners`, `integration_type`, `documentation`, `requirements`
- **Integration Types**: `device`, `hub`, `service`, `system`, `helper`
- **IoT Class**: Always specify connectivity method (e.g., `cloud_polling`, `local_polling`, `local_push`)
- **Discovery Methods**: Add when applicable: `zeroconf`, `dhcp`, `bluetooth`, `ssdp`, `usb`
- **Dependencies**: Include platform dependencies (e.g., `application_credentials`, `bluetooth_adapters`)
### Config Flow Patterns
- **Version Control**: Always set `VERSION = 1` and `MINOR_VERSION = 1`
- **Unique ID Management**:
```python
await self.async_set_unique_id(device_unique_id)
self._abort_if_unique_id_configured()
```
- **Error Handling**: Define errors in `strings.json` under `config.error`
- **Step Methods**: Use standard naming (`async_step_user`, `async_step_discovery`, etc.)
### Integration Ownership
- **manifest.json**: Add GitHub usernames to `codeowners`:
```json
{
"domain": "my_integration",
"name": "My Integration",
"codeowners": ["@me"]
}
```
### Async Dependencies (Platinum)
- **Requirement**: All dependencies must use asyncio
- Ensures efficient task handling without thread context switching
### WebSession Injection (Platinum)
- **Pass WebSession**: Support passing web sessions to dependencies
```python
async def async_setup_entry(hass: HomeAssistant, entry: MyConfigEntry) -> bool:
"""Set up integration from config entry."""
client = MyClient(entry.data[CONF_HOST], async_get_clientsession(hass))
```
- For cookies: Use `async_create_clientsession` (aiohttp) or `create_async_httpx_client` (httpx)
### Data Update Coordinator
- **Standard Pattern**: Use for efficient data management
```python
class MyCoordinator(DataUpdateCoordinator):
def __init__(self, hass: HomeAssistant, client: MyClient, config_entry: ConfigEntry) -> None:
super().__init__(
hass,
logger=LOGGER,
name=DOMAIN,
update_interval=timedelta(minutes=5),
config_entry=config_entry, # ✅ Pass config_entry - it's accepted and recommended
)
self.client = client
async def _async_update_data(self):
try:
return await self.client.fetch_data()
except ApiError as err:
raise UpdateFailed(f"API communication error: {err}")
```
- **Error Types**: Use `UpdateFailed` for API errors, `ConfigEntryAuthFailed` for auth issues
- **Config Entry**: Always pass `config_entry` parameter to coordinator - it's accepted and recommended
## Integration Guidelines
### Configuration Flow
- **UI Setup Required**: All integrations must support configuration via UI
- **Manifest**: Set `"config_flow": true` in `manifest.json`
- **Data Storage**:
- Connection-critical config: Store in `ConfigEntry.data`
- Non-critical settings: Store in `ConfigEntry.options`
- **Validation**: Always validate user input before creating entries
- **Config Entry Naming**:
- ❌ Do NOT allow users to set config entry names in config flows
- Names are automatically generated or can be customized later in UI
- ✅ Exception: Helper integrations MAY allow custom names in config flow
- **Connection Testing**: Test device/service connection during config flow:
```python
try:
await client.get_data()
except MyException:
errors["base"] = "cannot_connect"
```
- **Duplicate Prevention**: Prevent duplicate configurations:
```python
# Using unique ID
await self.async_set_unique_id(identifier)
self._abort_if_unique_id_configured()
# Using unique data
self._async_abort_entries_match({CONF_HOST: user_input[CONF_HOST]})
```
### Reauthentication Support
- **Required Method**: Implement `async_step_reauth` in config flow
- **Credential Updates**: Allow users to update credentials without re-adding
- **Validation**: Verify account matches existing unique ID:
```python
await self.async_set_unique_id(user_id)
self._abort_if_unique_id_mismatch(reason="wrong_account")
return self.async_update_reload_and_abort(
self._get_reauth_entry(),
data_updates={CONF_API_TOKEN: user_input[CONF_API_TOKEN]}
)
```
### Reconfiguration Flow
- **Purpose**: Allow configuration updates without removing device
- **Implementation**: Add `async_step_reconfigure` method
- **Validation**: Prevent changing underlying account with `_abort_if_unique_id_mismatch`
### Device Discovery
- **Manifest Configuration**: Add discovery method (zeroconf, dhcp, etc.)
```json
{
"zeroconf": ["_mydevice._tcp.local."]
}
```
- **Discovery Handler**: Implement appropriate `async_step_*` method:
```python
async def async_step_zeroconf(self, discovery_info):
"""Handle zeroconf discovery."""
await self.async_set_unique_id(discovery_info.properties["serialno"])
self._abort_if_unique_id_configured(updates={CONF_HOST: discovery_info.host})
```
- **Network Updates**: Use discovery to update dynamic IP addresses
### Network Discovery Implementation
- **Zeroconf/mDNS**: Use async instances
```python
aiozc = await zeroconf.async_get_async_instance(hass)
```
- **SSDP Discovery**: Register callbacks with cleanup
```python
entry.async_on_unload(
ssdp.async_register_callback(
hass, _async_discovered_device,
{"st": "urn:schemas-upnp-org:device:ZonePlayer:1"}
)
)
```
### Bluetooth Integration
- **Manifest Dependencies**: Add `bluetooth_adapters` to dependencies
- **Connectable**: Set `"connectable": true` for connection-required devices
- **Scanner Usage**: Always use shared scanner instance
```python
scanner = bluetooth.async_get_scanner()
entry.async_on_unload(
bluetooth.async_register_callback(
hass, _async_discovered_device,
{"service_uuid": "example_uuid"},
bluetooth.BluetoothScanningMode.ACTIVE
)
)
```
- **Connection Handling**: Never reuse `BleakClient` instances, use 10+ second timeouts
### Setup Validation
- **Test Before Setup**: Verify integration can be set up in `async_setup_entry`
- **Exception Handling**:
- `ConfigEntryNotReady`: Device offline or temporary failure
- `ConfigEntryAuthFailed`: Authentication issues
- `ConfigEntryError`: Unresolvable setup problems
### Config Entry Unloading
- **Required**: Implement `async_unload_entry` for runtime removal/reload
- **Platform Unloading**: Use `hass.config_entries.async_unload_platforms`
- **Cleanup**: Register callbacks with `entry.async_on_unload`:
```python
async def async_unload_entry(hass: HomeAssistant, entry: MyConfigEntry) -> bool:
"""Unload a config entry."""
if unload_ok := await hass.config_entries.async_unload_platforms(entry, PLATFORMS):
entry.runtime_data.listener() # Clean up resources
return unload_ok
```
### Service Actions
- **Registration**: Register all service actions in `async_setup`, NOT in `async_setup_entry`
- **Validation**: Check config entry existence and loaded state:
```python
async def async_setup(hass: HomeAssistant, config: ConfigType) -> bool:
async def service_action(call: ServiceCall) -> ServiceResponse:
if not (entry := hass.config_entries.async_get_entry(call.data[ATTR_CONFIG_ENTRY_ID])):
raise ServiceValidationError("Entry not found")
if entry.state is not ConfigEntryState.LOADED:
raise ServiceValidationError("Entry not loaded")
```
- **Exception Handling**: Raise appropriate exceptions:
```python
# For invalid input
if end_date < start_date:
raise ServiceValidationError("End date must be after start date")
# For service errors
try:
await client.set_schedule(start_date, end_date)
except MyConnectionError as err:
raise HomeAssistantError("Could not connect to the schedule") from err
```
### Service Registration Patterns
- **Entity Services**: Register on platform setup
```python
platform.async_register_entity_service(
"my_entity_service",
{vol.Required("parameter"): cv.string},
"handle_service_method"
)
```
- **Service Schema**: Always validate input
```python
SERVICE_SCHEMA = vol.Schema({
vol.Required("entity_id"): cv.entity_ids,
vol.Required("parameter"): cv.string,
vol.Optional("timeout", default=30): cv.positive_int,
})
```
- **Services File**: Create `services.yaml` with descriptions and field definitions
### Polling
- Use update coordinator pattern when possible
- **Polling intervals are NOT user-configurable**: Never add scan_interval, update_interval, or polling frequency options to config flows or config entries
- **Integration determines intervals**: Set `update_interval` programmatically based on integration logic, not user input
- **Minimum Intervals**:
- Local network: 5 seconds
- Cloud services: 60 seconds
- **Parallel Updates**: Specify number of concurrent updates:
```python
PARALLEL_UPDATES = 1 # Serialize updates to prevent overwhelming device
# OR
PARALLEL_UPDATES = 0 # Unlimited (for coordinator-based or read-only)
```
## Entity Development
### Unique IDs
- **Required**: Every entity must have a unique ID for registry tracking
- Must be unique per platform (not per integration)
- Don't include integration domain or platform in ID
- **Implementation**:
```python
class MySensor(SensorEntity):
def __init__(self, device_id: str) -> None:
self._attr_unique_id = f"{device_id}_temperature"
```
**Acceptable ID Sources**:
- Device serial numbers
- MAC addresses (formatted using `format_mac` from device registry)
- Physical identifiers (printed/EEPROM)
- Config entry ID as last resort: `f"{entry.entry_id}-battery"`
**Never Use**:
- IP addresses, hostnames, URLs
- Device names
- Email addresses, usernames
### Entity Descriptions
- **Lambda/Anonymous Functions**: Often used in EntityDescription for value transformation
- **Multiline Lambdas**: When lambdas exceed line length, wrap in parentheses for readability
- **Bad pattern**:
```python
SensorEntityDescription(
key="temperature",
name="Temperature",
value_fn=lambda data: round(data["temp_value"] * 1.8 + 32, 1) if data.get("temp_value") is not None else None, # ❌ Too long
)
```
- **Good pattern**:
```python
SensorEntityDescription(
key="temperature",
name="Temperature",
value_fn=lambda data: ( # ✅ Parenthesis on same line as lambda
round(data["temp_value"] * 1.8 + 32, 1)
if data.get("temp_value") is not None
else None
),
)
```
### Entity Naming
- **Use has_entity_name**: Set `_attr_has_entity_name = True`
- **For specific fields**:
```python
class MySensor(SensorEntity):
_attr_has_entity_name = True
def __init__(self, device: Device, field: str) -> None:
self._attr_device_info = DeviceInfo(
identifiers={(DOMAIN, device.id)},
name=device.name,
)
self._attr_name = field # e.g., "temperature", "humidity"
```
- **For device itself**: Set `_attr_name = None`
### Event Lifecycle Management
- **Subscribe in `async_added_to_hass`**:
```python
async def async_added_to_hass(self) -> None:
"""Subscribe to events."""
self.async_on_remove(
self.client.events.subscribe("my_event", self._handle_event)
)
```
- **Unsubscribe in `async_will_remove_from_hass`** if not using `async_on_remove`
- Never subscribe in `__init__` or other methods
### State Handling
- Unknown values: Use `None` (not "unknown" or "unavailable")
- Availability: Implement `available()` property instead of using "unavailable" state
### Entity Availability
- **Mark Unavailable**: When data cannot be fetched from device/service
- **Coordinator Pattern**:
```python
@property
def available(self) -> bool:
"""Return if entity is available."""
return super().available and self.identifier in self.coordinator.data
```
- **Direct Update Pattern**:
```python
async def async_update(self) -> None:
"""Update entity."""
try:
data = await self.client.get_data()
except MyException:
self._attr_available = False
else:
self._attr_available = True
self._attr_native_value = data.value
```
### Extra State Attributes
- All attribute keys must always be present
- Unknown values: Use `None`
- Provide descriptive attributes
## Device Management
### Device Registry
- **Create Devices**: Group related entities under devices
- **Device Info**: Provide comprehensive metadata:
```python
_attr_device_info = DeviceInfo(
connections={(CONNECTION_NETWORK_MAC, device.mac)},
identifiers={(DOMAIN, device.id)},
name=device.name,
manufacturer="My Company",
model="My Sensor",
sw_version=device.version,
)
```
- For services: Add `entry_type=DeviceEntryType.SERVICE`
### Dynamic Device Addition
- **Auto-detect New Devices**: After initial setup
- **Implementation Pattern**:
```python
def _check_device() -> None:
current_devices = set(coordinator.data)
new_devices = current_devices - known_devices
if new_devices:
known_devices.update(new_devices)
async_add_entities([MySensor(coordinator, device_id) for device_id in new_devices])
entry.async_on_unload(coordinator.async_add_listener(_check_device))
```
### Stale Device Removal
- **Auto-remove**: When devices disappear from hub/account
- **Device Registry Update**:
```python
device_registry.async_update_device(
device_id=device.id,
remove_config_entry_id=self.config_entry.entry_id,
)
```
- **Manual Deletion**: Implement `async_remove_config_entry_device` when needed
### Entity Categories
- **Required**: Assign appropriate category to entities
- **Implementation**: Set `_attr_entity_category`
```python
class MySensor(SensorEntity):
_attr_entity_category = EntityCategory.DIAGNOSTIC
```
- Categories include: `DIAGNOSTIC` for system/technical information
### Device Classes
- **Use When Available**: Set appropriate device class for entity type
```python
class MyTemperatureSensor(SensorEntity):
_attr_device_class = SensorDeviceClass.TEMPERATURE
```
- Provides context for: unit conversion, voice control, UI representation
### Disabled by Default
- **Disable Noisy/Less Popular Entities**: Reduce resource usage
```python
class MySignalStrengthSensor(SensorEntity):
_attr_entity_registry_enabled_default = False
```
- Target: frequently changing states, technical diagnostics
### Entity Translations
- **Required with has_entity_name**: Support international users
- **Implementation**:
```python
class MySensor(SensorEntity):
_attr_has_entity_name = True
_attr_translation_key = "phase_voltage"
```
- Create `strings.json` with translations:
```json
{
"entity": {
"sensor": {
"phase_voltage": {
"name": "Phase voltage"
}
}
}
}
```
### Exception Translations (Gold)
- **Translatable Errors**: Use translation keys for user-facing exceptions
- **Implementation**:
```python
raise ServiceValidationError(
translation_domain=DOMAIN,
translation_key="end_date_before_start_date",
)
```
- Add to `strings.json`:
```json
{
"exceptions": {
"end_date_before_start_date": {
"message": "The end date cannot be before the start date."
}
}
}
```
### Icon Translations (Gold)
- **Dynamic Icons**: Support state and range-based icon selection
- **State-based Icons**:
```json
{
"entity": {
"sensor": {
"tree_pollen": {
"default": "mdi:tree",
"state": {
"high": "mdi:tree-outline"
}
}
}
}
}
```
- **Range-based Icons** (for numeric values):
```json
{
"entity": {
"sensor": {
"battery_level": {
"default": "mdi:battery-unknown",
"range": {
"0": "mdi:battery-outline",
"90": "mdi:battery-90",
"100": "mdi:battery"
}
}
}
}
}
```
## Testing Requirements
- Tests should avoid interacting or mocking internal integration details. For more info, see https://developers.home-assistant.io/docs/development_testing/#writing-tests-for-integrations
- **Location**: `tests/components/{domain}/`
- **Coverage Requirement**: Above 95% test coverage for all modules
- **Best Practices**:
- Use pytest fixtures from `tests.common`
- Mock all external dependencies
- Use snapshots for complex data structures
- Follow existing test patterns
### Config Flow Testing
- **100% Coverage Required**: All config flow paths must be tested
- **Patch Boundaries**: Only patch library or client methods when testing config flows. Do not patch methods defined in `config_flow.py`; exercise the flow logic end-to-end.
- **Test Scenarios**:
- All flow initiation methods (user, discovery, import)
- Successful configuration paths
- Error recovery scenarios
- Prevention of duplicate entries
- Flow completion after errors
- Reauthentication/reconfigure flows
### Testing
- **Integration-specific tests** (recommended):
```bash
pytest ./tests/components/<integration_domain> \
--cov=homeassistant.components.<integration_domain> \
--cov-report term-missing \
--durations-min=1 \
--durations=0 \
--numprocesses=auto
```
### Testing Best Practices
- **Never access `hass.data` directly** - Use fixtures and proper integration setup instead
- **Use snapshot testing** - For verifying entity states and attributes
- **Test through integration setup** - Don't test entities in isolation
- **Mock external APIs** - Use fixtures with realistic JSON data
- **Verify registries** - Ensure entities are properly registered with devices
### Config Flow Testing Template
```python
async def test_user_flow_success(hass, mock_api):
"""Test successful user flow."""
result = await hass.config_entries.flow.async_init(
DOMAIN, context={"source": config_entries.SOURCE_USER}
)
assert result["type"] == FlowResultType.FORM
assert result["step_id"] == "user"
# Test form submission
result = await hass.config_entries.flow.async_configure(
result["flow_id"], user_input=TEST_USER_INPUT
)
assert result["type"] == FlowResultType.CREATE_ENTRY
assert result["title"] == "My Device"
assert result["data"] == TEST_USER_INPUT
async def test_flow_connection_error(hass, mock_api_error):
"""Test connection error handling."""
result = await hass.config_entries.flow.async_init(
DOMAIN, context={"source": config_entries.SOURCE_USER}
)
result = await hass.config_entries.flow.async_configure(
result["flow_id"], user_input=TEST_USER_INPUT
)
assert result["type"] == FlowResultType.FORM
assert result["errors"] == {"base": "cannot_connect"}
```
### Entity Testing Patterns
```python
@pytest.fixture
def platforms() -> list[Platform]:
"""Overridden fixture to specify platforms to test."""
return [Platform.SENSOR] # Or another specific platform as needed.
@pytest.mark.usefixtures("entity_registry_enabled_by_default", "init_integration")
async def test_entities(
hass: HomeAssistant,
snapshot: SnapshotAssertion,
entity_registry: er.EntityRegistry,
device_registry: dr.DeviceRegistry,
mock_config_entry: MockConfigEntry,
) -> None:
"""Test the sensor entities."""
await snapshot_platform(hass, entity_registry, snapshot, mock_config_entry.entry_id)
# Ensure entities are correctly assigned to device
device_entry = device_registry.async_get_device(
identifiers={(DOMAIN, "device_unique_id")}
)
assert device_entry
entity_entries = er.async_entries_for_config_entry(
entity_registry, mock_config_entry.entry_id
)
for entity_entry in entity_entries:
assert entity_entry.device_id == device_entry.id
```
### Mock Patterns
```python
# Modern integration fixture setup
@pytest.fixture
def mock_config_entry() -> MockConfigEntry:
"""Return the default mocked config entry."""
return MockConfigEntry(
title="My Integration",
domain=DOMAIN,
data={CONF_HOST: "127.0.0.1", CONF_API_KEY: "test_key"},
unique_id="device_unique_id",
)
@pytest.fixture
def mock_device_api() -> Generator[MagicMock]:
"""Return a mocked device API."""
with patch("homeassistant.components.my_integration.MyDeviceAPI", autospec=True) as api_mock:
api = api_mock.return_value
api.get_data.return_value = MyDeviceData.from_json(
load_fixture("device_data.json", DOMAIN)
)
yield api
@pytest.fixture
def platforms() -> list[Platform]:
"""Fixture to specify platforms to test."""
return PLATFORMS
@pytest.fixture
async def init_integration(
hass: HomeAssistant,
mock_config_entry: MockConfigEntry,
mock_device_api: MagicMock,
platforms: list[Platform],
) -> MockConfigEntry:
"""Set up the integration for testing."""
mock_config_entry.add_to_hass(hass)
with patch("homeassistant.components.my_integration.PLATFORMS", platforms):
await hass.config_entries.async_setup(mock_config_entry.entry_id)
await hass.async_block_till_done()
return mock_config_entry
```
## Debugging & Troubleshooting
### Common Issues & Solutions
- **Integration won't load**: Check `manifest.json` syntax and required fields
- **Entities not appearing**: Verify `unique_id` and `has_entity_name` implementation
- **Config flow errors**: Check `strings.json` entries and error handling
- **Discovery not working**: Verify manifest discovery configuration and callbacks
- **Tests failing**: Check mock setup and async context
### Debug Logging Setup
```python
# Enable debug logging in tests
caplog.set_level(logging.DEBUG, logger="my_integration")
# In integration code - use proper logging
_LOGGER = logging.getLogger(__name__)
_LOGGER.debug("Processing data: %s", data) # Use lazy logging
```
### Validation Commands
```bash
# Check specific integration
python -m script.hassfest --integration-path homeassistant/components/my_integration
# Validate quality scale
# Check quality_scale.yaml against current rules
# Run integration tests with coverage
pytest ./tests/components/my_integration \
--cov=homeassistant.components.my_integration \
--cov-report term-missing
```

View File

@@ -3,4 +3,17 @@
Platform exists as `homeassistant/components/<domain>/diagnostics.py`.
- **Required**: Implement diagnostic data collection
- **Implementation**:
```python
TO_REDACT = [CONF_API_KEY, CONF_LATITUDE, CONF_LONGITUDE]
async def async_get_config_entry_diagnostics(
hass: HomeAssistant, entry: MyConfigEntry
) -> dict[str, Any]:
"""Return diagnostics for a config entry."""
return {
"entry_data": async_redact_data(entry.data, TO_REDACT),
"data": entry.runtime_data.data,
}
```
- **Security**: Never expose passwords, tokens, or sensitive coordinates

View File

@@ -8,6 +8,29 @@ Platform exists as `homeassistant/components/<domain>/repairs.py`.
- Provide specific steps users need to take to resolve the issue
- Use friendly, helpful language
- Include relevant context (device names, error details, etc.)
- **Implementation**:
```python
ir.async_create_issue(
hass,
DOMAIN,
"outdated_version",
is_fixable=False,
issue_domain=DOMAIN,
severity=ir.IssueSeverity.ERROR,
translation_key="outdated_version",
)
```
- **Translation Strings Requirements**: Must contain user-actionable text in `strings.json`:
```json
{
"issues": {
"outdated_version": {
"title": "Device firmware is outdated",
"description": "Your device firmware version {current_version} is below the minimum required version {min_version}. To fix this issue: 1) Open the manufacturer's mobile app, 2) Navigate to device settings, 3) Select 'Update Firmware', 4) Wait for the update to complete, then 5) Restart Home Assistant."
}
}
}
```
- **String Content Must Include**:
- What the problem is
- Why it matters
@@ -18,4 +41,15 @@ Platform exists as `homeassistant/components/<domain>/repairs.py`.
- `CRITICAL`: Reserved for extreme scenarios only
- `ERROR`: Requires immediate user attention
- `WARNING`: Indicates future potential breakage
- **Additional Attributes**:
```python
ir.async_create_issue(
hass, DOMAIN, "issue_id",
breaks_in_ha_version="2024.1.0",
is_fixable=True,
is_persistent=True,
severity=ir.IssueSeverity.ERROR,
translation_key="issue_description",
)
```
- Only create issues for problems users can potentially resolve

View File

@@ -174,7 +174,6 @@ homeassistant.components.dnsip.*
homeassistant.components.doorbird.*
homeassistant.components.dormakaba_dkey.*
homeassistant.components.downloader.*
homeassistant.components.dropbox.*
homeassistant.components.droplet.*
homeassistant.components.dsmr.*
homeassistant.components.duckdns.*

2
CODEOWNERS generated
View File

@@ -401,8 +401,6 @@ build.json @home-assistant/supervisor
/tests/components/dremel_3d_printer/ @tkdrob
/homeassistant/components/drop_connect/ @ChandlerSystems @pfrazer
/tests/components/drop_connect/ @ChandlerSystems @pfrazer
/homeassistant/components/dropbox/ @bdr99
/tests/components/dropbox/ @bdr99
/homeassistant/components/droplet/ @sarahseidman
/tests/components/droplet/ @sarahseidman
/homeassistant/components/dsmr/ @Robbie1221

View File

@@ -470,7 +470,7 @@ async def async_load_base_functionality(hass: core.HomeAssistant) -> bool:
translation.async_setup(hass)
recovery = hass.config.recovery_mode
device_registry.async_setup(hass)
device_registry.async_get(hass)
try:
await asyncio.gather(
create_eager_task(get_internal_store_manager(hass).async_initialize()),

View File

@@ -33,21 +33,14 @@ from homeassistant.helpers import device_registry as dr, entity_registry as er
from homeassistant.helpers.aiohttp_client import async_get_clientsession
from .const import DEFAULT_SSL, DEFAULT_VERIFY_SSL, DOMAIN, SECTION_ADVANCED_SETTINGS
from .coordinator import (
AirOSConfigEntry,
AirOSDataUpdateCoordinator,
AirOSFirmwareUpdateCoordinator,
AirOSRuntimeData,
)
from .coordinator import AirOSConfigEntry, AirOSDataUpdateCoordinator
_PLATFORMS: list[Platform] = [
Platform.BINARY_SENSOR,
Platform.BUTTON,
Platform.SENSOR,
Platform.UPDATE,
]
_LOGGER = logging.getLogger(__name__)
@@ -93,20 +86,10 @@ async def async_setup_entry(hass: HomeAssistant, entry: AirOSConfigEntry) -> boo
airos_device = airos_class(**conn_data)
data_coordinator = AirOSDataUpdateCoordinator(
hass, entry, device_data, airos_device
)
await data_coordinator.async_config_entry_first_refresh()
coordinator = AirOSDataUpdateCoordinator(hass, entry, device_data, airos_device)
await coordinator.async_config_entry_first_refresh()
firmware_coordinator: AirOSFirmwareUpdateCoordinator | None = None
if device_data["fw_major"] >= 8:
firmware_coordinator = AirOSFirmwareUpdateCoordinator(hass, entry, airos_device)
await firmware_coordinator.async_config_entry_first_refresh()
entry.runtime_data = AirOSRuntimeData(
status=data_coordinator,
firmware=firmware_coordinator,
)
entry.runtime_data = coordinator
await hass.config_entries.async_forward_entry_setups(entry, _PLATFORMS)

View File

@@ -87,7 +87,7 @@ async def async_setup_entry(
async_add_entities: AddConfigEntryEntitiesCallback,
) -> None:
"""Set up the AirOS binary sensors from a config entry."""
coordinator = config_entry.runtime_data.status
coordinator = config_entry.runtime_data
entities = [
AirOSBinarySensor(coordinator, description)

View File

@@ -31,9 +31,7 @@ async def async_setup_entry(
async_add_entities: AddConfigEntryEntitiesCallback,
) -> None:
"""Set up the AirOS button from a config entry."""
async_add_entities(
[AirOSRebootButton(config_entry.runtime_data.status, REBOOT_BUTTON)]
)
async_add_entities([AirOSRebootButton(config_entry.runtime_data, REBOOT_BUTTON)])
class AirOSRebootButton(AirOSEntity, ButtonEntity):

View File

@@ -5,7 +5,6 @@ from datetime import timedelta
DOMAIN = "airos"
SCAN_INTERVAL = timedelta(minutes=1)
UPDATE_SCAN_INTERVAL = timedelta(days=1)
MANUFACTURER = "Ubiquiti"

View File

@@ -2,10 +2,7 @@
from __future__ import annotations
from collections.abc import Awaitable, Callable
from dataclasses import dataclass
import logging
from typing import Any, TypeVar
from airos.airos6 import AirOS6, AirOS6Data
from airos.airos8 import AirOS8, AirOS8Data
@@ -22,61 +19,20 @@ from homeassistant.core import HomeAssistant
from homeassistant.exceptions import ConfigEntryAuthFailed
from homeassistant.helpers.update_coordinator import DataUpdateCoordinator, UpdateFailed
from .const import DOMAIN, SCAN_INTERVAL, UPDATE_SCAN_INTERVAL
from .const import DOMAIN, SCAN_INTERVAL
_LOGGER = logging.getLogger(__name__)
type AirOSDeviceDetect = AirOS8 | AirOS6
type AirOSDataDetect = AirOS8Data | AirOS6Data
type AirOSUpdateData = dict[str, Any]
AirOSDeviceDetect = AirOS8 | AirOS6
AirOSDataDetect = AirOS8Data | AirOS6Data
type AirOSConfigEntry = ConfigEntry[AirOSRuntimeData]
T = TypeVar("T", bound=AirOSDataDetect | AirOSUpdateData)
@dataclass
class AirOSRuntimeData:
"""Data for AirOS config entry."""
status: AirOSDataUpdateCoordinator
firmware: AirOSFirmwareUpdateCoordinator | None
async def async_fetch_airos_data(
airos_device: AirOSDeviceDetect,
update_method: Callable[[], Awaitable[T]],
) -> T:
"""Fetch data from AirOS device."""
try:
await airos_device.login()
return await update_method()
except AirOSConnectionAuthenticationError as err:
_LOGGER.exception("Error authenticating with airOS device")
raise ConfigEntryAuthFailed(
translation_domain=DOMAIN, translation_key="invalid_auth"
) from err
except (
AirOSConnectionSetupError,
AirOSDeviceConnectionError,
TimeoutError,
) as err:
_LOGGER.error("Error connecting to airOS device: %s", err)
raise UpdateFailed(
translation_domain=DOMAIN,
translation_key="cannot_connect",
) from err
except AirOSDataMissingError as err:
_LOGGER.error("Expected data not returned by airOS device: %s", err)
raise UpdateFailed(
translation_domain=DOMAIN,
translation_key="error_data_missing",
) from err
type AirOSConfigEntry = ConfigEntry[AirOSDataUpdateCoordinator]
class AirOSDataUpdateCoordinator(DataUpdateCoordinator[AirOSDataDetect]):
"""Class to manage fetching AirOS status data from single endpoint."""
"""Class to manage fetching AirOS data from single endpoint."""
airos_device: AirOSDeviceDetect
config_entry: AirOSConfigEntry
def __init__(
@@ -98,33 +54,28 @@ class AirOSDataUpdateCoordinator(DataUpdateCoordinator[AirOSDataDetect]):
)
async def _async_update_data(self) -> AirOSDataDetect:
"""Fetch status data from AirOS."""
return await async_fetch_airos_data(self.airos_device, self.airos_device.status)
class AirOSFirmwareUpdateCoordinator(DataUpdateCoordinator[AirOSUpdateData]):
"""Class to manage fetching AirOS firmware."""
config_entry: AirOSConfigEntry
def __init__(
self,
hass: HomeAssistant,
config_entry: AirOSConfigEntry,
airos_device: AirOSDeviceDetect,
) -> None:
"""Initialize the coordinator."""
self.airos_device = airos_device
super().__init__(
hass,
_LOGGER,
config_entry=config_entry,
name=DOMAIN,
update_interval=UPDATE_SCAN_INTERVAL,
)
async def _async_update_data(self) -> AirOSUpdateData:
"""Fetch firmware data from AirOS."""
return await async_fetch_airos_data(
self.airos_device, self.airos_device.update_check
)
"""Fetch data from AirOS."""
try:
await self.airos_device.login()
return await self.airos_device.status()
except AirOSConnectionAuthenticationError as err:
_LOGGER.exception("Error authenticating with airOS device")
raise ConfigEntryAuthFailed(
translation_domain=DOMAIN, translation_key="invalid_auth"
) from err
except (
AirOSConnectionSetupError,
AirOSDeviceConnectionError,
TimeoutError,
) as err:
_LOGGER.error("Error connecting to airOS device: %s", err)
raise UpdateFailed(
translation_domain=DOMAIN,
translation_key="cannot_connect",
) from err
except AirOSDataMissingError as err:
_LOGGER.error("Expected data not returned by airOS device: %s", err)
raise UpdateFailed(
translation_domain=DOMAIN,
translation_key="error_data_missing",
) from err

View File

@@ -29,15 +29,5 @@ async def async_get_config_entry_diagnostics(
"""Return diagnostics for a config entry."""
return {
"entry_data": async_redact_data(entry.data, TO_REDACT_HA),
"data": {
"status_data": async_redact_data(
entry.runtime_data.status.data.to_dict(), TO_REDACT_AIROS
),
"firmware_data": async_redact_data(
entry.runtime_data.firmware.data
if entry.runtime_data.firmware is not None
else {},
TO_REDACT_AIROS,
),
},
"data": async_redact_data(entry.runtime_data.data.to_dict(), TO_REDACT_AIROS),
}

View File

@@ -180,7 +180,7 @@ async def async_setup_entry(
async_add_entities: AddConfigEntryEntitiesCallback,
) -> None:
"""Set up the AirOS sensors from a config entry."""
coordinator = config_entry.runtime_data.status
coordinator = config_entry.runtime_data
entities = [AirOSSensor(coordinator, description) for description in COMMON_SENSORS]

View File

@@ -206,12 +206,6 @@
},
"reboot_failed": {
"message": "The device did not accept the reboot request. Try again, or check your device web interface for errors."
},
"update_connection_authentication_error": {
"message": "Authentication or connection failed during firmware update"
},
"update_error": {
"message": "Connection failed during firmware update"
}
}
}

View File

@@ -1,101 +0,0 @@
"""AirOS update component for Home Assistant."""
from __future__ import annotations
import logging
from typing import Any
from airos.exceptions import AirOSConnectionAuthenticationError, AirOSException
from homeassistant.components.update import (
UpdateDeviceClass,
UpdateEntity,
UpdateEntityFeature,
)
from homeassistant.core import HomeAssistant
from homeassistant.exceptions import HomeAssistantError
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from .const import DOMAIN
from .coordinator import (
AirOSConfigEntry,
AirOSDataUpdateCoordinator,
AirOSFirmwareUpdateCoordinator,
)
from .entity import AirOSEntity
PARALLEL_UPDATES = 0
_LOGGER = logging.getLogger(__name__)
async def async_setup_entry(
hass: HomeAssistant,
config_entry: AirOSConfigEntry,
async_add_entities: AddConfigEntryEntitiesCallback,
) -> None:
"""Set up the AirOS update entity from a config entry."""
runtime_data = config_entry.runtime_data
if runtime_data.firmware is None: # Unsupported device
return
async_add_entities([AirOSUpdateEntity(runtime_data.status, runtime_data.firmware)])
class AirOSUpdateEntity(AirOSEntity, UpdateEntity):
"""Update entity for AirOS firmware updates."""
_attr_device_class = UpdateDeviceClass.FIRMWARE
_attr_supported_features = UpdateEntityFeature.INSTALL
def __init__(
self,
status: AirOSDataUpdateCoordinator,
firmware: AirOSFirmwareUpdateCoordinator,
) -> None:
"""Initialize the AirOS update entity."""
super().__init__(status)
self.status = status
self.firmware = firmware
self._attr_unique_id = f"{status.data.derived.mac}_firmware_update"
@property
def installed_version(self) -> str | None:
"""Return the installed firmware version."""
return self.status.data.host.fwversion
@property
def latest_version(self) -> str | None:
"""Return the latest firmware version."""
if not self.firmware.data.get("update", False):
return self.status.data.host.fwversion
return self.firmware.data.get("version")
@property
def release_url(self) -> str | None:
"""Return the release url of the latest firmware."""
return self.firmware.data.get("changelog")
async def async_install(
self,
version: str | None,
backup: bool,
**kwargs: Any,
) -> None:
"""Handle the firmware update installation."""
_LOGGER.debug("Starting firmware update")
try:
await self.status.airos_device.login()
await self.status.airos_device.download()
await self.status.airos_device.install()
except AirOSConnectionAuthenticationError as err:
raise HomeAssistantError(
translation_domain=DOMAIN,
translation_key="update_connection_authentication_error",
) from err
except AirOSException as err:
raise HomeAssistantError(
translation_domain=DOMAIN,
translation_key="update_error",
) from err

View File

@@ -2,15 +2,19 @@
from __future__ import annotations
from homeassistant.config_entries import ConfigSubentry
import anthropic
from homeassistant.config_entries import ConfigEntry, ConfigSubentry
from homeassistant.const import CONF_API_KEY, Platform
from homeassistant.core import HomeAssistant
from homeassistant.exceptions import ConfigEntryAuthFailed, ConfigEntryNotReady
from homeassistant.helpers import (
config_validation as cv,
device_registry as dr,
entity_registry as er,
issue_registry as ir,
)
from homeassistant.helpers.httpx_client import get_async_client
from homeassistant.helpers.typing import ConfigType
from .const import (
@@ -20,11 +24,12 @@ from .const import (
DOMAIN,
LOGGER,
)
from .coordinator import AnthropicConfigEntry, AnthropicCoordinator
PLATFORMS = (Platform.AI_TASK, Platform.CONVERSATION)
CONFIG_SCHEMA = cv.config_entry_only_config_schema(DOMAIN)
type AnthropicConfigEntry = ConfigEntry[anthropic.AsyncClient]
async def async_setup(hass: HomeAssistant, config: ConfigType) -> bool:
"""Set up Anthropic."""
@@ -34,9 +39,29 @@ async def async_setup(hass: HomeAssistant, config: ConfigType) -> bool:
async def async_setup_entry(hass: HomeAssistant, entry: AnthropicConfigEntry) -> bool:
"""Set up Anthropic from a config entry."""
coordinator = AnthropicCoordinator(hass, entry)
await coordinator.async_config_entry_first_refresh()
entry.runtime_data = coordinator
client = anthropic.AsyncAnthropic(
api_key=entry.data[CONF_API_KEY], http_client=get_async_client(hass)
)
try:
await client.models.list(timeout=10.0)
except anthropic.AuthenticationError as err:
raise ConfigEntryAuthFailed(
translation_domain=DOMAIN,
translation_key="api_authentication_error",
translation_placeholders={"message": err.message},
) from err
except anthropic.AnthropicError as err:
raise ConfigEntryNotReady(
translation_domain=DOMAIN,
translation_key="api_error",
translation_placeholders={
"message": err.message
if isinstance(err, anthropic.APIError)
else str(err)
},
) from err
entry.runtime_data = client
await hass.config_entries.async_forward_entry_setups(entry, PLATFORMS)

View File

@@ -1,78 +0,0 @@
"""Coordinator for the Anthropic integration."""
from __future__ import annotations
from datetime import timedelta
import anthropic
from homeassistant.config_entries import ConfigEntry
from homeassistant.const import CONF_API_KEY
from homeassistant.core import HomeAssistant, callback
from homeassistant.exceptions import ConfigEntryAuthFailed
from homeassistant.helpers.httpx_client import get_async_client
from homeassistant.helpers.update_coordinator import DataUpdateCoordinator, UpdateFailed
from .const import DOMAIN, LOGGER
UPDATE_INTERVAL_CONNECTED = timedelta(hours=12)
UPDATE_INTERVAL_DISCONNECTED = timedelta(minutes=1)
type AnthropicConfigEntry = ConfigEntry[AnthropicCoordinator]
class AnthropicCoordinator(DataUpdateCoordinator[None]):
"""DataUpdateCoordinator which uses different intervals after successful and unsuccessful updates."""
client: anthropic.AsyncAnthropic
def __init__(self, hass: HomeAssistant, config_entry: AnthropicConfigEntry) -> None:
"""Initialize the coordinator."""
super().__init__(
hass,
LOGGER,
config_entry=config_entry,
name=config_entry.title,
update_interval=UPDATE_INTERVAL_CONNECTED,
update_method=self.async_update_data,
always_update=False,
)
self.client = anthropic.AsyncAnthropic(
api_key=config_entry.data[CONF_API_KEY], http_client=get_async_client(hass)
)
@callback
def async_set_updated_data(self, data: None) -> None:
"""Manually update data, notify listeners and update refresh interval."""
self.update_interval = UPDATE_INTERVAL_CONNECTED
super().async_set_updated_data(data)
async def async_update_data(self) -> None:
"""Fetch data from the API."""
try:
self.update_interval = UPDATE_INTERVAL_DISCONNECTED
await self.client.models.list(timeout=10.0)
self.update_interval = UPDATE_INTERVAL_CONNECTED
except anthropic.APITimeoutError as err:
raise TimeoutError(err.message or str(err)) from err
except anthropic.AuthenticationError as err:
raise ConfigEntryAuthFailed(
translation_domain=DOMAIN,
translation_key="api_authentication_error",
translation_placeholders={"message": err.message},
) from err
except anthropic.APIError as err:
raise UpdateFailed(
translation_domain=DOMAIN,
translation_key="api_error",
translation_placeholders={"message": err.message},
) from err
def mark_connection_error(self) -> None:
"""Mark the connection as having an error and reschedule background check."""
self.update_interval = UPDATE_INTERVAL_DISCONNECTED
if self.last_update_success:
self.last_update_success = False
self.async_update_listeners()
if self._listeners and not self.hass.is_stopping:
self._schedule_refresh()

View File

@@ -82,11 +82,12 @@ from homeassistant.config_entries import ConfigSubentry
from homeassistant.core import HomeAssistant
from homeassistant.exceptions import HomeAssistantError
from homeassistant.helpers import device_registry as dr, llm
from homeassistant.helpers.entity import Entity
from homeassistant.helpers.json import json_dumps
from homeassistant.helpers.update_coordinator import CoordinatorEntity
from homeassistant.util import slugify
from homeassistant.util.json import JsonObjectType
from . import AnthropicConfigEntry
from .const import (
CONF_CHAT_MODEL,
CONF_CODE_EXECUTION,
@@ -110,7 +111,6 @@ from .const import (
PROGRAMMATIC_TOOL_CALLING_UNSUPPORTED_MODELS,
UNSUPPORTED_STRUCTURED_OUTPUT_MODELS,
)
from .coordinator import AnthropicConfigEntry, AnthropicCoordinator
# Max number of back and forth with the LLM to generate a response
MAX_TOOL_ITERATIONS = 10
@@ -658,7 +658,7 @@ def _create_token_stats(
}
class AnthropicBaseLLMEntity(CoordinatorEntity[AnthropicCoordinator]):
class AnthropicBaseLLMEntity(Entity):
"""Anthropic base LLM entity."""
_attr_has_entity_name = True
@@ -666,7 +666,6 @@ class AnthropicBaseLLMEntity(CoordinatorEntity[AnthropicCoordinator]):
def __init__(self, entry: AnthropicConfigEntry, subentry: ConfigSubentry) -> None:
"""Initialize the entity."""
super().__init__(entry.runtime_data)
self.entry = entry
self.subentry = subentry
self._attr_unique_id = subentry.subentry_id
@@ -878,8 +877,7 @@ class AnthropicBaseLLMEntity(CoordinatorEntity[AnthropicCoordinator]):
if tools:
model_args["tools"] = tools
coordinator = self.entry.runtime_data
client = coordinator.client
client = self.entry.runtime_data
# To prevent infinite loops, we limit the number of iterations
for _iteration in range(max_iterations):
@@ -901,24 +899,13 @@ class AnthropicBaseLLMEntity(CoordinatorEntity[AnthropicCoordinator]):
)
messages.extend(new_messages)
except anthropic.AuthenticationError as err:
# Trigger coordinator to confirm the auth failure and trigger the reauth flow.
await coordinator.async_request_refresh()
self.entry.async_start_reauth(self.hass)
raise HomeAssistantError(
translation_domain=DOMAIN,
translation_key="api_authentication_error",
translation_placeholders={"message": err.message},
) from err
except anthropic.APIConnectionError as err:
LOGGER.info("Connection error while talking to Anthropic: %s", err)
coordinator.mark_connection_error()
raise HomeAssistantError(
translation_domain=DOMAIN,
translation_key="api_error",
translation_placeholders={"message": err.message},
) from err
except anthropic.AnthropicError as err:
# Non-connection error, mark connection as healthy
coordinator.async_set_updated_data(None)
raise HomeAssistantError(
translation_domain=DOMAIN,
translation_key="api_error",
@@ -930,7 +917,6 @@ class AnthropicBaseLLMEntity(CoordinatorEntity[AnthropicCoordinator]):
) from err
if not chat_log.unresponded_tool_results:
coordinator.async_set_updated_data(None)
break

View File

@@ -35,9 +35,9 @@ rules:
config-entry-unloading: done
docs-configuration-parameters: done
docs-installation-parameters: done
entity-unavailable: done
entity-unavailable: todo
integration-owner: done
log-when-unavailable: done
log-when-unavailable: todo
parallel-updates:
status: exempt
comment: |

View File

@@ -58,7 +58,7 @@ class ModelDeprecatedRepairFlow(RepairsFlow):
if entry.entry_id in self._model_list_cache:
model_list = self._model_list_cache[entry.entry_id]
else:
client = entry.runtime_data.client
client = entry.runtime_data
model_list = [
model_option
for model_option in await get_model_list(client)

View File

@@ -54,7 +54,7 @@
"message": "Storage account {account_name} not found"
},
"cannot_connect": {
"message": "Cannot connect to storage account {account_name}"
"message": "Can not connect to storage account {account_name}"
},
"container_not_found": {
"message": "Storage container {container_name} not found"

View File

@@ -16,7 +16,6 @@ PLATFORMS: list[Platform] = [
Platform.BUTTON,
Platform.LIGHT,
Platform.SELECT,
Platform.SENSOR,
]

View File

@@ -4,11 +4,7 @@ from __future__ import annotations
from pycasperglow import GlowState
from homeassistant.components.binary_sensor import (
BinarySensorDeviceClass,
BinarySensorEntity,
)
from homeassistant.const import EntityCategory
from homeassistant.components.binary_sensor import BinarySensorEntity
from homeassistant.core import HomeAssistant, callback
from homeassistant.helpers.device_registry import format_mac
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
@@ -25,12 +21,7 @@ async def async_setup_entry(
async_add_entities: AddConfigEntryEntitiesCallback,
) -> None:
"""Set up the binary sensor platform for Casper Glow."""
async_add_entities(
[
CasperGlowPausedBinarySensor(entry.runtime_data),
CasperGlowChargingBinarySensor(entry.runtime_data),
]
)
async_add_entities([CasperGlowPausedBinarySensor(entry.runtime_data)])
class CasperGlowPausedBinarySensor(CasperGlowEntity, BinarySensorEntity):
@@ -55,34 +46,6 @@ class CasperGlowPausedBinarySensor(CasperGlowEntity, BinarySensorEntity):
@callback
def _async_handle_state_update(self, state: GlowState) -> None:
"""Handle a state update from the device."""
if state.is_paused is not None and state.is_paused != self._attr_is_on:
if state.is_paused is not None:
self._attr_is_on = state.is_paused
self.async_write_ha_state()
class CasperGlowChargingBinarySensor(CasperGlowEntity, BinarySensorEntity):
"""Binary sensor indicating whether the Casper Glow is charging."""
_attr_device_class = BinarySensorDeviceClass.BATTERY_CHARGING
_attr_entity_category = EntityCategory.DIAGNOSTIC
def __init__(self, coordinator: CasperGlowCoordinator) -> None:
"""Initialize the charging binary sensor."""
super().__init__(coordinator)
self._attr_unique_id = f"{format_mac(coordinator.device.address)}_charging"
if coordinator.device.state.is_charging is not None:
self._attr_is_on = coordinator.device.state.is_charging
async def async_added_to_hass(self) -> None:
"""Register state update callback when entity is added."""
await super().async_added_to_hass()
self.async_on_remove(
self._device.register_callback(self._async_handle_state_update)
)
@callback
def _async_handle_state_update(self, state: GlowState) -> None:
"""Handle a state update from the device."""
if state.is_charging is not None and state.is_charging != self._attr_is_on:
self._attr_is_on = state.is_charging
self.async_write_ha_state()
self.async_write_ha_state()

View File

@@ -53,15 +53,15 @@ rules:
docs-use-cases: todo
dynamic-devices: todo
entity-category: done
entity-device-class: done
entity-device-class:
status: exempt
comment: No applicable device classes for binary_sensor, button, light, or select entities.
entity-disabled-by-default: todo
entity-translations: done
exception-translations: done
icon-translations: done
reconfiguration-flow: todo
repair-issues:
status: exempt
comment: Integration does not register repair issues.
repair-issues: todo
stale-devices: todo
# Platinum

View File

@@ -1,61 +0,0 @@
"""Casper Glow integration sensor platform."""
from __future__ import annotations
from pycasperglow import GlowState
from homeassistant.components.sensor import (
SensorDeviceClass,
SensorEntity,
SensorStateClass,
)
from homeassistant.const import PERCENTAGE, EntityCategory
from homeassistant.core import HomeAssistant, callback
from homeassistant.helpers.device_registry import format_mac
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from .coordinator import CasperGlowConfigEntry, CasperGlowCoordinator
from .entity import CasperGlowEntity
PARALLEL_UPDATES = 0
async def async_setup_entry(
hass: HomeAssistant,
entry: CasperGlowConfigEntry,
async_add_entities: AddConfigEntryEntitiesCallback,
) -> None:
"""Set up the sensor platform for Casper Glow."""
async_add_entities([CasperGlowBatterySensor(entry.runtime_data)])
class CasperGlowBatterySensor(CasperGlowEntity, SensorEntity):
"""Sensor entity for Casper Glow battery level."""
_attr_device_class = SensorDeviceClass.BATTERY
_attr_native_unit_of_measurement = PERCENTAGE
_attr_state_class = SensorStateClass.MEASUREMENT
_attr_entity_category = EntityCategory.DIAGNOSTIC
def __init__(self, coordinator: CasperGlowCoordinator) -> None:
"""Initialize the battery sensor."""
super().__init__(coordinator)
self._attr_unique_id = f"{format_mac(coordinator.device.address)}_battery"
if coordinator.device.state.battery_level is not None:
self._attr_native_value = coordinator.device.state.battery_level.percentage
async def async_added_to_hass(self) -> None:
"""Register state update callback when entity is added."""
await super().async_added_to_hass()
self.async_on_remove(
self._device.register_callback(self._async_handle_state_update)
)
@callback
def _async_handle_state_update(self, state: GlowState) -> None:
"""Handle a state update from the device."""
if state.battery_level is not None:
new_value = state.battery_level.percentage
if new_value != self._attr_native_value:
self._attr_native_value = new_value
self.async_write_ha_state()

View File

@@ -1,64 +0,0 @@
"""The Dropbox integration."""
from __future__ import annotations
from python_dropbox_api import (
DropboxAPIClient,
DropboxAuthException,
DropboxUnknownException,
)
from homeassistant.config_entries import ConfigEntry
from homeassistant.core import HomeAssistant
from homeassistant.exceptions import ConfigEntryAuthFailed, ConfigEntryNotReady
from homeassistant.helpers import aiohttp_client
from homeassistant.helpers.config_entry_oauth2_flow import (
ImplementationUnavailableError,
OAuth2Session,
async_get_config_entry_implementation,
)
from .auth import DropboxConfigEntryAuth
from .const import DATA_BACKUP_AGENT_LISTENERS, DOMAIN
type DropboxConfigEntry = ConfigEntry[DropboxAPIClient]
async def async_setup_entry(hass: HomeAssistant, entry: DropboxConfigEntry) -> bool:
"""Set up Dropbox from a config entry."""
try:
oauth2_implementation = await async_get_config_entry_implementation(hass, entry)
except ImplementationUnavailableError as err:
raise ConfigEntryNotReady(
translation_domain=DOMAIN,
translation_key="oauth2_implementation_unavailable",
) from err
oauth2_session = OAuth2Session(hass, entry, oauth2_implementation)
auth = DropboxConfigEntryAuth(
aiohttp_client.async_get_clientsession(hass), oauth2_session
)
client = DropboxAPIClient(auth)
try:
await client.get_account_info()
except DropboxAuthException as err:
raise ConfigEntryAuthFailed from err
except (DropboxUnknownException, TimeoutError) as err:
raise ConfigEntryNotReady from err
entry.runtime_data = client
def async_notify_backup_listeners() -> None:
for listener in hass.data.get(DATA_BACKUP_AGENT_LISTENERS, []):
listener()
entry.async_on_unload(entry.async_on_state_change(async_notify_backup_listeners))
return True
async def async_unload_entry(hass: HomeAssistant, entry: DropboxConfigEntry) -> bool:
"""Unload a config entry."""
return True

View File

@@ -1,38 +0,0 @@
"""Application credentials platform for the Dropbox integration."""
from homeassistant.components.application_credentials import ClientCredential
from homeassistant.core import HomeAssistant
from homeassistant.helpers.config_entry_oauth2_flow import (
AbstractOAuth2Implementation,
LocalOAuth2ImplementationWithPkce,
)
from .const import OAUTH2_AUTHORIZE, OAUTH2_SCOPES, OAUTH2_TOKEN
async def async_get_auth_implementation(
hass: HomeAssistant, auth_domain: str, credential: ClientCredential
) -> AbstractOAuth2Implementation:
"""Return custom auth implementation."""
return DropboxOAuth2Implementation(
hass,
auth_domain,
credential.client_id,
OAUTH2_AUTHORIZE,
OAUTH2_TOKEN,
credential.client_secret,
)
class DropboxOAuth2Implementation(LocalOAuth2ImplementationWithPkce):
"""Custom Dropbox OAuth2 implementation to add the necessary authorize url parameters."""
@property
def extra_authorize_data(self) -> dict:
"""Extra data that needs to be appended to the authorize url."""
data: dict = {
"token_access_type": "offline",
"scope": " ".join(OAUTH2_SCOPES),
}
data.update(super().extra_authorize_data)
return data

View File

@@ -1,44 +0,0 @@
"""Authentication for Dropbox."""
from typing import cast
from aiohttp import ClientSession
from python_dropbox_api import Auth
from homeassistant.helpers.config_entry_oauth2_flow import OAuth2Session
class DropboxConfigEntryAuth(Auth):
"""Provide Dropbox authentication tied to an OAuth2 based config entry."""
def __init__(
self,
websession: ClientSession,
oauth_session: OAuth2Session,
) -> None:
"""Initialize DropboxConfigEntryAuth."""
super().__init__(websession)
self._oauth_session = oauth_session
async def async_get_access_token(self) -> str:
"""Return a valid access token."""
await self._oauth_session.async_ensure_token_valid()
return cast(str, self._oauth_session.token["access_token"])
class DropboxConfigFlowAuth(Auth):
"""Provide authentication tied to a fixed token for the config flow."""
def __init__(
self,
websession: ClientSession,
token: str,
) -> None:
"""Initialize DropboxConfigFlowAuth."""
super().__init__(websession)
self._token = token
async def async_get_access_token(self) -> str:
"""Return the fixed access token."""
return self._token

View File

@@ -1,230 +0,0 @@
"""Backup platform for the Dropbox integration."""
from collections.abc import AsyncIterator, Callable, Coroutine
from functools import wraps
import json
import logging
from typing import Any, Concatenate
from python_dropbox_api import (
DropboxAPIClient,
DropboxAuthException,
DropboxFileOrFolderNotFoundException,
DropboxUnknownException,
)
from homeassistant.components.backup import (
AgentBackup,
BackupAgent,
BackupAgentError,
BackupNotFound,
suggested_filename,
)
from homeassistant.core import HomeAssistant, callback
from . import DropboxConfigEntry
from .const import DATA_BACKUP_AGENT_LISTENERS, DOMAIN
_LOGGER = logging.getLogger(__name__)
def _suggested_filenames(backup: AgentBackup) -> tuple[str, str]:
"""Return the suggested filenames for the backup and metadata."""
base_name = suggested_filename(backup).rsplit(".", 1)[0]
return f"{base_name}.tar", f"{base_name}.metadata.json"
async def _async_string_iterator(content: str) -> AsyncIterator[bytes]:
"""Yield a string as a single bytes chunk."""
yield content.encode()
def handle_backup_errors[_R, **P](
func: Callable[Concatenate[DropboxBackupAgent, P], Coroutine[Any, Any, _R]],
) -> Callable[Concatenate[DropboxBackupAgent, P], Coroutine[Any, Any, _R]]:
"""Handle backup errors."""
@wraps(func)
async def wrapper(
self: DropboxBackupAgent, *args: P.args, **kwargs: P.kwargs
) -> _R:
try:
return await func(self, *args, **kwargs)
except DropboxFileOrFolderNotFoundException as err:
raise BackupNotFound(
f"Failed to {func.__name__.removeprefix('async_').replace('_', ' ')}"
) from err
except DropboxAuthException as err:
self._entry.async_start_reauth(self._hass)
raise BackupAgentError("Authentication error") from err
except DropboxUnknownException as err:
_LOGGER.error(
"Error during %s: %s",
func.__name__,
err,
)
_LOGGER.debug("Full error: %s", err, exc_info=True)
raise BackupAgentError(
f"Failed to {func.__name__.removeprefix('async_').replace('_', ' ')}"
) from err
return wrapper
async def async_get_backup_agents(
hass: HomeAssistant,
**kwargs: Any,
) -> list[BackupAgent]:
"""Return a list of backup agents."""
entries = hass.config_entries.async_loaded_entries(DOMAIN)
return [DropboxBackupAgent(hass, entry) for entry in entries]
@callback
def async_register_backup_agents_listener(
hass: HomeAssistant,
*,
listener: Callable[[], None],
**kwargs: Any,
) -> Callable[[], None]:
"""Register a listener to be called when agents are added or removed.
:return: A function to unregister the listener.
"""
hass.data.setdefault(DATA_BACKUP_AGENT_LISTENERS, []).append(listener)
@callback
def remove_listener() -> None:
"""Remove the listener."""
hass.data[DATA_BACKUP_AGENT_LISTENERS].remove(listener)
if not hass.data[DATA_BACKUP_AGENT_LISTENERS]:
del hass.data[DATA_BACKUP_AGENT_LISTENERS]
return remove_listener
class DropboxBackupAgent(BackupAgent):
"""Backup agent for the Dropbox integration."""
domain = DOMAIN
def __init__(self, hass: HomeAssistant, entry: DropboxConfigEntry) -> None:
"""Initialize the backup agent."""
super().__init__()
self._hass = hass
self._entry = entry
self.name = entry.title
assert entry.unique_id
self.unique_id = entry.unique_id
self._api: DropboxAPIClient = entry.runtime_data
async def _async_get_backups(self) -> list[tuple[AgentBackup, str]]:
"""Get backups and their corresponding file names."""
files = await self._api.list_folder("")
tar_files = {f.name for f in files if f.name.endswith(".tar")}
metadata_files = [f for f in files if f.name.endswith(".metadata.json")]
backups: list[tuple[AgentBackup, str]] = []
for metadata_file in metadata_files:
tar_name = metadata_file.name.removesuffix(".metadata.json") + ".tar"
if tar_name not in tar_files:
_LOGGER.warning(
"Found metadata file '%s' without matching backup file",
metadata_file.name,
)
continue
metadata_stream = self._api.download_file(f"/{metadata_file.name}")
raw = b"".join([chunk async for chunk in metadata_stream])
try:
data = json.loads(raw)
backup = AgentBackup.from_dict(data)
except (json.JSONDecodeError, ValueError, TypeError, KeyError) as err:
_LOGGER.warning(
"Skipping invalid metadata file '%s': %s",
metadata_file.name,
err,
)
continue
backups.append((backup, tar_name))
return backups
@handle_backup_errors
async def async_upload_backup(
self,
*,
open_stream: Callable[[], Coroutine[Any, Any, AsyncIterator[bytes]]],
backup: AgentBackup,
**kwargs: Any,
) -> None:
"""Upload a backup."""
backup_filename, metadata_filename = _suggested_filenames(backup)
backup_path = f"/{backup_filename}"
metadata_path = f"/{metadata_filename}"
file_stream = await open_stream()
await self._api.upload_file(backup_path, file_stream)
metadata_stream = _async_string_iterator(json.dumps(backup.as_dict()))
try:
await self._api.upload_file(metadata_path, metadata_stream)
except (
DropboxAuthException,
DropboxUnknownException,
):
await self._api.delete_file(backup_path)
raise
@handle_backup_errors
async def async_list_backups(self, **kwargs: Any) -> list[AgentBackup]:
"""List backups."""
return [backup for backup, _ in await self._async_get_backups()]
@handle_backup_errors
async def async_download_backup(
self,
backup_id: str,
**kwargs: Any,
) -> AsyncIterator[bytes]:
"""Download a backup file."""
backups = await self._async_get_backups()
for backup, filename in backups:
if backup.backup_id == backup_id:
return self._api.download_file(f"/{filename}")
raise BackupNotFound(f"Backup {backup_id} not found")
@handle_backup_errors
async def async_get_backup(
self,
backup_id: str,
**kwargs: Any,
) -> AgentBackup:
"""Return a backup."""
backups = await self._async_get_backups()
for backup, _ in backups:
if backup.backup_id == backup_id:
return backup
raise BackupNotFound(f"Backup {backup_id} not found")
@handle_backup_errors
async def async_delete_backup(
self,
backup_id: str,
**kwargs: Any,
) -> None:
"""Delete a backup file."""
backups = await self._async_get_backups()
for backup, tar_filename in backups:
if backup.backup_id == backup_id:
metadata_filename = tar_filename.removesuffix(".tar") + ".metadata.json"
await self._api.delete_file(f"/{tar_filename}")
await self._api.delete_file(f"/{metadata_filename}")
return
raise BackupNotFound(f"Backup {backup_id} not found")

View File

@@ -1,60 +0,0 @@
"""Config flow for Dropbox."""
from collections.abc import Mapping
import logging
from typing import Any
from python_dropbox_api import DropboxAPIClient
from homeassistant.config_entries import SOURCE_REAUTH, ConfigFlowResult
from homeassistant.const import CONF_ACCESS_TOKEN, CONF_TOKEN
from homeassistant.helpers.aiohttp_client import async_get_clientsession
from homeassistant.helpers.config_entry_oauth2_flow import AbstractOAuth2FlowHandler
from .auth import DropboxConfigFlowAuth
from .const import DOMAIN
class DropboxConfigFlow(AbstractOAuth2FlowHandler, domain=DOMAIN):
"""Config flow to handle Dropbox OAuth2 authentication."""
DOMAIN = DOMAIN
@property
def logger(self) -> logging.Logger:
"""Return logger."""
return logging.getLogger(__name__)
async def async_oauth_create_entry(self, data: dict[str, Any]) -> ConfigFlowResult:
"""Create an entry for the flow, or update existing entry."""
access_token = data[CONF_TOKEN][CONF_ACCESS_TOKEN]
auth = DropboxConfigFlowAuth(async_get_clientsession(self.hass), access_token)
client = DropboxAPIClient(auth)
account_info = await client.get_account_info()
await self.async_set_unique_id(account_info.account_id)
if self.source == SOURCE_REAUTH:
self._abort_if_unique_id_mismatch(reason="wrong_account")
return self.async_update_reload_and_abort(
self._get_reauth_entry(), data=data
)
self._abort_if_unique_id_configured()
return self.async_create_entry(title=account_info.email, data=data)
async def async_step_reauth(
self, entry_data: Mapping[str, Any]
) -> ConfigFlowResult:
"""Perform reauth upon an API authentication error."""
return await self.async_step_reauth_confirm()
async def async_step_reauth_confirm(
self, user_input: dict[str, Any] | None = None
) -> ConfigFlowResult:
"""Dialog that informs the user that reauth is required."""
if user_input is None:
return self.async_show_form(step_id="reauth_confirm")
return await self.async_step_user()

View File

@@ -1,19 +0,0 @@
"""Constants for the Dropbox integration."""
from collections.abc import Callable
from homeassistant.util.hass_dict import HassKey
DOMAIN = "dropbox"
OAUTH2_AUTHORIZE = "https://www.dropbox.com/oauth2/authorize"
OAUTH2_TOKEN = "https://api.dropboxapi.com/oauth2/token"
OAUTH2_SCOPES = [
"account_info.read",
"files.content.read",
"files.content.write",
]
DATA_BACKUP_AGENT_LISTENERS: HassKey[list[Callable[[], None]]] = HassKey(
f"{DOMAIN}.backup_agent_listeners"
)

View File

@@ -1,13 +0,0 @@
{
"domain": "dropbox",
"name": "Dropbox",
"after_dependencies": ["backup"],
"codeowners": ["@bdr99"],
"config_flow": true,
"dependencies": ["application_credentials"],
"documentation": "https://www.home-assistant.io/integrations/dropbox",
"integration_type": "service",
"iot_class": "cloud_polling",
"quality_scale": "bronze",
"requirements": ["python-dropbox-api==0.1.3"]
}

View File

@@ -1,112 +0,0 @@
rules:
# Bronze
action-setup:
status: exempt
comment: Integration does not register any actions.
appropriate-polling:
status: exempt
comment: Integration does not poll.
brands: done
common-modules:
status: exempt
comment: Integration does not have any entities or coordinators.
config-flow-test-coverage: done
config-flow: done
dependency-transparency: done
docs-actions:
status: exempt
comment: Integration does not register any actions.
docs-high-level-description: done
docs-installation-instructions: done
docs-removal-instructions: done
entity-event-setup:
status: exempt
comment: Integration does not have any entities.
entity-unique-id:
status: exempt
comment: Integration does not have any entities.
has-entity-name:
status: exempt
comment: Integration does not have any entities.
runtime-data: done
test-before-configure: done
test-before-setup: done
unique-config-entry: done
# Silver
action-exceptions:
status: exempt
comment: Integration does not register any actions.
config-entry-unloading: done
docs-configuration-parameters:
status: exempt
comment: Integration does not have any configuration parameters.
docs-installation-parameters: done
entity-unavailable:
status: exempt
comment: Integration does not have any entities.
integration-owner: done
log-when-unavailable: todo
parallel-updates:
status: exempt
comment: Integration does not make any entity updates.
reauthentication-flow: done
test-coverage: done
# Gold
devices:
status: exempt
comment: Integration does not have any entities.
diagnostics:
status: exempt
comment: Integration does not have any data to diagnose.
discovery-update-info:
status: exempt
comment: Integration is a service.
discovery:
status: exempt
comment: Integration is a service.
docs-data-update:
status: exempt
comment: Integration does not update any data.
docs-examples:
status: exempt
comment: Integration only provides backup functionality.
docs-known-limitations: todo
docs-supported-devices:
status: exempt
comment: Integration does not support any devices.
docs-supported-functions: done
docs-troubleshooting: todo
docs-use-cases: done
dynamic-devices:
status: exempt
comment: Integration does not use any devices.
entity-category:
status: exempt
comment: Integration does not have any entities.
entity-device-class:
status: exempt
comment: Integration does not have any entities.
entity-disabled-by-default:
status: exempt
comment: Integration does not have any entities.
entity-translations:
status: exempt
comment: Integration does not have any entities.
exception-translations: todo
icon-translations:
status: exempt
comment: Integration does not have any entities.
reconfiguration-flow: todo
repair-issues:
status: exempt
comment: Integration does not have any repairs.
stale-devices:
status: exempt
comment: Integration does not have any devices.
# Platinum
async-dependency: done
inject-websession: done
strict-typing: done

View File

@@ -1,35 +0,0 @@
{
"config": {
"abort": {
"already_configured": "[%key:common::config_flow::abort::already_configured_account%]",
"already_in_progress": "[%key:common::config_flow::abort::already_in_progress%]",
"authorize_url_timeout": "[%key:common::config_flow::abort::oauth2_authorize_url_timeout%]",
"missing_configuration": "[%key:common::config_flow::abort::oauth2_missing_configuration%]",
"no_url_available": "[%key:common::config_flow::abort::oauth2_no_url_available%]",
"oauth_error": "[%key:common::config_flow::abort::oauth2_error%]",
"oauth_failed": "[%key:common::config_flow::abort::oauth2_failed%]",
"oauth_timeout": "[%key:common::config_flow::abort::oauth2_timeout%]",
"oauth_unauthorized": "[%key:common::config_flow::abort::oauth2_unauthorized%]",
"reauth_successful": "[%key:common::config_flow::abort::reauth_successful%]",
"user_rejected_authorize": "[%key:common::config_flow::abort::oauth2_user_rejected_authorize%]",
"wrong_account": "Wrong account: Please authenticate with the correct account."
},
"create_entry": {
"default": "[%key:common::config_flow::create_entry::authenticated%]"
},
"step": {
"pick_implementation": {
"title": "[%key:common::config_flow::title::oauth2_pick_implementation%]"
},
"reauth_confirm": {
"description": "The Dropbox integration needs to re-authenticate your account.",
"title": "[%key:common::config_flow::title::reauth%]"
}
}
},
"exceptions": {
"oauth2_implementation_unavailable": {
"message": "[%key:common::exceptions::oauth2_implementation_unavailable::message%]"
}
}
}

View File

@@ -2,26 +2,14 @@
from __future__ import annotations
from types import MappingProxyType
from homeassistant.config_entries import ConfigSubentry
from homeassistant.const import CONF_API_KEY, Platform
from homeassistant.const import Platform
from homeassistant.core import HomeAssistant
from homeassistant.exceptions import ConfigEntryError
from .const import (
CONF_AZIMUTH,
CONF_DAMPING,
CONF_DAMPING_EVENING,
CONF_DAMPING_MORNING,
CONF_DECLINATION,
CONF_MODULES_POWER,
DEFAULT_AZIMUTH,
DEFAULT_DAMPING,
DEFAULT_DECLINATION,
DEFAULT_MODULES_POWER,
DOMAIN,
SUBENTRY_TYPE_PLANE,
)
from .coordinator import ForecastSolarConfigEntry, ForecastSolarDataUpdateCoordinator
@@ -37,41 +25,14 @@ async def async_migrate_entry(
new_options = entry.options.copy()
new_options |= {
CONF_MODULES_POWER: new_options.pop("modules power"),
CONF_DAMPING_MORNING: new_options.get(CONF_DAMPING, DEFAULT_DAMPING),
CONF_DAMPING_EVENING: new_options.pop(CONF_DAMPING, DEFAULT_DAMPING),
CONF_DAMPING_MORNING: new_options.get(CONF_DAMPING, 0.0),
CONF_DAMPING_EVENING: new_options.pop(CONF_DAMPING, 0.0),
}
hass.config_entries.async_update_entry(
entry, data=entry.data, options=new_options, version=2
)
if entry.version == 2:
# Migrate the main plane from options to a subentry
declination = entry.options.get(CONF_DECLINATION, DEFAULT_DECLINATION)
azimuth = entry.options.get(CONF_AZIMUTH, DEFAULT_AZIMUTH)
modules_power = entry.options.get(CONF_MODULES_POWER, DEFAULT_MODULES_POWER)
subentry = ConfigSubentry(
data=MappingProxyType(
{
CONF_DECLINATION: declination,
CONF_AZIMUTH: azimuth,
CONF_MODULES_POWER: modules_power,
}
),
subentry_type=SUBENTRY_TYPE_PLANE,
title=f"{declination}° / {azimuth}° / {modules_power}W",
unique_id=None,
)
hass.config_entries.async_add_subentry(entry, subentry)
new_options = dict(entry.options)
new_options.pop(CONF_DECLINATION, None)
new_options.pop(CONF_AZIMUTH, None)
new_options.pop(CONF_MODULES_POWER, None)
hass.config_entries.async_update_entry(entry, options=new_options, version=3)
return True
@@ -79,19 +40,6 @@ async def async_setup_entry(
hass: HomeAssistant, entry: ForecastSolarConfigEntry
) -> bool:
"""Set up Forecast.Solar from a config entry."""
plane_subentries = entry.get_subentries_of_type(SUBENTRY_TYPE_PLANE)
if not plane_subentries:
raise ConfigEntryError(
translation_domain=DOMAIN,
translation_key="no_plane",
)
if len(plane_subentries) > 1 and not entry.options.get(CONF_API_KEY):
raise ConfigEntryError(
translation_domain=DOMAIN,
translation_key="api_key_required",
)
coordinator = ForecastSolarDataUpdateCoordinator(hass, entry)
await coordinator.async_config_entry_first_refresh()
@@ -99,18 +47,9 @@ async def async_setup_entry(
await hass.config_entries.async_forward_entry_setups(entry, PLATFORMS)
entry.async_on_unload(entry.add_update_listener(_async_update_listener))
return True
async def _async_update_listener(
hass: HomeAssistant, entry: ForecastSolarConfigEntry
) -> None:
"""Handle config entry updates (options or subentry changes)."""
hass.config_entries.async_schedule_reload(entry.entry_id)
async def async_unload_entry(
hass: HomeAssistant, entry: ForecastSolarConfigEntry
) -> bool:

View File

@@ -11,13 +11,11 @@ from homeassistant.config_entries import (
ConfigEntry,
ConfigFlow,
ConfigFlowResult,
ConfigSubentryFlow,
OptionsFlow,
SubentryFlowResult,
OptionsFlowWithReload,
)
from homeassistant.const import CONF_API_KEY, CONF_LATITUDE, CONF_LONGITUDE, CONF_NAME
from homeassistant.core import callback
from homeassistant.helpers import config_validation as cv, selector
from homeassistant.helpers import config_validation as cv
from .const import (
CONF_AZIMUTH,
@@ -26,51 +24,16 @@ from .const import (
CONF_DECLINATION,
CONF_INVERTER_SIZE,
CONF_MODULES_POWER,
DEFAULT_AZIMUTH,
DEFAULT_DAMPING,
DEFAULT_DECLINATION,
DEFAULT_MODULES_POWER,
DOMAIN,
MAX_PLANES,
SUBENTRY_TYPE_PLANE,
)
RE_API_KEY = re.compile(r"^[a-zA-Z0-9]{16}$")
PLANE_SCHEMA = vol.Schema(
{
vol.Required(CONF_DECLINATION): vol.All(
selector.NumberSelector(
selector.NumberSelectorConfig(
min=0, max=90, step=1, mode=selector.NumberSelectorMode.BOX
),
),
vol.Coerce(int),
),
vol.Required(CONF_AZIMUTH): vol.All(
selector.NumberSelector(
selector.NumberSelectorConfig(
min=0, max=360, step=1, mode=selector.NumberSelectorMode.BOX
),
),
vol.Coerce(int),
),
vol.Required(CONF_MODULES_POWER): vol.All(
selector.NumberSelector(
selector.NumberSelectorConfig(
min=1, step=1, mode=selector.NumberSelectorMode.BOX
),
),
vol.Coerce(int),
),
}
)
class ForecastSolarFlowHandler(ConfigFlow, domain=DOMAIN):
"""Handle a config flow for Forecast.Solar."""
VERSION = 3
VERSION = 2
@staticmethod
@callback
@@ -80,14 +43,6 @@ class ForecastSolarFlowHandler(ConfigFlow, domain=DOMAIN):
"""Get the options flow for this handler."""
return ForecastSolarOptionFlowHandler()
@classmethod
@callback
def async_get_supported_subentry_types(
cls, config_entry: ConfigEntry
) -> dict[str, type[ConfigSubentryFlow]]:
"""Return subentries supported by this handler."""
return {SUBENTRY_TYPE_PLANE: PlaneSubentryFlowHandler}
async def async_step_user(
self, user_input: dict[str, Any] | None = None
) -> ConfigFlowResult:
@@ -99,112 +54,94 @@ class ForecastSolarFlowHandler(ConfigFlow, domain=DOMAIN):
CONF_LATITUDE: user_input[CONF_LATITUDE],
CONF_LONGITUDE: user_input[CONF_LONGITUDE],
},
subentries=[
{
"subentry_type": SUBENTRY_TYPE_PLANE,
"data": {
CONF_DECLINATION: user_input[CONF_DECLINATION],
CONF_AZIMUTH: user_input[CONF_AZIMUTH],
CONF_MODULES_POWER: user_input[CONF_MODULES_POWER],
},
"title": f"{user_input[CONF_DECLINATION]}° / {user_input[CONF_AZIMUTH]}° / {user_input[CONF_MODULES_POWER]}W",
"unique_id": None,
},
],
options={
CONF_AZIMUTH: user_input[CONF_AZIMUTH],
CONF_DECLINATION: user_input[CONF_DECLINATION],
CONF_MODULES_POWER: user_input[CONF_MODULES_POWER],
},
)
return self.async_show_form(
step_id="user",
data_schema=self.add_suggested_values_to_schema(
vol.Schema(
{
vol.Required(CONF_NAME): str,
vol.Required(CONF_LATITUDE): cv.latitude,
vol.Required(CONF_LONGITUDE): cv.longitude,
}
).extend(PLANE_SCHEMA.schema),
data_schema=vol.Schema(
{
CONF_NAME: self.hass.config.location_name,
CONF_LATITUDE: self.hass.config.latitude,
CONF_LONGITUDE: self.hass.config.longitude,
CONF_DECLINATION: DEFAULT_DECLINATION,
CONF_AZIMUTH: DEFAULT_AZIMUTH,
CONF_MODULES_POWER: DEFAULT_MODULES_POWER,
},
vol.Required(
CONF_NAME, default=self.hass.config.location_name
): str,
vol.Required(
CONF_LATITUDE, default=self.hass.config.latitude
): cv.latitude,
vol.Required(
CONF_LONGITUDE, default=self.hass.config.longitude
): cv.longitude,
vol.Required(CONF_DECLINATION, default=25): vol.All(
vol.Coerce(int), vol.Range(min=0, max=90)
),
vol.Required(CONF_AZIMUTH, default=180): vol.All(
vol.Coerce(int), vol.Range(min=0, max=360)
),
vol.Required(CONF_MODULES_POWER): vol.All(
vol.Coerce(int), vol.Range(min=1)
),
}
),
)
class ForecastSolarOptionFlowHandler(OptionsFlow):
class ForecastSolarOptionFlowHandler(OptionsFlowWithReload):
"""Handle options."""
async def async_step_init(
self, user_input: dict[str, Any] | None = None
) -> ConfigFlowResult:
"""Manage the options."""
errors: dict[str, str] = {}
planes_count = len(
self.config_entry.get_subentries_of_type(SUBENTRY_TYPE_PLANE)
)
errors = {}
if user_input is not None:
api_key = user_input.get(CONF_API_KEY)
if planes_count > 1 and not api_key:
errors[CONF_API_KEY] = "api_key_required"
elif api_key and RE_API_KEY.match(api_key) is None:
if (api_key := user_input.get(CONF_API_KEY)) and RE_API_KEY.match(
api_key
) is None:
errors[CONF_API_KEY] = "invalid_api_key"
else:
return self.async_create_entry(
title="", data=user_input | {CONF_API_KEY: api_key or None}
)
suggested_api_key = self.config_entry.options.get(CONF_API_KEY, "")
return self.async_show_form(
step_id="init",
data_schema=vol.Schema(
{
vol.Required(
vol.Optional(
CONF_API_KEY,
default=suggested_api_key,
)
if planes_count > 1
else vol.Optional(
CONF_API_KEY,
description={"suggested_value": suggested_api_key},
description={
"suggested_value": self.config_entry.options.get(
CONF_API_KEY, ""
)
},
): str,
vol.Required(
CONF_DECLINATION,
default=self.config_entry.options[CONF_DECLINATION],
): vol.All(vol.Coerce(int), vol.Range(min=0, max=90)),
vol.Required(
CONF_AZIMUTH,
default=self.config_entry.options.get(CONF_AZIMUTH),
): vol.All(vol.Coerce(int), vol.Range(min=-0, max=360)),
vol.Required(
CONF_MODULES_POWER,
default=self.config_entry.options[CONF_MODULES_POWER],
): vol.All(vol.Coerce(int), vol.Range(min=1)),
vol.Optional(
CONF_DAMPING_MORNING,
default=self.config_entry.options.get(
CONF_DAMPING_MORNING, DEFAULT_DAMPING
CONF_DAMPING_MORNING, 0.0
),
): vol.All(
selector.NumberSelector(
selector.NumberSelectorConfig(
min=0,
max=1,
step=0.01,
mode=selector.NumberSelectorMode.BOX,
),
),
vol.Coerce(float),
),
): vol.Coerce(float),
vol.Optional(
CONF_DAMPING_EVENING,
default=self.config_entry.options.get(
CONF_DAMPING_EVENING, DEFAULT_DAMPING
CONF_DAMPING_EVENING, 0.0
),
): vol.All(
selector.NumberSelector(
selector.NumberSelectorConfig(
min=0,
max=1,
step=0.01,
mode=selector.NumberSelectorMode.BOX,
),
),
vol.Coerce(float),
),
): vol.Coerce(float),
vol.Optional(
CONF_INVERTER_SIZE,
description={
@@ -212,89 +149,8 @@ class ForecastSolarOptionFlowHandler(OptionsFlow):
CONF_INVERTER_SIZE
)
},
): vol.All(
selector.NumberSelector(
selector.NumberSelectorConfig(
min=1,
step=1,
mode=selector.NumberSelectorMode.BOX,
),
),
vol.Coerce(int),
),
): vol.Coerce(int),
}
),
errors=errors,
)
class PlaneSubentryFlowHandler(ConfigSubentryFlow):
"""Handle a subentry flow for adding/editing a plane."""
async def async_step_user(
self, user_input: dict[str, Any] | None = None
) -> SubentryFlowResult:
"""Handle the user step to add a new plane."""
entry = self._get_entry()
planes_count = len(entry.get_subentries_of_type(SUBENTRY_TYPE_PLANE))
if planes_count >= MAX_PLANES:
return self.async_abort(reason="max_planes")
if planes_count >= 1 and not entry.options.get(CONF_API_KEY):
return self.async_abort(reason="api_key_required")
if user_input is not None:
return self.async_create_entry(
title=f"{user_input[CONF_DECLINATION]}° / {user_input[CONF_AZIMUTH]}° / {user_input[CONF_MODULES_POWER]}W",
data={
CONF_DECLINATION: user_input[CONF_DECLINATION],
CONF_AZIMUTH: user_input[CONF_AZIMUTH],
CONF_MODULES_POWER: user_input[CONF_MODULES_POWER],
},
)
return self.async_show_form(
step_id="user",
data_schema=self.add_suggested_values_to_schema(
PLANE_SCHEMA,
{
CONF_DECLINATION: DEFAULT_DECLINATION,
CONF_AZIMUTH: DEFAULT_AZIMUTH,
CONF_MODULES_POWER: DEFAULT_MODULES_POWER,
},
),
)
async def async_step_reconfigure(
self, user_input: dict[str, Any] | None = None
) -> SubentryFlowResult:
"""Handle reconfiguration of an existing plane."""
subentry = self._get_reconfigure_subentry()
if user_input is not None:
entry = self._get_entry()
if self._async_update(
entry,
subentry,
data={
CONF_DECLINATION: user_input[CONF_DECLINATION],
CONF_AZIMUTH: user_input[CONF_AZIMUTH],
CONF_MODULES_POWER: user_input[CONF_MODULES_POWER],
},
title=f"{user_input[CONF_DECLINATION]}° / {user_input[CONF_AZIMUTH]}° / {user_input[CONF_MODULES_POWER]}W",
):
if not entry.update_listeners:
self.hass.config_entries.async_schedule_reload(entry.entry_id)
return self.async_abort(reason="reconfigure_successful")
return self.async_show_form(
step_id="reconfigure",
data_schema=self.add_suggested_values_to_schema(
PLANE_SCHEMA,
{
CONF_DECLINATION: subentry.data[CONF_DECLINATION],
CONF_AZIMUTH: subentry.data[CONF_AZIMUTH],
CONF_MODULES_POWER: subentry.data[CONF_MODULES_POWER],
},
),
)

View File

@@ -14,9 +14,3 @@ CONF_DAMPING = "damping"
CONF_DAMPING_MORNING = "damping_morning"
CONF_DAMPING_EVENING = "damping_evening"
CONF_INVERTER_SIZE = "inverter_size"
DEFAULT_DECLINATION = 25
DEFAULT_AZIMUTH = 180
DEFAULT_MODULES_POWER = 10000
DEFAULT_DAMPING = 0.0
MAX_PLANES = 4
SUBENTRY_TYPE_PLANE = "plane"

View File

@@ -4,7 +4,7 @@ from __future__ import annotations
from datetime import timedelta
from forecast_solar import Estimate, ForecastSolar, ForecastSolarConnectionError, Plane
from forecast_solar import Estimate, ForecastSolar, ForecastSolarConnectionError
from homeassistant.config_entries import ConfigEntry
from homeassistant.const import CONF_API_KEY, CONF_LATITUDE, CONF_LONGITUDE
@@ -19,10 +19,8 @@ from .const import (
CONF_DECLINATION,
CONF_INVERTER_SIZE,
CONF_MODULES_POWER,
DEFAULT_DAMPING,
DOMAIN,
LOGGER,
SUBENTRY_TYPE_PLANE,
)
type ForecastSolarConfigEntry = ConfigEntry[ForecastSolarDataUpdateCoordinator]
@@ -32,7 +30,6 @@ class ForecastSolarDataUpdateCoordinator(DataUpdateCoordinator[Estimate]):
"""The Forecast.Solar Data Update Coordinator."""
config_entry: ForecastSolarConfigEntry
forecast: ForecastSolar
def __init__(self, hass: HomeAssistant, entry: ForecastSolarConfigEntry) -> None:
"""Initialize the Forecast.Solar coordinator."""
@@ -46,34 +43,17 @@ class ForecastSolarDataUpdateCoordinator(DataUpdateCoordinator[Estimate]):
) is not None and inverter_size > 0:
inverter_size = inverter_size / 1000
# Build the list of planes from subentries.
plane_subentries = entry.get_subentries_of_type(SUBENTRY_TYPE_PLANE)
# The first plane subentry is the main plane
main_plane = plane_subentries[0]
# Additional planes
planes: list[Plane] = [
Plane(
declination=subentry.data[CONF_DECLINATION],
azimuth=(subentry.data[CONF_AZIMUTH] - 180),
kwp=(subentry.data[CONF_MODULES_POWER] / 1000),
)
for subentry in plane_subentries[1:]
]
self.forecast = ForecastSolar(
api_key=api_key,
session=async_get_clientsession(hass),
latitude=entry.data[CONF_LATITUDE],
longitude=entry.data[CONF_LONGITUDE],
declination=main_plane.data[CONF_DECLINATION],
azimuth=(main_plane.data[CONF_AZIMUTH] - 180),
kwp=(main_plane.data[CONF_MODULES_POWER] / 1000),
damping_morning=entry.options.get(CONF_DAMPING_MORNING, DEFAULT_DAMPING),
damping_evening=entry.options.get(CONF_DAMPING_EVENING, DEFAULT_DAMPING),
declination=entry.options[CONF_DECLINATION],
azimuth=(entry.options[CONF_AZIMUTH] - 180),
kwp=(entry.options[CONF_MODULES_POWER] / 1000),
damping_morning=entry.options.get(CONF_DAMPING_MORNING, 0.0),
damping_evening=entry.options.get(CONF_DAMPING_EVENING, 0.0),
inverter=inverter_size,
planes=planes,
)
# Free account have a resolution of 1 hour, using that as the default

View File

@@ -28,13 +28,6 @@ async def async_get_config_entry_diagnostics(
"title": entry.title,
"data": async_redact_data(entry.data, TO_REDACT),
"options": async_redact_data(entry.options, TO_REDACT),
"subentries": [
{
"data": dict(subentry.data),
"title": subentry.title,
}
for subentry in entry.subentries.values()
],
},
"data": {
"energy_production_today": coordinator.data.energy_production_today,

View File

@@ -14,37 +14,6 @@
}
}
},
"config_subentries": {
"plane": {
"abort": {
"api_key_required": "An API key is required to add more than one plane. You can configure it in the integration options.",
"max_planes": "You can add a maximum of 4 planes.",
"reconfigure_successful": "[%key:common::config_flow::abort::reconfigure_successful%]"
},
"entry_type": "Plane",
"initiate_flow": {
"user": "Add plane"
},
"step": {
"reconfigure": {
"data": {
"azimuth": "[%key:component::forecast_solar::config::step::user::data::azimuth%]",
"declination": "[%key:component::forecast_solar::config::step::user::data::declination%]",
"modules_power": "[%key:component::forecast_solar::config::step::user::data::modules_power%]"
},
"description": "Edit the solar plane configuration."
},
"user": {
"data": {
"azimuth": "[%key:component::forecast_solar::config::step::user::data::azimuth%]",
"declination": "[%key:component::forecast_solar::config::step::user::data::declination%]",
"modules_power": "[%key:component::forecast_solar::config::step::user::data::modules_power%]"
},
"description": "Add a solar plane. Multiple planes are supported with a Forecast.Solar API subscription."
}
}
}
},
"entity": {
"sensor": {
"energy_current_hour": {
@@ -82,26 +51,20 @@
}
}
},
"exceptions": {
"api_key_required": {
"message": "An API key is required when more than one plane is configured"
},
"no_plane": {
"message": "No plane configured, cannot set up Forecast.Solar"
}
},
"options": {
"error": {
"api_key_required": "An API key is required to add more than one plane. You can configure it in the integration options.",
"invalid_api_key": "[%key:common::config_flow::error::invalid_api_key%]"
},
"step": {
"init": {
"data": {
"api_key": "[%key:common::config_flow::data::api_key%]",
"azimuth": "[%key:component::forecast_solar::config::step::user::data::azimuth%]",
"damping_evening": "Damping factor: adjusts the results in the evening",
"damping_morning": "Damping factor: adjusts the results in the morning",
"inverter_size": "Inverter size (Watt)"
"declination": "[%key:component::forecast_solar::config::step::user::data::declination%]",
"inverter_size": "Inverter size (Watt)",
"modules_power": "[%key:component::forecast_solar::config::step::user::data::modules_power%]"
},
"description": "These values allow tweaking the Forecast.Solar result. Please refer to the documentation if a field is unclear."
}

View File

@@ -68,7 +68,7 @@
"name": "Worksheet"
}
},
"name": "Append data to Google sheet"
"name": "Append to sheet"
},
"get_sheet": {
"description": "Gets data from a worksheet in Google Sheets.",
@@ -86,7 +86,7 @@
"name": "[%key:component::google_sheets::services::append_sheet::fields::worksheet::name%]"
}
},
"name": "Get data from Google sheet"
"name": "Get data from sheet"
}
}
}

View File

@@ -8,7 +8,7 @@ import growattServer
import requests
import voluptuous as vol
from homeassistant.config_entries import ConfigEntry, ConfigFlow, ConfigFlowResult
from homeassistant.config_entries import ConfigFlow, ConfigFlowResult
from homeassistant.const import (
CONF_NAME,
CONF_PASSWORD,
@@ -64,16 +64,6 @@ class GrowattServerConfigFlow(ConfigFlow, domain=DOMAIN):
menu_options=["password_auth", "token_auth"],
)
async def async_step_reconfigure(
self, user_input: dict[str, Any] | None = None
) -> ConfigFlowResult:
"""Handle reconfiguration."""
return await self._async_step_credentials(
step_id="reconfigure",
entry=self._get_reconfigure_entry(),
user_input=user_input,
)
async def async_step_reauth(self, _: Mapping[str, Any]) -> ConfigFlowResult:
"""Handle reauth."""
return await self.async_step_reauth_confirm()
@@ -82,23 +72,11 @@ class GrowattServerConfigFlow(ConfigFlow, domain=DOMAIN):
self, user_input: dict[str, Any] | None = None
) -> ConfigFlowResult:
"""Handle reauth confirmation."""
return await self._async_step_credentials(
step_id="reauth_confirm",
entry=self._get_reauth_entry(),
user_input=user_input,
)
async def _async_step_credentials(
self,
step_id: str,
entry: ConfigEntry,
user_input: dict[str, Any] | None,
) -> ConfigFlowResult:
"""Handle credential update for both reauth and reconfigure."""
errors: dict[str, str] = {}
reauth_entry = self._get_reauth_entry()
if user_input is not None:
auth_type = entry.data.get(CONF_AUTH_TYPE)
auth_type = reauth_entry.data.get(CONF_AUTH_TYPE)
if auth_type == AUTH_PASSWORD:
server_url = SERVER_URLS_NAMES[user_input[CONF_REGION]]
@@ -113,19 +91,17 @@ class GrowattServerConfigFlow(ConfigFlow, domain=DOMAIN):
api.login, user_input[CONF_USERNAME], user_input[CONF_PASSWORD]
)
except requests.exceptions.RequestException as ex:
_LOGGER.debug("Network error during credential update: %s", ex)
_LOGGER.debug("Network error during reauth login: %s", ex)
errors["base"] = ERROR_CANNOT_CONNECT
except (ValueError, KeyError, TypeError, AttributeError) as ex:
_LOGGER.debug(
"Invalid response format during credential update: %s", ex
)
_LOGGER.debug("Invalid response format during reauth login: %s", ex)
errors["base"] = ERROR_CANNOT_CONNECT
else:
if not isinstance(login_response, dict):
errors["base"] = ERROR_CANNOT_CONNECT
elif login_response.get("success"):
return self.async_update_reload_and_abort(
entry,
reauth_entry,
data_updates={
CONF_USERNAME: user_input[CONF_USERNAME],
CONF_PASSWORD: user_input[CONF_PASSWORD],
@@ -145,26 +121,28 @@ class GrowattServerConfigFlow(ConfigFlow, domain=DOMAIN):
try:
await self.hass.async_add_executor_job(api.plant_list)
except requests.exceptions.RequestException as ex:
_LOGGER.debug("Network error during credential update: %s", ex)
_LOGGER.debug(
"Network error during reauth token validation: %s", ex
)
errors["base"] = ERROR_CANNOT_CONNECT
except growattServer.GrowattV1ApiError as err:
if err.error_code == V1_API_ERROR_NO_PRIVILEGE:
errors["base"] = ERROR_INVALID_AUTH
else:
_LOGGER.debug(
"Growatt V1 API error during credential update: %s (Code: %s)",
"Growatt V1 API error during reauth: %s (Code: %s)",
err.error_msg or str(err),
err.error_code,
)
errors["base"] = ERROR_CANNOT_CONNECT
except (ValueError, KeyError, TypeError, AttributeError) as ex:
_LOGGER.debug(
"Invalid response format during credential update: %s", ex
"Invalid response format during reauth token validation: %s", ex
)
errors["base"] = ERROR_CANNOT_CONNECT
else:
return self.async_update_reload_and_abort(
entry,
reauth_entry,
data_updates={
CONF_TOKEN: user_input[CONF_TOKEN],
CONF_URL: server_url,
@@ -173,19 +151,19 @@ class GrowattServerConfigFlow(ConfigFlow, domain=DOMAIN):
# Determine the current region key from the stored config value.
# Legacy entries may store the region key directly; newer entries store the URL.
stored_url = entry.data.get(CONF_URL, "")
stored_url = reauth_entry.data.get(CONF_URL, "")
if stored_url in SERVER_URLS_NAMES:
current_region = stored_url
else:
current_region = _URL_TO_REGION.get(stored_url, DEFAULT_URL)
auth_type = entry.data.get(CONF_AUTH_TYPE)
auth_type = reauth_entry.data.get(CONF_AUTH_TYPE)
if auth_type == AUTH_PASSWORD:
data_schema = vol.Schema(
{
vol.Required(
CONF_USERNAME,
default=entry.data.get(CONF_USERNAME),
default=reauth_entry.data.get(CONF_USERNAME),
): str,
vol.Required(CONF_PASSWORD): str,
vol.Required(CONF_REGION, default=current_region): SelectSelector(
@@ -211,18 +189,8 @@ class GrowattServerConfigFlow(ConfigFlow, domain=DOMAIN):
else:
return self.async_abort(reason=ERROR_CANNOT_CONNECT)
if user_input is not None:
data_schema = self.add_suggested_values_to_schema(
data_schema,
{
key: value
for key, value in user_input.items()
if key not in (CONF_PASSWORD, CONF_TOKEN)
},
)
return self.async_show_form(
step_id=step_id,
step_id="reauth_confirm",
data_schema=data_schema,
errors=errors,
)

View File

@@ -50,7 +50,7 @@ rules:
entity-translations: done
exception-translations: done
icon-translations: done
reconfiguration-flow: done
reconfiguration-flow: todo
repair-issues:
status: exempt
comment: Integration does not raise repairable issues.

View File

@@ -4,8 +4,7 @@
"already_configured": "[%key:common::config_flow::abort::already_configured_device%]",
"cannot_connect": "[%key:common::config_flow::error::cannot_connect%]",
"no_plants": "No plants have been found on this account",
"reauth_successful": "[%key:common::config_flow::abort::reauth_successful%]",
"reconfigure_successful": "[%key:common::config_flow::abort::reconfigure_successful%]"
"reauth_successful": "[%key:common::config_flow::abort::reauth_successful%]"
},
"error": {
"cannot_connect": "Cannot connect to Growatt servers. Please check your internet connection and try again.",
@@ -50,22 +49,6 @@
"description": "Re-enter your credentials to continue using this integration.",
"title": "Re-authenticate with Growatt"
},
"reconfigure": {
"data": {
"password": "[%key:common::config_flow::data::password%]",
"region": "[%key:component::growatt_server::config::step::password_auth::data::region%]",
"token": "[%key:component::growatt_server::config::step::token_auth::data::token%]",
"username": "[%key:common::config_flow::data::username%]"
},
"data_description": {
"password": "[%key:component::growatt_server::config::step::password_auth::data_description::password%]",
"region": "[%key:component::growatt_server::config::step::password_auth::data_description::region%]",
"token": "[%key:component::growatt_server::config::step::token_auth::data_description::token%]",
"username": "[%key:component::growatt_server::config::step::password_auth::data_description::username%]"
},
"description": "Update your credentials to continue using this integration.",
"title": "Reconfigure Growatt"
},
"token_auth": {
"data": {
"region": "[%key:component::growatt_server::config::step::password_auth::data::region%]",

View File

@@ -49,21 +49,14 @@ from homeassistant.components.climate import (
HVACMode,
)
from homeassistant.components.water_heater import (
ATTR_OPERATION_LIST,
ATTR_OPERATION_MODE,
DOMAIN as WATER_HEATER_DOMAIN,
SERVICE_SET_OPERATION_MODE,
SERVICE_SET_TEMPERATURE as SERVICE_SET_TEMPERATURE_WATER_HEATER,
WaterHeaterEntityFeature,
)
from homeassistant.const import (
ATTR_ENTITY_ID,
ATTR_SUPPORTED_FEATURES,
ATTR_TEMPERATURE,
PERCENTAGE,
SERVICE_TURN_OFF,
SERVICE_TURN_ON,
STATE_OFF,
STATE_UNAVAILABLE,
STATE_UNKNOWN,
UnitOfTemperature,
@@ -752,7 +745,6 @@ class WaterHeater(HomeAccessory):
(
ATTR_MAX_TEMP,
ATTR_MIN_TEMP,
ATTR_OPERATION_LIST,
)
)
self._unit = self.hass.config.units.temperature_unit
@@ -760,20 +752,6 @@ class WaterHeater(HomeAccessory):
assert state
min_temp, max_temp = self.get_temperature_range(state)
features = state.attributes.get(ATTR_SUPPORTED_FEATURES, 0)
operation_list = state.attributes.get(ATTR_OPERATION_LIST) or []
self._supports_on_off = bool(features & WaterHeaterEntityFeature.ON_OFF)
self._supports_operation_mode = bool(
features & WaterHeaterEntityFeature.OPERATION_MODE
)
self._off_mode_available = self._supports_on_off or (
self._supports_operation_mode and STATE_OFF in operation_list
)
valid_modes = dict(HC_HOMEKIT_VALID_MODES_WATER_HEATER)
if self._off_mode_available:
valid_modes["Off"] = HC_HEAT_COOL_OFF
serv_thermostat = self.add_preload_service(SERV_THERMOSTAT)
self.char_current_heat_cool = serv_thermostat.configure_char(
@@ -783,7 +761,7 @@ class WaterHeater(HomeAccessory):
CHAR_TARGET_HEATING_COOLING,
value=1,
setter_callback=self.set_heat_cool,
valid_values=valid_modes,
valid_values=HC_HOMEKIT_VALID_MODES_WATER_HEATER,
)
self.char_current_temp = serv_thermostat.configure_char(
@@ -817,48 +795,8 @@ class WaterHeater(HomeAccessory):
def set_heat_cool(self, value: int) -> None:
"""Change operation mode to value if call came from HomeKit."""
_LOGGER.debug("%s: Set heat-cool to %d", self.entity_id, value)
params: dict[str, Any] = {ATTR_ENTITY_ID: self.entity_id}
if value == HC_HEAT_COOL_OFF:
if self._supports_on_off:
self.async_call_service(
WATER_HEATER_DOMAIN, SERVICE_TURN_OFF, params, "off"
)
elif self._off_mode_available and self._supports_operation_mode:
params[ATTR_OPERATION_MODE] = STATE_OFF
self.async_call_service(
WATER_HEATER_DOMAIN,
SERVICE_SET_OPERATION_MODE,
params,
STATE_OFF,
)
else:
self.char_target_heat_cool.set_value(HC_HEAT_COOL_HEAT)
elif value == HC_HEAT_COOL_HEAT:
if self._supports_on_off:
self.async_call_service(
WATER_HEATER_DOMAIN, SERVICE_TURN_ON, params, "on"
)
elif self._off_mode_available and self._supports_operation_mode:
state = self.hass.states.get(self.entity_id)
if not state:
return
current_operation_mode = state.attributes.get(ATTR_OPERATION_MODE)
if current_operation_mode and current_operation_mode != STATE_OFF:
# Already in a non-off operation mode; do not change it.
return
operation_list = state.attributes.get(ATTR_OPERATION_LIST) or []
for mode in operation_list:
if mode != STATE_OFF:
params[ATTR_OPERATION_MODE] = mode
self.async_call_service(
WATER_HEATER_DOMAIN,
SERVICE_SET_OPERATION_MODE,
params,
mode,
)
break
else:
self.char_target_heat_cool.set_value(HC_HEAT_COOL_HEAT)
if HC_HOMEKIT_TO_HASS[value] != HVACMode.HEAT:
self.char_target_heat_cool.set_value(1) # Heat
def set_target_temperature(self, value: float) -> None:
"""Set target temperature to value if call came from HomeKit."""
@@ -891,12 +829,7 @@ class WaterHeater(HomeAccessory):
# Update target operation mode
if new_state.state:
if new_state.state == STATE_OFF and self._off_mode_available:
self.char_target_heat_cool.set_value(HC_HEAT_COOL_OFF)
self.char_current_heat_cool.set_value(HC_HEAT_COOL_OFF)
else:
self.char_target_heat_cool.set_value(HC_HEAT_COOL_HEAT)
self.char_current_heat_cool.set_value(HC_HEAT_COOL_HEAT)
self.char_target_heat_cool.set_value(1) # Heat
def _get_temperature_range_from_state(

View File

@@ -72,7 +72,7 @@
"cold_tea": {
"fix_flow": {
"abort": {
"not_tea_time": "Cannot reheat the tea at this time"
"not_tea_time": "Can not re-heat the tea at this time"
},
"step": {}
},

View File

@@ -4,7 +4,7 @@
"already_configured": "[%key:common::config_flow::abort::already_configured_account%]"
},
"error": {
"not_readable_path": "The provided path to the file cannot be read"
"not_readable_path": "The provided path to the file can not be read"
},
"step": {
"user": {

View File

@@ -34,7 +34,6 @@ from .const import (
EVENT_TYPE_OFF,
EVENT_TYPE_ON,
MANUFACTURER,
NETATMO_ALIM_STATUS_ONLINE,
NETATMO_CREATE_CAMERA,
SERVICE_SET_CAMERA_LIGHT,
SERVICE_SET_PERSON_AWAY,
@@ -175,16 +174,18 @@ class NetatmoCamera(NetatmoModuleEntity, Camera):
self._monitoring = False
elif event_type in [EVENT_TYPE_CONNECTION, EVENT_TYPE_ON]:
_LOGGER.debug(
"Camera %s has received %s event, turning on and enabling streaming if applicable",
"Camera %s has received %s event, turning on and enabling streaming",
data["camera_id"],
event_type,
)
if self.device_type != "NDB":
self._attr_is_streaming = True
self._attr_is_streaming = True
self._monitoring = True
elif event_type == EVENT_TYPE_LIGHT_MODE:
if data.get("sub_type"):
self._light_state = data["sub_type"]
self._attr_extra_state_attributes.update(
{"light_state": self._light_state}
)
else:
_LOGGER.debug(
"Camera %s has received light mode event without sub_type",
@@ -224,20 +225,6 @@ class NetatmoCamera(NetatmoModuleEntity, Camera):
supported_features |= CameraEntityFeature.STREAM
return supported_features
@property
def extra_state_attributes(self) -> dict[str, Any]:
"""Return entity specific state attributes."""
return {
"id": self.device.entity_id,
"monitoring": self._monitoring,
"sd_status": self.device.sd_status,
"alim_status": self.device.alim_status,
"is_local": self.device.is_local,
"vpn_url": self.device.vpn_url,
"local_url": self.device.local_url,
"light_state": self._light_state,
}
async def async_turn_off(self) -> None:
"""Turn off camera."""
await self.device.async_monitoring_off()
@@ -261,10 +248,7 @@ class NetatmoCamera(NetatmoModuleEntity, Camera):
self._attr_is_on = self.device.alim_status is not None
self._attr_available = self.device.alim_status is not None
if self.device_type == "NDB":
self._monitoring = self.device.alim_status == NETATMO_ALIM_STATUS_ONLINE
elif self.device.monitoring is not None:
self._monitoring = self.device.monitoring
if self.device.monitoring is not None:
self._attr_is_streaming = self.device.monitoring
self._attr_motion_detection_enabled = self.device.monitoring
@@ -272,6 +256,19 @@ class NetatmoCamera(NetatmoModuleEntity, Camera):
self.process_events(self.device.events)
)
self._attr_extra_state_attributes.update(
{
"id": self.device.entity_id,
"monitoring": self._monitoring,
"sd_status": self.device.sd_status,
"alim_status": self.device.alim_status,
"is_local": self.device.is_local,
"vpn_url": self.device.vpn_url,
"local_url": self.device.local_url,
"light_state": self._light_state,
}
)
def process_events(self, event_list: list[NaEvent]) -> dict:
"""Add meta data to events."""
events = {}

View File

@@ -215,15 +215,5 @@ WEBHOOK_ACTIVATION = "webhook_activation"
WEBHOOK_DEACTIVATION = "webhook_deactivation"
WEBHOOK_NACAMERA_CONNECTION = "NACamera-connection"
WEBHOOK_NOCAMERA_CONNECTION = "NOC-connection"
WEBHOOK_NDB_CONNECTION = "NDB-connection"
WEBHOOK_PUSH_TYPE = "push_type"
CAMERA_CONNECTION_WEBHOOKS = [
WEBHOOK_NACAMERA_CONNECTION,
WEBHOOK_NOCAMERA_CONNECTION,
WEBHOOK_NDB_CONNECTION,
]
# Alimentation status (alim_status) for cameras and door bells (NDB).
# For NDB there is no monitoring attribute in status but only alim_status.
# 2 = Full power/online for NDB (and also Correct power adapter for NACamera).
NETATMO_ALIM_STATUS_ONLINE = 2
CAMERA_CONNECTION_WEBHOOKS = [WEBHOOK_NACAMERA_CONNECTION, WEBHOOK_NOCAMERA_CONNECTION]

View File

@@ -18,7 +18,7 @@ from .const import (
)
from .coordinator import NinaConfigEntry, NINADataUpdateCoordinator
PLATFORMS: list[Platform] = [Platform.BINARY_SENSOR, Platform.SENSOR]
PLATFORMS: list[str] = [Platform.BINARY_SENSOR]
async def async_setup_entry(hass: HomeAssistant, entry: NinaConfigEntry) -> bool:

View File

@@ -1,4 +1,4 @@
"""NINA binary sensor platform."""
"""NINA sensor platform."""
from __future__ import annotations
@@ -88,19 +88,15 @@ class NINAMessage(NinaEntity, BinarySensorEntity):
data = self._get_warning_data()
return {
ATTR_HEADLINE: data.headline, # Deprecated, remove in 2026.11
ATTR_DESCRIPTION: data.description, # Deprecated, remove in 2026.11
ATTR_SENDER: data.sender, # Deprecated, remove in 2026.11
ATTR_SEVERITY: data.severity or "Unknown", # Deprecated, remove in 2026.11
ATTR_RECOMMENDED_ACTIONS: data.recommended_actions, # Deprecated, remove in 2026.11
ATTR_AFFECTED_AREAS: data.affected_areas, # Deprecated, remove in 2026.11
ATTR_WEB: data.more_info_url, # Deprecated, remove in 2026.11
ATTR_HEADLINE: data.headline,
ATTR_DESCRIPTION: data.description,
ATTR_SENDER: data.sender,
ATTR_SEVERITY: data.severity,
ATTR_RECOMMENDED_ACTIONS: data.recommended_actions,
ATTR_AFFECTED_AREAS: data.affected_areas,
ATTR_WEB: data.web,
ATTR_ID: data.id,
ATTR_SENT: data.sent.isoformat(), # Deprecated, remove in 2026.11
ATTR_START: data.start.isoformat()
if data.start
else "", # Deprecated, remove in 2026.11
ATTR_EXPIRES: data.expires.isoformat()
if data.expires
else "", # Deprecated, remove in 2026.11
ATTR_SENT: data.sent,
ATTR_START: data.start,
ATTR_EXPIRES: data.expires,
}

View File

@@ -31,7 +31,6 @@ from .const import (
CONST_REGIONS,
DOMAIN,
NO_MATCH_REGEX,
SENSOR_SUFFIXES,
)
@@ -244,7 +243,32 @@ class OptionsFlowHandler(OptionsFlowWithReload):
user_input, self._all_region_codes_sorted
)
await self.remove_unused_entities(user_input)
entity_registry = er.async_get(self.hass)
entries = er.async_entries_for_config_entry(
entity_registry, self.config_entry.entry_id
)
removed_entities_slots = [
f"{region}-{slot_id}"
for region in self.data[CONF_REGIONS]
for slot_id in range(self.data[CONF_MESSAGE_SLOTS] + 1)
if slot_id > user_input[CONF_MESSAGE_SLOTS]
]
removed_entities_area = [
f"{cfg_region}-{slot_id}"
for slot_id in range(1, self.data[CONF_MESSAGE_SLOTS] + 1)
for cfg_region in self.data[CONF_REGIONS]
if cfg_region not in user_input[CONF_REGIONS]
]
for entry in entries:
for entity_uid in list(
set(removed_entities_slots + removed_entities_area)
):
if entry.unique_id == entity_uid:
entity_registry.async_remove(entry.entity_id)
self.hass.config_entries.async_update_entry(
self.config_entry, data=user_input
@@ -263,35 +287,3 @@ class OptionsFlowHandler(OptionsFlowWithReload):
data_schema=schema_with_suggested,
errors=errors,
)
async def remove_unused_entities(self, user_input: dict[str, Any]) -> None:
"""Remove entities which are not used anymore."""
entity_registry = er.async_get(self.hass)
entries = er.async_entries_for_config_entry(
entity_registry, self.config_entry.entry_id
)
id_type_suffix = [f"-{sensor_id}" for sensor_id in SENSOR_SUFFIXES] + [""]
removed_entities_slots = [
f"{region}-{slot_id}{suffix}"
for region in self.data[CONF_REGIONS]
for slot_id in range(self.data[CONF_MESSAGE_SLOTS] + 1)
for suffix in id_type_suffix
if slot_id > user_input[CONF_MESSAGE_SLOTS]
]
removed_entities_area = [
f"{cfg_region}-{slot_id}{suffix}"
for slot_id in range(1, self.data[CONF_MESSAGE_SLOTS] + 1)
for cfg_region in self.data[CONF_REGIONS]
for suffix in id_type_suffix
if cfg_region not in user_input[CONF_REGIONS]
]
removed_uids = set(removed_entities_slots + removed_entities_area)
for entry in entries:
if entry.unique_id in removed_uids:
entity_registry.async_remove(entry.entity_id)

View File

@@ -15,8 +15,6 @@ DOMAIN: str = "nina"
NO_MATCH_REGEX: str = "/(?!)/"
ALL_MATCH_REGEX: str = ".*"
SEVERITY_VALUES: list[str] = ["extreme", "severe", "moderate", "minor", "unknown"]
CONF_REGIONS: str = "regions"
CONF_MESSAGE_SLOTS: str = "slots"
CONF_FILTERS: str = "filters"
@@ -36,17 +34,6 @@ ATTR_SENT: str = "sent"
ATTR_START: str = "start"
ATTR_EXPIRES: str = "expires"
SENSOR_SUFFIXES: list[str] = [
"headline",
"sender",
"severity",
"affected_areas",
"more_info_url",
"sent",
"start",
"expires",
]
CONST_LIST_A_TO_D: list[str] = ["A", "Ä", "B", "C", "D"]
CONST_LIST_E_TO_H: list[str] = ["E", "F", "G", "H"]
CONST_LIST_I_TO_L: list[str] = ["I", "J", "K", "L"]

View File

@@ -4,7 +4,6 @@ from __future__ import annotations
import asyncio
from dataclasses import dataclass
from datetime import datetime
import re
from typing import Any
@@ -13,6 +12,7 @@ from pynina import ApiError, Nina
from homeassistant.config_entries import ConfigEntry
from homeassistant.core import HomeAssistant
from homeassistant.helpers.aiohttp_client import async_get_clientsession
from homeassistant.helpers.device_registry import DeviceEntryType, DeviceInfo
from homeassistant.helpers.update_coordinator import DataUpdateCoordinator, UpdateFailed
from .const import (
@@ -36,14 +36,13 @@ class NinaWarningData:
headline: str
description: str
sender: str
severity: str | None
severity: str
recommended_actions: str
affected_areas_short: str
affected_areas: str
more_info_url: str
sent: datetime
start: datetime | None
expires: datetime | None
web: str
sent: str
start: str
expires: str
is_valid: bool
@@ -66,6 +65,12 @@ class NINADataUpdateCoordinator(
]
self.area_filter: str = config_entry.data[CONF_FILTERS][CONF_AREA_FILTER]
self.device_info = DeviceInfo(
identifiers={(DOMAIN, config_entry.entry_id)},
manufacturer="NINA",
entry_type=DeviceEntryType.SERVICE,
)
regions: dict[str, str] = config_entry.data[CONF_REGIONS]
for region in regions:
self._nina.add_region(region)
@@ -141,33 +146,18 @@ class NINADataUpdateCoordinator(
)
continue
shortened_affected_areas: str = (
affected_areas_string[0:250] + "..."
if len(affected_areas_string) > 250
else affected_areas_string
)
severity = (
None
if raw_warn.severity.lower() == "unknown"
else raw_warn.severity
)
warning_data: NinaWarningData = NinaWarningData(
raw_warn.id,
raw_warn.headline,
raw_warn.description,
raw_warn.sender or "",
severity,
raw_warn.sender,
raw_warn.severity,
" ".join([str(action) for action in raw_warn.recommended_actions]),
shortened_affected_areas,
affected_areas_string,
raw_warn.web or "",
datetime.fromisoformat(raw_warn.sent),
datetime.fromisoformat(raw_warn.start) if raw_warn.start else None,
datetime.fromisoformat(raw_warn.expires)
if raw_warn.expires
else None,
raw_warn.sent or "",
raw_warn.start or "",
raw_warn.expires or "",
raw_warn.is_valid,
)
warnings_for_regions.append(warning_data)

View File

@@ -1,9 +1,7 @@
"""NINA common entity."""
from homeassistant.helpers.device_registry import DeviceEntryType, DeviceInfo
from homeassistant.helpers.update_coordinator import CoordinatorEntity
from .const import DOMAIN
from .coordinator import NINADataUpdateCoordinator, NinaWarningData
@@ -22,18 +20,12 @@ class NinaEntity(CoordinatorEntity[NINADataUpdateCoordinator]):
self._region = region
self._warning_index = slot_id - 1
self._region_name = region_name
self._attr_translation_placeholders = {
"region_name": region_name,
"slot_id": str(slot_id),
}
self._attr_device_info = DeviceInfo(
identifiers={(DOMAIN, self._region)},
manufacturer="NINA",
name=self._region_name,
entry_type=DeviceEntryType.SERVICE,
)
self._attr_device_info = coordinator.device_info
def _get_active_warnings_count(self) -> int:
"""Return the number of active warnings for the region."""

View File

@@ -62,17 +62,23 @@ rules:
docs-supported-devices:
status: exempt
comment: |
This integration exposes Home Assistant devices only for logical grouping and does not integrate specific physical devices that need to be documented as supported hardware.
docs-supported-functions: done
This integration does not use devices.
docs-supported-functions: todo
docs-troubleshooting: todo
docs-use-cases: todo
dynamic-devices: done
entity-category: done
entity-device-class: done
entity-category: todo
entity-device-class:
status: todo
comment: |
Extract attributes into own entities.
entity-disabled-by-default: done
entity-translations: done
entity-translations: todo
exception-translations: todo
icon-translations: todo
icon-translations:
status: exempt
comment: |
This integration does not custom icons.
reconfiguration-flow: todo
repair-issues:
status: exempt

View File

@@ -1,159 +0,0 @@
"""NINA sensor platform."""
from __future__ import annotations
from collections.abc import Callable, Sequence
from dataclasses import dataclass
from datetime import datetime
from homeassistant.components.sensor import (
SensorDeviceClass,
SensorEntity,
SensorEntityDescription,
)
from homeassistant.const import EntityCategory
from homeassistant.core import HomeAssistant
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from .const import CONF_MESSAGE_SLOTS, CONF_REGIONS, SENSOR_SUFFIXES, SEVERITY_VALUES
from .coordinator import NinaConfigEntry, NINADataUpdateCoordinator, NinaWarningData
from .entity import NinaEntity
PARALLEL_UPDATES = 0
@dataclass(frozen=True, kw_only=True)
class NinaSensorEntityDescription(SensorEntityDescription):
"""Describes NINA sensor entity."""
value_fn: Callable[[NinaWarningData], str | datetime | None]
SENSOR_TYPES: tuple[NinaSensorEntityDescription, ...] = (
NinaSensorEntityDescription(
key=SENSOR_SUFFIXES[0],
translation_key="headline",
value_fn=lambda data: data.headline,
),
NinaSensorEntityDescription(
key=SENSOR_SUFFIXES[1],
translation_key="sender",
value_fn=lambda data: data.sender,
),
NinaSensorEntityDescription(
key=SENSOR_SUFFIXES[2],
options=SEVERITY_VALUES,
device_class=SensorDeviceClass.ENUM,
translation_key="severity",
value_fn=lambda data: (
data.severity.lower() if data.severity is not None else None
),
),
NinaSensorEntityDescription(
key=SENSOR_SUFFIXES[3],
translation_key="affected_areas",
value_fn=lambda data: data.affected_areas_short,
),
NinaSensorEntityDescription(
key=SENSOR_SUFFIXES[4],
translation_key="more_info_url",
value_fn=lambda data: data.more_info_url,
),
NinaSensorEntityDescription(
key=SENSOR_SUFFIXES[5],
translation_key="sent",
entity_registry_enabled_default=False,
device_class=SensorDeviceClass.TIMESTAMP,
value_fn=lambda data: data.sent,
),
NinaSensorEntityDescription(
key=SENSOR_SUFFIXES[6],
translation_key="start",
entity_registry_enabled_default=False,
device_class=SensorDeviceClass.TIMESTAMP,
value_fn=lambda data: data.start,
),
NinaSensorEntityDescription(
key=SENSOR_SUFFIXES[7],
translation_key="expires",
entity_registry_enabled_default=False,
device_class=SensorDeviceClass.TIMESTAMP,
value_fn=lambda data: data.expires,
),
)
def create_sensors_for_warning(
coordinator: NINADataUpdateCoordinator, region: str, region_name: str, slot_id: int
) -> Sequence[NinaSensor]:
"""Create sensors for a warning."""
return [
NinaSensor(
coordinator,
region,
region_name,
slot_id,
description,
)
for description in SENSOR_TYPES
]
async def async_setup_entry(
_: HomeAssistant,
config_entry: NinaConfigEntry,
async_add_entities: AddConfigEntryEntitiesCallback,
) -> None:
"""Set up the NINA sensor platform."""
coordinator = config_entry.runtime_data
regions: dict[str, str] = config_entry.data[CONF_REGIONS]
message_slots: int = config_entry.data[CONF_MESSAGE_SLOTS]
entities = [
create_sensors_for_warning(coordinator, ent, regions[ent], i + 1)
for ent in coordinator.data
for i in range(message_slots)
]
async_add_entities(
[entity for slot_entities in entities for entity in slot_entities]
)
class NinaSensor(NinaEntity, SensorEntity):
"""Representation of a NINA sensor."""
_attr_has_entity_name = True
_attr_entity_category = EntityCategory.DIAGNOSTIC
entity_description: NinaSensorEntityDescription
def __init__(
self,
coordinator: NINADataUpdateCoordinator,
region: str,
region_name: str,
slot_id: int,
description: NinaSensorEntityDescription,
) -> None:
"""Initialize."""
super().__init__(coordinator, region, region_name, slot_id)
self.entity_description = description
self._attr_unique_id = f"{region}-{slot_id}-{self.entity_description.key}"
@property
def available(self) -> bool:
"""Return if entity is available."""
if self._get_active_warnings_count() <= self._warning_index:
return False
return self._get_warning_data().is_valid and super().available
@property
def native_value(self) -> str | datetime | None:
"""Return the state of the sensor."""
return self.entity_description.value_fn(self._get_warning_data())

View File

@@ -48,39 +48,7 @@
"entity": {
"binary_sensor": {
"warning": {
"name": "Warning {slot_id}"
}
},
"sensor": {
"affected_areas": {
"name": "Affected areas {slot_id}"
},
"expires": {
"name": "Expires {slot_id}"
},
"headline": {
"name": "Headline {slot_id}"
},
"more_info_url": {
"name": "More information URL {slot_id}"
},
"sender": {
"name": "Sender {slot_id}"
},
"sent": {
"name": "Sent {slot_id}"
},
"severity": {
"name": "Severity {slot_id}",
"state": {
"extreme": "Extreme",
"minor": "Minor",
"moderate": "Moderate",
"severe": "Severe"
}
},
"start": {
"name": "Start {slot_id}"
"name": "Warning: {region_name} {slot_id}"
}
}
},

View File

@@ -17,7 +17,6 @@ from opendisplay import (
from homeassistant.components.bluetooth import async_ble_device_from_address
from homeassistant.config_entries import ConfigEntry
from homeassistant.const import Platform
from homeassistant.core import HomeAssistant
from homeassistant.exceptions import ConfigEntryNotReady
from homeassistant.helpers import config_validation as cv, device_registry as dr
@@ -28,20 +27,15 @@ if TYPE_CHECKING:
from opendisplay.models import FirmwareVersion
from .const import DOMAIN
from .coordinator import OpenDisplayCoordinator
from .services import async_setup_services
CONFIG_SCHEMA = cv.config_entry_only_config_schema(DOMAIN)
_BASE_PLATFORMS: list[Platform] = []
_FLEX_PLATFORMS = [Platform.SENSOR]
@dataclass
class OpenDisplayRuntimeData:
"""Runtime data for an OpenDisplay config entry."""
coordinator: OpenDisplayCoordinator
firmware: FirmwareVersion
device_config: GlobalConfig
is_flex: bool
@@ -83,8 +77,13 @@ async def async_setup_entry(hass: HomeAssistant, entry: OpenDisplayConfigEntry)
if TYPE_CHECKING:
assert device_config is not None
coordinator = OpenDisplayCoordinator(hass, address)
entry.runtime_data = OpenDisplayRuntimeData(
firmware=fw,
device_config=device_config,
is_flex=is_flex,
)
# Will be moved to DeviceInfo object in entity.py once entities are added
manufacturer = device_config.manufacturer
display = device_config.displays[0]
color_scheme_enum = display.color_scheme_enum
@@ -98,16 +97,14 @@ async def async_setup_entry(hass: HomeAssistant, entry: OpenDisplayConfigEntry)
if display.screen_diagonal_inches is not None
else f"{display.pixel_width}x{display.pixel_height}"
)
dr.async_get(hass).async_get_or_create(
config_entry_id=entry.entry_id,
connections={(CONNECTION_BLUETOOTH, address)},
manufacturer=manufacturer.manufacturer_name,
model=f"{size} {color_scheme}",
sw_version=f"{fw['major']}.{fw['minor']}",
hw_version=(
f"{manufacturer.board_type_name or manufacturer.board_type}"
f" rev. {manufacturer.board_revision}"
)
hw_version=f"{manufacturer.board_type_name or manufacturer.board_type} rev. {manufacturer.board_revision}"
if is_flex
else None,
configuration_url="https://opendisplay.org/firmware/config/"
@@ -115,18 +112,6 @@ async def async_setup_entry(hass: HomeAssistant, entry: OpenDisplayConfigEntry)
else None,
)
entry.runtime_data = OpenDisplayRuntimeData(
coordinator=coordinator,
firmware=fw,
device_config=device_config,
is_flex=is_flex,
)
await hass.config_entries.async_forward_entry_setups(
entry, _FLEX_PLATFORMS if is_flex else _BASE_PLATFORMS
)
entry.async_on_unload(coordinator.async_start())
return True
@@ -139,6 +124,4 @@ async def async_unload_entry(
with contextlib.suppress(asyncio.CancelledError):
await task
return await hass.config_entries.async_unload_platforms(
entry, _FLEX_PLATFORMS if entry.runtime_data.is_flex else _BASE_PLATFORMS
)
return True

View File

@@ -1,86 +0,0 @@
"""Passive BLE coordinator for OpenDisplay devices."""
from __future__ import annotations
from dataclasses import dataclass
import logging
from opendisplay import MANUFACTURER_ID, parse_advertisement
from opendisplay.models.advertisement import AdvertisementData
from homeassistant.components.bluetooth import (
BluetoothChange,
BluetoothScanningMode,
BluetoothServiceInfoBleak,
)
from homeassistant.components.bluetooth.passive_update_coordinator import (
PassiveBluetoothDataUpdateCoordinator,
)
from homeassistant.core import HomeAssistant, callback
_LOGGER: logging.Logger = logging.getLogger(__package__)
@dataclass
class OpenDisplayUpdate:
"""Parsed advertisement data for one OpenDisplay device."""
address: str
advertisement: AdvertisementData
class OpenDisplayCoordinator(PassiveBluetoothDataUpdateCoordinator):
"""Coordinator for passive BLE advertisement updates from an OpenDisplay device."""
def __init__(self, hass: HomeAssistant, address: str) -> None:
"""Initialize the coordinator."""
super().__init__(
hass,
_LOGGER,
address,
BluetoothScanningMode.PASSIVE,
connectable=True,
)
self.data: OpenDisplayUpdate | None = None
@callback
def _async_handle_unavailable(
self, service_info: BluetoothServiceInfoBleak
) -> None:
"""Handle the device going unavailable."""
if self._available:
_LOGGER.info("%s: Device is unavailable", service_info.address)
super()._async_handle_unavailable(service_info)
@callback
def _async_handle_bluetooth_event(
self,
service_info: BluetoothServiceInfoBleak,
change: BluetoothChange,
) -> None:
"""Handle a Bluetooth advertisement event."""
if not self._available:
_LOGGER.info("%s: Device is available again", service_info.address)
if MANUFACTURER_ID not in service_info.manufacturer_data:
super()._async_handle_bluetooth_event(service_info, change)
return
try:
advertisement = parse_advertisement(
service_info.manufacturer_data[MANUFACTURER_ID]
)
except ValueError as err:
_LOGGER.debug(
"%s: Failed to parse advertisement data: %s",
service_info.address,
err,
exc_info=True,
)
else:
self.data = OpenDisplayUpdate(
address=service_info.address,
advertisement=advertisement,
)
super()._async_handle_bluetooth_event(service_info, change)

View File

@@ -1,31 +0,0 @@
"""Base entity for OpenDisplay devices."""
from __future__ import annotations
from homeassistant.components.bluetooth.passive_update_coordinator import (
PassiveBluetoothCoordinatorEntity,
)
from homeassistant.helpers.device_registry import CONNECTION_BLUETOOTH, DeviceInfo
from homeassistant.helpers.entity import EntityDescription
from .coordinator import OpenDisplayCoordinator
class OpenDisplayEntity(PassiveBluetoothCoordinatorEntity[OpenDisplayCoordinator]):
"""Base class for all OpenDisplay entities."""
_attr_has_entity_name = True
def __init__(
self,
coordinator: OpenDisplayCoordinator,
description: EntityDescription,
) -> None:
"""Initialize the entity."""
super().__init__(coordinator)
self.entity_description = description
self._attr_unique_id = f"{coordinator.address}-{description.key}"
self._attr_device_info = DeviceInfo(
connections={(CONNECTION_BLUETOOTH, coordinator.address)},
)

View File

@@ -6,7 +6,9 @@ rules:
comment: |
The `opendisplay` integration is a `local_push` integration that does not perform periodic polling.
brands: done
common-modules: done
common-modules:
status: exempt
comment: Integration does not currently use entities or a DataUpdateCoordinator.
config-flow-test-coverage: done
config-flow: done
dependency-transparency: done
@@ -14,9 +16,15 @@ rules:
docs-high-level-description: done
docs-installation-instructions: done
docs-removal-instructions: done
entity-event-setup: done
entity-unique-id: done
has-entity-name: done
entity-event-setup:
status: exempt
comment: Integration does not currently provide any entities.
entity-unique-id:
status: exempt
comment: Integration does not currently provide any entities.
has-entity-name:
status: exempt
comment: Integration does not currently provide any entities.
runtime-data: done
test-before-configure: done
test-before-setup: done
@@ -29,10 +37,16 @@ rules:
status: exempt
comment: Integration has no options flow.
docs-installation-parameters: done
entity-unavailable: done
entity-unavailable:
status: exempt
comment: Integration does not currently provide any entities.
integration-owner: done
log-when-unavailable: done
parallel-updates: done
log-when-unavailable:
status: exempt
comment: Integration does not currently implement any entities or background polling.
parallel-updates:
status: exempt
comment: Integration does not provide any entities.
reauthentication-flow:
status: exempt
comment: Devices do not require authentication.
@@ -45,7 +59,9 @@ rules:
status: exempt
comment: The device's BLE MAC address is both its unique identifier and does not change.
discovery: done
docs-data-update: todo
docs-data-update:
status: exempt
comment: Integration does not poll or push data to entities.
docs-examples: todo
docs-known-limitations: todo
docs-supported-devices: todo
@@ -55,10 +71,18 @@ rules:
dynamic-devices:
status: exempt
comment: Only one device per config entry. New devices are set up as new entries.
entity-category: done
entity-device-class: done
entity-disabled-by-default: done
entity-translations: done
entity-category:
status: exempt
comment: Integration does not provide any entities.
entity-device-class:
status: exempt
comment: Integration does not provide any entities.
entity-disabled-by-default:
status: exempt
comment: Integration does not provide any entities.
entity-translations:
status: exempt
comment: Integration does not provide any entities.
exception-translations: done
icon-translations: done
reconfiguration-flow:

View File

@@ -1,106 +0,0 @@
"""Sensor platform for OpenDisplay devices."""
from __future__ import annotations
from collections.abc import Callable
from dataclasses import dataclass
from opendisplay import voltage_to_percent
from opendisplay.models.advertisement import AdvertisementData
from opendisplay.models.enums import CapacityEstimator, PowerMode
from homeassistant.components.sensor import (
SensorDeviceClass,
SensorEntity,
SensorEntityDescription,
SensorStateClass,
)
from homeassistant.const import (
PERCENTAGE,
EntityCategory,
UnitOfElectricPotential,
UnitOfTemperature,
)
from homeassistant.core import HomeAssistant
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from . import OpenDisplayConfigEntry
from .entity import OpenDisplayEntity
PARALLEL_UPDATES = 0
@dataclass(frozen=True, kw_only=True)
class OpenDisplaySensorEntityDescription(SensorEntityDescription):
"""Describes an OpenDisplay sensor entity."""
value_fn: Callable[[AdvertisementData], float | int | None]
_TEMPERATURE_DESCRIPTION = OpenDisplaySensorEntityDescription(
key="temperature",
device_class=SensorDeviceClass.TEMPERATURE,
native_unit_of_measurement=UnitOfTemperature.CELSIUS,
state_class=SensorStateClass.MEASUREMENT,
entity_category=EntityCategory.DIAGNOSTIC,
entity_registry_enabled_default=False,
value_fn=lambda adv: adv.temperature_c,
)
_BATTERY_POWER_MODES = {PowerMode.BATTERY, PowerMode.SOLAR}
_BATTERY_VOLTAGE_DESCRIPTION = OpenDisplaySensorEntityDescription(
key="battery_voltage",
translation_key="battery_voltage",
device_class=SensorDeviceClass.VOLTAGE,
native_unit_of_measurement=UnitOfElectricPotential.MILLIVOLT,
state_class=SensorStateClass.MEASUREMENT,
entity_category=EntityCategory.DIAGNOSTIC,
entity_registry_enabled_default=False,
value_fn=lambda adv: adv.battery_mv,
)
async def async_setup_entry(
hass: HomeAssistant,
entry: OpenDisplayConfigEntry,
async_add_entities: AddConfigEntryEntitiesCallback,
) -> None:
"""Set up OpenDisplay sensor entities."""
coordinator = entry.runtime_data.coordinator
power_config = entry.runtime_data.device_config.power
descriptions: list[OpenDisplaySensorEntityDescription] = [_TEMPERATURE_DESCRIPTION]
if power_config.power_mode_enum in _BATTERY_POWER_MODES:
capacity_estimator = power_config.capacity_estimator or CapacityEstimator.LI_ION
descriptions += [
_BATTERY_VOLTAGE_DESCRIPTION,
OpenDisplaySensorEntityDescription(
key="battery",
device_class=SensorDeviceClass.BATTERY,
native_unit_of_measurement=PERCENTAGE,
state_class=SensorStateClass.MEASUREMENT,
entity_category=EntityCategory.DIAGNOSTIC,
value_fn=lambda adv: voltage_to_percent(
adv.battery_mv, capacity_estimator
),
),
]
async_add_entities(
OpenDisplaySensorEntity(coordinator, description)
for description in descriptions
)
class OpenDisplaySensorEntity(OpenDisplayEntity, SensorEntity):
"""A sensor entity for an OpenDisplay device."""
entity_description: OpenDisplaySensorEntityDescription
@property
def native_value(self) -> float | int | None:
"""Return the sensor value."""
if self.coordinator.data is None:
return None
return self.entity_description.value_fn(self.coordinator.data.advertisement)

View File

@@ -27,13 +27,6 @@
}
}
},
"entity": {
"sensor": {
"battery_voltage": {
"name": "Battery voltage"
}
}
},
"exceptions": {
"device_not_found": {
"message": "Could not find Bluetooth device with address `{address}`."

View File

@@ -346,7 +346,7 @@
},
"exceptions": {
"cannot_connect": {
"message": "Value cannot be set because the device is not connected"
"message": "Value can not be set because the device is not connected"
},
"write_rejected": {
"message": "The device rejected the value for {entity}: {value}"

View File

@@ -21,7 +21,7 @@ VM_CONTAINER_RUNNING = "running"
STORAGE_ACTIVE = 1
STORAGE_SHARED = 1
STORAGE_ENABLED = 1
STATUS_OK = "OK"
STATUS_OK = "ok"
AUTH_PAM = "pam"
AUTH_PVE = "pve"

View File

@@ -24,7 +24,6 @@ from .coordinator import (
InfoUpdateCoordinator,
JobUpdateCoordinator,
LegacyStatusCoordinator,
PrusaLinkConfigEntry,
PrusaLinkUpdateCoordinator,
StatusCoordinator,
)
@@ -37,7 +36,7 @@ PLATFORMS: list[Platform] = [
]
async def async_setup_entry(hass: HomeAssistant, entry: PrusaLinkConfigEntry) -> bool:
async def async_setup_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool:
"""Set up PrusaLink from a config entry."""
if entry.version == 1 and entry.minor_version < 2:
raise ConfigEntryError("Please upgrade your printer's firmware.")
@@ -58,7 +57,7 @@ async def async_setup_entry(hass: HomeAssistant, entry: PrusaLinkConfigEntry) ->
for coordinator in coordinators.values():
await coordinator.async_config_entry_first_refresh()
entry.runtime_data = coordinators
hass.data.setdefault(DOMAIN, {})[entry.entry_id] = coordinators
await hass.config_entries.async_forward_entry_setups(entry, PLATFORMS)
@@ -121,6 +120,9 @@ async def async_migrate_entry(hass: HomeAssistant, config_entry: ConfigEntry) ->
return True
async def async_unload_entry(hass: HomeAssistant, entry: PrusaLinkConfigEntry) -> bool:
async def async_unload_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool:
"""Unload a config entry."""
return await hass.config_entries.async_unload_platforms(entry, PLATFORMS)
if unload_ok := await hass.config_entries.async_unload_platforms(entry, PLATFORMS):
hass.data[DOMAIN].pop(entry.entry_id)
return unload_ok

View File

@@ -13,10 +13,12 @@ from homeassistant.components.binary_sensor import (
BinarySensorEntity,
BinarySensorEntityDescription,
)
from homeassistant.config_entries import ConfigEntry
from homeassistant.core import HomeAssistant
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from .coordinator import PrusaLinkConfigEntry, PrusaLinkUpdateCoordinator
from .const import DOMAIN
from .coordinator import PrusaLinkUpdateCoordinator
from .entity import PrusaLinkEntity
T = TypeVar("T", PrinterStatus, LegacyPrinterStatus, JobInfo, PrinterInfo)
@@ -54,11 +56,13 @@ BINARY_SENSORS: dict[str, tuple[PrusaLinkBinarySensorEntityDescription, ...]] =
async def async_setup_entry(
hass: HomeAssistant,
entry: PrusaLinkConfigEntry,
entry: ConfigEntry,
async_add_entities: AddConfigEntryEntitiesCallback,
) -> None:
"""Set up PrusaLink sensor based on a config entry."""
coordinators = entry.runtime_data
coordinators: dict[str, PrusaLinkUpdateCoordinator] = hass.data[DOMAIN][
entry.entry_id
]
entities: list[PrusaLinkEntity] = []
for coordinator_type, binary_sensors in BINARY_SENSORS.items():

View File

@@ -10,11 +10,13 @@ from pyprusalink import JobInfo, LegacyPrinterStatus, PrinterStatus, PrusaLink
from pyprusalink.types import Conflict, PrinterState
from homeassistant.components.button import ButtonEntity, ButtonEntityDescription
from homeassistant.config_entries import ConfigEntry
from homeassistant.core import HomeAssistant
from homeassistant.exceptions import HomeAssistantError
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from .coordinator import PrusaLinkConfigEntry, PrusaLinkUpdateCoordinator
from .const import DOMAIN
from .coordinator import PrusaLinkUpdateCoordinator
from .entity import PrusaLinkEntity
T = TypeVar("T", PrinterStatus, LegacyPrinterStatus, JobInfo)
@@ -69,11 +71,13 @@ BUTTONS: dict[str, tuple[PrusaLinkButtonEntityDescription, ...]] = {
async def async_setup_entry(
hass: HomeAssistant,
entry: PrusaLinkConfigEntry,
entry: ConfigEntry,
async_add_entities: AddConfigEntryEntitiesCallback,
) -> None:
"""Set up PrusaLink buttons based on a config entry."""
coordinators = entry.runtime_data
coordinators: dict[str, PrusaLinkUpdateCoordinator] = hass.data[DOMAIN][
entry.entry_id
]
entities: list[PrusaLinkEntity] = []
@@ -120,7 +124,9 @@ class PrusaLinkButtonEntity(PrusaLinkEntity, ButtonEntity):
"Action conflicts with current printer state"
) from err
coordinators = self.coordinator.config_entry.runtime_data
coordinators: dict[str, PrusaLinkUpdateCoordinator] = self.hass.data[DOMAIN][
self.coordinator.config_entry.entry_id
]
for coordinator in coordinators.values():
coordinator.expect_change()

View File

@@ -5,20 +5,22 @@ from __future__ import annotations
from pyprusalink.types import PrinterState
from homeassistant.components.camera import Camera
from homeassistant.config_entries import ConfigEntry
from homeassistant.core import HomeAssistant
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from .coordinator import PrusaLinkConfigEntry, PrusaLinkUpdateCoordinator
from .const import DOMAIN
from .coordinator import JobUpdateCoordinator
from .entity import PrusaLinkEntity
async def async_setup_entry(
hass: HomeAssistant,
entry: PrusaLinkConfigEntry,
entry: ConfigEntry,
async_add_entities: AddConfigEntryEntitiesCallback,
) -> None:
"""Set up PrusaLink camera."""
coordinator = entry.runtime_data["job"]
coordinator: JobUpdateCoordinator = hass.data[DOMAIN][entry.entry_id]["job"]
async_add_entities([PrusaLinkJobPreviewEntity(coordinator)])
@@ -29,7 +31,7 @@ class PrusaLinkJobPreviewEntity(PrusaLinkEntity, Camera):
last_image: bytes
_attr_translation_key = "job_preview"
def __init__(self, coordinator: PrusaLinkUpdateCoordinator) -> None:
def __init__(self, coordinator: JobUpdateCoordinator) -> None:
"""Initialize a PrusaLink camera entity."""
super().__init__(coordinator)
Camera.__init__(self)

View File

@@ -35,17 +35,14 @@ _MINIMUM_REFRESH_INTERVAL = 1.0
T = TypeVar("T", PrinterStatus, LegacyPrinterStatus, JobInfo)
type PrusaLinkConfigEntry = ConfigEntry[dict[str, PrusaLinkUpdateCoordinator]]
class PrusaLinkUpdateCoordinator(DataUpdateCoordinator[T], ABC):
"""Update coordinator for the printer."""
config_entry: PrusaLinkConfigEntry
config_entry: ConfigEntry
expect_change_until = 0.0
def __init__(
self, hass: HomeAssistant, config_entry: PrusaLinkConfigEntry, api: PrusaLink
self, hass: HomeAssistant, config_entry: ConfigEntry, api: PrusaLink
) -> None:
"""Initialize the update coordinator."""
self.api = api

View File

@@ -16,6 +16,7 @@ from homeassistant.components.sensor import (
SensorEntityDescription,
SensorStateClass,
)
from homeassistant.config_entries import ConfigEntry
from homeassistant.const import (
PERCENTAGE,
REVOLUTIONS_PER_MINUTE,
@@ -28,7 +29,8 @@ from homeassistant.helpers.typing import StateType
from homeassistant.util.dt import utcnow
from homeassistant.util.variance import ignore_variance
from .coordinator import PrusaLinkConfigEntry, PrusaLinkUpdateCoordinator
from .const import DOMAIN
from .coordinator import PrusaLinkUpdateCoordinator
from .entity import PrusaLinkEntity
T = TypeVar("T", PrinterStatus, LegacyPrinterStatus, JobInfo, PrinterInfo)
@@ -202,11 +204,13 @@ SENSORS: dict[str, tuple[PrusaLinkSensorEntityDescription, ...]] = {
async def async_setup_entry(
hass: HomeAssistant,
entry: PrusaLinkConfigEntry,
entry: ConfigEntry,
async_add_entities: AddConfigEntryEntitiesCallback,
) -> None:
"""Set up PrusaLink sensor based on a config entry."""
coordinators = entry.runtime_data
coordinators: dict[str, PrusaLinkUpdateCoordinator] = hass.data[DOMAIN][
entry.entry_id
]
entities: list[PrusaLinkEntity] = []

View File

@@ -122,7 +122,7 @@
},
"exceptions": {
"cannot_connect": {
"message": "Cannot connect to Rehlko servers."
"message": "Can not connect to Rehlko servers."
},
"invalid_auth": {
"message": "Authentication failed for email {email}."

View File

@@ -1042,7 +1042,7 @@
"title": "Reolink firmware update required"
},
"https_webhook": {
"description": "Reolink products cannot push motion events to an HTTPS address (SSL), please configure a (local) HTTP address under \"Home Assistant URL\" in the [network settings]({network_link}). The current (local) address is: `{base_url}`, a valid address could, for example, be `{example_url}` where `{example_ip}` is the IP of the Home Assistant device",
"description": "Reolink products can not push motion events to an HTTPS address (SSL), please configure a (local) HTTP address under \"Home Assistant URL\" in the [network settings]({network_link}). The current (local) address is: `{base_url}`, a valid address could, for example, be `{example_url}` where `{example_ip}` is the IP of the Home Assistant device",
"title": "Reolink webhook URL uses HTTPS (SSL)"
},
"password_too_long": {
@@ -1054,7 +1054,7 @@
"title": "Reolink incompatible with global SSL certificate"
},
"webhook_url": {
"description": "Did not receive initial ONVIF state from {name}. Most likely, the Reolink camera cannot reach the current (local) Home Assistant URL `{base_url}`, please configure a (local) HTTP address under \"Home Assistant URL\" in the [network settings]({network_link}) that points to Home Assistant. For example `{example_url}` where `{example_ip}` is the IP of the Home Assistant device. Also, make sure the Reolink camera can reach that URL. Using fast motion/AI state polling until the first ONVIF push is received.",
"description": "Did not receive initial ONVIF state from {name}. Most likely, the Reolink camera can not reach the current (local) Home Assistant URL `{base_url}`, please configure a (local) HTTP address under \"Home Assistant URL\" in the [network settings]({network_link}) that points to Home Assistant. For example `{example_url}` where `{example_ip}` is the IP of the Home Assistant device. Also, make sure the Reolink camera can reach that URL. Using fast motion/AI state polling until the first ONVIF push is received.",
"title": "Reolink webhook URL unreachable"
}
},

View File

@@ -19,6 +19,7 @@ is_option_selected:
required: true
selector:
state:
attribute: options
hide_states:
- unavailable
- unknown

View File

@@ -341,7 +341,7 @@
"charger_end": "Charge completed",
"charger_fault": "Error while charging",
"charger_free": "[%key:component::binary_sensor::entity_component::plug::state::off%]",
"charger_free_fault": "Cannot release plug",
"charger_free_fault": "Can not release plug",
"charger_insert": "[%key:component::binary_sensor::entity_component::plug::state::on%]",
"charger_pause": "Charging paused by charger",
"charger_wait": "Charging paused by vehicle"
@@ -795,7 +795,7 @@
},
"services": {
"get_kvs_value": {
"description": "Gets a value from a Shelly device's Key-Value Storage.",
"description": "Get a value from the device's Key-Value Storage.",
"fields": {
"device_id": {
"description": "The ID of the Shelly device to get the KVS value from.",
@@ -806,10 +806,10 @@
"name": "Key"
}
},
"name": "Get Shelly KVS value"
"name": "Get KVS value"
},
"set_kvs_value": {
"description": "Sets a value in a Shelly device's Key-Value Storage.",
"description": "Set a value in the device's Key-Value Storage.",
"fields": {
"device_id": {
"description": "The ID of the Shelly device to set the KVS value.",
@@ -824,7 +824,7 @@
"name": "Value"
}
},
"name": "Set Shelly KVS value"
"name": "Set KVS value"
}
}
}

View File

@@ -38,5 +38,5 @@
"iot_class": "cloud_push",
"loggers": ["pysmartthings"],
"quality_scale": "bronze",
"requirements": ["pysmartthings==3.7.3"]
"requirements": ["pysmartthings==3.7.2"]
}

View File

@@ -61,9 +61,7 @@ rules:
dynamic-devices: todo
entity-category: done
entity-device-class: done
entity-disabled-by-default:
status: exempt
comment: No noisy or non-essential entities to disable.
entity-disabled-by-default: todo
entity-translations: done
exception-translations: todo
icon-translations: todo

View File

@@ -12,11 +12,11 @@
},
"services": {
"clear": {
"description": "Deletes all system log entries.",
"name": "Clear system log"
"description": "Deletes all log entries.",
"name": "Clear"
},
"write": {
"description": "Writes a system log entry.",
"description": "Write log entry.",
"fields": {
"level": {
"description": "Log level.",
@@ -31,7 +31,7 @@
"name": "Message"
}
},
"name": "Write to system log"
"name": "Write"
}
}
}

View File

@@ -52,7 +52,7 @@ VEHICLE_DESCRIPTIONS: tuple[TeslaFleetNumberVehicleEntityDescription, ...] = (
mode=NumberMode.AUTO,
max_key="charge_state_charge_current_request_max",
func=lambda api, value: api.set_charging_amps(value),
scopes=[Scope.VEHICLE_CHARGING_CMDS, Scope.VEHICLE_CMDS],
scopes=[Scope.VEHICLE_CHARGING_CMDS],
),
TeslaFleetNumberVehicleEntityDescription(
key="charge_state_charge_limit_soc",

View File

@@ -30,8 +30,4 @@ class TeslaUserImplementation(AuthImplementation):
@property
def extra_authorize_data(self) -> dict[str, Any]:
"""Extra data that needs to be appended to the authorize url."""
return {
"prompt": "login",
"prompt_missing_scopes": "true",
"scope": " ".join(SCOPES),
}
return {"prompt": "login", "scope": " ".join(SCOPES)}

View File

@@ -50,7 +50,7 @@
"title": "[%key:common::config_flow::title::oauth2_pick_implementation%]"
},
"reauth_confirm": {
"description": "The {name} integration needs to re-authenticate your account. Reauthentication refreshes the Tesla API permissions granted to Home Assistant, including any newly enabled scopes.",
"description": "The {name} integration needs to re-authenticate your account",
"title": "[%key:common::config_flow::title::reauth%]"
},
"registration_complete": {
@@ -60,7 +60,7 @@
"data_description": {
"qr_code": "Scan this QR code with your phone to set up the virtual key."
},
"description": "To enable command signing, you must open the Tesla app, select your vehicle, and then visit the following URL to set up a virtual key. You must repeat this process for each vehicle.\n\n{virtual_key_url}\n\nIf you later enable additional Tesla API permissions, reauthenticate the integration to refresh the granted scopes.",
"description": "To enable command signing, you must open the Tesla app, select your vehicle, and then visit the following URL to set up a virtual key. You must repeat this process for each vehicle.\n\n{virtual_key_url}",
"title": "Command signing"
}
}

View File

@@ -41,7 +41,7 @@
"iot_class": "local_push",
"loggers": ["uiprotect", "unifi_discovery"],
"quality_scale": "platinum",
"requirements": ["uiprotect==10.2.3", "unifi-discovery==1.3.0"],
"requirements": ["uiprotect==10.2.3", "unifi-discovery==1.2.0"],
"ssdp": [
{
"manufacturer": "Ubiquiti Networks",

View File

@@ -531,13 +531,7 @@ async def websocket_release_notes(
"Entity does not support release notes",
)
return
if entity.available is False:
connection.send_error(
msg["id"],
websocket_api.ERR_HOME_ASSISTANT_ERROR,
"Entity is not available",
)
return
connection.send_result(
msg["id"],
await entity.async_release_notes(),

View File

@@ -40,7 +40,10 @@ async def async_setup_entry(hass: HomeAssistant, entry: WAQIConfigEntry) -> bool
entry.runtime_data = {}
for subentry in entry.get_subentries_of_type(SUBENTRY_TYPE_STATION):
for subentry in entry.subentries.values():
if subentry.subentry_type != SUBENTRY_TYPE_STATION:
continue
# Create a coordinator for each station subentry
coordinator = WAQIDataUpdateCoordinator(hass, entry, subentry, client)
await coordinator.async_config_entry_first_refresh()

View File

@@ -17,11 +17,7 @@ from homeassistant.helpers import config_validation as cv, issue_registry as ir
from homeassistant.helpers.typing import ConfigType
from .const import DOMAIN, INTEGRATION_TITLE
from .coordinator import (
WaterFurnaceCoordinator,
WaterFurnaceDeviceData,
WaterFurnaceEnergyCoordinator,
)
from .coordinator import WaterFurnaceCoordinator
_LOGGER = logging.getLogger(__name__)
@@ -38,7 +34,7 @@ CONFIG_SCHEMA = vol.Schema(
},
extra=vol.ALLOW_EXTRA,
)
type WaterFurnaceConfigEntry = ConfigEntry[dict[str, WaterFurnaceDeviceData]]
type WaterFurnaceConfigEntry = ConfigEntry[dict[str, WaterFurnaceCoordinator]]
async def async_setup(hass: HomeAssistant, config: ConfigType) -> bool:
@@ -99,7 +95,7 @@ async def _async_setup_coordinator(
password: str,
device_index: int,
entry: WaterFurnaceConfigEntry,
) -> tuple[str, WaterFurnaceDeviceData]:
) -> tuple[str, WaterFurnaceCoordinator]:
"""Set up a coordinator for a device."""
device_client = WaterFurnace(username, password, device=device_index)
@@ -111,18 +107,7 @@ async def _async_setup_coordinator(
raise ConfigEntryNotReady(
f"Invalid GWID for device at index {device_index}: {device_client.gwid}"
)
energy_coordinator = WaterFurnaceEnergyCoordinator(
hass, device_client, entry, device_client.gwid
)
# Use async_refresh() instead of async_config_entry_first_refresh() so that
# energy data failures (e.g. WFNoDataError for new accounts) don't block
# the integration from loading. Realtime sensor data is the primary concern.
await energy_coordinator.async_refresh()
return device_client.gwid, WaterFurnaceDeviceData(
realtime=coordinator, energy=energy_coordinator
)
return device_client.gwid, coordinator
async def async_setup_entry(
@@ -141,12 +126,10 @@ async def async_setup_entry(
"Authentication failed. Please update your credentials."
) from err
device_count = len(client.devices) if client.devices else 0
results = await asyncio.gather(
*[
_async_setup_coordinator(hass, username, password, index, entry)
for index in range(device_count)
for index in range(len(client.devices) if client.devices else 0)
]
)
entry.runtime_data = dict(results)

View File

@@ -6,4 +6,3 @@ from typing import Final
DOMAIN: Final = "waterfurnace"
INTEGRATION_TITLE: Final = "WaterFurnace"
UPDATE_INTERVAL: Final = timedelta(seconds=10)
ENERGY_UPDATE_INTERVAL: Final = timedelta(hours=2)

View File

@@ -1,38 +1,14 @@
"""Data update coordinator for WaterFurnace."""
from __future__ import annotations
from dataclasses import dataclass
from datetime import datetime, timedelta
import logging
from typing import TYPE_CHECKING
from waterfurnace.waterfurnace import (
WaterFurnace,
WFCredentialError,
WFException,
WFGateway,
WFNoDataError,
WFReading,
)
from waterfurnace.waterfurnace import WaterFurnace, WFException, WFGateway, WFReading
from homeassistant.components.recorder import get_instance
from homeassistant.components.recorder.models import StatisticMeanType
from homeassistant.components.recorder.models.statistics import (
StatisticData,
StatisticMetaData,
)
from homeassistant.components.recorder.statistics import (
async_add_external_statistics,
get_last_statistics,
)
from homeassistant.const import UnitOfEnergy
from homeassistant.core import HomeAssistant, callback
from homeassistant.core import HomeAssistant
from homeassistant.helpers.update_coordinator import DataUpdateCoordinator, UpdateFailed
from homeassistant.util import dt as dt_util
from homeassistant.util.unit_conversion import EnergyConverter
from .const import DOMAIN, ENERGY_UPDATE_INTERVAL, UPDATE_INTERVAL
from .const import UPDATE_INTERVAL
if TYPE_CHECKING:
from . import WaterFurnaceConfigEntry
@@ -40,14 +16,6 @@ if TYPE_CHECKING:
_LOGGER = logging.getLogger(__name__)
@dataclass
class WaterFurnaceDeviceData:
"""Container for per-device coordinators."""
realtime: WaterFurnaceCoordinator
energy: WaterFurnaceEnergyCoordinator
class WaterFurnaceCoordinator(DataUpdateCoordinator[WFReading]):
"""WaterFurnace data update coordinator.
@@ -86,164 +54,3 @@ class WaterFurnaceCoordinator(DataUpdateCoordinator[WFReading]):
return await self.hass.async_add_executor_job(self.client.read_with_retry)
except WFException as err:
raise UpdateFailed(str(err)) from err
class WaterFurnaceEnergyCoordinator(DataUpdateCoordinator[None]):
"""WaterFurnace energy data coordinator.
Periodically fetches energy data and inserts external statistics
for the Energy Dashboard.
"""
config_entry: WaterFurnaceConfigEntry
def __init__(
self,
hass: HomeAssistant,
client: WaterFurnace,
config_entry: WaterFurnaceConfigEntry,
gwid: str,
) -> None:
"""Initialize the energy coordinator."""
super().__init__(
hass,
_LOGGER,
name=f"WaterFurnace Energy {gwid}",
update_interval=ENERGY_UPDATE_INTERVAL,
config_entry=config_entry,
)
self.client = client
self.gwid = gwid
self.statistic_id = f"{DOMAIN}:{gwid.lower()}_energy"
self._statistic_metadata = StatisticMetaData(
has_sum=True,
mean_type=StatisticMeanType.NONE,
name=f"WaterFurnace Energy {gwid}",
source=DOMAIN,
statistic_id=self.statistic_id,
unit_class=EnergyConverter.UNIT_CLASS,
unit_of_measurement=UnitOfEnergy.KILO_WATT_HOUR,
)
@callback
def _dummy_listener() -> None:
pass
# Ensure periodic polling even without entity listeners,
# since this coordinator only inserts external statistics.
self.async_add_listener(_dummy_listener)
async def _async_get_last_stat(self) -> tuple[float, float] | None:
"""Get the last recorded statistic timestamp and sum.
Returns (timestamp, sum) or None if no statistics exist.
"""
last_stat = await get_instance(self.hass).async_add_executor_job(
get_last_statistics, self.hass, 1, self.statistic_id, True, {"sum"}
)
if not last_stat:
return None
entry = last_stat[self.statistic_id][0]
if entry["sum"] is None:
return None
return (entry["start"], entry["sum"])
def _fetch_energy_data(
self, start_date: str, end_date: str
) -> list[tuple[datetime, float]]:
"""Fetch energy data and return list of (timestamp, kWh) tuples."""
# Re-login to refresh the HTTP session token, which expires between
# the 2-hour polling intervals.
try:
self.client.login()
except WFCredentialError as err:
raise UpdateFailed(
"Authentication failed during energy data fetch"
) from err
data = self.client.get_energy_data(
start_date,
end_date,
frequency="1H",
timezone_str=self.hass.config.time_zone,
)
return [
(reading.timestamp, reading.total_power)
for reading in data
if reading.total_power is not None
]
@staticmethod
def _build_statistics(
readings: list[tuple[datetime, float]],
last_ts: float,
last_sum: float,
now: datetime,
) -> list[StatisticData]:
"""Build hourly statistics from readings, skipping already-recorded ones."""
current_hour_ts = now.replace(minute=0, second=0, microsecond=0).timestamp()
statistics: list[StatisticData] = []
seen_hours: set[float] = set()
running_sum = last_sum
for timestamp, kwh in sorted(readings, key=lambda x: x[0]):
ts = timestamp.timestamp()
if ts <= last_ts:
continue
if ts >= current_hour_ts:
continue
hour_ts = timestamp.replace(minute=0, second=0, microsecond=0).timestamp()
if hour_ts in seen_hours:
continue
seen_hours.add(hour_ts)
running_sum += kwh
statistics.append(
StatisticData(
start=timestamp.replace(minute=0, second=0, microsecond=0),
state=kwh,
sum=running_sum,
)
)
return statistics
async def _async_update_data(self) -> None:
"""Fetch energy data and insert statistics."""
last = await self._async_get_last_stat()
now = dt_util.utcnow()
if last is None:
_LOGGER.info("No prior statistics found, fetching recent energy data")
last_ts = 0.0
last_sum = 0.0
start_dt = now - timedelta(days=1)
else:
last_ts, last_sum = last
start_dt = dt_util.utc_from_timestamp(last_ts)
_LOGGER.debug("Last stat: ts=%s, sum=%s", start_dt.isoformat(), last_sum)
local_tz = dt_util.DEFAULT_TIME_ZONE
start_date = start_dt.astimezone(local_tz).strftime("%Y-%m-%d")
end_date = (now.astimezone(local_tz) + timedelta(days=1)).strftime("%Y-%m-%d")
try:
readings = await self.hass.async_add_executor_job(
self._fetch_energy_data, start_date, end_date
)
except WFNoDataError:
_LOGGER.debug("No energy data available for %s to %s", start_date, end_date)
return
except WFException as err:
raise UpdateFailed(str(err)) from err
if not readings:
_LOGGER.debug("No readings returned for %s to %s", start_date, end_date)
return
_LOGGER.debug("Fetched %s readings", len(readings))
statistics = self._build_statistics(readings, last_ts, last_sum, now)
_LOGGER.debug("Built %s statistics to insert", len(statistics))
if statistics:
async_add_external_statistics(
self.hass, self._statistic_metadata, statistics
)

View File

@@ -1,7 +1,6 @@
{
"domain": "waterfurnace",
"name": "WaterFurnace",
"after_dependencies": ["recorder"],
"codeowners": ["@sdague", "@masterkoppa"],
"config_flow": true,
"documentation": "https://www.home-assistant.io/integrations/waterfurnace",

View File

@@ -156,8 +156,8 @@ async def async_setup_entry(
) -> None:
"""Set up Waterfurnace sensors from a config entry."""
async_add_entities(
WaterFurnaceSensor(device_data.realtime, description)
for device_data in config_entry.runtime_data.values()
WaterFurnaceSensor(coordinator, description)
for coordinator in config_entry.runtime_data.values()
for description in SENSORS
)

View File

@@ -6,10 +6,10 @@
"no_devices_found": "[%key:common::config_flow::abort::no_devices_found%]"
},
"error": {
"bulb_time_out": "Cannot connect to the bulb. Maybe the bulb is offline or a wrong IP was entered. Please turn on the light and try again!",
"bulb_time_out": "Can not connect to the bulb. Maybe the bulb is offline or a wrong IP was entered. Please turn on the light and try again!",
"cannot_connect": "[%key:common::config_flow::error::cannot_connect%]",
"no_ip": "Not a valid IP address.",
"no_wiz_light": "The bulb cannot be connected via WiZ integration.",
"no_wiz_light": "The bulb cannot be connected via WiZ Platform integration.",
"unknown": "[%key:common::config_flow::error::unknown%]"
},
"flow_title": "{name} ({host})",
@@ -26,7 +26,7 @@
"data": {
"host": "[%key:common::config_flow::data::ip%]"
},
"description": "If you leave the IP address empty, discovery will be used to find devices."
"description": "If you leave the IP Address empty, discovery will be used to find devices."
}
}
},

View File

@@ -6,7 +6,6 @@ To update, run python3 -m script.hassfest
APPLICATION_CREDENTIALS = [
"aladdin_connect",
"august",
"dropbox",
"ekeybionyx",
"electric_kiwi",
"fitbit",

View File

@@ -160,7 +160,6 @@ FLOWS = {
"downloader",
"dremel_3d_printer",
"drop_connect",
"dropbox",
"droplet",
"dsmr",
"dsmr_reader",

View File

@@ -1485,12 +1485,6 @@
"config_flow": true,
"iot_class": "local_push"
},
"dropbox": {
"name": "Dropbox",
"integration_type": "service",
"config_flow": true,
"iot_class": "cloud_polling"
},
"droplet": {
"name": "Droplet",
"integration_type": "device",

View File

@@ -772,11 +772,11 @@ class DeviceRegistry(BaseRegistry[dict[str, list[dict[str, Any]]]]):
devices: ActiveDeviceRegistryItems
deleted_devices: DeviceRegistryItems[DeletedDeviceEntry]
_device_data: dict[str, DeviceEntry]
_loaded_event: asyncio.Event | None = None
def __init__(self, hass: HomeAssistant) -> None:
"""Initialize the device registry."""
self.hass = hass
self._loaded_event = asyncio.Event()
self._store = DeviceRegistryStore(
hass,
STORAGE_VERSION_MAJOR,
@@ -786,11 +786,6 @@ class DeviceRegistry(BaseRegistry[dict[str, list[dict[str, Any]]]]):
serialize_in_event_loop=False,
)
@callback
def async_setup(self) -> None:
"""Set up the registry."""
self._loaded_event = asyncio.Event()
@callback
def async_get(self, device_id: str) -> DeviceEntry | None:
"""Get device.
@@ -1470,9 +1465,6 @@ class DeviceRegistry(BaseRegistry[dict[str, list[dict[str, Any]]]]):
async def _async_load(self) -> None:
"""Load the device registry."""
assert self._loaded_event is not None
assert not self._loaded_event.is_set()
async_setup_cleanup(self.hass, self)
data = await self._store.async_load()
@@ -1577,8 +1569,7 @@ class DeviceRegistry(BaseRegistry[dict[str, list[dict[str, Any]]]]):
Will only wait if the registry had already been set up.
"""
if self._loaded_event is not None:
await self._loaded_event.wait()
await self._loaded_event.wait()
@callback
def _data_to_save(self) -> dict[str, Any]:
@@ -1726,12 +1717,6 @@ def async_get(hass: HomeAssistant) -> DeviceRegistry:
return DeviceRegistry(hass)
def async_setup(hass: HomeAssistant) -> None:
"""Set up device registry."""
assert DATA_REGISTRY not in hass.data
async_get(hass).async_setup()
async def async_load(hass: HomeAssistant, *, load_empty: bool = False) -> None:
"""Load device registry."""
await async_get(hass).async_load(load_empty=load_empty)

View File

@@ -6,7 +6,7 @@ from functools import cache
from getpass import getuser
import logging
import platform
from typing import Any
from typing import TYPE_CHECKING, Any
from homeassistant.const import __version__ as current_version
from homeassistant.core import HomeAssistant
@@ -15,6 +15,7 @@ from homeassistant.util.package import is_docker_env, is_virtual_env
from homeassistant.util.system_info import is_official_image
from .hassio import is_hassio
from .importlib import async_import_module
from .singleton import singleton
_LOGGER = logging.getLogger(__name__)
@@ -53,6 +54,15 @@ cached_get_user = cache(getuser)
@bind_hass
async def async_get_system_info(hass: HomeAssistant) -> dict[str, Any]:
"""Return info about the system."""
# Local import to avoid circular dependencies
# We use the import helper because hassio
# may not be loaded yet and we don't want to
# do blocking I/O in the event loop to import it.
if TYPE_CHECKING:
from homeassistant.components import hassio # noqa: PLC0415
else:
hassio = await async_import_module(hass, "homeassistant.components.hassio")
is_hassio_ = is_hassio(hass)
info_object = {
@@ -95,9 +105,6 @@ async def async_get_system_info(hass: HomeAssistant) -> dict[str, Any]:
# Enrich with Supervisor information
if is_hassio_:
# Local import to avoid circular dependencies
from homeassistant.components import hassio # noqa: PLC0415
if not (info := hassio.get_info(hass)):
_LOGGER.warning("No Home Assistant Supervisor info available")
info = {}

View File

@@ -55,7 +55,6 @@ def run(args: Sequence[str] | None) -> None:
async def run_command(args: argparse.Namespace) -> None:
"""Run the command."""
hass = HomeAssistant(os.path.join(os.getcwd(), args.config))
dr.async_setup(hass)
await asyncio.gather(dr.async_load(hass), er.async_load(hass))
hass.auth = await auth_manager_from_config(hass, [{"type": "homeassistant"}], [])
provider = hass.auth.auth_providers[0]

View File

@@ -302,7 +302,6 @@ async def async_check_config(config_dir):
hass = core.HomeAssistant(config_dir)
loader.async_setup(hass)
hass.config_entries = ConfigEntries(hass, {})
dr.async_setup(hass)
await ar.async_load(hass)
await dr.async_load(hass)
await er.async_load(hass)

Some files were not shown because too many files have changed in this diff Show More