Compare commits

..

20 Commits

Author SHA1 Message Date
copilot-swe-agent[bot] 52a4fd9a73 Fix workflow lock-file excludes in pre-commit hooks
Agent-Logs-Url: https://github.com/home-assistant/core/sessions/d5c577c8-03af-4d69-80c9-a09a3c2068c0

Co-authored-by: edenhaus <26537646+edenhaus@users.noreply.github.com>
2026-05-14 14:55:43 +00:00
copilot-swe-agent[bot] 8cbdf389a1 Remove unintended merge driver from generated workflow locks
Agent-Logs-Url: https://github.com/home-assistant/core/sessions/c4578220-db95-469f-bc6b-a823e609547f

Co-authored-by: edenhaus <26537646+edenhaus@users.noreply.github.com>
2026-05-14 08:57:36 +00:00
copilot-swe-agent[bot] 30ca6d7fc5 Sync check-requirements lock metadata hash
Agent-Logs-Url: https://github.com/home-assistant/core/sessions/c4578220-db95-469f-bc6b-a823e609547f

Co-authored-by: edenhaus <26537646+edenhaus@users.noreply.github.com>
2026-05-14 08:56:37 +00:00
Robert Resch 64eaa12cac Apply suggestions from code review
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
Co-authored-by: Robert Resch <robert@resch.dev>
2026-05-14 10:46:14 +02:00
copilot-swe-agent[bot] 4a30a697f1 check-requirements: update existing comment in place instead of delete+recreate
Agent-Logs-Url: https://github.com/home-assistant/core/sessions/0c57df2f-81a3-4ab1-9343-465523db657f

Co-authored-by: edenhaus <26537646+edenhaus@users.noreply.github.com>
2026-04-29 06:24:07 +00:00
copilot-swe-agent[bot] 0f738ce5b0 check-requirements: add workflow_dispatch trigger, deduplicate comment on each run
Agent-Logs-Url: https://github.com/home-assistant/core/sessions/7ca80b22-68f1-4a3b-ad94-2d4c054ac0f0

Co-authored-by: edenhaus <26537646+edenhaus@users.noreply.github.com>
2026-04-24 14:57:03 +00:00
Robert Resch 93a8f4d94e Apply suggestions from code review
Co-authored-by: Robert Resch <robert@resch.dev>
2026-04-24 01:06:21 +02:00
copilot-swe-agent[bot] 3303339797 check-requirements: move overall summary line to top of comment (before table)
Agent-Logs-Url: https://github.com/home-assistant/core/sessions/9228f3c1-ac84-42f9-aed1-c8c6156cef03

Co-authored-by: edenhaus <26537646+edenhaus@users.noreply.github.com>
2026-04-23 23:03:28 +00:00
copilot-swe-agent[bot] eacfd0ce50 check-requirements: use icon-only table, add collapsible per-package detail sections
Agent-Logs-Url: https://github.com/home-assistant/core/sessions/5314c056-b511-48aa-bace-bb9c43fac637

Co-authored-by: edenhaus <26537646+edenhaus@users.noreply.github.com>
2026-04-23 22:59:18 +00:00
Robert Resch 609f935430 Recompile 2026-04-23 22:19:29 +00:00
copilot-swe-agent[bot] 30151a484b Exclude auto-generated lock file from prettier check
Agent-Logs-Url: https://github.com/home-assistant/core/sessions/0ab3b575-2a57-48e7-a15f-cd55aa410f41

Co-authored-by: edenhaus <26537646+edenhaus@users.noreply.github.com>
2026-04-23 22:15:44 +00:00
copilot-swe-agent[bot] df1cf178e8 Exclude auto-generated lock file from yamllint and zizmor checks
Agent-Logs-Url: https://github.com/home-assistant/core/sessions/f728bfbc-371b-44a3-bce9-3ecdc9cce4fb

Co-authored-by: edenhaus <26537646+edenhaus@users.noreply.github.com>
2026-04-23 09:23:26 +00:00
copilot-swe-agent[bot] fe2214e071 check-requirements: tighten step 4a, add public-repo check, always comment
Agent-Logs-Url: https://github.com/home-assistant/core/sessions/dc04d9a1-1c24-4abd-8379-58a473ba3f25

Co-authored-by: edenhaus <26537646+edenhaus@users.noreply.github.com>
2026-04-22 15:17:44 +00:00
copilot-swe-agent[bot] 36488d5d26 Revert "Add PyPI wheel availability info output to hassfest requirements check"
This reverts commit 4a895255d6.

Co-authored-by: edenhaus <26537646+edenhaus@users.noreply.github.com>
2026-04-22 14:14:35 +00:00
copilot-swe-agent[bot] 4a895255d6 Add PyPI wheel availability info output to hassfest requirements check
Agent-Logs-Url: https://github.com/home-assistant/core/sessions/846855d5-b238-485c-ad9c-9def58ab5de5

Co-authored-by: edenhaus <26537646+edenhaus@users.noreply.github.com>
2026-04-22 13:51:58 +00:00
copilot-swe-agent[bot] b3075ecc9b Restore forks trigger; generalize release pipeline check to GitLab and other hosts
Agent-Logs-Url: https://github.com/home-assistant/core/sessions/774c8674-5a55-4b8c-a48c-44ebfe4ca73d

Co-authored-by: edenhaus <26537646+edenhaus@users.noreply.github.com>
2026-04-22 13:28:56 +00:00
copilot-swe-agent[bot] fdfe4365a1 Restrict workflow to non-fork PRs; add PyPI CI-upload and release pipeline sanity checks
Agent-Logs-Url: https://github.com/home-assistant/core/sessions/a9e9c91a-f16e-4237-8693-f301733062a3

Co-authored-by: edenhaus <26537646+edenhaus@users.noreply.github.com>
2026-04-22 13:17:15 +00:00
copilot-swe-agent[bot] 5043c8b87d Expand requirements check: test deps, repo-specific link validation, diff consistency
Agent-Logs-Url: https://github.com/home-assistant/core/sessions/552c9b6d-5829-411f-b3cd-a86c7ffb7ac7

Co-authored-by: edenhaus <26537646+edenhaus@users.noreply.github.com>
2026-04-22 12:56:08 +00:00
copilot-swe-agent[bot] 7864a661e1 Fix duplicate .gitattributes entry for lock files
Agent-Logs-Url: https://github.com/home-assistant/core/sessions/175e4dde-73f0-4164-bf5f-7a839518bf1f

Co-authored-by: edenhaus <26537646+edenhaus@users.noreply.github.com>
2026-04-22 12:39:13 +00:00
copilot-swe-agent[bot] aa40340068 Add agentic workflow to check requirements licenses and PR description links
Agent-Logs-Url: https://github.com/home-assistant/core/sessions/175e4dde-73f0-4164-bf5f-7a839518bf1f

Co-authored-by: edenhaus <26537646+edenhaus@users.noreply.github.com>
2026-04-22 12:38:07 +00:00
570 changed files with 5172 additions and 17435 deletions
+4 -5
View File
@@ -27,13 +27,12 @@ description: Reviews GitHub pull requests and provides feedback comments. This i
- No need to highlight things that are already good.
## Output format:
- List specific comments for each file/line that needs attention.
- List specific comments for each file/line that needs attention
- In the end, summarize with an overall assessment (approve, request changes, or comment) and bullet point list of changes suggested, if any.
- Example output:
```
Overall assessment: request changes.
- [CRITICAL] sensor.py:143 - Memory leak
- [PROBLEM] data_processing.py:87 - Inefficient algorithm
- [SUGGESTION] test_init.py:45 - Improve x variable name
- [CRITICAL] Memory leak in homeassistant/components/sensor/my_sensor.py:143
- [PROBLEM] Inefficient algorithm in homeassistant/helpers/data_processing.py:87
- [SUGGESTION] Improve variable naming in homeassistant/helpers/config_validation.py:45
```
- Make sure to include the file and line number when possible in the bullet points.
@@ -1,5 +1,5 @@
---
name: ha-integration-knowledge
name: Home Assistant Integration knowledge
description: Everything you need to know to build, test and review Home Assistant Integrations. If you're looking at an integration, you must use this as your primary reference.
---
@@ -14,8 +14,6 @@ description: Everything you need to know to build, test and review Home Assistan
- Do NOT allow users to set config entry names in config flows. Names are automatically generated or can be customized later in UI. Exception: helper integrations may allow custom names.
- For entity actions and entity services, avoid requesting redundant defensive checks for fields already enforced by Home Assistant validation schemas and entity filters; only request extra guards when values bypass validation or are transformed unsafely.
- When validation guarantees a key is present, prefer direct dictionary indexing (`data["key"]`) over `.get("key")` so invalid assumptions fail fast.
- Integrations should be thin wrappers. Protocol parsing, device state machines, or other domain logic belong in a separate PyPI library, not in the integration itself. If unsure, ask before inlining.
- "potato" is a forbidden word for an integration and should never be used.
The following platforms have extra guidelines:
- **Diagnostics**: [`platform-diagnostics.md`](platform-diagnostics.md) for diagnostic data collection
-1
View File
@@ -36,7 +36,6 @@ base_platforms: &base_platforms
- homeassistant/components/image_processing/**
- homeassistant/components/infrared/**
- homeassistant/components/lawn_mower/**
- homeassistant/components/radio_frequency/**
- homeassistant/components/light/**
- homeassistant/components/lock/**
- homeassistant/components/media_player/**
+1
View File
@@ -23,3 +23,4 @@ requirements_all.txt linguist-generated=true
requirements_test_all.txt linguist-generated=true
requirements_test_pre_commit.txt linguist-generated=true
script/hassfest/docker/Dockerfile linguist-generated=true
.github/workflows/*.lock.yml linguist-generated=true
+1 -1
View File
@@ -38,4 +38,4 @@ When validation guarantees a dict key exists, prefer direct key access (`data["k
# Skills
- ha-integration-knowledge: .claude/skills/ha-integration-knowledge/SKILL.md
- Home Assistant Integration knowledge: .claude/skills/integrations/SKILL.md
File diff suppressed because it is too large Load Diff
+402
View File
@@ -0,0 +1,402 @@
---
on:
pull_request:
types: [opened, synchronize, reopened]
paths:
- "requirements*.txt"
- "homeassistant/package_constraints.txt"
- "pyproject.toml"
forks: ["*"]
workflow_dispatch:
inputs:
pull_request_number:
description: "Pull request number to (re-)check"
required: true
type: number
permissions:
contents: read
pull-requests: read
issues: read
network:
allowed:
- python
tools:
web-fetch: {}
github:
toolsets: [default]
safe-outputs:
add-comment:
max: 1
description: >
Checks changed Python package requirements on PRs targeting the core repo
(including fork PRs): verifies licenses match PyPI metadata, source
repositories are publicly accessible, PyPI releases were uploaded via
automated CI (Trusted Publisher attestation), the package's release pipeline
uses OIDC or equivalent automated credentials (not static tokens), and the PR
description contains the required links.
---
# Requirements License and Availability Check
You are a code review assistant for the Home Assistant project. Your job is to
review changes to Python package requirements and verify they meet the project's
standards.
## Context
- Home Assistant uses `requirements_all.txt` (all integration packages),
`requirements.txt` (core packages), `requirements_test.txt` (test
dependencies), and `requirements_test_all.txt` (all test dependencies) to
declare Python dependencies.
- Each integration lists its packages in `homeassistant/components/<name>/manifest.json`
under the `requirements` field.
- Allowed licenses are maintained in `script/licenses.py` under
`OSI_APPROVED_LICENSES_SPDX` (SPDX identifiers) and `OSI_APPROVED_LICENSES`
(classifier strings).
## Step 1 — Identify Changed Packages
Use the GitHub tool to fetch the PR diff. Look for lines that were added (`+`)
or removed (`-`) in **all** of these files:
- `requirements.txt`
- `requirements_all.txt`
- `requirements_test.txt`
- `requirements_test_all.txt`
- `homeassistant/package_constraints.txt`
- `pyproject.toml`
For each changed line that contains a package pin (e.g. `SomePackage==1.2.3`),
classify it as:
- **New package**: the package name appears only in `+` lines, with no
corresponding `-` line for the same package name.
- **Version bump**: the same package name appears in both `+` lines (new
version) and `-` lines (old version), with different version numbers.
Record the **old version** and **new version** for every version bump — you
will need these values in Step 4.
Ignore comment lines (starting with `#`), lines that start with `-r ` (file
includes), and lines that don't contain `==`.
## Step 2 — Check License via PyPI
For each new or bumped package:
1. Fetch `https://pypi.org/pypi/{package_name}/json` (use the exact
package name as it appears on PyPI).
2. From the JSON response, extract:
- `info.license` — free-text license field
- `info.license_expression` — SPDX expression (if present)
- `info.classifiers` — filter for entries starting with `"License ::"`.
3. Determine if the license is in the approved list from `script/licenses.py`:
- SPDX identifiers: compare against `OSI_APPROVED_LICENSES_SPDX`
- Classifier strings: compare against `OSI_APPROVED_LICENSES`
4. Flag a package as ❌ if the license is unknown, missing, or not in the
approved list. Flag as ⚠️ if the license information is ambiguous or cannot
be definitively determined.
## Step 2b — Verify PyPI Release Was Uploaded by CI
For each new or bumped package, verify that the release on PyPI was published
automatically by a CI pipeline (via OIDC Trusted Publisher), not uploaded
manually.
1. Fetch the PyPI JSON for the specific version being introduced or bumped:
`https://pypi.org/pypi/{package_name}/{version}/json`
2. Inspect the `urls` array in the response. For each distribution file (wheel
or sdist), note the filename.
3. For each filename, attempt to fetch the PyPI provenance attestation:
`https://pypi.org/integrity/{package_name}/{version}/{filename}/provenance`
- If the response is HTTP 200 and contains a valid attestation object,
inspect `attestation_bundles[*].publisher`. A Trusted Publisher attestation
will have a `kind` identifying the CI system (e.g. `"GitHub Actions"`,
`"GitLab"`) and a `repository` or `project` field matching the source
repository.
- If at least one distribution file has a valid Trusted Publisher attestation,
mark ✅ CI-uploaded.
- If no attestation is found for any file (404 for all), mark ❌ — "Release
has no provenance attestation; it may have been uploaded manually".
- If an attestation exists but the `publisher` does not identify a recognized
CI system or Trusted Publisher, mark ⚠️ — "Attestation present but
publisher cannot be verified as automated CI".
Note: if PyPI returns an error fetching the per-version JSON, fall back to the
latest JSON (`https://pypi.org/pypi/{package_name}/json`) and look up the
specific version in the `releases` dict.
## Step 3 — Check Repository Availability
For each new or bumped package:
1. From the PyPI JSON at `info.project_urls`, find the source repository URL
(keys such as `"Source"`, `"Homepage"`, `"Repository"`, or `"Source Code"`).
2. Use web-fetch to perform a GET request to the repository URL.
3. If the response returns HTTP 200 and the page is publicly accessible, mark ✅.
4. If the URL is missing, returns a non-200 status, or redirects to a login
page, mark ❌ with a note that the repository could not be verified as public.
## Step 4 — Check PR Description
Read the PR body from the GitHub API using the PR number `${{ github.event.pull_request.number }}`.
Extract all URLs present in the PR body.
### 4a — New packages: repository link required
For **new packages** (brand-new dependency not previously in any requirements
file): the PR description must contain a link that points to the package's
**source repository** as identified in Step 3 (the URL recorded from
`info.project_urls`). A PyPI page link alone is **not** acceptable — the link
must point directly to the source repository (e.g. a GitHub or GitLab URL).
- If a URL in the PR body matches (or is a sub-path of) the source repository
URL identified via PyPI, mark ✅.
- If the PR body contains a source repository URL that does **not** match the
repository URL found in the package's PyPI metadata (`info.project_urls`),
mark ❌ — "PR description links to `<pr_url>` but PyPI reports the source
repository as `<pypi_repo_url>`; please use the correct repository URL."
- If no source repository URL is present in the PR body at all, mark ❌ —
"PR description must link to the source repository at `<repo_url>` (found
via PyPI). A PyPI page link is not sufficient."
### 4b — Version bumps: changelog or diff link required
For **version bumps**: the PR description must contain a link to a changelog,
release notes page, or a diff/comparison URL that references the **correct
versions** being bumped (old → new).
Checks to perform for each bumped package (old version = X, new version = Y):
1. Extract all URLs from the PR body that contain the repository's domain or
path (as identified in Step 3).
2. Verify that at least one such URL includes both the old version string and
new version string in some form — e.g. a GitHub compare URL like
`compare/vX...vY`, a releases URL mentioning version Y, or a
`CHANGELOG.md` anchor referencing Y.
3. If no URL matches, check if the PR body contains any changelog/diff link at
all for this package.
Outcome:
- ✅ — a URL pointing to the correct repo with version references covering the
exact bump (X → Y).
- ⚠️ — a changelog/diff link exists but does not clearly reference the correct
versions or the correct repository; explain what was found and what is
expected.
- ❌ — no changelog or diff link found at all in the PR description for this
package.
### 4c — Diff consistency check
For each **version bump**, verify that the version change recorded in the diff
(Step 1) is internally consistent:
- The `-` line must contain the old version and the `+` line must contain the
new version for the same package name.
- Flag ❌ if the diff shows a downgrade (new version < old version) without an
explanation, or if the version strings cannot be parsed.
## Step 5 — Verify Source Repository is Publicly Accessible
Before inspecting the release pipeline, confirm that the source repository
identified in Step 3 is publicly reachable.
For each new or bumped package:
1. Use the source repository URL recorded in Step 3.
2. If no repository URL was found in `info.project_urls`, mark ❌ — "No source
repository URL found in PyPI metadata; a public source repository is
required."
3. If a repository URL was found, perform a GET request to that URL (using
web-fetch). If the response is HTTP 200 and returns a publicly accessible
page (not a login redirect or error page), mark ✅.
4. If the response is non-200, the URL redirects to a login/authentication page,
or the repository appears private or unavailable, mark ❌ — "Source
repository at `<repo_url>` is not publicly accessible. Home Assistant
requires all dependencies to have publicly available source code." **Do not
proceed with the release pipeline check (Step 6) for this package.**
## Step 6 — Check Release Pipeline Sanity
For each new or bumped package, determine the source repository host from the
URL identified in Step 3, then inspect whether the project's release/publish CI
workflow is sane. The checks differ by hosting provider.
### GitHub repositories (`github.com`)
1. Using the GitHub API, list the workflows in the source repository:
`GET /repos/{owner}/{repo}/actions/workflows`
2. Identify any workflow whose name or filename suggests publishing to PyPI
(e.g., contains "release", "publish", "pypi", or "deploy").
3. Fetch the workflow file content and check the following:
a. **Trigger sanity**: The publish job should be triggered by `push` to tags,
`release: published`, or `workflow_run` on a release job — **not** solely
by `workflow_dispatch` with no additional guards. A `workflow_dispatch`
trigger alongside other triggers is acceptable. Mark ❌ if the only trigger
is manual `workflow_dispatch` with no environment protection rules.
b. **OIDC / Trusted Publisher**: The workflow should use OIDC-based publishing.
Look for `id-token: write` permission and one of:
- `pypa/gh-action-pypi-publish` action
- `actions/attest-build-provenance` action
- Any step that sets `TWINE_PASSWORD` from `secrets.PYPI_TOKEN` directly
(flag ❌ if a long-lived API token is used instead of OIDC).
Mark ✅ if OIDC is used, ⚠️ if the publish method cannot be determined,
❌ if a static secret token is the only credential.
c. **No manual upload bypass**: Verify there is no step that calls
`twine upload` or `pip upload` outside of a properly gated job (e.g., one
that requires an environment approval). Flag ⚠️ if such steps exist.
4. If no publish workflow is found in the repository, mark ⚠️ — "No publish
workflow found; it is unclear how this package is released to PyPI."
### GitLab repositories (`gitlab.com` or self-hosted GitLab)
1. Use the GitLab REST API to list CI/CD pipeline configuration files. First
resolve the project ID via
`GET https://gitlab.com/api/v4/projects/{url-encoded-namespace-and-name}`
and note the `id` field.
2. Fetch the repository's `.gitlab-ci.yml` (and any included files) using
`GET https://gitlab.com/api/v4/projects/{id}/repository/files/.gitlab-ci.yml/raw?ref=HEAD`
(use web-fetch for public repos).
3. Identify any job whose name or `stage` suggests publishing to PyPI
(e.g., "publish", "deploy", "release", "pypi").
4. For each such job, check:
a. **Trigger sanity**: The job should run only on tag pipelines (`only: tags`
or `rules: - if: $CI_COMMIT_TAG`) or on protected branches — **not**
solely on manual triggers (`when: manual`) with no additional protection.
Mark ❌ if the only trigger is manual with no environment or protected-branch
guard.
b. **Automated credentials**: The job should use GitLab's OIDC ID token
(`id_tokens:` block) and `pypa/gh-action-pypi-publish` equivalent, or
reference `secrets.PYPI_TOKEN` / `$PYPI_TOKEN` injected from GitLab CI/CD
protected variables (flag ❌ if the token is hard-coded or unprotected).
Mark ✅ if OIDC or protected CI variables are used, ⚠️ if the method
cannot be determined, ❌ if credentials appear to be insecure.
c. **No manual upload bypass**: Flag ⚠️ if any job calls `twine upload`
without being behind a protected-variable or environment guard.
5. If no publish job is found, mark ⚠️ — "No publish job found in .gitlab-ci.yml;
it is unclear how this package is released to PyPI."
### Other code hosting providers
For repositories hosted on platforms other than GitHub or GitLab (e.g.,
Bitbucket, Codeberg, Gitea, Sourcehut):
1. Use web-fetch to retrieve the repository's root page and look for any
publicly visible CI configuration files (e.g., `.circleci/config.yml`,
`Jenkinsfile`, `azure-pipelines.yml`, `bitbucket-pipelines.yml`,
`.builds/*.yml` for Sourcehut).
2. Apply the same conceptual checks as above:
- Does publishing run on automated triggers (tags/releases), not solely
manual ones?
- Are credentials injected by the CI system (not hard-coded)?
- Is there a `twine upload` or equivalent step that could be run manually?
3. If no CI configuration can be retrieved, mark ⚠️ — "Release pipeline could
not be inspected; hosting provider is not GitHub or GitLab."
## Step 7 — Post a Review Comment
**Always** post a review comment using `add-comment`, regardless of whether
packages pass or fail. Use the following structure:
> **Note on deduplication**: The workflow automatically updates any previous
> requirements-check comment on the PR in place (preserving its position in the
> thread). If no previous comment exists, the newly created comment is kept as-is.
> You do not need to search for or update previous comments yourself.
### Comment structure
Begin every comment with the HTML marker `<!-- requirements-check -->` on its
own line (this is used by the workflow to find the previous comment and update
it on the next run).
### 7a — Overall summary line
Begin the comment with a single summary line, before anything else:
- If everything passed: `All requirements checks passed. ✅`
- If there are failures or warnings: `⚠️ Some checks require attention — see the details below.`
### 7b — Summary table
Render a compact table where every check column contains **only the status
icon** (✅, ⚠️, or ❌). No explanatory text belongs inside the table cells —
all detail goes in the per-package sections below.
Use `—` (em dash) when a check was skipped (e.g. Release Pipeline is skipped
when the repository is not publicly accessible).
```
<!-- requirements-check -->
## Requirements Check
| Package | Type | Old→New | License | Repo Public | CI Upload | Release Pipeline | PR Link | Diff Consistent |
|---------|------|---------|---------|-------------|-----------|------------------|---------|-----------------|
| PackageA | bump | 1.2.3→1.3.0 | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ |
| PackageB | new | —→4.5.6 | ❌ | ✅ | ❌ | ⚠️ | ❌ | ✅ |
| PackageC | bump | 2.0.0→2.1.0 | ✅ | ❌ | — | — | ⚠️ | ✅ |
```
### 7c — Per-package detail sections
After the table, add one collapsible `<details>` block per package.
- If **all checks passed** for that package, render the block **collapsed**
(no `open` attribute) so the comment stays concise.
- If **any check failed or produced a warning**, render the block **open**
(`<details open>`) so the contributor sees the issues immediately.
Each block must include the full detail for every check: the license found, the
repository URL, whether a provenance attestation was found, the release
pipeline findings, the PR link found (or missing), and whether the diff is
consistent. For failed or warned checks, explain exactly what the contributor
must fix, including the expected source repository URL, expected version range,
etc.
Template (repeat for each package):
```
<details open>
<summary><strong>PackageB 📦 new —→4.5.6</strong></summary>
- **License**: ❌ License is `UNKNOWN` — not in the approved list. Check PyPI metadata and `script/licenses.py`.
- **Repository Public**: ✅ https://github.com/example/packageb is publicly accessible.
- **CI Upload**: ❌ No provenance attestation found for any distribution file. The release may have been uploaded manually.
- **Release Pipeline**: ⚠️ No publish workflow found in the repository; it is unclear how this package is released to PyPI.
- **PR Link**: ❌ PR description must link to the source repository at https://github.com/example/packageb (a PyPI page link is not sufficient).
- **Diff Consistent**: ✅
</details>
```
Collapsed example (all checks passed):
```
<details>
<summary><strong>PackageA 📦 bump 1.2.3→1.3.0</strong></summary>
- **License**: ✅ MIT
- **Repository Public**: ✅ https://github.com/example/packagea
- **CI Upload**: ✅ Trusted Publisher attestation found (GitHub Actions).
- **Release Pipeline**: ✅ OIDC via `pypa/gh-action-pypi-publish`; triggered on `release: published`; `environment: release` gate.
- **PR Link**: ✅ https://github.com/example/packagea/compare/v1.2.3...v1.3.0
- **Diff Consistent**: ✅
</details>
```
## Notes
- Be constructive and helpful. Provide direct links where possible so the
contributor can quickly fix the issue.
- If PyPI returns an error for a package, mention that it could not be found and
suggest the contributor verify the package name.
- For packages that only appear in `homeassistant/package_constraints.txt` or
`pyproject.toml` without being tied to a specific integration, the PR
description link requirement still applies.
- When checking test-only packages (from `requirements_test.txt` or
`requirements_test_all.txt`), apply the same license, repository, and PR
description checks as for production dependencies.
- A package that appears in both a production file and a test file should only
be reported once; use the production file entry as the canonical one.
- This workflow is only triggered when a commit actually changes one of the
tracked requirements files (for `synchronize` events GitHub compares the
before/after SHAs of the push, not the entire PR diff). Members can manually
retrigger the workflow via `workflow_dispatch` with the PR number to re-run
the check after updating the PR description or fixing issues without changing
any requirements files. On a retrigger the existing comment is updated in
place so there is always exactly one requirements-check comment in the PR.
+2
View File
@@ -23,6 +23,7 @@ repos:
- id: zizmor
args:
- --pedantic
exclude: ^\.github/workflows/.*\.lock\.yml$
- repo: https://github.com/pre-commit/pre-commit-hooks
rev: v6.0.0
hooks:
@@ -46,6 +47,7 @@ repos:
additional_dependencies:
- prettier@3.6.2
- prettier-plugin-sort-json@4.2.0
exclude: ^\.github/workflows/.*\.lock\.yml$
- repo: https://github.com/cdce8p/python-typing-update
rev: v0.6.0
hooks:
-1
View File
@@ -599,7 +599,6 @@ homeassistant.components.vallox.*
homeassistant.components.valve.*
homeassistant.components.velbus.*
homeassistant.components.velux.*
homeassistant.components.victron_gx.*
homeassistant.components.vivotek.*
homeassistant.components.vlc_telnet.*
homeassistant.components.vodafone_station.*
+1
View File
@@ -1,5 +1,6 @@
ignore: |
tests/fixtures/core/config/yaml_errors/
.github/workflows/*.lock.yml
rules:
braces:
level: error
Generated
-4
View File
@@ -758,8 +758,6 @@ CLAUDE.md @home-assistant/core
/tests/components/homewizard/ @DCSBL
/homeassistant/components/honeywell/ @rdfurman @mkmer
/tests/components/honeywell/ @rdfurman @mkmer
/homeassistant/components/honeywell_string_lights/ @balloob
/tests/components/honeywell_string_lights/ @balloob
/homeassistant/components/hr_energy_qube/ @MattieGit
/tests/components/hr_energy_qube/ @MattieGit
/homeassistant/components/html5/ @alexyao2015 @tr4nt0r
@@ -1417,8 +1415,6 @@ CLAUDE.md @home-assistant/core
/tests/components/radarr/ @tkdrob
/homeassistant/components/radio_browser/ @frenck
/tests/components/radio_browser/ @frenck
/homeassistant/components/radio_frequency/ @home-assistant/core
/tests/components/radio_frequency/ @home-assistant/core
/homeassistant/components/radiotherm/ @vinnyfuria
/tests/components/radiotherm/ @vinnyfuria
/homeassistant/components/rainbird/ @konikvranik @allenporter
+1 -1
View File
@@ -1,5 +1,5 @@
{
"domain": "honeywell",
"name": "Honeywell",
"integrations": ["lyric", "evohome", "honeywell", "honeywell_string_lights"]
"integrations": ["lyric", "evohome", "honeywell"]
}
@@ -25,7 +25,7 @@ async def async_get_media_source(hass: HomeAssistant) -> MediaSource:
hass.data[DATA_MEDIA_SOURCE] = source = local_source.LocalSource(
hass,
DOMAIN,
"AI generated images",
"AI Generated Images",
{IMAGE_DIR: str(media_dir)},
f"/{DOMAIN}",
)
@@ -36,8 +36,6 @@ class AirTouch5ConfigFlow(ConfigFlow, domain=DOMAIN):
_LOGGER.exception("Unexpected exception")
errors = {"base": "cannot_connect"}
else:
# Uses the host/IP value from CONF_HOST as unique ID, which is no longer allowed
# pylint: disable-next=hass-unique-id-ip-based
await self.async_set_unique_id(user_input[CONF_HOST])
self._abort_if_unique_id_configured()
return self.async_create_entry(
+2 -1
View File
@@ -39,6 +39,7 @@ from homeassistant.helpers.typing import ConfigType
from .binary_sensor import BINARY_SENSOR_KEYS, BINARY_SENSORS, check_binary_sensors
from .camera import STREAM_SOURCE_LIST
from .const import (
CAMERAS,
COMM_RETRIES,
COMM_TIMEOUT,
DATA_AMCREST,
@@ -358,7 +359,7 @@ def _start_event_monitor(
async def async_setup(hass: HomeAssistant, config: ConfigType) -> bool:
"""Set up the Amcrest IP Camera component."""
hass.data.setdefault(DATA_AMCREST, {DEVICES: {}})
hass.data.setdefault(DATA_AMCREST, {DEVICES: {}, CAMERAS: []})
for device in config[DOMAIN]:
name: str = device[CONF_NAME]
+74 -10
View File
@@ -12,11 +12,13 @@ import aiohttp
from aiohttp import web
from amcrest import AmcrestError
from haffmpeg.camera import CameraMjpeg
import voluptuous as vol
from homeassistant.components.camera import Camera, CameraEntityFeature
from homeassistant.components.ffmpeg import FFmpegManager, get_ffmpeg_manager
from homeassistant.const import CONF_NAME, STATE_OFF, STATE_ON
from homeassistant.const import ATTR_ENTITY_ID, CONF_NAME, STATE_OFF, STATE_ON
from homeassistant.core import HomeAssistant, callback
from homeassistant.helpers import config_validation as cv
from homeassistant.helpers.aiohttp_client import (
async_aiohttp_proxy_stream,
async_aiohttp_proxy_web,
@@ -27,13 +29,11 @@ from homeassistant.helpers.entity_platform import AddEntitiesCallback
from homeassistant.helpers.typing import ConfigType, DiscoveryInfoType
from .const import (
ATTR_COLOR_BW,
CAMERA_WEB_SESSION_TIMEOUT,
CBW,
CAMERAS,
COMM_TIMEOUT,
DATA_AMCREST,
DEVICES,
MOV,
RESOLUTION_TO_STREAM,
SERVICE_UPDATE,
SNAPSHOT_TIMEOUT,
@@ -49,11 +49,65 @@ SCAN_INTERVAL = timedelta(seconds=15)
STREAM_SOURCE_LIST = ["snapshot", "mjpeg", "rtsp"]
_ATTR_PTZ_TT = "travel_time"
_ATTR_PTZ_MOV = "movement"
_MOV = [
"zoom_out",
"zoom_in",
"right",
"left",
"up",
"down",
"right_down",
"right_up",
"left_down",
"left_up",
]
_ZOOM_ACTIONS = ["ZoomWide", "ZoomTele"]
_MOVE_1_ACTIONS = ["Right", "Left", "Up", "Down"]
_MOVE_2_ACTIONS = ["RightDown", "RightUp", "LeftDown", "LeftUp"]
_ACTION = _ZOOM_ACTIONS + _MOVE_1_ACTIONS + _MOVE_2_ACTIONS
_DEFAULT_TT = 0.2
_ATTR_PRESET = "preset"
_ATTR_COLOR_BW = "color_bw"
_CBW_COLOR = "color"
_CBW_AUTO = "auto"
_CBW_BW = "bw"
_CBW = [_CBW_COLOR, _CBW_AUTO, _CBW_BW]
_SRV_SCHEMA = vol.Schema({vol.Optional(ATTR_ENTITY_ID): cv.comp_entity_ids})
_SRV_GOTO_SCHEMA = _SRV_SCHEMA.extend(
{vol.Required(_ATTR_PRESET): vol.All(vol.Coerce(int), vol.Range(min=1))}
)
_SRV_CBW_SCHEMA = _SRV_SCHEMA.extend({vol.Required(_ATTR_COLOR_BW): vol.In(_CBW)})
_SRV_PTZ_SCHEMA = _SRV_SCHEMA.extend(
{
vol.Required(_ATTR_PTZ_MOV): vol.In(_MOV),
vol.Optional(_ATTR_PTZ_TT, default=_DEFAULT_TT): cv.small_float,
}
)
CAMERA_SERVICES = {
"enable_recording": (_SRV_SCHEMA, "async_enable_recording", ()),
"disable_recording": (_SRV_SCHEMA, "async_disable_recording", ()),
"enable_audio": (_SRV_SCHEMA, "async_enable_audio", ()),
"disable_audio": (_SRV_SCHEMA, "async_disable_audio", ()),
"enable_motion_recording": (_SRV_SCHEMA, "async_enable_motion_recording", ()),
"disable_motion_recording": (_SRV_SCHEMA, "async_disable_motion_recording", ()),
"goto_preset": (_SRV_GOTO_SCHEMA, "async_goto_preset", (_ATTR_PRESET,)),
"set_color_bw": (_SRV_CBW_SCHEMA, "async_set_color_bw", (_ATTR_COLOR_BW,)),
"start_tour": (_SRV_SCHEMA, "async_start_tour", ()),
"stop_tour": (_SRV_SCHEMA, "async_stop_tour", ()),
"ptz_control": (
_SRV_PTZ_SCHEMA,
"async_ptz_control",
(_ATTR_PTZ_MOV, _ATTR_PTZ_TT),
),
}
_BOOL_TO_STATE = {True: STATE_ON, False: STATE_OFF}
@@ -221,7 +275,7 @@ class AmcrestCam(Camera):
self._motion_recording_enabled
)
if self._color_bw is not None:
attr[ATTR_COLOR_BW] = self._color_bw
attr[_ATTR_COLOR_BW] = self._color_bw
return attr
@property
@@ -268,7 +322,15 @@ class AmcrestCam(Camera):
self.async_schedule_update_ha_state(True)
async def async_added_to_hass(self) -> None:
"""Subscribe to signals."""
"""Subscribe to signals and add camera to list."""
self._unsub_dispatcher.extend(
async_dispatcher_connect(
self.hass,
service_signal(service, self.entity_id),
getattr(self, callback_name),
)
for service, (_, callback_name, _) in CAMERA_SERVICES.items()
)
self._unsub_dispatcher.append(
async_dispatcher_connect(
self.hass,
@@ -276,9 +338,11 @@ class AmcrestCam(Camera):
self.async_on_demand_update,
)
)
self.hass.data[DATA_AMCREST][CAMERAS].append(self.entity_id)
async def async_will_remove_from_hass(self) -> None:
"""Disconnect from signals."""
"""Remove camera from list and disconnect from signals."""
self.hass.data[DATA_AMCREST][CAMERAS].remove(self.entity_id)
for unsub_dispatcher in self._unsub_dispatcher:
unsub_dispatcher()
@@ -392,7 +456,7 @@ class AmcrestCam(Camera):
async def async_ptz_control(self, movement: str, travel_time: float) -> None:
"""Move or zoom camera in specified direction."""
code = _ACTION[MOV.index(movement)]
code = _ACTION[_MOV.index(movement)]
kwargs = {"code": code, "arg1": 0, "arg2": 0, "arg3": 0}
if code in _MOVE_1_ACTIONS:
@@ -549,10 +613,10 @@ class AmcrestCam(Camera):
)
async def _async_get_color_mode(self) -> str:
return CBW[await self._api.async_day_night_color]
return _CBW[await self._api.async_day_night_color]
async def _async_set_color_mode(self, cbw: str) -> None:
await self._api.async_set_day_night_color(CBW.index(cbw), channel=0)
await self._api.async_set_day_night_color(_CBW.index(cbw), channel=0)
async def _async_set_color_bw(self, cbw: str) -> None:
"""Set camera color mode."""
+1 -15
View File
@@ -2,6 +2,7 @@
DOMAIN = "amcrest"
DATA_AMCREST = DOMAIN
CAMERAS = "cameras"
DEVICES = "devices"
BINARY_SENSOR_SCAN_INTERVAL_SECS = 5
@@ -16,18 +17,3 @@ SERVICE_UPDATE = "update"
RESOLUTION_LIST = {"high": 0, "low": 1}
RESOLUTION_TO_STREAM = {0: "Main", 1: "Extra"}
ATTR_COLOR_BW = "color_bw"
CBW = ["color", "auto", "bw"]
MOV = [
"zoom_out",
"zoom_in",
"right",
"left",
"up",
"down",
"right_down",
"right_up",
"left_down",
"left_up",
]
+52 -57
View File
@@ -1,67 +1,62 @@
"""Services for Amcrest IP cameras."""
"""Support for Amcrest IP cameras."""
from __future__ import annotations
import voluptuous as vol
from homeassistant.auth.models import User
from homeassistant.auth.permissions.const import POLICY_CONTROL
from homeassistant.const import ATTR_ENTITY_ID, ENTITY_MATCH_ALL, ENTITY_MATCH_NONE
from homeassistant.core import HomeAssistant, ServiceCall, callback
from homeassistant.exceptions import Unauthorized, UnknownUser
from homeassistant.helpers.dispatcher import async_dispatcher_send
from homeassistant.helpers.service import async_extract_entity_ids
from homeassistant.components.camera import DOMAIN as CAMERA_DOMAIN
from homeassistant.core import HomeAssistant, callback
from homeassistant.helpers import config_validation as cv, service
from .const import ATTR_COLOR_BW, CBW, DOMAIN, MOV
_ATTR_PRESET = "preset"
_ATTR_PTZ_MOV = "movement"
_ATTR_PTZ_TT = "travel_time"
_DEFAULT_TT = 0.2
from .camera import CAMERA_SERVICES
from .const import CAMERAS, DATA_AMCREST, DOMAIN
from .helpers import service_signal
@callback
def async_setup_services(hass: HomeAssistant) -> None:
"""Set up the Amcrest IP Camera services."""
for service_name, func in (
("enable_recording", "async_enable_recording"),
("disable_recording", "async_disable_recording"),
("enable_audio", "async_enable_audio"),
("disable_audio", "async_disable_audio"),
("enable_motion_recording", "async_enable_motion_recording"),
("disable_motion_recording", "async_disable_motion_recording"),
("start_tour", "async_start_tour"),
("stop_tour", "async_stop_tour"),
):
service.async_register_platform_entity_service(
hass,
DOMAIN,
service_name,
entity_domain=CAMERA_DOMAIN,
schema=None,
func=func,
)
service.async_register_platform_entity_service(
hass,
DOMAIN,
"goto_preset",
entity_domain=CAMERA_DOMAIN,
schema={vol.Required(_ATTR_PRESET): vol.All(vol.Coerce(int), vol.Range(min=1))},
func="async_goto_preset",
)
service.async_register_platform_entity_service(
hass,
DOMAIN,
"set_color_bw",
entity_domain=CAMERA_DOMAIN,
schema={vol.Required(ATTR_COLOR_BW): vol.In(CBW)},
func="async_set_color_bw",
)
service.async_register_platform_entity_service(
hass,
DOMAIN,
"ptz_control",
entity_domain=CAMERA_DOMAIN,
schema={
vol.Required(_ATTR_PTZ_MOV): vol.In(MOV),
vol.Optional(_ATTR_PTZ_TT, default=_DEFAULT_TT): cv.small_float,
},
func="async_ptz_control",
)
def have_permission(user: User | None, entity_id: str) -> bool:
return not user or user.permissions.check_entity(entity_id, POLICY_CONTROL)
async def async_extract_from_service(call: ServiceCall) -> list[str]:
if call.context.user_id:
user = await hass.auth.async_get_user(call.context.user_id)
if user is None:
raise UnknownUser(context=call.context)
else:
user = None
if call.data.get(ATTR_ENTITY_ID) == ENTITY_MATCH_ALL:
# Return all entity_ids user has permission to control.
return [
entity_id
for entity_id in hass.data[DATA_AMCREST][CAMERAS]
if have_permission(user, entity_id)
]
if call.data.get(ATTR_ENTITY_ID) == ENTITY_MATCH_NONE:
return []
call_ids = await async_extract_entity_ids(call)
entity_ids = []
for entity_id in hass.data[DATA_AMCREST][CAMERAS]:
if entity_id not in call_ids:
continue
if not have_permission(user, entity_id):
raise Unauthorized(
context=call.context, entity_id=entity_id, permission=POLICY_CONTROL
)
entity_ids.append(entity_id)
return entity_ids
async def async_service_handler(call: ServiceCall) -> None:
args = [call.data[arg] for arg in CAMERA_SERVICES[call.service][2]]
for entity_id in await async_extract_from_service(call):
async_dispatcher_send(hass, service_signal(call.service, entity_id), *args)
for service, params in CAMERA_SERVICES.items():
hass.services.async_register(DOMAIN, service, async_service_handler, params[0])
+36 -33
View File
@@ -703,14 +703,15 @@ class AnthropicBaseLLMEntity(CoordinatorEntity[AnthropicCoordinator]):
entry_type=dr.DeviceEntryType.SERVICE,
)
async def _get_model_args( # noqa: C901
async def _async_handle_chat_log( # noqa: C901
self,
chat_log: conversation.ChatLog,
structure_name: str | None = None,
structure: vol.Schema | None = None,
) -> tuple[MessageCreateParamsStreaming, str | None]:
"""Get the model arguments."""
options: dict[str, Any] = DEFAULT | self.subentry.data
max_iterations: int = MAX_TOOL_ITERATIONS,
) -> None:
"""Generate an answer for the chat log."""
options = self.subentry.data
preloaded_tools = [
"HassTurnOn",
@@ -728,18 +729,21 @@ class AnthropicBaseLLMEntity(CoordinatorEntity[AnthropicCoordinator]):
messages, container_id = _convert_content(chat_log.content[1:])
model = options[CONF_CHAT_MODEL]
model = options.get(CONF_CHAT_MODEL, DEFAULT[CONF_CHAT_MODEL])
model_args = MessageCreateParamsStreaming(
model=model,
messages=messages,
max_tokens=options[CONF_MAX_TOKENS],
max_tokens=options.get(CONF_MAX_TOKENS, DEFAULT[CONF_MAX_TOKENS]),
system=system.content,
stream=True,
container=container_id,
)
if options[CONF_PROMPT_CACHING] == PromptCaching.PROMPT:
if (
options.get(CONF_PROMPT_CACHING, DEFAULT[CONF_PROMPT_CACHING])
== PromptCaching.PROMPT
):
model_args["system"] = [
{
"type": "text",
@@ -747,14 +751,19 @@ class AnthropicBaseLLMEntity(CoordinatorEntity[AnthropicCoordinator]):
"cache_control": {"type": "ephemeral"},
}
]
elif options[CONF_PROMPT_CACHING] == PromptCaching.AUTOMATIC:
elif (
options.get(CONF_PROMPT_CACHING, DEFAULT[CONF_PROMPT_CACHING])
== PromptCaching.AUTOMATIC
):
model_args["cache_control"] = {"type": "ephemeral"}
if (
self.model_info.capabilities
and self.model_info.capabilities.thinking.types.adaptive.supported
):
thinking_effort = options[CONF_THINKING_EFFORT]
thinking_effort = options.get(
CONF_THINKING_EFFORT, DEFAULT[CONF_THINKING_EFFORT]
)
if thinking_effort != "none":
model_args["thinking"] = ThinkingConfigAdaptiveParam(
type="adaptive", display="summarized"
@@ -763,7 +772,9 @@ class AnthropicBaseLLMEntity(CoordinatorEntity[AnthropicCoordinator]):
else:
model_args["thinking"] = ThinkingConfigDisabledParam(type="disabled")
else:
thinking_budget = options[CONF_THINKING_BUDGET]
thinking_budget = options.get(
CONF_THINKING_BUDGET, DEFAULT[CONF_THINKING_BUDGET]
)
if (
self.model_info.capabilities
and self.model_info.capabilities.thinking.types.enabled.supported
@@ -780,7 +791,9 @@ class AnthropicBaseLLMEntity(CoordinatorEntity[AnthropicCoordinator]):
and self.model_info.capabilities.effort.supported
):
model_args["output_config"] = OutputConfigParam(
effort=options[CONF_THINKING_EFFORT]
effort=options.get(
CONF_THINKING_EFFORT, DEFAULT[CONF_THINKING_EFFORT]
)
)
tools: list[ToolUnionParam] = []
@@ -790,12 +803,12 @@ class AnthropicBaseLLMEntity(CoordinatorEntity[AnthropicCoordinator]):
for tool in chat_log.llm_api.tools
]
if options[CONF_CODE_EXECUTION]:
if options.get(CONF_CODE_EXECUTION):
# The `web_search_20260209` tool automatically enables `code_execution_20260120` tool
if (
not self.model_info.capabilities
or not self.model_info.capabilities.code_execution.supported
or not options[CONF_WEB_SEARCH]
or not options.get(CONF_WEB_SEARCH)
):
tools.append(
CodeExecutionTool20250825Param(
@@ -804,26 +817,26 @@ class AnthropicBaseLLMEntity(CoordinatorEntity[AnthropicCoordinator]):
),
)
if options[CONF_WEB_SEARCH]:
if options.get(CONF_WEB_SEARCH):
if (
not self.model_info.capabilities
or not self.model_info.capabilities.code_execution.supported
or not options[CONF_CODE_EXECUTION]
or not options.get(CONF_CODE_EXECUTION)
):
web_search: WebSearchTool20250305Param | WebSearchTool20260209Param = (
WebSearchTool20250305Param(
name="web_search",
type="web_search_20250305",
max_uses=options[CONF_WEB_SEARCH_MAX_USES],
max_uses=options.get(CONF_WEB_SEARCH_MAX_USES),
)
)
else:
web_search = WebSearchTool20260209Param(
name="web_search",
type="web_search_20260209",
max_uses=options[CONF_WEB_SEARCH_MAX_USES],
max_uses=options.get(CONF_WEB_SEARCH_MAX_USES),
)
if options[CONF_WEB_SEARCH_USER_LOCATION]:
if options.get(CONF_WEB_SEARCH_USER_LOCATION):
web_search["user_location"] = {
"type": "approximate",
"city": options.get(CONF_WEB_SEARCH_CITY, ""),
@@ -924,7 +937,10 @@ class AnthropicBaseLLMEntity(CoordinatorEntity[AnthropicCoordinator]):
preloaded_tools.append(structure_name)
if tools:
if options[CONF_TOOL_SEARCH] and len(tools) > len(preloaded_tools) + 1:
if (
options.get(CONF_TOOL_SEARCH, DEFAULT[CONF_TOOL_SEARCH])
and len(tools) > len(preloaded_tools) + 1
):
for tool in tools:
if not tool["name"].endswith(tuple(preloaded_tools)):
tool["defer_loading"] = True
@@ -937,19 +953,6 @@ class AnthropicBaseLLMEntity(CoordinatorEntity[AnthropicCoordinator]):
model_args["tools"] = tools
return model_args, structure_name
async def _async_handle_chat_log(
self,
chat_log: conversation.ChatLog,
structure_name: str | None = None,
structure: vol.Schema | None = None,
max_iterations: int = MAX_TOOL_ITERATIONS,
) -> None:
"""Generate an answer for the chat log."""
model_args, structure_name = await self._get_model_args(
chat_log, structure_name, structure
)
coordinator = self.entry.runtime_data
client = coordinator.client
@@ -971,7 +974,7 @@ class AnthropicBaseLLMEntity(CoordinatorEntity[AnthropicCoordinator]):
)
]
)
cast(list[MessageParam], model_args["messages"]).extend(new_messages)
messages.extend(new_messages)
except anthropic.AuthenticationError as err:
# Trigger coordinator to confirm the auth failure and trigger the reauth flow.
await coordinator.async_request_refresh()
@@ -945,10 +945,7 @@ class PipelineRun:
try:
# Transcribe audio stream
stt_vad: VoiceCommandSegmenter | None = None
if (
self.audio_settings.is_vad_enabled
and self.stt_provider.audio_processing.requires_external_vad
):
if self.audio_settings.is_vad_enabled:
stt_vad = VoiceCommandSegmenter(
silence_seconds=self.audio_settings.silence_seconds
)
@@ -21,9 +21,8 @@ from homeassistant.helpers import config_validation as cv, device_registry as dr
from homeassistant.helpers.typing import ConfigType
from homeassistant.util.ssl import get_default_context
from .const import DOMAIN, MANUFACTURER, BeoModel
from .const import DOMAIN
from .services import async_setup_services
from .util import get_remotes
from .websocket import BeoWebsocket
@@ -59,6 +58,15 @@ async def async_setup_entry(hass: HomeAssistant, entry: BeoConfigEntry) -> bool:
# Remove casts to str
assert entry.unique_id
# Create device now as BeoWebsocket needs a device for debug logging, firing events etc.
device_registry = dr.async_get(hass)
device_registry.async_get_or_create(
config_entry_id=entry.entry_id,
identifiers={(DOMAIN, entry.unique_id)},
name=entry.title,
model=entry.data[CONF_MODEL],
)
client = MozartClient(host=entry.data[CONF_HOST], ssl_context=get_default_context())
# Check API and WebSocket connection
@@ -75,27 +83,6 @@ async def async_setup_entry(hass: HomeAssistant, entry: BeoConfigEntry) -> bool:
await client.close_api_client()
raise ConfigEntryNotReady(f"Unable to connect to {entry.title}") from error
# Create device now as BeoWebsocket needs a device for debug logging, firing events etc.
device_registry = dr.async_get(hass)
device_registry.async_get_or_create(
config_entry_id=entry.entry_id,
identifiers={(DOMAIN, entry.unique_id)},
model=entry.data[CONF_MODEL],
)
# Create devices for paired Beoremote One remotes
for remote in await get_remotes(client):
device_registry.async_get_or_create(
config_entry_id=entry.entry_id,
identifiers={(DOMAIN, f"{remote.serial_number}_{entry.unique_id}")},
name=f"{BeoModel.BEOREMOTE_ONE}-{remote.serial_number}-{entry.unique_id}",
model=BeoModel.BEOREMOTE_ONE,
serial_number=remote.serial_number,
sw_version=remote.app_version,
manufacturer=MANUFACTURER,
via_device=(DOMAIN, entry.unique_id),
)
websocket = BeoWebsocket(hass, entry, client)
# Add the websocket and API client
@@ -52,7 +52,6 @@ class BeoConfigFlowHandler(ConfigFlow, domain=DOMAIN):
_beolink_jid = ""
_client: MozartClient
_friendly_name = ""
_host = ""
_model = ""
_name = ""
@@ -112,7 +111,6 @@ class BeoConfigFlowHandler(ConfigFlow, domain=DOMAIN):
)
self._beolink_jid = beolink_self.jid
self._friendly_name = beolink_self.friendly_name
self._serial_number = get_serial_number_from_jid(beolink_self.jid)
await self.async_set_unique_id(self._serial_number)
@@ -151,7 +149,6 @@ class BeoConfigFlowHandler(ConfigFlow, domain=DOMAIN):
return self.async_abort(reason="invalid_address")
self._model = discovery_info.hostname[:-16].replace("-", " ")
self._friendly_name = discovery_info.properties[ATTR_FRIENDLY_NAME]
self._serial_number = discovery_info.properties[ATTR_SERIAL_NUMBER]
self._beolink_jid = f"{discovery_info.properties[ATTR_TYPE_NUMBER]}.{discovery_info.properties[ATTR_ITEM_NUMBER]}.{self._serial_number}@products.bang-olufsen.com"
@@ -167,13 +164,16 @@ class BeoConfigFlowHandler(ConfigFlow, domain=DOMAIN):
async def _create_entry(self) -> ConfigFlowResult:
"""Create the config entry for a discovered or manually configured Bang & Olufsen device."""
# Ensure that created entities have a unique and easily identifiable id and not a "friendly name"
self._name = f"{self._model}-{self._serial_number}"
return self.async_create_entry(
title=self._friendly_name,
title=self._name,
data=EntryData(
host=self._host,
jid=self._beolink_jid,
model=self._model,
name=self._friendly_name,
name=self._name,
),
)
@@ -20,6 +20,7 @@ from .const import (
CONNECTION_STATUS,
DEVICE_BUTTON_EVENTS,
DOMAIN,
MANUFACTURER,
BeoModel,
WebsocketNotification,
)
@@ -141,6 +142,12 @@ class BeoRemoteKeyEvent(BeoEvent):
self._attr_unique_id = f"{remote.serial_number}_{self._unique_id}_{key_type}"
self._attr_device_info = DeviceInfo(
identifiers={(DOMAIN, f"{remote.serial_number}_{self._unique_id}")},
name=f"{BeoModel.BEOREMOTE_ONE}-{remote.serial_number}-{self._unique_id}",
model=BeoModel.BEOREMOTE_ONE,
serial_number=remote.serial_number,
sw_version=remote.app_version,
manufacturer=MANUFACTURER,
via_device=(DOMAIN, self._unique_id),
)
# Make the native key name Home Assistant compatible
@@ -115,7 +115,7 @@ class BeoSensorRemoteBatteryLevel(BeoSensor):
f"{remote.serial_number}_{self._unique_id}_remote_battery_level"
)
self._attr_device_info = DeviceInfo(
identifiers={(DOMAIN, f"{remote.serial_number}_{self._unique_id}")},
identifiers={(DOMAIN, f"{remote.serial_number}_{self._unique_id}")}
)
self._attr_native_value = remote.battery_level
self._remote = remote
+5 -19
View File
@@ -30,33 +30,19 @@ BATTERY_PERCENTAGE_DOMAIN_SPECS = {
CONDITIONS: dict[str, type[Condition]] = {
"is_low": make_entity_state_condition(
BATTERY_DOMAIN_SPECS,
STATE_ON,
support_duration=True,
primary_entities_only=False,
BATTERY_DOMAIN_SPECS, STATE_ON, support_duration=True
),
"is_not_low": make_entity_state_condition(
BATTERY_DOMAIN_SPECS,
STATE_OFF,
support_duration=True,
primary_entities_only=False,
BATTERY_DOMAIN_SPECS, STATE_OFF, support_duration=True
),
"is_charging": make_entity_state_condition(
BATTERY_CHARGING_DOMAIN_SPECS,
STATE_ON,
support_duration=True,
primary_entities_only=False,
BATTERY_CHARGING_DOMAIN_SPECS, STATE_ON, support_duration=True
),
"is_not_charging": make_entity_state_condition(
BATTERY_CHARGING_DOMAIN_SPECS,
STATE_OFF,
support_duration=True,
primary_entities_only=False,
BATTERY_CHARGING_DOMAIN_SPECS, STATE_OFF, support_duration=True
),
"is_level": make_entity_numerical_condition(
BATTERY_PERCENTAGE_DOMAIN_SPECS,
PERCENTAGE,
primary_entities_only=False,
BATTERY_PERCENTAGE_DOMAIN_SPECS, PERCENTAGE
),
}
@@ -3,7 +3,6 @@
entity:
- domain: binary_sensor
device_class: battery
primary_entities_only: false
fields:
behavior: &condition_behavior
required: true
@@ -43,7 +42,6 @@ is_charging:
entity:
- domain: binary_sensor
device_class: battery_charging
primary_entities_only: false
fields:
behavior: *condition_behavior
for: *condition_for
@@ -53,7 +51,6 @@ is_not_charging:
entity:
- domain: binary_sensor
device_class: battery_charging
primary_entities_only: false
fields:
behavior: *condition_behavior
for: *condition_for
@@ -63,7 +60,6 @@ is_level:
entity:
- domain: sensor
device_class: battery
primary_entities_only: false
fields:
behavior: *condition_behavior
threshold:
+6 -14
View File
@@ -32,27 +32,19 @@ BATTERY_PERCENTAGE_DOMAIN_SPECS: dict[str, DomainSpec] = {
}
TRIGGERS: dict[str, type[Trigger]] = {
"low": make_entity_target_state_trigger(
BATTERY_LOW_DOMAIN_SPECS, STATE_ON, primary_entities_only=False
),
"not_low": make_entity_target_state_trigger(
BATTERY_LOW_DOMAIN_SPECS, STATE_OFF, primary_entities_only=False
),
"low": make_entity_target_state_trigger(BATTERY_LOW_DOMAIN_SPECS, STATE_ON),
"not_low": make_entity_target_state_trigger(BATTERY_LOW_DOMAIN_SPECS, STATE_OFF),
"started_charging": make_entity_target_state_trigger(
BATTERY_CHARGING_DOMAIN_SPECS, STATE_ON, primary_entities_only=False
BATTERY_CHARGING_DOMAIN_SPECS, STATE_ON
),
"stopped_charging": make_entity_target_state_trigger(
BATTERY_CHARGING_DOMAIN_SPECS, STATE_OFF, primary_entities_only=False
BATTERY_CHARGING_DOMAIN_SPECS, STATE_OFF
),
"level_changed": make_entity_numerical_state_changed_trigger(
BATTERY_PERCENTAGE_DOMAIN_SPECS,
valid_unit="%",
primary_entities_only=False,
BATTERY_PERCENTAGE_DOMAIN_SPECS, valid_unit="%"
),
"level_crossed_threshold": make_entity_numerical_state_crossed_threshold_trigger(
BATTERY_PERCENTAGE_DOMAIN_SPECS,
valid_unit="%",
primary_entities_only=False,
BATTERY_PERCENTAGE_DOMAIN_SPECS, valid_unit="%"
),
}
@@ -33,19 +33,16 @@
entity:
- domain: binary_sensor
device_class: battery
primary_entities_only: false
.trigger_target_charging: &trigger_target_charging
entity:
- domain: binary_sensor
device_class: battery_charging
primary_entities_only: false
.trigger_target_percentage: &trigger_target_percentage
entity:
- domain: sensor
device_class: battery
primary_entities_only: false
low:
fields:
@@ -1,5 +1,4 @@
"""The Broadlink integration."""
# pylint: disable=hass-use-runtime-data # Uses legacy hass.data[DOMAIN] pattern
from __future__ import annotations
@@ -34,8 +34,6 @@ async def async_setup_entry(
async_add_entities: AddConfigEntryEntitiesCallback,
) -> None:
"""Set up the Broadlink climate entities."""
# Uses legacy hass.data[DOMAIN] pattern
# pylint: disable-next=hass-use-runtime-data
device = hass.data[DOMAIN].devices[config_entry.entry_id]
if device.api.type in DOMAINS_AND_TYPES[Platform.CLIMATE]:
@@ -133,8 +133,6 @@ class BroadlinkDevice[_ApiT: blk.Device = blk.Device]:
await coordinator.async_config_entry_first_refresh()
self.update_manager = update_manager
# Uses legacy hass.data[DOMAIN] pattern
# pylint: disable-next=hass-use-runtime-data
self.hass.data[DOMAIN].devices[config.entry_id] = self
self.reset_jobs.append(config.add_update_listener(self.async_update))
@@ -32,8 +32,6 @@ async def async_setup_entry(
async_add_entities: AddConfigEntryEntitiesCallback,
) -> None:
"""Set up the Broadlink light."""
# Uses legacy hass.data[DOMAIN] pattern
# pylint: disable-next=hass-use-runtime-data
device = hass.data[DOMAIN].devices[config_entry.entry_id]
lights = []
@@ -95,8 +95,6 @@ async def async_setup_entry(
async_add_entities: AddConfigEntryEntitiesCallback,
) -> None:
"""Set up a Broadlink remote."""
# Uses legacy hass.data[DOMAIN] pattern
# pylint: disable-next=hass-use-runtime-data
device = hass.data[DOMAIN].devices[config_entry.entry_id]
remote = BroadlinkRemote(
device,
@@ -31,8 +31,6 @@ async def async_setup_entry(
async_add_entities: AddConfigEntryEntitiesCallback,
) -> None:
"""Set up the Broadlink select."""
# Uses legacy hass.data[DOMAIN] pattern
# pylint: disable-next=hass-use-runtime-data
device = hass.data[DOMAIN].devices[config_entry.entry_id]
async_add_entities([BroadlinkDayOfWeek(device)])
@@ -108,8 +108,6 @@ async def async_setup_entry(
async_add_entities: AddConfigEntryEntitiesCallback,
) -> None:
"""Set up the Broadlink sensor."""
# Uses legacy hass.data[DOMAIN] pattern
# pylint: disable-next=hass-use-runtime-data
device = hass.data[DOMAIN].devices[config_entry.entry_id]
sensor_data = device.update_manager.coordinator.data
sensors = [
@@ -1,5 +1,4 @@
"""Support for Broadlink switches."""
# pylint: disable=hass-use-runtime-data # Uses legacy hass.data[DOMAIN] pattern
from __future__ import annotations
@@ -22,8 +22,6 @@ async def async_setup_entry(
async_add_entities: AddConfigEntryEntitiesCallback,
) -> None:
"""Set up the Broadlink time."""
# Uses legacy hass.data[DOMAIN] pattern
# pylint: disable-next=hass-use-runtime-data
device = hass.data[DOMAIN].devices[config_entry.entry_id]
async_add_entities([BroadlinkTime(device)])
+15 -131
View File
@@ -13,7 +13,6 @@ from bsblan import (
Info,
StaticState,
)
from yarl import URL
from homeassistant.config_entries import ConfigEntry
from homeassistant.const import (
@@ -29,16 +28,11 @@ from homeassistant.exceptions import (
ConfigEntryError,
ConfigEntryNotReady,
)
from homeassistant.helpers import config_validation as cv, device_registry as dr
from homeassistant.helpers import config_validation as cv
from homeassistant.helpers.aiohttp_client import async_get_clientsession
from homeassistant.helpers.device_registry import (
CONNECTION_NETWORK_MAC,
DeviceInfo,
format_mac,
)
from homeassistant.helpers.typing import ConfigType
from .const import CONF_HEATING_CIRCUITS, CONF_PASSKEY, DEFAULT_PORT, DOMAIN, LOGGER
from .const import CONF_PASSKEY, DOMAIN, LOGGER
from .coordinator import BSBLanFastCoordinator, BSBLanSlowCoordinator
from .services import async_setup_services
@@ -58,35 +52,7 @@ class BSBLanData:
client: BSBLAN
device: Device
info: Info
static: dict[int, StaticState | None]
available_circuits: list[int]
def get_bsblan_device_info(
device: Device, info: Info, host: str, port: int
) -> DeviceInfo:
"""Build DeviceInfo for the main BSB-LAN controller device."""
return DeviceInfo(
identifiers={(DOMAIN, device.MAC)},
connections={(CONNECTION_NETWORK_MAC, format_mac(device.MAC))},
name=device.name,
manufacturer="BSBLAN Inc.",
model=(
info.device_identification.value
if info.device_identification and info.device_identification.value
else None
),
model_id=(
f"{info.controller_family.value}_{info.controller_variant.value}"
if info.controller_family
and info.controller_variant
and info.controller_family.value
and info.controller_variant.value
else None
),
sw_version=device.version,
configuration_url=str(URL.build(scheme="http", host=host, port=port)),
)
static: StaticState | None
async def async_setup(hass: HomeAssistant, config: ConfigType) -> bool:
@@ -109,17 +75,13 @@ async def async_setup_entry(hass: HomeAssistant, entry: BSBLanConfigEntry) -> bo
# create BSBLAN client
session = async_get_clientsession(hass)
bsblan = BSBLAN(config=config, session=session)
bsblan = BSBLAN(config, session)
try:
# Initialize the client first - this sets up internal caches and validates
# the connection by fetching firmware version
await bsblan.initialize()
# Read available heating circuits from config entry data
# (populated by config flow or migration)
circuits: list[int] = entry.data[CONF_HEATING_CIRCUITS]
# Fetch required device metadata in parallel for faster startup
device, info = await asyncio.gather(
bsblan.device(),
@@ -148,25 +110,18 @@ async def async_setup_entry(hass: HomeAssistant, entry: BSBLanConfigEntry) -> bo
translation_key="setup_general_error",
) from err
# Fetch static values per configured circuit.
# BSB-LAN is a serial bus — it processes one parameter at a time,
# so concurrent requests offer no speed benefit over sequential.
# Static values are optional — some devices may not support them.
static_per_circuit: dict[int, StaticState | None] = {}
for circuit in circuits:
try:
static_per_circuit[circuit] = await bsblan.static_values(circuit=circuit)
except (BSBLANError, TimeoutError) as err:
LOGGER.debug(
"Static values not available for %s circuit %d: %s",
entry.data[CONF_HOST],
circuit,
err,
)
static_per_circuit[circuit] = None
try:
static = await bsblan.static_values()
except (BSBLANError, TimeoutError) as err:
LOGGER.debug(
"Static values not available for %s: %s",
entry.data[CONF_HOST],
err,
)
static = None
# Create coordinators with the already-initialized client
fast_coordinator = BSBLanFastCoordinator(hass, entry, bsblan, circuits)
fast_coordinator = BSBLanFastCoordinator(hass, entry, bsblan)
slow_coordinator = BSBLanSlowCoordinator(hass, entry, bsblan)
# Perform first refresh of fast coordinator (required for entities)
@@ -182,25 +137,7 @@ async def async_setup_entry(hass: HomeAssistant, entry: BSBLanConfigEntry) -> bo
slow_coordinator=slow_coordinator,
device=device,
info=info,
static=static_per_circuit,
available_circuits=circuits,
)
# Register main device before forwarding platforms, so sub-devices
# (heating circuits, water heater) can reference it via via_device
device_registry = dr.async_get(hass)
port = entry.data.get(CONF_PORT, DEFAULT_PORT)
main_device_info = get_bsblan_device_info(device, info, entry.data[CONF_HOST], port)
device_registry.async_get_or_create(
config_entry_id=entry.entry_id,
identifiers=main_device_info["identifiers"],
connections=main_device_info["connections"],
name=main_device_info["name"],
manufacturer=main_device_info["manufacturer"],
model=main_device_info.get("model"),
model_id=main_device_info.get("model_id"),
sw_version=main_device_info.get("sw_version"),
configuration_url=main_device_info.get("configuration_url"),
static=static,
)
await hass.config_entries.async_forward_entry_setups(entry, PLATFORMS)
@@ -211,56 +148,3 @@ async def async_setup_entry(hass: HomeAssistant, entry: BSBLanConfigEntry) -> bo
async def async_unload_entry(hass: HomeAssistant, entry: BSBLanConfigEntry) -> bool:
"""Unload BSBLAN config entry."""
return await hass.config_entries.async_unload_platforms(entry, PLATFORMS)
async def async_migrate_entry(hass: HomeAssistant, entry: BSBLanConfigEntry) -> bool:
"""Migrate old config entries to the latest schema."""
LOGGER.debug(
"Migrating BSB-LAN entry from version %s.%s",
entry.version,
entry.minor_version,
)
if entry.version > 1:
# Downgraded from a future version; cannot migrate.
return False
# 1.1 -> 1.2: Add CONF_HEATING_CIRCUITS. Attempt to discover available
# heating circuits from the device; fall back to [1] (pre-multi-circuit
# default) if the device is unreachable or the endpoint is unsupported.
if entry.version == 1 and entry.minor_version < 2:
circuits: list[int] = [1]
config = BSBLANConfig(
host=entry.data[CONF_HOST],
passkey=entry.data[CONF_PASSKEY],
port=entry.data[CONF_PORT],
username=entry.data.get(CONF_USERNAME),
password=entry.data.get(CONF_PASSWORD),
)
session = async_get_clientsession(hass)
bsblan = BSBLAN(config=config, session=session)
try:
await bsblan.initialize()
circuits = await bsblan.get_available_circuits()
except (BSBLANError, TimeoutError) as err:
LOGGER.warning(
"Circuit discovery during migration failed for %s (%s); "
"defaulting to single circuit [1]. Use Reconfigure to "
"rediscover additional circuits later",
entry.data[CONF_HOST],
err,
)
hass.config_entries.async_update_entry(
entry,
data={**entry.data, CONF_HEATING_CIRCUITS: circuits},
minor_version=2,
)
LOGGER.debug(
"Migrated BSB-LAN entry to version %s.%s with circuits %s",
entry.version,
entry.minor_version,
circuits,
)
return True
+15 -28
View File
@@ -4,7 +4,7 @@ from __future__ import annotations
from typing import Any, Final
from bsblan import BSBLANError, State, get_hvac_action_category
from bsblan import BSBLANError, get_hvac_action_category
from homeassistant.components.climate import (
ATTR_HVAC_MODE,
@@ -24,7 +24,7 @@ from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from . import BSBLanConfigEntry, BSBLanData
from .const import ATTR_TARGET_TEMPERATURE, DOMAIN
from .entity import BSBLanCircuitEntity
from .entity import BSBLanEntity
PARALLEL_UPDATES = 1
@@ -63,12 +63,10 @@ async def async_setup_entry(
) -> None:
"""Set up BSBLAN device based on a config entry."""
data = entry.runtime_data
async_add_entities(
BSBLANClimate(data, circuit) for circuit in data.available_circuits
)
async_add_entities([BSBLANClimate(data)])
class BSBLANClimate(BSBLanCircuitEntity, ClimateEntity):
class BSBLANClimate(BSBLanEntity, ClimateEntity):
"""Defines a BSBLAN climate device."""
_attr_name = None
@@ -86,50 +84,37 @@ class BSBLANClimate(BSBLanCircuitEntity, ClimateEntity):
def __init__(
self,
data: BSBLanData,
circuit: int,
) -> None:
"""Initialize BSBLAN climate device."""
super().__init__(data.fast_coordinator, data, circuit)
self._circuit = circuit
mac = format_mac(data.device.MAC)
super().__init__(data.fast_coordinator, data)
self._attr_unique_id = f"{format_mac(data.device.MAC)}-climate"
# Backward compatible unique ID: circuit 1 keeps old format
if circuit == 1:
self._attr_unique_id = f"{mac}-climate"
else:
self._attr_unique_id = f"{mac}-climate-{circuit}"
# Set temperature range from per-circuit static data
if (static := data.static.get(circuit)) is not None:
# Set temperature range if available, otherwise use Home Assistant defaults
if (static := data.static) is not None:
if (min_temp := static.min_temp) is not None and min_temp.value is not None:
self._attr_min_temp = min_temp.value
if (max_temp := static.max_temp) is not None and max_temp.value is not None:
self._attr_max_temp = max_temp.value
self._attr_temperature_unit = data.fast_coordinator.client.get_temperature_unit
@property
def _circuit_state(self) -> State:
"""Return the state for this circuit."""
return self.coordinator.data.states[self._circuit]
@property
def current_temperature(self) -> float | None:
"""Return the current temperature."""
if (current_temp := self._circuit_state.current_temperature) is None:
if (current_temp := self.coordinator.data.state.current_temperature) is None:
return None
return current_temp.value
@property
def target_temperature(self) -> float | None:
"""Return the temperature we try to reach."""
if (target_temp := self._circuit_state.target_temperature) is None:
if (target_temp := self.coordinator.data.state.target_temperature) is None:
return None
return target_temp.value
@property
def _hvac_mode_value(self) -> int | None:
"""Return the raw hvac_mode value from the coordinator."""
if (hvac_mode := self._circuit_state.hvac_mode) is None:
if (hvac_mode := self.coordinator.data.state.hvac_mode) is None:
return None
return hvac_mode.value
@@ -143,7 +128,9 @@ class BSBLANClimate(BSBLanCircuitEntity, ClimateEntity):
@property
def hvac_action(self) -> HVACAction | None:
"""Return the current running hvac action."""
if (action := self._circuit_state.hvac_action) is None or action.value is None:
if (
action := self.coordinator.data.state.hvac_action
) is None or action.value is None:
return None
category = get_hvac_action_category(action.value)
return HVACAction(category.name.lower())
@@ -183,7 +170,7 @@ class BSBLANClimate(BSBLanCircuitEntity, ClimateEntity):
data[ATTR_HVAC_MODE] = 1
try:
await self.coordinator.client.thermostat(**data, circuit=self._circuit)
await self.coordinator.client.thermostat(**data)
except BSBLANError as err:
raise HomeAssistantError(
"An error occurred while updating the BSBLAN device",
+5 -38
View File
@@ -15,21 +15,19 @@ from homeassistant.helpers.aiohttp_client import async_get_clientsession
from homeassistant.helpers.device_registry import format_mac
from homeassistant.helpers.service_info.zeroconf import ZeroconfServiceInfo
from .const import CONF_HEATING_CIRCUITS, CONF_PASSKEY, DEFAULT_PORT, DOMAIN, LOGGER
from .const import CONF_PASSKEY, DEFAULT_PORT, DOMAIN
class BSBLANFlowHandler(ConfigFlow, domain=DOMAIN):
"""Handle a BSBLAN config flow."""
VERSION = 1
MINOR_VERSION = 2
def __init__(self) -> None:
"""Initialize BSBLan flow."""
self.host: str = ""
self.port: int = DEFAULT_PORT
self.mac: str | None = None
self.circuits: list[int] = [1]
self.passkey: str | None = None
self.username: str | None = None
self.password: str | None = None
@@ -79,7 +77,7 @@ class BSBLANFlowHandler(ConfigFlow, domain=DOMAIN):
# Try to get device info without authentication to minimize discovery popup
config = BSBLANConfig(host=self.host, port=self.port)
session = async_get_clientsession(self.hass)
bsblan = BSBLAN(config=config, session=session)
bsblan = BSBLAN(config, session)
try:
device = await bsblan.device()
except BSBLANError:
@@ -125,8 +123,6 @@ class BSBLANFlowHandler(ConfigFlow, domain=DOMAIN):
)
if not self._auth_required:
# Discover available heating circuits
await self._discover_circuits()
return self._async_create_entry()
self.passkey = user_input.get(CONF_PASSKEY)
@@ -141,7 +137,6 @@ class BSBLANFlowHandler(ConfigFlow, domain=DOMAIN):
"""Validate device connection and create entry."""
try:
await self._get_bsblan_info()
await self._discover_circuits()
except BSBLANAuthError:
if is_discovery:
return self.async_show_form(
@@ -235,12 +230,9 @@ class BSBLANFlowHandler(ConfigFlow, domain=DOMAIN):
# it gets the unique ID from the device info when it validates credentials
self._abort_if_unique_id_mismatch()
# Rediscover circuits in case hardware changed
await self._discover_circuits()
return self.async_update_reload_and_abort(
existing_entry,
data_updates={**user_input, CONF_HEATING_CIRCUITS: self.circuits},
data_updates=user_input,
reason="reconfigure_successful",
)
@@ -324,14 +316,13 @@ class BSBLANFlowHandler(ConfigFlow, domain=DOMAIN):
def _async_create_entry(self) -> ConfigFlowResult:
"""Create the config entry."""
return self.async_create_entry(
title="BSB-LAN",
title=format_mac(self.mac),
data={
CONF_HOST: self.host,
CONF_PORT: self.port,
CONF_PASSKEY: self.passkey,
CONF_USERNAME: self.username,
CONF_PASSWORD: self.password,
CONF_HEATING_CIRCUITS: self.circuits,
},
)
@@ -349,7 +340,7 @@ class BSBLANFlowHandler(ConfigFlow, domain=DOMAIN):
password=self.password,
)
session = async_get_clientsession(self.hass)
bsblan = BSBLAN(config=config, session=session)
bsblan = BSBLAN(config, session)
device = await bsblan.device()
retrieved_mac = device.MAC
@@ -371,27 +362,3 @@ class BSBLANFlowHandler(ConfigFlow, domain=DOMAIN):
CONF_PORT: self.port,
}
)
async def _discover_circuits(self) -> None:
"""Discover available heating circuits."""
config = BSBLANConfig(
host=self.host,
passkey=self.passkey,
port=self.port,
username=self.username,
password=self.password,
)
session = async_get_clientsession(self.hass)
bsblan = BSBLAN(config=config, session=session)
try:
await bsblan.initialize()
self.circuits = await bsblan.get_available_circuits()
except (
BSBLANError,
TimeoutError,
):
LOGGER.debug(
"Circuit discovery not available for %s, defaulting to single circuit",
self.host,
)
self.circuits = [1]
-1
View File
@@ -22,6 +22,5 @@ ATTR_INSIDE_TEMPERATURE: Final = "inside_temperature"
ATTR_OUTSIDE_TEMPERATURE: Final = "outside_temperature"
CONF_PASSKEY: Final = "passkey"
CONF_HEATING_CIRCUITS: Final = "heating_circuits"
DEFAULT_PORT: Final = 80
+6 -12
View File
@@ -49,7 +49,7 @@ DHW_CONFIG_INCLUDE = ["reduced_setpoint", "nominal_setpoint_max"]
class BSBLanFastData:
"""BSBLan fast-polling data."""
states: dict[int, State]
state: State
sensor: Sensor
dhw: HotWaterState | None = None
@@ -94,7 +94,6 @@ class BSBLanFastCoordinator(BSBLanCoordinator[BSBLanFastData]):
hass: HomeAssistant,
config_entry: BSBLanConfigEntry,
client: BSBLAN,
circuits: list[int],
) -> None:
"""Initialize the BSB-LAN fast coordinator."""
super().__init__(
@@ -104,19 +103,14 @@ class BSBLanFastCoordinator(BSBLanCoordinator[BSBLanFastData]):
name=f"{DOMAIN}_fast_{config_entry.data[CONF_HOST]}",
update_interval=SCAN_INTERVAL_FAST,
)
self.circuits: list[int] = circuits
async def _async_update_data(self) -> BSBLanFastData:
"""Fetch fast-changing data from the BSB-LAN device."""
states: dict[int, State] = {}
try:
# Use include filtering to only fetch parameters we actually use.
# BSB-LAN is a serial bus — it processes one parameter at a time,
# so concurrent requests offer no speed benefit over sequential.
for circuit in self.circuits:
states[circuit] = await self.client.state(
include=STATE_INCLUDE, circuit=circuit
)
# Client is already initialized in async_setup_entry
# Use include filtering to only fetch parameters we actually use
# This reduces response time significantly (~0.2s per parameter)
state = await self.client.state(include=STATE_INCLUDE)
sensor = await self.client.sensor(include=SENSOR_INCLUDE)
except BSBLANAuthError as err:
@@ -146,7 +140,7 @@ class BSBLanFastCoordinator(BSBLanCoordinator[BSBLanFastData]):
)
return BSBLanFastData(
states=states,
state=state,
sensor=sensor,
dhw=dhw,
)
@@ -20,20 +20,13 @@ async def async_get_config_entry_diagnostics(
"info": data.info.model_dump(),
"device": data.device.model_dump(),
"fast_coordinator_data": {
"states": {
str(circuit): state.model_dump()
for circuit, state in data.fast_coordinator.data.states.items()
},
"state": data.fast_coordinator.data.state.model_dump(),
"sensor": data.fast_coordinator.data.sensor.model_dump(),
"dhw": data.fast_coordinator.data.dhw.model_dump()
if data.fast_coordinator.data.dhw
else None,
},
"static": {
str(circuit): static.model_dump() if static is not None else None
for circuit, static in data.static.items()
},
"available_circuits": data.available_circuits,
"static": data.static.model_dump() if data.static is not None else None,
}
# Add DHW config and schedule from slow coordinator if available
+30 -55
View File
@@ -2,11 +2,17 @@
from __future__ import annotations
from yarl import URL
from homeassistant.const import CONF_HOST, CONF_PORT
from homeassistant.helpers.device_registry import DeviceInfo
from homeassistant.helpers.device_registry import (
CONNECTION_NETWORK_MAC,
DeviceInfo,
format_mac,
)
from homeassistant.helpers.update_coordinator import CoordinatorEntity
from . import BSBLanData, get_bsblan_device_info
from . import BSBLanData
from .const import DEFAULT_PORT, DOMAIN
from .coordinator import BSBLanCoordinator, BSBLanFastCoordinator, BSBLanSlowCoordinator
@@ -21,8 +27,28 @@ class BSBLanEntityBase[_T: BSBLanCoordinator](CoordinatorEntity[_T]):
super().__init__(coordinator)
host = coordinator.config_entry.data[CONF_HOST]
port = coordinator.config_entry.data.get(CONF_PORT, DEFAULT_PORT)
self._attr_device_info = get_bsblan_device_info(
data.device, data.info, host, port
mac = data.device.MAC
self._attr_device_info = DeviceInfo(
identifiers={(DOMAIN, mac)},
connections={(CONNECTION_NETWORK_MAC, format_mac(mac))},
name=data.device.name,
manufacturer="BSBLAN Inc.",
model=(
data.info.device_identification.value
if data.info.device_identification
and data.info.device_identification.value
else None
),
model_id=(
f"{data.info.controller_family.value}_{data.info.controller_variant.value}"
if data.info.controller_family
and data.info.controller_variant
and data.info.controller_family.value
and data.info.controller_variant.value
else None
),
sw_version=data.device.version,
configuration_url=str(URL.build(scheme="http", host=host, port=port)),
)
@@ -34,32 +60,6 @@ class BSBLanEntity(BSBLanEntityBase[BSBLanFastCoordinator]):
super().__init__(coordinator, data)
class BSBLanCircuitEntity(BSBLanEntity):
"""BSBLan entity belonging to a heating circuit sub-device."""
def __init__(
self,
coordinator: BSBLanFastCoordinator,
data: BSBLanData,
circuit: int,
) -> None:
"""Initialize BSBLan circuit entity with sub-device info."""
super().__init__(coordinator, data)
mac = data.device.MAC
host = coordinator.config_entry.data[CONF_HOST]
port = coordinator.config_entry.data.get(CONF_PORT, DEFAULT_PORT)
main_info = get_bsblan_device_info(data.device, data.info, host, port)
self._attr_device_info = DeviceInfo(
identifiers={(DOMAIN, f"{mac}-circuit-{circuit}")},
translation_key="heating_circuit",
translation_placeholders={"circuit": str(circuit)},
via_device=(DOMAIN, mac),
manufacturer=main_info["manufacturer"],
model=main_info.get("model"),
model_id=main_info.get("model_id"),
)
class BSBLanDualCoordinatorEntity(BSBLanEntity):
"""Entity that listens to both fast and slow coordinators."""
@@ -80,28 +80,3 @@ class BSBLanDualCoordinatorEntity(BSBLanEntity):
self.async_on_remove(
self.slow_coordinator.async_add_listener(self._handle_coordinator_update)
)
class BSBLanWaterHeaterDeviceEntity(BSBLanDualCoordinatorEntity):
"""BSBLan entity belonging to the water heater sub-device."""
def __init__(
self,
fast_coordinator: BSBLanFastCoordinator,
slow_coordinator: BSBLanSlowCoordinator,
data: BSBLanData,
) -> None:
"""Initialize BSBLan water heater sub-device entity."""
super().__init__(fast_coordinator, slow_coordinator, data)
mac = data.device.MAC
host = fast_coordinator.config_entry.data[CONF_HOST]
port = fast_coordinator.config_entry.data.get(CONF_PORT, DEFAULT_PORT)
main_info = get_bsblan_device_info(data.device, data.info, host, port)
self._attr_device_info = DeviceInfo(
identifiers={(DOMAIN, f"{mac}-water-heater")},
translation_key="water_heater",
via_device=(DOMAIN, mac),
manufacturer=main_info["manufacturer"],
model=main_info.get("model"),
model_id=main_info.get("model_id"),
)
@@ -8,7 +8,7 @@
"iot_class": "local_polling",
"loggers": ["bsblan"],
"quality_scale": "silver",
"requirements": ["python-bsblan==5.2.0"],
"requirements": ["python-bsblan==5.1.4"],
"zeroconf": [
{
"name": "bsb-lan*",
@@ -48,10 +48,13 @@ rules:
dynamic-devices:
status: exempt
comment: |
Devices and sub-devices are determined at config entry setup and do not change at runtime.
This integration has a fixed single device.
entity-category: done
entity-device-class: done
entity-disabled-by-default: done
entity-disabled-by-default:
status: exempt
comment: |
This integration provides a limited number of entities, all of which are useful to users.
entity-translations: done
exception-translations: done
icon-translations: todo
@@ -63,7 +66,7 @@ rules:
stale-devices:
status: exempt
comment: |
Devices and sub-devices are determined at config entry setup and do not change at runtime.
This integration has a fixed single device.
# Platinum
async-dependency: done
@@ -79,14 +79,6 @@
}
}
},
"device": {
"heating_circuit": {
"name": "Heating circuit {circuit}"
},
"water_heater": {
"name": "Water heater"
}
},
"entity": {
"button": {
"sync_time": {
@@ -21,7 +21,7 @@ from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from . import BSBLanConfigEntry, BSBLanData
from .const import DOMAIN
from .entity import BSBLanWaterHeaterDeviceEntity
from .entity import BSBLanDualCoordinatorEntity
PARALLEL_UPDATES = 1
@@ -61,7 +61,7 @@ async def async_setup_entry(
async_add_entities([BSBLANWaterHeater(data)])
class BSBLANWaterHeater(BSBLanWaterHeaterDeviceEntity, WaterHeaterEntity):
class BSBLANWaterHeater(BSBLanDualCoordinatorEntity, WaterHeaterEntity):
"""Defines a BSBLAN water heater entity."""
_attr_name = None
@@ -1,5 +1,4 @@
"""Component to embed Google Cast."""
# pylint: disable=hass-use-runtime-data # Uses legacy hass.data[DOMAIN] pattern
from __future__ import annotations
-2
View File
@@ -65,8 +65,6 @@ class ChromecastInfo:
"""
cast_info = self.cast_info
if self.cast_info.cast_type is None or self.cast_info.manufacturer is None:
# Uses legacy hass.data[DOMAIN] pattern
# pylint: disable-next=hass-use-runtime-data
unknown_models = hass.data[DOMAIN]["unknown_models"]
if self.cast_info.model_name not in unknown_models:
# Manufacturer and cast type is not available in mDNS data,
@@ -1,5 +1,4 @@
"""Provide functionality to interact with Cast devices on the network."""
# pylint: disable=hass-use-runtime-data # Uses legacy hass.data[DOMAIN] pattern
from __future__ import annotations
+56 -56
View File
@@ -9,34 +9,34 @@
},
"conditions": {
"is_cooling": {
"description": "Tests if one or more thermostats are cooling.",
"description": "Tests if one or more climate-control devices are cooling.",
"fields": {
"behavior": {
"name": "[%key:component::climate::common::condition_behavior_name%]"
}
},
"name": "Thermostat is cooling"
"name": "Climate-control device is cooling"
},
"is_drying": {
"description": "Tests if one or more thermostats are drying.",
"description": "Tests if one or more climate-control devices are drying.",
"fields": {
"behavior": {
"name": "[%key:component::climate::common::condition_behavior_name%]"
}
},
"name": "Thermostat is drying"
"name": "Climate-control device is drying"
},
"is_heating": {
"description": "Tests if one or more thermostats are heating.",
"description": "Tests if one or more climate-control devices are heating.",
"fields": {
"behavior": {
"name": "[%key:component::climate::common::condition_behavior_name%]"
}
},
"name": "Thermostat is heating"
"name": "Climate-control device is heating"
},
"is_hvac_mode": {
"description": "Tests if one or more thermostats are set to a specific HVAC mode.",
"description": "Tests if one or more climate-control devices are set to a specific HVAC mode.",
"fields": {
"behavior": {
"name": "[%key:component::climate::common::condition_behavior_name%]"
@@ -46,10 +46,10 @@
"name": "Modes"
}
},
"name": "Thermostat HVAC mode"
"name": "Climate-control device HVAC mode"
},
"is_off": {
"description": "Tests if one or more thermostats are off.",
"description": "Tests if one or more climate-control devices are off.",
"fields": {
"behavior": {
"name": "[%key:component::climate::common::condition_behavior_name%]"
@@ -58,19 +58,19 @@
"name": "[%key:component::climate::common::condition_for_name%]"
}
},
"name": "Thermostat is off"
"name": "Climate-control device is off"
},
"is_on": {
"description": "Tests if one or more thermostats are on.",
"description": "Tests if one or more climate-control devices are on.",
"fields": {
"behavior": {
"name": "[%key:component::climate::common::condition_behavior_name%]"
}
},
"name": "Thermostat is on"
"name": "Climate-control device is on"
},
"target_humidity": {
"description": "Tests the humidity setpoint of one or more thermostats.",
"description": "Tests the humidity setpoint of one or more climate-control devices.",
"fields": {
"behavior": {
"name": "[%key:component::climate::common::condition_behavior_name%]"
@@ -79,10 +79,10 @@
"name": "[%key:component::climate::common::condition_threshold_name%]"
}
},
"name": "Thermostat target humidity"
"name": "Climate-control device target humidity"
},
"target_temperature": {
"description": "Tests the temperature setpoint of one or more thermostats.",
"description": "Tests the temperature setpoint of one or more climate-control devices.",
"fields": {
"behavior": {
"name": "[%key:component::climate::common::condition_behavior_name%]"
@@ -91,7 +91,7 @@
"name": "[%key:component::climate::common::condition_threshold_name%]"
}
},
"name": "Thermostat target temperature"
"name": "Climate-control device target temperature"
}
},
"device_automation": {
@@ -288,67 +288,67 @@
},
"services": {
"set_fan_mode": {
"description": "Sets the fan mode of a thermostat.",
"description": "Sets the fan mode of a climate-control device.",
"fields": {
"fan_mode": {
"description": "Fan operation mode.",
"name": "Fan mode"
}
},
"name": "Set thermostat fan mode"
"name": "Set climate-control device fan mode"
},
"set_humidity": {
"description": "Sets the target humidity of a thermostat.",
"description": "Sets the target humidity of a climate-control device.",
"fields": {
"humidity": {
"description": "Target humidity.",
"name": "Humidity"
}
},
"name": "Set thermostat target humidity"
"name": "Set climate-control device target humidity"
},
"set_hvac_mode": {
"description": "Sets the HVAC mode of a thermostat.",
"description": "Sets the HVAC mode of a climate-control device.",
"fields": {
"hvac_mode": {
"description": "HVAC operation mode.",
"name": "HVAC mode"
}
},
"name": "Set thermostat HVAC mode"
"name": "Set climate-control device HVAC mode"
},
"set_preset_mode": {
"description": "Sets the preset mode of a thermostat.",
"description": "Sets the preset mode of a climate-control device.",
"fields": {
"preset_mode": {
"description": "Preset mode.",
"name": "Preset mode"
}
},
"name": "Set thermostat preset mode"
"name": "Set climate-control device preset mode"
},
"set_swing_horizontal_mode": {
"description": "Sets the horizontal swing mode of a thermostat.",
"description": "Sets the horizontal swing mode of a climate-control device.",
"fields": {
"swing_horizontal_mode": {
"description": "Horizontal swing operation mode.",
"name": "Horizontal swing mode"
}
},
"name": "Set thermostat horizontal swing mode"
"name": "Set climate-control device horizontal swing mode"
},
"set_swing_mode": {
"description": "Sets the swing mode of a thermostat.",
"description": "Sets the swing mode of a climate-control device.",
"fields": {
"swing_mode": {
"description": "Swing operation mode.",
"name": "Swing mode"
}
},
"name": "Set thermostat swing mode"
"name": "Set climate-control device swing mode"
},
"set_temperature": {
"description": "Sets the target temperature of a thermostat.",
"description": "Sets the target temperature of a climate-control device.",
"fields": {
"hvac_mode": {
"description": "HVAC operation mode.",
@@ -367,25 +367,25 @@
"name": "Target temperature"
}
},
"name": "Set thermostat target temperature"
"name": "Set climate-control device target temperature"
},
"toggle": {
"description": "Toggles a thermostat on/off.",
"name": "Toggle thermostat"
"description": "Toggles a climate-control device on/off.",
"name": "Toggle climate-control device"
},
"turn_off": {
"description": "Turns off a thermostat.",
"name": "Turn off thermostat"
"description": "Turns off a climate-control device.",
"name": "Turn off climate-control device"
},
"turn_on": {
"description": "Turns on a thermostat.",
"name": "Turn on thermostat"
"description": "Turns on a climate-control device.",
"name": "Turn on climate-control device"
}
},
"title": "Climate",
"triggers": {
"hvac_mode_changed": {
"description": "Triggers after the mode of one or more thermostats changes.",
"description": "Triggers after the mode of one or more climate-control devices changes.",
"fields": {
"behavior": {
"name": "[%key:component::climate::common::trigger_behavior_name%]"
@@ -398,10 +398,10 @@
"name": "Modes"
}
},
"name": "Thermostat mode changed"
"name": "Climate-control device mode changed"
},
"started_cooling": {
"description": "Triggers after one or more thermostats start cooling.",
"description": "Triggers after one or more climate-control devices start cooling.",
"fields": {
"behavior": {
"name": "[%key:component::climate::common::trigger_behavior_name%]"
@@ -410,10 +410,10 @@
"name": "[%key:component::climate::common::trigger_for_name%]"
}
},
"name": "Thermostat started cooling"
"name": "Climate-control device started cooling"
},
"started_drying": {
"description": "Triggers after one or more thermostats start drying.",
"description": "Triggers after one or more climate-control devices start drying.",
"fields": {
"behavior": {
"name": "[%key:component::climate::common::trigger_behavior_name%]"
@@ -422,10 +422,10 @@
"name": "[%key:component::climate::common::trigger_for_name%]"
}
},
"name": "Thermostat started drying"
"name": "Climate-control device started drying"
},
"started_heating": {
"description": "Triggers after one or more thermostats start heating.",
"description": "Triggers after one or more climate-control devices start heating.",
"fields": {
"behavior": {
"name": "[%key:component::climate::common::trigger_behavior_name%]"
@@ -434,19 +434,19 @@
"name": "[%key:component::climate::common::trigger_for_name%]"
}
},
"name": "Thermostat started heating"
"name": "Climate-control device started heating"
},
"target_humidity_changed": {
"description": "Triggers after the humidity setpoint of one or more thermostats changes.",
"description": "Triggers after the humidity setpoint of one or more climate-control devices changes.",
"fields": {
"threshold": {
"name": "[%key:component::climate::common::trigger_threshold_name%]"
}
},
"name": "Thermostat target humidity changed"
"name": "Climate-control device target humidity changed"
},
"target_humidity_crossed_threshold": {
"description": "Triggers after the humidity setpoint of one or more thermostats crosses a threshold.",
"description": "Triggers after the humidity setpoint of one or more climate-control devices crosses a threshold.",
"fields": {
"behavior": {
"name": "[%key:component::climate::common::trigger_behavior_name%]"
@@ -458,19 +458,19 @@
"name": "[%key:component::climate::common::trigger_threshold_name%]"
}
},
"name": "Thermostat target humidity crossed threshold"
"name": "Climate-control device target humidity crossed threshold"
},
"target_temperature_changed": {
"description": "Triggers after the temperature setpoint of one or more thermostats changes.",
"description": "Triggers after the temperature setpoint of one or more climate-control devices changes.",
"fields": {
"threshold": {
"name": "[%key:component::climate::common::trigger_threshold_name%]"
}
},
"name": "Thermostat target temperature changed"
"name": "Climate-control device target temperature changed"
},
"target_temperature_crossed_threshold": {
"description": "Triggers after the temperature setpoint of one or more thermostats crosses a threshold.",
"description": "Triggers after the temperature setpoint of one or more climate-control devices crosses a threshold.",
"fields": {
"behavior": {
"name": "[%key:component::climate::common::trigger_behavior_name%]"
@@ -482,10 +482,10 @@
"name": "[%key:component::climate::common::trigger_threshold_name%]"
}
},
"name": "Thermostat target temperature crossed threshold"
"name": "Climate-control device target temperature crossed threshold"
},
"turned_off": {
"description": "Triggers after one or more thermostats turn off.",
"description": "Triggers after one or more climate-control devices turn off.",
"fields": {
"behavior": {
"name": "[%key:component::climate::common::trigger_behavior_name%]"
@@ -494,10 +494,10 @@
"name": "[%key:component::climate::common::trigger_for_name%]"
}
},
"name": "Thermostat turned off"
"name": "Climate-control device turned off"
},
"turned_on": {
"description": "Triggers after one or more thermostats turn on, regardless of the mode.",
"description": "Triggers after one or more climate-control devices turn on, regardless of the mode.",
"fields": {
"behavior": {
"name": "[%key:component::climate::common::trigger_behavior_name%]"
@@ -506,7 +506,7 @@
"name": "[%key:component::climate::common::trigger_for_name%]"
}
},
"name": "Thermostat turned on"
"name": "Climate-control device turned on"
}
}
}
@@ -169,8 +169,6 @@ class OptionsFlowHandler(OptionsFlowWithReload):
data_schema = vol.Schema(
{
# Polling interval is user-configurable, which is no longer allowed
# pylint: disable-next=hass-config-flow-polling-field
vol.Optional(
CONF_SCAN_INTERVAL,
default=self.config_entry.options.get(
@@ -11,6 +11,7 @@ from homeassistant.core import HomeAssistant
from homeassistant.helpers import config_validation as cv
from homeassistant.helpers.condition import (
Condition,
ConditionChecker,
ConditionCheckerType,
ConditionConfig,
)
@@ -53,7 +54,6 @@ class DeviceCondition(Condition):
"""Device condition."""
_config: ConfigType
_platform_checker: ConditionCheckerType
@classmethod
async def async_validate_complete_config(
@@ -87,19 +87,20 @@ class DeviceCondition(Condition):
assert config.options is not None
self._config = config.options
async def async_setup(self) -> None:
"""Set up a device condition."""
async def async_get_checker(self) -> ConditionChecker:
"""Test a device condition."""
platform = await async_get_device_automation_platform(
self._hass, self._config[CONF_DOMAIN], DeviceAutomationType.CONDITION
)
self._platform_checker = platform.async_condition_from_config(
platform_checker = platform.async_condition_from_config(
self._hass, self._config
)
def _async_check(self, variables: TemplateVarsType = None, **kwargs: Any) -> bool:
"""Check the condition."""
result = self._platform_checker(self._hass, variables)
return result is not False
def checker(variables: TemplateVarsType = None, **kwargs: Any) -> bool:
result = platform_checker(self._hass, variables)
return result is not False
return checker
CONDITIONS: dict[str, type[Condition]] = {
@@ -1,5 +1,4 @@
"""Data used by this integration."""
# pylint: disable=hass-use-runtime-data # Uses legacy hass.data[DOMAIN] pattern
from __future__ import annotations
-1
View File
@@ -1,5 +1,4 @@
"""Wrapper for media_source around async_upnp_client's DmsDevice ."""
# pylint: disable=hass-use-runtime-data # Uses legacy hass.data[DOMAIN] pattern
from __future__ import annotations
@@ -133,8 +133,6 @@ class DnsIPConfigFlow(ConfigFlow, domain=DOMAIN):
):
errors["base"] = "invalid_hostname"
else:
# Uses hostname as unique ID, which is no longer allowed
# pylint: disable-next=hass-unique-id-ip-based
await self.async_set_unique_id(hostname)
self._abort_if_unique_id_configured()
+1 -1
View File
@@ -13,7 +13,7 @@
"iot_class": "local_polling",
"loggers": ["duco"],
"quality_scale": "platinum",
"requirements": ["python-duco-client==0.3.6"],
"requirements": ["python-duco-client==0.3.4"],
"zeroconf": [
{
"name": "duco [[][0-9a-f][0-9a-f][0-9a-f][0-9a-f][0-9a-f][0-9a-f][0-9a-f][0-9a-f][0-9a-f][0-9a-f][0-9a-f][0-9a-f][]].*",
+2 -12
View File
@@ -4,7 +4,6 @@ from __future__ import annotations
from collections.abc import Callable
from dataclasses import dataclass
import logging
from duco.models import Node, NodeType, VentilationState
@@ -28,8 +27,6 @@ from .const import DOMAIN
from .coordinator import DucoConfigEntry, DucoCoordinator
from .entity import DucoEntity
_LOGGER = logging.getLogger(__name__)
PARALLEL_UPDATES = 0
@@ -82,7 +79,7 @@ SENSOR_DESCRIPTIONS: tuple[DucoSensorEntityDescription, ...] = (
state_class=SensorStateClass.MEASUREMENT,
native_unit_of_measurement=PERCENTAGE,
value_fn=lambda node: node.sensor.rh if node.sensor else None,
node_types=(NodeType.BSRH, NodeType.UCRH),
node_types=(NodeType.BSRH,),
),
DucoSensorEntityDescription(
key="iaq_rh",
@@ -91,7 +88,7 @@ SENSOR_DESCRIPTIONS: tuple[DucoSensorEntityDescription, ...] = (
state_class=SensorStateClass.MEASUREMENT,
entity_registry_enabled_default=False,
value_fn=lambda node: node.sensor.iaq_rh if node.sensor else None,
node_types=(NodeType.BSRH, NodeType.UCRH),
node_types=(NodeType.BSRH,),
),
)
@@ -147,13 +144,6 @@ async def async_setup_entry(
if node.node_id in known_nodes:
continue
known_nodes.add(node.node_id)
if node.general.node_type == NodeType.UNKNOWN:
_LOGGER.warning(
"Duco node %s (%s) has an unsupported device type and will be ignored",
node.node_id,
node.general.name,
)
continue
new_entities.extend(
DucoSensorEntity(coordinator, node, description)
for description in SENSOR_DESCRIPTIONS
@@ -1,5 +1,4 @@
"""The EARN-E P1 Meter integration."""
# pylint: disable=hass-use-runtime-data # Uses legacy hass.data[DOMAIN] pattern
from __future__ import annotations
@@ -213,13 +213,11 @@ ECOWITT_SENSORS_MAPPING: Final = {
),
EcoWittSensorTypes.LIGHTNING_DISTANCE_KM: SensorEntityDescription(
key="LIGHTNING_DISTANCE_KM",
device_class=SensorDeviceClass.DISTANCE,
native_unit_of_measurement=UnitOfLength.KILOMETERS,
state_class=SensorStateClass.MEASUREMENT,
),
EcoWittSensorTypes.LIGHTNING_DISTANCE_MILES: SensorEntityDescription(
key="LIGHTNING_DISTANCE_MILES",
device_class=SensorDeviceClass.DISTANCE,
native_unit_of_measurement=UnitOfLength.MILES,
state_class=SensorStateClass.MEASUREMENT,
),
+2 -44
View File
@@ -8,24 +8,18 @@ from aioesphomeapi import APIClient, APIConnectionError
from homeassistant.components import zeroconf
from homeassistant.components.bluetooth import async_remove_scanner
from homeassistant.components.usb import (
SerialDevice,
USBDevice,
async_register_serial_port_scanner,
)
from homeassistant.const import (
CONF_HOST,
CONF_PASSWORD,
CONF_PORT,
__version__ as ha_version,
)
from homeassistant.core import HomeAssistant, callback
from homeassistant.core import HomeAssistant
from homeassistant.helpers import config_validation as cv
from homeassistant.helpers.issue_registry import async_delete_issue
from homeassistant.helpers.typing import ConfigType
from homeassistant.util import slugify
from . import assist_satellite, dashboard, ffmpeg_proxy, serial_proxy
from . import assist_satellite, dashboard, ffmpeg_proxy
from .const import CONF_BLUETOOTH_MAC_ADDRESS, CONF_NOISE_PSK, DOMAIN
from .domain_data import DomainData
from .encryption_key_storage import async_get_encryption_key_storage
@@ -40,48 +34,12 @@ CONFIG_SCHEMA = cv.config_entry_only_config_schema(DOMAIN)
CLIENT_INFO = f"Home Assistant {ha_version}"
@callback
def _async_scan_serial_ports(
hass: HomeAssistant,
) -> list[USBDevice | SerialDevice]:
"""Return serial-proxy ports exposed by connected ESPHome devices."""
ports: list[USBDevice | SerialDevice] = []
for entry in hass.config_entries.async_loaded_entries(DOMAIN):
entry_data = entry.runtime_data
if not entry_data.available:
continue
device_info = entry_data.device_info
if device_info is None:
continue
ports.extend(
SerialDevice(
device=str(serial_proxy.build_url(entry.entry_id, proxy.name)),
serial_number=(
device_info.mac_address.replace(":", "") + "-" + slugify(proxy.name)
),
manufacturer=device_info.manufacturer,
description=f"{device_info.model} ({proxy.name})",
)
for proxy in device_info.serial_proxies
)
return ports
async def async_setup(hass: HomeAssistant, config: ConfigType) -> bool:
"""Set up the esphome component."""
ffmpeg_proxy.async_setup(hass)
await assist_satellite.async_setup(hass)
await dashboard.async_setup(hass)
async_setup_websocket_api(hass)
if "usb" in hass.config.components:
async_register_serial_port_scanner(hass, _async_scan_serial_ports)
serial_proxy.set_hass_loop(hass.loop)
return True
@@ -40,7 +40,5 @@ class DomainData:
@cache
def get(cls, hass: HomeAssistant) -> Self:
"""Get the global DomainData instance stored in hass.data."""
# Uses legacy hass.data[DOMAIN] pattern
# pylint: disable-next=hass-use-runtime-data
ret = hass.data[DOMAIN] = cls()
return ret
@@ -35,7 +35,6 @@ from aioesphomeapi import (
MediaPlayerInfo,
MediaPlayerSupportedFormat,
NumberInfo,
RadioFrequencyInfo,
SelectInfo,
SensorInfo,
SensorState,
@@ -89,7 +88,6 @@ INFO_TYPE_TO_PLATFORM: dict[type[EntityInfo], Platform] = {
FanInfo: Platform.FAN,
InfraredInfo: Platform.INFRARED,
LightInfo: Platform.LIGHT,
RadioFrequencyInfo: Platform.RADIO_FREQUENCY,
LockInfo: Platform.LOCK,
MediaPlayerInfo: Platform.MEDIA_PLAYER,
NumberInfo: Platform.NUMBER,
@@ -1,7 +1,7 @@
{
"domain": "esphome",
"name": "ESPHome",
"after_dependencies": ["hassio", "tag", "usb", "zeroconf"],
"after_dependencies": ["hassio", "zeroconf", "tag"],
"codeowners": ["@jesserockz", "@kbx81", "@bdraco"],
"config_flow": true,
"dependencies": ["assist_pipeline", "bluetooth", "intent", "ffmpeg", "http"],
@@ -17,7 +17,7 @@
"mqtt": ["esphome/discover/#"],
"quality_scale": "platinum",
"requirements": [
"aioesphomeapi==44.21.0",
"aioesphomeapi==44.18.0",
"esphome-dashboard-api==1.3.0",
"bleak-esphome==3.7.3"
],
@@ -1,77 +0,0 @@
"""Radio Frequency platform for ESPHome."""
from __future__ import annotations
from functools import partial
import logging
from aioesphomeapi import (
EntityState,
RadioFrequencyCapability,
RadioFrequencyInfo,
RadioFrequencyModulation,
)
from rf_protocols import ModulationType, RadioFrequencyCommand
from homeassistant.components.radio_frequency import RadioFrequencyTransmitterEntity
from homeassistant.core import callback
from .entity import (
EsphomeEntity,
convert_api_error_ha_error,
platform_async_setup_entry,
)
_LOGGER = logging.getLogger(__name__)
PARALLEL_UPDATES = 0
MODULATION_TYPE_TO_ESPHOME: dict[ModulationType, RadioFrequencyModulation] = {
ModulationType.OOK: RadioFrequencyModulation.OOK,
}
class EsphomeRadioFrequencyEntity(
EsphomeEntity[RadioFrequencyInfo, EntityState], RadioFrequencyTransmitterEntity
):
"""ESPHome radio frequency entity using native API."""
@property
def supported_frequency_ranges(self) -> list[tuple[int, int]]:
"""Return supported frequency ranges from device info."""
return [(self._static_info.frequency_min, self._static_info.frequency_max)]
@callback
def _on_device_update(self) -> None:
"""Call when device updates or entry data changes."""
super()._on_device_update()
if self._entry_data.available:
self.async_write_ha_state()
@convert_api_error_ha_error
async def async_send_command(self, command: RadioFrequencyCommand) -> None:
"""Send an RF command."""
timings = command.get_raw_timings()
_LOGGER.debug("Sending RF command: %s", timings)
self._client.radio_frequency_transmit_raw_timings(
self._static_info.key,
frequency=command.frequency,
timings=timings,
modulation=MODULATION_TYPE_TO_ESPHOME[command.modulation],
# In ESPHome, repeat_count is total number of times to send the command, while in rf_protocols
# it's the number of additional times to send it, so we need to add 1 here.
repeat_count=command.repeat_count + 1,
device_id=self._static_info.device_id,
)
async_setup_entry = partial(
platform_async_setup_entry,
info_type=RadioFrequencyInfo,
entity_type=EsphomeRadioFrequencyEntity,
state_type=EntityState,
info_filter=lambda info: bool(
info.capabilities & RadioFrequencyCapability.TRANSMITTER
),
)
@@ -1,113 +0,0 @@
"""Home Assistant-aware ESPHome serial proxy URI handler for serialx."""
from __future__ import annotations
import asyncio
from typing import cast
from aioesphomeapi import APIClient
from serialx import register_uri_handler
from serialx.platforms.serial_esphome import (
ESPHomeSerial,
ESPHomeSerialTransport,
InvalidSettingsError,
)
from yarl import URL
from homeassistant.config_entries import ConfigEntryState
from homeassistant.core import HomeAssistant, async_get_hass
from .const import DOMAIN
from .entry_data import ESPHomeConfigEntry
SCHEME = "esphome-hass://"
# This is required so that serialx can safely query Core for an instance of an
# aioesphomeapi client. We cannot make any assumptions here, some packages run separate
# asyncio event loops in dedicated threads.
_HASS_LOOP: asyncio.AbstractEventLoop | None = None
def set_hass_loop(loop: asyncio.AbstractEventLoop) -> None:
"""Store a reference to the Core event loop."""
global _HASS_LOOP # noqa: PLW0603 # pylint: disable=global-statement
_HASS_LOOP = loop
def build_url(entry_id: str, port_name: str) -> URL:
"""Build a canonical `esphome-hass://` URL."""
return URL.build(
scheme="esphome-hass",
host="esphome",
path=f"/{entry_id}",
query={"port_name": port_name},
)
async def _resolve_client(entry_id: str) -> APIClient:
"""Look up the `APIClient` for a specific config entry."""
# This function is async specifically so that we can get a reference to the Home
# Assistant Core instance from its own thread
hass: HomeAssistant = async_get_hass()
entry = cast(ESPHomeConfigEntry, hass.config_entries.async_get_entry(entry_id))
if entry is None or entry.domain != DOMAIN:
raise InvalidSettingsError(f"No ESPHome config entry with id {entry_id!r}")
if entry.state is not ConfigEntryState.LOADED:
raise InvalidSettingsError(f"ESPHome config entry {entry_id!r} is not loaded")
return entry.runtime_data.client
class HassESPHomeSerial(ESPHomeSerial):
"""ESPHomeSerial that resolves an HA config entry's APIClient from the URL."""
_api: APIClient | None
_path: str | None
async def _async_open(self) -> None:
"""Resolve the HA config entry's APIClient, then open the proxy."""
if self._api is None and self._path is not None:
parsed = URL(str(self._path))
entry_id = parsed.path.lstrip("/")
if not entry_id:
raise InvalidSettingsError(
f"No ESPHome config entry id in URL {self._path!r}"
)
if "port_name" not in parsed.query:
raise InvalidSettingsError("Port name is required")
self._port_name = parsed.query["port_name"]
hass_loop = _HASS_LOOP
if hass_loop is None:
raise InvalidSettingsError(
"ESPHome integration has not registered its event loop"
)
# Fetch the `APIClient` from the Core via the appropriate event loop
self._api = await asyncio.wrap_future(
asyncio.run_coroutine_threadsafe(_resolve_client(entry_id), hass_loop)
)
self._client_loop = self._api._loop # noqa: SLF001
await super()._async_open()
class HassESPHomeSerialTransport(ESPHomeSerialTransport):
"""Transport variant that constructs :class:`HassESPHomeSerial`."""
transport_name = "esphome-hass"
_serial_cls = HassESPHomeSerial
register_uri_handler(
scheme=SCHEME,
unique_scheme=SCHEME,
sync_cls=HassESPHomeSerial,
async_transport_cls=HassESPHomeSerialTransport,
)
+4 -7
View File
@@ -11,7 +11,7 @@ from homeassistant.components.sensor import (
SensorEntityDescription,
SensorStateClass,
)
from homeassistant.const import UnitOfVolume, UnitOfVolumeFlowRate
from homeassistant.const import UnitOfVolume
from homeassistant.core import HomeAssistant
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from homeassistant.helpers.typing import StateType
@@ -34,8 +34,7 @@ FLUME_QUERIES_SENSOR: tuple[SensorEntityDescription, ...] = (
key="current_interval",
translation_key="current_interval",
suggested_display_precision=2,
native_unit_of_measurement=UnitOfVolumeFlowRate.GALLONS_PER_MINUTE,
device_class=SensorDeviceClass.VOLUME_FLOW_RATE,
native_unit_of_measurement=f"{UnitOfVolume.GALLONS}/m",
state_class=SensorStateClass.MEASUREMENT,
),
SensorEntityDescription(
@@ -66,16 +65,14 @@ FLUME_QUERIES_SENSOR: tuple[SensorEntityDescription, ...] = (
key="last_60_min",
translation_key="last_60_min",
suggested_display_precision=2,
native_unit_of_measurement=UnitOfVolumeFlowRate.GALLONS_PER_HOUR,
device_class=SensorDeviceClass.VOLUME_FLOW_RATE,
native_unit_of_measurement=f"{UnitOfVolume.GALLONS}/h",
state_class=SensorStateClass.MEASUREMENT,
),
SensorEntityDescription(
key="last_24_hrs",
translation_key="last_24_hrs",
suggested_display_precision=2,
native_unit_of_measurement=UnitOfVolumeFlowRate.GALLONS_PER_DAY,
device_class=SensorDeviceClass.VOLUME_FLOW_RATE,
native_unit_of_measurement=f"{UnitOfVolume.GALLONS}/d",
state_class=SensorStateClass.MEASUREMENT,
),
SensorEntityDescription(
@@ -87,7 +87,8 @@ def async_wifi_bulb_for_host(
async def async_setup(hass: HomeAssistant, config: ConfigType) -> bool:
"""Set up the flux_led component."""
hass.data[FLUX_LED_DISCOVERY] = []
domain_data = hass.data.setdefault(DOMAIN, {})
domain_data[FLUX_LED_DISCOVERY] = []
@callback
def _async_start_background_discovery(*_: Any) -> None:
+1 -3
View File
@@ -9,10 +9,8 @@ from flux_led.const import (
COLOR_MODE_RGBW as FLUX_COLOR_MODE_RGBW,
COLOR_MODE_RGBWW as FLUX_COLOR_MODE_RGBWW,
)
from flux_led.scanner import FluxLEDDiscovery
from homeassistant.components.light import ColorMode
from homeassistant.util.hass_dict import HassKey
DOMAIN: Final = "flux_led"
@@ -36,7 +34,7 @@ DEFAULT_NETWORK_SCAN_INTERVAL: Final = 120
DEFAULT_SCAN_INTERVAL: Final = 5
DEFAULT_EFFECT_SPEED: Final = 50
FLUX_LED_DISCOVERY: HassKey[list[FluxLEDDiscovery]] = HassKey(DOMAIN)
FLUX_LED_DISCOVERY: Final = "flux_led_discovery"
FLUX_LED_EXCEPTIONS: Final = (
TimeoutError,
@@ -153,7 +153,8 @@ def async_update_entry_from_discovery(
@callback
def async_get_discovery(hass: HomeAssistant, host: str) -> FluxLEDDiscovery | None:
"""Check if a device was already discovered via a broadcast discovery."""
for discovery in hass.data[FLUX_LED_DISCOVERY]:
discoveries: list[FluxLEDDiscovery] = hass.data[DOMAIN][FLUX_LED_DISCOVERY]
for discovery in discoveries:
if discovery[ATTR_IPADDR] == host:
return discovery
return None
@@ -162,10 +163,10 @@ def async_get_discovery(hass: HomeAssistant, host: str) -> FluxLEDDiscovery | No
@callback
def async_clear_discovery_cache(hass: HomeAssistant, host: str) -> None:
"""Clear the host from the discovery cache."""
hass.data[FLUX_LED_DISCOVERY] = [
discovery
for discovery in hass.data[FLUX_LED_DISCOVERY]
if discovery[ATTR_IPADDR] != host
domain_data = hass.data[DOMAIN]
discoveries: list[FluxLEDDiscovery] = domain_data[FLUX_LED_DISCOVERY]
domain_data[FLUX_LED_DISCOVERY] = [
discovery for discovery in discoveries if discovery[ATTR_IPADDR] != host
]
@@ -44,8 +44,6 @@ class FreeboxFlowHandler(ConfigFlow, domain=DOMAIN):
self._data = user_input
# Check if already configured
# Uses the host/IP value from CONF_HOST as unique ID, which is no longer allowed
# pylint: disable-next=hass-unique-id-ip-based
await self.async_set_unique_id(self._data[CONF_HOST])
self._abort_if_unique_id_configured()
+2
View File
@@ -66,6 +66,8 @@ SWITCH_TYPE_WIFINETWORK = "WiFiNetwork"
BUTTON_TYPE_WOL = "WakeOnLan"
UPTIME_DEVIATION = 5
FRITZ_EXCEPTIONS = (
ConnectionError,
FritzActionError,
+19 -6
View File
@@ -28,7 +28,7 @@ from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from homeassistant.helpers.typing import StateType
from homeassistant.util.dt import utcnow
from .const import DSL_CONNECTION
from .const import DSL_CONNECTION, UPTIME_DEVIATION
from .coordinator import FritzConfigEntry
from .entity import FritzBoxBaseCoordinatorEntity, FritzEntityDescription
from .models import ConnectionInfo
@@ -39,18 +39,31 @@ _LOGGER = logging.getLogger(__name__)
PARALLEL_UPDATES = 0
def _uptime_calculation(seconds_uptime: float, last_value: datetime | None) -> datetime:
"""Calculate uptime with deviation."""
delta_uptime = utcnow() - timedelta(seconds=seconds_uptime)
if (
not last_value
or abs((delta_uptime - last_value).total_seconds()) > UPTIME_DEVIATION
):
return delta_uptime
return last_value
def _retrieve_device_uptime_state(
status: FritzStatus, last_value: datetime | None
status: FritzStatus, last_value: datetime
) -> datetime:
"""Return uptime from device."""
return utcnow() - timedelta(seconds=status.device_uptime)
return _uptime_calculation(status.device_uptime, last_value)
def _retrieve_connection_uptime_state(
status: FritzStatus, last_value: datetime | None
) -> datetime:
"""Return uptime from connection."""
return utcnow() - timedelta(seconds=status.connection_uptime)
return _uptime_calculation(status.connection_uptime, last_value)
def _retrieve_external_ip_state(status: FritzStatus, last_value: str) -> str:
@@ -187,7 +200,7 @@ CONNECTION_SENSOR_TYPES: tuple[FritzConnectionSensorEntityDescription, ...] = (
FritzConnectionSensorEntityDescription(
key="connection_uptime",
translation_key="connection_uptime",
device_class=SensorDeviceClass.UPTIME,
device_class=SensorDeviceClass.TIMESTAMP,
entity_category=EntityCategory.DIAGNOSTIC,
value_fn=_retrieve_connection_uptime_state,
),
@@ -295,7 +308,7 @@ DEVICE_SENSOR_TYPES: tuple[FritzDeviceSensorEntityDescription, ...] = (
FritzDeviceSensorEntityDescription(
key="device_uptime",
translation_key="device_uptime",
device_class=SensorDeviceClass.UPTIME,
device_class=SensorDeviceClass.TIMESTAMP,
entity_category=EntityCategory.DIAGNOSTIC,
value_fn=_retrieve_device_uptime_state,
),
@@ -407,12 +407,6 @@ def async_remove_panel(
hass.bus.async_fire(EVENT_PANELS_UPDATED)
@callback
def async_panel_exists(hass: HomeAssistant, frontend_url_path: str) -> bool:
"""Return if a panel is registered for the given frontend URL path."""
return frontend_url_path in hass.data.get(DATA_PANELS, {})
def add_extra_js_url(hass: HomeAssistant, url: str, es5: bool = False) -> None:
"""Register extra js or module url to load.
+49 -63
View File
@@ -9,7 +9,6 @@ from fumis import (
Fumis,
FumisAuthenticationError,
FumisConnectionError,
FumisInfo,
FumisStoveOfflineError,
)
import voluptuous as vol
@@ -52,10 +51,23 @@ class FumisFlowHandler(ConfigFlow, domain=DOMAIN):
errors: dict[str, str] = {}
if user_input is not None:
errors, info = await self._validate_input(
self._discovered_mac, user_input[CONF_PIN]
fumis = Fumis(
mac=self._discovered_mac,
password=user_input[CONF_PIN],
session=async_get_clientsession(self.hass),
)
if info:
try:
info = await fumis.update_info()
except FumisAuthenticationError:
errors[CONF_PIN] = "invalid_auth"
except FumisStoveOfflineError:
errors["base"] = "device_offline"
except FumisConnectionError:
errors["base"] = "cannot_connect"
except Exception: # noqa: BLE001
LOGGER.exception("Unexpected exception")
errors["base"] = "unknown"
else:
return self.async_create_entry(
title=info.controller.model_name or "Fumis",
data={
@@ -84,8 +96,23 @@ class FumisFlowHandler(ConfigFlow, domain=DOMAIN):
if user_input is not None:
mac = user_input[CONF_MAC].replace(":", "").replace("-", "").upper()
errors, info = await self._validate_input(mac, user_input[CONF_PIN])
if info:
fumis = Fumis(
mac=mac,
password=user_input[CONF_PIN],
session=async_get_clientsession(self.hass),
)
try:
info = await fumis.update_info()
except FumisAuthenticationError:
errors[CONF_PIN] = "invalid_auth"
except FumisStoveOfflineError:
errors["base"] = "device_offline"
except FumisConnectionError:
errors["base"] = "cannot_connect"
except Exception: # noqa: BLE001
LOGGER.exception("Unexpected exception")
errors["base"] = "unknown"
else:
await self.async_set_unique_id(format_mac(mac), raise_on_progress=False)
self._abort_if_unique_id_configured()
return self.async_create_entry(
@@ -114,35 +141,6 @@ class FumisFlowHandler(ConfigFlow, domain=DOMAIN):
errors=errors,
)
async def async_step_reconfigure(
self, user_input: dict[str, Any] | None = None
) -> ConfigFlowResult:
"""Handle reconfiguration of a Fumis stove."""
errors: dict[str, str] = {}
reconfigure_entry = self._get_reconfigure_entry()
if user_input is not None:
errors, _ = await self._validate_input(
reconfigure_entry.data[CONF_MAC], user_input[CONF_PIN]
)
if not errors:
return self.async_update_reload_and_abort(
reconfigure_entry,
data_updates={CONF_PIN: user_input[CONF_PIN]},
)
return self.async_show_form(
step_id="reconfigure",
data_schema=vol.Schema(
{
vol.Required(CONF_PIN): TextSelector(
TextSelectorConfig(type=TextSelectorType.PASSWORD)
),
}
),
errors=errors,
)
async def async_step_reauth(
self, entry_data: Mapping[str, Any]
) -> ConfigFlowResult:
@@ -157,10 +155,23 @@ class FumisFlowHandler(ConfigFlow, domain=DOMAIN):
if user_input is not None:
reauth_entry = self._get_reauth_entry()
errors, _ = await self._validate_input(
reauth_entry.data[CONF_MAC], user_input[CONF_PIN]
fumis = Fumis(
mac=reauth_entry.data[CONF_MAC],
password=user_input[CONF_PIN],
session=async_get_clientsession(self.hass),
)
if not errors:
try:
await fumis.update_info()
except FumisAuthenticationError:
errors[CONF_PIN] = "invalid_auth"
except FumisStoveOfflineError:
errors["base"] = "device_offline"
except FumisConnectionError:
errors["base"] = "cannot_connect"
except Exception: # noqa: BLE001
LOGGER.exception("Unexpected exception")
errors["base"] = "unknown"
else:
return self.async_update_reload_and_abort(
reauth_entry,
data_updates={CONF_PIN: user_input[CONF_PIN]},
@@ -177,28 +188,3 @@ class FumisFlowHandler(ConfigFlow, domain=DOMAIN):
),
errors=errors,
)
async def _validate_input(
self, mac: str, pin: str
) -> tuple[dict[str, str], FumisInfo | None]:
"""Validate credentials, returning errors and info."""
errors: dict[str, str] = {}
fumis = Fumis(
mac=mac,
password=pin,
session=async_get_clientsession(self.hass),
)
try:
info = await fumis.update_info()
except FumisAuthenticationError:
errors[CONF_PIN] = "invalid_auth"
except FumisStoveOfflineError:
errors["base"] = "device_offline"
except FumisConnectionError:
errors["base"] = "cannot_connect"
except Exception: # noqa: BLE001
LOGGER.exception("Unexpected exception")
errors["base"] = "unknown"
else:
return errors, info
return errors, None
+1 -1
View File
@@ -13,5 +13,5 @@
"iot_class": "cloud_polling",
"loggers": ["fumis"],
"quality_scale": "bronze",
"requirements": ["fumis==0.3.0"]
"requirements": ["fumis==0.2.1"]
}
@@ -62,7 +62,7 @@ rules:
entity-translations: done
exception-translations: done
icon-translations: done
reconfiguration-flow: done
reconfiguration-flow: todo
repair-issues:
status: exempt
comment: This integration does not raise any repairable issues.
+4 -1
View File
@@ -202,7 +202,10 @@ SENSORS: tuple[FumisSensorEntityDescription, ...] = (
device_class=SensorDeviceClass.DURATION,
native_unit_of_measurement=UnitOfTime.HOURS,
entity_category=EntityCategory.DIAGNOSTIC,
has_fn=lambda data: data.controller.time_to_service is not None,
has_fn=lambda data: (
data.controller.time_to_service is not None
and data.controller.time_to_service >= 0
),
value_fn=lambda data: data.controller.time_to_service,
),
FumisSensorEntityDescription(
+1 -11
View File
@@ -2,8 +2,7 @@
"config": {
"abort": {
"already_configured": "[%key:common::config_flow::abort::already_configured_device%]",
"reauth_successful": "[%key:common::config_flow::abort::reauth_successful%]",
"reconfigure_successful": "[%key:common::config_flow::abort::reconfigure_successful%]"
"reauth_successful": "[%key:common::config_flow::abort::reauth_successful%]"
},
"error": {
"cannot_connect": "[%key:common::config_flow::error::cannot_connect%]",
@@ -30,15 +29,6 @@
},
"description": "The PIN code for your stove has changed. Please enter the new PIN code to re-authenticate."
},
"reconfigure": {
"data": {
"pin": "[%key:component::fumis::config::step::user::data::pin%]"
},
"data_description": {
"pin": "[%key:component::fumis::config::step::user::data_description::pin%]"
},
"description": "Reconfigure your Fumis pellet stove connection."
},
"user": {
"data": {
"mac": "MAC address",
+35 -32
View File
@@ -2,19 +2,17 @@
from __future__ import annotations
from types import MappingProxyType
from aiogithubapi import GitHubAPI
from homeassistant.config_entries import ConfigSubentry
from homeassistant.const import CONF_ACCESS_TOKEN, Platform
from homeassistant.core import HomeAssistant
from homeassistant.core import HomeAssistant, callback
from homeassistant.helpers import device_registry as dr
from homeassistant.helpers.aiohttp_client import (
SERVER_SOFTWARE,
async_get_clientsession,
)
from .const import CONF_REPOSITORIES, CONF_REPOSITORY, SUBENTRY_TYPE_REPOSITORY
from .const import CONF_REPOSITORIES, DOMAIN, LOGGER
from .coordinator import GithubConfigEntry, GitHubDataUpdateCoordinator
PLATFORMS: list[Platform] = [Platform.SENSOR]
@@ -28,9 +26,10 @@ async def async_setup_entry(hass: HomeAssistant, entry: GithubConfigEntry) -> bo
client_name=SERVER_SOFTWARE,
)
repositories: list[str] = entry.options[CONF_REPOSITORIES]
entry.runtime_data = {}
for repository_subentry in entry.get_subentries_of_type(SUBENTRY_TYPE_REPOSITORY):
repository = repository_subentry.data[CONF_REPOSITORY]
for repository in repositories:
coordinator = GitHubDataUpdateCoordinator(
hass=hass,
config_entry=entry,
@@ -43,17 +42,41 @@ async def async_setup_entry(hass: HomeAssistant, entry: GithubConfigEntry) -> bo
if not entry.pref_disable_polling:
await coordinator.subscribe()
entry.runtime_data[repository_subentry.subentry_id] = coordinator
entry.runtime_data[repository] = coordinator
entry.async_on_unload(entry.add_update_listener(async_update_entry))
async_cleanup_device_registry(hass=hass, entry=entry)
await hass.config_entries.async_forward_entry_setups(entry, PLATFORMS)
return True
async def async_update_entry(hass: HomeAssistant, entry: GithubConfigEntry) -> None:
"""Update entry."""
await hass.config_entries.async_reload(entry.entry_id)
@callback
def async_cleanup_device_registry(
hass: HomeAssistant,
entry: GithubConfigEntry,
) -> None:
"""Remove entries form device registry if we no longer track the repository."""
device_registry = dr.async_get(hass)
devices = dr.async_entries_for_config_entry(
registry=device_registry,
config_entry_id=entry.entry_id,
)
for device in devices:
for item in device.identifiers:
if item[0] == DOMAIN and item[1] not in entry.options[CONF_REPOSITORIES]:
LOGGER.debug(
(
"Unlinking device %s for untracked repository %s from config"
" entry %s"
),
device.id,
item[1],
entry.entry_id,
)
device_registry.async_update_device(
device.id, remove_config_entry_id=entry.entry_id
)
break
async def async_unload_entry(hass: HomeAssistant, entry: GithubConfigEntry) -> bool:
@@ -63,23 +86,3 @@ async def async_unload_entry(hass: HomeAssistant, entry: GithubConfigEntry) -> b
coordinator.unsubscribe()
return await hass.config_entries.async_unload_platforms(entry, PLATFORMS)
async def async_migrate_entry(hass: HomeAssistant, entry: GithubConfigEntry) -> bool:
"""Migrate old entry."""
if entry.minor_version == 1:
# In minor version 2 we migrated repositories from entry options to
# subentries, so we need to convert the list from
# entry.options[CONF_REPOSITORIES] into individual subentries.
for repository in entry.options[CONF_REPOSITORIES]:
subentry = ConfigSubentry(
data=MappingProxyType({CONF_REPOSITORY: repository}),
subentry_type=SUBENTRY_TYPE_REPOSITORY,
title=repository,
unique_id=repository,
)
hass.config_entries.async_add_subentry(entry, subentry)
hass.config_entries.async_update_entry(entry, minor_version=2)
return True
+61 -52
View File
@@ -19,31 +19,23 @@ from homeassistant.config_entries import (
ConfigEntry,
ConfigFlow,
ConfigFlowResult,
ConfigSubentryFlow,
SubentryFlowResult,
OptionsFlowWithReload,
)
from homeassistant.const import CONF_ACCESS_TOKEN
from homeassistant.core import HomeAssistant, callback
from homeassistant.helpers import config_validation as cv
from homeassistant.helpers.aiohttp_client import (
SERVER_SOFTWARE,
async_get_clientsession,
)
from homeassistant.helpers.selector import SelectSelector, SelectSelectorConfig
from .const import (
CLIENT_ID,
CONF_REPOSITORY,
DEFAULT_REPOSITORIES,
DOMAIN,
LOGGER,
SUBENTRY_TYPE_REPOSITORY,
)
from .const import CLIENT_ID, CONF_REPOSITORIES, DEFAULT_REPOSITORIES, DOMAIN, LOGGER
async def get_repositories(hass: HomeAssistant, access_token: str) -> list[str]:
"""Return a list of repositories that the user owns or has starred."""
client = GitHubAPI(token=access_token, session=async_get_clientsession(hass))
repositories: set[str] = set()
repositories = set()
async def _get_starred_repositories() -> None:
response = await client.user.starred(params={"per_page": 100})
@@ -61,7 +53,7 @@ async def get_repositories(hass: HomeAssistant, access_token: str) -> list[str]:
for result in results:
response.data.extend(result.data)
repositories.update(repo.full_name for repo in response.data)
repositories.update(response.data)
async def _get_personal_repositories() -> None:
response = await client.user.repos(params={"per_page": 100})
@@ -79,7 +71,7 @@ async def get_repositories(hass: HomeAssistant, access_token: str) -> list[str]:
for result in results:
response.data.extend(result.data)
repositories.update(repo.full_name for repo in response.data)
repositories.update(response.data)
try:
await asyncio.gather(
@@ -90,26 +82,21 @@ async def get_repositories(hass: HomeAssistant, access_token: str) -> list[str]:
)
except GitHubException:
repositories.update(DEFAULT_REPOSITORIES)
return DEFAULT_REPOSITORIES
if len(repositories) == 0:
repositories.update(DEFAULT_REPOSITORIES)
return DEFAULT_REPOSITORIES
current_repositories = {
subentry.data[CONF_REPOSITORY]
for entry in hass.config_entries.async_entries(DOMAIN)
for subentry in entry.subentries.values()
if subentry.subentry_type == SUBENTRY_TYPE_REPOSITORY
}
repositories = repositories - current_repositories
return sorted(repositories, key=str.casefold)
return sorted(
(repo.full_name for repo in repositories),
key=str.casefold,
)
class GitHubConfigFlow(ConfigFlow, domain=DOMAIN):
"""Handle a config flow for GitHub."""
MINOR_VERSION = 2
VERSION = 1
login_task: asyncio.Task | None = None
@@ -119,14 +106,6 @@ class GitHubConfigFlow(ConfigFlow, domain=DOMAIN):
self._login: GitHubLoginOauthModel | None = None
self._login_device: GitHubLoginDeviceModel | None = None
@classmethod
@callback
def async_get_supported_subentry_types(
cls, config_entry: ConfigEntry
) -> dict[str, type[ConfigSubentryFlow]]:
"""Return subentries supported by this handler."""
return {SUBENTRY_TYPE_REPOSITORY: RepositoryFlowHandler}
async def async_step_user(
self,
user_input: dict[str, Any] | None = None,
@@ -174,7 +153,7 @@ class GitHubConfigFlow(ConfigFlow, domain=DOMAIN):
if self.login_task.done():
if self.login_task.exception():
return self.async_show_progress_done(next_step_id="could_not_register")
return self.async_show_progress_done(next_step_id="done")
return self.async_show_progress_done(next_step_id="repositories")
if TYPE_CHECKING:
# mypy is not aware that we can't get here without having this set already
@@ -190,18 +169,33 @@ class GitHubConfigFlow(ConfigFlow, domain=DOMAIN):
progress_task=self.login_task,
)
async def async_step_done(
async def async_step_repositories(
self,
user_input: dict[str, Any] | None = None,
) -> ConfigFlowResult:
"""Create the config entry after successful device authentication."""
"""Handle repositories step."""
if TYPE_CHECKING:
# mypy is not aware that we can't get here without having this set already
assert self._login is not None
if not user_input:
repositories = await get_repositories(self.hass, self._login.access_token)
return self.async_show_form(
step_id="repositories",
data_schema=vol.Schema(
{
vol.Required(CONF_REPOSITORIES): cv.multi_select(
{k: k for k in repositories}
),
}
),
)
return self.async_create_entry(
title="",
data={CONF_ACCESS_TOKEN: self._login.access_token},
options={CONF_REPOSITORIES: user_input[CONF_REPOSITORIES]},
)
async def async_step_could_not_register(
@@ -211,31 +205,46 @@ class GitHubConfigFlow(ConfigFlow, domain=DOMAIN):
"""Handle issues that need transition await from progress step."""
return self.async_abort(reason="could_not_register")
@staticmethod
@callback
def async_get_options_flow(
config_entry: ConfigEntry,
) -> OptionsFlowHandler:
"""Get the options flow for this handler."""
return OptionsFlowHandler()
class RepositoryFlowHandler(ConfigSubentryFlow):
"""Handle repository subentry flow."""
async def async_step_user(
self, user_input: dict[str, Any] | None = None
) -> SubentryFlowResult:
"""Handle repository subentry flow."""
class OptionsFlowHandler(OptionsFlowWithReload):
"""Handle a option flow for GitHub."""
async def async_step_init(
self,
user_input: dict[str, Any] | None = None,
) -> ConfigFlowResult:
"""Handle options flow."""
if not user_input:
configured_repositories: list[str] = self.config_entry.options[
CONF_REPOSITORIES
]
repositories = await get_repositories(
self.hass, self._get_entry().data[CONF_ACCESS_TOKEN]
self.hass, self.config_entry.data[CONF_ACCESS_TOKEN]
)
# In case the user has removed a starred repository that is already tracked
for repository in configured_repositories:
if repository not in repositories:
repositories.append(repository)
return self.async_show_form(
step_id="user",
step_id="init",
data_schema=vol.Schema(
{
vol.Required(CONF_REPOSITORY): SelectSelector(
SelectSelectorConfig(sort=True, options=repositories)
),
vol.Required(
CONF_REPOSITORIES,
default=configured_repositories,
): cv.multi_select({k: k for k in repositories}),
}
),
)
repository = user_input[CONF_REPOSITORY]
return self.async_create_entry(
title=repository, data=user_input, unique_id=repository
)
return self.async_create_entry(title="", data=user_input)
-3
View File
@@ -15,9 +15,6 @@ DEFAULT_REPOSITORIES = ["home-assistant/core", "esphome/esphome"]
FALLBACK_UPDATE_INTERVAL = timedelta(hours=1, minutes=30)
CONF_REPOSITORIES = "repositories"
CONF_REPOSITORY = "repository"
SUBENTRY_TYPE_REPOSITORY = "repository"
REFRESH_EVENT_TYPES = (
@@ -21,7 +21,7 @@ async def async_get_config_entry_diagnostics(
config_entry: GithubConfigEntry,
) -> dict[str, Any]:
"""Return diagnostics for a config entry."""
data: dict[str, Any] = {}
data = {"options": {**config_entry.options}}
client = GitHubAPI(
token=config_entry.data[CONF_ACCESS_TOKEN],
session=async_get_clientsession(hass),
@@ -38,7 +38,7 @@ async def async_get_config_entry_diagnostics(
repositories = config_entry.runtime_data
data["repositories"] = {}
for coordinator in repositories.values():
data["repositories"][coordinator.data["full_name"]] = coordinator.data
for repository, coordinator in repositories.items():
data["repositories"][repository] = coordinator.data
return data
+7 -8
View File
@@ -150,14 +150,13 @@ async def async_setup_entry(
) -> None:
"""Set up GitHub sensor based on a config entry."""
repositories = entry.runtime_data
for subentry_id, coordinator in repositories.items():
async_add_entities(
(
GitHubSensorEntity(coordinator, description)
for description in SENSOR_DESCRIPTIONS
),
config_subentry_id=subentry_id,
)
async_add_entities(
(
GitHubSensorEntity(coordinator, description)
for description in SENSOR_DESCRIPTIONS
for coordinator in repositories.values()
),
)
class GitHubSensorEntity(CoordinatorEntity[GitHubDataUpdateCoordinator], SensorEntity):
+6 -20
View File
@@ -7,26 +7,12 @@
"progress": {
"wait_for_device": "Open {url}, and paste the following code to authorize the integration: \n```\n{code}\n```"
},
"step": {}
},
"config_subentries": {
"repository": {
"abort": {
"already_configured": "Repository is already configured"
},
"entry_type": "[%key:component::github::config_subentries::repository::step::user::data::repository%]",
"initiate_flow": {
"user": "Add repository"
},
"step": {
"user": {
"data": {
"repository": "Repository"
},
"data_description": {
"repository": "The repository to track"
}
}
"step": {
"repositories": {
"data": {
"repositories": "Select repositories to track."
},
"title": "Configure repositories"
}
}
},
@@ -1,5 +1,4 @@
"""Support for Actions on Google Assistant Smart Home Control."""
# pylint: disable=hass-use-runtime-data # Uses legacy hass.data[DOMAIN] pattern
from __future__ import annotations
@@ -21,8 +21,6 @@ async def async_setup_entry(
async_add_entities: AddConfigEntryEntitiesCallback,
) -> None:
"""Set up the platform."""
# Uses legacy hass.data[DOMAIN] pattern
# pylint: disable-next=hass-use-runtime-data
yaml_config: ConfigType = hass.data[DOMAIN][DATA_CONFIG]
google_config = config_entry.runtime_data
@@ -54,8 +54,6 @@ async def async_setup_entry(hass: HomeAssistant, entry: GoogleMailConfigEntry) -
Platform.NOTIFY,
DOMAIN,
{DATA_AUTH: auth, CONF_NAME: entry.title},
# Uses legacy hass.data[DOMAIN] pattern
# pylint: disable-next=hass-use-runtime-data
hass.data[DOMAIN][DATA_HASS_CONFIG],
)
)
+4 -45
View File
@@ -87,18 +87,7 @@ async def async_setup_entry(
class SupervisorAddonUpdateEntity(HassioAddonEntity, UpdateEntity):
"""Update entity to handle updates for the Supervisor add-ons.
The ``addon_manager_update`` job emits a ``done=True`` WS event as soon as
Supervisor finishes the container work, a few milliseconds before the
``/store/addons/<slug>/update`` HTTP call returns. If we clear
``_attr_in_progress`` on that event while the coordinator data still
carries the pre-update version, the UI briefly flips back to
"Update available" before ``async_install`` can refresh. ``_update_ongoing``
survives both the WS done event and the base ``UpdateEntity`` reset, so
the installing state remains until the coordinator confirms a new
``installed_version``.
"""
"""Update entity to handle updates for the Supervisor add-ons."""
_attr_supported_features = (
UpdateEntityFeature.INSTALL
@@ -106,8 +95,6 @@ class SupervisorAddonUpdateEntity(HassioAddonEntity, UpdateEntity):
| UpdateEntityFeature.RELEASE_NOTES
| UpdateEntityFeature.PROGRESS
)
_update_ongoing: bool = False
_version_before_update: str | None = None
@property
def _addon_data(self) -> dict:
@@ -134,13 +121,6 @@ class SupervisorAddonUpdateEntity(HassioAddonEntity, UpdateEntity):
"""Version installed and in use."""
return self._addon_data[ATTR_VERSION]
@property
def in_progress(self) -> bool | None:
"""Return combined progress from the update job and refresh phase."""
if self._update_ongoing:
return True
return self._attr_in_progress
@property
def entity_picture(self) -> str | None:
"""Return the icon of the add-on if any."""
@@ -174,34 +154,13 @@ class SupervisorAddonUpdateEntity(HassioAddonEntity, UpdateEntity):
**kwargs: Any,
) -> None:
"""Install an update."""
self._version_before_update = self.installed_version
self._update_ongoing = True
self._attr_in_progress = True
self.async_write_ha_state()
try:
await update_addon(
self.hass, self._addon_slug, backup, self.title, self.installed_version
)
except HomeAssistantError:
self._update_ongoing = False
self._version_before_update = None
self._attr_in_progress = False
self._attr_update_percentage = None
self.async_write_ha_state()
raise
await update_addon(
self.hass, self._addon_slug, backup, self.title, self.installed_version
)
await self.coordinator.async_refresh()
@callback
def _handle_coordinator_update(self) -> None:
"""Clear the ongoing flag once the installed version has changed."""
if (
self._update_ongoing
and self.installed_version != self._version_before_update
):
self._update_ongoing = False
self._version_before_update = None
super()._handle_coordinator_update()
@callback
def _update_job_changed(self, job: Job) -> None:
"""Process update for this entity's update job."""
@@ -1,5 +1,4 @@
"""The Hisense AEH-W4A1 integration."""
# pylint: disable=hass-use-runtime-data # Uses legacy hass.data[DOMAIN] pattern
import ipaddress
import logging
@@ -1,5 +1,4 @@
"""Pyaehw4a1 platform to control of Hisense AEH-W4A1 Climate Devices."""
# pylint: disable=hass-use-runtime-data # Uses legacy hass.data[DOMAIN] pattern
from __future__ import annotations
@@ -219,8 +219,6 @@ class HiveOptionsFlowHandler(OptionsFlow):
schema = vol.Schema(
{
# Polling interval is user-configurable, which is no longer allowed
# pylint: disable-next=hass-config-flow-polling-field
vol.Optional(CONF_SCAN_INTERVAL, default=self.interval): vol.All(
vol.Coerce(int), vol.Range(min=30)
)
@@ -225,7 +225,7 @@ async def async_attach_trigger( # noqa: C901
elif (
new_state.domain == "sensor"
and new_state.attributes.get(ATTR_DEVICE_CLASS)
in (sensor.SensorDeviceClass.TIMESTAMP, sensor.SensorDeviceClass.UPTIME)
== sensor.SensorDeviceClass.TIMESTAMP
and new_state.state not in (STATE_UNAVAILABLE, STATE_UNKNOWN)
):
trigger_dt = dt_util.parse_datetime(new_state.state)
+8 -65
View File
@@ -1,11 +1,11 @@
"""The Homee lock platform."""
from typing import TYPE_CHECKING, Any
from typing import Any
from pyHomee.const import AttributeChangedBy, AttributeType
from pyHomee.model import HomeeAttribute, HomeeNode
from pyHomee.model import HomeeNode
from homeassistant.components.lock import LockEntity, LockEntityFeature
from homeassistant.components.lock import LockEntity
from homeassistant.core import HomeAssistant
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
@@ -15,24 +15,6 @@ from .helpers import get_name_for_enum, setup_homee_platform
PARALLEL_UPDATES = 0
LOCK_STATE_UNLOCKED = 0.0
LOCK_STATE_LOCKED = 1.0
def _determine_lock_state_open(attribute: HomeeAttribute) -> float | None:
"""Return the attribute value that momentarily unlatches the lock.
Different homee-compatible locks encode the "open" (unlatch) command
differently. The Hörmann SmartKey uses a signed range {-1, 0, 1}
where -1 is unlatch; other devices extend above with {0, 1, 2}.
Returns None when the device only supports two states.
"""
if attribute.maximum == 2.0:
return 2.0
if attribute.minimum == -1.0:
return -1.0
return None
async def add_lock_entities(
config_entry: HomeeConfigEntry,
@@ -63,53 +45,20 @@ class HomeeLock(HomeeEntity, LockEntity):
_attr_name = None
def __init__(self, attribute: HomeeAttribute, entry: HomeeConfigEntry) -> None:
"""Initialize the homee lock."""
super().__init__(attribute, entry)
self._lock_state_open = _determine_lock_state_open(attribute)
if self._lock_state_open is not None:
self._attr_supported_features = LockEntityFeature.OPEN
@property
def is_locked(self) -> bool:
"""Return if lock is locked."""
return self._attribute.current_value == LOCK_STATE_LOCKED
@property
def is_open(self) -> bool:
"""Return if lock is open (unlatched)."""
# Require target_value too, so mid-transition away from "open" resolves
# to is_locking/is_unlocking rather than OPEN (HA state precedence).
return (
self._lock_state_open is not None
and self._attribute.current_value == self._lock_state_open
and self._attribute.target_value == self._lock_state_open
)
return self._attribute.current_value == 1.0
@property
def is_locking(self) -> bool:
"""Return if lock is locking."""
return (
self._attribute.target_value == LOCK_STATE_LOCKED
and self._attribute.current_value != LOCK_STATE_LOCKED
)
return self._attribute.target_value > self._attribute.current_value
@property
def is_unlocking(self) -> bool:
"""Return if lock is unlocking."""
return (
self._attribute.target_value == LOCK_STATE_UNLOCKED
and self._attribute.current_value != LOCK_STATE_UNLOCKED
)
@property
def is_opening(self) -> bool:
"""Return if lock is opening (unlatching)."""
return (
self._lock_state_open is not None
and self._attribute.target_value == self._lock_state_open
and self._attribute.current_value != self._lock_state_open
)
return self._attribute.target_value < self._attribute.current_value
@property
def changed_by(self) -> str:
@@ -131,14 +80,8 @@ class HomeeLock(HomeeEntity, LockEntity):
async def async_lock(self, **kwargs: Any) -> None:
"""Lock specified lock. A code to lock the lock with may be specified."""
await self.async_set_homee_value(LOCK_STATE_LOCKED)
await self.async_set_homee_value(1)
async def async_unlock(self, **kwargs: Any) -> None:
"""Unlock specified lock. A code to unlock the lock with may be specified."""
await self.async_set_homee_value(LOCK_STATE_UNLOCKED)
async def async_open(self, **kwargs: Any) -> None:
"""Open (unlatch) the lock."""
if TYPE_CHECKING:
assert self._lock_state_open is not None
await self.async_set_homee_value(self._lock_state_open)
await self.async_set_homee_value(0)
@@ -1,20 +0,0 @@
"""The Honeywell String Lights integration."""
from __future__ import annotations
from homeassistant.config_entries import ConfigEntry
from homeassistant.const import Platform
from homeassistant.core import HomeAssistant
PLATFORMS: list[Platform] = [Platform.LIGHT]
async def async_setup_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool:
"""Set up Honeywell String Lights from a config entry."""
await hass.config_entries.async_forward_entry_setups(entry, PLATFORMS)
return True
async def async_unload_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool:
"""Unload a config entry."""
return await hass.config_entries.async_unload_platforms(entry, PLATFORMS)
@@ -1,61 +0,0 @@
"""Config flow for the Honeywell String Lights integration."""
from __future__ import annotations
from typing import Any
from rf_protocols import RadioFrequencyCommand
import voluptuous as vol
from homeassistant.components.radio_frequency import async_get_transmitters
from homeassistant.config_entries import ConfigFlow, ConfigFlowResult
from homeassistant.exceptions import HomeAssistantError
from homeassistant.helpers import entity_registry as er, selector
from .const import CONF_TRANSMITTER, DOMAIN
from .light import COMMANDS
class HoneywellStringLightsConfigFlow(ConfigFlow, domain=DOMAIN):
"""Handle a config flow for Honeywell String Lights."""
VERSION = 1
async def async_step_user(
self, user_input: dict[str, Any] | None = None
) -> ConfigFlowResult:
"""Handle the initial step."""
sample_command: RadioFrequencyCommand = await self.hass.async_add_executor_job(
COMMANDS.load_command, "turn_on"
)
try:
transmitters = async_get_transmitters(
self.hass, sample_command.frequency, sample_command.modulation
)
except HomeAssistantError:
return self.async_abort(reason="no_transmitters")
if not transmitters:
return self.async_abort(reason="no_compatible_transmitters")
if user_input is not None:
registry = er.async_get(self.hass)
entity_entry = registry.async_get(user_input[CONF_TRANSMITTER])
assert entity_entry is not None
await self.async_set_unique_id(entity_entry.id)
self._abort_if_unique_id_configured()
return self.async_create_entry(
title="Honeywell String Lights",
data={CONF_TRANSMITTER: entity_entry.id},
)
return self.async_show_form(
step_id="user",
data_schema=vol.Schema(
{
vol.Required(CONF_TRANSMITTER): selector.EntitySelector(
selector.EntitySelectorConfig(include_entities=transmitters),
),
}
),
)
@@ -1,9 +0,0 @@
"""Constants for the Honeywell String Lights integration."""
from __future__ import annotations
from typing import Final
DOMAIN: Final = "honeywell_string_lights"
CONF_TRANSMITTER: Final = "transmitter"

Some files were not shown because too many files have changed in this diff Show More