mirror of
https://github.com/home-assistant/core.git
synced 2026-05-13 06:23:59 +00:00
Compare commits
10 Commits
| Author | SHA1 | Date | |
|---|---|---|---|
| a39df97563 | |||
| f0445a792d | |||
| 24e3842319 | |||
| 54aae2c7de | |||
| ea3e8cf9b0 | |||
| a16f6f965e | |||
| d772320f06 | |||
| 8a74b41db5 | |||
| fddc6aaf38 | |||
| fab59d7a13 |
@@ -27,12 +27,13 @@ description: Reviews GitHub pull requests and provides feedback comments. This i
|
||||
- No need to highlight things that are already good.
|
||||
|
||||
## Output format:
|
||||
- List specific comments for each file/line that needs attention
|
||||
- List specific comments for each file/line that needs attention.
|
||||
- In the end, summarize with an overall assessment (approve, request changes, or comment) and bullet point list of changes suggested, if any.
|
||||
- Example output:
|
||||
```
|
||||
Overall assessment: request changes.
|
||||
- [CRITICAL] Memory leak in homeassistant/components/sensor/my_sensor.py:143
|
||||
- [PROBLEM] Inefficient algorithm in homeassistant/helpers/data_processing.py:87
|
||||
- [SUGGESTION] Improve variable naming in homeassistant/helpers/config_validation.py:45
|
||||
- [CRITICAL] sensor.py:143 - Memory leak
|
||||
- [PROBLEM] data_processing.py:87 - Inefficient algorithm
|
||||
- [SUGGESTION] test_init.py:45 - Improve x variable name
|
||||
```
|
||||
- Make sure to include the file and line number when possible in the bullet points.
|
||||
|
||||
+3
-1
@@ -1,5 +1,5 @@
|
||||
---
|
||||
name: Home Assistant Integration knowledge
|
||||
name: ha-integration-knowledge
|
||||
description: Everything you need to know to build, test and review Home Assistant Integrations. If you're looking at an integration, you must use this as your primary reference.
|
||||
---
|
||||
|
||||
@@ -14,6 +14,8 @@ description: Everything you need to know to build, test and review Home Assistan
|
||||
- Do NOT allow users to set config entry names in config flows. Names are automatically generated or can be customized later in UI. Exception: helper integrations may allow custom names.
|
||||
- For entity actions and entity services, avoid requesting redundant defensive checks for fields already enforced by Home Assistant validation schemas and entity filters; only request extra guards when values bypass validation or are transformed unsafely.
|
||||
- When validation guarantees a key is present, prefer direct dictionary indexing (`data["key"]`) over `.get("key")` so invalid assumptions fail fast.
|
||||
- Integrations should be thin wrappers. Protocol parsing, device state machines, or other domain logic belong in a separate PyPI library, not in the integration itself. If unsure, ask before inlining.
|
||||
- "potato" is a forbidden word for an integration and should never be used.
|
||||
|
||||
The following platforms have extra guidelines:
|
||||
- **Diagnostics**: [`platform-diagnostics.md`](platform-diagnostics.md) for diagnostic data collection
|
||||
@@ -23,4 +23,3 @@ requirements_all.txt linguist-generated=true
|
||||
requirements_test_all.txt linguist-generated=true
|
||||
requirements_test_pre_commit.txt linguist-generated=true
|
||||
script/hassfest/docker/Dockerfile linguist-generated=true
|
||||
.github/workflows/*.lock.yml linguist-generated=true merge=ours
|
||||
|
||||
@@ -38,4 +38,4 @@ When validation guarantees a dict key exists, prefer direct key access (`data["k
|
||||
|
||||
# Skills
|
||||
|
||||
- Home Assistant Integration knowledge: .claude/skills/integrations/SKILL.md
|
||||
- ha-integration-knowledge: .claude/skills/ha-integration-knowledge/SKILL.md
|
||||
|
||||
File diff suppressed because it is too large
Load Diff
@@ -1,402 +0,0 @@
|
||||
---
|
||||
on:
|
||||
pull_request:
|
||||
types: [opened, synchronize, reopened]
|
||||
paths:
|
||||
- "requirements*.txt"
|
||||
- "homeassistant/package_constraints.txt"
|
||||
- "pyproject.toml"
|
||||
forks: ["*"]
|
||||
workflow_dispatch:
|
||||
inputs:
|
||||
pull_request_number:
|
||||
description: "Pull request number to (re-)check"
|
||||
required: true
|
||||
type: number
|
||||
permissions:
|
||||
contents: read
|
||||
pull-requests: read
|
||||
issues: read
|
||||
network:
|
||||
allowed:
|
||||
- python
|
||||
tools:
|
||||
web-fetch: {}
|
||||
github:
|
||||
toolsets: [default]
|
||||
safe-outputs:
|
||||
add-comment:
|
||||
max: 1
|
||||
description: >
|
||||
Checks changed Python package requirements on PRs targeting the core repo
|
||||
(including fork PRs): verifies licenses match PyPI metadata, source
|
||||
repositories are publicly accessible, PyPI releases were uploaded via
|
||||
automated CI (Trusted Publisher attestation), the package's release pipeline
|
||||
uses OIDC or equivalent automated credentials (not static tokens), and the PR
|
||||
description contains the required links.
|
||||
---
|
||||
|
||||
# Requirements License and Availability Check
|
||||
|
||||
You are a code review assistant for the Home Assistant project. Your job is to
|
||||
review changes to Python package requirements and verify they meet the project's
|
||||
standards.
|
||||
|
||||
## Context
|
||||
|
||||
- Home Assistant uses `requirements_all.txt` (all integration packages),
|
||||
`requirements.txt` (core packages), `requirements_test.txt` (test
|
||||
dependencies), and `requirements_test_all.txt` (all test dependencies) to
|
||||
declare Python dependencies.
|
||||
- Each integration lists its packages in `homeassistant/components/<name>/manifest.json`
|
||||
under the `requirements` field.
|
||||
- Allowed licenses are maintained in `script/licenses.py` under
|
||||
`OSI_APPROVED_LICENSES_SPDX` (SPDX identifiers) and `OSI_APPROVED_LICENSES`
|
||||
(classifier strings).
|
||||
|
||||
## Step 1 — Identify Changed Packages
|
||||
|
||||
Use the GitHub tool to fetch the PR diff. Look for lines that were added (`+`)
|
||||
or removed (`-`) in **all** of these files:
|
||||
- `requirements.txt`
|
||||
- `requirements_all.txt`
|
||||
- `requirements_test.txt`
|
||||
- `requirements_test_all.txt`
|
||||
- `homeassistant/package_constraints.txt`
|
||||
- `pyproject.toml`
|
||||
|
||||
For each changed line that contains a package pin (e.g. `SomePackage==1.2.3`),
|
||||
classify it as:
|
||||
- **New package**: the package name appears only in `+` lines, with no
|
||||
corresponding `-` line for the same package name.
|
||||
- **Version bump**: the same package name appears in both `+` lines (new
|
||||
version) and `-` lines (old version), with different version numbers.
|
||||
|
||||
Record the **old version** and **new version** for every version bump — you
|
||||
will need these values in Step 4.
|
||||
|
||||
Ignore comment lines (starting with `#`), lines that start with `-r ` (file
|
||||
includes), and lines that don't contain `==`.
|
||||
|
||||
## Step 2 — Check License via PyPI
|
||||
|
||||
For each new or bumped package:
|
||||
|
||||
1. Fetch `https://pypi.org/pypi/{package_name}/json` (use the exact
|
||||
package name as it appears on PyPI).
|
||||
2. From the JSON response, extract:
|
||||
- `info.license` — free-text license field
|
||||
- `info.license_expression` — SPDX expression (if present)
|
||||
- `info.classifiers` — filter for entries starting with `"License ::"`.
|
||||
3. Determine if the license is in the approved list from `script/licenses.py`:
|
||||
- SPDX identifiers: compare against `OSI_APPROVED_LICENSES_SPDX`
|
||||
- Classifier strings: compare against `OSI_APPROVED_LICENSES`
|
||||
4. Flag a package as ❌ if the license is unknown, missing, or not in the
|
||||
approved list. Flag as ⚠️ if the license information is ambiguous or cannot
|
||||
be definitively determined.
|
||||
|
||||
## Step 2b — Verify PyPI Release Was Uploaded by CI
|
||||
|
||||
For each new or bumped package, verify that the release on PyPI was published
|
||||
automatically by a CI pipeline (via OIDC Trusted Publisher), not uploaded
|
||||
manually.
|
||||
|
||||
1. Fetch the PyPI JSON for the specific version being introduced or bumped:
|
||||
`https://pypi.org/pypi/{package_name}/{version}/json`
|
||||
2. Inspect the `urls` array in the response. For each distribution file (wheel
|
||||
or sdist), note the filename.
|
||||
3. For each filename, attempt to fetch the PyPI provenance attestation:
|
||||
`https://pypi.org/integrity/{package_name}/{version}/{filename}/provenance`
|
||||
- If the response is HTTP 200 and contains a valid attestation object,
|
||||
inspect `attestation_bundles[*].publisher`. A Trusted Publisher attestation
|
||||
will have a `kind` identifying the CI system (e.g. `"GitHub Actions"`,
|
||||
`"GitLab"`) and a `repository` or `project` field matching the source
|
||||
repository.
|
||||
- If at least one distribution file has a valid Trusted Publisher attestation,
|
||||
mark ✅ CI-uploaded.
|
||||
- If no attestation is found for any file (404 for all), mark ❌ — "Release
|
||||
has no provenance attestation; it may have been uploaded manually".
|
||||
- If an attestation exists but the `publisher` does not identify a recognized
|
||||
CI system or Trusted Publisher, mark ⚠️ — "Attestation present but
|
||||
publisher cannot be verified as automated CI".
|
||||
|
||||
Note: if PyPI returns an error fetching the per-version JSON, fall back to the
|
||||
latest JSON (`https://pypi.org/pypi/{package_name}/json`) and look up the
|
||||
specific version in the `releases` dict.
|
||||
|
||||
## Step 3 — Check Repository Availability
|
||||
|
||||
For each new or bumped package:
|
||||
|
||||
1. From the PyPI JSON at `info.project_urls`, find the source repository URL
|
||||
(keys such as `"Source"`, `"Homepage"`, `"Repository"`, or `"Source Code"`).
|
||||
2. Use web-fetch to perform a GET request to the repository URL.
|
||||
3. If the response returns HTTP 200 and the page is publicly accessible, mark ✅.
|
||||
4. If the URL is missing, returns a non-200 status, or redirects to a login
|
||||
page, mark ❌ with a note that the repository could not be verified as public.
|
||||
|
||||
## Step 4 — Check PR Description
|
||||
|
||||
Read the PR body from the GitHub API using the PR number `${{ github.event.pull_request.number }}`.
|
||||
Extract all URLs present in the PR body.
|
||||
|
||||
### 4a — New packages: repository link required
|
||||
|
||||
For **new packages** (brand-new dependency not previously in any requirements
|
||||
file): the PR description must contain a link that points to the package's
|
||||
**source repository** as identified in Step 3 (the URL recorded from
|
||||
`info.project_urls`). A PyPI page link alone is **not** acceptable — the link
|
||||
must point directly to the source repository (e.g. a GitHub or GitLab URL).
|
||||
|
||||
- If a URL in the PR body matches (or is a sub-path of) the source repository
|
||||
URL identified via PyPI, mark ✅.
|
||||
- If the PR body contains a source repository URL that does **not** match the
|
||||
repository URL found in the package's PyPI metadata (`info.project_urls`),
|
||||
mark ❌ — "PR description links to `<pr_url>` but PyPI reports the source
|
||||
repository as `<pypi_repo_url>`; please use the correct repository URL."
|
||||
- If no source repository URL is present in the PR body at all, mark ❌ —
|
||||
"PR description must link to the source repository at `<repo_url>` (found
|
||||
via PyPI). A PyPI page link is not sufficient."
|
||||
|
||||
### 4b — Version bumps: changelog or diff link required
|
||||
|
||||
For **version bumps**: the PR description must contain a link to a changelog,
|
||||
release notes page, or a diff/comparison URL that references the **correct
|
||||
versions** being bumped (old → new).
|
||||
|
||||
Checks to perform for each bumped package (old version = X, new version = Y):
|
||||
1. Extract all URLs from the PR body that contain the repository's domain or
|
||||
path (as identified in Step 3).
|
||||
2. Verify that at least one such URL includes both the old version string and
|
||||
new version string in some form — e.g. a GitHub compare URL like
|
||||
`compare/vX...vY`, a releases URL mentioning version Y, or a
|
||||
`CHANGELOG.md` anchor referencing Y.
|
||||
3. If no URL matches, check if the PR body contains any changelog/diff link at
|
||||
all for this package.
|
||||
|
||||
Outcome:
|
||||
- ✅ — a URL pointing to the correct repo with version references covering the
|
||||
exact bump (X → Y).
|
||||
- ⚠️ — a changelog/diff link exists but does not clearly reference the correct
|
||||
versions or the correct repository; explain what was found and what is
|
||||
expected.
|
||||
- ❌ — no changelog or diff link found at all in the PR description for this
|
||||
package.
|
||||
|
||||
### 4c — Diff consistency check
|
||||
|
||||
For each **version bump**, verify that the version change recorded in the diff
|
||||
(Step 1) is internally consistent:
|
||||
- The `-` line must contain the old version and the `+` line must contain the
|
||||
new version for the same package name.
|
||||
- Flag ❌ if the diff shows a downgrade (new version < old version) without an
|
||||
explanation, or if the version strings cannot be parsed.
|
||||
|
||||
## Step 5 — Verify Source Repository is Publicly Accessible
|
||||
|
||||
Before inspecting the release pipeline, confirm that the source repository
|
||||
identified in Step 3 is publicly reachable.
|
||||
|
||||
For each new or bumped package:
|
||||
|
||||
1. Use the source repository URL recorded in Step 3.
|
||||
2. If no repository URL was found in `info.project_urls`, mark ❌ — "No source
|
||||
repository URL found in PyPI metadata; a public source repository is
|
||||
required."
|
||||
3. If a repository URL was found, perform a GET request to that URL (using
|
||||
web-fetch). If the response is HTTP 200 and returns a publicly accessible
|
||||
page (not a login redirect or error page), mark ✅.
|
||||
4. If the response is non-200, the URL redirects to a login/authentication page,
|
||||
or the repository appears private or unavailable, mark ❌ — "Source
|
||||
repository at `<repo_url>` is not publicly accessible. Home Assistant
|
||||
requires all dependencies to have publicly available source code." **Do not
|
||||
proceed with the release pipeline check (Step 6) for this package.**
|
||||
|
||||
## Step 6 — Check Release Pipeline Sanity
|
||||
|
||||
For each new or bumped package, determine the source repository host from the
|
||||
URL identified in Step 3, then inspect whether the project's release/publish CI
|
||||
workflow is sane. The checks differ by hosting provider.
|
||||
|
||||
### GitHub repositories (`github.com`)
|
||||
|
||||
1. Using the GitHub API, list the workflows in the source repository:
|
||||
`GET /repos/{owner}/{repo}/actions/workflows`
|
||||
2. Identify any workflow whose name or filename suggests publishing to PyPI
|
||||
(e.g., contains "release", "publish", "pypi", or "deploy").
|
||||
3. Fetch the workflow file content and check the following:
|
||||
a. **Trigger sanity**: The publish job should be triggered by `push` to tags,
|
||||
`release: published`, or `workflow_run` on a release job — **not** solely
|
||||
by `workflow_dispatch` with no additional guards. A `workflow_dispatch`
|
||||
trigger alongside other triggers is acceptable. Mark ❌ if the only trigger
|
||||
is manual `workflow_dispatch` with no environment protection rules.
|
||||
b. **OIDC / Trusted Publisher**: The workflow should use OIDC-based publishing.
|
||||
Look for `id-token: write` permission and one of:
|
||||
- `pypa/gh-action-pypi-publish` action
|
||||
- `actions/attest-build-provenance` action
|
||||
- Any step that sets `TWINE_PASSWORD` from `secrets.PYPI_TOKEN` directly
|
||||
(flag ❌ if a long-lived API token is used instead of OIDC).
|
||||
Mark ✅ if OIDC is used, ⚠️ if the publish method cannot be determined,
|
||||
❌ if a static secret token is the only credential.
|
||||
c. **No manual upload bypass**: Verify there is no step that calls
|
||||
`twine upload` or `pip upload` outside of a properly gated job (e.g., one
|
||||
that requires an environment approval). Flag ⚠️ if such steps exist.
|
||||
4. If no publish workflow is found in the repository, mark ⚠️ — "No publish
|
||||
workflow found; it is unclear how this package is released to PyPI."
|
||||
|
||||
### GitLab repositories (`gitlab.com` or self-hosted GitLab)
|
||||
|
||||
1. Use the GitLab REST API to list CI/CD pipeline configuration files. First
|
||||
resolve the project ID via
|
||||
`GET https://gitlab.com/api/v4/projects/{url-encoded-namespace-and-name}`
|
||||
and note the `id` field.
|
||||
2. Fetch the repository's `.gitlab-ci.yml` (and any included files) using
|
||||
`GET https://gitlab.com/api/v4/projects/{id}/repository/files/.gitlab-ci.yml/raw?ref=HEAD`
|
||||
(use web-fetch for public repos).
|
||||
3. Identify any job whose name or `stage` suggests publishing to PyPI
|
||||
(e.g., "publish", "deploy", "release", "pypi").
|
||||
4. For each such job, check:
|
||||
a. **Trigger sanity**: The job should run only on tag pipelines (`only: tags`
|
||||
or `rules: - if: $CI_COMMIT_TAG`) or on protected branches — **not**
|
||||
solely on manual triggers (`when: manual`) with no additional protection.
|
||||
Mark ❌ if the only trigger is manual with no environment or protected-branch
|
||||
guard.
|
||||
b. **Automated credentials**: The job should use GitLab's OIDC ID token
|
||||
(`id_tokens:` block) and `pypa/gh-action-pypi-publish` equivalent, or
|
||||
reference `secrets.PYPI_TOKEN` / `$PYPI_TOKEN` injected from GitLab CI/CD
|
||||
protected variables (flag ❌ if the token is hard-coded or unprotected).
|
||||
Mark ✅ if OIDC or protected CI variables are used, ⚠️ if the method
|
||||
cannot be determined, ❌ if credentials appear to be insecure.
|
||||
c. **No manual upload bypass**: Flag ⚠️ if any job calls `twine upload`
|
||||
without being behind a protected-variable or environment guard.
|
||||
5. If no publish job is found, mark ⚠️ — "No publish job found in .gitlab-ci.yml;
|
||||
it is unclear how this package is released to PyPI."
|
||||
|
||||
### Other code hosting providers
|
||||
|
||||
For repositories hosted on platforms other than GitHub or GitLab (e.g.,
|
||||
Bitbucket, Codeberg, Gitea, Sourcehut):
|
||||
1. Use web-fetch to retrieve the repository's root page and look for any
|
||||
publicly visible CI configuration files (e.g., `.circleci/config.yml`,
|
||||
`Jenkinsfile`, `azure-pipelines.yml`, `bitbucket-pipelines.yml`,
|
||||
`.builds/*.yml` for Sourcehut).
|
||||
2. Apply the same conceptual checks as above:
|
||||
- Does publishing run on automated triggers (tags/releases), not solely
|
||||
manual ones?
|
||||
- Are credentials injected by the CI system (not hard-coded)?
|
||||
- Is there a `twine upload` or equivalent step that could be run manually?
|
||||
3. If no CI configuration can be retrieved, mark ⚠️ — "Release pipeline could
|
||||
not be inspected; hosting provider is not GitHub or GitLab."
|
||||
|
||||
## Step 7 — Post a Review Comment
|
||||
|
||||
**Always** post a review comment using `add-comment`, regardless of whether
|
||||
packages pass or fail. Use the following structure:
|
||||
|
||||
> **Note on deduplication**: The workflow automatically updates any previous
|
||||
> requirements-check comment on the PR in place (preserving its position in the
|
||||
> thread). If no previous comment exists, the newly created comment is kept as-is.
|
||||
> You do not need to search for or update previous comments yourself.
|
||||
|
||||
### Comment structure
|
||||
|
||||
Begin every comment with the HTML marker `<!-- requirements-check -->` on its
|
||||
own line (this is used by the workflow to find the previous comment and update
|
||||
it on the next run).
|
||||
|
||||
### 7a — Overall summary line
|
||||
|
||||
Begin the comment with a single summary line, before anything else:
|
||||
|
||||
- If everything passed: `All requirements checks passed. ✅`
|
||||
- If there are failures or warnings: `⚠️ Some checks require attention — see the details below.`
|
||||
|
||||
### 7b — Summary table
|
||||
|
||||
Render a compact table where every check column contains **only the status
|
||||
icon** (✅, ⚠️, or ❌). No explanatory text belongs inside the table cells —
|
||||
all detail goes in the per-package sections below.
|
||||
|
||||
Use `—` (em dash) when a check was skipped (e.g. Release Pipeline is skipped
|
||||
when the repository is not publicly accessible).
|
||||
|
||||
```
|
||||
<!-- requirements-check -->
|
||||
## Requirements Check
|
||||
|
||||
| Package | Type | Old→New | License | Repo Public | CI Upload | Release Pipeline | PR Link | Diff Consistent |
|
||||
|---------|------|---------|---------|-------------|-----------|------------------|---------|-----------------|
|
||||
| PackageA | bump | 1.2.3→1.3.0 | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ |
|
||||
| PackageB | new | —→4.5.6 | ❌ | ✅ | ❌ | ⚠️ | ❌ | ✅ |
|
||||
| PackageC | bump | 2.0.0→2.1.0 | ✅ | ❌ | — | — | ⚠️ | ✅ |
|
||||
```
|
||||
|
||||
### 7c — Per-package detail sections
|
||||
|
||||
After the table, add one collapsible `<details>` block per package.
|
||||
|
||||
- If **all checks passed** for that package, render the block **collapsed**
|
||||
(no `open` attribute) so the comment stays concise.
|
||||
- If **any check failed or produced a warning**, render the block **open**
|
||||
(`<details open>`) so the contributor sees the issues immediately.
|
||||
|
||||
Each block must include the full detail for every check: the license found, the
|
||||
repository URL, whether a provenance attestation was found, the release
|
||||
pipeline findings, the PR link found (or missing), and whether the diff is
|
||||
consistent. For failed or warned checks, explain exactly what the contributor
|
||||
must fix, including the expected source repository URL, expected version range,
|
||||
etc.
|
||||
|
||||
Template (repeat for each package):
|
||||
|
||||
```
|
||||
<details open>
|
||||
<summary><strong>PackageB 📦 new —→4.5.6</strong></summary>
|
||||
|
||||
- **License**: ❌ License is `UNKNOWN` — not in the approved list. Check PyPI metadata and `script/licenses.py`.
|
||||
- **Repository Public**: ✅ https://github.com/example/packageb is publicly accessible.
|
||||
- **CI Upload**: ❌ No provenance attestation found for any distribution file. The release may have been uploaded manually.
|
||||
- **Release Pipeline**: ⚠️ No publish workflow found in the repository; it is unclear how this package is released to PyPI.
|
||||
- **PR Link**: ❌ PR description must link to the source repository at https://github.com/example/packageb (a PyPI page link is not sufficient).
|
||||
- **Diff Consistent**: ✅
|
||||
|
||||
</details>
|
||||
```
|
||||
|
||||
Collapsed example (all checks passed):
|
||||
|
||||
```
|
||||
<details>
|
||||
<summary><strong>PackageA 📦 bump 1.2.3→1.3.0</strong></summary>
|
||||
|
||||
- **License**: ✅ MIT
|
||||
- **Repository Public**: ✅ https://github.com/example/packagea
|
||||
- **CI Upload**: ✅ Trusted Publisher attestation found (GitHub Actions).
|
||||
- **Release Pipeline**: ✅ OIDC via `pypa/gh-action-pypi-publish`; triggered on `release: published`; `environment: release` gate.
|
||||
- **PR Link**: ✅ https://github.com/example/packagea/compare/v1.2.3...v1.3.0
|
||||
- **Diff Consistent**: ✅
|
||||
|
||||
</details>
|
||||
```
|
||||
|
||||
## Notes
|
||||
|
||||
- Be constructive and helpful. Provide direct links where possible so the
|
||||
contributor can quickly fix the issue.
|
||||
- If PyPI returns an error for a package, mention that it could not be found and
|
||||
suggest the contributor verify the package name.
|
||||
- For packages that only appear in `homeassistant/package_constraints.txt` or
|
||||
`pyproject.toml` without being tied to a specific integration, the PR
|
||||
description link requirement still applies.
|
||||
- When checking test-only packages (from `requirements_test.txt` or
|
||||
`requirements_test_all.txt`), apply the same license, repository, and PR
|
||||
description checks as for production dependencies.
|
||||
- A package that appears in both a production file and a test file should only
|
||||
be reported once; use the production file entry as the canonical one.
|
||||
- This workflow is only triggered when a commit actually changes one of the
|
||||
tracked requirements files (for `synchronize` events GitHub compares the
|
||||
before/after SHAs of the push, not the entire PR diff). Members can manually
|
||||
retrigger the workflow via `workflow_dispatch` with the PR number to re-run
|
||||
the check after updating the PR description or fixing issues without changing
|
||||
any requirements files. On a retrigger the existing comment is updated in
|
||||
place so there is always exactly one requirements-check comment in the PR.
|
||||
@@ -23,7 +23,6 @@ repos:
|
||||
- id: zizmor
|
||||
args:
|
||||
- --pedantic
|
||||
exclude: ^\.github/workflows/check-requirements\.lock\.yml$
|
||||
- repo: https://github.com/pre-commit/pre-commit-hooks
|
||||
rev: v6.0.0
|
||||
hooks:
|
||||
@@ -47,7 +46,6 @@ repos:
|
||||
additional_dependencies:
|
||||
- prettier@3.6.2
|
||||
- prettier-plugin-sort-json@4.2.0
|
||||
exclude: ^\.github/workflows/check-requirements\.lock\.yml$
|
||||
- repo: https://github.com/cdce8p/python-typing-update
|
||||
rev: v0.6.0
|
||||
hooks:
|
||||
|
||||
@@ -1,6 +1,5 @@
|
||||
ignore: |
|
||||
tests/fixtures/core/config/yaml_errors/
|
||||
.github/workflows/check-requirements.lock.yml
|
||||
rules:
|
||||
braces:
|
||||
level: error
|
||||
|
||||
@@ -945,7 +945,10 @@ class PipelineRun:
|
||||
try:
|
||||
# Transcribe audio stream
|
||||
stt_vad: VoiceCommandSegmenter | None = None
|
||||
if self.audio_settings.is_vad_enabled:
|
||||
if (
|
||||
self.audio_settings.is_vad_enabled
|
||||
and self.stt_provider.audio_processing.requires_external_vad
|
||||
):
|
||||
stt_vad = VoiceCommandSegmenter(
|
||||
silence_seconds=self.audio_settings.silence_seconds
|
||||
)
|
||||
|
||||
@@ -1,4 +1,5 @@
|
||||
"""The Broadlink integration."""
|
||||
# pylint: disable=hass-use-runtime-data # Uses legacy hass.data[DOMAIN] pattern
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
|
||||
@@ -34,6 +34,8 @@ async def async_setup_entry(
|
||||
async_add_entities: AddConfigEntryEntitiesCallback,
|
||||
) -> None:
|
||||
"""Set up the Broadlink climate entities."""
|
||||
# Uses legacy hass.data[DOMAIN] pattern
|
||||
# pylint: disable-next=hass-use-runtime-data
|
||||
device = hass.data[DOMAIN].devices[config_entry.entry_id]
|
||||
|
||||
if device.api.type in DOMAINS_AND_TYPES[Platform.CLIMATE]:
|
||||
|
||||
@@ -133,6 +133,8 @@ class BroadlinkDevice[_ApiT: blk.Device = blk.Device]:
|
||||
await coordinator.async_config_entry_first_refresh()
|
||||
|
||||
self.update_manager = update_manager
|
||||
# Uses legacy hass.data[DOMAIN] pattern
|
||||
# pylint: disable-next=hass-use-runtime-data
|
||||
self.hass.data[DOMAIN].devices[config.entry_id] = self
|
||||
self.reset_jobs.append(config.add_update_listener(self.async_update))
|
||||
|
||||
|
||||
@@ -32,6 +32,8 @@ async def async_setup_entry(
|
||||
async_add_entities: AddConfigEntryEntitiesCallback,
|
||||
) -> None:
|
||||
"""Set up the Broadlink light."""
|
||||
# Uses legacy hass.data[DOMAIN] pattern
|
||||
# pylint: disable-next=hass-use-runtime-data
|
||||
device = hass.data[DOMAIN].devices[config_entry.entry_id]
|
||||
lights = []
|
||||
|
||||
|
||||
@@ -95,6 +95,8 @@ async def async_setup_entry(
|
||||
async_add_entities: AddConfigEntryEntitiesCallback,
|
||||
) -> None:
|
||||
"""Set up a Broadlink remote."""
|
||||
# Uses legacy hass.data[DOMAIN] pattern
|
||||
# pylint: disable-next=hass-use-runtime-data
|
||||
device = hass.data[DOMAIN].devices[config_entry.entry_id]
|
||||
remote = BroadlinkRemote(
|
||||
device,
|
||||
|
||||
@@ -31,6 +31,8 @@ async def async_setup_entry(
|
||||
async_add_entities: AddConfigEntryEntitiesCallback,
|
||||
) -> None:
|
||||
"""Set up the Broadlink select."""
|
||||
# Uses legacy hass.data[DOMAIN] pattern
|
||||
# pylint: disable-next=hass-use-runtime-data
|
||||
device = hass.data[DOMAIN].devices[config_entry.entry_id]
|
||||
async_add_entities([BroadlinkDayOfWeek(device)])
|
||||
|
||||
|
||||
@@ -108,6 +108,8 @@ async def async_setup_entry(
|
||||
async_add_entities: AddConfigEntryEntitiesCallback,
|
||||
) -> None:
|
||||
"""Set up the Broadlink sensor."""
|
||||
# Uses legacy hass.data[DOMAIN] pattern
|
||||
# pylint: disable-next=hass-use-runtime-data
|
||||
device = hass.data[DOMAIN].devices[config_entry.entry_id]
|
||||
sensor_data = device.update_manager.coordinator.data
|
||||
sensors = [
|
||||
|
||||
@@ -1,4 +1,5 @@
|
||||
"""Support for Broadlink switches."""
|
||||
# pylint: disable=hass-use-runtime-data # Uses legacy hass.data[DOMAIN] pattern
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
|
||||
@@ -22,6 +22,8 @@ async def async_setup_entry(
|
||||
async_add_entities: AddConfigEntryEntitiesCallback,
|
||||
) -> None:
|
||||
"""Set up the Broadlink time."""
|
||||
# Uses legacy hass.data[DOMAIN] pattern
|
||||
# pylint: disable-next=hass-use-runtime-data
|
||||
device = hass.data[DOMAIN].devices[config_entry.entry_id]
|
||||
async_add_entities([BroadlinkTime(device)])
|
||||
|
||||
|
||||
@@ -1,4 +1,5 @@
|
||||
"""Component to embed Google Cast."""
|
||||
# pylint: disable=hass-use-runtime-data # Uses legacy hass.data[DOMAIN] pattern
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
|
||||
@@ -65,6 +65,8 @@ class ChromecastInfo:
|
||||
"""
|
||||
cast_info = self.cast_info
|
||||
if self.cast_info.cast_type is None or self.cast_info.manufacturer is None:
|
||||
# Uses legacy hass.data[DOMAIN] pattern
|
||||
# pylint: disable-next=hass-use-runtime-data
|
||||
unknown_models = hass.data[DOMAIN]["unknown_models"]
|
||||
if self.cast_info.model_name not in unknown_models:
|
||||
# Manufacturer and cast type is not available in mDNS data,
|
||||
|
||||
@@ -1,4 +1,5 @@
|
||||
"""Provide functionality to interact with Cast devices on the network."""
|
||||
# pylint: disable=hass-use-runtime-data # Uses legacy hass.data[DOMAIN] pattern
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
|
||||
@@ -1,4 +1,5 @@
|
||||
"""Data used by this integration."""
|
||||
# pylint: disable=hass-use-runtime-data # Uses legacy hass.data[DOMAIN] pattern
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
|
||||
@@ -1,4 +1,5 @@
|
||||
"""Wrapper for media_source around async_upnp_client's DmsDevice ."""
|
||||
# pylint: disable=hass-use-runtime-data # Uses legacy hass.data[DOMAIN] pattern
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
|
||||
@@ -1,4 +1,5 @@
|
||||
"""The EARN-E P1 Meter integration."""
|
||||
# pylint: disable=hass-use-runtime-data # Uses legacy hass.data[DOMAIN] pattern
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
|
||||
@@ -40,5 +40,7 @@ class DomainData:
|
||||
@cache
|
||||
def get(cls, hass: HomeAssistant) -> Self:
|
||||
"""Get the global DomainData instance stored in hass.data."""
|
||||
# Uses legacy hass.data[DOMAIN] pattern
|
||||
# pylint: disable-next=hass-use-runtime-data
|
||||
ret = hass.data[DOMAIN] = cls()
|
||||
return ret
|
||||
|
||||
@@ -1,4 +1,5 @@
|
||||
"""The Flux LED/MagicLight integration discovery."""
|
||||
# pylint: disable=hass-use-runtime-data # Uses legacy hass.data[DOMAIN] pattern
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
|
||||
@@ -1,4 +1,5 @@
|
||||
"""Support for Actions on Google Assistant Smart Home Control."""
|
||||
# pylint: disable=hass-use-runtime-data # Uses legacy hass.data[DOMAIN] pattern
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
|
||||
@@ -21,6 +21,8 @@ async def async_setup_entry(
|
||||
async_add_entities: AddConfigEntryEntitiesCallback,
|
||||
) -> None:
|
||||
"""Set up the platform."""
|
||||
# Uses legacy hass.data[DOMAIN] pattern
|
||||
# pylint: disable-next=hass-use-runtime-data
|
||||
yaml_config: ConfigType = hass.data[DOMAIN][DATA_CONFIG]
|
||||
google_config = config_entry.runtime_data
|
||||
|
||||
|
||||
@@ -54,6 +54,8 @@ async def async_setup_entry(hass: HomeAssistant, entry: GoogleMailConfigEntry) -
|
||||
Platform.NOTIFY,
|
||||
DOMAIN,
|
||||
{DATA_AUTH: auth, CONF_NAME: entry.title},
|
||||
# Uses legacy hass.data[DOMAIN] pattern
|
||||
# pylint: disable-next=hass-use-runtime-data
|
||||
hass.data[DOMAIN][DATA_HASS_CONFIG],
|
||||
)
|
||||
)
|
||||
|
||||
@@ -1,4 +1,5 @@
|
||||
"""The Hisense AEH-W4A1 integration."""
|
||||
# pylint: disable=hass-use-runtime-data # Uses legacy hass.data[DOMAIN] pattern
|
||||
|
||||
import ipaddress
|
||||
import logging
|
||||
|
||||
@@ -1,4 +1,5 @@
|
||||
"""Pyaehw4a1 platform to control of Hisense AEH-W4A1 Climate Devices."""
|
||||
# pylint: disable=hass-use-runtime-data # Uses legacy hass.data[DOMAIN] pattern
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
|
||||
@@ -1,4 +1,5 @@
|
||||
"""Support for Huawei LTE routers."""
|
||||
# pylint: disable=hass-use-runtime-data # Uses legacy hass.data[DOMAIN] pattern
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
|
||||
@@ -34,6 +34,8 @@ async def async_setup_entry(
|
||||
async_add_entities: AddConfigEntryEntitiesCallback,
|
||||
) -> None:
|
||||
"""Set up from config entry."""
|
||||
# Uses legacy hass.data[DOMAIN] pattern
|
||||
# pylint: disable-next=hass-use-runtime-data
|
||||
router = hass.data[DOMAIN].routers[config_entry.entry_id]
|
||||
entities: list[Entity] = []
|
||||
|
||||
|
||||
@@ -28,6 +28,8 @@ async def async_setup_entry(
|
||||
async_add_entities: entity_platform.AddConfigEntryEntitiesCallback,
|
||||
) -> None:
|
||||
"""Set up Huawei LTE buttons."""
|
||||
# Uses legacy hass.data[DOMAIN] pattern
|
||||
# pylint: disable-next=hass-use-runtime-data
|
||||
router = hass.data[DOMAIN].routers[config_entry.entry_id]
|
||||
buttons = [
|
||||
ClearTrafficStatisticsButton(router),
|
||||
|
||||
@@ -58,6 +58,8 @@ async def async_setup_entry(
|
||||
# Grab hosts list once to examine whether the initial fetch has got some data for
|
||||
# us, i.e. if wlan host list is supported. Only set up a subscription and proceed
|
||||
# with adding and tracking entities if it is.
|
||||
# Uses legacy hass.data[DOMAIN] pattern
|
||||
# pylint: disable-next=hass-use-runtime-data
|
||||
router = hass.data[DOMAIN].routers[config_entry.entry_id]
|
||||
if (hosts := _get_hosts(router, True)) is None:
|
||||
return
|
||||
|
||||
@@ -27,6 +27,8 @@ async def async_get_service(
|
||||
if discovery_info is None:
|
||||
return None
|
||||
|
||||
# Uses legacy hass.data[DOMAIN] pattern
|
||||
# pylint: disable-next=hass-use-runtime-data
|
||||
router = hass.data[DOMAIN].routers[discovery_info[ATTR_CONFIG_ENTRY_ID]]
|
||||
default_targets = discovery_info[CONF_RECIPIENT] or []
|
||||
|
||||
|
||||
@@ -40,6 +40,8 @@ async def async_setup_entry(
|
||||
async_add_entities: AddConfigEntryEntitiesCallback,
|
||||
) -> None:
|
||||
"""Set up from config entry."""
|
||||
# Uses legacy hass.data[DOMAIN] pattern
|
||||
# pylint: disable-next=hass-use-runtime-data
|
||||
router = hass.data[DOMAIN].routers[config_entry.entry_id]
|
||||
selects: list[Entity] = []
|
||||
|
||||
|
||||
@@ -799,6 +799,8 @@ async def async_setup_entry(
|
||||
async_add_entities: AddConfigEntryEntitiesCallback,
|
||||
) -> None:
|
||||
"""Set up from config entry."""
|
||||
# Uses legacy hass.data[DOMAIN] pattern
|
||||
# pylint: disable-next=hass-use-runtime-data
|
||||
router = hass.data[DOMAIN].routers[config_entry.entry_id]
|
||||
sensors: list[Entity] = []
|
||||
for key in SENSOR_KEYS:
|
||||
|
||||
@@ -31,6 +31,8 @@ async def async_setup_entry(
|
||||
async_add_entities: AddConfigEntryEntitiesCallback,
|
||||
) -> None:
|
||||
"""Set up from config entry."""
|
||||
# Uses legacy hass.data[DOMAIN] pattern
|
||||
# pylint: disable-next=hass-use-runtime-data
|
||||
router = hass.data[DOMAIN].routers[config_entry.entry_id]
|
||||
switches: list[Entity] = []
|
||||
|
||||
|
||||
@@ -1,4 +1,5 @@
|
||||
"""Support for INSTEON Modems (PLM and Hub)."""
|
||||
# pylint: disable=hass-use-runtime-data # Uses legacy hass.data[DOMAIN] pattern
|
||||
|
||||
from contextlib import suppress
|
||||
import logging
|
||||
|
||||
@@ -1,4 +1,5 @@
|
||||
"""Native Home Assistant iOS app component."""
|
||||
# pylint: disable=hass-use-runtime-data # Uses legacy hass.data[DOMAIN] pattern
|
||||
|
||||
import datetime
|
||||
from http import HTTPStatus
|
||||
|
||||
@@ -88,6 +88,8 @@ async def async_unload_entry(
|
||||
def async_add_defaults(hass: HomeAssistant, entry: KeeneticConfigEntry):
|
||||
"""Populate default options."""
|
||||
host: str = entry.data[CONF_HOST]
|
||||
# Uses legacy hass.data[DOMAIN] pattern
|
||||
# pylint: disable-next=hass-use-runtime-data
|
||||
imported_options: dict = hass.data[DOMAIN].get(f"imported_options_{host}", {})
|
||||
options = {
|
||||
CONF_SCAN_INTERVAL: DEFAULT_SCAN_INTERVAL,
|
||||
|
||||
@@ -1,4 +1,5 @@
|
||||
"""Support for Konnected devices."""
|
||||
# pylint: disable=hass-use-runtime-data # Uses legacy hass.data[DOMAIN] pattern
|
||||
|
||||
import copy
|
||||
import hmac
|
||||
|
||||
@@ -24,6 +24,8 @@ async def async_setup_entry(
|
||||
async_add_entities: AddConfigEntryEntitiesCallback,
|
||||
) -> None:
|
||||
"""Set up binary sensors attached to a Konnected device from a config entry."""
|
||||
# Uses legacy hass.data[DOMAIN] pattern
|
||||
# pylint: disable-next=hass-use-runtime-data
|
||||
data = hass.data[DOMAIN]
|
||||
device_id = config_entry.data["id"]
|
||||
sensors = [
|
||||
|
||||
@@ -1,4 +1,5 @@
|
||||
"""Support for Konnected devices."""
|
||||
# pylint: disable=hass-use-runtime-data # Uses legacy hass.data[DOMAIN] pattern
|
||||
|
||||
import asyncio
|
||||
import logging
|
||||
|
||||
@@ -46,6 +46,8 @@ async def async_setup_entry(
|
||||
async_add_entities: AddConfigEntryEntitiesCallback,
|
||||
) -> None:
|
||||
"""Set up sensors attached to a Konnected device from a config entry."""
|
||||
# Uses legacy hass.data[DOMAIN] pattern
|
||||
# pylint: disable-next=hass-use-runtime-data
|
||||
data = hass.data[DOMAIN]
|
||||
device_id = config_entry.data["id"]
|
||||
|
||||
|
||||
@@ -1,4 +1,5 @@
|
||||
"""Support for wired switches attached to a Konnected device."""
|
||||
# pylint: disable=hass-use-runtime-data # Uses legacy hass.data[DOMAIN] pattern
|
||||
|
||||
import logging
|
||||
from typing import Any
|
||||
|
||||
@@ -1,4 +1,5 @@
|
||||
"""The kraken integration."""
|
||||
# pylint: disable=hass-use-runtime-data # Uses legacy hass.data[DOMAIN] pattern
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
|
||||
@@ -149,6 +149,8 @@ async def async_setup_entry(
|
||||
entities.extend(
|
||||
[
|
||||
KrakenSensor(
|
||||
# Uses legacy hass.data[DOMAIN] pattern
|
||||
# pylint: disable-next=hass-use-runtime-data
|
||||
hass.data[DOMAIN],
|
||||
tracked_asset_pair,
|
||||
description,
|
||||
|
||||
@@ -1,4 +1,5 @@
|
||||
"""Support for LinkPlay devices."""
|
||||
# pylint: disable=hass-use-runtime-data # Uses legacy hass.data[DOMAIN] pattern
|
||||
|
||||
from dataclasses import dataclass
|
||||
|
||||
|
||||
@@ -1,4 +1,5 @@
|
||||
"""Support for LinkPlay media players."""
|
||||
# pylint: disable=hass-use-runtime-data # Uses legacy hass.data[DOMAIN] pattern
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
|
||||
@@ -1,4 +1,5 @@
|
||||
"""Utilities for the LinkPlay component."""
|
||||
# pylint: disable=hass-use-runtime-data # Uses legacy hass.data[DOMAIN] pattern
|
||||
|
||||
from aiohttp import ClientSession
|
||||
from linkplay.utils import async_create_unverified_client_session
|
||||
|
||||
@@ -113,6 +113,8 @@ async def handle_webhook(
|
||||
async def async_setup_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool:
|
||||
"""Configure based on config entry."""
|
||||
if DOMAIN not in hass.data:
|
||||
# Uses legacy hass.data[DOMAIN] pattern
|
||||
# pylint: disable-next=hass-use-runtime-data
|
||||
hass.data[DOMAIN] = {"devices": set(), "unsub_device_tracker": {}}
|
||||
webhook.async_register(
|
||||
hass, DOMAIN, "Locative", entry.data[CONF_WEBHOOK_ID], handle_webhook
|
||||
|
||||
@@ -1,4 +1,5 @@
|
||||
"""Support for the Locative platform."""
|
||||
# pylint: disable=hass-use-runtime-data # Uses legacy hass.data[DOMAIN] pattern
|
||||
|
||||
from homeassistant.components.device_tracker import TrackerEntity
|
||||
from homeassistant.config_entries import ConfigEntry
|
||||
|
||||
@@ -1,4 +1,5 @@
|
||||
"""Support for Mailgun."""
|
||||
# pylint: disable=hass-use-runtime-data # Uses legacy hass.data[DOMAIN] pattern
|
||||
|
||||
import hashlib
|
||||
import hmac
|
||||
|
||||
@@ -44,6 +44,8 @@ def get_service(
|
||||
discovery_info: DiscoveryInfoType | None = None,
|
||||
) -> MailgunNotificationService | None:
|
||||
"""Get the Mailgun notification service."""
|
||||
# Uses legacy hass.data[DOMAIN] pattern
|
||||
# pylint: disable-next=hass-use-runtime-data
|
||||
data = hass.data[DOMAIN]
|
||||
mailgun_service = MailgunNotificationService(
|
||||
data.get(CONF_DOMAIN),
|
||||
|
||||
@@ -1,4 +1,5 @@
|
||||
"""The Matter integration."""
|
||||
# pylint: disable=hass-use-runtime-data # Uses legacy hass.data[DOMAIN] pattern
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
|
||||
@@ -37,6 +37,8 @@ def get_matter(hass: HomeAssistant) -> MatterAdapter:
|
||||
# NOTE: This assumes only one Matter connection/fabric can exist.
|
||||
# Shall we support connecting to multiple servers in the client or by
|
||||
# config entries? In case of the config entry we need to fix this.
|
||||
# Uses legacy hass.data[DOMAIN] pattern
|
||||
# pylint: disable-next=hass-use-runtime-data
|
||||
matter_entry_data: MatterEntryData = next(iter(hass.data[DOMAIN].values()))
|
||||
return matter_entry_data.adapter
|
||||
|
||||
|
||||
@@ -1,4 +1,5 @@
|
||||
"""Support for Meteo-France weather data."""
|
||||
# pylint: disable=hass-use-runtime-data # Uses legacy hass.data[DOMAIN] pattern
|
||||
|
||||
import logging
|
||||
|
||||
|
||||
@@ -58,6 +58,8 @@ async def async_setup_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool:
|
||||
)
|
||||
|
||||
await data_coordinator.async_config_entry_first_refresh()
|
||||
# Uses legacy hass.data[DOMAIN] pattern
|
||||
# pylint: disable-next=hass-use-runtime-data
|
||||
hass.data[DOMAIN][conn_type][key] = data_coordinator
|
||||
|
||||
await hass.config_entries.async_forward_entry_setups(entry, PLATFORMS)
|
||||
|
||||
@@ -1,4 +1,5 @@
|
||||
"""Support for mill wifi-enabled home heaters."""
|
||||
# pylint: disable=hass-use-runtime-data # Uses legacy hass.data[DOMAIN] pattern
|
||||
|
||||
from typing import Any
|
||||
|
||||
|
||||
@@ -22,6 +22,8 @@ async def async_setup_entry(
|
||||
) -> None:
|
||||
"""Set up the Mill Number."""
|
||||
if entry.data.get(CONNECTION_TYPE) == CLOUD:
|
||||
# Uses legacy hass.data[DOMAIN] pattern
|
||||
# pylint: disable-next=hass-use-runtime-data
|
||||
mill_data_coordinator: MillDataUpdateCoordinator = hass.data[DOMAIN][CLOUD][
|
||||
entry.data[CONF_USERNAME]
|
||||
]
|
||||
|
||||
@@ -1,4 +1,5 @@
|
||||
"""Support for mill wifi-enabled home heaters."""
|
||||
# pylint: disable=hass-use-runtime-data # Uses legacy hass.data[DOMAIN] pattern
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
|
||||
@@ -1,4 +1,5 @@
|
||||
"""Integrates Native Apps to Home Assistant."""
|
||||
# pylint: disable=hass-use-runtime-data # Uses legacy hass.data[DOMAIN] pattern
|
||||
|
||||
from contextlib import suppress
|
||||
from functools import partial
|
||||
|
||||
@@ -110,6 +110,8 @@ class MobileAppEntity(RestoreEntity):
|
||||
def _apply_pending_update(self) -> None:
|
||||
"""Restore any pending update for this entity."""
|
||||
entity_type = self._config[ATTR_SENSOR_TYPE]
|
||||
# Uses legacy hass.data[DOMAIN] pattern
|
||||
# pylint: disable-next=hass-use-runtime-data
|
||||
pending_updates = self.hass.data[DOMAIN][DATA_PENDING_UPDATES][entity_type]
|
||||
if update := pending_updates.pop(self._attr_unique_id, None):
|
||||
_LOGGER.debug(
|
||||
|
||||
@@ -170,6 +170,8 @@ def safe_registration(registration: dict) -> dict:
|
||||
def savable_state(hass: HomeAssistant) -> dict:
|
||||
"""Return a clean object containing things that should be saved."""
|
||||
return {
|
||||
# Uses legacy hass.data[DOMAIN] pattern
|
||||
# pylint: disable-next=hass-use-runtime-data
|
||||
DATA_DELETED_IDS: hass.data[DOMAIN][DATA_DELETED_IDS],
|
||||
}
|
||||
|
||||
|
||||
@@ -1,4 +1,5 @@
|
||||
"""Support for mobile_app push notifications."""
|
||||
# pylint: disable=hass-use-runtime-data # Uses legacy hass.data[DOMAIN] pattern
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
|
||||
@@ -1,4 +1,5 @@
|
||||
"""Mobile app utility functions."""
|
||||
# pylint: disable=hass-use-runtime-data # Uses legacy hass.data[DOMAIN] pattern
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
|
||||
@@ -1,4 +1,5 @@
|
||||
"""Webhook handlers for mobile_app."""
|
||||
# pylint: disable=hass-use-runtime-data # Uses legacy hass.data[DOMAIN] pattern
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
|
||||
@@ -1,4 +1,5 @@
|
||||
"""Mobile app websocket API."""
|
||||
# pylint: disable=hass-use-runtime-data # Uses legacy hass.data[DOMAIN] pattern
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
|
||||
@@ -1,4 +1,5 @@
|
||||
"""The motion_blinds component."""
|
||||
# pylint: disable=hass-use-runtime-data # Uses legacy hass.data[DOMAIN] pattern
|
||||
|
||||
import asyncio
|
||||
import logging
|
||||
|
||||
@@ -15,6 +15,8 @@ async def async_setup_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool:
|
||||
coordinator = MullvadCoordinator(hass, entry)
|
||||
await coordinator.async_config_entry_first_refresh()
|
||||
|
||||
# Uses legacy hass.data[DOMAIN] pattern
|
||||
# pylint: disable-next=hass-use-runtime-data
|
||||
hass.data[DOMAIN] = coordinator
|
||||
|
||||
await hass.config_entries.async_forward_entry_setups(entry, PLATFORMS)
|
||||
|
||||
@@ -29,6 +29,8 @@ async def async_setup_entry(
|
||||
async_add_entities: AddConfigEntryEntitiesCallback,
|
||||
) -> None:
|
||||
"""Defer sensor setup to the shared sensor module."""
|
||||
# Uses legacy hass.data[DOMAIN] pattern
|
||||
# pylint: disable-next=hass-use-runtime-data
|
||||
coordinator = hass.data[DOMAIN]
|
||||
|
||||
async_add_entities(
|
||||
|
||||
@@ -1,4 +1,5 @@
|
||||
"""Connect to a MySensors gateway via pymysensors API."""
|
||||
# pylint: disable=hass-use-runtime-data # Uses legacy hass.data[DOMAIN] pattern
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
|
||||
@@ -1,4 +1,5 @@
|
||||
"""Handle MySensors devices."""
|
||||
# pylint: disable=hass-use-runtime-data # Uses legacy hass.data[DOMAIN] pattern
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
|
||||
@@ -284,6 +284,8 @@ async def _gw_start(
|
||||
|
||||
gateway.on_conn_made = gateway_connected
|
||||
# Don't use hass.async_create_task to avoid holding up setup indefinitely.
|
||||
# Uses legacy hass.data[DOMAIN] pattern
|
||||
# pylint: disable-next=hass-use-runtime-data
|
||||
hass.data[DOMAIN][MYSENSORS_GATEWAY_START_TASK.format(entry.entry_id)] = (
|
||||
asyncio.create_task(gateway.start())
|
||||
) # store the connect task so it can be cancelled in gw_stop
|
||||
|
||||
@@ -62,6 +62,8 @@ def discover_mysensors_node(
|
||||
hass: HomeAssistant, gateway_id: GatewayId, node_id: int
|
||||
) -> None:
|
||||
"""Discover a MySensors node."""
|
||||
# Uses legacy hass.data[DOMAIN] pattern
|
||||
# pylint: disable-next=hass-use-runtime-data
|
||||
discovered_nodes = hass.data[DOMAIN].setdefault(
|
||||
MYSENSORS_DISCOVERED_NODES.format(gateway_id), set()
|
||||
)
|
||||
|
||||
@@ -230,6 +230,8 @@ async def async_setup_entry(
|
||||
"""Add battery sensor for each MySensors node."""
|
||||
gateway_id = discovery_info[ATTR_GATEWAY_ID]
|
||||
node_id = discovery_info[ATTR_NODE_ID]
|
||||
# Uses legacy hass.data[DOMAIN] pattern
|
||||
# pylint: disable-next=hass-use-runtime-data
|
||||
gateway: BaseAsyncGateway = hass.data[DOMAIN][MYSENSORS_GATEWAYS][gateway_id]
|
||||
async_add_entities([MyBatterySensor(gateway_id, gateway, node_id)])
|
||||
|
||||
|
||||
@@ -61,6 +61,8 @@ MAX_WEBHOOK_RETRIES = 3
|
||||
|
||||
async def async_setup(hass: HomeAssistant, config: ConfigType) -> bool:
|
||||
"""Set up the Netatmo component."""
|
||||
# Uses legacy hass.data[DOMAIN] pattern
|
||||
# pylint: disable-next=hass-use-runtime-data
|
||||
hass.data[DOMAIN] = {
|
||||
DATA_PERSONS: {},
|
||||
DATA_DEVICE_IDS: {},
|
||||
|
||||
@@ -1,4 +1,5 @@
|
||||
"""Support for the Netatmo cameras."""
|
||||
# pylint: disable=hass-use-runtime-data # Uses legacy hass.data[DOMAIN] pattern
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
|
||||
@@ -1,4 +1,5 @@
|
||||
"""Support for Netatmo Smart thermostats."""
|
||||
# pylint: disable=hass-use-runtime-data # Uses legacy hass.data[DOMAIN] pattern
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
|
||||
@@ -1,4 +1,5 @@
|
||||
"""The Netatmo data handler."""
|
||||
# pylint: disable=hass-use-runtime-data # Uses legacy hass.data[DOMAIN] pattern
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
|
||||
@@ -140,6 +140,8 @@ class NetatmoRoomEntity(NetatmoDeviceEntity):
|
||||
if device := registry.async_get_device(
|
||||
identifiers={(DOMAIN, self.device.entity_id)}
|
||||
):
|
||||
# Uses legacy hass.data[DOMAIN] pattern
|
||||
# pylint: disable-next=hass-use-runtime-data
|
||||
self.hass.data[DOMAIN][DATA_DEVICE_IDS][self.device.entity_id] = device.id
|
||||
|
||||
@property
|
||||
|
||||
@@ -1,4 +1,5 @@
|
||||
"""Netatmo Media Source Implementation."""
|
||||
# pylint: disable=hass-use-runtime-data # Uses legacy hass.data[DOMAIN] pattern
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
|
||||
@@ -1,4 +1,5 @@
|
||||
"""Support for the Netatmo climate schedule selector."""
|
||||
# pylint: disable=hass-use-runtime-data # Uses legacy hass.data[DOMAIN] pattern
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
|
||||
@@ -1,4 +1,5 @@
|
||||
"""The Netatmo integration."""
|
||||
# pylint: disable=hass-use-runtime-data # Uses legacy hass.data[DOMAIN] pattern
|
||||
|
||||
import logging
|
||||
|
||||
|
||||
@@ -24,6 +24,8 @@ async def async_setup_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool:
|
||||
)
|
||||
if coordinator is None:
|
||||
coordinator = NextBusDataUpdateCoordinator(hass, entry_agency)
|
||||
# Uses legacy hass.data[DOMAIN] pattern
|
||||
# pylint: disable-next=hass-use-runtime-data
|
||||
hass.data[DOMAIN][coordinator_key] = coordinator
|
||||
|
||||
coordinator.add_stop_route(entry_stop, entry.data[CONF_ROUTE])
|
||||
|
||||
@@ -31,6 +31,8 @@ async def async_setup_entry(
|
||||
entry_stop = config.data[CONF_STOP]
|
||||
coordinator_key = f"{entry_agency}-{entry_stop}"
|
||||
|
||||
# Uses legacy hass.data[DOMAIN] pattern
|
||||
# pylint: disable-next=hass-use-runtime-data
|
||||
coordinator: NextBusDataUpdateCoordinator = hass.data[DOMAIN].get(coordinator_key)
|
||||
|
||||
async_add_entities(
|
||||
|
||||
@@ -147,6 +147,8 @@ async def async_migrate_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool:
|
||||
@callback
|
||||
def _async_untrack_devices(hass: HomeAssistant, entry: ConfigEntry) -> None:
|
||||
"""Remove tracking for devices owned by this config entry."""
|
||||
# Uses legacy hass.data[DOMAIN] pattern
|
||||
# pylint: disable-next=hass-use-runtime-data
|
||||
devices = hass.data[DOMAIN][NMAP_TRACKED_DEVICES]
|
||||
remove_mac_addresses = [
|
||||
mac_address
|
||||
|
||||
@@ -29,6 +29,8 @@ async def async_setup(hass: HomeAssistant, config: ConfigType) -> bool:
|
||||
station_response = await api_client.get_stations()
|
||||
if station_response is None:
|
||||
return False
|
||||
# Uses legacy hass.data[DOMAIN] pattern
|
||||
# pylint: disable-next=hass-use-runtime-data
|
||||
hass.data[DOMAIN] = station_response.stations
|
||||
|
||||
return True
|
||||
|
||||
@@ -65,11 +65,16 @@ class NtfyNotifyEntity(NtfyBaseEntity, NotifyEntity):
|
||||
_attr_supported_features = NotifyEntityFeature.TITLE
|
||||
|
||||
async def async_send_message(self, message: str, title: str | None = None) -> None:
|
||||
"""Publish a message to a topic."""
|
||||
await self.publish(message=message, title=title)
|
||||
"""Publish a message to a topic via notify.send_message action."""
|
||||
await self._publish(message=message, title=title)
|
||||
|
||||
async def publish(self, **kwargs: Any) -> None:
|
||||
"""Publish a message to a topic."""
|
||||
"""Publish a message to a topic via ntfy.publish action."""
|
||||
await self._publish(**kwargs)
|
||||
self._async_record_notification()
|
||||
|
||||
async def _publish(self, **kwargs: Any) -> None:
|
||||
"""Shared internal helper to publish a message to a topic."""
|
||||
attachment = None
|
||||
params: dict[str, Any] = kwargs
|
||||
delay: timedelta | None = params.get("delay")
|
||||
|
||||
@@ -1,4 +1,5 @@
|
||||
"""The ONVIF integration."""
|
||||
# pylint: disable=hass-use-runtime-data # Uses legacy hass.data[DOMAIN] pattern
|
||||
|
||||
import asyncio
|
||||
from contextlib import AsyncExitStack, suppress
|
||||
|
||||
@@ -26,6 +26,8 @@ async def async_setup_entry(
|
||||
async_add_entities: AddConfigEntryEntitiesCallback,
|
||||
) -> None:
|
||||
"""Set up ONVIF binary sensor platform."""
|
||||
# Uses legacy hass.data[DOMAIN] pattern
|
||||
# pylint: disable-next=hass-use-runtime-data
|
||||
device: ONVIFDevice = hass.data[DOMAIN][config_entry.unique_id]
|
||||
|
||||
events = device.events.get_platform("binary_sensor")
|
||||
|
||||
@@ -17,6 +17,8 @@ async def async_setup_entry(
|
||||
async_add_entities: AddConfigEntryEntitiesCallback,
|
||||
) -> None:
|
||||
"""Set up ONVIF button based on a config entry."""
|
||||
# Uses legacy hass.data[DOMAIN] pattern
|
||||
# pylint: disable-next=hass-use-runtime-data
|
||||
device = hass.data[DOMAIN][config_entry.unique_id]
|
||||
async_add_entities([RebootButton(device), SetSystemDateAndTimeButton(device)])
|
||||
|
||||
|
||||
@@ -86,6 +86,8 @@ async def async_setup_entry(
|
||||
"async_perform_ptz",
|
||||
)
|
||||
|
||||
# Uses legacy hass.data[DOMAIN] pattern
|
||||
# pylint: disable-next=hass-use-runtime-data
|
||||
device = hass.data[DOMAIN][config_entry.unique_id]
|
||||
async_add_entities(
|
||||
[ONVIFCameraEntity(device, profile) for profile in device.profiles]
|
||||
|
||||
@@ -25,6 +25,8 @@ async def async_setup_entry(
|
||||
async_add_entities: AddConfigEntryEntitiesCallback,
|
||||
) -> None:
|
||||
"""Set up ONVIF sensor platform."""
|
||||
# Uses legacy hass.data[DOMAIN] pattern
|
||||
# pylint: disable-next=hass-use-runtime-data
|
||||
device: ONVIFDevice = hass.data[DOMAIN][config_entry.unique_id]
|
||||
|
||||
events = device.events.get_platform("sensor")
|
||||
|
||||
@@ -69,6 +69,8 @@ async def async_setup_entry(
|
||||
async_add_entities: AddConfigEntryEntitiesCallback,
|
||||
) -> None:
|
||||
"""Set up a ONVIF switch platform."""
|
||||
# Uses legacy hass.data[DOMAIN] pattern
|
||||
# pylint: disable-next=hass-use-runtime-data
|
||||
device = hass.data[DOMAIN][config_entry.unique_id]
|
||||
|
||||
async_add_entities(
|
||||
|
||||
@@ -1,4 +1,5 @@
|
||||
"""Support for OwnTracks."""
|
||||
# pylint: disable=hass-use-runtime-data # Uses legacy hass.data[DOMAIN] pattern
|
||||
|
||||
from collections import defaultdict
|
||||
import json
|
||||
|
||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user