Compare commits

..

1 Commits

Author SHA1 Message Date
Josef Zweck
aeebf67575 Revert "Add device trackers to enabled_by_default fixture (#134446)"
This reverts commit a47fa08a9b.
2025-01-09 13:30:23 +01:00
10861 changed files with 158661 additions and 647330 deletions

View File

@@ -62,7 +62,7 @@
"json.schemas": [ "json.schemas": [
{ {
"fileMatch": ["homeassistant/components/*/manifest.json"], "fileMatch": ["homeassistant/components/*/manifest.json"],
"url": "${containerWorkspaceFolder}/script/json_schemas/manifest_schema.json" "url": "./script/json_schemas/manifest_schema.json"
} }
] ]
} }

11
.gitattributes vendored
View File

@@ -11,14 +11,3 @@
*.pcm binary *.pcm binary
Dockerfile.dev linguist-language=Dockerfile Dockerfile.dev linguist-language=Dockerfile
# Generated files
CODEOWNERS linguist-generated=true
Dockerfile linguist-generated=true
homeassistant/generated/*.py linguist-generated=true
mypy.ini linguist-generated=true
requirements.txt linguist-generated=true
requirements_all.txt linguist-generated=true
requirements_test_all.txt linguist-generated=true
requirements_test_pre_commit.txt linguist-generated=true
script/hassfest/docker/Dockerfile linguist-generated=true

View File

@@ -6,9 +6,9 @@ body:
value: | value: |
This issue form is for reporting bugs only! This issue form is for reporting bugs only!
If you have a feature or enhancement request, please [request them here instead][fr]. If you have a feature or enhancement request, please use the [feature request][fr] section of our [Community Forum][fr].
[fr]: https://github.com/orgs/home-assistant/discussions [fr]: https://community.home-assistant.io/c/feature-requests
- type: textarea - type: textarea
validations: validations:
required: true required: true

View File

@@ -10,8 +10,8 @@ contact_links:
url: https://www.home-assistant.io/help url: https://www.home-assistant.io/help
about: We use GitHub for tracking bugs, check our website for resources on getting help. about: We use GitHub for tracking bugs, check our website for resources on getting help.
- name: Feature Request - name: Feature Request
url: https://github.com/orgs/home-assistant/discussions url: https://community.home-assistant.io/c/feature-requests
about: Please use this link to request new features or enhancements to existing features. about: Please use our Community Forum for making feature requests.
- name: I'm unsure where to go - name: I'm unsure where to go
url: https://www.home-assistant.io/join-chat url: https://www.home-assistant.io/join-chat
about: If you are unsure where to go, then joining our chat is recommended; Just ask! about: If you are unsure where to go, then joining our chat is recommended; Just ask!

View File

@@ -1,51 +0,0 @@
name: Task
description: For staff only - Create a task
type: Task
body:
- type: markdown
attributes:
value: |
## ⚠️ RESTRICTED ACCESS
**This form is restricted to Open Home Foundation staff, authorized contributors, and integration code owners only.**
If you are a community member wanting to contribute, please:
- For bug reports: Use the [bug report form](https://github.com/home-assistant/core/issues/new?template=bug_report.yml)
- For feature requests: Submit to [Feature Requests](https://github.com/orgs/home-assistant/discussions)
---
### For authorized contributors
Use this form to create tasks for development work, improvements, or other actionable items that need to be tracked.
- type: textarea
id: description
attributes:
label: Task description
description: |
Provide a clear and detailed description of the task that needs to be accomplished.
Be specific about what needs to be done, why it's important, and any constraints or requirements.
placeholder: |
Describe the task, including:
- What needs to be done
- Why this task is needed
- Expected outcome
- Any constraints or requirements
validations:
required: true
- type: textarea
id: additional_context
attributes:
label: Additional context
description: |
Any additional information, links, research, or context that would be helpful.
Include links to related issues, research, prototypes, roadmap opportunities etc.
placeholder: |
- Roadmap opportunity: [links]
- Feature request: [link]
- Technical design documents: [link]
- Prototype/mockup: [link]
validations:
required: false

View File

@@ -46,8 +46,6 @@
- This PR fixes or closes issue: fixes # - This PR fixes or closes issue: fixes #
- This PR is related to issue: - This PR is related to issue:
- Link to documentation pull request: - Link to documentation pull request:
- Link to developer documentation pull request:
- Link to frontend pull request:
## Checklist ## Checklist
<!-- <!--

Binary file not shown.

Before

Width:  |  Height:  |  Size: 99 KiB

After

Width:  |  Height:  |  Size: 65 KiB

File diff suppressed because it is too large Load Diff

View File

@@ -32,7 +32,7 @@ jobs:
fetch-depth: 0 fetch-depth: 0
- name: Set up Python ${{ env.DEFAULT_PYTHON }} - name: Set up Python ${{ env.DEFAULT_PYTHON }}
uses: actions/setup-python@v5.6.0 uses: actions/setup-python@v5.3.0
with: with:
python-version: ${{ env.DEFAULT_PYTHON }} python-version: ${{ env.DEFAULT_PYTHON }}
@@ -69,7 +69,7 @@ jobs:
run: find ./homeassistant/components/*/translations -name "*.json" | tar zcvf translations.tar.gz -T - run: find ./homeassistant/components/*/translations -name "*.json" | tar zcvf translations.tar.gz -T -
- name: Upload translations - name: Upload translations
uses: actions/upload-artifact@v4.6.2 uses: actions/upload-artifact@v4.5.0
with: with:
name: translations name: translations
path: translations.tar.gz path: translations.tar.gz
@@ -94,7 +94,7 @@ jobs:
- name: Download nightly wheels of frontend - name: Download nightly wheels of frontend
if: needs.init.outputs.channel == 'dev' if: needs.init.outputs.channel == 'dev'
uses: dawidd6/action-download-artifact@v11 uses: dawidd6/action-download-artifact@v7
with: with:
github_token: ${{secrets.GITHUB_TOKEN}} github_token: ${{secrets.GITHUB_TOKEN}}
repo: home-assistant/frontend repo: home-assistant/frontend
@@ -105,10 +105,10 @@ jobs:
- name: Download nightly wheels of intents - name: Download nightly wheels of intents
if: needs.init.outputs.channel == 'dev' if: needs.init.outputs.channel == 'dev'
uses: dawidd6/action-download-artifact@v11 uses: dawidd6/action-download-artifact@v7
with: with:
github_token: ${{secrets.GITHUB_TOKEN}} github_token: ${{secrets.GITHUB_TOKEN}}
repo: OHF-Voice/intents-package repo: home-assistant/intents-package
branch: main branch: main
workflow: nightly.yaml workflow: nightly.yaml
workflow_conclusion: success workflow_conclusion: success
@@ -116,7 +116,7 @@ jobs:
- name: Set up Python ${{ env.DEFAULT_PYTHON }} - name: Set up Python ${{ env.DEFAULT_PYTHON }}
if: needs.init.outputs.channel == 'dev' if: needs.init.outputs.channel == 'dev'
uses: actions/setup-python@v5.6.0 uses: actions/setup-python@v5.3.0
with: with:
python-version: ${{ env.DEFAULT_PYTHON }} python-version: ${{ env.DEFAULT_PYTHON }}
@@ -175,7 +175,7 @@ jobs:
sed -i "s|pykrakenapi|# pykrakenapi|g" requirements_all.txt sed -i "s|pykrakenapi|# pykrakenapi|g" requirements_all.txt
- name: Download translations - name: Download translations
uses: actions/download-artifact@v4.3.0 uses: actions/download-artifact@v4.1.8
with: with:
name: translations name: translations
@@ -190,14 +190,14 @@ jobs:
echo "${{ github.sha }};${{ github.ref }};${{ github.event_name }};${{ github.actor }}" > rootfs/OFFICIAL_IMAGE echo "${{ github.sha }};${{ github.ref }};${{ github.event_name }};${{ github.actor }}" > rootfs/OFFICIAL_IMAGE
- name: Login to GitHub Container Registry - name: Login to GitHub Container Registry
uses: docker/login-action@v3.4.0 uses: docker/login-action@v3.3.0
with: with:
registry: ghcr.io registry: ghcr.io
username: ${{ github.repository_owner }} username: ${{ github.repository_owner }}
password: ${{ secrets.GITHUB_TOKEN }} password: ${{ secrets.GITHUB_TOKEN }}
- name: Build base image - name: Build base image
uses: home-assistant/builder@2025.03.0 uses: home-assistant/builder@2024.08.2
with: with:
args: | args: |
$BUILD_ARGS \ $BUILD_ARGS \
@@ -256,14 +256,14 @@ jobs:
fi fi
- name: Login to GitHub Container Registry - name: Login to GitHub Container Registry
uses: docker/login-action@v3.4.0 uses: docker/login-action@v3.3.0
with: with:
registry: ghcr.io registry: ghcr.io
username: ${{ github.repository_owner }} username: ${{ github.repository_owner }}
password: ${{ secrets.GITHUB_TOKEN }} password: ${{ secrets.GITHUB_TOKEN }}
- name: Build base image - name: Build base image
uses: home-assistant/builder@2025.03.0 uses: home-assistant/builder@2024.08.2
with: with:
args: | args: |
$BUILD_ARGS \ $BUILD_ARGS \
@@ -324,20 +324,20 @@ jobs:
uses: actions/checkout@v4.2.2 uses: actions/checkout@v4.2.2
- name: Install Cosign - name: Install Cosign
uses: sigstore/cosign-installer@v3.9.1 uses: sigstore/cosign-installer@v3.7.0
with: with:
cosign-release: "v2.2.3" cosign-release: "v2.2.3"
- name: Login to DockerHub - name: Login to DockerHub
if: matrix.registry == 'docker.io/homeassistant' if: matrix.registry == 'docker.io/homeassistant'
uses: docker/login-action@v3.4.0 uses: docker/login-action@v3.3.0
with: with:
username: ${{ secrets.DOCKERHUB_USERNAME }} username: ${{ secrets.DOCKERHUB_USERNAME }}
password: ${{ secrets.DOCKERHUB_TOKEN }} password: ${{ secrets.DOCKERHUB_TOKEN }}
- name: Login to GitHub Container Registry - name: Login to GitHub Container Registry
if: matrix.registry == 'ghcr.io/home-assistant' if: matrix.registry == 'ghcr.io/home-assistant'
uses: docker/login-action@v3.4.0 uses: docker/login-action@v3.3.0
with: with:
registry: ghcr.io registry: ghcr.io
username: ${{ github.repository_owner }} username: ${{ github.repository_owner }}
@@ -448,21 +448,18 @@ jobs:
environment: ${{ needs.init.outputs.channel }} environment: ${{ needs.init.outputs.channel }}
needs: ["init", "build_base"] needs: ["init", "build_base"]
runs-on: ubuntu-latest runs-on: ubuntu-latest
permissions:
contents: read
id-token: write
if: github.repository_owner == 'home-assistant' && needs.init.outputs.publish == 'true' if: github.repository_owner == 'home-assistant' && needs.init.outputs.publish == 'true'
steps: steps:
- name: Checkout the repository - name: Checkout the repository
uses: actions/checkout@v4.2.2 uses: actions/checkout@v4.2.2
- name: Set up Python ${{ env.DEFAULT_PYTHON }} - name: Set up Python ${{ env.DEFAULT_PYTHON }}
uses: actions/setup-python@v5.6.0 uses: actions/setup-python@v5.3.0
with: with:
python-version: ${{ env.DEFAULT_PYTHON }} python-version: ${{ env.DEFAULT_PYTHON }}
- name: Download translations - name: Download translations
uses: actions/download-artifact@v4.3.0 uses: actions/download-artifact@v4.1.8
with: with:
name: translations name: translations
@@ -476,13 +473,16 @@ jobs:
run: | run: |
# Remove dist, build, and homeassistant.egg-info # Remove dist, build, and homeassistant.egg-info
# when build locally for testing! # when build locally for testing!
pip install build pip install twine build
python -m build python -m build
- name: Upload package to PyPI - name: Upload package
uses: pypa/gh-action-pypi-publish@v1.12.4 shell: bash
with: run: |
skip-existing: true export TWINE_USERNAME="__token__"
export TWINE_PASSWORD="${{ secrets.TWINE_TOKEN }}"
twine upload dist/* --skip-existing
hassfest-image: hassfest-image:
name: Build and test hassfest image name: Build and test hassfest image
@@ -502,14 +502,14 @@ jobs:
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2 uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
- name: Login to GitHub Container Registry - name: Login to GitHub Container Registry
uses: docker/login-action@74a5d142397b4f367a81961eba4e8cd7edddf772 # v3.4.0 uses: docker/login-action@9780b0c442fbb1117ed29e0efdff1e18412f7567 # v3.3.0
with: with:
registry: ghcr.io registry: ghcr.io
username: ${{ github.repository_owner }} username: ${{ github.repository_owner }}
password: ${{ secrets.GITHUB_TOKEN }} password: ${{ secrets.GITHUB_TOKEN }}
- name: Build Docker image - name: Build Docker image
uses: docker/build-push-action@263435318d21b8e681c14492fe198d362a7d2c83 # v6.18.0 uses: docker/build-push-action@48aba3b46d1b1fec4febb7c5d0c644b249a11355 # v6.10.0
with: with:
context: . # So action will not pull the repository again context: . # So action will not pull the repository again
file: ./script/hassfest/docker/Dockerfile file: ./script/hassfest/docker/Dockerfile
@@ -522,7 +522,7 @@ jobs:
- name: Push Docker image - name: Push Docker image
if: needs.init.outputs.channel != 'dev' && needs.init.outputs.publish == 'true' if: needs.init.outputs.channel != 'dev' && needs.init.outputs.publish == 'true'
id: push id: push
uses: docker/build-push-action@263435318d21b8e681c14492fe198d362a7d2c83 # v6.18.0 uses: docker/build-push-action@48aba3b46d1b1fec4febb7c5d0c644b249a11355 # v6.10.0
with: with:
context: . # So action will not pull the repository again context: . # So action will not pull the repository again
file: ./script/hassfest/docker/Dockerfile file: ./script/hassfest/docker/Dockerfile
@@ -531,7 +531,7 @@ jobs:
- name: Generate artifact attestation - name: Generate artifact attestation
if: needs.init.outputs.channel != 'dev' && needs.init.outputs.publish == 'true' if: needs.init.outputs.channel != 'dev' && needs.init.outputs.publish == 'true'
uses: actions/attest-build-provenance@e8998f949152b193b063cb0ec769d69d929409be # v2.4.0 uses: actions/attest-build-provenance@7668571508540a607bdfd90a87a560489fe372eb # v2.1.0
with: with:
subject-name: ${{ env.HASSFEST_IMAGE_NAME }} subject-name: ${{ env.HASSFEST_IMAGE_NAME }}
subject-digest: ${{ steps.push.outputs.digest }} subject-digest: ${{ steps.push.outputs.digest }}

View File

@@ -37,12 +37,12 @@ on:
type: boolean type: boolean
env: env:
CACHE_VERSION: 4 CACHE_VERSION: 11
UV_CACHE_VERSION: 1 UV_CACHE_VERSION: 1
MYPY_CACHE_VERSION: 1 MYPY_CACHE_VERSION: 9
HA_SHORT_VERSION: "2025.8" HA_SHORT_VERSION: "2025.2"
DEFAULT_PYTHON: "3.13" DEFAULT_PYTHON: "3.12"
ALL_PYTHON_VERSIONS: "['3.13']" ALL_PYTHON_VERSIONS: "['3.12', '3.13']"
# 10.3 is the oldest supported version # 10.3 is the oldest supported version
# - 10.3.32 is the version currently shipped with Synology (as of 17 Feb 2022) # - 10.3.32 is the version currently shipped with Synology (as of 17 Feb 2022)
# 10.6 is the current long-term-support # 10.6 is the current long-term-support
@@ -89,7 +89,6 @@ jobs:
test_groups: ${{ steps.info.outputs.test_groups }} test_groups: ${{ steps.info.outputs.test_groups }}
tests_glob: ${{ steps.info.outputs.tests_glob }} tests_glob: ${{ steps.info.outputs.tests_glob }}
tests: ${{ steps.info.outputs.tests }} tests: ${{ steps.info.outputs.tests }}
lint_only: ${{ steps.info.outputs.lint_only }}
skip_coverage: ${{ steps.info.outputs.skip_coverage }} skip_coverage: ${{ steps.info.outputs.skip_coverage }}
runs-on: ubuntu-24.04 runs-on: ubuntu-24.04
steps: steps:
@@ -143,7 +142,6 @@ jobs:
test_group_count=10 test_group_count=10
tests="[]" tests="[]"
tests_glob="" tests_glob=""
lint_only=""
skip_coverage="" skip_coverage=""
if [[ "${{ steps.integrations.outputs.changes }}" != "[]" ]]; if [[ "${{ steps.integrations.outputs.changes }}" != "[]" ]];
@@ -194,17 +192,6 @@ jobs:
test_full_suite="true" test_full_suite="true"
fi fi
if [[ "${{ github.event.inputs.lint-only }}" == "true" ]] \
|| [[ "${{ github.event.inputs.pylint-only }}" == "true" ]] \
|| [[ "${{ github.event.inputs.mypy-only }}" == "true" ]] \
|| [[ "${{ github.event.inputs.audit-licenses-only }}" == "true" ]] \
|| [[ "${{ github.event_name }}" == "push" \
&& "${{ github.event.repository.full_name }}" != "home-assistant/core" ]];
then
lint_only="true"
skip_coverage="true"
fi
if [[ "${{ github.event.inputs.skip-coverage }}" == "true" ]] \ if [[ "${{ github.event.inputs.skip-coverage }}" == "true" ]] \
|| [[ "${{ contains(github.event.pull_request.labels.*.name, 'ci-skip-coverage') }}" == "true" ]]; || [[ "${{ contains(github.event.pull_request.labels.*.name, 'ci-skip-coverage') }}" == "true" ]];
then then
@@ -230,8 +217,6 @@ jobs:
echo "tests=${tests}" >> $GITHUB_OUTPUT echo "tests=${tests}" >> $GITHUB_OUTPUT
echo "tests_glob: ${tests_glob}" echo "tests_glob: ${tests_glob}"
echo "tests_glob=${tests_glob}" >> $GITHUB_OUTPUT echo "tests_glob=${tests_glob}" >> $GITHUB_OUTPUT
echo "lint_only": ${lint_only}
echo "lint_only=${lint_only}" >> $GITHUB_OUTPUT
echo "skip_coverage: ${skip_coverage}" echo "skip_coverage: ${skip_coverage}"
echo "skip_coverage=${skip_coverage}" >> $GITHUB_OUTPUT echo "skip_coverage=${skip_coverage}" >> $GITHUB_OUTPUT
@@ -249,17 +234,17 @@ jobs:
uses: actions/checkout@v4.2.2 uses: actions/checkout@v4.2.2
- name: Set up Python ${{ env.DEFAULT_PYTHON }} - name: Set up Python ${{ env.DEFAULT_PYTHON }}
id: python id: python
uses: actions/setup-python@v5.6.0 uses: actions/setup-python@v5.3.0
with: with:
python-version: ${{ env.DEFAULT_PYTHON }} python-version: ${{ env.DEFAULT_PYTHON }}
check-latest: true check-latest: true
- name: Restore base Python virtual environment - name: Restore base Python virtual environment
id: cache-venv id: cache-venv
uses: actions/cache@v4.2.3 uses: actions/cache@v4.2.0
with: with:
path: venv path: venv
key: >- key: >-
${{ runner.os }}-${{ runner.arch }}-${{ steps.python.outputs.python-version }}-venv-${{ ${{ runner.os }}-${{ steps.python.outputs.python-version }}-venv-${{
needs.info.outputs.pre-commit_cache_key }} needs.info.outputs.pre-commit_cache_key }}
- name: Create Python virtual environment - name: Create Python virtual environment
if: steps.cache-venv.outputs.cache-hit != 'true' if: steps.cache-venv.outputs.cache-hit != 'true'
@@ -271,12 +256,12 @@ jobs:
uv pip install "$(cat requirements_test.txt | grep pre-commit)" uv pip install "$(cat requirements_test.txt | grep pre-commit)"
- name: Restore pre-commit environment from cache - name: Restore pre-commit environment from cache
id: cache-precommit id: cache-precommit
uses: actions/cache@v4.2.3 uses: actions/cache@v4.2.0
with: with:
path: ${{ env.PRE_COMMIT_CACHE }} path: ${{ env.PRE_COMMIT_CACHE }}
lookup-only: true lookup-only: true
key: >- key: >-
${{ runner.os }}-${{ runner.arch }}-${{ steps.python.outputs.python-version }}-${{ ${{ runner.os }}-${{ steps.python.outputs.python-version }}-${{
needs.info.outputs.pre-commit_cache_key }} needs.info.outputs.pre-commit_cache_key }}
- name: Install pre-commit dependencies - name: Install pre-commit dependencies
if: steps.cache-precommit.outputs.cache-hit != 'true' if: steps.cache-precommit.outputs.cache-hit != 'true'
@@ -294,28 +279,28 @@ jobs:
- name: Check out code from GitHub - name: Check out code from GitHub
uses: actions/checkout@v4.2.2 uses: actions/checkout@v4.2.2
- name: Set up Python ${{ env.DEFAULT_PYTHON }} - name: Set up Python ${{ env.DEFAULT_PYTHON }}
uses: actions/setup-python@v5.6.0 uses: actions/setup-python@v5.3.0
id: python id: python
with: with:
python-version: ${{ env.DEFAULT_PYTHON }} python-version: ${{ env.DEFAULT_PYTHON }}
check-latest: true check-latest: true
- name: Restore base Python virtual environment - name: Restore base Python virtual environment
id: cache-venv id: cache-venv
uses: actions/cache/restore@v4.2.3 uses: actions/cache/restore@v4.2.0
with: with:
path: venv path: venv
fail-on-cache-miss: true fail-on-cache-miss: true
key: >- key: >-
${{ runner.os }}-${{ runner.arch }}-${{ steps.python.outputs.python-version }}-venv-${{ ${{ runner.os }}-${{ steps.python.outputs.python-version }}-venv-${{
needs.info.outputs.pre-commit_cache_key }} needs.info.outputs.pre-commit_cache_key }}
- name: Restore pre-commit environment from cache - name: Restore pre-commit environment from cache
id: cache-precommit id: cache-precommit
uses: actions/cache/restore@v4.2.3 uses: actions/cache/restore@v4.2.0
with: with:
path: ${{ env.PRE_COMMIT_CACHE }} path: ${{ env.PRE_COMMIT_CACHE }}
fail-on-cache-miss: true fail-on-cache-miss: true
key: >- key: >-
${{ runner.os }}-${{ runner.arch }}-${{ steps.python.outputs.python-version }}-${{ ${{ runner.os }}-${{ steps.python.outputs.python-version }}-${{
needs.info.outputs.pre-commit_cache_key }} needs.info.outputs.pre-commit_cache_key }}
- name: Run ruff-format - name: Run ruff-format
run: | run: |
@@ -334,33 +319,33 @@ jobs:
- name: Check out code from GitHub - name: Check out code from GitHub
uses: actions/checkout@v4.2.2 uses: actions/checkout@v4.2.2
- name: Set up Python ${{ env.DEFAULT_PYTHON }} - name: Set up Python ${{ env.DEFAULT_PYTHON }}
uses: actions/setup-python@v5.6.0 uses: actions/setup-python@v5.3.0
id: python id: python
with: with:
python-version: ${{ env.DEFAULT_PYTHON }} python-version: ${{ env.DEFAULT_PYTHON }}
check-latest: true check-latest: true
- name: Restore base Python virtual environment - name: Restore base Python virtual environment
id: cache-venv id: cache-venv
uses: actions/cache/restore@v4.2.3 uses: actions/cache/restore@v4.2.0
with: with:
path: venv path: venv
fail-on-cache-miss: true fail-on-cache-miss: true
key: >- key: >-
${{ runner.os }}-${{ runner.arch }}-${{ steps.python.outputs.python-version }}-venv-${{ ${{ runner.os }}-${{ steps.python.outputs.python-version }}-venv-${{
needs.info.outputs.pre-commit_cache_key }} needs.info.outputs.pre-commit_cache_key }}
- name: Restore pre-commit environment from cache - name: Restore pre-commit environment from cache
id: cache-precommit id: cache-precommit
uses: actions/cache/restore@v4.2.3 uses: actions/cache/restore@v4.2.0
with: with:
path: ${{ env.PRE_COMMIT_CACHE }} path: ${{ env.PRE_COMMIT_CACHE }}
fail-on-cache-miss: true fail-on-cache-miss: true
key: >- key: >-
${{ runner.os }}-${{ runner.arch }}-${{ steps.python.outputs.python-version }}-${{ ${{ runner.os }}-${{ steps.python.outputs.python-version }}-${{
needs.info.outputs.pre-commit_cache_key }} needs.info.outputs.pre-commit_cache_key }}
- name: Run ruff - name: Run ruff
run: | run: |
. venv/bin/activate . venv/bin/activate
pre-commit run --hook-stage manual ruff-check --all-files --show-diff-on-failure pre-commit run --hook-stage manual ruff --all-files --show-diff-on-failure
env: env:
RUFF_OUTPUT_FORMAT: github RUFF_OUTPUT_FORMAT: github
@@ -374,28 +359,28 @@ jobs:
- name: Check out code from GitHub - name: Check out code from GitHub
uses: actions/checkout@v4.2.2 uses: actions/checkout@v4.2.2
- name: Set up Python ${{ env.DEFAULT_PYTHON }} - name: Set up Python ${{ env.DEFAULT_PYTHON }}
uses: actions/setup-python@v5.6.0 uses: actions/setup-python@v5.3.0
id: python id: python
with: with:
python-version: ${{ env.DEFAULT_PYTHON }} python-version: ${{ env.DEFAULT_PYTHON }}
check-latest: true check-latest: true
- name: Restore base Python virtual environment - name: Restore base Python virtual environment
id: cache-venv id: cache-venv
uses: actions/cache/restore@v4.2.3 uses: actions/cache/restore@v4.2.0
with: with:
path: venv path: venv
fail-on-cache-miss: true fail-on-cache-miss: true
key: >- key: >-
${{ runner.os }}-${{ runner.arch }}-${{ steps.python.outputs.python-version }}-venv-${{ ${{ runner.os }}-${{ steps.python.outputs.python-version }}-venv-${{
needs.info.outputs.pre-commit_cache_key }} needs.info.outputs.pre-commit_cache_key }}
- name: Restore pre-commit environment from cache - name: Restore pre-commit environment from cache
id: cache-precommit id: cache-precommit
uses: actions/cache/restore@v4.2.3 uses: actions/cache/restore@v4.2.0
with: with:
path: ${{ env.PRE_COMMIT_CACHE }} path: ${{ env.PRE_COMMIT_CACHE }}
fail-on-cache-miss: true fail-on-cache-miss: true
key: >- key: >-
${{ runner.os }}-${{ runner.arch }}-${{ steps.python.outputs.python-version }}-${{ ${{ runner.os }}-${{ steps.python.outputs.python-version }}-${{
needs.info.outputs.pre-commit_cache_key }} needs.info.outputs.pre-commit_cache_key }}
- name: Register yamllint problem matcher - name: Register yamllint problem matcher
@@ -484,7 +469,7 @@ jobs:
uses: actions/checkout@v4.2.2 uses: actions/checkout@v4.2.2
- name: Set up Python ${{ matrix.python-version }} - name: Set up Python ${{ matrix.python-version }}
id: python id: python
uses: actions/setup-python@v5.6.0 uses: actions/setup-python@v5.3.0
with: with:
python-version: ${{ matrix.python-version }} python-version: ${{ matrix.python-version }}
check-latest: true check-latest: true
@@ -497,22 +482,22 @@ jobs:
env.HA_SHORT_VERSION }}-$(date -u '+%Y-%m-%dT%H:%M:%s')" >> $GITHUB_OUTPUT env.HA_SHORT_VERSION }}-$(date -u '+%Y-%m-%dT%H:%M:%s')" >> $GITHUB_OUTPUT
- name: Restore base Python virtual environment - name: Restore base Python virtual environment
id: cache-venv id: cache-venv
uses: actions/cache@v4.2.3 uses: actions/cache@v4.2.0
with: with:
path: venv path: venv
key: >- key: >-
${{ runner.os }}-${{ runner.arch }}-${{ steps.python.outputs.python-version }}-${{ ${{ runner.os }}-${{ steps.python.outputs.python-version }}-${{
needs.info.outputs.python_cache_key }} needs.info.outputs.python_cache_key }}
- name: Restore uv wheel cache - name: Restore uv wheel cache
if: steps.cache-venv.outputs.cache-hit != 'true' if: steps.cache-venv.outputs.cache-hit != 'true'
uses: actions/cache@v4.2.3 uses: actions/cache@v4.2.0
with: with:
path: ${{ env.UV_CACHE_DIR }} path: ${{ env.UV_CACHE_DIR }}
key: >- key: >-
${{ runner.os }}-${{ runner.arch }}-${{ steps.python.outputs.python-version }}-${{ ${{ runner.os }}-${{ steps.python.outputs.python-version }}-${{
steps.generate-uv-key.outputs.key }} steps.generate-uv-key.outputs.key }}
restore-keys: | restore-keys: |
${{ runner.os }}-${{ runner.arch }}-${{ steps.python.outputs.python-version }}-uv-${{ ${{ runner.os }}-${{ steps.python.outputs.python-version }}-uv-${{
env.UV_CACHE_VERSION }}-${{ steps.generate-uv-key.outputs.version }}-${{ env.UV_CACHE_VERSION }}-${{ steps.generate-uv-key.outputs.version }}-${{
env.HA_SHORT_VERSION }}- env.HA_SHORT_VERSION }}-
- name: Install additional OS dependencies - name: Install additional OS dependencies
@@ -552,7 +537,7 @@ jobs:
python --version python --version
uv pip freeze >> pip_freeze.txt uv pip freeze >> pip_freeze.txt
- name: Upload pip_freeze artifact - name: Upload pip_freeze artifact
uses: actions/upload-artifact@v4.6.2 uses: actions/upload-artifact@v4.5.0
with: with:
name: pip-freeze-${{ matrix.python-version }} name: pip-freeze-${{ matrix.python-version }}
path: pip_freeze.txt path: pip_freeze.txt
@@ -587,18 +572,18 @@ jobs:
uses: actions/checkout@v4.2.2 uses: actions/checkout@v4.2.2
- name: Set up Python ${{ env.DEFAULT_PYTHON }} - name: Set up Python ${{ env.DEFAULT_PYTHON }}
id: python id: python
uses: actions/setup-python@v5.6.0 uses: actions/setup-python@v5.3.0
with: with:
python-version: ${{ env.DEFAULT_PYTHON }} python-version: ${{ env.DEFAULT_PYTHON }}
check-latest: true check-latest: true
- name: Restore full Python ${{ env.DEFAULT_PYTHON }} virtual environment - name: Restore full Python ${{ env.DEFAULT_PYTHON }} virtual environment
id: cache-venv id: cache-venv
uses: actions/cache/restore@v4.2.3 uses: actions/cache/restore@v4.2.0
with: with:
path: venv path: venv
fail-on-cache-miss: true fail-on-cache-miss: true
key: >- key: >-
${{ runner.os }}-${{ runner.arch }}-${{ steps.python.outputs.python-version }}-${{ ${{ runner.os }}-${{ steps.python.outputs.python-version }}-${{
needs.info.outputs.python_cache_key }} needs.info.outputs.python_cache_key }}
- name: Run hassfest - name: Run hassfest
run: | run: |
@@ -620,43 +605,24 @@ jobs:
uses: actions/checkout@v4.2.2 uses: actions/checkout@v4.2.2
- name: Set up Python ${{ env.DEFAULT_PYTHON }} - name: Set up Python ${{ env.DEFAULT_PYTHON }}
id: python id: python
uses: actions/setup-python@v5.6.0 uses: actions/setup-python@v5.3.0
with: with:
python-version: ${{ env.DEFAULT_PYTHON }} python-version: ${{ env.DEFAULT_PYTHON }}
check-latest: true check-latest: true
- name: Restore base Python virtual environment - name: Restore base Python virtual environment
id: cache-venv id: cache-venv
uses: actions/cache/restore@v4.2.3 uses: actions/cache/restore@v4.2.0
with: with:
path: venv path: venv
fail-on-cache-miss: true fail-on-cache-miss: true
key: >- key: >-
${{ runner.os }}-${{ runner.arch }}-${{ steps.python.outputs.python-version }}-${{ ${{ runner.os }}-${{ steps.python.outputs.python-version }}-${{
needs.info.outputs.python_cache_key }} needs.info.outputs.python_cache_key }}
- name: Run gen_requirements_all.py - name: Run gen_requirements_all.py
run: | run: |
. venv/bin/activate . venv/bin/activate
python -m script.gen_requirements_all validate python -m script.gen_requirements_all validate
dependency-review:
name: Dependency review
runs-on: ubuntu-24.04
needs:
- info
- base
if: |
github.event.inputs.pylint-only != 'true'
&& github.event.inputs.mypy-only != 'true'
&& needs.info.outputs.requirements == 'true'
&& github.event_name == 'pull_request'
steps:
- name: Check out code from GitHub
uses: actions/checkout@v4.2.2
- name: Dependency review
uses: actions/dependency-review-action@v4.7.1
with:
license-check: false # We use our own license audit checks
audit-licenses: audit-licenses:
name: Audit licenses name: Audit licenses
runs-on: ubuntu-24.04 runs-on: ubuntu-24.04
@@ -677,25 +643,25 @@ jobs:
uses: actions/checkout@v4.2.2 uses: actions/checkout@v4.2.2
- name: Set up Python ${{ matrix.python-version }} - name: Set up Python ${{ matrix.python-version }}
id: python id: python
uses: actions/setup-python@v5.6.0 uses: actions/setup-python@v5.3.0
with: with:
python-version: ${{ matrix.python-version }} python-version: ${{ matrix.python-version }}
check-latest: true check-latest: true
- name: Restore full Python ${{ matrix.python-version }} virtual environment - name: Restore full Python ${{ matrix.python-version }} virtual environment
id: cache-venv id: cache-venv
uses: actions/cache/restore@v4.2.3 uses: actions/cache/restore@v4.2.0
with: with:
path: venv path: venv
fail-on-cache-miss: true fail-on-cache-miss: true
key: >- key: >-
${{ runner.os }}-${{ runner.arch }}-${{ steps.python.outputs.python-version }}-${{ ${{ runner.os }}-${{ steps.python.outputs.python-version }}-${{
needs.info.outputs.python_cache_key }} needs.info.outputs.python_cache_key }}
- name: Extract license data - name: Extract license data
run: | run: |
. venv/bin/activate . venv/bin/activate
python -m script.licenses extract --output-file=licenses-${{ matrix.python-version }}.json python -m script.licenses extract --output-file=licenses-${{ matrix.python-version }}.json
- name: Upload licenses - name: Upload licenses
uses: actions/upload-artifact@v4.6.2 uses: actions/upload-artifact@v4.5.0
with: with:
name: licenses-${{ github.run_number }}-${{ matrix.python-version }} name: licenses-${{ github.run_number }}-${{ matrix.python-version }}
path: licenses-${{ matrix.python-version }}.json path: licenses-${{ matrix.python-version }}.json
@@ -720,18 +686,18 @@ jobs:
uses: actions/checkout@v4.2.2 uses: actions/checkout@v4.2.2
- name: Set up Python ${{ env.DEFAULT_PYTHON }} - name: Set up Python ${{ env.DEFAULT_PYTHON }}
id: python id: python
uses: actions/setup-python@v5.6.0 uses: actions/setup-python@v5.3.0
with: with:
python-version: ${{ env.DEFAULT_PYTHON }} python-version: ${{ env.DEFAULT_PYTHON }}
check-latest: true check-latest: true
- name: Restore full Python ${{ env.DEFAULT_PYTHON }} virtual environment - name: Restore full Python ${{ env.DEFAULT_PYTHON }} virtual environment
id: cache-venv id: cache-venv
uses: actions/cache/restore@v4.2.3 uses: actions/cache/restore@v4.2.0
with: with:
path: venv path: venv
fail-on-cache-miss: true fail-on-cache-miss: true
key: >- key: >-
${{ runner.os }}-${{ runner.arch }}-${{ steps.python.outputs.python-version }}-${{ ${{ runner.os }}-${{ steps.python.outputs.python-version }}-${{
needs.info.outputs.python_cache_key }} needs.info.outputs.python_cache_key }}
- name: Register pylint problem matcher - name: Register pylint problem matcher
run: | run: |
@@ -767,18 +733,18 @@ jobs:
uses: actions/checkout@v4.2.2 uses: actions/checkout@v4.2.2
- name: Set up Python ${{ env.DEFAULT_PYTHON }} - name: Set up Python ${{ env.DEFAULT_PYTHON }}
id: python id: python
uses: actions/setup-python@v5.6.0 uses: actions/setup-python@v5.3.0
with: with:
python-version: ${{ env.DEFAULT_PYTHON }} python-version: ${{ env.DEFAULT_PYTHON }}
check-latest: true check-latest: true
- name: Restore full Python ${{ env.DEFAULT_PYTHON }} virtual environment - name: Restore full Python ${{ env.DEFAULT_PYTHON }} virtual environment
id: cache-venv id: cache-venv
uses: actions/cache/restore@v4.2.3 uses: actions/cache/restore@v4.2.0
with: with:
path: venv path: venv
fail-on-cache-miss: true fail-on-cache-miss: true
key: >- key: >-
${{ runner.os }}-${{ runner.arch }}-${{ steps.python.outputs.python-version }}-${{ ${{ runner.os }}-${{ steps.python.outputs.python-version }}-${{
needs.info.outputs.python_cache_key }} needs.info.outputs.python_cache_key }}
- name: Register pylint problem matcher - name: Register pylint problem matcher
run: | run: |
@@ -812,7 +778,7 @@ jobs:
uses: actions/checkout@v4.2.2 uses: actions/checkout@v4.2.2
- name: Set up Python ${{ env.DEFAULT_PYTHON }} - name: Set up Python ${{ env.DEFAULT_PYTHON }}
id: python id: python
uses: actions/setup-python@v5.6.0 uses: actions/setup-python@v5.3.0
with: with:
python-version: ${{ env.DEFAULT_PYTHON }} python-version: ${{ env.DEFAULT_PYTHON }}
check-latest: true check-latest: true
@@ -825,22 +791,22 @@ jobs:
env.HA_SHORT_VERSION }}-$(date -u '+%Y-%m-%dT%H:%M:%s')" >> $GITHUB_OUTPUT env.HA_SHORT_VERSION }}-$(date -u '+%Y-%m-%dT%H:%M:%s')" >> $GITHUB_OUTPUT
- name: Restore full Python ${{ env.DEFAULT_PYTHON }} virtual environment - name: Restore full Python ${{ env.DEFAULT_PYTHON }} virtual environment
id: cache-venv id: cache-venv
uses: actions/cache/restore@v4.2.3 uses: actions/cache/restore@v4.2.0
with: with:
path: venv path: venv
fail-on-cache-miss: true fail-on-cache-miss: true
key: >- key: >-
${{ runner.os }}-${{ runner.arch }}-${{ steps.python.outputs.python-version }}-${{ ${{ runner.os }}-${{ steps.python.outputs.python-version }}-${{
needs.info.outputs.python_cache_key }} needs.info.outputs.python_cache_key }}
- name: Restore mypy cache - name: Restore mypy cache
uses: actions/cache@v4.2.3 uses: actions/cache@v4.2.0
with: with:
path: .mypy_cache path: .mypy_cache
key: >- key: >-
${{ runner.os }}-${{ runner.arch }}-${{ steps.python.outputs.python-version }}-${{ ${{ runner.os }}-${{ steps.python.outputs.python-version }}-${{
steps.generate-mypy-key.outputs.key }} steps.generate-mypy-key.outputs.key }}
restore-keys: | restore-keys: |
${{ runner.os }}-${{ runner.arch }}-${{ steps.python.outputs.python-version }}-mypy-${{ ${{ runner.os }}-${{ steps.python.outputs.python-version }}-mypy-${{
env.MYPY_CACHE_VERSION }}-${{ steps.generate-mypy-key.outputs.version }}-${{ env.MYPY_CACHE_VERSION }}-${{ steps.generate-mypy-key.outputs.version }}-${{
env.HA_SHORT_VERSION }}- env.HA_SHORT_VERSION }}-
- name: Register mypy problem matcher - name: Register mypy problem matcher
@@ -863,7 +829,11 @@ jobs:
prepare-pytest-full: prepare-pytest-full:
runs-on: ubuntu-24.04 runs-on: ubuntu-24.04
if: | if: |
needs.info.outputs.lint_only != 'true' (github.event_name != 'push' || github.event.repository.full_name == 'home-assistant/core')
&& github.event.inputs.lint-only != 'true'
&& github.event.inputs.pylint-only != 'true'
&& github.event.inputs.mypy-only != 'true'
&& github.event.inputs.audit-licenses-only != 'true'
&& needs.info.outputs.test_full_suite == 'true' && needs.info.outputs.test_full_suite == 'true'
needs: needs:
- info - info
@@ -889,25 +859,25 @@ jobs:
uses: actions/checkout@v4.2.2 uses: actions/checkout@v4.2.2
- name: Set up Python ${{ env.DEFAULT_PYTHON }} - name: Set up Python ${{ env.DEFAULT_PYTHON }}
id: python id: python
uses: actions/setup-python@v5.6.0 uses: actions/setup-python@v5.3.0
with: with:
python-version: ${{ env.DEFAULT_PYTHON }} python-version: ${{ env.DEFAULT_PYTHON }}
check-latest: true check-latest: true
- name: Restore base Python virtual environment - name: Restore base Python virtual environment
id: cache-venv id: cache-venv
uses: actions/cache/restore@v4.2.3 uses: actions/cache/restore@v4.2.0
with: with:
path: venv path: venv
fail-on-cache-miss: true fail-on-cache-miss: true
key: >- key: >-
${{ runner.os }}-${{ runner.arch }}-${{ steps.python.outputs.python-version }}-${{ ${{ runner.os }}-${{ steps.python.outputs.python-version }}-${{
needs.info.outputs.python_cache_key }} needs.info.outputs.python_cache_key }}
- name: Run split_tests.py - name: Run split_tests.py
run: | run: |
. venv/bin/activate . venv/bin/activate
python -m script.split_tests ${{ needs.info.outputs.test_group_count }} tests python -m script.split_tests ${{ needs.info.outputs.test_group_count }} tests
- name: Upload pytest_buckets - name: Upload pytest_buckets
uses: actions/upload-artifact@v4.6.2 uses: actions/upload-artifact@v4.5.0
with: with:
name: pytest_buckets name: pytest_buckets
path: pytest_buckets.txt path: pytest_buckets.txt
@@ -916,7 +886,11 @@ jobs:
pytest-full: pytest-full:
runs-on: ubuntu-24.04 runs-on: ubuntu-24.04
if: | if: |
needs.info.outputs.lint_only != 'true' (github.event_name != 'push' || github.event.repository.full_name == 'home-assistant/core')
&& github.event.inputs.lint-only != 'true'
&& github.event.inputs.pylint-only != 'true'
&& github.event.inputs.mypy-only != 'true'
&& github.event.inputs.audit-licenses-only != 'true'
&& needs.info.outputs.test_full_suite == 'true' && needs.info.outputs.test_full_suite == 'true'
needs: needs:
- info - info
@@ -944,24 +918,22 @@ jobs:
bluez \ bluez \
ffmpeg \ ffmpeg \
libturbojpeg \ libturbojpeg \
libgammu-dev \ libgammu-dev
libxml2-utils
- name: Check out code from GitHub - name: Check out code from GitHub
uses: actions/checkout@v4.2.2 uses: actions/checkout@v4.2.2
- name: Set up Python ${{ matrix.python-version }} - name: Set up Python ${{ matrix.python-version }}
id: python id: python
uses: actions/setup-python@v5.6.0 uses: actions/setup-python@v5.3.0
with: with:
python-version: ${{ matrix.python-version }} python-version: ${{ matrix.python-version }}
check-latest: true check-latest: true
- name: Restore full Python ${{ matrix.python-version }} virtual environment - name: Restore full Python ${{ matrix.python-version }} virtual environment
id: cache-venv id: cache-venv
uses: actions/cache/restore@v4.2.3 uses: actions/cache/restore@v4.2.0
with: with:
path: venv path: venv
fail-on-cache-miss: true fail-on-cache-miss: true
key: >- key: ${{ runner.os }}-${{ steps.python.outputs.python-version }}-${{
${{ runner.os }}-${{ runner.arch }}-${{ steps.python.outputs.python-version }}-${{
needs.info.outputs.python_cache_key }} needs.info.outputs.python_cache_key }}
- name: Register Python problem matcher - name: Register Python problem matcher
run: | run: |
@@ -970,7 +942,7 @@ jobs:
run: | run: |
echo "::add-matcher::.github/workflows/matchers/pytest-slow.json" echo "::add-matcher::.github/workflows/matchers/pytest-slow.json"
- name: Download pytest_buckets - name: Download pytest_buckets
uses: actions/download-artifact@v4.3.0 uses: actions/download-artifact@v4.1.8
with: with:
name: pytest_buckets name: pytest_buckets
- name: Compile English translations - name: Compile English translations
@@ -990,7 +962,6 @@ jobs:
if [[ "${{ needs.info.outputs.skip_coverage }}" != "true" ]]; then if [[ "${{ needs.info.outputs.skip_coverage }}" != "true" ]]; then
cov_params+=(--cov="homeassistant") cov_params+=(--cov="homeassistant")
cov_params+=(--cov-report=xml) cov_params+=(--cov-report=xml)
cov_params+=(--junitxml=junit.xml -o junit_family=legacy)
fi fi
echo "Test group ${{ matrix.group }}: $(sed -n "${{ matrix.group }},1p" pytest_buckets.txt)" echo "Test group ${{ matrix.group }}: $(sed -n "${{ matrix.group }},1p" pytest_buckets.txt)"
@@ -1004,35 +975,22 @@ jobs:
${cov_params[@]} \ ${cov_params[@]} \
-o console_output_style=count \ -o console_output_style=count \
-p no:sugar \ -p no:sugar \
--exclude-warning-annotations \
$(sed -n "${{ matrix.group }},1p" pytest_buckets.txt) \ $(sed -n "${{ matrix.group }},1p" pytest_buckets.txt) \
2>&1 | tee pytest-${{ matrix.python-version }}-${{ matrix.group }}.txt 2>&1 | tee pytest-${{ matrix.python-version }}-${{ matrix.group }}.txt
- name: Upload pytest output - name: Upload pytest output
if: success() || failure() && steps.pytest-full.conclusion == 'failure' if: success() || failure() && steps.pytest-full.conclusion == 'failure'
uses: actions/upload-artifact@v4.6.2 uses: actions/upload-artifact@v4.5.0
with: with:
name: pytest-${{ github.run_number }}-${{ matrix.python-version }}-${{ matrix.group }} name: pytest-${{ github.run_number }}-${{ matrix.python-version }}-${{ matrix.group }}
path: pytest-*.txt path: pytest-*.txt
overwrite: true overwrite: true
- name: Upload coverage artifact - name: Upload coverage artifact
if: needs.info.outputs.skip_coverage != 'true' if: needs.info.outputs.skip_coverage != 'true'
uses: actions/upload-artifact@v4.6.2 uses: actions/upload-artifact@v4.5.0
with: with:
name: coverage-${{ matrix.python-version }}-${{ matrix.group }} name: coverage-${{ matrix.python-version }}-${{ matrix.group }}
path: coverage.xml path: coverage.xml
overwrite: true overwrite: true
- name: Beautify test results
# For easier identification of parsing errors
if: needs.info.outputs.skip_coverage != 'true'
run: |
xmllint --format "junit.xml" > "junit.xml-tmp"
mv "junit.xml-tmp" "junit.xml"
- name: Upload test results artifact
if: needs.info.outputs.skip_coverage != 'true' && !cancelled()
uses: actions/upload-artifact@v4.6.2
with:
name: test-results-full-${{ matrix.python-version }}-${{ matrix.group }}
path: junit.xml
- name: Remove pytest_buckets - name: Remove pytest_buckets
run: rm pytest_buckets.txt run: rm pytest_buckets.txt
- name: Check dirty - name: Check dirty
@@ -1050,7 +1008,11 @@ jobs:
MYSQL_ROOT_PASSWORD: password MYSQL_ROOT_PASSWORD: password
options: --health-cmd="mysqladmin ping -uroot -ppassword" --health-interval=5s --health-timeout=2s --health-retries=3 options: --health-cmd="mysqladmin ping -uroot -ppassword" --health-interval=5s --health-timeout=2s --health-retries=3
if: | if: |
needs.info.outputs.lint_only != 'true' (github.event_name != 'push' || github.event.repository.full_name == 'home-assistant/core')
&& github.event.inputs.lint-only != 'true'
&& github.event.inputs.pylint-only != 'true'
&& github.event.inputs.mypy-only != 'true'
&& github.event.inputs.audit-licenses-only != 'true'
&& needs.info.outputs.mariadb_groups != '[]' && needs.info.outputs.mariadb_groups != '[]'
needs: needs:
- info - info
@@ -1077,24 +1039,22 @@ jobs:
bluez \ bluez \
ffmpeg \ ffmpeg \
libturbojpeg \ libturbojpeg \
libmariadb-dev-compat \ libmariadb-dev-compat
libxml2-utils
- name: Check out code from GitHub - name: Check out code from GitHub
uses: actions/checkout@v4.2.2 uses: actions/checkout@v4.2.2
- name: Set up Python ${{ matrix.python-version }} - name: Set up Python ${{ matrix.python-version }}
id: python id: python
uses: actions/setup-python@v5.6.0 uses: actions/setup-python@v5.3.0
with: with:
python-version: ${{ matrix.python-version }} python-version: ${{ matrix.python-version }}
check-latest: true check-latest: true
- name: Restore full Python ${{ matrix.python-version }} virtual environment - name: Restore full Python ${{ matrix.python-version }} virtual environment
id: cache-venv id: cache-venv
uses: actions/cache/restore@v4.2.3 uses: actions/cache/restore@v4.2.0
with: with:
path: venv path: venv
fail-on-cache-miss: true fail-on-cache-miss: true
key: >- key: ${{ runner.os }}-${{ steps.python.outputs.python-version }}-${{
${{ runner.os }}-${{ runner.arch }}-${{ steps.python.outputs.python-version }}-${{
needs.info.outputs.python_cache_key }} needs.info.outputs.python_cache_key }}
- name: Register Python problem matcher - name: Register Python problem matcher
run: | run: |
@@ -1127,7 +1087,6 @@ jobs:
cov_params+=(--cov="homeassistant.components.recorder") cov_params+=(--cov="homeassistant.components.recorder")
cov_params+=(--cov-report=xml) cov_params+=(--cov-report=xml)
cov_params+=(--cov-report=term-missing) cov_params+=(--cov-report=term-missing)
cov_params+=(--junitxml=junit.xml -o junit_family=legacy)
fi fi
python3 -b -X dev -m pytest \ python3 -b -X dev -m pytest \
@@ -1139,7 +1098,6 @@ jobs:
-o console_output_style=count \ -o console_output_style=count \
--durations=10 \ --durations=10 \
-p no:sugar \ -p no:sugar \
--exclude-warning-annotations \
--dburl=mysql://root:password@127.0.0.1/homeassistant-test \ --dburl=mysql://root:password@127.0.0.1/homeassistant-test \
tests/components/history \ tests/components/history \
tests/components/logbook \ tests/components/logbook \
@@ -1148,7 +1106,7 @@ jobs:
2>&1 | tee pytest-${{ matrix.python-version }}-${mariadb}.txt 2>&1 | tee pytest-${{ matrix.python-version }}-${mariadb}.txt
- name: Upload pytest output - name: Upload pytest output
if: success() || failure() && steps.pytest-partial.conclusion == 'failure' if: success() || failure() && steps.pytest-partial.conclusion == 'failure'
uses: actions/upload-artifact@v4.6.2 uses: actions/upload-artifact@v4.5.0
with: with:
name: pytest-${{ github.run_number }}-${{ matrix.python-version }}-${{ name: pytest-${{ github.run_number }}-${{ matrix.python-version }}-${{
steps.pytest-partial.outputs.mariadb }} steps.pytest-partial.outputs.mariadb }}
@@ -1156,25 +1114,12 @@ jobs:
overwrite: true overwrite: true
- name: Upload coverage artifact - name: Upload coverage artifact
if: needs.info.outputs.skip_coverage != 'true' if: needs.info.outputs.skip_coverage != 'true'
uses: actions/upload-artifact@v4.6.2 uses: actions/upload-artifact@v4.5.0
with: with:
name: coverage-${{ matrix.python-version }}-${{ name: coverage-${{ matrix.python-version }}-${{
steps.pytest-partial.outputs.mariadb }} steps.pytest-partial.outputs.mariadb }}
path: coverage.xml path: coverage.xml
overwrite: true overwrite: true
- name: Beautify test results
# For easier identification of parsing errors
if: needs.info.outputs.skip_coverage != 'true'
run: |
xmllint --format "junit.xml" > "junit.xml-tmp"
mv "junit.xml-tmp" "junit.xml"
- name: Upload test results artifact
if: needs.info.outputs.skip_coverage != 'true' && !cancelled()
uses: actions/upload-artifact@v4.6.2
with:
name: test-results-mariadb-${{ matrix.python-version }}-${{
steps.pytest-partial.outputs.mariadb }}
path: junit.xml
- name: Check dirty - name: Check dirty
run: | run: |
./script/check_dirty ./script/check_dirty
@@ -1190,7 +1135,11 @@ jobs:
POSTGRES_PASSWORD: password POSTGRES_PASSWORD: password
options: --health-cmd="pg_isready -hlocalhost -Upostgres" --health-interval=5s --health-timeout=2s --health-retries=3 options: --health-cmd="pg_isready -hlocalhost -Upostgres" --health-interval=5s --health-timeout=2s --health-retries=3
if: | if: |
needs.info.outputs.lint_only != 'true' (github.event_name != 'push' || github.event.repository.full_name == 'home-assistant/core')
&& github.event.inputs.lint-only != 'true'
&& github.event.inputs.pylint-only != 'true'
&& github.event.inputs.mypy-only != 'true'
&& github.event.inputs.audit-licenses-only != 'true'
&& needs.info.outputs.postgresql_groups != '[]' && needs.info.outputs.postgresql_groups != '[]'
needs: needs:
- info - info
@@ -1216,8 +1165,7 @@ jobs:
sudo apt-get -y install \ sudo apt-get -y install \
bluez \ bluez \
ffmpeg \ ffmpeg \
libturbojpeg \ libturbojpeg
libxml2-utils
sudo /usr/share/postgresql-common/pgdg/apt.postgresql.org.sh -y sudo /usr/share/postgresql-common/pgdg/apt.postgresql.org.sh -y
sudo apt-get -y install \ sudo apt-get -y install \
postgresql-server-dev-14 postgresql-server-dev-14
@@ -1225,18 +1173,17 @@ jobs:
uses: actions/checkout@v4.2.2 uses: actions/checkout@v4.2.2
- name: Set up Python ${{ matrix.python-version }} - name: Set up Python ${{ matrix.python-version }}
id: python id: python
uses: actions/setup-python@v5.6.0 uses: actions/setup-python@v5.3.0
with: with:
python-version: ${{ matrix.python-version }} python-version: ${{ matrix.python-version }}
check-latest: true check-latest: true
- name: Restore full Python ${{ matrix.python-version }} virtual environment - name: Restore full Python ${{ matrix.python-version }} virtual environment
id: cache-venv id: cache-venv
uses: actions/cache/restore@v4.2.3 uses: actions/cache/restore@v4.2.0
with: with:
path: venv path: venv
fail-on-cache-miss: true fail-on-cache-miss: true
key: >- key: ${{ runner.os }}-${{ steps.python.outputs.python-version }}-${{
${{ runner.os }}-${{ runner.arch }}-${{ steps.python.outputs.python-version }}-${{
needs.info.outputs.python_cache_key }} needs.info.outputs.python_cache_key }}
- name: Register Python problem matcher - name: Register Python problem matcher
run: | run: |
@@ -1269,7 +1216,6 @@ jobs:
cov_params+=(--cov="homeassistant.components.recorder") cov_params+=(--cov="homeassistant.components.recorder")
cov_params+=(--cov-report=xml) cov_params+=(--cov-report=xml)
cov_params+=(--cov-report=term-missing) cov_params+=(--cov-report=term-missing)
cov_params+=(--junitxml=junit.xml -o junit_family=legacy)
fi fi
python3 -b -X dev -m pytest \ python3 -b -X dev -m pytest \
@@ -1282,7 +1228,6 @@ jobs:
--durations=0 \ --durations=0 \
--durations-min=10 \ --durations-min=10 \
-p no:sugar \ -p no:sugar \
--exclude-warning-annotations \
--dburl=postgresql://postgres:password@127.0.0.1/homeassistant-test \ --dburl=postgresql://postgres:password@127.0.0.1/homeassistant-test \
tests/components/history \ tests/components/history \
tests/components/logbook \ tests/components/logbook \
@@ -1291,7 +1236,7 @@ jobs:
2>&1 | tee pytest-${{ matrix.python-version }}-${postgresql}.txt 2>&1 | tee pytest-${{ matrix.python-version }}-${postgresql}.txt
- name: Upload pytest output - name: Upload pytest output
if: success() || failure() && steps.pytest-partial.conclusion == 'failure' if: success() || failure() && steps.pytest-partial.conclusion == 'failure'
uses: actions/upload-artifact@v4.6.2 uses: actions/upload-artifact@v4.5.0
with: with:
name: pytest-${{ github.run_number }}-${{ matrix.python-version }}-${{ name: pytest-${{ github.run_number }}-${{ matrix.python-version }}-${{
steps.pytest-partial.outputs.postgresql }} steps.pytest-partial.outputs.postgresql }}
@@ -1299,25 +1244,12 @@ jobs:
overwrite: true overwrite: true
- name: Upload coverage artifact - name: Upload coverage artifact
if: needs.info.outputs.skip_coverage != 'true' if: needs.info.outputs.skip_coverage != 'true'
uses: actions/upload-artifact@v4.6.2 uses: actions/upload-artifact@v4.5.0
with: with:
name: coverage-${{ matrix.python-version }}-${{ name: coverage-${{ matrix.python-version }}-${{
steps.pytest-partial.outputs.postgresql }} steps.pytest-partial.outputs.postgresql }}
path: coverage.xml path: coverage.xml
overwrite: true overwrite: true
- name: Beautify test results
# For easier identification of parsing errors
if: needs.info.outputs.skip_coverage != 'true'
run: |
xmllint --format "junit.xml" > "junit.xml-tmp"
mv "junit.xml-tmp" "junit.xml"
- name: Upload test results artifact
if: needs.info.outputs.skip_coverage != 'true' && !cancelled()
uses: actions/upload-artifact@v4.6.2
with:
name: test-results-postgres-${{ matrix.python-version }}-${{
steps.pytest-partial.outputs.postgresql }}
path: junit.xml
- name: Check dirty - name: Check dirty
run: | run: |
./script/check_dirty ./script/check_dirty
@@ -1336,12 +1268,12 @@ jobs:
- name: Check out code from GitHub - name: Check out code from GitHub
uses: actions/checkout@v4.2.2 uses: actions/checkout@v4.2.2
- name: Download all coverage artifacts - name: Download all coverage artifacts
uses: actions/download-artifact@v4.3.0 uses: actions/download-artifact@v4.1.8
with: with:
pattern: coverage-* pattern: coverage-*
- name: Upload coverage to Codecov - name: Upload coverage to Codecov
if: needs.info.outputs.test_full_suite == 'true' if: needs.info.outputs.test_full_suite == 'true'
uses: codecov/codecov-action@v5.4.3 uses: codecov/codecov-action@v5.1.2
with: with:
fail_ci_if_error: true fail_ci_if_error: true
flags: full-suite flags: full-suite
@@ -1350,7 +1282,11 @@ jobs:
pytest-partial: pytest-partial:
runs-on: ubuntu-24.04 runs-on: ubuntu-24.04
if: | if: |
needs.info.outputs.lint_only != 'true' (github.event_name != 'push' || github.event.repository.full_name == 'home-assistant/core')
&& github.event.inputs.lint-only != 'true'
&& github.event.inputs.pylint-only != 'true'
&& github.event.inputs.mypy-only != 'true'
&& github.event.inputs.audit-licenses-only != 'true'
&& needs.info.outputs.tests_glob && needs.info.outputs.tests_glob
&& needs.info.outputs.test_full_suite == 'false' && needs.info.outputs.test_full_suite == 'false'
needs: needs:
@@ -1378,24 +1314,22 @@ jobs:
bluez \ bluez \
ffmpeg \ ffmpeg \
libturbojpeg \ libturbojpeg \
libgammu-dev \ libgammu-dev
libxml2-utils
- name: Check out code from GitHub - name: Check out code from GitHub
uses: actions/checkout@v4.2.2 uses: actions/checkout@v4.2.2
- name: Set up Python ${{ matrix.python-version }} - name: Set up Python ${{ matrix.python-version }}
id: python id: python
uses: actions/setup-python@v5.6.0 uses: actions/setup-python@v5.3.0
with: with:
python-version: ${{ matrix.python-version }} python-version: ${{ matrix.python-version }}
check-latest: true check-latest: true
- name: Restore full Python ${{ matrix.python-version }} virtual environment - name: Restore full Python ${{ matrix.python-version }} virtual environment
id: cache-venv id: cache-venv
uses: actions/cache/restore@v4.2.3 uses: actions/cache/restore@v4.2.0
with: with:
path: venv path: venv
fail-on-cache-miss: true fail-on-cache-miss: true
key: >- key: ${{ runner.os }}-${{ steps.python.outputs.python-version }}-${{
${{ runner.os }}-${{ runner.arch }}-${{ steps.python.outputs.python-version }}-${{
needs.info.outputs.python_cache_key }} needs.info.outputs.python_cache_key }}
- name: Register Python problem matcher - name: Register Python problem matcher
run: | run: |
@@ -1428,7 +1362,6 @@ jobs:
cov_params+=(--cov="homeassistant.components.${{ matrix.group }}") cov_params+=(--cov="homeassistant.components.${{ matrix.group }}")
cov_params+=(--cov-report=xml) cov_params+=(--cov-report=xml)
cov_params+=(--cov-report=term-missing) cov_params+=(--cov-report=term-missing)
cov_params+=(--junitxml=junit.xml -o junit_family=legacy)
fi fi
python3 -b -X dev -m pytest \ python3 -b -X dev -m pytest \
@@ -1441,35 +1374,22 @@ jobs:
--durations=0 \ --durations=0 \
--durations-min=1 \ --durations-min=1 \
-p no:sugar \ -p no:sugar \
--exclude-warning-annotations \
tests/components/${{ matrix.group }} \ tests/components/${{ matrix.group }} \
2>&1 | tee pytest-${{ matrix.python-version }}-${{ matrix.group }}.txt 2>&1 | tee pytest-${{ matrix.python-version }}-${{ matrix.group }}.txt
- name: Upload pytest output - name: Upload pytest output
if: success() || failure() && steps.pytest-partial.conclusion == 'failure' if: success() || failure() && steps.pytest-partial.conclusion == 'failure'
uses: actions/upload-artifact@v4.6.2 uses: actions/upload-artifact@v4.5.0
with: with:
name: pytest-${{ github.run_number }}-${{ matrix.python-version }}-${{ matrix.group }} name: pytest-${{ github.run_number }}-${{ matrix.python-version }}-${{ matrix.group }}
path: pytest-*.txt path: pytest-*.txt
overwrite: true overwrite: true
- name: Upload coverage artifact - name: Upload coverage artifact
if: needs.info.outputs.skip_coverage != 'true' if: needs.info.outputs.skip_coverage != 'true'
uses: actions/upload-artifact@v4.6.2 uses: actions/upload-artifact@v4.5.0
with: with:
name: coverage-${{ matrix.python-version }}-${{ matrix.group }} name: coverage-${{ matrix.python-version }}-${{ matrix.group }}
path: coverage.xml path: coverage.xml
overwrite: true overwrite: true
- name: Beautify test results
# For easier identification of parsing errors
if: needs.info.outputs.skip_coverage != 'true'
run: |
xmllint --format "junit.xml" > "junit.xml-tmp"
mv "junit.xml-tmp" "junit.xml"
- name: Upload test results artifact
if: needs.info.outputs.skip_coverage != 'true' && !cancelled()
uses: actions/upload-artifact@v4.6.2
with:
name: test-results-partial-${{ matrix.python-version }}-${{ matrix.group }}
path: junit.xml
- name: Check dirty - name: Check dirty
run: | run: |
./script/check_dirty ./script/check_dirty
@@ -1486,37 +1406,12 @@ jobs:
- name: Check out code from GitHub - name: Check out code from GitHub
uses: actions/checkout@v4.2.2 uses: actions/checkout@v4.2.2
- name: Download all coverage artifacts - name: Download all coverage artifacts
uses: actions/download-artifact@v4.3.0 uses: actions/download-artifact@v4.1.8
with: with:
pattern: coverage-* pattern: coverage-*
- name: Upload coverage to Codecov - name: Upload coverage to Codecov
if: needs.info.outputs.test_full_suite == 'false' if: needs.info.outputs.test_full_suite == 'false'
uses: codecov/codecov-action@v5.4.3 uses: codecov/codecov-action@v5.1.2
with: with:
fail_ci_if_error: true fail_ci_if_error: true
token: ${{ secrets.CODECOV_TOKEN }} token: ${{ secrets.CODECOV_TOKEN }}
upload-test-results:
name: Upload test results to Codecov
# codecov/test-results-action currently doesn't support tokenless uploads
# therefore we can't run it on forks
if: ${{ (github.event_name != 'pull_request' || !github.event.pull_request.head.repo.fork) && needs.info.outputs.skip_coverage != 'true' && !cancelled() }}
runs-on: ubuntu-24.04
needs:
- info
- pytest-partial
- pytest-full
- pytest-postgres
- pytest-mariadb
timeout-minutes: 10
steps:
- name: Download all coverage artifacts
uses: actions/download-artifact@v4.3.0
with:
pattern: test-results-*
- name: Upload test results to Codecov
uses: codecov/test-results-action@v1
with:
fail_ci_if_error: true
verbose: true
token: ${{ secrets.CODECOV_TOKEN }}

View File

@@ -24,11 +24,11 @@ jobs:
uses: actions/checkout@v4.2.2 uses: actions/checkout@v4.2.2
- name: Initialize CodeQL - name: Initialize CodeQL
uses: github/codeql-action/init@v3.29.2 uses: github/codeql-action/init@v3.28.0
with: with:
languages: python languages: python
- name: Perform CodeQL Analysis - name: Perform CodeQL Analysis
uses: github/codeql-action/analyze@v3.29.2 uses: github/codeql-action/analyze@v3.28.0
with: with:
category: "/language:python" category: "/language:python"

View File

@@ -1,385 +0,0 @@
name: Auto-detect duplicate issues
# yamllint disable-line rule:truthy
on:
issues:
types: [labeled]
permissions:
issues: write
models: read
jobs:
detect-duplicates:
runs-on: ubuntu-latest
steps:
- name: Check if integration label was added and extract details
id: extract
uses: actions/github-script@v7.0.1
with:
script: |
// Debug: Log the event payload
console.log('Event name:', context.eventName);
console.log('Event action:', context.payload.action);
console.log('Event payload keys:', Object.keys(context.payload));
// Check the specific label that was added
const addedLabel = context.payload.label;
if (!addedLabel) {
console.log('No label found in labeled event payload');
core.setOutput('should_continue', 'false');
return;
}
console.log(`Label added: ${addedLabel.name}`);
if (!addedLabel.name.startsWith('integration:')) {
console.log('Added label is not an integration label, skipping duplicate detection');
core.setOutput('should_continue', 'false');
return;
}
console.log(`Integration label added: ${addedLabel.name}`);
let currentIssue;
let integrationLabels = [];
try {
const issue = await github.rest.issues.get({
owner: context.repo.owner,
repo: context.repo.repo,
issue_number: context.payload.issue.number
});
currentIssue = issue.data;
// Check if potential-duplicate label already exists
const hasPotentialDuplicateLabel = currentIssue.labels
.some(label => label.name === 'potential-duplicate');
if (hasPotentialDuplicateLabel) {
console.log('Issue already has potential-duplicate label, skipping duplicate detection');
core.setOutput('should_continue', 'false');
return;
}
integrationLabels = currentIssue.labels
.filter(label => label.name.startsWith('integration:'))
.map(label => label.name);
} catch (error) {
core.error(`Failed to fetch issue #${context.payload.issue.number}:`, error.message);
core.setOutput('should_continue', 'false');
return;
}
// Check if we've already posted a duplicate detection comment recently
let comments;
try {
comments = await github.rest.issues.listComments({
owner: context.repo.owner,
repo: context.repo.repo,
issue_number: context.payload.issue.number,
per_page: 10
});
} catch (error) {
core.error('Failed to fetch comments:', error.message);
// Continue anyway, worst case we might post a duplicate comment
comments = { data: [] };
}
// Check if we've already posted a duplicate detection comment
const recentDuplicateComment = comments.data.find(comment =>
comment.user && comment.user.login === 'github-actions[bot]' &&
comment.body.includes('<!-- workflow: detect-duplicate-issues -->')
);
if (recentDuplicateComment) {
console.log('Already posted duplicate detection comment, skipping');
core.setOutput('should_continue', 'false');
return;
}
core.setOutput('should_continue', 'true');
core.setOutput('current_number', currentIssue.number);
core.setOutput('current_title', currentIssue.title);
core.setOutput('current_body', currentIssue.body);
core.setOutput('current_url', currentIssue.html_url);
core.setOutput('integration_labels', JSON.stringify(integrationLabels));
console.log(`Current issue: #${currentIssue.number}`);
console.log(`Integration labels: ${integrationLabels.join(', ')}`);
- name: Fetch similar issues
id: fetch_similar
if: steps.extract.outputs.should_continue == 'true'
uses: actions/github-script@v7.0.1
env:
INTEGRATION_LABELS: ${{ steps.extract.outputs.integration_labels }}
CURRENT_NUMBER: ${{ steps.extract.outputs.current_number }}
with:
script: |
const integrationLabels = JSON.parse(process.env.INTEGRATION_LABELS);
const currentNumber = parseInt(process.env.CURRENT_NUMBER);
if (integrationLabels.length === 0) {
console.log('No integration labels found, skipping duplicate detection');
core.setOutput('has_similar', 'false');
return;
}
// Use GitHub search API to find issues with matching integration labels
console.log(`Searching for issues with integration labels: ${integrationLabels.join(', ')}`);
// Build search query for issues with any of the current integration labels
const labelQueries = integrationLabels.map(label => `label:"${label}"`);
// Calculate date 6 months ago
const sixMonthsAgo = new Date();
sixMonthsAgo.setMonth(sixMonthsAgo.getMonth() - 6);
const dateFilter = `created:>=${sixMonthsAgo.toISOString().split('T')[0]}`;
let searchQuery;
if (labelQueries.length === 1) {
searchQuery = `repo:${context.repo.owner}/${context.repo.repo} is:issue ${labelQueries[0]} ${dateFilter}`;
} else {
searchQuery = `repo:${context.repo.owner}/${context.repo.repo} is:issue (${labelQueries.join(' OR ')}) ${dateFilter}`;
}
console.log(`Search query: ${searchQuery}`);
let result;
try {
result = await github.rest.search.issuesAndPullRequests({
q: searchQuery,
per_page: 15,
sort: 'updated',
order: 'desc'
});
} catch (error) {
core.error('Failed to search for similar issues:', error.message);
if (error.status === 403 && error.message.includes('rate limit')) {
core.error('GitHub API rate limit exceeded');
}
core.setOutput('has_similar', 'false');
return;
}
// Filter out the current issue, pull requests, and newer issues (higher numbers)
const similarIssues = result.data.items
.filter(item =>
item.number !== currentNumber &&
!item.pull_request &&
item.number < currentNumber // Only include older issues (lower numbers)
)
.map(item => ({
number: item.number,
title: item.title,
body: item.body,
url: item.html_url,
state: item.state,
createdAt: item.created_at,
updatedAt: item.updated_at,
comments: item.comments,
labels: item.labels.map(l => l.name)
}));
console.log(`Found ${similarIssues.length} issues with matching integration labels`);
console.log('Raw similar issues:', JSON.stringify(similarIssues.slice(0, 3), null, 2));
if (similarIssues.length === 0) {
console.log('No similar issues found, setting has_similar to false');
core.setOutput('has_similar', 'false');
return;
}
console.log('Similar issues found, setting has_similar to true');
core.setOutput('has_similar', 'true');
// Clean the issue data to prevent JSON parsing issues
const cleanedIssues = similarIssues.slice(0, 15).map(item => {
// Handle body with improved truncation and null handling
let cleanBody = '';
if (item.body && typeof item.body === 'string') {
// Remove control characters
const cleaned = item.body.replace(/[\u0000-\u001F\u007F-\u009F]/g, '');
// Truncate to 1000 characters and add ellipsis if needed
cleanBody = cleaned.length > 1000
? cleaned.substring(0, 1000) + '...'
: cleaned;
}
return {
number: item.number,
title: item.title.replace(/[\u0000-\u001F\u007F-\u009F]/g, ''), // Remove control characters
body: cleanBody,
url: item.url,
state: item.state,
createdAt: item.createdAt,
updatedAt: item.updatedAt,
comments: item.comments,
labels: item.labels
};
});
console.log(`Cleaned issues count: ${cleanedIssues.length}`);
console.log('First cleaned issue:', JSON.stringify(cleanedIssues[0], null, 2));
core.setOutput('similar_issues', JSON.stringify(cleanedIssues));
- name: Detect duplicates using AI
id: ai_detection
if: steps.extract.outputs.should_continue == 'true' && steps.fetch_similar.outputs.has_similar == 'true'
uses: actions/ai-inference@v1.1.0
with:
model: openai/gpt-4o
system-prompt: |
You are a Home Assistant issue duplicate detector. Your task is to identify TRUE DUPLICATES - issues that report the EXACT SAME problem, not just similar or related issues.
CRITICAL: An issue is ONLY a duplicate if:
- It describes the SAME problem with the SAME root cause
- Issues about the same integration but different problems are NOT duplicates
- Issues with similar symptoms but different causes are NOT duplicates
Important considerations:
- Open issues are more relevant than closed ones for duplicate detection
- Recently updated issues may indicate ongoing work or discussion
- Issues with more comments are generally more relevant and active
- Older closed issues might be resolved differently than newer approaches
- Consider the time between issues - very old issues may have different contexts
Rules:
1. ONLY mark as duplicate if the issues describe IDENTICAL problems
2. Look for issues that report the same problem or request the same functionality
3. Different error messages = NOT a duplicate (even if same integration)
4. For CLOSED issues, only mark as duplicate if they describe the EXACT same problem
5. For OPEN issues, use a lower threshold (90%+ similarity)
6. Prioritize issues with higher comment counts as they indicate more activity/relevance
7. When in doubt, do NOT mark as duplicate
8. Return ONLY a JSON array of issue numbers that are duplicates
9. If no duplicates are found, return an empty array: []
10. Maximum 5 potential duplicates, prioritize open issues with comments
11. Consider the age of issues - prefer recent duplicates over very old ones
Example response format:
[1234, 5678, 9012]
prompt: |
Current issue (just created):
Title: ${{ steps.extract.outputs.current_title }}
Body: ${{ steps.extract.outputs.current_body }}
Other issues to compare against (each includes state, creation date, last update, and comment count):
${{ steps.fetch_similar.outputs.similar_issues }}
Analyze these issues and identify which ones describe IDENTICAL problems and thus are duplicates of the current issue. When sorting them, consider their state (open/closed), how recently they were updated, and their comment count (higher = more relevant).
max-tokens: 100
- name: Post duplicate detection results
id: post_results
if: steps.extract.outputs.should_continue == 'true' && steps.fetch_similar.outputs.has_similar == 'true'
uses: actions/github-script@v7.0.1
env:
AI_RESPONSE: ${{ steps.ai_detection.outputs.response }}
SIMILAR_ISSUES: ${{ steps.fetch_similar.outputs.similar_issues }}
with:
script: |
const aiResponse = process.env.AI_RESPONSE;
console.log('Raw AI response:', JSON.stringify(aiResponse));
let duplicateNumbers = [];
try {
// Clean the response of any potential control characters
const cleanResponse = aiResponse.trim().replace(/[\u0000-\u001F\u007F-\u009F]/g, '');
console.log('Cleaned AI response:', cleanResponse);
duplicateNumbers = JSON.parse(cleanResponse);
// Ensure it's an array and contains only numbers
if (!Array.isArray(duplicateNumbers)) {
console.log('AI response is not an array, trying to extract numbers');
const numberMatches = cleanResponse.match(/\d+/g);
duplicateNumbers = numberMatches ? numberMatches.map(n => parseInt(n)) : [];
}
// Filter to only valid numbers
duplicateNumbers = duplicateNumbers.filter(n => typeof n === 'number' && !isNaN(n));
} catch (error) {
console.log('Failed to parse AI response as JSON:', error.message);
console.log('Raw response:', aiResponse);
// Fallback: try to extract numbers from the response
const numberMatches = aiResponse.match(/\d+/g);
duplicateNumbers = numberMatches ? numberMatches.map(n => parseInt(n)) : [];
console.log('Extracted numbers as fallback:', duplicateNumbers);
}
if (!Array.isArray(duplicateNumbers) || duplicateNumbers.length === 0) {
console.log('No duplicates detected by AI');
return;
}
console.log(`AI detected ${duplicateNumbers.length} potential duplicates: ${duplicateNumbers.join(', ')}`);
// Get details of detected duplicates
const similarIssues = JSON.parse(process.env.SIMILAR_ISSUES);
const duplicates = similarIssues.filter(issue => duplicateNumbers.includes(issue.number));
if (duplicates.length === 0) {
console.log('No matching issues found for detected numbers');
return;
}
// Create comment with duplicate detection results
const duplicateLinks = duplicates.map(issue => `- [#${issue.number}: ${issue.title}](${issue.url})`).join('\n');
const commentBody = [
'<!-- workflow: detect-duplicate-issues -->',
'### 🔍 **Potential duplicate detection**',
'',
'I\'ve analyzed similar issues and found the following potential duplicates:',
'',
duplicateLinks,
'',
'**What to do next:**',
'1. Please review these issues to see if they match your issue',
'2. If you find an existing issue that covers your problem:',
' - Consider closing this issue',
' - Add your findings or 👍 on the existing issue instead',
'3. If your issue is different or adds new aspects, please clarify how it differs',
'',
'This helps keep our issues organized and ensures similar issues are consolidated for better visibility.',
'',
'*This message was generated automatically by our duplicate detection system.*'
].join('\n');
try {
await github.rest.issues.createComment({
owner: context.repo.owner,
repo: context.repo.repo,
issue_number: context.payload.issue.number,
body: commentBody
});
console.log(`Posted duplicate detection comment with ${duplicates.length} potential duplicates`);
// Add the potential-duplicate label
await github.rest.issues.addLabels({
owner: context.repo.owner,
repo: context.repo.repo,
issue_number: context.payload.issue.number,
labels: ['potential-duplicate']
});
console.log('Added potential-duplicate label to the issue');
} catch (error) {
core.error('Failed to post duplicate detection comment or add label:', error.message);
if (error.status === 403) {
core.error('Permission denied or rate limit exceeded');
}
// Don't throw - we've done the analysis, just couldn't post the result
}

View File

@@ -1,193 +0,0 @@
name: Auto-detect non-English issues
# yamllint disable-line rule:truthy
on:
issues:
types: [opened]
permissions:
issues: write
models: read
jobs:
detect-language:
runs-on: ubuntu-latest
steps:
- name: Check issue language
id: detect_language
uses: actions/github-script@v7.0.1
env:
ISSUE_NUMBER: ${{ github.event.issue.number }}
ISSUE_TITLE: ${{ github.event.issue.title }}
ISSUE_BODY: ${{ github.event.issue.body }}
ISSUE_USER_TYPE: ${{ github.event.issue.user.type }}
with:
script: |
// Get the issue details from environment variables
const issueNumber = process.env.ISSUE_NUMBER;
const issueTitle = process.env.ISSUE_TITLE || '';
const issueBody = process.env.ISSUE_BODY || '';
const userType = process.env.ISSUE_USER_TYPE;
// Skip language detection for bot users
if (userType === 'Bot') {
console.log('Skipping language detection for bot user');
core.setOutput('should_continue', 'false');
return;
}
console.log(`Checking language for issue #${issueNumber}`);
console.log(`Title: ${issueTitle}`);
// Combine title and body for language detection
const fullText = `${issueTitle}\n\n${issueBody}`;
// Check if the text is too short to reliably detect language
if (fullText.trim().length < 20) {
console.log('Text too short for reliable language detection');
core.setOutput('should_continue', 'false'); // Skip processing for very short text
return;
}
core.setOutput('issue_number', issueNumber);
core.setOutput('issue_text', fullText);
core.setOutput('should_continue', 'true');
- name: Detect language using AI
id: ai_language_detection
if: steps.detect_language.outputs.should_continue == 'true'
uses: actions/ai-inference@v1.1.0
with:
model: openai/gpt-4o-mini
system-prompt: |
You are a language detection system. Your task is to determine if the provided text is written in English or another language.
Rules:
1. Analyze the text and determine the primary language of the USER'S DESCRIPTION only
2. IGNORE markdown headers (lines starting with #, ##, ###, etc.) as these are from issue templates, not user input
3. IGNORE all code blocks (text between ``` or ` markers) as they may contain system-generated error messages in other languages
4. IGNORE error messages, logs, and system output even if not in code blocks - these often appear in the user's system language
5. Consider technical terms, code snippets, URLs, and file paths as neutral (they don't indicate non-English)
6. Focus ONLY on the actual sentences and descriptions written by the user explaining their issue
7. If the user's explanation/description is in English but includes non-English error messages or logs, consider it ENGLISH
8. Return ONLY a JSON object with two fields:
- "is_english": boolean (true if the user's description is primarily in English, false otherwise)
- "detected_language": string (the name of the detected language, e.g., "English", "Spanish", "Chinese", etc.)
9. Be lenient - if the user's explanation is in English with non-English system output, it's still English
10. Common programming terms, error messages, and technical jargon should not be considered as non-English
11. If you cannot reliably determine the language, set detected_language to "undefined"
Example response:
{"is_english": false, "detected_language": "Spanish"}
prompt: |
Please analyze the following issue text and determine if it is written in English:
${{ steps.detect_language.outputs.issue_text }}
max-tokens: 50
- name: Process non-English issues
if: steps.detect_language.outputs.should_continue == 'true'
uses: actions/github-script@v7.0.1
env:
AI_RESPONSE: ${{ steps.ai_language_detection.outputs.response }}
ISSUE_NUMBER: ${{ steps.detect_language.outputs.issue_number }}
with:
script: |
const issueNumber = parseInt(process.env.ISSUE_NUMBER);
const aiResponse = process.env.AI_RESPONSE;
console.log('AI language detection response:', aiResponse);
let languageResult;
try {
languageResult = JSON.parse(aiResponse.trim());
// Validate the response structure
if (!languageResult || typeof languageResult.is_english !== 'boolean') {
throw new Error('Invalid response structure');
}
} catch (error) {
core.error(`Failed to parse AI response: ${error.message}`);
console.log('Raw AI response:', aiResponse);
// Log more details for debugging
core.warning('Defaulting to English due to parsing error');
// Default to English if we can't parse the response
return;
}
if (languageResult.is_english) {
console.log('Issue is in English, no action needed');
return;
}
// If language is undefined or not detected, skip processing
if (!languageResult.detected_language || languageResult.detected_language === 'undefined') {
console.log('Language could not be determined, skipping processing');
return;
}
console.log(`Issue detected as non-English: ${languageResult.detected_language}`);
// Post comment explaining the language requirement
const commentBody = [
'<!-- workflow: detect-non-english-issues -->',
'### 🌐 Non-English issue detected',
'',
`This issue appears to be written in **${languageResult.detected_language}** rather than English.`,
'',
'The Home Assistant project uses English as the primary language for issues to ensure that everyone in our international community can participate and help resolve issues. This allows any of our thousands of contributors to jump in and provide assistance.',
'',
'**What to do:**',
'1. Re-create the issue using the English language',
'2. If you need help with translation, consider using:',
' - Translation tools like Google Translate',
' - AI assistants like ChatGPT or Claude',
'',
'This helps our community provide the best possible support and ensures your issue gets the attention it deserves from our global contributor base.',
'',
'Thank you for your understanding! 🙏'
].join('\n');
try {
// Add comment
await github.rest.issues.createComment({
owner: context.repo.owner,
repo: context.repo.repo,
issue_number: issueNumber,
body: commentBody
});
console.log('Posted language requirement comment');
// Add non-english label
await github.rest.issues.addLabels({
owner: context.repo.owner,
repo: context.repo.repo,
issue_number: issueNumber,
labels: ['non-english']
});
console.log('Added non-english label');
// Close the issue
await github.rest.issues.update({
owner: context.repo.owner,
repo: context.repo.repo,
issue_number: issueNumber,
state: 'closed',
state_reason: 'not_planned'
});
console.log('Closed the issue');
} catch (error) {
core.error('Failed to process non-English issue:', error.message);
if (error.status === 403) {
core.error('Permission denied or rate limit exceeded');
}
}

View File

@@ -1,84 +0,0 @@
name: Restrict task creation
# yamllint disable-line rule:truthy
on:
issues:
types: [opened]
jobs:
check-authorization:
runs-on: ubuntu-latest
# Only run if this is a Task issue type (from the issue form)
if: github.event.issue.issue_type == 'Task'
steps:
- name: Check if user is authorized
uses: actions/github-script@v7
with:
script: |
const issueAuthor = context.payload.issue.user.login;
// First check if user is an organization member
try {
await github.rest.orgs.checkMembershipForUser({
org: 'home-assistant',
username: issueAuthor
});
console.log(`✅ ${issueAuthor} is an organization member`);
return; // Authorized, no need to check further
} catch (error) {
console.log(` ${issueAuthor} is not an organization member, checking codeowners...`);
}
// If not an org member, check if they're a codeowner
try {
// Fetch CODEOWNERS file from the repository
const { data: codeownersFile } = await github.rest.repos.getContent({
owner: context.repo.owner,
repo: context.repo.repo,
path: 'CODEOWNERS',
ref: 'dev'
});
// Decode the content (it's base64 encoded)
const codeownersContent = Buffer.from(codeownersFile.content, 'base64').toString('utf-8');
// Check if the issue author is mentioned in CODEOWNERS
// GitHub usernames in CODEOWNERS are prefixed with @
if (codeownersContent.includes(`@${issueAuthor}`)) {
console.log(`✅ ${issueAuthor} is a integration code owner`);
return; // Authorized
}
} catch (error) {
console.error('Error checking CODEOWNERS:', error);
}
// If we reach here, user is not authorized
console.log(`❌ ${issueAuthor} is not authorized to create Task issues`);
// Close the issue with a comment
await github.rest.issues.createComment({
owner: context.repo.owner,
repo: context.repo.repo,
issue_number: context.issue.number,
body: `Hi @${issueAuthor}, thank you for your contribution!\n\n` +
`Task issues are restricted to Open Home Foundation staff, authorized contributors, and integration code owners.\n\n` +
`If you would like to:\n` +
`- Report a bug: Please use the [bug report form](https://github.com/home-assistant/core/issues/new?template=bug_report.yml)\n` +
`- Request a feature: Please submit to [Feature Requests](https://github.com/orgs/home-assistant/discussions)\n\n` +
`If you believe you should have access to create Task issues, please contact the maintainers.`
});
await github.rest.issues.update({
owner: context.repo.owner,
repo: context.repo.repo,
issue_number: context.issue.number,
state: 'closed'
});
// Add a label to indicate this was auto-closed
await github.rest.issues.addLabels({
owner: context.repo.owner,
repo: context.repo.repo,
issue_number: context.issue.number,
labels: ['auto-closed']
});

View File

@@ -17,7 +17,7 @@ jobs:
# - No PRs marked as no-stale # - No PRs marked as no-stale
# - No issues (-1) # - No issues (-1)
- name: 60 days stale PRs policy - name: 60 days stale PRs policy
uses: actions/stale@v9.1.0 uses: actions/stale@v9.0.0
with: with:
repo-token: ${{ secrets.GITHUB_TOKEN }} repo-token: ${{ secrets.GITHUB_TOKEN }}
days-before-stale: 60 days-before-stale: 60
@@ -57,7 +57,7 @@ jobs:
# - No issues marked as no-stale or help-wanted # - No issues marked as no-stale or help-wanted
# - No PRs (-1) # - No PRs (-1)
- name: 90 days stale issues - name: 90 days stale issues
uses: actions/stale@v9.1.0 uses: actions/stale@v9.0.0
with: with:
repo-token: ${{ steps.token.outputs.token }} repo-token: ${{ steps.token.outputs.token }}
days-before-stale: 90 days-before-stale: 90
@@ -87,7 +87,7 @@ jobs:
# - No Issues marked as no-stale or help-wanted # - No Issues marked as no-stale or help-wanted
# - No PRs (-1) # - No PRs (-1)
- name: Needs more information stale issues policy - name: Needs more information stale issues policy
uses: actions/stale@v9.1.0 uses: actions/stale@v9.0.0
with: with:
repo-token: ${{ steps.token.outputs.token }} repo-token: ${{ steps.token.outputs.token }}
only-labels: "needs-more-information" only-labels: "needs-more-information"

View File

@@ -10,7 +10,7 @@ on:
- "**strings.json" - "**strings.json"
env: env:
DEFAULT_PYTHON: "3.13" DEFAULT_PYTHON: "3.12"
jobs: jobs:
upload: upload:
@@ -22,7 +22,7 @@ jobs:
uses: actions/checkout@v4.2.2 uses: actions/checkout@v4.2.2
- name: Set up Python ${{ env.DEFAULT_PYTHON }} - name: Set up Python ${{ env.DEFAULT_PYTHON }}
uses: actions/setup-python@v5.6.0 uses: actions/setup-python@v5.3.0
with: with:
python-version: ${{ env.DEFAULT_PYTHON }} python-version: ${{ env.DEFAULT_PYTHON }}

View File

@@ -17,7 +17,7 @@ on:
- "script/gen_requirements_all.py" - "script/gen_requirements_all.py"
env: env:
DEFAULT_PYTHON: "3.13" DEFAULT_PYTHON: "3.12"
concurrency: concurrency:
group: ${{ github.workflow }}-${{ github.ref_name}} group: ${{ github.workflow }}-${{ github.ref_name}}
@@ -36,7 +36,7 @@ jobs:
- name: Set up Python ${{ env.DEFAULT_PYTHON }} - name: Set up Python ${{ env.DEFAULT_PYTHON }}
id: python id: python
uses: actions/setup-python@v5.6.0 uses: actions/setup-python@v5.3.0
with: with:
python-version: ${{ env.DEFAULT_PYTHON }} python-version: ${{ env.DEFAULT_PYTHON }}
check-latest: true check-latest: true
@@ -91,7 +91,7 @@ jobs:
) > build_constraints.txt ) > build_constraints.txt
- name: Upload env_file - name: Upload env_file
uses: actions/upload-artifact@v4.6.2 uses: actions/upload-artifact@v4.5.0
with: with:
name: env_file name: env_file
path: ./.env_file path: ./.env_file
@@ -99,14 +99,14 @@ jobs:
overwrite: true overwrite: true
- name: Upload build_constraints - name: Upload build_constraints
uses: actions/upload-artifact@v4.6.2 uses: actions/upload-artifact@v4.5.0
with: with:
name: build_constraints name: build_constraints
path: ./build_constraints.txt path: ./build_constraints.txt
overwrite: true overwrite: true
- name: Upload requirements_diff - name: Upload requirements_diff
uses: actions/upload-artifact@v4.6.2 uses: actions/upload-artifact@v4.5.0
with: with:
name: requirements_diff name: requirements_diff
path: ./requirements_diff.txt path: ./requirements_diff.txt
@@ -118,7 +118,7 @@ jobs:
python -m script.gen_requirements_all ci python -m script.gen_requirements_all ci
- name: Upload requirements_all_wheels - name: Upload requirements_all_wheels
uses: actions/upload-artifact@v4.6.2 uses: actions/upload-artifact@v4.5.0
with: with:
name: requirements_all_wheels name: requirements_all_wheels
path: ./requirements_all_wheels_*.txt path: ./requirements_all_wheels_*.txt
@@ -131,24 +131,24 @@ jobs:
strategy: strategy:
fail-fast: false fail-fast: false
matrix: matrix:
abi: ["cp313"] abi: ["cp312", "cp313"]
arch: ${{ fromJson(needs.init.outputs.architectures) }} arch: ${{ fromJson(needs.init.outputs.architectures) }}
steps: steps:
- name: Checkout the repository - name: Checkout the repository
uses: actions/checkout@v4.2.2 uses: actions/checkout@v4.2.2
- name: Download env_file - name: Download env_file
uses: actions/download-artifact@v4.3.0 uses: actions/download-artifact@v4.1.8
with: with:
name: env_file name: env_file
- name: Download build_constraints - name: Download build_constraints
uses: actions/download-artifact@v4.3.0 uses: actions/download-artifact@v4.1.8
with: with:
name: build_constraints name: build_constraints
- name: Download requirements_diff - name: Download requirements_diff
uses: actions/download-artifact@v4.3.0 uses: actions/download-artifact@v4.1.8
with: with:
name: requirements_diff name: requirements_diff
@@ -159,7 +159,7 @@ jobs:
sed -i "/uv/d" requirements_diff.txt sed -i "/uv/d" requirements_diff.txt
- name: Build wheels - name: Build wheels
uses: home-assistant/wheels@2025.03.0 uses: home-assistant/wheels@2024.11.0
with: with:
abi: ${{ matrix.abi }} abi: ${{ matrix.abi }}
tag: musllinux_1_2 tag: musllinux_1_2
@@ -180,29 +180,29 @@ jobs:
strategy: strategy:
fail-fast: false fail-fast: false
matrix: matrix:
abi: ["cp313"] abi: ["cp312", "cp313"]
arch: ${{ fromJson(needs.init.outputs.architectures) }} arch: ${{ fromJson(needs.init.outputs.architectures) }}
steps: steps:
- name: Checkout the repository - name: Checkout the repository
uses: actions/checkout@v4.2.2 uses: actions/checkout@v4.2.2
- name: Download env_file - name: Download env_file
uses: actions/download-artifact@v4.3.0 uses: actions/download-artifact@v4.1.8
with: with:
name: env_file name: env_file
- name: Download build_constraints - name: Download build_constraints
uses: actions/download-artifact@v4.3.0 uses: actions/download-artifact@v4.1.8
with: with:
name: build_constraints name: build_constraints
- name: Download requirements_diff - name: Download requirements_diff
uses: actions/download-artifact@v4.3.0 uses: actions/download-artifact@v4.1.8
with: with:
name: requirements_diff name: requirements_diff
- name: Download requirements_all_wheels - name: Download requirements_all_wheels
uses: actions/download-artifact@v4.3.0 uses: actions/download-artifact@v4.1.8
with: with:
name: requirements_all_wheels name: requirements_all_wheels
@@ -218,8 +218,16 @@ jobs:
sed -i "/uv/d" requirements.txt sed -i "/uv/d" requirements.txt
sed -i "/uv/d" requirements_diff.txt sed -i "/uv/d" requirements_diff.txt
- name: Build wheels - name: Split requirements all
uses: home-assistant/wheels@2025.03.0 run: |
# We split requirements all into multiple files.
# This is to prevent the build from running out of memory when
# resolving packages on 32-bits systems (like armhf, armv7).
split -l $(expr $(expr $(cat requirements_all.txt | wc -l) + 1) / 3) requirements_all_wheels_${{ matrix.arch }}.txt requirements_all.txt
- name: Build wheels (part 1)
uses: home-assistant/wheels@2024.11.0
with: with:
abi: ${{ matrix.abi }} abi: ${{ matrix.abi }}
tag: musllinux_1_2 tag: musllinux_1_2
@@ -230,4 +238,32 @@ jobs:
skip-binary: aiohttp;charset-normalizer;grpcio;multidict;SQLAlchemy;propcache;protobuf;pymicro-vad;yarl skip-binary: aiohttp;charset-normalizer;grpcio;multidict;SQLAlchemy;propcache;protobuf;pymicro-vad;yarl
constraints: "homeassistant/package_constraints.txt" constraints: "homeassistant/package_constraints.txt"
requirements-diff: "requirements_diff.txt" requirements-diff: "requirements_diff.txt"
requirements: "requirements_all.txt" requirements: "requirements_all.txtaa"
- name: Build wheels (part 2)
uses: home-assistant/wheels@2024.11.0
with:
abi: ${{ matrix.abi }}
tag: musllinux_1_2
arch: ${{ matrix.arch }}
wheels-key: ${{ secrets.WHEELS_KEY }}
env-file: true
apk: "bluez-dev;libffi-dev;openssl-dev;glib-dev;eudev-dev;libxml2-dev;libxslt-dev;libpng-dev;libjpeg-turbo-dev;tiff-dev;cups-dev;gmp-dev;mpfr-dev;mpc1-dev;ffmpeg-dev;gammu-dev;yaml-dev;openblas-dev;fftw-dev;lapack-dev;gfortran;blas-dev;eigen-dev;freetype-dev;glew-dev;harfbuzz-dev;hdf5-dev;libdc1394-dev;libtbb-dev;mesa-dev;openexr-dev;openjpeg-dev;uchardet-dev;nasm;zlib-ng-dev"
skip-binary: aiohttp;charset-normalizer;grpcio;multidict;SQLAlchemy;propcache;protobuf;pymicro-vad;yarl
constraints: "homeassistant/package_constraints.txt"
requirements-diff: "requirements_diff.txt"
requirements: "requirements_all.txtab"
- name: Build wheels (part 3)
uses: home-assistant/wheels@2024.11.0
with:
abi: ${{ matrix.abi }}
tag: musllinux_1_2
arch: ${{ matrix.arch }}
wheels-key: ${{ secrets.WHEELS_KEY }}
env-file: true
apk: "bluez-dev;libffi-dev;openssl-dev;glib-dev;eudev-dev;libxml2-dev;libxslt-dev;libpng-dev;libjpeg-turbo-dev;tiff-dev;cups-dev;gmp-dev;mpfr-dev;mpc1-dev;ffmpeg-dev;gammu-dev;yaml-dev;openblas-dev;fftw-dev;lapack-dev;gfortran;blas-dev;eigen-dev;freetype-dev;glew-dev;harfbuzz-dev;hdf5-dev;libdc1394-dev;libtbb-dev;mesa-dev;openexr-dev;openjpeg-dev;uchardet-dev;nasm;zlib-ng-dev"
skip-binary: aiohttp;charset-normalizer;grpcio;multidict;SQLAlchemy;propcache;protobuf;pymicro-vad;yarl
constraints: "homeassistant/package_constraints.txt"
requirements-diff: "requirements_diff.txt"
requirements: "requirements_all.txtac"

7
.gitignore vendored
View File

@@ -69,7 +69,6 @@ test-reports/
test-results.xml test-results.xml
test-output.xml test-output.xml
pytest-*.txt pytest-*.txt
junit.xml
# Translations # Translations
*.mo *.mo
@@ -137,8 +136,4 @@ tmp_cache
.ropeproject .ropeproject
# Will be created from script/split_tests.py # Will be created from script/split_tests.py
pytest_buckets.txt pytest_buckets.txt
# AI tooling
.claude

View File

@@ -1,14 +1,14 @@
repos: repos:
- repo: https://github.com/astral-sh/ruff-pre-commit - repo: https://github.com/astral-sh/ruff-pre-commit
rev: v0.12.1 rev: v0.8.6
hooks: hooks:
- id: ruff-check - id: ruff
args: args:
- --fix - --fix
- id: ruff-format - id: ruff-format
files: ^((homeassistant|pylint|script|tests)/.+)?[^/]+\.(py|pyi)$ files: ^((homeassistant|pylint|script|tests)/.+)?[^/]+\.(py|pyi)$
- repo: https://github.com/codespell-project/codespell - repo: https://github.com/codespell-project/codespell
rev: v2.4.1 rev: v2.3.0
hooks: hooks:
- id: codespell - id: codespell
args: args:
@@ -30,7 +30,7 @@ repos:
- --branch=master - --branch=master
- --branch=rc - --branch=rc
- repo: https://github.com/adrienverge/yamllint.git - repo: https://github.com/adrienverge/yamllint.git
rev: v1.37.1 rev: v1.35.1
hooks: hooks:
- id: yamllint - id: yamllint
- repo: https://github.com/pre-commit/mirrors-prettier - repo: https://github.com/pre-commit/mirrors-prettier
@@ -61,14 +61,13 @@ repos:
name: mypy name: mypy
entry: script/run-in-env.sh mypy entry: script/run-in-env.sh mypy
language: script language: script
require_serial: true
types_or: [python, pyi] types_or: [python, pyi]
require_serial: true
files: ^(homeassistant|pylint)/.+\.(py|pyi)$ files: ^(homeassistant|pylint)/.+\.(py|pyi)$
- id: pylint - id: pylint
name: pylint name: pylint
entry: script/run-in-env.sh pylint --ignore-missing-annotations=y entry: script/run-in-env.sh pylint -j 0 --ignore-missing-annotations=y
language: script language: script
require_serial: true
types_or: [python, pyi] types_or: [python, pyi]
files: ^(homeassistant|tests)/.+\.(py|pyi)$ files: ^(homeassistant|tests)/.+\.(py|pyi)$
- id: gen_requirements_all - id: gen_requirements_all

View File

@@ -65,9 +65,7 @@ homeassistant.components.aladdin_connect.*
homeassistant.components.alarm_control_panel.* homeassistant.components.alarm_control_panel.*
homeassistant.components.alert.* homeassistant.components.alert.*
homeassistant.components.alexa.* homeassistant.components.alexa.*
homeassistant.components.alexa_devices.*
homeassistant.components.alpha_vantage.* homeassistant.components.alpha_vantage.*
homeassistant.components.altruist.*
homeassistant.components.amazon_polly.* homeassistant.components.amazon_polly.*
homeassistant.components.amberelectric.* homeassistant.components.amberelectric.*
homeassistant.components.ambient_network.* homeassistant.components.ambient_network.*
@@ -105,7 +103,6 @@ homeassistant.components.auth.*
homeassistant.components.automation.* homeassistant.components.automation.*
homeassistant.components.awair.* homeassistant.components.awair.*
homeassistant.components.axis.* homeassistant.components.axis.*
homeassistant.components.azure_storage.*
homeassistant.components.backup.* homeassistant.components.backup.*
homeassistant.components.baf.* homeassistant.components.baf.*
homeassistant.components.bang_olufsen.* homeassistant.components.bang_olufsen.*
@@ -121,9 +118,7 @@ homeassistant.components.bluetooth_adapters.*
homeassistant.components.bluetooth_tracker.* homeassistant.components.bluetooth_tracker.*
homeassistant.components.bmw_connected_drive.* homeassistant.components.bmw_connected_drive.*
homeassistant.components.bond.* homeassistant.components.bond.*
homeassistant.components.bosch_alarm.*
homeassistant.components.braviatv.* homeassistant.components.braviatv.*
homeassistant.components.bring.*
homeassistant.components.brother.* homeassistant.components.brother.*
homeassistant.components.browser.* homeassistant.components.browser.*
homeassistant.components.bryant_evolution.* homeassistant.components.bryant_evolution.*
@@ -139,7 +134,6 @@ homeassistant.components.clicksend.*
homeassistant.components.climate.* homeassistant.components.climate.*
homeassistant.components.cloud.* homeassistant.components.cloud.*
homeassistant.components.co2signal.* homeassistant.components.co2signal.*
homeassistant.components.comelit.*
homeassistant.components.command_line.* homeassistant.components.command_line.*
homeassistant.components.config.* homeassistant.components.config.*
homeassistant.components.configurator.* homeassistant.components.configurator.*
@@ -223,7 +217,6 @@ homeassistant.components.goalzero.*
homeassistant.components.google.* homeassistant.components.google.*
homeassistant.components.google_assistant_sdk.* homeassistant.components.google_assistant_sdk.*
homeassistant.components.google_cloud.* homeassistant.components.google_cloud.*
homeassistant.components.google_drive.*
homeassistant.components.google_photos.* homeassistant.components.google_photos.*
homeassistant.components.google_sheets.* homeassistant.components.google_sheets.*
homeassistant.components.govee_ble.* homeassistant.components.govee_ble.*
@@ -231,22 +224,18 @@ homeassistant.components.gpsd.*
homeassistant.components.greeneye_monitor.* homeassistant.components.greeneye_monitor.*
homeassistant.components.group.* homeassistant.components.group.*
homeassistant.components.guardian.* homeassistant.components.guardian.*
homeassistant.components.habitica.*
homeassistant.components.hardkernel.* homeassistant.components.hardkernel.*
homeassistant.components.hardware.* homeassistant.components.hardware.*
homeassistant.components.heos.*
homeassistant.components.here_travel_time.* homeassistant.components.here_travel_time.*
homeassistant.components.history.* homeassistant.components.history.*
homeassistant.components.history_stats.* homeassistant.components.history_stats.*
homeassistant.components.holiday.* homeassistant.components.holiday.*
homeassistant.components.home_connect.*
homeassistant.components.homeassistant.* homeassistant.components.homeassistant.*
homeassistant.components.homeassistant_alerts.* homeassistant.components.homeassistant_alerts.*
homeassistant.components.homeassistant_green.* homeassistant.components.homeassistant_green.*
homeassistant.components.homeassistant_hardware.* homeassistant.components.homeassistant_hardware.*
homeassistant.components.homeassistant_sky_connect.* homeassistant.components.homeassistant_sky_connect.*
homeassistant.components.homeassistant_yellow.* homeassistant.components.homeassistant_yellow.*
homeassistant.components.homee.*
homeassistant.components.homekit.* homeassistant.components.homekit.*
homeassistant.components.homekit_controller homeassistant.components.homekit_controller
homeassistant.components.homekit_controller.alarm_control_panel homeassistant.components.homekit_controller.alarm_control_panel
@@ -272,8 +261,6 @@ homeassistant.components.image_processing.*
homeassistant.components.image_upload.* homeassistant.components.image_upload.*
homeassistant.components.imap.* homeassistant.components.imap.*
homeassistant.components.imgw_pib.* homeassistant.components.imgw_pib.*
homeassistant.components.immich.*
homeassistant.components.incomfort.*
homeassistant.components.input_button.* homeassistant.components.input_button.*
homeassistant.components.input_select.* homeassistant.components.input_select.*
homeassistant.components.input_text.* homeassistant.components.input_text.*
@@ -294,7 +281,6 @@ homeassistant.components.kaleidescape.*
homeassistant.components.knocki.* homeassistant.components.knocki.*
homeassistant.components.knx.* homeassistant.components.knx.*
homeassistant.components.kraken.* homeassistant.components.kraken.*
homeassistant.components.kulersky.*
homeassistant.components.lacrosse.* homeassistant.components.lacrosse.*
homeassistant.components.lacrosse_view.* homeassistant.components.lacrosse_view.*
homeassistant.components.lamarzocco.* homeassistant.components.lamarzocco.*
@@ -320,14 +306,12 @@ homeassistant.components.logbook.*
homeassistant.components.logger.* homeassistant.components.logger.*
homeassistant.components.london_underground.* homeassistant.components.london_underground.*
homeassistant.components.lookin.* homeassistant.components.lookin.*
homeassistant.components.lovelace.*
homeassistant.components.luftdaten.* homeassistant.components.luftdaten.*
homeassistant.components.madvr.* homeassistant.components.madvr.*
homeassistant.components.manual.* homeassistant.components.manual.*
homeassistant.components.mastodon.* homeassistant.components.mastodon.*
homeassistant.components.matrix.* homeassistant.components.matrix.*
homeassistant.components.matter.* homeassistant.components.matter.*
homeassistant.components.mcp.*
homeassistant.components.mcp_server.* homeassistant.components.mcp_server.*
homeassistant.components.mealie.* homeassistant.components.mealie.*
homeassistant.components.media_extractor.* homeassistant.components.media_extractor.*
@@ -335,7 +319,6 @@ homeassistant.components.media_player.*
homeassistant.components.media_source.* homeassistant.components.media_source.*
homeassistant.components.met_eireann.* homeassistant.components.met_eireann.*
homeassistant.components.metoffice.* homeassistant.components.metoffice.*
homeassistant.components.miele.*
homeassistant.components.mikrotik.* homeassistant.components.mikrotik.*
homeassistant.components.min_max.* homeassistant.components.min_max.*
homeassistant.components.minecraft_server.* homeassistant.components.minecraft_server.*
@@ -367,13 +350,10 @@ homeassistant.components.no_ip.*
homeassistant.components.nordpool.* homeassistant.components.nordpool.*
homeassistant.components.notify.* homeassistant.components.notify.*
homeassistant.components.notion.* homeassistant.components.notion.*
homeassistant.components.ntfy.*
homeassistant.components.number.* homeassistant.components.number.*
homeassistant.components.nut.* homeassistant.components.nut.*
homeassistant.components.ohme.*
homeassistant.components.onboarding.* homeassistant.components.onboarding.*
homeassistant.components.oncue.* homeassistant.components.oncue.*
homeassistant.components.onedrive.*
homeassistant.components.onewire.* homeassistant.components.onewire.*
homeassistant.components.onkyo.* homeassistant.components.onkyo.*
homeassistant.components.open_meteo.* homeassistant.components.open_meteo.*
@@ -381,7 +361,6 @@ homeassistant.components.openai_conversation.*
homeassistant.components.openexchangerates.* homeassistant.components.openexchangerates.*
homeassistant.components.opensky.* homeassistant.components.opensky.*
homeassistant.components.openuv.* homeassistant.components.openuv.*
homeassistant.components.opower.*
homeassistant.components.oralb.* homeassistant.components.oralb.*
homeassistant.components.otbr.* homeassistant.components.otbr.*
homeassistant.components.overkiz.* homeassistant.components.overkiz.*
@@ -389,12 +368,9 @@ homeassistant.components.overseerr.*
homeassistant.components.p1_monitor.* homeassistant.components.p1_monitor.*
homeassistant.components.pandora.* homeassistant.components.pandora.*
homeassistant.components.panel_custom.* homeassistant.components.panel_custom.*
homeassistant.components.paperless_ngx.*
homeassistant.components.peblar.* homeassistant.components.peblar.*
homeassistant.components.peco.* homeassistant.components.peco.*
homeassistant.components.pegel_online.*
homeassistant.components.persistent_notification.* homeassistant.components.persistent_notification.*
homeassistant.components.person.*
homeassistant.components.pi_hole.* homeassistant.components.pi_hole.*
homeassistant.components.ping.* homeassistant.components.ping.*
homeassistant.components.plugwise.* homeassistant.components.plugwise.*
@@ -408,9 +384,7 @@ homeassistant.components.pure_energie.*
homeassistant.components.purpleair.* homeassistant.components.purpleair.*
homeassistant.components.pushbullet.* homeassistant.components.pushbullet.*
homeassistant.components.pvoutput.* homeassistant.components.pvoutput.*
homeassistant.components.pyload.*
homeassistant.components.python_script.* homeassistant.components.python_script.*
homeassistant.components.qbus.*
homeassistant.components.qnap_qsw.* homeassistant.components.qnap_qsw.*
homeassistant.components.rabbitair.* homeassistant.components.rabbitair.*
homeassistant.components.radarr.* homeassistant.components.radarr.*
@@ -421,9 +395,7 @@ homeassistant.components.raspberry_pi.*
homeassistant.components.rdw.* homeassistant.components.rdw.*
homeassistant.components.recollect_waste.* homeassistant.components.recollect_waste.*
homeassistant.components.recorder.* homeassistant.components.recorder.*
homeassistant.components.remember_the_milk.*
homeassistant.components.remote.* homeassistant.components.remote.*
homeassistant.components.remote_calendar.*
homeassistant.components.renault.* homeassistant.components.renault.*
homeassistant.components.reolink.* homeassistant.components.reolink.*
homeassistant.components.repairs.* homeassistant.components.repairs.*
@@ -439,6 +411,7 @@ homeassistant.components.roku.*
homeassistant.components.romy.* homeassistant.components.romy.*
homeassistant.components.rpi_power.* homeassistant.components.rpi_power.*
homeassistant.components.rss_feed_template.* homeassistant.components.rss_feed_template.*
homeassistant.components.rtsp_to_webrtc.*
homeassistant.components.russound_rio.* homeassistant.components.russound_rio.*
homeassistant.components.ruuvi_gateway.* homeassistant.components.ruuvi_gateway.*
homeassistant.components.ruuvitag_ble.* homeassistant.components.ruuvitag_ble.*
@@ -453,7 +426,6 @@ homeassistant.components.select.*
homeassistant.components.sensibo.* homeassistant.components.sensibo.*
homeassistant.components.sensirion_ble.* homeassistant.components.sensirion_ble.*
homeassistant.components.sensor.* homeassistant.components.sensor.*
homeassistant.components.sensorpush_cloud.*
homeassistant.components.sensoterra.* homeassistant.components.sensoterra.*
homeassistant.components.senz.* homeassistant.components.senz.*
homeassistant.components.sfr_box.* homeassistant.components.sfr_box.*
@@ -468,7 +440,6 @@ homeassistant.components.slack.*
homeassistant.components.sleepiq.* homeassistant.components.sleepiq.*
homeassistant.components.smhi.* homeassistant.components.smhi.*
homeassistant.components.smlight.* homeassistant.components.smlight.*
homeassistant.components.smtp.*
homeassistant.components.snooz.* homeassistant.components.snooz.*
homeassistant.components.solarlog.* homeassistant.components.solarlog.*
homeassistant.components.sonarr.* homeassistant.components.sonarr.*
@@ -504,7 +475,6 @@ homeassistant.components.tautulli.*
homeassistant.components.tcp.* homeassistant.components.tcp.*
homeassistant.components.technove.* homeassistant.components.technove.*
homeassistant.components.tedee.* homeassistant.components.tedee.*
homeassistant.components.telegram_bot.*
homeassistant.components.text.* homeassistant.components.text.*
homeassistant.components.thethingsnetwork.* homeassistant.components.thethingsnetwork.*
homeassistant.components.threshold.* homeassistant.components.threshold.*
@@ -543,7 +513,6 @@ homeassistant.components.vallox.*
homeassistant.components.valve.* homeassistant.components.valve.*
homeassistant.components.velbus.* homeassistant.components.velbus.*
homeassistant.components.vlc_telnet.* homeassistant.components.vlc_telnet.*
homeassistant.components.vodafone_station.*
homeassistant.components.wake_on_lan.* homeassistant.components.wake_on_lan.*
homeassistant.components.wake_word.* homeassistant.components.wake_word.*
homeassistant.components.wallbox.* homeassistant.components.wallbox.*

11
.vscode/launch.json vendored
View File

@@ -38,17 +38,10 @@
"module": "pytest", "module": "pytest",
"justMyCode": false, "justMyCode": false,
"args": [ "args": [
"--timeout=10",
"--picked" "--picked"
], ],
}, },
{
"name": "Home Assistant: Debug Current Test File",
"type": "debugpy",
"request": "launch",
"module": "pytest",
"console": "integratedTerminal",
"args": ["-vv", "${file}"]
},
{ {
// Debug by attaching to local Home Assistant server using Remote Python Debugger. // Debug by attaching to local Home Assistant server using Remote Python Debugger.
// See https://www.home-assistant.io/integrations/debugpy/ // See https://www.home-assistant.io/integrations/debugpy/
@@ -84,4 +77,4 @@
] ]
} }
] ]
} }

View File

@@ -1,5 +1,5 @@
{ {
// Please keep this file (mostly!) in sync with settings in home-assistant/.devcontainer/devcontainer.json // Please keep this file in sync with settings in home-assistant/.devcontainer/devcontainer.json
// Added --no-cov to work around TypeError: message must be set // Added --no-cov to work around TypeError: message must be set
// https://github.com/microsoft/vscode-python/issues/14067 // https://github.com/microsoft/vscode-python/issues/14067
"python.testing.pytestArgs": ["--no-cov"], "python.testing.pytestArgs": ["--no-cov"],
@@ -12,7 +12,6 @@
"fileMatch": [ "fileMatch": [
"homeassistant/components/*/manifest.json" "homeassistant/components/*/manifest.json"
], ],
// This value differs between working with devcontainer and locally, therefor this value should NOT be in sync!
"url": "./script/json_schemas/manifest_schema.json" "url": "./script/json_schemas/manifest_schema.json"
} }
] ]

6
.vscode/tasks.json vendored
View File

@@ -4,7 +4,7 @@
{ {
"label": "Run Home Assistant Core", "label": "Run Home Assistant Core",
"type": "shell", "type": "shell",
"command": "${command:python.interpreterPath} -m homeassistant -c ./config", "command": "hass -c ./config",
"group": "test", "group": "test",
"presentation": { "presentation": {
"reveal": "always", "reveal": "always",
@@ -45,7 +45,7 @@
{ {
"label": "Ruff", "label": "Ruff",
"type": "shell", "type": "shell",
"command": "pre-commit run ruff-check --all-files", "command": "pre-commit run ruff --all-files",
"group": { "group": {
"kind": "test", "kind": "test",
"isDefault": true "isDefault": true
@@ -148,7 +148,7 @@
{ {
"label": "Install all Test Requirements", "label": "Install all Test Requirements",
"type": "shell", "type": "shell",
"command": "uv pip install -r requirements.txt -r requirements_test_all.txt", "command": "uv pip install -r requirements_test_all.txt",
"group": { "group": {
"kind": "build", "kind": "build",
"isDefault": true "isDefault": true

View File

@@ -1 +0,0 @@
.github/copilot-instructions.md

View File

@@ -46,8 +46,8 @@ build.json @home-assistant/supervisor
/tests/components/accuweather/ @bieniu /tests/components/accuweather/ @bieniu
/homeassistant/components/acmeda/ @atmurray /homeassistant/components/acmeda/ @atmurray
/tests/components/acmeda/ @atmurray /tests/components/acmeda/ @atmurray
/homeassistant/components/adax/ @danielhiversen @lazytarget /homeassistant/components/adax/ @danielhiversen
/tests/components/adax/ @danielhiversen @lazytarget /tests/components/adax/ @danielhiversen
/homeassistant/components/adguard/ @frenck /homeassistant/components/adguard/ @frenck
/tests/components/adguard/ @frenck /tests/components/adguard/ @frenck
/homeassistant/components/ads/ @mrpasztoradam /homeassistant/components/ads/ @mrpasztoradam
@@ -57,8 +57,6 @@ build.json @home-assistant/supervisor
/tests/components/aemet/ @Noltari /tests/components/aemet/ @Noltari
/homeassistant/components/agent_dvr/ @ispysoftware /homeassistant/components/agent_dvr/ @ispysoftware
/tests/components/agent_dvr/ @ispysoftware /tests/components/agent_dvr/ @ispysoftware
/homeassistant/components/ai_task/ @home-assistant/core
/tests/components/ai_task/ @home-assistant/core
/homeassistant/components/air_quality/ @home-assistant/core /homeassistant/components/air_quality/ @home-assistant/core
/tests/components/air_quality/ @home-assistant/core /tests/components/air_quality/ @home-assistant/core
/homeassistant/components/airgradient/ @airgradienthq @joostlek /homeassistant/components/airgradient/ @airgradienthq @joostlek
@@ -91,10 +89,6 @@ build.json @home-assistant/supervisor
/tests/components/alert/ @home-assistant/core @frenck /tests/components/alert/ @home-assistant/core @frenck
/homeassistant/components/alexa/ @home-assistant/cloud @ochlocracy @jbouwh /homeassistant/components/alexa/ @home-assistant/cloud @ochlocracy @jbouwh
/tests/components/alexa/ @home-assistant/cloud @ochlocracy @jbouwh /tests/components/alexa/ @home-assistant/cloud @ochlocracy @jbouwh
/homeassistant/components/alexa_devices/ @chemelli74
/tests/components/alexa_devices/ @chemelli74
/homeassistant/components/altruist/ @airalab @LoSk-p
/tests/components/altruist/ @airalab @LoSk-p
/homeassistant/components/amazon_polly/ @jschlyter /homeassistant/components/amazon_polly/ @jschlyter
/homeassistant/components/amberelectric/ @madpilot /homeassistant/components/amberelectric/ @madpilot
/tests/components/amberelectric/ @madpilot /tests/components/amberelectric/ @madpilot
@@ -177,8 +171,6 @@ build.json @home-assistant/supervisor
/homeassistant/components/avea/ @pattyland /homeassistant/components/avea/ @pattyland
/homeassistant/components/awair/ @ahayworth @danielsjf /homeassistant/components/awair/ @ahayworth @danielsjf
/tests/components/awair/ @ahayworth @danielsjf /tests/components/awair/ @ahayworth @danielsjf
/homeassistant/components/aws_s3/ @tomasbedrich
/tests/components/aws_s3/ @tomasbedrich
/homeassistant/components/axis/ @Kane610 /homeassistant/components/axis/ @Kane610
/tests/components/axis/ @Kane610 /tests/components/axis/ @Kane610
/homeassistant/components/azure_data_explorer/ @kaareseras /homeassistant/components/azure_data_explorer/ @kaareseras
@@ -188,8 +180,6 @@ build.json @home-assistant/supervisor
/homeassistant/components/azure_event_hub/ @eavanvalkenburg /homeassistant/components/azure_event_hub/ @eavanvalkenburg
/tests/components/azure_event_hub/ @eavanvalkenburg /tests/components/azure_event_hub/ @eavanvalkenburg
/homeassistant/components/azure_service_bus/ @hfurubotten /homeassistant/components/azure_service_bus/ @hfurubotten
/homeassistant/components/azure_storage/ @zweckj
/tests/components/azure_storage/ @zweckj
/homeassistant/components/backup/ @home-assistant/core /homeassistant/components/backup/ @home-assistant/core
/tests/components/backup/ @home-assistant/core /tests/components/backup/ @home-assistant/core
/homeassistant/components/baf/ @bdraco @jfroy /homeassistant/components/baf/ @bdraco @jfroy
@@ -208,8 +198,8 @@ build.json @home-assistant/supervisor
/tests/components/blebox/ @bbx-a @swistakm /tests/components/blebox/ @bbx-a @swistakm
/homeassistant/components/blink/ @fronzbot @mkmer /homeassistant/components/blink/ @fronzbot @mkmer
/tests/components/blink/ @fronzbot @mkmer /tests/components/blink/ @fronzbot @mkmer
/homeassistant/components/blue_current/ @gleeuwen @NickKoepr @jtodorova23 /homeassistant/components/blue_current/ @Floris272 @gleeuwen
/tests/components/blue_current/ @gleeuwen @NickKoepr @jtodorova23 /tests/components/blue_current/ @Floris272 @gleeuwen
/homeassistant/components/bluemaestro/ @bdraco /homeassistant/components/bluemaestro/ @bdraco
/tests/components/bluemaestro/ @bdraco /tests/components/bluemaestro/ @bdraco
/homeassistant/components/blueprint/ @home-assistant/core /homeassistant/components/blueprint/ @home-assistant/core
@@ -224,8 +214,6 @@ build.json @home-assistant/supervisor
/tests/components/bmw_connected_drive/ @gerard33 @rikroe /tests/components/bmw_connected_drive/ @gerard33 @rikroe
/homeassistant/components/bond/ @bdraco @prystupa @joshs85 @marciogranzotto /homeassistant/components/bond/ @bdraco @prystupa @joshs85 @marciogranzotto
/tests/components/bond/ @bdraco @prystupa @joshs85 @marciogranzotto /tests/components/bond/ @bdraco @prystupa @joshs85 @marciogranzotto
/homeassistant/components/bosch_alarm/ @mag1024 @sanjay900
/tests/components/bosch_alarm/ @mag1024 @sanjay900
/homeassistant/components/bosch_shc/ @tschamm /homeassistant/components/bosch_shc/ @tschamm
/tests/components/bosch_shc/ @tschamm /tests/components/bosch_shc/ @tschamm
/homeassistant/components/braviatv/ @bieniu @Drafteed /homeassistant/components/braviatv/ @bieniu @Drafteed
@@ -309,7 +297,6 @@ build.json @home-assistant/supervisor
/homeassistant/components/crownstone/ @Crownstone @RicArch97 /homeassistant/components/crownstone/ @Crownstone @RicArch97
/tests/components/crownstone/ @Crownstone @RicArch97 /tests/components/crownstone/ @Crownstone @RicArch97
/homeassistant/components/cups/ @fabaff /homeassistant/components/cups/ @fabaff
/tests/components/cups/ @fabaff
/homeassistant/components/daikin/ @fredrike /homeassistant/components/daikin/ @fredrike
/tests/components/daikin/ @fredrike /tests/components/daikin/ @fredrike
/homeassistant/components/date/ @home-assistant/core /homeassistant/components/date/ @home-assistant/core
@@ -331,8 +318,8 @@ build.json @home-assistant/supervisor
/tests/components/demo/ @home-assistant/core /tests/components/demo/ @home-assistant/core
/homeassistant/components/denonavr/ @ol-iver @starkillerOG /homeassistant/components/denonavr/ @ol-iver @starkillerOG
/tests/components/denonavr/ @ol-iver @starkillerOG /tests/components/denonavr/ @ol-iver @starkillerOG
/homeassistant/components/derivative/ @afaucogney @karwosts /homeassistant/components/derivative/ @afaucogney
/tests/components/derivative/ @afaucogney @karwosts /tests/components/derivative/ @afaucogney
/homeassistant/components/devialet/ @fwestenberg /homeassistant/components/devialet/ @fwestenberg
/tests/components/devialet/ @fwestenberg /tests/components/devialet/ @fwestenberg
/homeassistant/components/device_automation/ @home-assistant/core /homeassistant/components/device_automation/ @home-assistant/core
@@ -441,7 +428,7 @@ build.json @home-assistant/supervisor
/homeassistant/components/entur_public_transport/ @hfurubotten /homeassistant/components/entur_public_transport/ @hfurubotten
/homeassistant/components/environment_canada/ @gwww @michaeldavie /homeassistant/components/environment_canada/ @gwww @michaeldavie
/tests/components/environment_canada/ @gwww @michaeldavie /tests/components/environment_canada/ @gwww @michaeldavie
/homeassistant/components/ephember/ @ttroy50 @roberty99 /homeassistant/components/ephember/ @ttroy50
/homeassistant/components/epic_games_store/ @hacf-fr @Quentame /homeassistant/components/epic_games_store/ @hacf-fr @Quentame
/tests/components/epic_games_store/ @hacf-fr @Quentame /tests/components/epic_games_store/ @hacf-fr @Quentame
/homeassistant/components/epion/ @lhgravendeel /homeassistant/components/epion/ @lhgravendeel
@@ -452,8 +439,8 @@ build.json @home-assistant/supervisor
/tests/components/eq3btsmart/ @eulemitkeule @dbuezas /tests/components/eq3btsmart/ @eulemitkeule @dbuezas
/homeassistant/components/escea/ @lazdavila /homeassistant/components/escea/ @lazdavila
/tests/components/escea/ @lazdavila /tests/components/escea/ @lazdavila
/homeassistant/components/esphome/ @jesserockz @kbx81 @bdraco /homeassistant/components/esphome/ @OttoWinter @jesserockz @kbx81 @bdraco
/tests/components/esphome/ @jesserockz @kbx81 @bdraco /tests/components/esphome/ @OttoWinter @jesserockz @kbx81 @bdraco
/homeassistant/components/eufylife_ble/ @bdr99 /homeassistant/components/eufylife_ble/ @bdr99
/tests/components/eufylife_ble/ @bdr99 /tests/components/eufylife_ble/ @bdr99
/homeassistant/components/event/ @home-assistant/core /homeassistant/components/event/ @home-assistant/core
@@ -462,8 +449,8 @@ build.json @home-assistant/supervisor
/tests/components/evil_genius_labs/ @balloob /tests/components/evil_genius_labs/ @balloob
/homeassistant/components/evohome/ @zxdavb /homeassistant/components/evohome/ @zxdavb
/tests/components/evohome/ @zxdavb /tests/components/evohome/ @zxdavb
/homeassistant/components/ezviz/ @RenierM26 /homeassistant/components/ezviz/ @RenierM26 @baqs
/tests/components/ezviz/ @RenierM26 /tests/components/ezviz/ @RenierM26 @baqs
/homeassistant/components/faa_delays/ @ntilley905 /homeassistant/components/faa_delays/ @ntilley905
/tests/components/faa_delays/ @ntilley905 /tests/components/faa_delays/ @ntilley905
/homeassistant/components/fan/ @home-assistant/core /homeassistant/components/fan/ @home-assistant/core
@@ -579,10 +566,8 @@ build.json @home-assistant/supervisor
/tests/components/google_assistant_sdk/ @tronikos /tests/components/google_assistant_sdk/ @tronikos
/homeassistant/components/google_cloud/ @lufton @tronikos /homeassistant/components/google_cloud/ @lufton @tronikos
/tests/components/google_cloud/ @lufton @tronikos /tests/components/google_cloud/ @lufton @tronikos
/homeassistant/components/google_drive/ @tronikos /homeassistant/components/google_generative_ai_conversation/ @tronikos
/tests/components/google_drive/ @tronikos /tests/components/google_generative_ai_conversation/ @tronikos
/homeassistant/components/google_generative_ai_conversation/ @tronikos @ivanlh
/tests/components/google_generative_ai_conversation/ @tronikos @ivanlh
/homeassistant/components/google_mail/ @tkdrob /homeassistant/components/google_mail/ @tkdrob
/tests/components/google_mail/ @tkdrob /tests/components/google_mail/ @tkdrob
/homeassistant/components/google_photos/ @allenporter /homeassistant/components/google_photos/ @allenporter
@@ -638,8 +623,8 @@ build.json @home-assistant/supervisor
/tests/components/hlk_sw16/ @jameshilliard /tests/components/hlk_sw16/ @jameshilliard
/homeassistant/components/holiday/ @jrieger @gjohansson-ST /homeassistant/components/holiday/ @jrieger @gjohansson-ST
/tests/components/holiday/ @jrieger @gjohansson-ST /tests/components/holiday/ @jrieger @gjohansson-ST
/homeassistant/components/home_connect/ @DavidMStraub @Diegorro98 @MartinHjelmare /homeassistant/components/home_connect/ @DavidMStraub @Diegorro98
/tests/components/home_connect/ @DavidMStraub @Diegorro98 @MartinHjelmare /tests/components/home_connect/ @DavidMStraub @Diegorro98
/homeassistant/components/homeassistant/ @home-assistant/core /homeassistant/components/homeassistant/ @home-assistant/core
/tests/components/homeassistant/ @home-assistant/core /tests/components/homeassistant/ @home-assistant/core
/homeassistant/components/homeassistant_alerts/ @home-assistant/core /homeassistant/components/homeassistant_alerts/ @home-assistant/core
@@ -697,6 +682,8 @@ build.json @home-assistant/supervisor
/homeassistant/components/iammeter/ @lewei50 /homeassistant/components/iammeter/ @lewei50
/homeassistant/components/iaqualink/ @flz /homeassistant/components/iaqualink/ @flz
/tests/components/iaqualink/ @flz /tests/components/iaqualink/ @flz
/homeassistant/components/ibeacon/ @bdraco
/tests/components/ibeacon/ @bdraco
/homeassistant/components/icloud/ @Quentame @nzapponi /homeassistant/components/icloud/ @Quentame @nzapponi
/tests/components/icloud/ @Quentame @nzapponi /tests/components/icloud/ @Quentame @nzapponi
/homeassistant/components/idasen_desk/ @abmantis /homeassistant/components/idasen_desk/ @abmantis
@@ -713,12 +700,8 @@ build.json @home-assistant/supervisor
/tests/components/image_upload/ @home-assistant/core /tests/components/image_upload/ @home-assistant/core
/homeassistant/components/imap/ @jbouwh /homeassistant/components/imap/ @jbouwh
/tests/components/imap/ @jbouwh /tests/components/imap/ @jbouwh
/homeassistant/components/imeon_inverter/ @Imeon-Energy
/tests/components/imeon_inverter/ @Imeon-Energy
/homeassistant/components/imgw_pib/ @bieniu /homeassistant/components/imgw_pib/ @bieniu
/tests/components/imgw_pib/ @bieniu /tests/components/imgw_pib/ @bieniu
/homeassistant/components/immich/ @mib1185
/tests/components/immich/ @mib1185
/homeassistant/components/improv_ble/ @emontnemery /homeassistant/components/improv_ble/ @emontnemery
/tests/components/improv_ble/ @emontnemery /tests/components/improv_ble/ @emontnemery
/homeassistant/components/incomfort/ @jbouwh /homeassistant/components/incomfort/ @jbouwh
@@ -748,8 +731,6 @@ build.json @home-assistant/supervisor
/homeassistant/components/intent/ @home-assistant/core @synesthesiam /homeassistant/components/intent/ @home-assistant/core @synesthesiam
/tests/components/intent/ @home-assistant/core @synesthesiam /tests/components/intent/ @home-assistant/core @synesthesiam
/homeassistant/components/intesishome/ @jnimmo /homeassistant/components/intesishome/ @jnimmo
/homeassistant/components/iometer/ @MaestroOnICe
/tests/components/iometer/ @MaestroOnICe
/homeassistant/components/ios/ @robbiet480 /homeassistant/components/ios/ @robbiet480
/tests/components/ios/ @robbiet480 /tests/components/ios/ @robbiet480
/homeassistant/components/iotawatt/ @gtdiehl @jyavenard /homeassistant/components/iotawatt/ @gtdiehl @jyavenard
@@ -784,10 +765,12 @@ build.json @home-assistant/supervisor
/tests/components/ituran/ @shmuelzon /tests/components/ituran/ @shmuelzon
/homeassistant/components/izone/ @Swamp-Ig /homeassistant/components/izone/ @Swamp-Ig
/tests/components/izone/ @Swamp-Ig /tests/components/izone/ @Swamp-Ig
/homeassistant/components/jellyfin/ @RunC0deRun @ctalkington /homeassistant/components/jellyfin/ @j-stienstra @ctalkington
/tests/components/jellyfin/ @RunC0deRun @ctalkington /tests/components/jellyfin/ @j-stienstra @ctalkington
/homeassistant/components/jewish_calendar/ @tsvi /homeassistant/components/jewish_calendar/ @tsvi
/tests/components/jewish_calendar/ @tsvi /tests/components/jewish_calendar/ @tsvi
/homeassistant/components/juicenet/ @jesserockz
/tests/components/juicenet/ @jesserockz
/homeassistant/components/justnimbus/ @kvanzuijlen /homeassistant/components/justnimbus/ @kvanzuijlen
/tests/components/justnimbus/ @kvanzuijlen /tests/components/justnimbus/ @kvanzuijlen
/homeassistant/components/jvc_projector/ @SteveEasley @msavazzi /homeassistant/components/jvc_projector/ @SteveEasley @msavazzi
@@ -910,8 +893,6 @@ build.json @home-assistant/supervisor
/tests/components/matrix/ @PaarthShah /tests/components/matrix/ @PaarthShah
/homeassistant/components/matter/ @home-assistant/matter /homeassistant/components/matter/ @home-assistant/matter
/tests/components/matter/ @home-assistant/matter /tests/components/matter/ @home-assistant/matter
/homeassistant/components/mcp/ @allenporter
/tests/components/mcp/ @allenporter
/homeassistant/components/mcp_server/ @allenporter /homeassistant/components/mcp_server/ @allenporter
/tests/components/mcp_server/ @allenporter /tests/components/mcp_server/ @allenporter
/homeassistant/components/mealie/ @joostlek @andrew-codechimp /homeassistant/components/mealie/ @joostlek @andrew-codechimp
@@ -946,8 +927,6 @@ build.json @home-assistant/supervisor
/tests/components/metoffice/ @MrHarcombe @avee87 /tests/components/metoffice/ @MrHarcombe @avee87
/homeassistant/components/microbees/ @microBeesTech /homeassistant/components/microbees/ @microBeesTech
/tests/components/microbees/ @microBeesTech /tests/components/microbees/ @microBeesTech
/homeassistant/components/miele/ @astrandb
/tests/components/miele/ @astrandb
/homeassistant/components/mikrotik/ @engrbm87 /homeassistant/components/mikrotik/ @engrbm87
/tests/components/mikrotik/ @engrbm87 /tests/components/mikrotik/ @engrbm87
/homeassistant/components/mill/ @danielhiversen /homeassistant/components/mill/ @danielhiversen
@@ -984,8 +963,8 @@ build.json @home-assistant/supervisor
/tests/components/motionblinds_ble/ @LennP @jerrybboy /tests/components/motionblinds_ble/ @LennP @jerrybboy
/homeassistant/components/motioneye/ @dermotduffy /homeassistant/components/motioneye/ @dermotduffy
/tests/components/motioneye/ @dermotduffy /tests/components/motioneye/ @dermotduffy
/homeassistant/components/motionmount/ @laiho-vogels /homeassistant/components/motionmount/ @RJPoelstra
/tests/components/motionmount/ @laiho-vogels /tests/components/motionmount/ @RJPoelstra
/homeassistant/components/mqtt/ @emontnemery @jbouwh @bdraco /homeassistant/components/mqtt/ @emontnemery @jbouwh @bdraco
/tests/components/mqtt/ @emontnemery @jbouwh @bdraco /tests/components/mqtt/ @emontnemery @jbouwh @bdraco
/homeassistant/components/msteams/ @peroyvind /homeassistant/components/msteams/ @peroyvind
@@ -1045,6 +1024,7 @@ build.json @home-assistant/supervisor
/homeassistant/components/nina/ @DeerMaximum /homeassistant/components/nina/ @DeerMaximum
/tests/components/nina/ @DeerMaximum /tests/components/nina/ @DeerMaximum
/homeassistant/components/nissan_leaf/ @filcole /homeassistant/components/nissan_leaf/ @filcole
/homeassistant/components/nmbs/ @thibmaek
/homeassistant/components/noaa_tides/ @jdelaney72 /homeassistant/components/noaa_tides/ @jdelaney72
/homeassistant/components/nobo_hub/ @echoromeo @oyvindwe /homeassistant/components/nobo_hub/ @echoromeo @oyvindwe
/tests/components/nobo_hub/ @echoromeo @oyvindwe /tests/components/nobo_hub/ @echoromeo @oyvindwe
@@ -1060,8 +1040,6 @@ build.json @home-assistant/supervisor
/tests/components/nsw_fuel_station/ @nickw444 /tests/components/nsw_fuel_station/ @nickw444
/homeassistant/components/nsw_rural_fire_service_feed/ @exxamalte /homeassistant/components/nsw_rural_fire_service_feed/ @exxamalte
/tests/components/nsw_rural_fire_service_feed/ @exxamalte /tests/components/nsw_rural_fire_service_feed/ @exxamalte
/homeassistant/components/ntfy/ @tr4nt0r
/tests/components/ntfy/ @tr4nt0r
/homeassistant/components/nuheat/ @tstabrawa /homeassistant/components/nuheat/ @tstabrawa
/tests/components/nuheat/ @tstabrawa /tests/components/nuheat/ @tstabrawa
/homeassistant/components/nuki/ @pschmitt @pvizeli @pree /homeassistant/components/nuki/ @pschmitt @pvizeli @pree
@@ -1070,8 +1048,8 @@ build.json @home-assistant/supervisor
/tests/components/numato/ @clssn /tests/components/numato/ @clssn
/homeassistant/components/number/ @home-assistant/core @Shulyaka /homeassistant/components/number/ @home-assistant/core @Shulyaka
/tests/components/number/ @home-assistant/core @Shulyaka /tests/components/number/ @home-assistant/core @Shulyaka
/homeassistant/components/nut/ @bdraco @ollo69 @pestevez @tdfountain /homeassistant/components/nut/ @bdraco @ollo69 @pestevez
/tests/components/nut/ @bdraco @ollo69 @pestevez @tdfountain /tests/components/nut/ @bdraco @ollo69 @pestevez
/homeassistant/components/nws/ @MatthewFlamm @kamiyo /homeassistant/components/nws/ @MatthewFlamm @kamiyo
/tests/components/nws/ @MatthewFlamm @kamiyo /tests/components/nws/ @MatthewFlamm @kamiyo
/homeassistant/components/nyt_games/ @joostlek /homeassistant/components/nyt_games/ @joostlek
@@ -1090,16 +1068,16 @@ build.json @home-assistant/supervisor
/homeassistant/components/ombi/ @larssont /homeassistant/components/ombi/ @larssont
/homeassistant/components/onboarding/ @home-assistant/core /homeassistant/components/onboarding/ @home-assistant/core
/tests/components/onboarding/ @home-assistant/core /tests/components/onboarding/ @home-assistant/core
/homeassistant/components/oncue/ @bdraco @peterager
/tests/components/oncue/ @bdraco @peterager
/homeassistant/components/ondilo_ico/ @JeromeHXP /homeassistant/components/ondilo_ico/ @JeromeHXP
/tests/components/ondilo_ico/ @JeromeHXP /tests/components/ondilo_ico/ @JeromeHXP
/homeassistant/components/onedrive/ @zweckj
/tests/components/onedrive/ @zweckj
/homeassistant/components/onewire/ @garbled1 @epenet /homeassistant/components/onewire/ @garbled1 @epenet
/tests/components/onewire/ @garbled1 @epenet /tests/components/onewire/ @garbled1 @epenet
/homeassistant/components/onkyo/ @arturpragacz @eclair4151 /homeassistant/components/onkyo/ @arturpragacz @eclair4151
/tests/components/onkyo/ @arturpragacz @eclair4151 /tests/components/onkyo/ @arturpragacz @eclair4151
/homeassistant/components/onvif/ @hunterjm @jterrace /homeassistant/components/onvif/ @hunterjm
/tests/components/onvif/ @hunterjm @jterrace /tests/components/onvif/ @hunterjm
/homeassistant/components/open_meteo/ @frenck /homeassistant/components/open_meteo/ @frenck
/tests/components/open_meteo/ @frenck /tests/components/open_meteo/ @frenck
/homeassistant/components/openai_conversation/ @balloob /homeassistant/components/openai_conversation/ @balloob
@@ -1118,8 +1096,8 @@ build.json @home-assistant/supervisor
/tests/components/opentherm_gw/ @mvn23 /tests/components/opentherm_gw/ @mvn23
/homeassistant/components/openuv/ @bachya /homeassistant/components/openuv/ @bachya
/tests/components/openuv/ @bachya /tests/components/openuv/ @bachya
/homeassistant/components/openweathermap/ @fabaff @freekode @nzapponi @wittypluck /homeassistant/components/openweathermap/ @fabaff @freekode @nzapponi
/tests/components/openweathermap/ @fabaff @freekode @nzapponi @wittypluck /tests/components/openweathermap/ @fabaff @freekode @nzapponi
/homeassistant/components/opnsense/ @mtreinish /homeassistant/components/opnsense/ @mtreinish
/tests/components/opnsense/ @mtreinish /tests/components/opnsense/ @mtreinish
/homeassistant/components/opower/ @tronikos /homeassistant/components/opower/ @tronikos
@@ -1145,8 +1123,6 @@ build.json @home-assistant/supervisor
/tests/components/palazzetti/ @dotvav /tests/components/palazzetti/ @dotvav
/homeassistant/components/panel_custom/ @home-assistant/frontend /homeassistant/components/panel_custom/ @home-assistant/frontend
/tests/components/panel_custom/ @home-assistant/frontend /tests/components/panel_custom/ @home-assistant/frontend
/homeassistant/components/paperless_ngx/ @fvgarrel
/tests/components/paperless_ngx/ @fvgarrel
/homeassistant/components/peblar/ @frenck /homeassistant/components/peblar/ @frenck
/tests/components/peblar/ @frenck /tests/components/peblar/ @frenck
/homeassistant/components/peco/ @IceBotYT /homeassistant/components/peco/ @IceBotYT
@@ -1157,20 +1133,16 @@ build.json @home-assistant/supervisor
/tests/components/permobil/ @IsakNyberg /tests/components/permobil/ @IsakNyberg
/homeassistant/components/persistent_notification/ @home-assistant/core /homeassistant/components/persistent_notification/ @home-assistant/core
/tests/components/persistent_notification/ @home-assistant/core /tests/components/persistent_notification/ @home-assistant/core
/homeassistant/components/pglab/ @pglab-electronics
/tests/components/pglab/ @pglab-electronics
/homeassistant/components/philips_js/ @elupus /homeassistant/components/philips_js/ @elupus
/tests/components/philips_js/ @elupus /tests/components/philips_js/ @elupus
/homeassistant/components/pi_hole/ @shenxn /homeassistant/components/pi_hole/ @shenxn
/tests/components/pi_hole/ @shenxn /tests/components/pi_hole/ @shenxn
/homeassistant/components/picnic/ @corneyl @codesalatdev /homeassistant/components/picnic/ @corneyl
/tests/components/picnic/ @corneyl @codesalatdev /tests/components/picnic/ @corneyl
/homeassistant/components/ping/ @jpbede /homeassistant/components/ping/ @jpbede
/tests/components/ping/ @jpbede /tests/components/ping/ @jpbede
/homeassistant/components/plaato/ @JohNan /homeassistant/components/plaato/ @JohNan
/tests/components/plaato/ @JohNan /tests/components/plaato/ @JohNan
/homeassistant/components/playstation_network/ @jackjpowell @tr4nt0r
/tests/components/playstation_network/ @jackjpowell @tr4nt0r
/homeassistant/components/plex/ @jjlawren /homeassistant/components/plex/ @jjlawren
/tests/components/plex/ @jjlawren /tests/components/plex/ @jjlawren
/homeassistant/components/plugwise/ @CoMPaTech @bouwew /homeassistant/components/plugwise/ @CoMPaTech @bouwew
@@ -1187,8 +1159,6 @@ build.json @home-assistant/supervisor
/tests/components/powerwall/ @bdraco @jrester @daniel-simpson /tests/components/powerwall/ @bdraco @jrester @daniel-simpson
/homeassistant/components/private_ble_device/ @Jc2k /homeassistant/components/private_ble_device/ @Jc2k
/tests/components/private_ble_device/ @Jc2k /tests/components/private_ble_device/ @Jc2k
/homeassistant/components/probe_plus/ @pantherale0
/tests/components/probe_plus/ @pantherale0
/homeassistant/components/profiler/ @bdraco /homeassistant/components/profiler/ @bdraco
/tests/components/profiler/ @bdraco /tests/components/profiler/ @bdraco
/homeassistant/components/progettihwsw/ @ardaseremet /homeassistant/components/progettihwsw/ @ardaseremet
@@ -1204,8 +1174,6 @@ build.json @home-assistant/supervisor
/tests/components/prusalink/ @balloob /tests/components/prusalink/ @balloob
/homeassistant/components/ps4/ @ktnrg45 /homeassistant/components/ps4/ @ktnrg45
/tests/components/ps4/ @ktnrg45 /tests/components/ps4/ @ktnrg45
/homeassistant/components/pterodactyl/ @elmurato
/tests/components/pterodactyl/ @elmurato
/homeassistant/components/pure_energie/ @klaasnicolaas /homeassistant/components/pure_energie/ @klaasnicolaas
/tests/components/pure_energie/ @klaasnicolaas /tests/components/pure_energie/ @klaasnicolaas
/homeassistant/components/purpleair/ @bachya /homeassistant/components/purpleair/ @bachya
@@ -1224,8 +1192,6 @@ build.json @home-assistant/supervisor
/tests/components/pyload/ @tr4nt0r /tests/components/pyload/ @tr4nt0r
/homeassistant/components/qbittorrent/ @geoffreylagaisse @finder39 /homeassistant/components/qbittorrent/ @geoffreylagaisse @finder39
/tests/components/qbittorrent/ @geoffreylagaisse @finder39 /tests/components/qbittorrent/ @geoffreylagaisse @finder39
/homeassistant/components/qbus/ @Qbus-iot @thomasddn
/tests/components/qbus/ @Qbus-iot @thomasddn
/homeassistant/components/qingping/ @bdraco /homeassistant/components/qingping/ @bdraco
/tests/components/qingping/ @bdraco /tests/components/qingping/ @bdraco
/homeassistant/components/qld_bushfire/ @exxamalte /homeassistant/components/qld_bushfire/ @exxamalte
@@ -1235,7 +1201,6 @@ build.json @home-assistant/supervisor
/homeassistant/components/qnap_qsw/ @Noltari /homeassistant/components/qnap_qsw/ @Noltari
/tests/components/qnap_qsw/ @Noltari /tests/components/qnap_qsw/ @Noltari
/homeassistant/components/quantum_gateway/ @cisasteelersfan /homeassistant/components/quantum_gateway/ @cisasteelersfan
/tests/components/quantum_gateway/ @cisasteelersfan
/homeassistant/components/qvr_pro/ @oblogic7 /homeassistant/components/qvr_pro/ @oblogic7
/homeassistant/components/qwikswitch/ @kellerza /homeassistant/components/qwikswitch/ @kellerza
/tests/components/qwikswitch/ @kellerza /tests/components/qwikswitch/ @kellerza
@@ -1274,12 +1239,8 @@ build.json @home-assistant/supervisor
/tests/components/recovery_mode/ @home-assistant/core /tests/components/recovery_mode/ @home-assistant/core
/homeassistant/components/refoss/ @ashionky /homeassistant/components/refoss/ @ashionky
/tests/components/refoss/ @ashionky /tests/components/refoss/ @ashionky
/homeassistant/components/rehlko/ @bdraco @peterager
/tests/components/rehlko/ @bdraco @peterager
/homeassistant/components/remote/ @home-assistant/core /homeassistant/components/remote/ @home-assistant/core
/tests/components/remote/ @home-assistant/core /tests/components/remote/ @home-assistant/core
/homeassistant/components/remote_calendar/ @Thomas55555 @allenporter
/tests/components/remote_calendar/ @Thomas55555 @allenporter
/homeassistant/components/renault/ @epenet /homeassistant/components/renault/ @epenet
/tests/components/renault/ @epenet /tests/components/renault/ @epenet
/homeassistant/components/renson/ @jimmyd-be /homeassistant/components/renson/ @jimmyd-be
@@ -1307,8 +1268,8 @@ build.json @home-assistant/supervisor
/tests/components/rituals_perfume_genie/ @milanmeu @frenck /tests/components/rituals_perfume_genie/ @milanmeu @frenck
/homeassistant/components/rmvtransport/ @cgtobi /homeassistant/components/rmvtransport/ @cgtobi
/tests/components/rmvtransport/ @cgtobi /tests/components/rmvtransport/ @cgtobi
/homeassistant/components/roborock/ @Lash-L @allenporter /homeassistant/components/roborock/ @Lash-L
/tests/components/roborock/ @Lash-L @allenporter /tests/components/roborock/ @Lash-L
/homeassistant/components/roku/ @ctalkington /homeassistant/components/roku/ @ctalkington
/tests/components/roku/ @ctalkington /tests/components/roku/ @ctalkington
/homeassistant/components/romy/ @xeniter /homeassistant/components/romy/ @xeniter
@@ -1321,11 +1282,12 @@ build.json @home-assistant/supervisor
/tests/components/rpi_power/ @shenxn @swetoast /tests/components/rpi_power/ @shenxn @swetoast
/homeassistant/components/rss_feed_template/ @home-assistant/core /homeassistant/components/rss_feed_template/ @home-assistant/core
/tests/components/rss_feed_template/ @home-assistant/core /tests/components/rss_feed_template/ @home-assistant/core
/homeassistant/components/rtsp_to_webrtc/ @allenporter
/tests/components/rtsp_to_webrtc/ @allenporter
/homeassistant/components/ruckus_unleashed/ @lanrat @ms264556 @gabe565 /homeassistant/components/ruckus_unleashed/ @lanrat @ms264556 @gabe565
/tests/components/ruckus_unleashed/ @lanrat @ms264556 @gabe565 /tests/components/ruckus_unleashed/ @lanrat @ms264556 @gabe565
/homeassistant/components/russound_rio/ @noahhusby /homeassistant/components/russound_rio/ @noahhusby
/tests/components/russound_rio/ @noahhusby /tests/components/russound_rio/ @noahhusby
/homeassistant/components/russound_rnet/ @noahhusby
/homeassistant/components/ruuvi_gateway/ @akx /homeassistant/components/ruuvi_gateway/ @akx
/tests/components/ruuvi_gateway/ @akx /tests/components/ruuvi_gateway/ @akx
/homeassistant/components/ruuvitag_ble/ @akx /homeassistant/components/ruuvitag_ble/ @akx
@@ -1370,8 +1332,6 @@ build.json @home-assistant/supervisor
/tests/components/sensorpro/ @bdraco /tests/components/sensorpro/ @bdraco
/homeassistant/components/sensorpush/ @bdraco /homeassistant/components/sensorpush/ @bdraco
/tests/components/sensorpush/ @bdraco /tests/components/sensorpush/ @bdraco
/homeassistant/components/sensorpush_cloud/ @sstallion
/tests/components/sensorpush_cloud/ @sstallion
/homeassistant/components/sensoterra/ @markruys /homeassistant/components/sensoterra/ @markruys
/tests/components/sensoterra/ @markruys /tests/components/sensoterra/ @markruys
/homeassistant/components/sentry/ @dcramer @frenck /homeassistant/components/sentry/ @dcramer @frenck
@@ -1407,6 +1367,7 @@ build.json @home-assistant/supervisor
/homeassistant/components/siren/ @home-assistant/core @raman325 /homeassistant/components/siren/ @home-assistant/core @raman325
/tests/components/siren/ @home-assistant/core @raman325 /tests/components/siren/ @home-assistant/core @raman325
/homeassistant/components/sisyphus/ @jkeljo /homeassistant/components/sisyphus/ @jkeljo
/homeassistant/components/sky_hub/ @rogerselwyn
/homeassistant/components/sky_remote/ @dunnmj @saty9 /homeassistant/components/sky_remote/ @dunnmj @saty9
/tests/components/sky_remote/ @dunnmj @saty9 /tests/components/sky_remote/ @dunnmj @saty9
/homeassistant/components/skybell/ @tkdrob /homeassistant/components/skybell/ @tkdrob
@@ -1420,16 +1381,12 @@ build.json @home-assistant/supervisor
/tests/components/slide_local/ @dontinelli /tests/components/slide_local/ @dontinelli
/homeassistant/components/slimproto/ @marcelveldt /homeassistant/components/slimproto/ @marcelveldt
/tests/components/slimproto/ @marcelveldt /tests/components/slimproto/ @marcelveldt
/homeassistant/components/sma/ @kellerza @rklomp @erwindouna /homeassistant/components/sma/ @kellerza @rklomp
/tests/components/sma/ @kellerza @rklomp @erwindouna /tests/components/sma/ @kellerza @rklomp
/homeassistant/components/smappee/ @bsmappee /homeassistant/components/smappee/ @bsmappee
/tests/components/smappee/ @bsmappee /tests/components/smappee/ @bsmappee
/homeassistant/components/smarla/ @explicatis @rlint-explicatis
/tests/components/smarla/ @explicatis @rlint-explicatis
/homeassistant/components/smart_meter_texas/ @grahamwetzler /homeassistant/components/smart_meter_texas/ @grahamwetzler
/tests/components/smart_meter_texas/ @grahamwetzler /tests/components/smart_meter_texas/ @grahamwetzler
/homeassistant/components/smartthings/ @joostlek
/tests/components/smartthings/ @joostlek
/homeassistant/components/smarttub/ @mdz /homeassistant/components/smarttub/ @mdz
/tests/components/smarttub/ @mdz /tests/components/smarttub/ @mdz
/homeassistant/components/smarty/ @z0mbieprocess /homeassistant/components/smarty/ @z0mbieprocess
@@ -1444,8 +1401,6 @@ build.json @home-assistant/supervisor
/tests/components/snapcast/ @luar123 /tests/components/snapcast/ @luar123
/homeassistant/components/snmp/ @nmaggioni /homeassistant/components/snmp/ @nmaggioni
/tests/components/snmp/ @nmaggioni /tests/components/snmp/ @nmaggioni
/homeassistant/components/snoo/ @Lash-L
/tests/components/snoo/ @Lash-L
/homeassistant/components/snooz/ @AustinBrunkhorst /homeassistant/components/snooz/ @AustinBrunkhorst
/tests/components/snooz/ @AustinBrunkhorst /tests/components/snooz/ @AustinBrunkhorst
/homeassistant/components/solaredge/ @frenck @bdraco /homeassistant/components/solaredge/ @frenck @bdraco
@@ -1453,10 +1408,10 @@ build.json @home-assistant/supervisor
/homeassistant/components/solaredge_local/ @drobtravels @scheric /homeassistant/components/solaredge_local/ @drobtravels @scheric
/homeassistant/components/solarlog/ @Ernst79 @dontinelli /homeassistant/components/solarlog/ @Ernst79 @dontinelli
/tests/components/solarlog/ @Ernst79 @dontinelli /tests/components/solarlog/ @Ernst79 @dontinelli
/homeassistant/components/solax/ @squishykid @Darsstar /homeassistant/components/solax/ @squishykid
/tests/components/solax/ @squishykid @Darsstar /tests/components/solax/ @squishykid
/homeassistant/components/soma/ @ratsept /homeassistant/components/soma/ @ratsept @sebfortier2288
/tests/components/soma/ @ratsept /tests/components/soma/ @ratsept @sebfortier2288
/homeassistant/components/sonarr/ @ctalkington /homeassistant/components/sonarr/ @ctalkington
/tests/components/sonarr/ @ctalkington /tests/components/sonarr/ @ctalkington
/homeassistant/components/songpal/ @rytilahti @shenxn /homeassistant/components/songpal/ @rytilahti @shenxn
@@ -1488,8 +1443,7 @@ build.json @home-assistant/supervisor
/tests/components/steam_online/ @tkdrob /tests/components/steam_online/ @tkdrob
/homeassistant/components/steamist/ @bdraco /homeassistant/components/steamist/ @bdraco
/tests/components/steamist/ @bdraco /tests/components/steamist/ @bdraco
/homeassistant/components/stiebel_eltron/ @fucm @ThyMYthOS /homeassistant/components/stiebel_eltron/ @fucm
/tests/components/stiebel_eltron/ @fucm @ThyMYthOS
/homeassistant/components/stookwijzer/ @fwestenberg /homeassistant/components/stookwijzer/ @fwestenberg
/tests/components/stookwijzer/ @fwestenberg /tests/components/stookwijzer/ @fwestenberg
/homeassistant/components/stream/ @hunterjm @uvjustin @allenporter /homeassistant/components/stream/ @hunterjm @uvjustin @allenporter
@@ -1500,8 +1454,10 @@ build.json @home-assistant/supervisor
/tests/components/subaru/ @G-Two /tests/components/subaru/ @G-Two
/homeassistant/components/suez_water/ @ooii @jb101010-2 /homeassistant/components/suez_water/ @ooii @jb101010-2
/tests/components/suez_water/ @ooii @jb101010-2 /tests/components/suez_water/ @ooii @jb101010-2
/homeassistant/components/sun/ @home-assistant/core /homeassistant/components/sun/ @Swamp-Ig
/tests/components/sun/ @home-assistant/core /tests/components/sun/ @Swamp-Ig
/homeassistant/components/sunweg/ @rokam
/tests/components/sunweg/ @rokam
/homeassistant/components/supla/ @mwegrzynek /homeassistant/components/supla/ @mwegrzynek
/homeassistant/components/surepetcare/ @benleb @danielhiversen /homeassistant/components/surepetcare/ @benleb @danielhiversen
/tests/components/surepetcare/ @benleb @danielhiversen /tests/components/surepetcare/ @benleb @danielhiversen
@@ -1514,8 +1470,8 @@ build.json @home-assistant/supervisor
/tests/components/switch_as_x/ @home-assistant/core /tests/components/switch_as_x/ @home-assistant/core
/homeassistant/components/switchbee/ @jafar-atili /homeassistant/components/switchbee/ @jafar-atili
/tests/components/switchbee/ @jafar-atili /tests/components/switchbee/ @jafar-atili
/homeassistant/components/switchbot/ @danielhiversen @RenierM26 @murtas @Eloston @dsypniewski @zerzhang /homeassistant/components/switchbot/ @danielhiversen @RenierM26 @murtas @Eloston @dsypniewski
/tests/components/switchbot/ @danielhiversen @RenierM26 @murtas @Eloston @dsypniewski @zerzhang /tests/components/switchbot/ @danielhiversen @RenierM26 @murtas @Eloston @dsypniewski
/homeassistant/components/switchbot_cloud/ @SeraphicRav @laurence-presland @Gigatrappeur /homeassistant/components/switchbot_cloud/ @SeraphicRav @laurence-presland @Gigatrappeur
/tests/components/switchbot_cloud/ @SeraphicRav @laurence-presland @Gigatrappeur /tests/components/switchbot_cloud/ @SeraphicRav @laurence-presland @Gigatrappeur
/homeassistant/components/switcher_kis/ @thecode @YogevBokobza /homeassistant/components/switcher_kis/ @thecode @YogevBokobza
@@ -1553,12 +1509,10 @@ build.json @home-assistant/supervisor
/tests/components/technove/ @Moustachauve /tests/components/technove/ @Moustachauve
/homeassistant/components/tedee/ @patrickhilker @zweckj /homeassistant/components/tedee/ @patrickhilker @zweckj
/tests/components/tedee/ @patrickhilker @zweckj /tests/components/tedee/ @patrickhilker @zweckj
/homeassistant/components/telegram_bot/ @hanwg
/tests/components/telegram_bot/ @hanwg
/homeassistant/components/tellduslive/ @fredrike /homeassistant/components/tellduslive/ @fredrike
/tests/components/tellduslive/ @fredrike /tests/components/tellduslive/ @fredrike
/homeassistant/components/template/ @Petro31 @home-assistant/core /homeassistant/components/template/ @PhracturedBlue @home-assistant/core
/tests/components/template/ @Petro31 @home-assistant/core /tests/components/template/ @PhracturedBlue @home-assistant/core
/homeassistant/components/tesla_fleet/ @Bre77 /homeassistant/components/tesla_fleet/ @Bre77
/tests/components/tesla_fleet/ @Bre77 /tests/components/tesla_fleet/ @Bre77
/homeassistant/components/tesla_wall_connector/ @einarhauks /homeassistant/components/tesla_wall_connector/ @einarhauks
@@ -1584,8 +1538,6 @@ build.json @home-assistant/supervisor
/tests/components/tile/ @bachya /tests/components/tile/ @bachya
/homeassistant/components/tilt_ble/ @apt-itude /homeassistant/components/tilt_ble/ @apt-itude
/tests/components/tilt_ble/ @apt-itude /tests/components/tilt_ble/ @apt-itude
/homeassistant/components/tilt_pi/ @michaelheyman
/tests/components/tilt_pi/ @michaelheyman
/homeassistant/components/time/ @home-assistant/core /homeassistant/components/time/ @home-assistant/core
/tests/components/time/ @home-assistant/core /tests/components/time/ @home-assistant/core
/homeassistant/components/time_date/ @fabaff /homeassistant/components/time_date/ @fabaff
@@ -1674,19 +1626,17 @@ build.json @home-assistant/supervisor
/tests/components/vallox/ @andre-richter @slovdahl @viiru- @yozik04 /tests/components/vallox/ @andre-richter @slovdahl @viiru- @yozik04
/homeassistant/components/valve/ @home-assistant/core /homeassistant/components/valve/ @home-assistant/core
/tests/components/valve/ @home-assistant/core /tests/components/valve/ @home-assistant/core
/homeassistant/components/vegehub/ @ghowevege
/tests/components/vegehub/ @ghowevege
/homeassistant/components/velbus/ @Cereal2nd @brefra /homeassistant/components/velbus/ @Cereal2nd @brefra
/tests/components/velbus/ @Cereal2nd @brefra /tests/components/velbus/ @Cereal2nd @brefra
/homeassistant/components/velux/ @Julius2342 @DeerMaximum @pawlizio /homeassistant/components/velux/ @Julius2342 @DeerMaximum
/tests/components/velux/ @Julius2342 @DeerMaximum @pawlizio /tests/components/velux/ @Julius2342 @DeerMaximum
/homeassistant/components/venstar/ @garbled1 @jhollowe /homeassistant/components/venstar/ @garbled1 @jhollowe
/tests/components/venstar/ @garbled1 @jhollowe /tests/components/venstar/ @garbled1 @jhollowe
/homeassistant/components/versasense/ @imstevenxyz /homeassistant/components/versasense/ @imstevenxyz
/homeassistant/components/version/ @ludeeus /homeassistant/components/version/ @ludeeus
/tests/components/version/ @ludeeus /tests/components/version/ @ludeeus
/homeassistant/components/vesync/ @markperdue @webdjoe @thegardenmonkey @cdnninja @iprak /homeassistant/components/vesync/ @markperdue @webdjoe @thegardenmonkey @cdnninja
/tests/components/vesync/ @markperdue @webdjoe @thegardenmonkey @cdnninja @iprak /tests/components/vesync/ @markperdue @webdjoe @thegardenmonkey @cdnninja
/homeassistant/components/vicare/ @CFenner /homeassistant/components/vicare/ @CFenner
/tests/components/vicare/ @CFenner /tests/components/vicare/ @CFenner
/homeassistant/components/vilfo/ @ManneW /homeassistant/components/vilfo/ @ManneW
@@ -1698,8 +1648,8 @@ build.json @home-assistant/supervisor
/tests/components/vlc_telnet/ @rodripf @MartinHjelmare /tests/components/vlc_telnet/ @rodripf @MartinHjelmare
/homeassistant/components/vodafone_station/ @paoloantinori @chemelli74 /homeassistant/components/vodafone_station/ @paoloantinori @chemelli74
/tests/components/vodafone_station/ @paoloantinori @chemelli74 /tests/components/vodafone_station/ @paoloantinori @chemelli74
/homeassistant/components/voip/ @balloob @synesthesiam @jaminh /homeassistant/components/voip/ @balloob @synesthesiam
/tests/components/voip/ @balloob @synesthesiam @jaminh /tests/components/voip/ @balloob @synesthesiam
/homeassistant/components/volumio/ @OnFreund /homeassistant/components/volumio/ @OnFreund
/tests/components/volumio/ @OnFreund /tests/components/volumio/ @OnFreund
/homeassistant/components/volvooncall/ @molobrakos /homeassistant/components/volvooncall/ @molobrakos
@@ -1731,8 +1681,6 @@ build.json @home-assistant/supervisor
/tests/components/weatherflow_cloud/ @jeeftor /tests/components/weatherflow_cloud/ @jeeftor
/homeassistant/components/weatherkit/ @tjhorner /homeassistant/components/weatherkit/ @tjhorner
/tests/components/weatherkit/ @tjhorner /tests/components/weatherkit/ @tjhorner
/homeassistant/components/webdav/ @jpbede
/tests/components/webdav/ @jpbede
/homeassistant/components/webhook/ @home-assistant/core /homeassistant/components/webhook/ @home-assistant/core
/tests/components/webhook/ @home-assistant/core /tests/components/webhook/ @home-assistant/core
/homeassistant/components/webmin/ @autinerd /homeassistant/components/webmin/ @autinerd
@@ -1816,8 +1764,6 @@ build.json @home-assistant/supervisor
/tests/components/zeversolar/ @kvanzuijlen /tests/components/zeversolar/ @kvanzuijlen
/homeassistant/components/zha/ @dmulcahey @adminiuga @puddly @TheJulianJES /homeassistant/components/zha/ @dmulcahey @adminiuga @puddly @TheJulianJES
/tests/components/zha/ @dmulcahey @adminiuga @puddly @TheJulianJES /tests/components/zha/ @dmulcahey @adminiuga @puddly @TheJulianJES
/homeassistant/components/zimi/ @markhannon
/tests/components/zimi/ @markhannon
/homeassistant/components/zodiac/ @JulienTant /homeassistant/components/zodiac/ @JulienTant
/tests/components/zodiac/ @JulienTant /tests/components/zodiac/ @JulienTant
/homeassistant/components/zone/ @home-assistant/core /homeassistant/components/zone/ @home-assistant/core

View File

@@ -12,26 +12,8 @@ ENV \
ARG QEMU_CPU ARG QEMU_CPU
# Home Assistant S6-Overlay
COPY rootfs /
# Needs to be redefined inside the FROM statement to be set for RUN commands
ARG BUILD_ARCH
# Get go2rtc binary
RUN \
case "${BUILD_ARCH}" in \
"aarch64") go2rtc_suffix='arm64' ;; \
"armhf") go2rtc_suffix='armv6' ;; \
"armv7") go2rtc_suffix='arm' ;; \
*) go2rtc_suffix=${BUILD_ARCH} ;; \
esac \
&& curl -L https://github.com/AlexxIT/go2rtc/releases/download/v1.9.9/go2rtc_linux_${go2rtc_suffix} --output /bin/go2rtc \
&& chmod +x /bin/go2rtc \
# Verify go2rtc can be executed
&& go2rtc --version
# Install uv # Install uv
RUN pip3 install uv==0.7.1 RUN pip3 install uv==0.5.8
WORKDIR /usr/src WORKDIR /usr/src
@@ -60,4 +42,22 @@ RUN \
&& python3 -m compileall \ && python3 -m compileall \
homeassistant/homeassistant homeassistant/homeassistant
# Home Assistant S6-Overlay
COPY rootfs /
# Needs to be redefined inside the FROM statement to be set for RUN commands
ARG BUILD_ARCH
# Get go2rtc binary
RUN \
case "${BUILD_ARCH}" in \
"aarch64") go2rtc_suffix='arm64' ;; \
"armhf") go2rtc_suffix='armv6' ;; \
"armv7") go2rtc_suffix='arm' ;; \
*) go2rtc_suffix=${BUILD_ARCH} ;; \
esac \
&& curl -L https://github.com/AlexxIT/go2rtc/releases/download/v1.9.7/go2rtc_linux_${go2rtc_suffix} --output /bin/go2rtc \
&& chmod +x /bin/go2rtc \
# Verify go2rtc can be executed
&& go2rtc --version
WORKDIR /config WORKDIR /config

View File

@@ -1,7 +1,15 @@
FROM mcr.microsoft.com/vscode/devcontainers/base:debian FROM mcr.microsoft.com/devcontainers/python:1-3.13
SHELL ["/bin/bash", "-o", "pipefail", "-c"] SHELL ["/bin/bash", "-o", "pipefail", "-c"]
# Uninstall pre-installed formatting and linting tools
# They would conflict with our pinned versions
RUN \
pipx uninstall pydocstyle \
&& pipx uninstall pycodestyle \
&& pipx uninstall mypy \
&& pipx uninstall pylint
RUN \ RUN \
curl -sS https://dl.yarnpkg.com/debian/pubkey.gpg | apt-key add - \ curl -sS https://dl.yarnpkg.com/debian/pubkey.gpg | apt-key add - \
&& apt-get update \ && apt-get update \
@@ -24,18 +32,21 @@ RUN \
libxml2 \ libxml2 \
git \ git \
cmake \ cmake \
autoconf \
&& apt-get clean \ && apt-get clean \
&& rm -rf /var/lib/apt/lists/* && rm -rf /var/lib/apt/lists/*
# Add go2rtc binary # Add go2rtc binary
COPY --from=ghcr.io/alexxit/go2rtc:latest /usr/local/bin/go2rtc /bin/go2rtc COPY --from=ghcr.io/alexxit/go2rtc:latest /usr/local/bin/go2rtc /bin/go2rtc
# Install uv
RUN pip3 install uv
WORKDIR /usr/src WORKDIR /usr/src
COPY --from=ghcr.io/astral-sh/uv:latest /uv /usr/local/bin/uv # Setup hass-release
RUN git clone --depth 1 https://github.com/home-assistant/hass-release \
RUN uv python install 3.13.2 && uv pip install --system -e hass-release/ \
&& chown -R vscode /usr/src/hass-release/data
USER vscode USER vscode
ENV VIRTUAL_ENV="/home/vscode/.local/ha-venv" ENV VIRTUAL_ENV="/home/vscode/.local/ha-venv"
@@ -44,10 +55,6 @@ ENV PATH="$VIRTUAL_ENV/bin:$PATH"
WORKDIR /tmp WORKDIR /tmp
# Setup hass-release
RUN git clone --depth 1 https://github.com/home-assistant/hass-release ~/hass-release \
&& uv pip install -e ~/hass-release/
# Install Python dependencies from requirements # Install Python dependencies from requirements
COPY requirements.txt ./ COPY requirements.txt ./
COPY homeassistant/package_constraints.txt homeassistant/package_constraints.txt COPY homeassistant/package_constraints.txt homeassistant/package_constraints.txt
@@ -58,4 +65,4 @@ RUN uv pip install -r requirements_test.txt
WORKDIR /workspaces WORKDIR /workspaces
# Set the default shell to bash instead of sh # Set the default shell to bash instead of sh
ENV SHELL=/bin/bash ENV SHELL /bin/bash

View File

@@ -1,10 +1,10 @@
image: ghcr.io/home-assistant/{arch}-homeassistant image: ghcr.io/home-assistant/{arch}-homeassistant
build_from: build_from:
aarch64: ghcr.io/home-assistant/aarch64-homeassistant-base:2025.05.0 aarch64: ghcr.io/home-assistant/aarch64-homeassistant-base:2024.12.0
armhf: ghcr.io/home-assistant/armhf-homeassistant-base:2025.05.0 armhf: ghcr.io/home-assistant/armhf-homeassistant-base:2024.12.0
armv7: ghcr.io/home-assistant/armv7-homeassistant-base:2025.05.0 armv7: ghcr.io/home-assistant/armv7-homeassistant-base:2024.12.0
amd64: ghcr.io/home-assistant/amd64-homeassistant-base:2025.05.0 amd64: ghcr.io/home-assistant/amd64-homeassistant-base:2024.12.0
i386: ghcr.io/home-assistant/i386-homeassistant-base:2025.05.0 i386: ghcr.io/home-assistant/i386-homeassistant-base:2024.12.0
codenotary: codenotary:
signer: notary@home-assistant.io signer: notary@home-assistant.io
base_image: notary@home-assistant.io base_image: notary@home-assistant.io
@@ -19,4 +19,4 @@ labels:
org.opencontainers.image.authors: The Home Assistant Authors org.opencontainers.image.authors: The Home Assistant Authors
org.opencontainers.image.url: https://www.home-assistant.io/ org.opencontainers.image.url: https://www.home-assistant.io/
org.opencontainers.image.documentation: https://www.home-assistant.io/docs/ org.opencontainers.image.documentation: https://www.home-assistant.io/docs/
org.opencontainers.image.licenses: Apache-2.0 org.opencontainers.image.licenses: Apache License 2.0

View File

@@ -38,7 +38,8 @@ def validate_python() -> None:
def ensure_config_path(config_dir: str) -> None: def ensure_config_path(config_dir: str) -> None:
"""Validate the configuration directory.""" """Validate the configuration directory."""
from . import config as config_util # noqa: PLC0415 # pylint: disable-next=import-outside-toplevel
from . import config as config_util
lib_dir = os.path.join(config_dir, "deps") lib_dir = os.path.join(config_dir, "deps")
@@ -79,7 +80,8 @@ def ensure_config_path(config_dir: str) -> None:
def get_arguments() -> argparse.Namespace: def get_arguments() -> argparse.Namespace:
"""Get parsed passed in arguments.""" """Get parsed passed in arguments."""
from . import config as config_util # noqa: PLC0415 # pylint: disable-next=import-outside-toplevel
from . import config as config_util
parser = argparse.ArgumentParser( parser = argparse.ArgumentParser(
description="Home Assistant: Observe, Control, Automate.", description="Home Assistant: Observe, Control, Automate.",
@@ -175,7 +177,8 @@ def main() -> int:
validate_os() validate_os()
if args.script is not None: if args.script is not None:
from . import scripts # noqa: PLC0415 # pylint: disable-next=import-outside-toplevel
from . import scripts
return scripts.run(args.script) return scripts.run(args.script)
@@ -185,7 +188,8 @@ def main() -> int:
ensure_config_path(config_dir) ensure_config_path(config_dir)
from . import config, runner # noqa: PLC0415 # pylint: disable-next=import-outside-toplevel
from . import config, runner
safe_mode = config.safe_mode_enabled(config_dir) safe_mode = config.safe_mode_enabled(config_dir)

View File

@@ -308,7 +308,7 @@ class AuthStore:
credentials.data = data credentials.data = data
self._async_schedule_save() self._async_schedule_save()
async def async_load(self) -> None: async def async_load(self) -> None: # noqa: C901
"""Load the users.""" """Load the users."""
if self._loaded: if self._loaded:
raise RuntimeError("Auth storage is already loaded") raise RuntimeError("Auth storage is already loaded")

View File

@@ -4,8 +4,9 @@ from __future__ import annotations
import logging import logging
import types import types
from typing import Any from typing import Any, Generic
from typing_extensions import TypeVar
import voluptuous as vol import voluptuous as vol
from voluptuous.humanize import humanize_error from voluptuous.humanize import humanize_error
@@ -34,6 +35,12 @@ DATA_REQS: HassKey[set[str]] = HassKey("mfa_auth_module_reqs_processed")
_LOGGER = logging.getLogger(__name__) _LOGGER = logging.getLogger(__name__)
_MultiFactorAuthModuleT = TypeVar(
"_MultiFactorAuthModuleT",
bound="MultiFactorAuthModule",
default="MultiFactorAuthModule",
)
class MultiFactorAuthModule: class MultiFactorAuthModule:
"""Multi-factor Auth Module of validation function.""" """Multi-factor Auth Module of validation function."""
@@ -95,9 +102,7 @@ class MultiFactorAuthModule:
raise NotImplementedError raise NotImplementedError
class SetupFlow[_MultiFactorAuthModuleT: MultiFactorAuthModule = MultiFactorAuthModule]( class SetupFlow(data_entry_flow.FlowHandler, Generic[_MultiFactorAuthModuleT]):
data_entry_flow.FlowHandler
):
"""Handler for the setup flow.""" """Handler for the setup flow."""
def __init__( def __init__(

View File

@@ -52,28 +52,28 @@ _LOGGER = logging.getLogger(__name__)
def _generate_secret() -> str: def _generate_secret() -> str:
"""Generate a secret.""" """Generate a secret."""
import pyotp # noqa: PLC0415 import pyotp # pylint: disable=import-outside-toplevel
return str(pyotp.random_base32()) return str(pyotp.random_base32())
def _generate_random() -> int: def _generate_random() -> int:
"""Generate a 32 digit number.""" """Generate a 32 digit number."""
import pyotp # noqa: PLC0415 import pyotp # pylint: disable=import-outside-toplevel
return int(pyotp.random_base32(length=32, chars=list("1234567890"))) return int(pyotp.random_base32(length=32, chars=list("1234567890")))
def _generate_otp(secret: str, count: int) -> str: def _generate_otp(secret: str, count: int) -> str:
"""Generate one time password.""" """Generate one time password."""
import pyotp # noqa: PLC0415 import pyotp # pylint: disable=import-outside-toplevel
return str(pyotp.HOTP(secret).at(count)) return str(pyotp.HOTP(secret).at(count))
def _verify_otp(secret: str, otp: str, count: int) -> bool: def _verify_otp(secret: str, otp: str, count: int) -> bool:
"""Verify one time password.""" """Verify one time password."""
import pyotp # noqa: PLC0415 import pyotp # pylint: disable=import-outside-toplevel
return bool(pyotp.HOTP(secret).verify(otp, count)) return bool(pyotp.HOTP(secret).verify(otp, count))

View File

@@ -37,7 +37,7 @@ DUMMY_SECRET = "FPPTH34D4E3MI2HG"
def _generate_qr_code(data: str) -> str: def _generate_qr_code(data: str) -> str:
"""Generate a base64 PNG string represent QR Code image of data.""" """Generate a base64 PNG string represent QR Code image of data."""
import pyqrcode # noqa: PLC0415 import pyqrcode # pylint: disable=import-outside-toplevel
qr_code = pyqrcode.create(data) qr_code = pyqrcode.create(data)
@@ -59,7 +59,7 @@ def _generate_qr_code(data: str) -> str:
def _generate_secret_and_qr_code(username: str) -> tuple[str, str, str]: def _generate_secret_and_qr_code(username: str) -> tuple[str, str, str]:
"""Generate a secret, url, and QR code.""" """Generate a secret, url, and QR code."""
import pyotp # noqa: PLC0415 import pyotp # pylint: disable=import-outside-toplevel
ota_secret = pyotp.random_base32() ota_secret = pyotp.random_base32()
url = pyotp.totp.TOTP(ota_secret).provisioning_uri( url = pyotp.totp.TOTP(ota_secret).provisioning_uri(
@@ -107,7 +107,7 @@ class TotpAuthModule(MultiFactorAuthModule):
def _add_ota_secret(self, user_id: str, secret: str | None = None) -> str: def _add_ota_secret(self, user_id: str, secret: str | None = None) -> str:
"""Create a ota_secret for user.""" """Create a ota_secret for user."""
import pyotp # noqa: PLC0415 import pyotp # pylint: disable=import-outside-toplevel
ota_secret: str = secret or pyotp.random_base32() ota_secret: str = secret or pyotp.random_base32()
@@ -163,7 +163,7 @@ class TotpAuthModule(MultiFactorAuthModule):
def _validate_2fa(self, user_id: str, code: str) -> bool: def _validate_2fa(self, user_id: str, code: str) -> bool:
"""Validate two factor authentication code.""" """Validate two factor authentication code."""
import pyotp # noqa: PLC0415 import pyotp # pylint: disable=import-outside-toplevel
if (ota_secret := self._users.get(user_id)) is None: # type: ignore[union-attr] if (ota_secret := self._users.get(user_id)) is None: # type: ignore[union-attr]
# even we cannot find user, we still do verify # even we cannot find user, we still do verify
@@ -196,7 +196,7 @@ class TotpSetupFlow(SetupFlow[TotpAuthModule]):
Return self.async_show_form(step_id='init') if user_input is None. Return self.async_show_form(step_id='init') if user_input is None.
Return self.async_create_entry(data={'result': result}) if finish. Return self.async_create_entry(data={'result': result}) if finish.
""" """
import pyotp # noqa: PLC0415 import pyotp # pylint: disable=import-outside-toplevel
errors: dict[str, str] = {} errors: dict[str, str] = {}

View File

@@ -11,7 +11,7 @@ import uuid
import attr import attr
from attr import Attribute from attr import Attribute
from attr.setters import validate from attr.setters import validate
from propcache.api import cached_property from propcache import cached_property
from homeassistant.const import __version__ from homeassistant.const import __version__
from homeassistant.data_entry_flow import FlowContext, FlowResult from homeassistant.data_entry_flow import FlowContext, FlowResult

View File

@@ -17,12 +17,12 @@ POLICY_SCHEMA = vol.Schema({vol.Optional(CAT_ENTITIES): ENTITY_POLICY_SCHEMA})
__all__ = [ __all__ = [
"POLICY_SCHEMA", "POLICY_SCHEMA",
"AbstractPermissions",
"OwnerPermissions",
"PermissionLookup",
"PolicyPermissions",
"PolicyType",
"merge_policies", "merge_policies",
"PermissionLookup",
"PolicyType",
"AbstractPermissions",
"PolicyPermissions",
"OwnerPermissions",
] ]

View File

@@ -5,8 +5,9 @@ from __future__ import annotations
from collections.abc import Mapping from collections.abc import Mapping
import logging import logging
import types import types
from typing import Any from typing import Any, Generic
from typing_extensions import TypeVar
import voluptuous as vol import voluptuous as vol
from voluptuous.humanize import humanize_error from voluptuous.humanize import humanize_error
@@ -46,6 +47,8 @@ AUTH_PROVIDER_SCHEMA = vol.Schema(
extra=vol.ALLOW_EXTRA, extra=vol.ALLOW_EXTRA,
) )
_AuthProviderT = TypeVar("_AuthProviderT", bound="AuthProvider", default="AuthProvider")
class AuthProvider: class AuthProvider:
"""Provider of user authentication.""" """Provider of user authentication."""
@@ -192,8 +195,9 @@ async def load_auth_provider_module(
return module return module
class LoginFlow[_AuthProviderT: AuthProvider = AuthProvider]( class LoginFlow(
FlowHandler[AuthFlowContext, AuthFlowResult, tuple[str, str]], FlowHandler[AuthFlowContext, AuthFlowResult, tuple[str, str]],
Generic[_AuthProviderT],
): ):
"""Handler for the login flow.""" """Handler for the login flow."""

View File

@@ -21,7 +21,7 @@ import voluptuous as vol
from homeassistant.core import callback from homeassistant.core import callback
from homeassistant.exceptions import HomeAssistantError from homeassistant.exceptions import HomeAssistantError
from homeassistant.helpers import config_validation as cv import homeassistant.helpers.config_validation as cv
from homeassistant.helpers.network import is_cloud_connection from homeassistant.helpers.network import is_cloud_connection
from .. import InvalidAuthError from .. import InvalidAuthError

View File

@@ -0,0 +1,29 @@
"""Enum backports from standard lib.
This file contained the backport of the StrEnum of Python 3.11.
Since we have dropped support for Python 3.10, we can remove this backport.
This file is kept for now to avoid breaking custom components that might
import it.
"""
from __future__ import annotations
from enum import StrEnum as _StrEnum
from functools import partial
from homeassistant.helpers.deprecation import (
DeprecatedAlias,
all_with_deprecated_constants,
check_if_deprecated_constant,
dir_with_deprecated_constants,
)
# StrEnum deprecated as of 2024.5 use enum.StrEnum instead.
_DEPRECATED_StrEnum = DeprecatedAlias(_StrEnum, "enum.StrEnum", "2025.5")
__getattr__ = partial(check_if_deprecated_constant, module_globals=globals())
__dir__ = partial(
dir_with_deprecated_constants, module_globals_keys=[*globals().keys()]
)
__all__ = all_with_deprecated_constants(globals())

View File

@@ -0,0 +1,31 @@
"""Functools backports from standard lib.
This file contained the backport of the cached_property implementation of Python 3.12.
Since we have dropped support for Python 3.11, we can remove this backport.
This file is kept for now to avoid breaking custom components that might
import it.
"""
from __future__ import annotations
# pylint: disable-next=hass-deprecated-import
from functools import cached_property as _cached_property, partial
from homeassistant.helpers.deprecation import (
DeprecatedAlias,
all_with_deprecated_constants,
check_if_deprecated_constant,
dir_with_deprecated_constants,
)
# cached_property deprecated as of 2024.5 use functools.cached_property instead.
_DEPRECATED_cached_property = DeprecatedAlias(
_cached_property, "functools.cached_property", "2025.5"
)
__getattr__ = partial(check_if_deprecated_constant, module_globals=globals())
__dir__ = partial(
dir_with_deprecated_constants, module_globals_keys=[*globals().keys()]
)
__all__ = all_with_deprecated_constants(globals())

View File

@@ -18,7 +18,6 @@ import securetar
from .const import __version__ as HA_VERSION from .const import __version__ as HA_VERSION
RESTORE_BACKUP_FILE = ".HA_RESTORE" RESTORE_BACKUP_FILE = ".HA_RESTORE"
RESTORE_BACKUP_RESULT_FILE = ".HA_RESTORE_RESULT"
KEEP_BACKUPS = ("backups",) KEEP_BACKUPS = ("backups",)
KEEP_DATABASE = ( KEEP_DATABASE = (
"home-assistant_v2.db", "home-assistant_v2.db",
@@ -63,10 +62,7 @@ def restore_backup_file_content(config_dir: Path) -> RestoreBackupFileContent |
restore_database=instruction_content["restore_database"], restore_database=instruction_content["restore_database"],
restore_homeassistant=instruction_content["restore_homeassistant"], restore_homeassistant=instruction_content["restore_homeassistant"],
) )
except FileNotFoundError: except (FileNotFoundError, KeyError, json.JSONDecodeError):
return None
except (KeyError, json.JSONDecodeError) as err:
_write_restore_result_file(config_dir, False, err)
return None return None
finally: finally:
# Always remove the backup instruction file to prevent a boot loop # Always remove the backup instruction file to prevent a boot loop
@@ -123,7 +119,7 @@ def _extract_backup(
Path( Path(
tempdir, tempdir,
"extracted", "extracted",
f"homeassistant.tar{'.gz' if backup_meta['compressed'] else ''}", f"homeassistant.tar{'.gz' if backup_meta["compressed"] else ''}",
), ),
gzip=backup_meta["compressed"], gzip=backup_meta["compressed"],
key=password_to_key(restore_content.password) key=password_to_key(restore_content.password)
@@ -146,7 +142,6 @@ def _extract_backup(
config_dir, config_dir,
dirs_exist_ok=True, dirs_exist_ok=True,
ignore=shutil.ignore_patterns(*(keep)), ignore=shutil.ignore_patterns(*(keep)),
ignore_dangling_symlinks=True,
) )
elif restore_content.restore_database: elif restore_content.restore_database:
for entry in KEEP_DATABASE: for entry in KEEP_DATABASE:
@@ -164,23 +159,6 @@ def _extract_backup(
) )
def _write_restore_result_file(
config_dir: Path, success: bool, error: Exception | None
) -> None:
"""Write the restore result file."""
result_path = config_dir.joinpath(RESTORE_BACKUP_RESULT_FILE)
result_path.write_text(
json.dumps(
{
"success": success,
"error": str(error) if error else None,
"error_type": str(type(error).__name__) if error else None,
}
),
encoding="utf-8",
)
def restore_backup(config_dir_path: str) -> bool: def restore_backup(config_dir_path: str) -> bool:
"""Restore the backup file if any. """Restore the backup file if any.
@@ -199,14 +177,7 @@ def restore_backup(config_dir_path: str) -> bool:
restore_content=restore_content, restore_content=restore_content,
) )
except FileNotFoundError as err: except FileNotFoundError as err:
file_not_found = ValueError(f"Backup file {backup_file_path} does not exist") raise ValueError(f"Backup file {backup_file_path} does not exist") from err
_write_restore_result_file(config_dir, False, file_not_found)
raise file_not_found from err
except Exception as err:
_write_restore_result_file(config_dir, False, err)
raise
else:
_write_restore_result_file(config_dir, True, None)
if restore_content.remove_after_restore: if restore_content.remove_after_restore:
backup_file_path.unlink(missing_ok=True) backup_file_path.unlink(missing_ok=True)
_LOGGER.info("Restore complete, restarting") _LOGGER.info("Restore complete, restarting")

View File

@@ -31,7 +31,7 @@ def _check_import_call_allowed(mapped_args: dict[str, Any]) -> bool:
def _check_file_allowed(mapped_args: dict[str, Any]) -> bool: def _check_file_allowed(mapped_args: dict[str, Any]) -> bool:
# If the file is in /proc we can ignore it. # If the file is in /proc we can ignore it.
args = mapped_args["args"] args = mapped_args["args"]
path = args[0] if type(args[0]) is str else str(args[0]) path = args[0] if type(args[0]) is str else str(args[0]) # noqa: E721
return path.startswith(ALLOWED_FILE_PREFIXES) return path.startswith(ALLOWED_FILE_PREFIXES)
@@ -178,15 +178,6 @@ _BLOCKING_CALLS: tuple[BlockingCall, ...] = (
strict_core=False, strict_core=False,
skip_for_tests=True, skip_for_tests=True,
), ),
BlockingCall(
original_func=SSLContext.set_default_verify_paths,
object=SSLContext,
function="set_default_verify_paths",
check_allowed=None,
strict=False,
strict_core=False,
skip_for_tests=True,
),
BlockingCall( BlockingCall(
original_func=Path.open, original_func=Path.open,
object=Path, object=Path,

View File

@@ -53,7 +53,6 @@ from .components import (
logbook as logbook_pre_import, # noqa: F401 logbook as logbook_pre_import, # noqa: F401
lovelace as lovelace_pre_import, # noqa: F401 lovelace as lovelace_pre_import, # noqa: F401
onboarding as onboarding_pre_import, # noqa: F401 onboarding as onboarding_pre_import, # noqa: F401
person as person_pre_import, # noqa: F401
recorder as recorder_import, # noqa: F401 - not named pre_import since it has requirements recorder as recorder_import, # noqa: F401 - not named pre_import since it has requirements
repairs as repairs_pre_import, # noqa: F401 repairs as repairs_pre_import, # noqa: F401
search as search_pre_import, # noqa: F401 search as search_pre_import, # noqa: F401
@@ -76,26 +75,22 @@ from .exceptions import HomeAssistantError
from .helpers import ( from .helpers import (
area_registry, area_registry,
category_registry, category_registry,
condition,
config_validation as cv, config_validation as cv,
device_registry, device_registry,
entity, entity,
entity_registry, entity_registry,
floor_registry, floor_registry,
frame,
issue_registry, issue_registry,
label_registry, label_registry,
recorder, recorder,
restore_state, restore_state,
template, template,
translation, translation,
trigger,
) )
from .helpers.dispatcher import async_dispatcher_send_internal from .helpers.dispatcher import async_dispatcher_send_internal
from .helpers.storage import get_internal_store_manager from .helpers.storage import get_internal_store_manager
from .helpers.system_info import async_get_system_info from .helpers.system_info import async_get_system_info
from .helpers.typing import ConfigType from .helpers.typing import ConfigType
from .loader import Integration
from .setup import ( from .setup import (
# _setup_started is marked as protected to make it clear # _setup_started is marked as protected to make it clear
# that it is not part of the public API and should not be used # that it is not part of the public API and should not be used
@@ -117,11 +112,6 @@ with contextlib.suppress(ImportError):
# Ensure anyio backend is imported to avoid it being imported in the event loop # Ensure anyio backend is imported to avoid it being imported in the event loop
from anyio._backends import _asyncio # noqa: F401 from anyio._backends import _asyncio # noqa: F401
with contextlib.suppress(ImportError):
# httpx will import trio if it is installed which does
# blocking I/O in the event loop. We want to avoid that.
import trio # noqa: F401
if TYPE_CHECKING: if TYPE_CHECKING:
from .runner import RuntimeConfig from .runner import RuntimeConfig
@@ -139,12 +129,14 @@ DATA_REGISTRIES_LOADED: HassKey[None] = HassKey("bootstrap_registries_loaded")
LOG_SLOW_STARTUP_INTERVAL = 60 LOG_SLOW_STARTUP_INTERVAL = 60
SLOW_STARTUP_CHECK_INTERVAL = 1 SLOW_STARTUP_CHECK_INTERVAL = 1
STAGE_0_SUBSTAGE_TIMEOUT = 60
STAGE_1_TIMEOUT = 120 STAGE_1_TIMEOUT = 120
STAGE_2_TIMEOUT = 300 STAGE_2_TIMEOUT = 300
WRAP_UP_TIMEOUT = 300 WRAP_UP_TIMEOUT = 300
COOLDOWN_TIME = 60 COOLDOWN_TIME = 60
DEBUGGER_INTEGRATIONS = {"debugpy"}
# Core integrations are unconditionally loaded # Core integrations are unconditionally loaded
CORE_INTEGRATIONS = {"homeassistant", "persistent_notification"} CORE_INTEGRATIONS = {"homeassistant", "persistent_notification"}
@@ -155,10 +147,6 @@ LOGGING_AND_HTTP_DEPS_INTEGRATIONS = {
"isal", "isal",
# Set log levels # Set log levels
"logger", "logger",
# Ensure network config is available
# before hassio or any other integration is
# loaded that might create an aiohttp client session
"network",
# Error logging # Error logging
"system_log", "system_log",
"sentry", "sentry",
@@ -169,25 +157,12 @@ FRONTEND_INTEGRATIONS = {
# visible in frontend # visible in frontend
"frontend", "frontend",
} }
# Stage 0 is divided into substages. Each substage has a name, a set of integrations and a timeout. RECORDER_INTEGRATIONS = {
# The substage containing recorder should have no timeout, as it could cancel a database migration. # Setup after frontend
# Recorder freezes "recorder" timeout during a migration, but it does not freeze other timeouts. # To record data
# If we add timeouts to the frontend substages, we should make sure they don't apply in recovery mode. "recorder",
STAGE_0_INTEGRATIONS = ( }
# Load logging and http deps as soon as possible DISCOVERY_INTEGRATIONS = ("bluetooth", "dhcp", "ssdp", "usb", "zeroconf")
("logging, http deps", LOGGING_AND_HTTP_DEPS_INTEGRATIONS, None),
# Setup frontend
("frontend", FRONTEND_INTEGRATIONS, None),
# Setup recorder
("recorder", {"recorder"}, None),
# Start up debuggers. Start these first in case they want to wait.
("debugger", {"debugpy"}, STAGE_0_SUBSTAGE_TIMEOUT),
# Zeroconf is used for mdns resolution in aiohttp client helper.
("zeroconf", {"zeroconf"}, STAGE_0_SUBSTAGE_TIMEOUT),
)
DISCOVERY_INTEGRATIONS = ("bluetooth", "dhcp", "ssdp", "usb")
# Stage 1 integrations are not to be preimported in bootstrap.
STAGE_1_INTEGRATIONS = { STAGE_1_INTEGRATIONS = {
# We need to make sure discovery integrations # We need to make sure discovery integrations
# update their deps before stage 2 integrations # update their deps before stage 2 integrations
@@ -202,7 +177,6 @@ STAGE_1_INTEGRATIONS = {
# Ensure supervisor is available # Ensure supervisor is available
"hassio", "hassio",
} }
DEFAULT_INTEGRATIONS = { DEFAULT_INTEGRATIONS = {
# These integrations are set up unless recovery mode is activated. # These integrations are set up unless recovery mode is activated.
# #
@@ -243,12 +217,22 @@ DEFAULT_INTEGRATIONS_SUPERVISOR = {
# These integrations are set up if using the Supervisor # These integrations are set up if using the Supervisor
"hassio", "hassio",
} }
CRITICAL_INTEGRATIONS = { CRITICAL_INTEGRATIONS = {
# Recovery mode is activated if these integrations fail to set up # Recovery mode is activated if these integrations fail to set up
"frontend", "frontend",
} }
SETUP_ORDER = (
# Load logging and http deps as soon as possible
("logging, http deps", LOGGING_AND_HTTP_DEPS_INTEGRATIONS),
# Setup frontend
("frontend", FRONTEND_INTEGRATIONS),
# Setup recorder
("recorder", RECORDER_INTEGRATIONS),
# Start up debuggers. Start these first in case they want to wait.
("debugger", DEBUGGER_INTEGRATIONS),
)
# #
# Storage keys we are likely to load during startup # Storage keys we are likely to load during startup
# in order of when we expect to load them. # in order of when we expect to load them.
@@ -300,6 +284,14 @@ async def async_setup_hass(
return hass return hass
async def stop_hass(hass: core.HomeAssistant) -> None:
"""Stop hass."""
# Ask integrations to shut down. It's messy but we can't
# do a clean stop without knowing what is broken
with contextlib.suppress(TimeoutError):
async with hass.timeout.async_timeout(10):
await hass.async_stop()
hass = await create_hass() hass = await create_hass()
if runtime_config.skip_pip or runtime_config.skip_pip_packages: if runtime_config.skip_pip or runtime_config.skip_pip_packages:
@@ -315,10 +307,10 @@ async def async_setup_hass(
block_async_io.enable() block_async_io.enable()
if not (recovery_mode := runtime_config.recovery_mode): config_dict = None
config_dict = None basic_setup_success = False
basic_setup_success = False
if not (recovery_mode := runtime_config.recovery_mode):
await hass.async_add_executor_job(conf_util.process_ha_config_upgrade, hass) await hass.async_add_executor_job(conf_util.process_ha_config_upgrade, hass)
try: try:
@@ -336,43 +328,39 @@ async def async_setup_hass(
await async_from_config_dict(config_dict, hass) is not None await async_from_config_dict(config_dict, hass) is not None
) )
if config_dict is None: if config_dict is None:
recovery_mode = True recovery_mode = True
await hass.async_stop(force=True) await stop_hass(hass)
hass = await create_hass() hass = await create_hass()
elif not basic_setup_success: elif not basic_setup_success:
_LOGGER.warning( _LOGGER.warning("Unable to set up core integrations. Activating recovery mode")
"Unable to set up core integrations. Activating recovery mode" recovery_mode = True
) await stop_hass(hass)
recovery_mode = True hass = await create_hass()
await hass.async_stop(force=True)
hass = await create_hass()
elif any( elif any(domain not in hass.config.components for domain in CRITICAL_INTEGRATIONS):
domain not in hass.config.components for domain in CRITICAL_INTEGRATIONS _LOGGER.warning(
): "Detected that %s did not load. Activating recovery mode",
_LOGGER.warning( ",".join(CRITICAL_INTEGRATIONS),
"Detected that %s did not load. Activating recovery mode", )
",".join(CRITICAL_INTEGRATIONS),
)
old_config = hass.config old_config = hass.config
old_logging = hass.data.get(DATA_LOGGING) old_logging = hass.data.get(DATA_LOGGING)
recovery_mode = True recovery_mode = True
await hass.async_stop(force=True) await stop_hass(hass)
hass = await create_hass() hass = await create_hass()
if old_logging: if old_logging:
hass.data[DATA_LOGGING] = old_logging hass.data[DATA_LOGGING] = old_logging
hass.config.debug = old_config.debug hass.config.debug = old_config.debug
hass.config.skip_pip = old_config.skip_pip hass.config.skip_pip = old_config.skip_pip
hass.config.skip_pip_packages = old_config.skip_pip_packages hass.config.skip_pip_packages = old_config.skip_pip_packages
hass.config.internal_url = old_config.internal_url hass.config.internal_url = old_config.internal_url
hass.config.external_url = old_config.external_url hass.config.external_url = old_config.external_url
# Setup loader cache after the config dir has been set # Setup loader cache after the config dir has been set
loader.async_setup(hass) loader.async_setup(hass)
if recovery_mode: if recovery_mode:
_LOGGER.info("Starting in recovery mode") _LOGGER.info("Starting in recovery mode")
@@ -395,7 +383,7 @@ async def async_setup_hass(
def open_hass_ui(hass: core.HomeAssistant) -> None: def open_hass_ui(hass: core.HomeAssistant) -> None:
"""Open the UI.""" """Open the UI."""
import webbrowser # noqa: PLC0415 import webbrowser # pylint: disable=import-outside-toplevel
if hass.config.api is None or "frontend" not in hass.config.components: if hass.config.api is None or "frontend" not in hass.config.components:
_LOGGER.warning("Cannot launch the UI because frontend not loaded") _LOGGER.warning("Cannot launch the UI because frontend not loaded")
@@ -435,10 +423,9 @@ async def async_load_base_functionality(hass: core.HomeAssistant) -> None:
if DATA_REGISTRIES_LOADED in hass.data: if DATA_REGISTRIES_LOADED in hass.data:
return return
hass.data[DATA_REGISTRIES_LOADED] = None hass.data[DATA_REGISTRIES_LOADED] = None
entity.async_setup(hass)
frame.async_setup(hass)
template.async_setup(hass)
translation.async_setup(hass) translation.async_setup(hass)
entity.async_setup(hass)
template.async_setup(hass)
await asyncio.gather( await asyncio.gather(
create_eager_task(get_internal_store_manager(hass).async_initialize()), create_eager_task(get_internal_store_manager(hass).async_initialize()),
create_eager_task(area_registry.async_load(hass)), create_eager_task(area_registry.async_load(hass)),
@@ -453,8 +440,6 @@ async def async_load_base_functionality(hass: core.HomeAssistant) -> None:
create_eager_task(restore_state.async_load(hass)), create_eager_task(restore_state.async_load(hass)),
create_eager_task(hass.config_entries.async_initialize()), create_eager_task(hass.config_entries.async_initialize()),
create_eager_task(async_get_system_info(hass)), create_eager_task(async_get_system_info(hass)),
create_eager_task(condition.async_setup(hass)),
create_eager_task(trigger.async_setup(hass)),
) )
@@ -564,7 +549,8 @@ async def async_enable_logging(
if not log_no_color: if not log_no_color:
try: try:
from colorlog import ColoredFormatter # noqa: PLC0415 # pylint: disable-next=import-outside-toplevel
from colorlog import ColoredFormatter
# basicConfig must be called after importing colorlog in order to # basicConfig must be called after importing colorlog in order to
# ensure that the handlers it sets up wraps the correct streams. # ensure that the handlers it sets up wraps the correct streams.
@@ -660,10 +646,11 @@ def _create_log_file(
err_handler = _RotatingFileHandlerWithoutShouldRollOver( err_handler = _RotatingFileHandlerWithoutShouldRollOver(
err_log_path, backupCount=1 err_log_path, backupCount=1
) )
try:
err_handler.doRollover() try:
except OSError as err: err_handler.doRollover()
_LOGGER.error("Error rolling over log file: %s", err) except OSError as err:
_LOGGER.error("Error rolling over log file: %s", err)
return err_handler return err_handler
@@ -692,6 +679,7 @@ async def async_mount_local_lib_path(config_dir: str) -> str:
return deps_dir return deps_dir
@core.callback
def _get_domains(hass: core.HomeAssistant, config: dict[str, Any]) -> set[str]: def _get_domains(hass: core.HomeAssistant, config: dict[str, Any]) -> set[str]:
"""Get domains of components to set up.""" """Get domains of components to set up."""
# Filter out the repeating and common config section [homeassistant] # Filter out the repeating and common config section [homeassistant]
@@ -713,256 +701,6 @@ def _get_domains(hass: core.HomeAssistant, config: dict[str, Any]) -> set[str]:
return domains return domains
async def _async_resolve_domains_and_preload(
hass: core.HomeAssistant, config: dict[str, Any]
) -> tuple[dict[str, Integration], dict[str, Integration]]:
"""Resolve all dependencies and return integrations to set up.
The return value is a tuple of two dictionaries:
- The first dictionary contains integrations
specified by the configuration (including config entries).
- The second dictionary contains the same integrations as the first dictionary
together with all their dependencies.
"""
domains_to_setup = _get_domains(hass, config)
platform_integrations = conf_util.extract_platform_integrations(
config, BASE_PLATFORMS
)
# Ensure base platforms that have platform integrations are added to `domains`,
# so they can be setup first instead of discovering them later when a config
# entry setup task notices that it's needed and there is already a long line
# to use the import executor.
#
# For example if we have
# sensor:
# - platform: template
#
# `template` has to be loaded to validate the config for sensor
# so we want to start loading `sensor` as soon as we know
# it will be needed. The more platforms under `sensor:`, the longer
# it will take to finish setup for `sensor` because each of these
# platforms has to be imported before we can validate the config.
#
# Thankfully we are migrating away from the platform pattern
# so this will be less of a problem in the future.
domains_to_setup.update(platform_integrations)
# Additionally process base platforms since we do not require the manifest
# to list them as dependencies.
# We want to later avoid lock contention when multiple integrations try to load
# their manifests at once.
# Also process integrations that are defined under base platforms
# to speed things up.
additional_domains_to_process = {
*BASE_PLATFORMS,
*chain.from_iterable(platform_integrations.values()),
}
# Resolve all dependencies so we know all integrations
# that will have to be loaded and start right-away
integrations_or_excs = await loader.async_get_integrations(
hass, {*domains_to_setup, *additional_domains_to_process}
)
# Eliminate those missing or with invalid manifest
integrations_to_process = {
domain: itg
for domain, itg in integrations_or_excs.items()
if isinstance(itg, Integration)
}
integrations_dependencies = await loader.resolve_integrations_dependencies(
hass, integrations_to_process.values()
)
# Eliminate those without valid dependencies
integrations_to_process = {
domain: integrations_to_process[domain] for domain in integrations_dependencies
}
integrations_to_setup = {
domain: itg
for domain, itg in integrations_to_process.items()
if domain in domains_to_setup
}
all_integrations_to_setup = integrations_to_setup.copy()
all_integrations_to_setup.update(
(dep, loader.async_get_loaded_integration(hass, dep))
for domain in integrations_to_setup
for dep in integrations_dependencies[domain].difference(
all_integrations_to_setup
)
)
# Gather requirements for all integrations,
# their dependencies and after dependencies.
# To gather all the requirements we must ignore exceptions here.
# The exceptions will be detected and handled later in the bootstrap process.
integrations_after_dependencies = (
await loader.resolve_integrations_after_dependencies(
hass, integrations_to_process.values(), ignore_exceptions=True
)
)
integrations_requirements = {
domain: itg.requirements for domain, itg in integrations_to_process.items()
}
integrations_requirements.update(
(dep, loader.async_get_loaded_integration(hass, dep).requirements)
for deps in integrations_after_dependencies.values()
for dep in deps.difference(integrations_requirements)
)
all_requirements = set(chain.from_iterable(integrations_requirements.values()))
# Optimistically check if requirements are already installed
# ahead of setting up the integrations so we can prime the cache
# We do not wait for this since it's an optimization only
hass.async_create_background_task(
requirements.async_load_installed_versions(hass, all_requirements),
"check installed requirements",
eager_start=True,
)
# Start loading translations for all integrations we are going to set up
# in the background so they are ready when we need them. This avoids a
# lot of waiting for the translation load lock and a thundering herd of
# tasks trying to load the same translations at the same time as each
# integration is loaded.
#
# We do not wait for this since as soon as the task runs it will
# hold the translation load lock and if anything is fast enough to
# wait for the translation load lock, loading will be done by the
# time it gets to it.
translations_to_load = {*all_integrations_to_setup, *additional_domains_to_process}
hass.async_create_background_task(
translation.async_load_integrations(hass, translations_to_load),
"load translations",
eager_start=True,
)
# Preload storage for all integrations we are going to set up
# so we do not have to wait for it to be loaded when we need it
# in the setup process.
hass.async_create_background_task(
get_internal_store_manager(hass).async_preload(
[*PRELOAD_STORAGE, *all_integrations_to_setup]
),
"preload storage",
eager_start=True,
)
return integrations_to_setup, all_integrations_to_setup
async def _async_set_up_integrations(
hass: core.HomeAssistant, config: dict[str, Any]
) -> None:
"""Set up all the integrations."""
watcher = _WatchPendingSetups(hass, _setup_started(hass))
watcher.async_start()
integrations, all_integrations = await _async_resolve_domains_and_preload(
hass, config
)
# Detect all cycles
integrations_after_dependencies = (
await loader.resolve_integrations_after_dependencies(
hass, all_integrations.values(), set(all_integrations)
)
)
all_domains = set(integrations_after_dependencies)
domains = set(integrations) & all_domains
_LOGGER.info(
"Domains to be set up: %s | %s",
domains,
all_domains - domains,
)
async_set_domains_to_be_loaded(hass, all_domains)
# Initialize recorder
if "recorder" in all_domains:
recorder.async_initialize_recorder(hass)
stages: list[tuple[str, set[str], int | None]] = [
*(
(name, domain_group, timeout)
for name, domain_group, timeout in STAGE_0_INTEGRATIONS
),
("1", STAGE_1_INTEGRATIONS, STAGE_1_TIMEOUT),
("2", domains, STAGE_2_TIMEOUT),
]
_LOGGER.info("Setting up stage 0")
for name, domain_group, timeout in stages:
stage_domains_unfiltered = domain_group & all_domains
if not stage_domains_unfiltered:
_LOGGER.info("Nothing to set up in stage %s: %s", name, domain_group)
continue
stage_domains = stage_domains_unfiltered - hass.config.components
if not stage_domains:
_LOGGER.info("Already set up stage %s: %s", name, stage_domains_unfiltered)
continue
stage_dep_domains_unfiltered = {
dep
for domain in stage_domains
for dep in integrations_after_dependencies[domain]
if dep not in stage_domains
}
stage_dep_domains = stage_dep_domains_unfiltered - hass.config.components
stage_all_domains = stage_domains | stage_dep_domains
_LOGGER.info(
"Setting up stage %s: %s | %s\nDependencies: %s | %s",
name,
stage_domains,
stage_domains_unfiltered - stage_domains,
stage_dep_domains,
stage_dep_domains_unfiltered - stage_dep_domains,
)
if timeout is None:
await _async_setup_multi_components(hass, stage_all_domains, config)
continue
try:
async with hass.timeout.async_timeout(
timeout,
cool_down=COOLDOWN_TIME,
cancel_message=f"Bootstrap stage {name} timeout",
):
await _async_setup_multi_components(hass, stage_all_domains, config)
except TimeoutError:
_LOGGER.warning(
"Setup timed out for stage %s waiting on %s - moving forward",
name,
hass._active_tasks, # noqa: SLF001
)
# Wrap up startup
_LOGGER.debug("Waiting for startup to wrap up")
try:
async with hass.timeout.async_timeout(
WRAP_UP_TIMEOUT,
cool_down=COOLDOWN_TIME,
cancel_message="Bootstrap startup wrap up timeout",
):
await hass.async_block_till_done()
except TimeoutError:
_LOGGER.warning(
"Setup timed out for bootstrap waiting on %s - moving forward",
hass._active_tasks, # noqa: SLF001
)
watcher.async_stop()
if _LOGGER.isEnabledFor(logging.DEBUG):
setup_time = async_get_setup_timings(hass)
_LOGGER.debug(
"Integration setup times: %s",
dict(sorted(setup_time.items(), key=itemgetter(1), reverse=True)),
)
class _WatchPendingSetups: class _WatchPendingSetups:
"""Periodic log and dispatch of setups that are pending.""" """Periodic log and dispatch of setups that are pending."""
@@ -1034,12 +772,14 @@ class _WatchPendingSetups:
self._handle = None self._handle = None
async def _async_setup_multi_components( async def async_setup_multi_components(
hass: core.HomeAssistant, hass: core.HomeAssistant,
domains: set[str], domains: set[str],
config: dict[str, Any], config: dict[str, Any],
) -> None: ) -> None:
"""Set up multiple domains. Log on failure.""" """Set up multiple domains. Log on failure."""
# Avoid creating tasks for domains that were setup in a previous stage
domains_not_yet_setup = domains - hass.config.components
# Create setup tasks for base platforms first since everything will have # Create setup tasks for base platforms first since everything will have
# to wait to be imported, and the sooner we can get the base platforms # to wait to be imported, and the sooner we can get the base platforms
# loaded the sooner we can start loading the rest of the integrations. # loaded the sooner we can start loading the rest of the integrations.
@@ -1049,7 +789,9 @@ async def _async_setup_multi_components(
f"setup component {domain}", f"setup component {domain}",
eager_start=True, eager_start=True,
) )
for domain in sorted(domains, key=SETUP_ORDER_SORT_KEY, reverse=True) for domain in sorted(
domains_not_yet_setup, key=SETUP_ORDER_SORT_KEY, reverse=True
)
} }
results = await asyncio.gather(*futures.values(), return_exceptions=True) results = await asyncio.gather(*futures.values(), return_exceptions=True)
for idx, domain in enumerate(futures): for idx, domain in enumerate(futures):
@@ -1060,3 +802,278 @@ async def _async_setup_multi_components(
domain, domain,
exc_info=(type(result), result, result.__traceback__), exc_info=(type(result), result, result.__traceback__),
) )
async def _async_resolve_domains_to_setup(
hass: core.HomeAssistant, config: dict[str, Any]
) -> tuple[set[str], dict[str, loader.Integration]]:
"""Resolve all dependencies and return list of domains to set up."""
domains_to_setup = _get_domains(hass, config)
needed_requirements: set[str] = set()
platform_integrations = conf_util.extract_platform_integrations(
config, BASE_PLATFORMS
)
# Ensure base platforms that have platform integrations are added to
# to `domains_to_setup so they can be setup first instead of
# discovering them when later when a config entry setup task
# notices its needed and there is already a long line to use
# the import executor.
#
# For example if we have
# sensor:
# - platform: template
#
# `template` has to be loaded to validate the config for sensor
# so we want to start loading `sensor` as soon as we know
# it will be needed. The more platforms under `sensor:`, the longer
# it will take to finish setup for `sensor` because each of these
# platforms has to be imported before we can validate the config.
#
# Thankfully we are migrating away from the platform pattern
# so this will be less of a problem in the future.
domains_to_setup.update(platform_integrations)
# Load manifests for base platforms and platform based integrations
# that are defined under base platforms right away since we do not require
# the manifest to list them as dependencies and we want to avoid the lock
# contention when multiple integrations try to load them at once
additional_manifests_to_load = {
*BASE_PLATFORMS,
*chain.from_iterable(platform_integrations.values()),
}
translations_to_load = additional_manifests_to_load.copy()
# Resolve all dependencies so we know all integrations
# that will have to be loaded and start right-away
integration_cache: dict[str, loader.Integration] = {}
to_resolve: set[str] = domains_to_setup
while to_resolve or additional_manifests_to_load:
old_to_resolve: set[str] = to_resolve
to_resolve = set()
if additional_manifests_to_load:
to_get = {*old_to_resolve, *additional_manifests_to_load}
additional_manifests_to_load.clear()
else:
to_get = old_to_resolve
manifest_deps: set[str] = set()
resolve_dependencies_tasks: list[asyncio.Task[bool]] = []
integrations_to_process: list[loader.Integration] = []
for domain, itg in (await loader.async_get_integrations(hass, to_get)).items():
if not isinstance(itg, loader.Integration):
continue
integration_cache[domain] = itg
needed_requirements.update(itg.requirements)
# Make sure manifests for dependencies are loaded in the next
# loop to try to group as many as manifest loads in a single
# call to avoid the creating one-off executor jobs later in
# the setup process
additional_manifests_to_load.update(
dep
for dep in chain(itg.dependencies, itg.after_dependencies)
if dep not in integration_cache
)
if domain not in old_to_resolve:
continue
integrations_to_process.append(itg)
manifest_deps.update(itg.dependencies)
manifest_deps.update(itg.after_dependencies)
if not itg.all_dependencies_resolved:
resolve_dependencies_tasks.append(
create_eager_task(
itg.resolve_dependencies(),
name=f"resolve dependencies {domain}",
loop=hass.loop,
)
)
if unseen_deps := manifest_deps - integration_cache.keys():
# If there are dependencies, try to preload all
# the integrations manifest at once and add them
# to the list of requirements we need to install
# so we can try to check if they are already installed
# in a single call below which avoids each integration
# having to wait for the lock to do it individually
deps = await loader.async_get_integrations(hass, unseen_deps)
for dependant_domain, dependant_itg in deps.items():
if isinstance(dependant_itg, loader.Integration):
integration_cache[dependant_domain] = dependant_itg
needed_requirements.update(dependant_itg.requirements)
if resolve_dependencies_tasks:
await asyncio.gather(*resolve_dependencies_tasks)
for itg in integrations_to_process:
try:
all_deps = itg.all_dependencies
except RuntimeError:
# Integration.all_dependencies raises RuntimeError if
# dependencies could not be resolved
continue
for dep in all_deps:
if dep in domains_to_setup:
continue
domains_to_setup.add(dep)
to_resolve.add(dep)
_LOGGER.info("Domains to be set up: %s", domains_to_setup)
# Optimistically check if requirements are already installed
# ahead of setting up the integrations so we can prime the cache
# We do not wait for this since its an optimization only
hass.async_create_background_task(
requirements.async_load_installed_versions(hass, needed_requirements),
"check installed requirements",
eager_start=True,
)
#
# Only add the domains_to_setup after we finish resolving
# as new domains are likely to added in the process
#
translations_to_load.update(domains_to_setup)
# Start loading translations for all integrations we are going to set up
# in the background so they are ready when we need them. This avoids a
# lot of waiting for the translation load lock and a thundering herd of
# tasks trying to load the same translations at the same time as each
# integration is loaded.
#
# We do not wait for this since as soon as the task runs it will
# hold the translation load lock and if anything is fast enough to
# wait for the translation load lock, loading will be done by the
# time it gets to it.
hass.async_create_background_task(
translation.async_load_integrations(hass, translations_to_load),
"load translations",
eager_start=True,
)
# Preload storage for all integrations we are going to set up
# so we do not have to wait for it to be loaded when we need it
# in the setup process.
hass.async_create_background_task(
get_internal_store_manager(hass).async_preload(
[*PRELOAD_STORAGE, *domains_to_setup]
),
"preload storage",
eager_start=True,
)
return domains_to_setup, integration_cache
async def _async_set_up_integrations(
hass: core.HomeAssistant, config: dict[str, Any]
) -> None:
"""Set up all the integrations."""
watcher = _WatchPendingSetups(hass, _setup_started(hass))
watcher.async_start()
domains_to_setup, integration_cache = await _async_resolve_domains_to_setup(
hass, config
)
# Initialize recorder
if "recorder" in domains_to_setup:
recorder.async_initialize_recorder(hass)
pre_stage_domains = [
(name, domains_to_setup & domain_group) for name, domain_group in SETUP_ORDER
]
# calculate what components to setup in what stage
stage_1_domains: set[str] = set()
# Find all dependencies of any dependency of any stage 1 integration that
# we plan on loading and promote them to stage 1. This is done only to not
# get misleading log messages
deps_promotion: set[str] = STAGE_1_INTEGRATIONS
while deps_promotion:
old_deps_promotion = deps_promotion
deps_promotion = set()
for domain in old_deps_promotion:
if domain not in domains_to_setup or domain in stage_1_domains:
continue
stage_1_domains.add(domain)
if (dep_itg := integration_cache.get(domain)) is None:
continue
deps_promotion.update(dep_itg.all_dependencies)
stage_2_domains = domains_to_setup - stage_1_domains
for name, domain_group in pre_stage_domains:
if domain_group:
stage_2_domains -= domain_group
_LOGGER.info("Setting up %s: %s", name, domain_group)
to_be_loaded = domain_group.copy()
to_be_loaded.update(
dep
for domain in domain_group
if (integration := integration_cache.get(domain)) is not None
for dep in integration.all_dependencies
)
async_set_domains_to_be_loaded(hass, to_be_loaded)
await async_setup_multi_components(hass, domain_group, config)
# Enables after dependencies when setting up stage 1 domains
async_set_domains_to_be_loaded(hass, stage_1_domains)
# Start setup
if stage_1_domains:
_LOGGER.info("Setting up stage 1: %s", stage_1_domains)
try:
async with hass.timeout.async_timeout(
STAGE_1_TIMEOUT, cool_down=COOLDOWN_TIME
):
await async_setup_multi_components(hass, stage_1_domains, config)
except TimeoutError:
_LOGGER.warning(
"Setup timed out for stage 1 waiting on %s - moving forward",
hass._active_tasks, # noqa: SLF001
)
# Add after dependencies when setting up stage 2 domains
async_set_domains_to_be_loaded(hass, stage_2_domains)
if stage_2_domains:
_LOGGER.info("Setting up stage 2: %s", stage_2_domains)
try:
async with hass.timeout.async_timeout(
STAGE_2_TIMEOUT, cool_down=COOLDOWN_TIME
):
await async_setup_multi_components(hass, stage_2_domains, config)
except TimeoutError:
_LOGGER.warning(
"Setup timed out for stage 2 waiting on %s - moving forward",
hass._active_tasks, # noqa: SLF001
)
# Wrap up startup
_LOGGER.debug("Waiting for startup to wrap up")
try:
async with hass.timeout.async_timeout(WRAP_UP_TIMEOUT, cool_down=COOLDOWN_TIME):
await hass.async_block_till_done()
except TimeoutError:
_LOGGER.warning(
"Setup timed out for bootstrap waiting on %s - moving forward",
hass._active_tasks, # noqa: SLF001
)
watcher.async_stop()
if _LOGGER.isEnabledFor(logging.DEBUG):
setup_time = async_get_setup_timings(hass)
_LOGGER.debug(
"Integration setup times: %s",
dict(sorted(setup_time.items(), key=itemgetter(1), reverse=True)),
)

View File

@@ -1,13 +1,5 @@
{ {
"domain": "amazon", "domain": "amazon",
"name": "Amazon", "name": "Amazon",
"integrations": [ "integrations": ["alexa", "amazon_polly", "aws", "fire_tv", "route53"]
"alexa",
"alexa_devices",
"amazon_polly",
"aws",
"aws_s3",
"fire_tv",
"route53"
]
} }

View File

@@ -1,5 +0,0 @@
{
"domain": "bosch",
"name": "Bosch",
"integrations": ["bosch_alarm", "bosch_shc", "home_connect"]
}

View File

@@ -1,5 +0,0 @@
{
"domain": "eve",
"name": "Eve",
"iot_standards": ["matter"]
}

View File

@@ -5,8 +5,6 @@
"google_assistant", "google_assistant",
"google_assistant_sdk", "google_assistant_sdk",
"google_cloud", "google_cloud",
"google_drive",
"google_gemini",
"google_generative_ai_conversation", "google_generative_ai_conversation",
"google_mail", "google_mail",
"google_maps", "google_maps",

View File

@@ -6,13 +6,11 @@
"azure_devops", "azure_devops",
"azure_event_hub", "azure_event_hub",
"azure_service_bus", "azure_service_bus",
"azure_storage",
"microsoft_face_detect", "microsoft_face_detect",
"microsoft_face_identify", "microsoft_face_identify",
"microsoft_face", "microsoft_face",
"microsoft", "microsoft",
"msteams", "msteams",
"onedrive",
"xbox" "xbox"
] ]
} }

View File

@@ -1,6 +1,5 @@
{ {
"domain": "motionblinds", "domain": "motionblinds",
"name": "Motionblinds", "name": "Motionblinds",
"integrations": ["motion_blinds", "motionblinds_ble"], "integrations": ["motion_blinds", "motionblinds_ble"]
"iot_standards": ["matter"]
} }

View File

@@ -1,6 +0,0 @@
{
"domain": "nuki",
"name": "Nuki",
"integrations": ["nuki"],
"iot_standards": ["matter"]
}

View File

@@ -1,5 +0,0 @@
{
"domain": "sensorpush",
"name": "SensorPush",
"integrations": ["sensorpush", "sensorpush_cloud"]
}

View File

@@ -1,6 +0,0 @@
{
"domain": "shelly",
"name": "shelly",
"integrations": ["shelly"],
"iot_standards": ["zwave"]
}

View File

@@ -1,11 +1,5 @@
{ {
"domain": "sony", "domain": "sony",
"name": "Sony", "name": "Sony",
"integrations": [ "integrations": ["braviatv", "ps4", "sony_projector", "songpal"]
"braviatv",
"ps4",
"sony_projector",
"songpal",
"playstation_network"
]
} }

View File

@@ -1,6 +1,5 @@
{ {
"domain": "switchbot", "domain": "switchbot",
"name": "SwitchBot", "name": "SwitchBot",
"integrations": ["switchbot", "switchbot_cloud"], "integrations": ["switchbot", "switchbot_cloud"]
"iot_standards": ["matter"]
} }

View File

@@ -1,5 +0,0 @@
{
"domain": "tilt",
"name": "Tilt",
"integrations": ["tilt_ble", "tilt_pi"]
}

View File

@@ -14,24 +14,30 @@ from jaraco.abode.exceptions import (
) )
from jaraco.abode.helpers.timeline import Groups as GROUPS from jaraco.abode.helpers.timeline import Groups as GROUPS
from requests.exceptions import ConnectTimeout, HTTPError from requests.exceptions import ConnectTimeout, HTTPError
import voluptuous as vol
from homeassistant.config_entries import ConfigEntry from homeassistant.config_entries import ConfigEntry
from homeassistant.const import ( from homeassistant.const import (
ATTR_DATE, ATTR_DATE,
ATTR_DEVICE_ID, ATTR_DEVICE_ID,
ATTR_ENTITY_ID,
ATTR_TIME, ATTR_TIME,
CONF_PASSWORD, CONF_PASSWORD,
CONF_USERNAME, CONF_USERNAME,
EVENT_HOMEASSISTANT_STOP, EVENT_HOMEASSISTANT_STOP,
Platform, Platform,
) )
from homeassistant.core import CALLBACK_TYPE, Event, HomeAssistant from homeassistant.core import CALLBACK_TYPE, Event, HomeAssistant, ServiceCall
from homeassistant.exceptions import ConfigEntryAuthFailed, ConfigEntryNotReady from homeassistant.exceptions import ConfigEntryAuthFailed, ConfigEntryNotReady
from homeassistant.helpers import config_validation as cv from homeassistant.helpers import config_validation as cv
from homeassistant.helpers.dispatcher import dispatcher_send
from homeassistant.helpers.typing import ConfigType from homeassistant.helpers.typing import ConfigType
from .const import CONF_POLLING, DOMAIN, LOGGER from .const import CONF_POLLING, DOMAIN, LOGGER
from .services import async_setup_services
SERVICE_SETTINGS = "change_setting"
SERVICE_CAPTURE_IMAGE = "capture_image"
SERVICE_TRIGGER_AUTOMATION = "trigger_automation"
ATTR_DEVICE_NAME = "device_name" ATTR_DEVICE_NAME = "device_name"
ATTR_DEVICE_TYPE = "device_type" ATTR_DEVICE_TYPE = "device_type"
@@ -39,12 +45,22 @@ ATTR_EVENT_CODE = "event_code"
ATTR_EVENT_NAME = "event_name" ATTR_EVENT_NAME = "event_name"
ATTR_EVENT_TYPE = "event_type" ATTR_EVENT_TYPE = "event_type"
ATTR_EVENT_UTC = "event_utc" ATTR_EVENT_UTC = "event_utc"
ATTR_SETTING = "setting"
ATTR_USER_NAME = "user_name" ATTR_USER_NAME = "user_name"
ATTR_APP_TYPE = "app_type" ATTR_APP_TYPE = "app_type"
ATTR_EVENT_BY = "event_by" ATTR_EVENT_BY = "event_by"
ATTR_VALUE = "value"
CONFIG_SCHEMA = cv.removed(DOMAIN, raise_if_present=False) CONFIG_SCHEMA = cv.removed(DOMAIN, raise_if_present=False)
CHANGE_SETTING_SCHEMA = vol.Schema(
{vol.Required(ATTR_SETTING): cv.string, vol.Required(ATTR_VALUE): cv.string}
)
CAPTURE_IMAGE_SCHEMA = vol.Schema({ATTR_ENTITY_ID: cv.entity_ids})
AUTOMATION_SCHEMA = vol.Schema({ATTR_ENTITY_ID: cv.entity_ids})
PLATFORMS = [ PLATFORMS = [
Platform.ALARM_CONTROL_PANEL, Platform.ALARM_CONTROL_PANEL,
Platform.BINARY_SENSOR, Platform.BINARY_SENSOR,
@@ -69,7 +85,7 @@ class AbodeSystem:
async def async_setup(hass: HomeAssistant, config: ConfigType) -> bool: async def async_setup(hass: HomeAssistant, config: ConfigType) -> bool:
"""Set up the Abode component.""" """Set up the Abode component."""
async_setup_services(hass) setup_hass_services(hass)
return True return True
@@ -122,6 +138,60 @@ async def async_unload_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool:
return unload_ok return unload_ok
def setup_hass_services(hass: HomeAssistant) -> None:
"""Home Assistant services."""
def change_setting(call: ServiceCall) -> None:
"""Change an Abode system setting."""
setting = call.data[ATTR_SETTING]
value = call.data[ATTR_VALUE]
try:
hass.data[DOMAIN].abode.set_setting(setting, value)
except AbodeException as ex:
LOGGER.warning(ex)
def capture_image(call: ServiceCall) -> None:
"""Capture a new image."""
entity_ids = call.data[ATTR_ENTITY_ID]
target_entities = [
entity_id
for entity_id in hass.data[DOMAIN].entity_ids
if entity_id in entity_ids
]
for entity_id in target_entities:
signal = f"abode_camera_capture_{entity_id}"
dispatcher_send(hass, signal)
def trigger_automation(call: ServiceCall) -> None:
"""Trigger an Abode automation."""
entity_ids = call.data[ATTR_ENTITY_ID]
target_entities = [
entity_id
for entity_id in hass.data[DOMAIN].entity_ids
if entity_id in entity_ids
]
for entity_id in target_entities:
signal = f"abode_trigger_automation_{entity_id}"
dispatcher_send(hass, signal)
hass.services.async_register(
DOMAIN, SERVICE_SETTINGS, change_setting, schema=CHANGE_SETTING_SCHEMA
)
hass.services.async_register(
DOMAIN, SERVICE_CAPTURE_IMAGE, capture_image, schema=CAPTURE_IMAGE_SCHEMA
)
hass.services.async_register(
DOMAIN, SERVICE_TRIGGER_AUTOMATION, trigger_automation, schema=AUTOMATION_SCHEMA
)
async def setup_hass_events(hass: HomeAssistant) -> None: async def setup_hass_events(hass: HomeAssistant) -> None:
"""Home Assistant start and stop callbacks.""" """Home Assistant start and stop callbacks."""

View File

@@ -11,7 +11,7 @@ from homeassistant.components.alarm_control_panel import (
) )
from homeassistant.config_entries import ConfigEntry from homeassistant.config_entries import ConfigEntry
from homeassistant.core import HomeAssistant from homeassistant.core import HomeAssistant
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback from homeassistant.helpers.entity_platform import AddEntitiesCallback
from . import AbodeSystem from . import AbodeSystem
from .const import DOMAIN from .const import DOMAIN
@@ -19,9 +19,7 @@ from .entity import AbodeDevice
async def async_setup_entry( async def async_setup_entry(
hass: HomeAssistant, hass: HomeAssistant, entry: ConfigEntry, async_add_entities: AddEntitiesCallback
entry: ConfigEntry,
async_add_entities: AddConfigEntryEntitiesCallback,
) -> None: ) -> None:
"""Set up Abode alarm control panel device.""" """Set up Abode alarm control panel device."""
data: AbodeSystem = hass.data[DOMAIN] data: AbodeSystem = hass.data[DOMAIN]

View File

@@ -12,7 +12,7 @@ from homeassistant.components.binary_sensor import (
) )
from homeassistant.config_entries import ConfigEntry from homeassistant.config_entries import ConfigEntry
from homeassistant.core import HomeAssistant from homeassistant.core import HomeAssistant
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback from homeassistant.helpers.entity_platform import AddEntitiesCallback
from homeassistant.util.enum import try_parse_enum from homeassistant.util.enum import try_parse_enum
from . import AbodeSystem from . import AbodeSystem
@@ -21,9 +21,7 @@ from .entity import AbodeDevice
async def async_setup_entry( async def async_setup_entry(
hass: HomeAssistant, hass: HomeAssistant, entry: ConfigEntry, async_add_entities: AddEntitiesCallback
entry: ConfigEntry,
async_add_entities: AddConfigEntryEntitiesCallback,
) -> None: ) -> None:
"""Set up Abode binary sensor devices.""" """Set up Abode binary sensor devices."""
data: AbodeSystem = hass.data[DOMAIN] data: AbodeSystem = hass.data[DOMAIN]

View File

@@ -15,7 +15,7 @@ from homeassistant.components.camera import Camera
from homeassistant.config_entries import ConfigEntry from homeassistant.config_entries import ConfigEntry
from homeassistant.core import Event, HomeAssistant from homeassistant.core import Event, HomeAssistant
from homeassistant.helpers.dispatcher import async_dispatcher_connect from homeassistant.helpers.dispatcher import async_dispatcher_connect
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback from homeassistant.helpers.entity_platform import AddEntitiesCallback
from homeassistant.util import Throttle from homeassistant.util import Throttle
from . import AbodeSystem from . import AbodeSystem
@@ -26,9 +26,7 @@ MIN_TIME_BETWEEN_UPDATES = timedelta(seconds=90)
async def async_setup_entry( async def async_setup_entry(
hass: HomeAssistant, hass: HomeAssistant, entry: ConfigEntry, async_add_entities: AddEntitiesCallback
entry: ConfigEntry,
async_add_entities: AddConfigEntryEntitiesCallback,
) -> None: ) -> None:
"""Set up Abode camera devices.""" """Set up Abode camera devices."""
data: AbodeSystem = hass.data[DOMAIN] data: AbodeSystem = hass.data[DOMAIN]

View File

@@ -7,7 +7,7 @@ from jaraco.abode.devices.cover import Cover
from homeassistant.components.cover import CoverEntity from homeassistant.components.cover import CoverEntity
from homeassistant.config_entries import ConfigEntry from homeassistant.config_entries import ConfigEntry
from homeassistant.core import HomeAssistant from homeassistant.core import HomeAssistant
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback from homeassistant.helpers.entity_platform import AddEntitiesCallback
from . import AbodeSystem from . import AbodeSystem
from .const import DOMAIN from .const import DOMAIN
@@ -15,9 +15,7 @@ from .entity import AbodeDevice
async def async_setup_entry( async def async_setup_entry(
hass: HomeAssistant, hass: HomeAssistant, entry: ConfigEntry, async_add_entities: AddEntitiesCallback
entry: ConfigEntry,
async_add_entities: AddConfigEntryEntitiesCallback,
) -> None: ) -> None:
"""Set up Abode cover devices.""" """Set up Abode cover devices."""
data: AbodeSystem = hass.data[DOMAIN] data: AbodeSystem = hass.data[DOMAIN]

View File

@@ -18,7 +18,7 @@ from homeassistant.components.light import (
) )
from homeassistant.config_entries import ConfigEntry from homeassistant.config_entries import ConfigEntry
from homeassistant.core import HomeAssistant from homeassistant.core import HomeAssistant
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback from homeassistant.helpers.entity_platform import AddEntitiesCallback
from . import AbodeSystem from . import AbodeSystem
from .const import DOMAIN from .const import DOMAIN
@@ -26,9 +26,7 @@ from .entity import AbodeDevice
async def async_setup_entry( async def async_setup_entry(
hass: HomeAssistant, hass: HomeAssistant, entry: ConfigEntry, async_add_entities: AddEntitiesCallback
entry: ConfigEntry,
async_add_entities: AddConfigEntryEntitiesCallback,
) -> None: ) -> None:
"""Set up Abode light devices.""" """Set up Abode light devices."""
data: AbodeSystem = hass.data[DOMAIN] data: AbodeSystem = hass.data[DOMAIN]

View File

@@ -7,7 +7,7 @@ from jaraco.abode.devices.lock import Lock
from homeassistant.components.lock import LockEntity from homeassistant.components.lock import LockEntity
from homeassistant.config_entries import ConfigEntry from homeassistant.config_entries import ConfigEntry
from homeassistant.core import HomeAssistant from homeassistant.core import HomeAssistant
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback from homeassistant.helpers.entity_platform import AddEntitiesCallback
from . import AbodeSystem from . import AbodeSystem
from .const import DOMAIN from .const import DOMAIN
@@ -15,9 +15,7 @@ from .entity import AbodeDevice
async def async_setup_entry( async def async_setup_entry(
hass: HomeAssistant, hass: HomeAssistant, entry: ConfigEntry, async_add_entities: AddEntitiesCallback
entry: ConfigEntry,
async_add_entities: AddConfigEntryEntitiesCallback,
) -> None: ) -> None:
"""Set up Abode lock devices.""" """Set up Abode lock devices."""
data: AbodeSystem = hass.data[DOMAIN] data: AbodeSystem = hass.data[DOMAIN]

View File

@@ -16,7 +16,7 @@ from homeassistant.components.sensor import (
from homeassistant.config_entries import ConfigEntry from homeassistant.config_entries import ConfigEntry
from homeassistant.const import LIGHT_LUX, PERCENTAGE, UnitOfTemperature from homeassistant.const import LIGHT_LUX, PERCENTAGE, UnitOfTemperature
from homeassistant.core import HomeAssistant from homeassistant.core import HomeAssistant
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback from homeassistant.helpers.entity_platform import AddEntitiesCallback
from . import AbodeSystem from . import AbodeSystem
from .const import DOMAIN from .const import DOMAIN
@@ -61,9 +61,7 @@ SENSOR_TYPES: tuple[AbodeSensorDescription, ...] = (
async def async_setup_entry( async def async_setup_entry(
hass: HomeAssistant, hass: HomeAssistant, entry: ConfigEntry, async_add_entities: AddEntitiesCallback
entry: ConfigEntry,
async_add_entities: AddConfigEntryEntitiesCallback,
) -> None: ) -> None:
"""Set up Abode sensor devices.""" """Set up Abode sensor devices."""
data: AbodeSystem = hass.data[DOMAIN] data: AbodeSystem = hass.data[DOMAIN]

View File

@@ -1,90 +0,0 @@
"""Support for the Abode Security System."""
from __future__ import annotations
from jaraco.abode.exceptions import Exception as AbodeException
import voluptuous as vol
from homeassistant.const import ATTR_ENTITY_ID
from homeassistant.core import HomeAssistant, ServiceCall, callback
from homeassistant.helpers import config_validation as cv
from homeassistant.helpers.dispatcher import dispatcher_send
from .const import DOMAIN, LOGGER
SERVICE_SETTINGS = "change_setting"
SERVICE_CAPTURE_IMAGE = "capture_image"
SERVICE_TRIGGER_AUTOMATION = "trigger_automation"
ATTR_SETTING = "setting"
ATTR_VALUE = "value"
CHANGE_SETTING_SCHEMA = vol.Schema(
{vol.Required(ATTR_SETTING): cv.string, vol.Required(ATTR_VALUE): cv.string}
)
CAPTURE_IMAGE_SCHEMA = vol.Schema({ATTR_ENTITY_ID: cv.entity_ids})
AUTOMATION_SCHEMA = vol.Schema({ATTR_ENTITY_ID: cv.entity_ids})
def _change_setting(call: ServiceCall) -> None:
"""Change an Abode system setting."""
setting = call.data[ATTR_SETTING]
value = call.data[ATTR_VALUE]
try:
call.hass.data[DOMAIN].abode.set_setting(setting, value)
except AbodeException as ex:
LOGGER.warning(ex)
def _capture_image(call: ServiceCall) -> None:
"""Capture a new image."""
entity_ids = call.data[ATTR_ENTITY_ID]
target_entities = [
entity_id
for entity_id in call.hass.data[DOMAIN].entity_ids
if entity_id in entity_ids
]
for entity_id in target_entities:
signal = f"abode_camera_capture_{entity_id}"
dispatcher_send(call.hass, signal)
def _trigger_automation(call: ServiceCall) -> None:
"""Trigger an Abode automation."""
entity_ids = call.data[ATTR_ENTITY_ID]
target_entities = [
entity_id
for entity_id in call.hass.data[DOMAIN].entity_ids
if entity_id in entity_ids
]
for entity_id in target_entities:
signal = f"abode_trigger_automation_{entity_id}"
dispatcher_send(call.hass, signal)
@callback
def async_setup_services(hass: HomeAssistant) -> None:
"""Home Assistant services."""
hass.services.async_register(
DOMAIN, SERVICE_SETTINGS, _change_setting, schema=CHANGE_SETTING_SCHEMA
)
hass.services.async_register(
DOMAIN, SERVICE_CAPTURE_IMAGE, _capture_image, schema=CAPTURE_IMAGE_SCHEMA
)
hass.services.async_register(
DOMAIN,
SERVICE_TRIGGER_AUTOMATION,
_trigger_automation,
schema=AUTOMATION_SCHEMA,
)

View File

@@ -10,7 +10,7 @@ from homeassistant.components.switch import SwitchEntity
from homeassistant.config_entries import ConfigEntry from homeassistant.config_entries import ConfigEntry
from homeassistant.core import HomeAssistant from homeassistant.core import HomeAssistant
from homeassistant.helpers.dispatcher import async_dispatcher_connect from homeassistant.helpers.dispatcher import async_dispatcher_connect
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback from homeassistant.helpers.entity_platform import AddEntitiesCallback
from . import AbodeSystem from . import AbodeSystem
from .const import DOMAIN from .const import DOMAIN
@@ -20,9 +20,7 @@ DEVICE_TYPES = ["switch", "valve"]
async def async_setup_entry( async def async_setup_entry(
hass: HomeAssistant, hass: HomeAssistant, entry: ConfigEntry, async_add_entities: AddEntitiesCallback
entry: ConfigEntry,
async_add_entities: AddConfigEntryEntitiesCallback,
) -> None: ) -> None:
"""Set up Abode switch devices.""" """Set up Abode switch devices."""
data: AbodeSystem = hass.data[DOMAIN] data: AbodeSystem = hass.data[DOMAIN]

View File

@@ -11,7 +11,7 @@ from homeassistant.components.binary_sensor import (
BinarySensorEntityDescription, BinarySensorEntityDescription,
) )
from homeassistant.core import HomeAssistant from homeassistant.core import HomeAssistant
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback from homeassistant.helpers.entity_platform import AddEntitiesCallback
from .coordinator import AcaiaConfigEntry from .coordinator import AcaiaConfigEntry
from .entity import AcaiaEntity from .entity import AcaiaEntity
@@ -40,7 +40,7 @@ BINARY_SENSORS: tuple[AcaiaBinarySensorEntityDescription, ...] = (
async def async_setup_entry( async def async_setup_entry(
hass: HomeAssistant, hass: HomeAssistant,
entry: AcaiaConfigEntry, entry: AcaiaConfigEntry,
async_add_entities: AddConfigEntryEntitiesCallback, async_add_entities: AddEntitiesCallback,
) -> None: ) -> None:
"""Set up binary sensors.""" """Set up binary sensors."""

View File

@@ -8,7 +8,7 @@ from aioacaia.acaiascale import AcaiaScale
from homeassistant.components.button import ButtonEntity, ButtonEntityDescription from homeassistant.components.button import ButtonEntity, ButtonEntityDescription
from homeassistant.core import HomeAssistant from homeassistant.core import HomeAssistant
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback from homeassistant.helpers.entity_platform import AddEntitiesCallback
from .coordinator import AcaiaConfigEntry from .coordinator import AcaiaConfigEntry
from .entity import AcaiaEntity from .entity import AcaiaEntity
@@ -45,7 +45,7 @@ BUTTONS: tuple[AcaiaButtonEntityDescription, ...] = (
async def async_setup_entry( async def async_setup_entry(
hass: HomeAssistant, hass: HomeAssistant,
entry: AcaiaConfigEntry, entry: AcaiaConfigEntry,
async_add_entities: AddConfigEntryEntitiesCallback, async_add_entities: AddEntitiesCallback,
) -> None: ) -> None:
"""Set up button entities and services.""" """Set up button entities and services."""

View File

@@ -26,5 +26,5 @@
"iot_class": "local_push", "iot_class": "local_push",
"loggers": ["aioacaia"], "loggers": ["aioacaia"],
"quality_scale": "platinum", "quality_scale": "platinum",
"requirements": ["aioacaia==0.1.14"] "requirements": ["aioacaia==0.1.13"]
} }

View File

@@ -16,7 +16,7 @@ from homeassistant.components.sensor import (
) )
from homeassistant.const import PERCENTAGE, UnitOfMass, UnitOfVolumeFlowRate from homeassistant.const import PERCENTAGE, UnitOfMass, UnitOfVolumeFlowRate
from homeassistant.core import HomeAssistant, callback from homeassistant.core import HomeAssistant, callback
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback from homeassistant.helpers.entity_platform import AddEntitiesCallback
from .coordinator import AcaiaConfigEntry from .coordinator import AcaiaConfigEntry
from .entity import AcaiaEntity from .entity import AcaiaEntity
@@ -77,7 +77,7 @@ RESTORE_SENSORS: tuple[AcaiaSensorEntityDescription, ...] = (
async def async_setup_entry( async def async_setup_entry(
hass: HomeAssistant, hass: HomeAssistant,
entry: AcaiaConfigEntry, entry: AcaiaConfigEntry,
async_add_entities: AddConfigEntryEntitiesCallback, async_add_entities: AddEntitiesCallback,
) -> None: ) -> None:
"""Set up sensors.""" """Set up sensors."""

View File

@@ -12,8 +12,8 @@ import voluptuous as vol
from homeassistant.config_entries import ConfigFlow, ConfigFlowResult from homeassistant.config_entries import ConfigFlow, ConfigFlowResult
from homeassistant.const import CONF_API_KEY, CONF_LATITUDE, CONF_LONGITUDE, CONF_NAME from homeassistant.const import CONF_API_KEY, CONF_LATITUDE, CONF_LONGITUDE, CONF_NAME
from homeassistant.helpers import config_validation as cv
from homeassistant.helpers.aiohttp_client import async_get_clientsession from homeassistant.helpers.aiohttp_client import async_get_clientsession
import homeassistant.helpers.config_validation as cv
from .const import DOMAIN from .const import DOMAIN

View File

@@ -24,7 +24,7 @@ from homeassistant.components.weather import (
API_METRIC: Final = "Metric" API_METRIC: Final = "Metric"
ATTRIBUTION: Final = "Data provided by AccuWeather" ATTRIBUTION: Final = "Data provided by AccuWeather"
ATTR_CATEGORY_VALUE = "CategoryValue" ATTR_CATEGORY: Final = "Category"
ATTR_DIRECTION: Final = "Direction" ATTR_DIRECTION: Final = "Direction"
ATTR_ENGLISH: Final = "English" ATTR_ENGLISH: Final = "English"
ATTR_LEVEL: Final = "level" ATTR_LEVEL: Final = "level"
@@ -55,19 +55,5 @@ CONDITION_MAP = {
for cond_ha, cond_codes in CONDITION_CLASSES.items() for cond_ha, cond_codes in CONDITION_CLASSES.items()
for cond_code in cond_codes for cond_code in cond_codes
} }
AIR_QUALITY_CATEGORY_MAP = {
1: "good",
2: "moderate",
3: "unhealthy",
4: "very_unhealthy",
5: "hazardous",
}
POLLEN_CATEGORY_MAP = {
1: "low",
2: "moderate",
3: "high",
4: "very_high",
5: "extreme",
}
UPDATE_INTERVAL_OBSERVATION = timedelta(minutes=40) UPDATE_INTERVAL_OBSERVATION = timedelta(minutes=40)
UPDATE_INTERVAL_DAILY_FORECAST = timedelta(hours=6) UPDATE_INTERVAL_DAILY_FORECAST = timedelta(hours=6)

View File

@@ -75,11 +75,7 @@ class AccuWeatherObservationDataUpdateCoordinator(
async with timeout(10): async with timeout(10):
result = await self.accuweather.async_get_current_conditions() result = await self.accuweather.async_get_current_conditions()
except EXCEPTIONS as error: except EXCEPTIONS as error:
raise UpdateFailed( raise UpdateFailed(error) from error
translation_domain=DOMAIN,
translation_key="current_conditions_update_error",
translation_placeholders={"error": repr(error)},
) from error
_LOGGER.debug("Requests remaining: %d", self.accuweather.requests_remaining) _LOGGER.debug("Requests remaining: %d", self.accuweather.requests_remaining)
@@ -121,15 +117,9 @@ class AccuWeatherDailyForecastDataUpdateCoordinator(
"""Update data via library.""" """Update data via library."""
try: try:
async with timeout(10): async with timeout(10):
result = await self.accuweather.async_get_daily_forecast( result = await self.accuweather.async_get_daily_forecast()
language=self.hass.config.language
)
except EXCEPTIONS as error: except EXCEPTIONS as error:
raise UpdateFailed( raise UpdateFailed(error) from error
translation_domain=DOMAIN,
translation_key="forecast_update_error",
translation_placeholders={"error": repr(error)},
) from error
_LOGGER.debug("Requests remaining: %d", self.accuweather.requests_remaining) _LOGGER.debug("Requests remaining: %d", self.accuweather.requests_remaining)

View File

@@ -7,6 +7,6 @@
"integration_type": "service", "integration_type": "service",
"iot_class": "cloud_polling", "iot_class": "cloud_polling",
"loggers": ["accuweather"], "loggers": ["accuweather"],
"requirements": ["accuweather==4.2.0"], "requirements": ["accuweather==4.0.0"],
"single_config_entry": true "single_config_entry": true
} }

View File

@@ -25,13 +25,12 @@ from homeassistant.const import (
UnitOfVolumetricFlux, UnitOfVolumetricFlux,
) )
from homeassistant.core import HomeAssistant, callback from homeassistant.core import HomeAssistant, callback
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback from homeassistant.helpers.entity_platform import AddEntitiesCallback
from homeassistant.helpers.update_coordinator import CoordinatorEntity from homeassistant.helpers.update_coordinator import CoordinatorEntity
from .const import ( from .const import (
AIR_QUALITY_CATEGORY_MAP,
API_METRIC, API_METRIC,
ATTR_CATEGORY_VALUE, ATTR_CATEGORY,
ATTR_DIRECTION, ATTR_DIRECTION,
ATTR_ENGLISH, ATTR_ENGLISH,
ATTR_LEVEL, ATTR_LEVEL,
@@ -39,7 +38,6 @@ from .const import (
ATTR_VALUE, ATTR_VALUE,
ATTRIBUTION, ATTRIBUTION,
MAX_FORECAST_DAYS, MAX_FORECAST_DAYS,
POLLEN_CATEGORY_MAP,
) )
from .coordinator import ( from .coordinator import (
AccuWeatherConfigEntry, AccuWeatherConfigEntry,
@@ -61,9 +59,9 @@ class AccuWeatherSensorDescription(SensorEntityDescription):
FORECAST_SENSOR_TYPES: tuple[AccuWeatherSensorDescription, ...] = ( FORECAST_SENSOR_TYPES: tuple[AccuWeatherSensorDescription, ...] = (
AccuWeatherSensorDescription( AccuWeatherSensorDescription(
key="AirQuality", key="AirQuality",
value_fn=lambda data: AIR_QUALITY_CATEGORY_MAP[data[ATTR_CATEGORY_VALUE]], value_fn=lambda data: cast(str, data[ATTR_CATEGORY]),
device_class=SensorDeviceClass.ENUM, device_class=SensorDeviceClass.ENUM,
options=list(AIR_QUALITY_CATEGORY_MAP.values()), options=["good", "hazardous", "high", "low", "moderate", "unhealthy"],
translation_key="air_quality", translation_key="air_quality",
), ),
AccuWeatherSensorDescription( AccuWeatherSensorDescription(
@@ -85,9 +83,7 @@ FORECAST_SENSOR_TYPES: tuple[AccuWeatherSensorDescription, ...] = (
entity_registry_enabled_default=False, entity_registry_enabled_default=False,
native_unit_of_measurement=CONCENTRATION_PARTS_PER_CUBIC_METER, native_unit_of_measurement=CONCENTRATION_PARTS_PER_CUBIC_METER,
value_fn=lambda data: cast(int, data[ATTR_VALUE]), value_fn=lambda data: cast(int, data[ATTR_VALUE]),
attr_fn=lambda data: { attr_fn=lambda data: {ATTR_LEVEL: data[ATTR_CATEGORY]},
ATTR_LEVEL: POLLEN_CATEGORY_MAP[data[ATTR_CATEGORY_VALUE]]
},
translation_key="grass_pollen", translation_key="grass_pollen",
), ),
AccuWeatherSensorDescription( AccuWeatherSensorDescription(
@@ -111,9 +107,7 @@ FORECAST_SENSOR_TYPES: tuple[AccuWeatherSensorDescription, ...] = (
entity_registry_enabled_default=False, entity_registry_enabled_default=False,
native_unit_of_measurement=CONCENTRATION_PARTS_PER_CUBIC_METER, native_unit_of_measurement=CONCENTRATION_PARTS_PER_CUBIC_METER,
value_fn=lambda data: cast(int, data[ATTR_VALUE]), value_fn=lambda data: cast(int, data[ATTR_VALUE]),
attr_fn=lambda data: { attr_fn=lambda data: {ATTR_LEVEL: data[ATTR_CATEGORY]},
ATTR_LEVEL: POLLEN_CATEGORY_MAP[data[ATTR_CATEGORY_VALUE]]
},
translation_key="mold_pollen", translation_key="mold_pollen",
), ),
AccuWeatherSensorDescription( AccuWeatherSensorDescription(
@@ -121,9 +115,7 @@ FORECAST_SENSOR_TYPES: tuple[AccuWeatherSensorDescription, ...] = (
native_unit_of_measurement=CONCENTRATION_PARTS_PER_CUBIC_METER, native_unit_of_measurement=CONCENTRATION_PARTS_PER_CUBIC_METER,
entity_registry_enabled_default=False, entity_registry_enabled_default=False,
value_fn=lambda data: cast(int, data[ATTR_VALUE]), value_fn=lambda data: cast(int, data[ATTR_VALUE]),
attr_fn=lambda data: { attr_fn=lambda data: {ATTR_LEVEL: data[ATTR_CATEGORY]},
ATTR_LEVEL: POLLEN_CATEGORY_MAP[data[ATTR_CATEGORY_VALUE]]
},
translation_key="ragweed_pollen", translation_key="ragweed_pollen",
), ),
AccuWeatherSensorDescription( AccuWeatherSensorDescription(
@@ -189,18 +181,14 @@ FORECAST_SENSOR_TYPES: tuple[AccuWeatherSensorDescription, ...] = (
native_unit_of_measurement=CONCENTRATION_PARTS_PER_CUBIC_METER, native_unit_of_measurement=CONCENTRATION_PARTS_PER_CUBIC_METER,
entity_registry_enabled_default=False, entity_registry_enabled_default=False,
value_fn=lambda data: cast(int, data[ATTR_VALUE]), value_fn=lambda data: cast(int, data[ATTR_VALUE]),
attr_fn=lambda data: { attr_fn=lambda data: {ATTR_LEVEL: data[ATTR_CATEGORY]},
ATTR_LEVEL: POLLEN_CATEGORY_MAP[data[ATTR_CATEGORY_VALUE]]
},
translation_key="tree_pollen", translation_key="tree_pollen",
), ),
AccuWeatherSensorDescription( AccuWeatherSensorDescription(
key="UVIndex", key="UVIndex",
native_unit_of_measurement=UV_INDEX, native_unit_of_measurement=UV_INDEX,
value_fn=lambda data: cast(int, data[ATTR_VALUE]), value_fn=lambda data: cast(int, data[ATTR_VALUE]),
attr_fn=lambda data: { attr_fn=lambda data: {ATTR_LEVEL: data[ATTR_CATEGORY]},
ATTR_LEVEL: POLLEN_CATEGORY_MAP[data[ATTR_CATEGORY_VALUE]]
},
translation_key="uv_index_forecast", translation_key="uv_index_forecast",
), ),
AccuWeatherSensorDescription( AccuWeatherSensorDescription(
@@ -387,7 +375,7 @@ SENSOR_TYPES: tuple[AccuWeatherSensorDescription, ...] = (
async def async_setup_entry( async def async_setup_entry(
hass: HomeAssistant, hass: HomeAssistant,
entry: AccuWeatherConfigEntry, entry: AccuWeatherConfigEntry,
async_add_entities: AddConfigEntryEntitiesCallback, async_add_entities: AddEntitiesCallback,
) -> None: ) -> None:
"""Add AccuWeather entities from a config_entry.""" """Add AccuWeather entities from a config_entry."""
observation_coordinator: AccuWeatherObservationDataUpdateCoordinator = ( observation_coordinator: AccuWeatherObservationDataUpdateCoordinator = (

View File

@@ -16,7 +16,7 @@
"error": { "error": {
"cannot_connect": "[%key:common::config_flow::error::cannot_connect%]", "cannot_connect": "[%key:common::config_flow::error::cannot_connect%]",
"invalid_api_key": "[%key:common::config_flow::error::invalid_api_key%]", "invalid_api_key": "[%key:common::config_flow::error::invalid_api_key%]",
"requests_exceeded": "The allowed number of requests to the AccuWeather API has been exceeded. You have to wait or change the API key." "requests_exceeded": "The allowed number of requests to Accuweather API has been exceeded. You have to wait or change API Key."
} }
}, },
"entity": { "entity": {
@@ -26,20 +26,10 @@
"state": { "state": {
"good": "Good", "good": "Good",
"hazardous": "Hazardous", "hazardous": "Hazardous",
"high": "High",
"low": "Low",
"moderate": "Moderate", "moderate": "Moderate",
"unhealthy": "Unhealthy", "unhealthy": "Unhealthy"
"very_unhealthy": "Very unhealthy"
},
"state_attributes": {
"options": {
"state": {
"good": "[%key:component::accuweather::entity::sensor::air_quality::state::good%]",
"hazardous": "[%key:component::accuweather::entity::sensor::air_quality::state::hazardous%]",
"moderate": "[%key:component::accuweather::entity::sensor::air_quality::state::moderate%]",
"unhealthy": "[%key:component::accuweather::entity::sensor::air_quality::state::unhealthy%]",
"very_unhealthy": "[%key:component::accuweather::entity::sensor::air_quality::state::very_unhealthy%]"
}
}
} }
}, },
"apparent_temperature": { "apparent_temperature": {
@@ -72,11 +62,12 @@
"level": { "level": {
"name": "Level", "name": "Level",
"state": { "state": {
"extreme": "Extreme", "good": "[%key:component::accuweather::entity::sensor::air_quality::state::good%]",
"high": "[%key:common::state::high%]", "hazardous": "[%key:component::accuweather::entity::sensor::air_quality::state::hazardous%]",
"low": "[%key:common::state::low%]", "high": "[%key:component::accuweather::entity::sensor::air_quality::state::high%]",
"moderate": "Moderate", "low": "[%key:component::accuweather::entity::sensor::air_quality::state::low%]",
"very_high": "[%key:common::state::very_high%]" "moderate": "[%key:component::accuweather::entity::sensor::air_quality::state::moderate%]",
"unhealthy": "[%key:component::accuweather::entity::sensor::air_quality::state::unhealthy%]"
} }
} }
} }
@@ -90,11 +81,12 @@
"level": { "level": {
"name": "[%key:component::accuweather::entity::sensor::grass_pollen::state_attributes::level::name%]", "name": "[%key:component::accuweather::entity::sensor::grass_pollen::state_attributes::level::name%]",
"state": { "state": {
"extreme": "[%key:component::accuweather::entity::sensor::grass_pollen::state_attributes::level::state::extreme%]", "good": "[%key:component::accuweather::entity::sensor::air_quality::state::good%]",
"high": "[%key:common::state::high%]", "hazardous": "[%key:component::accuweather::entity::sensor::air_quality::state::hazardous%]",
"low": "[%key:common::state::low%]", "high": "[%key:component::accuweather::entity::sensor::air_quality::state::high%]",
"moderate": "[%key:component::accuweather::entity::sensor::grass_pollen::state_attributes::level::state::moderate%]", "low": "[%key:component::accuweather::entity::sensor::air_quality::state::low%]",
"very_high": "[%key:common::state::very_high%]" "moderate": "[%key:component::accuweather::entity::sensor::air_quality::state::moderate%]",
"unhealthy": "[%key:component::accuweather::entity::sensor::air_quality::state::unhealthy%]"
} }
} }
} }
@@ -108,15 +100,6 @@
"steady": "Steady", "steady": "Steady",
"rising": "Rising", "rising": "Rising",
"falling": "Falling" "falling": "Falling"
},
"state_attributes": {
"options": {
"state": {
"falling": "[%key:component::accuweather::entity::sensor::pressure_tendency::state::falling%]",
"rising": "[%key:component::accuweather::entity::sensor::pressure_tendency::state::rising%]",
"steady": "[%key:component::accuweather::entity::sensor::pressure_tendency::state::steady%]"
}
}
} }
}, },
"ragweed_pollen": { "ragweed_pollen": {
@@ -125,11 +108,12 @@
"level": { "level": {
"name": "[%key:component::accuweather::entity::sensor::grass_pollen::state_attributes::level::name%]", "name": "[%key:component::accuweather::entity::sensor::grass_pollen::state_attributes::level::name%]",
"state": { "state": {
"extreme": "[%key:component::accuweather::entity::sensor::grass_pollen::state_attributes::level::state::extreme%]", "good": "[%key:component::accuweather::entity::sensor::air_quality::state::good%]",
"high": "[%key:common::state::high%]", "hazardous": "[%key:component::accuweather::entity::sensor::air_quality::state::hazardous%]",
"low": "[%key:common::state::low%]", "high": "[%key:component::accuweather::entity::sensor::air_quality::state::high%]",
"moderate": "[%key:component::accuweather::entity::sensor::grass_pollen::state_attributes::level::state::moderate%]", "low": "[%key:component::accuweather::entity::sensor::air_quality::state::low%]",
"very_high": "[%key:common::state::very_high%]" "moderate": "[%key:component::accuweather::entity::sensor::air_quality::state::moderate%]",
"unhealthy": "[%key:component::accuweather::entity::sensor::air_quality::state::unhealthy%]"
} }
} }
} }
@@ -170,11 +154,12 @@
"level": { "level": {
"name": "[%key:component::accuweather::entity::sensor::grass_pollen::state_attributes::level::name%]", "name": "[%key:component::accuweather::entity::sensor::grass_pollen::state_attributes::level::name%]",
"state": { "state": {
"extreme": "[%key:component::accuweather::entity::sensor::grass_pollen::state_attributes::level::state::extreme%]", "good": "[%key:component::accuweather::entity::sensor::air_quality::state::good%]",
"high": "[%key:common::state::high%]", "hazardous": "[%key:component::accuweather::entity::sensor::air_quality::state::hazardous%]",
"low": "[%key:common::state::low%]", "high": "[%key:component::accuweather::entity::sensor::air_quality::state::high%]",
"moderate": "[%key:component::accuweather::entity::sensor::grass_pollen::state_attributes::level::state::moderate%]", "low": "[%key:component::accuweather::entity::sensor::air_quality::state::low%]",
"very_high": "[%key:common::state::very_high%]" "moderate": "[%key:component::accuweather::entity::sensor::air_quality::state::moderate%]",
"unhealthy": "[%key:component::accuweather::entity::sensor::air_quality::state::unhealthy%]"
} }
} }
} }
@@ -185,11 +170,12 @@
"level": { "level": {
"name": "[%key:component::accuweather::entity::sensor::grass_pollen::state_attributes::level::name%]", "name": "[%key:component::accuweather::entity::sensor::grass_pollen::state_attributes::level::name%]",
"state": { "state": {
"extreme": "[%key:component::accuweather::entity::sensor::grass_pollen::state_attributes::level::state::extreme%]", "good": "[%key:component::accuweather::entity::sensor::air_quality::state::good%]",
"high": "[%key:common::state::high%]", "hazardous": "[%key:component::accuweather::entity::sensor::air_quality::state::hazardous%]",
"low": "[%key:common::state::low%]", "high": "[%key:component::accuweather::entity::sensor::air_quality::state::high%]",
"moderate": "[%key:component::accuweather::entity::sensor::grass_pollen::state_attributes::level::state::moderate%]", "low": "[%key:component::accuweather::entity::sensor::air_quality::state::low%]",
"very_high": "[%key:common::state::very_high%]" "moderate": "[%key:component::accuweather::entity::sensor::air_quality::state::moderate%]",
"unhealthy": "[%key:component::accuweather::entity::sensor::air_quality::state::unhealthy%]"
} }
} }
} }
@@ -200,11 +186,12 @@
"level": { "level": {
"name": "[%key:component::accuweather::entity::sensor::grass_pollen::state_attributes::level::name%]", "name": "[%key:component::accuweather::entity::sensor::grass_pollen::state_attributes::level::name%]",
"state": { "state": {
"extreme": "[%key:component::accuweather::entity::sensor::grass_pollen::state_attributes::level::state::extreme%]", "good": "[%key:component::accuweather::entity::sensor::air_quality::state::good%]",
"high": "[%key:common::state::high%]", "hazardous": "[%key:component::accuweather::entity::sensor::air_quality::state::hazardous%]",
"low": "[%key:common::state::low%]", "high": "[%key:component::accuweather::entity::sensor::air_quality::state::high%]",
"moderate": "[%key:component::accuweather::entity::sensor::grass_pollen::state_attributes::level::state::moderate%]", "low": "[%key:component::accuweather::entity::sensor::air_quality::state::low%]",
"very_high": "[%key:common::state::very_high%]" "moderate": "[%key:component::accuweather::entity::sensor::air_quality::state::moderate%]",
"unhealthy": "[%key:component::accuweather::entity::sensor::air_quality::state::unhealthy%]"
} }
} }
} }
@@ -235,14 +222,6 @@
} }
} }
}, },
"exceptions": {
"current_conditions_update_error": {
"message": "An error occurred while retrieving weather current conditions data from the AccuWeather API: {error}"
},
"forecast_update_error": {
"message": "An error occurred while retrieving weather forecast data from the AccuWeather API: {error}"
}
},
"system_health": { "system_health": {
"info": { "info": {
"can_reach_server": "Reach AccuWeather server", "can_reach_server": "Reach AccuWeather server",

View File

@@ -30,7 +30,7 @@ from homeassistant.const import (
UnitOfTemperature, UnitOfTemperature,
) )
from homeassistant.core import HomeAssistant, callback from homeassistant.core import HomeAssistant, callback
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback from homeassistant.helpers.entity_platform import AddEntitiesCallback
from homeassistant.util.dt import utc_from_timestamp from homeassistant.util.dt import utc_from_timestamp
from .const import ( from .const import (
@@ -54,7 +54,7 @@ PARALLEL_UPDATES = 1
async def async_setup_entry( async def async_setup_entry(
hass: HomeAssistant, hass: HomeAssistant,
entry: AccuWeatherConfigEntry, entry: AccuWeatherConfigEntry,
async_add_entities: AddConfigEntryEntitiesCallback, async_add_entities: AddEntitiesCallback,
) -> None: ) -> None:
"""Add a AccuWeather weather entity from a config_entry.""" """Add a AccuWeather weather entity from a config_entry."""
async_add_entities([AccuWeatherEntity(entry.runtime_data)]) async_add_entities([AccuWeatherEntity(entry.runtime_data)])

View File

@@ -22,7 +22,7 @@ from homeassistant.const import (
STATE_UNKNOWN, STATE_UNKNOWN,
) )
from homeassistant.core import HomeAssistant from homeassistant.core import HomeAssistant
from homeassistant.helpers import config_validation as cv import homeassistant.helpers.config_validation as cv
from homeassistant.helpers.entity_platform import AddEntitiesCallback from homeassistant.helpers.entity_platform import AddEntitiesCallback
from homeassistant.helpers.typing import ConfigType, DiscoveryInfoType from homeassistant.helpers.typing import ConfigType, DiscoveryInfoType

View File

@@ -3,7 +3,7 @@
from homeassistant.config_entries import ConfigEntry from homeassistant.config_entries import ConfigEntry
from homeassistant.const import Platform from homeassistant.const import Platform
from homeassistant.core import HomeAssistant from homeassistant.core import HomeAssistant
from homeassistant.helpers import entity_registry as er import homeassistant.helpers.entity_registry as er
from .hub import PulseHub from .hub import PulseHub

View File

@@ -40,10 +40,9 @@ class AcmedaFlowHandler(ConfigFlow, domain=DOMAIN):
entry.unique_id for entry in self._async_current_entries() entry.unique_id for entry in self._async_current_entries()
} }
hubs: list[aiopulse.Hub] = []
with suppress(TimeoutError): with suppress(TimeoutError):
async with timeout(5): async with timeout(5):
hubs = [ hubs: list[aiopulse.Hub] = [
hub hub
async for hub in aiopulse.Hub.discover() async for hub in aiopulse.Hub.discover()
if hub.id not in already_configured if hub.id not in already_configured

View File

@@ -11,7 +11,7 @@ from homeassistant.components.cover import (
) )
from homeassistant.core import HomeAssistant, callback from homeassistant.core import HomeAssistant, callback
from homeassistant.helpers.dispatcher import async_dispatcher_connect from homeassistant.helpers.dispatcher import async_dispatcher_connect
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback from homeassistant.helpers.entity_platform import AddEntitiesCallback
from . import AcmedaConfigEntry from . import AcmedaConfigEntry
from .const import ACMEDA_HUB_UPDATE from .const import ACMEDA_HUB_UPDATE
@@ -22,7 +22,7 @@ from .helpers import async_add_acmeda_entities
async def async_setup_entry( async def async_setup_entry(
hass: HomeAssistant, hass: HomeAssistant,
config_entry: AcmedaConfigEntry, config_entry: AcmedaConfigEntry,
async_add_entities: AddConfigEntryEntitiesCallback, async_add_entities: AddEntitiesCallback,
) -> None: ) -> None:
"""Set up the Acmeda Rollers from a config entry.""" """Set up the Acmeda Rollers from a config entry."""
hub = config_entry.runtime_data hub = config_entry.runtime_data

View File

@@ -9,7 +9,7 @@ from aiopulse import Roller
from homeassistant.config_entries import ConfigEntry from homeassistant.config_entries import ConfigEntry
from homeassistant.core import HomeAssistant, callback from homeassistant.core import HomeAssistant, callback
from homeassistant.helpers import device_registry as dr from homeassistant.helpers import device_registry as dr
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback from homeassistant.helpers.entity_platform import AddEntitiesCallback
from .const import DOMAIN, LOGGER from .const import DOMAIN, LOGGER
@@ -23,7 +23,7 @@ def async_add_acmeda_entities(
entity_class: type, entity_class: type,
config_entry: AcmedaConfigEntry, config_entry: AcmedaConfigEntry,
current: set[int], current: set[int],
async_add_entities: AddConfigEntryEntitiesCallback, async_add_entities: AddEntitiesCallback,
) -> None: ) -> None:
"""Add any new entities.""" """Add any new entities."""
hub = config_entry.runtime_data hub = config_entry.runtime_data

View File

@@ -70,7 +70,7 @@ class PulseHub:
async def async_notify_update(self, update_type: aiopulse.UpdateType) -> None: async def async_notify_update(self, update_type: aiopulse.UpdateType) -> None:
"""Evaluate entities when hub reports that update has occurred.""" """Evaluate entities when hub reports that update has occurred."""
LOGGER.debug("Hub %s updated", update_type.name) LOGGER.debug("Hub {update_type.name} updated")
if update_type == aiopulse.UpdateType.rollers: if update_type == aiopulse.UpdateType.rollers:
await update_devices(self.hass, self.config_entry, self.api.rollers) await update_devices(self.hass, self.config_entry, self.api.rollers)

View File

@@ -6,7 +6,7 @@ from homeassistant.components.sensor import SensorDeviceClass, SensorEntity
from homeassistant.const import PERCENTAGE from homeassistant.const import PERCENTAGE
from homeassistant.core import HomeAssistant, callback from homeassistant.core import HomeAssistant, callback
from homeassistant.helpers.dispatcher import async_dispatcher_connect from homeassistant.helpers.dispatcher import async_dispatcher_connect
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback from homeassistant.helpers.entity_platform import AddEntitiesCallback
from . import AcmedaConfigEntry from . import AcmedaConfigEntry
from .const import ACMEDA_HUB_UPDATE from .const import ACMEDA_HUB_UPDATE
@@ -17,7 +17,7 @@ from .helpers import async_add_acmeda_entities
async def async_setup_entry( async def async_setup_entry(
hass: HomeAssistant, hass: HomeAssistant,
config_entry: AcmedaConfigEntry, config_entry: AcmedaConfigEntry,
async_add_entities: AddConfigEntryEntitiesCallback, async_add_entities: AddEntitiesCallback,
) -> None: ) -> None:
"""Set up the Acmeda Rollers from a config entry.""" """Set up the Acmeda Rollers from a config entry."""
hub = config_entry.runtime_data hub = config_entry.runtime_data

View File

@@ -3,9 +3,9 @@
from __future__ import annotations from __future__ import annotations
import logging import logging
import telnetlib # pylint: disable=deprecated-module
from typing import Final from typing import Final
import telnetlib # pylint: disable=deprecated-module
import voluptuous as vol import voluptuous as vol
from homeassistant.components.device_tracker import ( from homeassistant.components.device_tracker import (
@@ -15,7 +15,7 @@ from homeassistant.components.device_tracker import (
) )
from homeassistant.const import CONF_HOST, CONF_PASSWORD, CONF_USERNAME from homeassistant.const import CONF_HOST, CONF_PASSWORD, CONF_USERNAME
from homeassistant.core import HomeAssistant from homeassistant.core import HomeAssistant
from homeassistant.helpers import config_validation as cv import homeassistant.helpers.config_validation as cv
from homeassistant.helpers.typing import ConfigType from homeassistant.helpers.typing import ConfigType
from .const import LEASES_REGEX from .const import LEASES_REGEX

View File

@@ -2,38 +2,25 @@
from __future__ import annotations from __future__ import annotations
from homeassistant.config_entries import ConfigEntry
from homeassistant.const import Platform from homeassistant.const import Platform
from homeassistant.core import HomeAssistant from homeassistant.core import HomeAssistant
from .const import CONNECTION_TYPE, LOCAL PLATFORMS = [Platform.CLIMATE]
from .coordinator import AdaxCloudCoordinator, AdaxConfigEntry, AdaxLocalCoordinator
PLATFORMS = [Platform.CLIMATE, Platform.SENSOR]
async def async_setup_entry(hass: HomeAssistant, entry: AdaxConfigEntry) -> bool: async def async_setup_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool:
"""Set up Adax from a config entry.""" """Set up Adax from a config entry."""
if entry.data.get(CONNECTION_TYPE) == LOCAL:
local_coordinator = AdaxLocalCoordinator(hass, entry)
entry.runtime_data = local_coordinator
else:
cloud_coordinator = AdaxCloudCoordinator(hass, entry)
entry.runtime_data = cloud_coordinator
await entry.runtime_data.async_config_entry_first_refresh()
await hass.config_entries.async_forward_entry_setups(entry, PLATFORMS) await hass.config_entries.async_forward_entry_setups(entry, PLATFORMS)
return True return True
async def async_unload_entry(hass: HomeAssistant, entry: AdaxConfigEntry) -> bool: async def async_unload_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool:
"""Unload a config entry.""" """Unload a config entry."""
return await hass.config_entries.async_unload_platforms(entry, PLATFORMS) return await hass.config_entries.async_unload_platforms(entry, PLATFORMS)
async def async_migrate_entry( async def async_migrate_entry(hass: HomeAssistant, config_entry: ConfigEntry) -> bool:
hass: HomeAssistant, config_entry: AdaxConfigEntry
) -> bool:
"""Migrate old entry.""" """Migrate old entry."""
# convert title and unique_id to string # convert title and unique_id to string
if config_entry.version == 1: if config_entry.version == 1:

View File

@@ -12,42 +12,57 @@ from homeassistant.components.climate import (
ClimateEntityFeature, ClimateEntityFeature,
HVACMode, HVACMode,
) )
from homeassistant.config_entries import ConfigEntry
from homeassistant.const import ( from homeassistant.const import (
ATTR_TEMPERATURE, ATTR_TEMPERATURE,
CONF_IP_ADDRESS,
CONF_PASSWORD,
CONF_TOKEN,
CONF_UNIQUE_ID, CONF_UNIQUE_ID,
PRECISION_WHOLE, PRECISION_WHOLE,
UnitOfTemperature, UnitOfTemperature,
) )
from homeassistant.core import HomeAssistant, callback from homeassistant.core import HomeAssistant
from homeassistant.helpers.aiohttp_client import async_get_clientsession
from homeassistant.helpers.device_registry import DeviceInfo from homeassistant.helpers.device_registry import DeviceInfo
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback from homeassistant.helpers.entity_platform import AddEntitiesCallback
from homeassistant.helpers.update_coordinator import CoordinatorEntity
from . import AdaxConfigEntry from .const import ACCOUNT_ID, CONNECTION_TYPE, DOMAIN, LOCAL
from .const import CONNECTION_TYPE, DOMAIN, LOCAL
from .coordinator import AdaxCloudCoordinator, AdaxLocalCoordinator
async def async_setup_entry( async def async_setup_entry(
hass: HomeAssistant, hass: HomeAssistant,
entry: AdaxConfigEntry, entry: ConfigEntry,
async_add_entities: AddConfigEntryEntitiesCallback, async_add_entities: AddEntitiesCallback,
) -> None: ) -> None:
"""Set up the Adax thermostat with config flow.""" """Set up the Adax thermostat with config flow."""
if entry.data.get(CONNECTION_TYPE) == LOCAL: if entry.data.get(CONNECTION_TYPE) == LOCAL:
local_coordinator = cast(AdaxLocalCoordinator, entry.runtime_data) adax_data_handler = AdaxLocal(
async_add_entities( entry.data[CONF_IP_ADDRESS],
[LocalAdaxDevice(local_coordinator, entry.data[CONF_UNIQUE_ID])], entry.data[CONF_TOKEN],
websession=async_get_clientsession(hass, verify_ssl=False),
) )
else:
cloud_coordinator = cast(AdaxCloudCoordinator, entry.runtime_data)
async_add_entities( async_add_entities(
AdaxDevice(cloud_coordinator, device_id) [LocalAdaxDevice(adax_data_handler, entry.data[CONF_UNIQUE_ID])], True
for device_id in cloud_coordinator.data
) )
return
adax_data_handler = Adax(
entry.data[ACCOUNT_ID],
entry.data[CONF_PASSWORD],
websession=async_get_clientsession(hass),
)
async_add_entities(
(
AdaxDevice(room, adax_data_handler)
for room in await adax_data_handler.get_rooms()
),
True,
)
class AdaxDevice(CoordinatorEntity[AdaxCloudCoordinator], ClimateEntity): class AdaxDevice(ClimateEntity):
"""Representation of a heater.""" """Representation of a heater."""
_attr_hvac_modes = [HVACMode.HEAT, HVACMode.OFF] _attr_hvac_modes = [HVACMode.HEAT, HVACMode.OFF]
@@ -61,37 +76,20 @@ class AdaxDevice(CoordinatorEntity[AdaxCloudCoordinator], ClimateEntity):
_attr_target_temperature_step = PRECISION_WHOLE _attr_target_temperature_step = PRECISION_WHOLE
_attr_temperature_unit = UnitOfTemperature.CELSIUS _attr_temperature_unit = UnitOfTemperature.CELSIUS
def __init__( def __init__(self, heater_data: dict[str, Any], adax_data_handler: Adax) -> None:
self,
coordinator: AdaxCloudCoordinator,
device_id: str,
) -> None:
"""Initialize the heater.""" """Initialize the heater."""
super().__init__(coordinator) self._device_id = heater_data["id"]
self._adax_data_handler: Adax = coordinator.adax_data_handler self._adax_data_handler = adax_data_handler
self._device_id = device_id
self._attr_name = self.room["name"] self._attr_unique_id = f"{heater_data['homeId']}_{heater_data['id']}"
self._attr_unique_id = f"{self.room['homeId']}_{self._device_id}"
self._attr_device_info = DeviceInfo( self._attr_device_info = DeviceInfo(
identifiers={(DOMAIN, device_id)}, identifiers={(DOMAIN, heater_data["id"])},
# Instead of setting the device name to the entity name, adax # Instead of setting the device name to the entity name, adax
# should be updated to set has_entity_name = True, and set the entity # should be updated to set has_entity_name = True, and set the entity
# name to None # name to None
name=cast(str | None, self.name), name=cast(str | None, self.name),
manufacturer="Adax", manufacturer="Adax",
) )
self._apply_data(self.room)
@property
def available(self) -> bool:
"""Whether the entity is available or not."""
return super().available and self._device_id in self.coordinator.data
@property
def room(self) -> dict[str, Any]:
"""Gets the data for this particular device."""
return self.coordinator.data[self._device_id]
async def async_set_hvac_mode(self, hvac_mode: HVACMode) -> None: async def async_set_hvac_mode(self, hvac_mode: HVACMode) -> None:
"""Set hvac mode.""" """Set hvac mode."""
@@ -106,9 +104,7 @@ class AdaxDevice(CoordinatorEntity[AdaxCloudCoordinator], ClimateEntity):
) )
else: else:
return return
await self._adax_data_handler.update()
# Request data refresh from source to verify that update was successful
await self.coordinator.async_request_refresh()
async def async_set_temperature(self, **kwargs: Any) -> None: async def async_set_temperature(self, **kwargs: Any) -> None:
"""Set new target temperature.""" """Set new target temperature."""
@@ -118,31 +114,28 @@ class AdaxDevice(CoordinatorEntity[AdaxCloudCoordinator], ClimateEntity):
self._device_id, temperature, True self._device_id, temperature, True
) )
@callback async def async_update(self) -> None:
def _handle_coordinator_update(self) -> None: """Get the latest data."""
"""Handle updated data from the coordinator.""" for room in await self._adax_data_handler.get_rooms():
if room := self.room: if room["id"] != self._device_id:
self._apply_data(room) continue
super()._handle_coordinator_update() self._attr_name = room["name"]
self._attr_current_temperature = room.get("temperature")
def _apply_data(self, room: dict[str, Any]) -> None: self._attr_target_temperature = room.get("targetTemperature")
"""Update the appropriate attributues based on received data.""" if room["heatingEnabled"]:
self._attr_current_temperature = room.get("temperature") self._attr_hvac_mode = HVACMode.HEAT
self._attr_target_temperature = room.get("targetTemperature") self._attr_icon = "mdi:radiator"
if room["heatingEnabled"]: else:
self._attr_hvac_mode = HVACMode.HEAT self._attr_hvac_mode = HVACMode.OFF
self._attr_icon = "mdi:radiator" self._attr_icon = "mdi:radiator-off"
else: return
self._attr_hvac_mode = HVACMode.OFF
self._attr_icon = "mdi:radiator-off"
class LocalAdaxDevice(CoordinatorEntity[AdaxLocalCoordinator], ClimateEntity): class LocalAdaxDevice(ClimateEntity):
"""Representation of a heater.""" """Representation of a heater."""
_attr_hvac_modes = [HVACMode.HEAT, HVACMode.OFF] _attr_hvac_modes = [HVACMode.HEAT, HVACMode.OFF]
_attr_hvac_mode = HVACMode.OFF _attr_hvac_mode = HVACMode.HEAT
_attr_icon = "mdi:radiator-off"
_attr_max_temp = 35 _attr_max_temp = 35
_attr_min_temp = 5 _attr_min_temp = 5
_attr_supported_features = ( _attr_supported_features = (
@@ -153,10 +146,9 @@ class LocalAdaxDevice(CoordinatorEntity[AdaxLocalCoordinator], ClimateEntity):
_attr_target_temperature_step = PRECISION_WHOLE _attr_target_temperature_step = PRECISION_WHOLE
_attr_temperature_unit = UnitOfTemperature.CELSIUS _attr_temperature_unit = UnitOfTemperature.CELSIUS
def __init__(self, coordinator: AdaxLocalCoordinator, unique_id: str) -> None: def __init__(self, adax_data_handler: AdaxLocal, unique_id: str) -> None:
"""Initialize the heater.""" """Initialize the heater."""
super().__init__(coordinator) self._adax_data_handler = adax_data_handler
self._adax_data_handler: AdaxLocal = coordinator.adax_data_handler
self._attr_unique_id = unique_id self._attr_unique_id = unique_id
self._attr_device_info = DeviceInfo( self._attr_device_info = DeviceInfo(
identifiers={(DOMAIN, unique_id)}, identifiers={(DOMAIN, unique_id)},
@@ -177,20 +169,17 @@ class LocalAdaxDevice(CoordinatorEntity[AdaxLocalCoordinator], ClimateEntity):
return return
await self._adax_data_handler.set_target_temperature(temperature) await self._adax_data_handler.set_target_temperature(temperature)
@callback async def async_update(self) -> None:
def _handle_coordinator_update(self) -> None: """Get the latest data."""
"""Handle updated data from the coordinator.""" data = await self._adax_data_handler.get_status()
if data := self.coordinator.data: self._attr_current_temperature = data["current_temperature"]
self._attr_current_temperature = data["current_temperature"] self._attr_available = self._attr_current_temperature is not None
self._attr_available = self._attr_current_temperature is not None if (target_temp := data["target_temperature"]) == 0:
if (target_temp := data["target_temperature"]) == 0: self._attr_hvac_mode = HVACMode.OFF
self._attr_hvac_mode = HVACMode.OFF self._attr_icon = "mdi:radiator-off"
self._attr_icon = "mdi:radiator-off" if target_temp == 0:
if target_temp == 0: self._attr_target_temperature = self._attr_min_temp
self._attr_target_temperature = self._attr_min_temp else:
else: self._attr_hvac_mode = HVACMode.HEAT
self._attr_hvac_mode = HVACMode.HEAT self._attr_icon = "mdi:radiator"
self._attr_icon = "mdi:radiator" self._attr_target_temperature = target_temp
self._attr_target_temperature = target_temp
super()._handle_coordinator_update()

View File

@@ -1,6 +1,5 @@
"""Constants for the Adax integration.""" """Constants for the Adax integration."""
import datetime
from typing import Final from typing import Final
ACCOUNT_ID: Final = "account_id" ACCOUNT_ID: Final = "account_id"
@@ -10,5 +9,3 @@ DOMAIN: Final = "adax"
LOCAL = "Local" LOCAL = "Local"
WIFI_SSID = "wifi_ssid" WIFI_SSID = "wifi_ssid"
WIFI_PSWD = "wifi_pswd" WIFI_PSWD = "wifi_pswd"
SCAN_INTERVAL = datetime.timedelta(seconds=60)

View File

@@ -1,94 +0,0 @@
"""DataUpdateCoordinator for the Adax component."""
import logging
from typing import Any, cast
from adax import Adax
from adax_local import Adax as AdaxLocal
from homeassistant.config_entries import ConfigEntry
from homeassistant.const import CONF_IP_ADDRESS, CONF_PASSWORD, CONF_TOKEN
from homeassistant.core import HomeAssistant
from homeassistant.helpers.aiohttp_client import async_get_clientsession
from homeassistant.helpers.update_coordinator import DataUpdateCoordinator, UpdateFailed
from .const import ACCOUNT_ID, SCAN_INTERVAL
_LOGGER = logging.getLogger(__name__)
type AdaxConfigEntry = ConfigEntry[AdaxCloudCoordinator | AdaxLocalCoordinator]
class AdaxCloudCoordinator(DataUpdateCoordinator[dict[str, dict[str, Any]]]):
"""Coordinator for updating data to and from Adax (cloud)."""
def __init__(self, hass: HomeAssistant, entry: AdaxConfigEntry) -> None:
"""Initialize the Adax coordinator used for Cloud mode."""
super().__init__(
hass,
config_entry=entry,
logger=_LOGGER,
name="AdaxCloud",
update_interval=SCAN_INTERVAL,
)
self.adax_data_handler = Adax(
entry.data[ACCOUNT_ID],
entry.data[CONF_PASSWORD],
websession=async_get_clientsession(hass),
)
async def _async_update_data(self) -> dict[str, dict[str, Any]]:
"""Fetch data from the Adax."""
try:
if hasattr(self.adax_data_handler, "fetch_rooms_info"):
rooms = await self.adax_data_handler.fetch_rooms_info() or []
_LOGGER.debug("fetch_rooms_info returned: %s", rooms)
else:
_LOGGER.debug("fetch_rooms_info method not available, using get_rooms")
rooms = []
if not rooms:
_LOGGER.debug(
"No rooms from fetch_rooms_info, trying get_rooms as fallback"
)
rooms = await self.adax_data_handler.get_rooms() or []
_LOGGER.debug("get_rooms fallback returned: %s", rooms)
if not rooms:
raise UpdateFailed("No rooms available from Adax API")
except OSError as e:
raise UpdateFailed(f"Error communicating with API: {e}") from e
for room in rooms:
room["energyWh"] = int(room.get("energyWh", 0))
return {r["id"]: r for r in rooms}
class AdaxLocalCoordinator(DataUpdateCoordinator[dict[str, Any] | None]):
"""Coordinator for updating data to and from Adax (local)."""
def __init__(self, hass: HomeAssistant, entry: AdaxConfigEntry) -> None:
"""Initialize the Adax coordinator used for Local mode."""
super().__init__(
hass,
config_entry=entry,
logger=_LOGGER,
name="AdaxLocal",
update_interval=SCAN_INTERVAL,
)
self.adax_data_handler = AdaxLocal(
entry.data[CONF_IP_ADDRESS],
entry.data[CONF_TOKEN],
websession=async_get_clientsession(hass, verify_ssl=False),
)
async def _async_update_data(self) -> dict[str, Any]:
"""Fetch data from the Adax."""
if result := await self.adax_data_handler.get_status():
return cast(dict[str, Any], result)
raise UpdateFailed("Got invalid status from device")

View File

@@ -1,7 +1,7 @@
{ {
"domain": "adax", "domain": "adax",
"name": "Adax", "name": "Adax",
"codeowners": ["@danielhiversen", "@lazytarget"], "codeowners": ["@danielhiversen"],
"config_flow": true, "config_flow": true,
"documentation": "https://www.home-assistant.io/integrations/adax", "documentation": "https://www.home-assistant.io/integrations/adax",
"iot_class": "local_polling", "iot_class": "local_polling",

View File

@@ -1,77 +0,0 @@
"""Support for Adax energy sensors."""
from __future__ import annotations
from typing import cast
from homeassistant.components.sensor import (
SensorDeviceClass,
SensorEntity,
SensorStateClass,
)
from homeassistant.const import UnitOfEnergy
from homeassistant.core import HomeAssistant
from homeassistant.helpers.device_registry import DeviceInfo
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from homeassistant.helpers.update_coordinator import CoordinatorEntity
from . import AdaxConfigEntry
from .const import CONNECTION_TYPE, DOMAIN, LOCAL
from .coordinator import AdaxCloudCoordinator
async def async_setup_entry(
hass: HomeAssistant,
entry: AdaxConfigEntry,
async_add_entities: AddConfigEntryEntitiesCallback,
) -> None:
"""Set up the Adax energy sensors with config flow."""
if entry.data.get(CONNECTION_TYPE) != LOCAL:
cloud_coordinator = cast(AdaxCloudCoordinator, entry.runtime_data)
# Create individual energy sensors for each device
async_add_entities(
AdaxEnergySensor(cloud_coordinator, device_id)
for device_id in cloud_coordinator.data
)
class AdaxEnergySensor(CoordinatorEntity[AdaxCloudCoordinator], SensorEntity):
"""Representation of an Adax energy sensor."""
_attr_has_entity_name = True
_attr_translation_key = "energy"
_attr_device_class = SensorDeviceClass.ENERGY
_attr_native_unit_of_measurement = UnitOfEnergy.WATT_HOUR
_attr_suggested_unit_of_measurement = UnitOfEnergy.KILO_WATT_HOUR
_attr_state_class = SensorStateClass.TOTAL_INCREASING
_attr_suggested_display_precision = 3
def __init__(
self,
coordinator: AdaxCloudCoordinator,
device_id: str,
) -> None:
"""Initialize the energy sensor."""
super().__init__(coordinator)
self._device_id = device_id
room = coordinator.data[device_id]
self._attr_unique_id = f"{room['homeId']}_{device_id}_energy"
self._attr_device_info = DeviceInfo(
identifiers={(DOMAIN, device_id)},
name=room["name"],
manufacturer="Adax",
)
@property
def available(self) -> bool:
"""Return True if entity is available."""
return (
super().available and "energyWh" in self.coordinator.data[self._device_id]
)
@property
def native_value(self) -> int:
"""Return the native value of the sensor."""
return int(self.coordinator.data[self._device_id]["energyWh"])

View File

@@ -5,14 +5,14 @@
"data": { "data": {
"connection_type": "Select connection type" "connection_type": "Select connection type"
}, },
"description": "Select connection type. Local requires heaters with Bluetooth" "description": "Select connection type. Local requires heaters with bluetooth"
}, },
"local": { "local": {
"data": { "data": {
"wifi_ssid": "Wi-Fi SSID", "wifi_ssid": "Wi-Fi SSID",
"wifi_pswd": "Wi-Fi password" "wifi_pswd": "Wi-Fi Password"
}, },
"description": "Reset the heater by pressing + and OK until display shows 'Reset'. Then press and hold OK button on the heater until the blue LED starts blinking before pressing Submit. Configuring heater might take some minutes." "description": "Reset the heater by pressing + and OK until display shows 'Reset'. Then press and hold OK button on the heater until the blue led starts blinking before pressing Submit. Configuring heater might take some minutes."
}, },
"cloud": { "cloud": {
"data": { "data": {

View File

@@ -7,7 +7,7 @@ from dataclasses import dataclass
from adguardhome import AdGuardHome, AdGuardHomeConnectionError from adguardhome import AdGuardHome, AdGuardHomeConnectionError
import voluptuous as vol import voluptuous as vol
from homeassistant.config_entries import ConfigEntry from homeassistant.config_entries import ConfigEntry, ConfigEntryState
from homeassistant.const import ( from homeassistant.const import (
CONF_HOST, CONF_HOST,
CONF_NAME, CONF_NAME,
@@ -34,12 +34,9 @@ from .const import (
SERVICE_REMOVE_URL, SERVICE_REMOVE_URL,
) )
SERVICE_URL_SCHEMA = vol.Schema({vol.Required(CONF_URL): vol.Any(cv.url, cv.path)}) SERVICE_URL_SCHEMA = vol.Schema({vol.Required(CONF_URL): cv.url})
SERVICE_ADD_URL_SCHEMA = vol.Schema( SERVICE_ADD_URL_SCHEMA = vol.Schema(
{ {vol.Required(CONF_NAME): cv.string, vol.Required(CONF_URL): cv.url}
vol.Required(CONF_NAME): cv.string,
vol.Required(CONF_URL): vol.Any(cv.url, cv.path),
}
) )
SERVICE_REFRESH_SCHEMA = vol.Schema( SERVICE_REFRESH_SCHEMA = vol.Schema(
{vol.Optional(CONF_FORCE, default=False): cv.boolean} {vol.Optional(CONF_FORCE, default=False): cv.boolean}
@@ -123,7 +120,12 @@ async def async_setup_entry(hass: HomeAssistant, entry: AdGuardConfigEntry) -> b
async def async_unload_entry(hass: HomeAssistant, entry: AdGuardConfigEntry) -> bool: async def async_unload_entry(hass: HomeAssistant, entry: AdGuardConfigEntry) -> bool:
"""Unload AdGuard Home config entry.""" """Unload AdGuard Home config entry."""
unload_ok = await hass.config_entries.async_unload_platforms(entry, PLATFORMS) unload_ok = await hass.config_entries.async_unload_platforms(entry, PLATFORMS)
if not hass.config_entries.async_loaded_entries(DOMAIN): loaded_entries = [
entry
for entry in hass.config_entries.async_entries(DOMAIN)
if entry.state == ConfigEntryState.LOADED
]
if len(loaded_entries) == 1:
# This is the last loaded instance of AdGuard, deregister any services # This is the last loaded instance of AdGuard, deregister any services
hass.services.async_remove(DOMAIN, SERVICE_ADD_URL) hass.services.async_remove(DOMAIN, SERVICE_ADD_URL)
hass.services.async_remove(DOMAIN, SERVICE_REMOVE_URL) hass.services.async_remove(DOMAIN, SERVICE_REMOVE_URL)

View File

@@ -12,7 +12,7 @@ from adguardhome import AdGuardHome
from homeassistant.components.sensor import SensorEntity, SensorEntityDescription from homeassistant.components.sensor import SensorEntity, SensorEntityDescription
from homeassistant.const import PERCENTAGE, UnitOfTime from homeassistant.const import PERCENTAGE, UnitOfTime
from homeassistant.core import HomeAssistant from homeassistant.core import HomeAssistant
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback from homeassistant.helpers.entity_platform import AddEntitiesCallback
from . import AdGuardConfigEntry, AdGuardData from . import AdGuardConfigEntry, AdGuardData
from .const import DOMAIN from .const import DOMAIN
@@ -85,7 +85,7 @@ SENSORS: tuple[AdGuardHomeEntityDescription, ...] = (
async def async_setup_entry( async def async_setup_entry(
hass: HomeAssistant, hass: HomeAssistant,
entry: AdGuardConfigEntry, entry: AdGuardConfigEntry,
async_add_entities: AddConfigEntryEntitiesCallback, async_add_entities: AddEntitiesCallback,
) -> None: ) -> None:
"""Set up AdGuard Home sensor based on a config entry.""" """Set up AdGuard Home sensor based on a config entry."""
data = entry.runtime_data data = entry.runtime_data

View File

@@ -79,7 +79,7 @@
"services": { "services": {
"add_url": { "add_url": {
"name": "Add URL", "name": "Add URL",
"description": "Adds a new filter subscription to AdGuard Home.", "description": "Add a new filter subscription to AdGuard Home.",
"fields": { "fields": {
"name": { "name": {
"name": "[%key:common::config_flow::data::name%]", "name": "[%key:common::config_flow::data::name%]",
@@ -123,11 +123,11 @@
}, },
"refresh": { "refresh": {
"name": "Refresh", "name": "Refresh",
"description": "Refreshes all filter subscriptions in AdGuard Home.", "description": "Refresh all filter subscriptions in AdGuard Home.",
"fields": { "fields": {
"force": { "force": {
"name": "Force", "name": "Force",
"description": "Force update (bypasses AdGuard Home throttling), omit for a regular refresh." "description": "Force update (bypasses AdGuard Home throttling). \"true\" to force, or \"false\" to omit for a regular refresh."
} }
} }
} }

View File

@@ -11,7 +11,7 @@ from adguardhome import AdGuardHome, AdGuardHomeError
from homeassistant.components.switch import SwitchEntity, SwitchEntityDescription from homeassistant.components.switch import SwitchEntity, SwitchEntityDescription
from homeassistant.core import HomeAssistant from homeassistant.core import HomeAssistant
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback from homeassistant.helpers.entity_platform import AddEntitiesCallback
from . import AdGuardConfigEntry, AdGuardData from . import AdGuardConfigEntry, AdGuardData
from .const import DOMAIN, LOGGER from .const import DOMAIN, LOGGER
@@ -79,7 +79,7 @@ SWITCHES: tuple[AdGuardHomeSwitchEntityDescription, ...] = (
async def async_setup_entry( async def async_setup_entry(
hass: HomeAssistant, hass: HomeAssistant,
entry: AdGuardConfigEntry, entry: AdGuardConfigEntry,
async_add_entities: AddConfigEntryEntitiesCallback, async_add_entities: AddEntitiesCallback,
) -> None: ) -> None:
"""Set up AdGuard Home switch based on a config entry.""" """Set up AdGuard Home switch based on a config entry."""
data = entry.runtime_data data = entry.runtime_data

View File

@@ -12,7 +12,7 @@ from homeassistant.const import (
EVENT_HOMEASSISTANT_STOP, EVENT_HOMEASSISTANT_STOP,
) )
from homeassistant.core import HomeAssistant, ServiceCall from homeassistant.core import HomeAssistant, ServiceCall
from homeassistant.helpers import config_validation as cv import homeassistant.helpers.config_validation as cv
from homeassistant.helpers.typing import ConfigType from homeassistant.helpers.typing import ConfigType
from .const import CONF_ADS_VAR, DATA_ADS, DOMAIN, AdsType from .const import CONF_ADS_VAR, DATA_ADS, DOMAIN, AdsType

View File

@@ -13,7 +13,7 @@ from homeassistant.components.binary_sensor import (
) )
from homeassistant.const import CONF_DEVICE_CLASS, CONF_NAME from homeassistant.const import CONF_DEVICE_CLASS, CONF_NAME
from homeassistant.core import HomeAssistant from homeassistant.core import HomeAssistant
from homeassistant.helpers import config_validation as cv import homeassistant.helpers.config_validation as cv
from homeassistant.helpers.entity_platform import AddEntitiesCallback from homeassistant.helpers.entity_platform import AddEntitiesCallback
from homeassistant.helpers.typing import ConfigType, DiscoveryInfoType from homeassistant.helpers.typing import ConfigType, DiscoveryInfoType

View File

@@ -17,7 +17,7 @@ from homeassistant.components.cover import (
) )
from homeassistant.const import CONF_DEVICE_CLASS, CONF_NAME from homeassistant.const import CONF_DEVICE_CLASS, CONF_NAME
from homeassistant.core import HomeAssistant from homeassistant.core import HomeAssistant
from homeassistant.helpers import config_validation as cv import homeassistant.helpers.config_validation as cv
from homeassistant.helpers.entity_platform import AddEntitiesCallback from homeassistant.helpers.entity_platform import AddEntitiesCallback
from homeassistant.helpers.typing import ConfigType, DiscoveryInfoType from homeassistant.helpers.typing import ConfigType, DiscoveryInfoType

View File

@@ -15,7 +15,7 @@ from homeassistant.components.light import (
) )
from homeassistant.const import CONF_NAME from homeassistant.const import CONF_NAME
from homeassistant.core import HomeAssistant from homeassistant.core import HomeAssistant
from homeassistant.helpers import config_validation as cv import homeassistant.helpers.config_validation as cv
from homeassistant.helpers.entity_platform import AddEntitiesCallback from homeassistant.helpers.entity_platform import AddEntitiesCallback
from homeassistant.helpers.typing import ConfigType, DiscoveryInfoType from homeassistant.helpers.typing import ConfigType, DiscoveryInfoType

View File

@@ -11,7 +11,7 @@ from homeassistant.components.select import (
) )
from homeassistant.const import CONF_NAME from homeassistant.const import CONF_NAME
from homeassistant.core import HomeAssistant from homeassistant.core import HomeAssistant
from homeassistant.helpers import config_validation as cv import homeassistant.helpers.config_validation as cv
from homeassistant.helpers.entity_platform import AddEntitiesCallback from homeassistant.helpers.entity_platform import AddEntitiesCallback
from homeassistant.helpers.typing import ConfigType, DiscoveryInfoType from homeassistant.helpers.typing import ConfigType, DiscoveryInfoType

View File

@@ -15,7 +15,7 @@ from homeassistant.components.sensor import (
) )
from homeassistant.const import CONF_DEVICE_CLASS, CONF_NAME, CONF_UNIT_OF_MEASUREMENT from homeassistant.const import CONF_DEVICE_CLASS, CONF_NAME, CONF_UNIT_OF_MEASUREMENT
from homeassistant.core import HomeAssistant from homeassistant.core import HomeAssistant
from homeassistant.helpers import config_validation as cv import homeassistant.helpers.config_validation as cv
from homeassistant.helpers.entity_platform import AddEntitiesCallback from homeassistant.helpers.entity_platform import AddEntitiesCallback
from homeassistant.helpers.typing import ConfigType, DiscoveryInfoType, StateType from homeassistant.helpers.typing import ConfigType, DiscoveryInfoType, StateType

Some files were not shown because too many files have changed in this diff Show More