Compare commits

..

1 Commits

Author SHA1 Message Date
Paulus Schoutsen
6c2fc12b6a Validate Platform constant up to date 2025-09-12 11:04:25 -04:00
556 changed files with 5888 additions and 30780 deletions

View File

@@ -1,77 +0,0 @@
---
name: quality-scale-rule-verifier
description: |
Use this agent when you need to verify that a Home Assistant integration follows a specific quality scale rule. This includes checking if the integration implements required patterns, configurations, or code structures defined by the quality scale system.
<example>
Context: The user wants to verify if an integration follows a specific quality scale rule.
user: "Check if the peblar integration follows the config-flow rule"
assistant: "I'll use the quality scale rule verifier to check if the peblar integration properly implements the config-flow rule."
<commentary>
Since the user is asking to verify a quality scale rule implementation, use the quality-scale-rule-verifier agent.
</commentary>
</example>
<example>
Context: The user is reviewing if an integration reaches a specific quality scale level.
user: "Verify that this integration reaches the bronze quality scale"
assistant: "Let me use the quality scale rule verifier to check the bronze quality scale implementation."
<commentary>
The user wants to verify the integration has reached a certain quality level, so use multiple quality-scale-rule-verifier agents to verify each bronze rule.
</commentary>
</example>
model: inherit
color: yellow
tools: Read, Bash, Grep, Glob, WebFetch
---
You are an expert Home Assistant integration quality scale auditor specializing in verifying compliance with specific quality scale rules. You have deep knowledge of Home Assistant's architecture, best practices, and the quality scale system that ensures integration consistency and reliability.
You will verify if an integration follows a specific quality scale rule by:
1. **Fetching Rule Documentation**: Retrieve the official rule documentation from:
`https://raw.githubusercontent.com/home-assistant/developers.home-assistant/refs/heads/master/docs/core/integration-quality-scale/rules/{rule_name}.md`
where `{rule_name}` is the rule identifier (e.g., 'config-flow', 'entity-unique-id', 'parallel-updates')
2. **Understanding Rule Requirements**: Parse the rule documentation to identify:
- Core requirements and mandatory implementations
- Specific code patterns or configurations required
- Common violations and anti-patterns
- Exemption criteria (when a rule might not apply)
- The quality tier this rule belongs to (Bronze, Silver, Gold, Platinum)
3. **Analyzing Integration Code**: Examine the integration's codebase at `homeassistant/components/<integration domain>` focusing on:
- `manifest.json` for quality scale declaration and configuration
- `quality_scale.yaml` for rule status (done, todo, exempt)
- Relevant Python modules based on the rule requirements
- Configuration files and service definitions as needed
4. **Verification Process**:
- Check if the rule is marked as 'done', 'todo', or 'exempt' in quality_scale.yaml
- If marked 'exempt', verify the exemption reason is valid
- If marked 'done', verify the actual implementation matches requirements
- Identify specific files and code sections that demonstrate compliance or violations
- Consider the integration's declared quality tier when applying rules
- To fetch the integration docs, use WebFetch to fetch from `https://raw.githubusercontent.com/home-assistant/home-assistant.io/refs/heads/current/source/_integrations/<integration domain>.markdown`
- To fetch information about a PyPI package, use the URL `https://pypi.org/pypi/<package>/json`
5. **Reporting Findings**: Provide a comprehensive verification report that includes:
- **Rule Summary**: Brief description of what the rule requires
- **Compliance Status**: Clear pass/fail/exempt determination
- **Evidence**: Specific code examples showing compliance or violations
- **Issues Found**: Detailed list of any non-compliance issues with file locations
- **Recommendations**: Actionable steps to achieve compliance if needed
- **Exemption Analysis**: If applicable, whether the exemption is justified
When examining code, you will:
- Look for exact implementation patterns specified in the rule
- Verify all required components are present and properly configured
- Check for common mistakes and anti-patterns
- Consider edge cases and error handling requirements
- Validate that implementations follow Home Assistant conventions
You will be thorough but focused, examining only the aspects relevant to the specific rule being verified. You will provide clear, actionable feedback that helps developers understand both what needs to be fixed and why it matters for integration quality.
If you cannot access the rule documentation or find the integration code, clearly state what information is missing and what you would need to complete the verification.
Remember that quality scale rules are cumulative - Bronze rules apply to all integrations with a quality scale, Silver rules apply to Silver+ integrations, and so on. Always consider the integration's target quality level when determining which rules should be enforced.

View File

@@ -55,12 +55,8 @@
creating the PR. If you're unsure about any of them, don't hesitate to ask.
We're here to help! This is simply a reminder of what we are going to look
for before merging your code.
AI tools are welcome, but contributors are responsible for *fully*
understanding the code before submitting a PR.
-->
- [ ] I understand the code I am submitting and can explain how it works.
- [ ] The code change is tested and works locally.
- [ ] Local tests pass. **Your PR cannot be merged unless tests pass**
- [ ] There is no commented out code in this PR.
@@ -68,7 +64,6 @@
- [ ] I have followed the [perfect PR recommendations][perfect-pr]
- [ ] The code has been formatted using Ruff (`ruff format homeassistant tests`)
- [ ] Tests have been added to verify that the new code works.
- [ ] Any generated code has been carefully reviewed for correctness and compliance with project standards.
If user exposed functionality or configuration variables are added/changed:

View File

@@ -27,12 +27,12 @@ jobs:
publish: ${{ steps.version.outputs.publish }}
steps:
- name: Checkout the repository
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
uses: actions/checkout@v5.0.0
with:
fetch-depth: 0
- name: Set up Python ${{ env.DEFAULT_PYTHON }}
uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
uses: actions/setup-python@v6.0.0
with:
python-version: ${{ env.DEFAULT_PYTHON }}
@@ -69,7 +69,7 @@ jobs:
run: find ./homeassistant/components/*/translations -name "*.json" | tar zcvf translations.tar.gz -T -
- name: Upload translations
uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4.6.2
uses: actions/upload-artifact@v4.6.2
with:
name: translations
path: translations.tar.gz
@@ -90,11 +90,11 @@ jobs:
arch: ${{ fromJson(needs.init.outputs.architectures) }}
steps:
- name: Checkout the repository
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
uses: actions/checkout@v5.0.0
- name: Download nightly wheels of frontend
if: needs.init.outputs.channel == 'dev'
uses: dawidd6/action-download-artifact@ac66b43f0e6a346234dd65d4d0c8fbb31cb316e5 # v11
uses: dawidd6/action-download-artifact@v11
with:
github_token: ${{secrets.GITHUB_TOKEN}}
repo: home-assistant/frontend
@@ -105,7 +105,7 @@ jobs:
- name: Download nightly wheels of intents
if: needs.init.outputs.channel == 'dev'
uses: dawidd6/action-download-artifact@ac66b43f0e6a346234dd65d4d0c8fbb31cb316e5 # v11
uses: dawidd6/action-download-artifact@v11
with:
github_token: ${{secrets.GITHUB_TOKEN}}
repo: OHF-Voice/intents-package
@@ -116,7 +116,7 @@ jobs:
- name: Set up Python ${{ env.DEFAULT_PYTHON }}
if: needs.init.outputs.channel == 'dev'
uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
uses: actions/setup-python@v6.0.0
with:
python-version: ${{ env.DEFAULT_PYTHON }}
@@ -175,7 +175,7 @@ jobs:
sed -i "s|pykrakenapi|# pykrakenapi|g" requirements_all.txt
- name: Download translations
uses: actions/download-artifact@634f93cb2916e3fdff6788551b99b062d0335ce0 # v5.0.0
uses: actions/download-artifact@v5.0.0
with:
name: translations
@@ -190,15 +190,14 @@ jobs:
echo "${{ github.sha }};${{ github.ref }};${{ github.event_name }};${{ github.actor }}" > rootfs/OFFICIAL_IMAGE
- name: Login to GitHub Container Registry
uses: docker/login-action@184bdaa0721073962dff0199f1fb9940f07167d1 # v3.5.0
uses: docker/login-action@v3.5.0
with:
registry: ghcr.io
username: ${{ github.repository_owner }}
password: ${{ secrets.GITHUB_TOKEN }}
# home-assistant/builder doesn't support sha pinning
- name: Build base image
uses: home-assistant/builder@2025.09.0
uses: home-assistant/builder@2025.03.0
with:
args: |
$BUILD_ARGS \
@@ -243,7 +242,7 @@ jobs:
- green
steps:
- name: Checkout the repository
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
uses: actions/checkout@v5.0.0
- name: Set build additional args
run: |
@@ -257,15 +256,14 @@ jobs:
fi
- name: Login to GitHub Container Registry
uses: docker/login-action@184bdaa0721073962dff0199f1fb9940f07167d1 # v3.5.0
uses: docker/login-action@v3.5.0
with:
registry: ghcr.io
username: ${{ github.repository_owner }}
password: ${{ secrets.GITHUB_TOKEN }}
# home-assistant/builder doesn't support sha pinning
- name: Build base image
uses: home-assistant/builder@2025.09.0
uses: home-assistant/builder@2025.03.0
with:
args: |
$BUILD_ARGS \
@@ -281,7 +279,7 @@ jobs:
runs-on: ubuntu-latest
steps:
- name: Checkout the repository
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
uses: actions/checkout@v5.0.0
- name: Initialize git
uses: home-assistant/actions/helpers/git-init@master
@@ -323,23 +321,23 @@ jobs:
registry: ["ghcr.io/home-assistant", "docker.io/homeassistant"]
steps:
- name: Checkout the repository
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
uses: actions/checkout@v5.0.0
- name: Install Cosign
uses: sigstore/cosign-installer@d7543c93d881b35a8faa02e8e3605f69b7a1ce62 # v3.10.0
uses: sigstore/cosign-installer@v3.9.2
with:
cosign-release: "v2.2.3"
- name: Login to DockerHub
if: matrix.registry == 'docker.io/homeassistant'
uses: docker/login-action@184bdaa0721073962dff0199f1fb9940f07167d1 # v3.5.0
uses: docker/login-action@v3.5.0
with:
username: ${{ secrets.DOCKERHUB_USERNAME }}
password: ${{ secrets.DOCKERHUB_TOKEN }}
- name: Login to GitHub Container Registry
if: matrix.registry == 'ghcr.io/home-assistant'
uses: docker/login-action@184bdaa0721073962dff0199f1fb9940f07167d1 # v3.5.0
uses: docker/login-action@v3.5.0
with:
registry: ghcr.io
username: ${{ github.repository_owner }}
@@ -456,15 +454,15 @@ jobs:
if: github.repository_owner == 'home-assistant' && needs.init.outputs.publish == 'true'
steps:
- name: Checkout the repository
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
uses: actions/checkout@v5.0.0
- name: Set up Python ${{ env.DEFAULT_PYTHON }}
uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
uses: actions/setup-python@v6.0.0
with:
python-version: ${{ env.DEFAULT_PYTHON }}
- name: Download translations
uses: actions/download-artifact@634f93cb2916e3fdff6788551b99b062d0335ce0 # v5.0.0
uses: actions/download-artifact@v5.0.0
with:
name: translations
@@ -482,7 +480,7 @@ jobs:
python -m build
- name: Upload package to PyPI
uses: pypa/gh-action-pypi-publish@ed0c53931b1dc9bd32cbe73a98c7f6766f8a527e # v1.13.0
uses: pypa/gh-action-pypi-publish@v1.13.0
with:
skip-existing: true

View File

@@ -98,7 +98,7 @@ jobs:
runs-on: ubuntu-24.04
steps:
- name: Check out code from GitHub
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
uses: actions/checkout@v5.0.0
- name: Generate partial Python venv restore key
id: generate_python_cache_key
run: |
@@ -120,7 +120,7 @@ jobs:
run: |
echo "key=$(lsb_release -rs)-apt-${{ env.CACHE_VERSION }}-${{ env.HA_SHORT_VERSION }}" >> $GITHUB_OUTPUT
- name: Filter for core changes
uses: dorny/paths-filter@de90cc6fb38fc0963ad72b210f1f284cd68cea36 # v3.0.2
uses: dorny/paths-filter@v3.0.2
id: core
with:
filters: .core_files.yaml
@@ -135,7 +135,7 @@ jobs:
echo "Result:"
cat .integration_paths.yaml
- name: Filter for integration changes
uses: dorny/paths-filter@de90cc6fb38fc0963ad72b210f1f284cd68cea36 # v3.0.2
uses: dorny/paths-filter@v3.0.2
id: integrations
with:
filters: .integration_paths.yaml
@@ -254,16 +254,16 @@ jobs:
- info
steps:
- name: Check out code from GitHub
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
uses: actions/checkout@v5.0.0
- name: Set up Python ${{ env.DEFAULT_PYTHON }}
id: python
uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
uses: actions/setup-python@v6.0.0
with:
python-version: ${{ env.DEFAULT_PYTHON }}
check-latest: true
- name: Restore base Python virtual environment
id: cache-venv
uses: actions/cache@0400d5f644dc74513175e3cd8d07132dd4860809 # v4.2.4
uses: actions/cache@v4.2.4
with:
path: venv
key: >-
@@ -279,7 +279,7 @@ jobs:
uv pip install "$(cat requirements_test.txt | grep pre-commit)"
- name: Restore pre-commit environment from cache
id: cache-precommit
uses: actions/cache@0400d5f644dc74513175e3cd8d07132dd4860809 # v4.2.4
uses: actions/cache@v4.2.4
with:
path: ${{ env.PRE_COMMIT_CACHE }}
lookup-only: true
@@ -300,16 +300,16 @@ jobs:
- pre-commit
steps:
- name: Check out code from GitHub
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
uses: actions/checkout@v5.0.0
- name: Set up Python ${{ env.DEFAULT_PYTHON }}
uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
uses: actions/setup-python@v6.0.0
id: python
with:
python-version: ${{ env.DEFAULT_PYTHON }}
check-latest: true
- name: Restore base Python virtual environment
id: cache-venv
uses: actions/cache/restore@0400d5f644dc74513175e3cd8d07132dd4860809 # v4.2.4
uses: actions/cache/restore@v4.2.4
with:
path: venv
fail-on-cache-miss: true
@@ -318,7 +318,7 @@ jobs:
needs.info.outputs.pre-commit_cache_key }}
- name: Restore pre-commit environment from cache
id: cache-precommit
uses: actions/cache/restore@0400d5f644dc74513175e3cd8d07132dd4860809 # v4.2.4
uses: actions/cache/restore@v4.2.4
with:
path: ${{ env.PRE_COMMIT_CACHE }}
fail-on-cache-miss: true
@@ -340,16 +340,16 @@ jobs:
- pre-commit
steps:
- name: Check out code from GitHub
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
uses: actions/checkout@v5.0.0
- name: Set up Python ${{ env.DEFAULT_PYTHON }}
uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
uses: actions/setup-python@v6.0.0
id: python
with:
python-version: ${{ env.DEFAULT_PYTHON }}
check-latest: true
- name: Restore base Python virtual environment
id: cache-venv
uses: actions/cache/restore@0400d5f644dc74513175e3cd8d07132dd4860809 # v4.2.4
uses: actions/cache/restore@v4.2.4
with:
path: venv
fail-on-cache-miss: true
@@ -358,7 +358,7 @@ jobs:
needs.info.outputs.pre-commit_cache_key }}
- name: Restore pre-commit environment from cache
id: cache-precommit
uses: actions/cache/restore@0400d5f644dc74513175e3cd8d07132dd4860809 # v4.2.4
uses: actions/cache/restore@v4.2.4
with:
path: ${{ env.PRE_COMMIT_CACHE }}
fail-on-cache-miss: true
@@ -380,16 +380,16 @@ jobs:
- pre-commit
steps:
- name: Check out code from GitHub
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
uses: actions/checkout@v5.0.0
- name: Set up Python ${{ env.DEFAULT_PYTHON }}
uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
uses: actions/setup-python@v6.0.0
id: python
with:
python-version: ${{ env.DEFAULT_PYTHON }}
check-latest: true
- name: Restore base Python virtual environment
id: cache-venv
uses: actions/cache/restore@0400d5f644dc74513175e3cd8d07132dd4860809 # v4.2.4
uses: actions/cache/restore@v4.2.4
with:
path: venv
fail-on-cache-miss: true
@@ -398,7 +398,7 @@ jobs:
needs.info.outputs.pre-commit_cache_key }}
- name: Restore pre-commit environment from cache
id: cache-precommit
uses: actions/cache/restore@0400d5f644dc74513175e3cd8d07132dd4860809 # v4.2.4
uses: actions/cache/restore@v4.2.4
with:
path: ${{ env.PRE_COMMIT_CACHE }}
fail-on-cache-miss: true
@@ -470,7 +470,7 @@ jobs:
- script/hassfest/docker/Dockerfile
steps:
- name: Check out code from GitHub
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
uses: actions/checkout@v5.0.0
- name: Register hadolint problem matcher
run: |
echo "::add-matcher::.github/workflows/matchers/hadolint.json"
@@ -489,10 +489,10 @@ jobs:
python-version: ${{ fromJSON(needs.info.outputs.python_versions) }}
steps:
- name: Check out code from GitHub
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
uses: actions/checkout@v5.0.0
- name: Set up Python ${{ matrix.python-version }}
id: python
uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
uses: actions/setup-python@v6.0.0
with:
python-version: ${{ matrix.python-version }}
check-latest: true
@@ -505,7 +505,7 @@ jobs:
env.HA_SHORT_VERSION }}-$(date -u '+%Y-%m-%dT%H:%M:%s')" >> $GITHUB_OUTPUT
- name: Restore base Python virtual environment
id: cache-venv
uses: actions/cache@0400d5f644dc74513175e3cd8d07132dd4860809 # v4.2.4
uses: actions/cache@v4.2.4
with:
path: venv
key: >-
@@ -513,7 +513,7 @@ jobs:
needs.info.outputs.python_cache_key }}
- name: Restore uv wheel cache
if: steps.cache-venv.outputs.cache-hit != 'true'
uses: actions/cache@0400d5f644dc74513175e3cd8d07132dd4860809 # v4.2.4
uses: actions/cache@v4.2.4
with:
path: ${{ env.UV_CACHE_DIR }}
key: >-
@@ -523,24 +523,22 @@ jobs:
${{ runner.os }}-${{ runner.arch }}-${{ steps.python.outputs.python-version }}-uv-${{
env.UV_CACHE_VERSION }}-${{ steps.generate-uv-key.outputs.version }}-${{
env.HA_SHORT_VERSION }}-
- name: Check if apt cache exists
id: cache-apt-check
uses: actions/cache@0400d5f644dc74513175e3cd8d07132dd4860809 # v4.2.4
- name: Restore apt cache
if: steps.cache-venv.outputs.cache-hit != 'true'
id: cache-apt
uses: actions/cache@v4.2.4
with:
lookup-only: ${{ steps.cache-venv.outputs.cache-hit == 'true' }}
path: |
${{ env.APT_CACHE_DIR }}
${{ env.APT_LIST_CACHE_DIR }}
key: >-
${{ runner.os }}-${{ runner.arch }}-${{ needs.info.outputs.apt_cache_key }}
- name: Install additional OS dependencies
if: |
steps.cache-venv.outputs.cache-hit != 'true'
|| steps.cache-apt-check.outputs.cache-hit != 'true'
if: steps.cache-venv.outputs.cache-hit != 'true'
timeout-minutes: 10
run: |
sudo rm /etc/apt/sources.list.d/microsoft-prod.list
if [[ "${{ steps.cache-apt-check.outputs.cache-hit }}" != 'true' ]]; then
if [[ "${{ steps.cache-apt.outputs.cache-hit }}" != 'true' ]]; then
mkdir -p ${{ env.APT_CACHE_DIR }}
mkdir -p ${{ env.APT_LIST_CACHE_DIR }}
fi
@@ -565,18 +563,9 @@ jobs:
libswscale-dev \
libudev-dev
if [[ "${{ steps.cache-apt-check.outputs.cache-hit }}" != 'true' ]]; then
if [[ "${{ steps.cache-apt.outputs.cache-hit }}" != 'true' ]]; then
sudo chmod -R 755 ${{ env.APT_CACHE_BASE }}
fi
- name: Save apt cache
if: steps.cache-apt-check.outputs.cache-hit != 'true'
uses: actions/cache/save@0400d5f644dc74513175e3cd8d07132dd4860809 # v4.2.4
with:
path: |
${{ env.APT_CACHE_DIR }}
${{ env.APT_LIST_CACHE_DIR }}
key: >-
${{ runner.os }}-${{ runner.arch }}-${{ needs.info.outputs.apt_cache_key }}
- name: Create Python virtual environment
if: steps.cache-venv.outputs.cache-hit != 'true'
run: |
@@ -596,7 +585,7 @@ jobs:
python --version
uv pip freeze >> pip_freeze.txt
- name: Upload pip_freeze artifact
uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4.6.2
uses: actions/upload-artifact@v4.6.2
with:
name: pip-freeze-${{ matrix.python-version }}
path: pip_freeze.txt
@@ -642,16 +631,16 @@ jobs:
-o Dir::State::Lists=${{ env.APT_LIST_CACHE_DIR }} \
libturbojpeg
- name: Check out code from GitHub
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
uses: actions/checkout@v5.0.0
- name: Set up Python ${{ env.DEFAULT_PYTHON }}
id: python
uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
uses: actions/setup-python@v6.0.0
with:
python-version: ${{ env.DEFAULT_PYTHON }}
check-latest: true
- name: Restore full Python ${{ env.DEFAULT_PYTHON }} virtual environment
id: cache-venv
uses: actions/cache/restore@0400d5f644dc74513175e3cd8d07132dd4860809 # v4.2.4
uses: actions/cache/restore@v4.2.4
with:
path: venv
fail-on-cache-miss: true
@@ -675,16 +664,16 @@ jobs:
- base
steps:
- name: Check out code from GitHub
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
uses: actions/checkout@v5.0.0
- name: Set up Python ${{ env.DEFAULT_PYTHON }}
id: python
uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
uses: actions/setup-python@v6.0.0
with:
python-version: ${{ env.DEFAULT_PYTHON }}
check-latest: true
- name: Restore base Python virtual environment
id: cache-venv
uses: actions/cache/restore@0400d5f644dc74513175e3cd8d07132dd4860809 # v4.2.4
uses: actions/cache/restore@v4.2.4
with:
path: venv
fail-on-cache-miss: true
@@ -709,9 +698,9 @@ jobs:
&& github.event_name == 'pull_request'
steps:
- name: Check out code from GitHub
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
uses: actions/checkout@v5.0.0
- name: Dependency review
uses: actions/dependency-review-action@595b5aeba73380359d98a5e087f648dbb0edce1b # v4.7.3
uses: actions/dependency-review-action@v4.7.3
with:
license-check: false # We use our own license audit checks
@@ -732,16 +721,16 @@ jobs:
python-version: ${{ fromJson(needs.info.outputs.python_versions) }}
steps:
- name: Check out code from GitHub
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
uses: actions/checkout@v5.0.0
- name: Set up Python ${{ matrix.python-version }}
id: python
uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
uses: actions/setup-python@v6.0.0
with:
python-version: ${{ matrix.python-version }}
check-latest: true
- name: Restore full Python ${{ matrix.python-version }} virtual environment
id: cache-venv
uses: actions/cache/restore@0400d5f644dc74513175e3cd8d07132dd4860809 # v4.2.4
uses: actions/cache/restore@v4.2.4
with:
path: venv
fail-on-cache-miss: true
@@ -753,7 +742,7 @@ jobs:
. venv/bin/activate
python -m script.licenses extract --output-file=licenses-${{ matrix.python-version }}.json
- name: Upload licenses
uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4.6.2
uses: actions/upload-artifact@v4.6.2
with:
name: licenses-${{ github.run_number }}-${{ matrix.python-version }}
path: licenses-${{ matrix.python-version }}.json
@@ -775,16 +764,16 @@ jobs:
- base
steps:
- name: Check out code from GitHub
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
uses: actions/checkout@v5.0.0
- name: Set up Python ${{ env.DEFAULT_PYTHON }}
id: python
uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
uses: actions/setup-python@v6.0.0
with:
python-version: ${{ env.DEFAULT_PYTHON }}
check-latest: true
- name: Restore full Python ${{ env.DEFAULT_PYTHON }} virtual environment
id: cache-venv
uses: actions/cache/restore@0400d5f644dc74513175e3cd8d07132dd4860809 # v4.2.4
uses: actions/cache/restore@v4.2.4
with:
path: venv
fail-on-cache-miss: true
@@ -822,16 +811,16 @@ jobs:
- base
steps:
- name: Check out code from GitHub
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
uses: actions/checkout@v5.0.0
- name: Set up Python ${{ env.DEFAULT_PYTHON }}
id: python
uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
uses: actions/setup-python@v6.0.0
with:
python-version: ${{ env.DEFAULT_PYTHON }}
check-latest: true
- name: Restore full Python ${{ env.DEFAULT_PYTHON }} virtual environment
id: cache-venv
uses: actions/cache/restore@0400d5f644dc74513175e3cd8d07132dd4860809 # v4.2.4
uses: actions/cache/restore@v4.2.4
with:
path: venv
fail-on-cache-miss: true
@@ -867,10 +856,10 @@ jobs:
- base
steps:
- name: Check out code from GitHub
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
uses: actions/checkout@v5.0.0
- name: Set up Python ${{ env.DEFAULT_PYTHON }}
id: python
uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
uses: actions/setup-python@v6.0.0
with:
python-version: ${{ env.DEFAULT_PYTHON }}
check-latest: true
@@ -883,7 +872,7 @@ jobs:
env.HA_SHORT_VERSION }}-$(date -u '+%Y-%m-%dT%H:%M:%s')" >> $GITHUB_OUTPUT
- name: Restore full Python ${{ env.DEFAULT_PYTHON }} virtual environment
id: cache-venv
uses: actions/cache/restore@0400d5f644dc74513175e3cd8d07132dd4860809 # v4.2.4
uses: actions/cache/restore@v4.2.4
with:
path: venv
fail-on-cache-miss: true
@@ -891,7 +880,7 @@ jobs:
${{ runner.os }}-${{ runner.arch }}-${{ steps.python.outputs.python-version }}-${{
needs.info.outputs.python_cache_key }}
- name: Restore mypy cache
uses: actions/cache@0400d5f644dc74513175e3cd8d07132dd4860809 # v4.2.4
uses: actions/cache@v4.2.4
with:
path: .mypy_cache
key: >-
@@ -958,16 +947,16 @@ jobs:
libturbojpeg \
libgammu-dev
- name: Check out code from GitHub
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
uses: actions/checkout@v5.0.0
- name: Set up Python ${{ env.DEFAULT_PYTHON }}
id: python
uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
uses: actions/setup-python@v6.0.0
with:
python-version: ${{ env.DEFAULT_PYTHON }}
check-latest: true
- name: Restore base Python virtual environment
id: cache-venv
uses: actions/cache/restore@0400d5f644dc74513175e3cd8d07132dd4860809 # v4.2.4
uses: actions/cache/restore@v4.2.4
with:
path: venv
fail-on-cache-miss: true
@@ -979,7 +968,7 @@ jobs:
. venv/bin/activate
python -m script.split_tests ${{ needs.info.outputs.test_group_count }} tests
- name: Upload pytest_buckets
uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4.6.2
uses: actions/upload-artifact@v4.6.2
with:
name: pytest_buckets
path: pytest_buckets.txt
@@ -1033,16 +1022,16 @@ jobs:
libgammu-dev \
libxml2-utils
- name: Check out code from GitHub
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
uses: actions/checkout@v5.0.0
- name: Set up Python ${{ matrix.python-version }}
id: python
uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
uses: actions/setup-python@v6.0.0
with:
python-version: ${{ matrix.python-version }}
check-latest: true
- name: Restore full Python ${{ matrix.python-version }} virtual environment
id: cache-venv
uses: actions/cache/restore@0400d5f644dc74513175e3cd8d07132dd4860809 # v4.2.4
uses: actions/cache/restore@v4.2.4
with:
path: venv
fail-on-cache-miss: true
@@ -1056,7 +1045,7 @@ jobs:
run: |
echo "::add-matcher::.github/workflows/matchers/pytest-slow.json"
- name: Download pytest_buckets
uses: actions/download-artifact@634f93cb2916e3fdff6788551b99b062d0335ce0 # v5.0.0
uses: actions/download-artifact@v5.0.0
with:
name: pytest_buckets
- name: Compile English translations
@@ -1095,14 +1084,14 @@ jobs:
2>&1 | tee pytest-${{ matrix.python-version }}-${{ matrix.group }}.txt
- name: Upload pytest output
if: success() || failure() && steps.pytest-full.conclusion == 'failure'
uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4.6.2
uses: actions/upload-artifact@v4.6.2
with:
name: pytest-${{ github.run_number }}-${{ matrix.python-version }}-${{ matrix.group }}
path: pytest-*.txt
overwrite: true
- name: Upload coverage artifact
if: needs.info.outputs.skip_coverage != 'true'
uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4.6.2
uses: actions/upload-artifact@v4.6.2
with:
name: coverage-${{ matrix.python-version }}-${{ matrix.group }}
path: coverage.xml
@@ -1115,7 +1104,7 @@ jobs:
mv "junit.xml-tmp" "junit.xml"
- name: Upload test results artifact
if: needs.info.outputs.skip_coverage != 'true' && !cancelled()
uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4.6.2
uses: actions/upload-artifact@v4.6.2
with:
name: test-results-full-${{ matrix.python-version }}-${{ matrix.group }}
path: junit.xml
@@ -1180,16 +1169,16 @@ jobs:
libmariadb-dev-compat \
libxml2-utils
- name: Check out code from GitHub
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
uses: actions/checkout@v5.0.0
- name: Set up Python ${{ matrix.python-version }}
id: python
uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
uses: actions/setup-python@v6.0.0
with:
python-version: ${{ matrix.python-version }}
check-latest: true
- name: Restore full Python ${{ matrix.python-version }} virtual environment
id: cache-venv
uses: actions/cache/restore@0400d5f644dc74513175e3cd8d07132dd4860809 # v4.2.4
uses: actions/cache/restore@v4.2.4
with:
path: venv
fail-on-cache-miss: true
@@ -1248,7 +1237,7 @@ jobs:
2>&1 | tee pytest-${{ matrix.python-version }}-${mariadb}.txt
- name: Upload pytest output
if: success() || failure() && steps.pytest-partial.conclusion == 'failure'
uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4.6.2
uses: actions/upload-artifact@v4.6.2
with:
name: pytest-${{ github.run_number }}-${{ matrix.python-version }}-${{
steps.pytest-partial.outputs.mariadb }}
@@ -1256,7 +1245,7 @@ jobs:
overwrite: true
- name: Upload coverage artifact
if: needs.info.outputs.skip_coverage != 'true'
uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4.6.2
uses: actions/upload-artifact@v4.6.2
with:
name: coverage-${{ matrix.python-version }}-${{
steps.pytest-partial.outputs.mariadb }}
@@ -1270,7 +1259,7 @@ jobs:
mv "junit.xml-tmp" "junit.xml"
- name: Upload test results artifact
if: needs.info.outputs.skip_coverage != 'true' && !cancelled()
uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4.6.2
uses: actions/upload-artifact@v4.6.2
with:
name: test-results-mariadb-${{ matrix.python-version }}-${{
steps.pytest-partial.outputs.mariadb }}
@@ -1336,16 +1325,16 @@ jobs:
sudo apt-get -y install \
postgresql-server-dev-14
- name: Check out code from GitHub
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
uses: actions/checkout@v5.0.0
- name: Set up Python ${{ matrix.python-version }}
id: python
uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
uses: actions/setup-python@v6.0.0
with:
python-version: ${{ matrix.python-version }}
check-latest: true
- name: Restore full Python ${{ matrix.python-version }} virtual environment
id: cache-venv
uses: actions/cache/restore@0400d5f644dc74513175e3cd8d07132dd4860809 # v4.2.4
uses: actions/cache/restore@v4.2.4
with:
path: venv
fail-on-cache-miss: true
@@ -1405,7 +1394,7 @@ jobs:
2>&1 | tee pytest-${{ matrix.python-version }}-${postgresql}.txt
- name: Upload pytest output
if: success() || failure() && steps.pytest-partial.conclusion == 'failure'
uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4.6.2
uses: actions/upload-artifact@v4.6.2
with:
name: pytest-${{ github.run_number }}-${{ matrix.python-version }}-${{
steps.pytest-partial.outputs.postgresql }}
@@ -1413,7 +1402,7 @@ jobs:
overwrite: true
- name: Upload coverage artifact
if: needs.info.outputs.skip_coverage != 'true'
uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4.6.2
uses: actions/upload-artifact@v4.6.2
with:
name: coverage-${{ matrix.python-version }}-${{
steps.pytest-partial.outputs.postgresql }}
@@ -1427,7 +1416,7 @@ jobs:
mv "junit.xml-tmp" "junit.xml"
- name: Upload test results artifact
if: needs.info.outputs.skip_coverage != 'true' && !cancelled()
uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4.6.2
uses: actions/upload-artifact@v4.6.2
with:
name: test-results-postgres-${{ matrix.python-version }}-${{
steps.pytest-partial.outputs.postgresql }}
@@ -1448,14 +1437,14 @@ jobs:
timeout-minutes: 10
steps:
- name: Check out code from GitHub
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
uses: actions/checkout@v5.0.0
- name: Download all coverage artifacts
uses: actions/download-artifact@634f93cb2916e3fdff6788551b99b062d0335ce0 # v5.0.0
uses: actions/download-artifact@v5.0.0
with:
pattern: coverage-*
- name: Upload coverage to Codecov
if: needs.info.outputs.test_full_suite == 'true'
uses: codecov/codecov-action@5a1091511ad55cbe89839c7260b706298ca349f7 # v5.5.1
uses: codecov/codecov-action@v5.5.1
with:
fail_ci_if_error: true
flags: full-suite
@@ -1509,16 +1498,16 @@ jobs:
libgammu-dev \
libxml2-utils
- name: Check out code from GitHub
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
uses: actions/checkout@v5.0.0
- name: Set up Python ${{ matrix.python-version }}
id: python
uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
uses: actions/setup-python@v6.0.0
with:
python-version: ${{ matrix.python-version }}
check-latest: true
- name: Restore full Python ${{ matrix.python-version }} virtual environment
id: cache-venv
uses: actions/cache/restore@0400d5f644dc74513175e3cd8d07132dd4860809 # v4.2.4
uses: actions/cache/restore@v4.2.4
with:
path: venv
fail-on-cache-miss: true
@@ -1574,14 +1563,14 @@ jobs:
2>&1 | tee pytest-${{ matrix.python-version }}-${{ matrix.group }}.txt
- name: Upload pytest output
if: success() || failure() && steps.pytest-partial.conclusion == 'failure'
uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4.6.2
uses: actions/upload-artifact@v4.6.2
with:
name: pytest-${{ github.run_number }}-${{ matrix.python-version }}-${{ matrix.group }}
path: pytest-*.txt
overwrite: true
- name: Upload coverage artifact
if: needs.info.outputs.skip_coverage != 'true'
uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4.6.2
uses: actions/upload-artifact@v4.6.2
with:
name: coverage-${{ matrix.python-version }}-${{ matrix.group }}
path: coverage.xml
@@ -1594,7 +1583,7 @@ jobs:
mv "junit.xml-tmp" "junit.xml"
- name: Upload test results artifact
if: needs.info.outputs.skip_coverage != 'true' && !cancelled()
uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4.6.2
uses: actions/upload-artifact@v4.6.2
with:
name: test-results-partial-${{ matrix.python-version }}-${{ matrix.group }}
path: junit.xml
@@ -1612,14 +1601,14 @@ jobs:
timeout-minutes: 10
steps:
- name: Check out code from GitHub
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
uses: actions/checkout@v5.0.0
- name: Download all coverage artifacts
uses: actions/download-artifact@634f93cb2916e3fdff6788551b99b062d0335ce0 # v5.0.0
uses: actions/download-artifact@v5.0.0
with:
pattern: coverage-*
- name: Upload coverage to Codecov
if: needs.info.outputs.test_full_suite == 'false'
uses: codecov/codecov-action@5a1091511ad55cbe89839c7260b706298ca349f7 # v5.5.1
uses: codecov/codecov-action@v5.5.1
with:
fail_ci_if_error: true
token: ${{ secrets.CODECOV_TOKEN }}
@@ -1639,11 +1628,11 @@ jobs:
timeout-minutes: 10
steps:
- name: Download all coverage artifacts
uses: actions/download-artifact@634f93cb2916e3fdff6788551b99b062d0335ce0 # v5.0.0
uses: actions/download-artifact@v5.0.0
with:
pattern: test-results-*
- name: Upload test results to Codecov
uses: codecov/test-results-action@47f89e9acb64b76debcd5ea40642d25a4adced9f # v1.1.1
uses: codecov/test-results-action@v1
with:
fail_ci_if_error: true
verbose: true

View File

@@ -21,14 +21,14 @@ jobs:
steps:
- name: Check out code from GitHub
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
uses: actions/checkout@v5.0.0
- name: Initialize CodeQL
uses: github/codeql-action/init@192325c86100d080feab897ff886c34abd4c83a3 # v3.30.3
uses: github/codeql-action/init@v3.30.3
with:
languages: python
- name: Perform CodeQL Analysis
uses: github/codeql-action/analyze@192325c86100d080feab897ff886c34abd4c83a3 # v3.30.3
uses: github/codeql-action/analyze@v3.30.3
with:
category: "/language:python"

View File

@@ -16,7 +16,7 @@ jobs:
steps:
- name: Check if integration label was added and extract details
id: extract
uses: actions/github-script@ed597411d8f924073f98dfc5c65a23a2325f34cd # v8.0.0
uses: actions/github-script@v8
with:
script: |
// Debug: Log the event payload
@@ -113,7 +113,7 @@ jobs:
- name: Fetch similar issues
id: fetch_similar
if: steps.extract.outputs.should_continue == 'true'
uses: actions/github-script@ed597411d8f924073f98dfc5c65a23a2325f34cd # v8.0.0
uses: actions/github-script@v8
env:
INTEGRATION_LABELS: ${{ steps.extract.outputs.integration_labels }}
CURRENT_NUMBER: ${{ steps.extract.outputs.current_number }}
@@ -231,7 +231,7 @@ jobs:
- name: Detect duplicates using AI
id: ai_detection
if: steps.extract.outputs.should_continue == 'true' && steps.fetch_similar.outputs.has_similar == 'true'
uses: actions/ai-inference@a1c11829223a786afe3b5663db904a3aa1eac3a2 # v2.0.1
uses: actions/ai-inference@v2.0.1
with:
model: openai/gpt-4o
system-prompt: |
@@ -280,7 +280,7 @@ jobs:
- name: Post duplicate detection results
id: post_results
if: steps.extract.outputs.should_continue == 'true' && steps.fetch_similar.outputs.has_similar == 'true'
uses: actions/github-script@ed597411d8f924073f98dfc5c65a23a2325f34cd # v8.0.0
uses: actions/github-script@v8
env:
AI_RESPONSE: ${{ steps.ai_detection.outputs.response }}
SIMILAR_ISSUES: ${{ steps.fetch_similar.outputs.similar_issues }}

View File

@@ -16,7 +16,7 @@ jobs:
steps:
- name: Check issue language
id: detect_language
uses: actions/github-script@ed597411d8f924073f98dfc5c65a23a2325f34cd # v8.0.0
uses: actions/github-script@v8
env:
ISSUE_NUMBER: ${{ github.event.issue.number }}
ISSUE_TITLE: ${{ github.event.issue.title }}
@@ -57,7 +57,7 @@ jobs:
- name: Detect language using AI
id: ai_language_detection
if: steps.detect_language.outputs.should_continue == 'true'
uses: actions/ai-inference@a1c11829223a786afe3b5663db904a3aa1eac3a2 # v2.0.1
uses: actions/ai-inference@v2.0.1
with:
model: openai/gpt-4o-mini
system-prompt: |
@@ -90,7 +90,7 @@ jobs:
- name: Process non-English issues
if: steps.detect_language.outputs.should_continue == 'true'
uses: actions/github-script@ed597411d8f924073f98dfc5c65a23a2325f34cd # v8.0.0
uses: actions/github-script@v8
env:
AI_RESPONSE: ${{ steps.ai_language_detection.outputs.response }}
ISSUE_NUMBER: ${{ steps.detect_language.outputs.issue_number }}

View File

@@ -10,7 +10,7 @@ jobs:
if: github.repository_owner == 'home-assistant'
runs-on: ubuntu-latest
steps:
- uses: dessant/lock-threads@1bf7ec25051fe7c00bdd17e6a7cf3d7bfb7dc771 # v5.0.1
- uses: dessant/lock-threads@v5.0.1
with:
github-token: ${{ github.token }}
issue-inactive-days: "30"

View File

@@ -12,7 +12,7 @@ jobs:
if: github.event.issue.type.name == 'Task'
steps:
- name: Check if user is authorized
uses: actions/github-script@ed597411d8f924073f98dfc5c65a23a2325f34cd # v8.0.0
uses: actions/github-script@v8
with:
script: |
const issueAuthor = context.payload.issue.user.login;

View File

@@ -17,7 +17,7 @@ jobs:
# - No PRs marked as no-stale
# - No issues (-1)
- name: 60 days stale PRs policy
uses: actions/stale@3a9db7e6a41a89f618792c92c0e97cc736e1b13f # v10.0.0
uses: actions/stale@v10.0.0
with:
repo-token: ${{ secrets.GITHUB_TOKEN }}
days-before-stale: 60
@@ -57,7 +57,7 @@ jobs:
# - No issues marked as no-stale or help-wanted
# - No PRs (-1)
- name: 90 days stale issues
uses: actions/stale@3a9db7e6a41a89f618792c92c0e97cc736e1b13f # v10.0.0
uses: actions/stale@v10.0.0
with:
repo-token: ${{ steps.token.outputs.token }}
days-before-stale: 90
@@ -87,7 +87,7 @@ jobs:
# - No Issues marked as no-stale or help-wanted
# - No PRs (-1)
- name: Needs more information stale issues policy
uses: actions/stale@3a9db7e6a41a89f618792c92c0e97cc736e1b13f # v10.0.0
uses: actions/stale@v10.0.0
with:
repo-token: ${{ steps.token.outputs.token }}
only-labels: "needs-more-information"

View File

@@ -19,10 +19,10 @@ jobs:
runs-on: ubuntu-latest
steps:
- name: Checkout the repository
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
uses: actions/checkout@v5.0.0
- name: Set up Python ${{ env.DEFAULT_PYTHON }}
uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
uses: actions/setup-python@v6.0.0
with:
python-version: ${{ env.DEFAULT_PYTHON }}

View File

@@ -32,11 +32,11 @@ jobs:
architectures: ${{ steps.info.outputs.architectures }}
steps:
- name: Checkout the repository
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
uses: actions/checkout@v5.0.0
- name: Set up Python ${{ env.DEFAULT_PYTHON }}
id: python
uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
uses: actions/setup-python@v6.0.0
with:
python-version: ${{ env.DEFAULT_PYTHON }}
check-latest: true
@@ -91,7 +91,7 @@ jobs:
) > build_constraints.txt
- name: Upload env_file
uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4.6.2
uses: actions/upload-artifact@v4.6.2
with:
name: env_file
path: ./.env_file
@@ -99,14 +99,14 @@ jobs:
overwrite: true
- name: Upload build_constraints
uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4.6.2
uses: actions/upload-artifact@v4.6.2
with:
name: build_constraints
path: ./build_constraints.txt
overwrite: true
- name: Upload requirements_diff
uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4.6.2
uses: actions/upload-artifact@v4.6.2
with:
name: requirements_diff
path: ./requirements_diff.txt
@@ -118,7 +118,7 @@ jobs:
python -m script.gen_requirements_all ci
- name: Upload requirements_all_wheels
uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4.6.2
uses: actions/upload-artifact@v4.6.2
with:
name: requirements_all_wheels
path: ./requirements_all_wheels_*.txt
@@ -135,20 +135,20 @@ jobs:
arch: ${{ fromJson(needs.init.outputs.architectures) }}
steps:
- name: Checkout the repository
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
uses: actions/checkout@v5.0.0
- name: Download env_file
uses: actions/download-artifact@634f93cb2916e3fdff6788551b99b062d0335ce0 # v5.0.0
uses: actions/download-artifact@v5.0.0
with:
name: env_file
- name: Download build_constraints
uses: actions/download-artifact@634f93cb2916e3fdff6788551b99b062d0335ce0 # v5.0.0
uses: actions/download-artifact@v5.0.0
with:
name: build_constraints
- name: Download requirements_diff
uses: actions/download-artifact@634f93cb2916e3fdff6788551b99b062d0335ce0 # v5.0.0
uses: actions/download-artifact@v5.0.0
with:
name: requirements_diff
@@ -158,7 +158,6 @@ jobs:
sed -i "/uv/d" requirements.txt
sed -i "/uv/d" requirements_diff.txt
# home-assistant/wheels doesn't support sha pinning
- name: Build wheels
uses: home-assistant/wheels@2025.07.0
with:
@@ -185,25 +184,25 @@ jobs:
arch: ${{ fromJson(needs.init.outputs.architectures) }}
steps:
- name: Checkout the repository
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
uses: actions/checkout@v5.0.0
- name: Download env_file
uses: actions/download-artifact@634f93cb2916e3fdff6788551b99b062d0335ce0 # v5.0.0
uses: actions/download-artifact@v5.0.0
with:
name: env_file
- name: Download build_constraints
uses: actions/download-artifact@634f93cb2916e3fdff6788551b99b062d0335ce0 # v5.0.0
uses: actions/download-artifact@v5.0.0
with:
name: build_constraints
- name: Download requirements_diff
uses: actions/download-artifact@634f93cb2916e3fdff6788551b99b062d0335ce0 # v5.0.0
uses: actions/download-artifact@v5.0.0
with:
name: requirements_diff
- name: Download requirements_all_wheels
uses: actions/download-artifact@634f93cb2916e3fdff6788551b99b062d0335ce0 # v5.0.0
uses: actions/download-artifact@v5.0.0
with:
name: requirements_all_wheels
@@ -219,7 +218,6 @@ jobs:
sed -i "/uv/d" requirements.txt
sed -i "/uv/d" requirements_diff.txt
# home-assistant/wheels doesn't support sha pinning
- name: Build wheels
uses: home-assistant/wheels@2025.07.0
with:

2
.gitignore vendored
View File

@@ -140,5 +140,5 @@ tmp_cache
pytest_buckets.txt
# AI tooling
.claude/settings.local.json
.claude

21
CODEOWNERS generated
View File

@@ -107,8 +107,8 @@ build.json @home-assistant/supervisor
/homeassistant/components/ambient_station/ @bachya
/tests/components/ambient_station/ @bachya
/homeassistant/components/amcrest/ @flacjacket
/homeassistant/components/analytics/ @home-assistant/core
/tests/components/analytics/ @home-assistant/core
/homeassistant/components/analytics/ @home-assistant/core @ludeeus
/tests/components/analytics/ @home-assistant/core @ludeeus
/homeassistant/components/analytics_insights/ @joostlek
/tests/components/analytics_insights/ @joostlek
/homeassistant/components/android_ip_webcam/ @engrbm87
@@ -442,6 +442,8 @@ build.json @home-assistant/supervisor
/tests/components/energyzero/ @klaasnicolaas
/homeassistant/components/enigma2/ @autinerd
/tests/components/enigma2/ @autinerd
/homeassistant/components/enocean/ @bdurrer
/tests/components/enocean/ @bdurrer
/homeassistant/components/enphase_envoy/ @bdraco @cgarwood @catsmanac
/tests/components/enphase_envoy/ @bdraco @cgarwood @catsmanac
/homeassistant/components/entur_public_transport/ @hfurubotten
@@ -968,8 +970,6 @@ build.json @home-assistant/supervisor
/tests/components/moat/ @bdraco
/homeassistant/components/mobile_app/ @home-assistant/core
/tests/components/mobile_app/ @home-assistant/core
/homeassistant/components/modbus/ @janiversen
/tests/components/modbus/ @janiversen
/homeassistant/components/modem_callerid/ @tkdrob
/tests/components/modem_callerid/ @tkdrob
/homeassistant/components/modern_forms/ @wonderslug
@@ -1017,8 +1017,7 @@ build.json @home-assistant/supervisor
/tests/components/nanoleaf/ @milanmeu @joostlek
/homeassistant/components/nasweb/ @nasWebio
/tests/components/nasweb/ @nasWebio
/homeassistant/components/nederlandse_spoorwegen/ @YarmoM @heindrichpaul
/tests/components/nederlandse_spoorwegen/ @YarmoM @heindrichpaul
/homeassistant/components/nederlandse_spoorwegen/ @YarmoM
/homeassistant/components/ness_alarm/ @nickw444
/tests/components/ness_alarm/ @nickw444
/homeassistant/components/nest/ @allenporter
@@ -1350,8 +1349,6 @@ build.json @home-assistant/supervisor
/tests/components/samsungtv/ @chemelli74 @epenet
/homeassistant/components/sanix/ @tomaszsluszniak
/tests/components/sanix/ @tomaszsluszniak
/homeassistant/components/satel_integra/ @Tommatheussen
/tests/components/satel_integra/ @Tommatheussen
/homeassistant/components/scene/ @home-assistant/core
/tests/components/scene/ @home-assistant/core
/homeassistant/components/schedule/ @home-assistant/core
@@ -1533,8 +1530,8 @@ build.json @home-assistant/supervisor
/tests/components/switchbee/ @jafar-atili
/homeassistant/components/switchbot/ @danielhiversen @RenierM26 @murtas @Eloston @dsypniewski @zerzhang
/tests/components/switchbot/ @danielhiversen @RenierM26 @murtas @Eloston @dsypniewski @zerzhang
/homeassistant/components/switchbot_cloud/ @SeraphicRav @laurence-presland @Gigatrappeur @XiaoLing-git
/tests/components/switchbot_cloud/ @SeraphicRav @laurence-presland @Gigatrappeur @XiaoLing-git
/homeassistant/components/switchbot_cloud/ @SeraphicRav @laurence-presland @Gigatrappeur
/tests/components/switchbot_cloud/ @SeraphicRav @laurence-presland @Gigatrappeur
/homeassistant/components/switcher_kis/ @thecode @YogevBokobza
/tests/components/switcher_kis/ @thecode @YogevBokobza
/homeassistant/components/switchmate/ @danielhiversen @qiz-li
@@ -1679,8 +1676,6 @@ build.json @home-assistant/supervisor
/tests/components/uptime_kuma/ @tr4nt0r
/homeassistant/components/uptimerobot/ @ludeeus @chemelli74
/tests/components/uptimerobot/ @ludeeus @chemelli74
/homeassistant/components/usage_prediction/ @home-assistant/core
/tests/components/usage_prediction/ @home-assistant/core
/homeassistant/components/usb/ @bdraco
/tests/components/usb/ @bdraco
/homeassistant/components/usgs_earthquakes_feed/ @exxamalte
@@ -1710,8 +1705,6 @@ build.json @home-assistant/supervisor
/tests/components/vesync/ @markperdue @webdjoe @thegardenmonkey @cdnninja @iprak @sapuseven
/homeassistant/components/vicare/ @CFenner
/tests/components/vicare/ @CFenner
/homeassistant/components/victron_remote_monitoring/ @AndyTempel
/tests/components/victron_remote_monitoring/ @AndyTempel
/homeassistant/components/vilfo/ @ManneW
/tests/components/vilfo/ @ManneW
/homeassistant/components/vivotek/ @HarlemSquirrel

View File

@@ -2,23 +2,21 @@
from __future__ import annotations
import asyncio
import logging
from accuweather import AccuWeather
from homeassistant.components.sensor import DOMAIN as SENSOR_PLATFORM
from homeassistant.const import CONF_API_KEY, Platform
from homeassistant.const import CONF_API_KEY, CONF_NAME, Platform
from homeassistant.core import HomeAssistant
from homeassistant.helpers import entity_registry as er
from homeassistant.helpers.aiohttp_client import async_get_clientsession
from .const import DOMAIN
from .const import DOMAIN, UPDATE_INTERVAL_DAILY_FORECAST, UPDATE_INTERVAL_OBSERVATION
from .coordinator import (
AccuWeatherConfigEntry,
AccuWeatherDailyForecastDataUpdateCoordinator,
AccuWeatherData,
AccuWeatherHourlyForecastDataUpdateCoordinator,
AccuWeatherObservationDataUpdateCoordinator,
)
@@ -30,6 +28,7 @@ PLATFORMS = [Platform.SENSOR, Platform.WEATHER]
async def async_setup_entry(hass: HomeAssistant, entry: AccuWeatherConfigEntry) -> bool:
"""Set up AccuWeather as config entry."""
api_key: str = entry.data[CONF_API_KEY]
name: str = entry.data[CONF_NAME]
location_key = entry.unique_id
@@ -42,28 +41,26 @@ async def async_setup_entry(hass: HomeAssistant, entry: AccuWeatherConfigEntry)
hass,
entry,
accuweather,
name,
"observation",
UPDATE_INTERVAL_OBSERVATION,
)
coordinator_daily_forecast = AccuWeatherDailyForecastDataUpdateCoordinator(
hass,
entry,
accuweather,
)
coordinator_hourly_forecast = AccuWeatherHourlyForecastDataUpdateCoordinator(
hass,
entry,
accuweather,
name,
"daily forecast",
UPDATE_INTERVAL_DAILY_FORECAST,
)
await asyncio.gather(
coordinator_observation.async_config_entry_first_refresh(),
coordinator_daily_forecast.async_config_entry_first_refresh(),
coordinator_hourly_forecast.async_config_entry_first_refresh(),
)
await coordinator_observation.async_config_entry_first_refresh()
await coordinator_daily_forecast.async_config_entry_first_refresh()
entry.runtime_data = AccuWeatherData(
coordinator_observation=coordinator_observation,
coordinator_daily_forecast=coordinator_daily_forecast,
coordinator_hourly_forecast=coordinator_hourly_forecast,
)
await hass.config_entries.async_forward_entry_setups(entry, PLATFORMS)

View File

@@ -71,4 +71,3 @@ POLLEN_CATEGORY_MAP = {
}
UPDATE_INTERVAL_OBSERVATION = timedelta(minutes=10)
UPDATE_INTERVAL_DAILY_FORECAST = timedelta(hours=6)
UPDATE_INTERVAL_HOURLY_FORECAST = timedelta(hours=30)

View File

@@ -3,7 +3,6 @@
from __future__ import annotations
from asyncio import timeout
from collections.abc import Awaitable, Callable
from dataclasses import dataclass
from datetime import timedelta
import logging
@@ -13,7 +12,6 @@ from accuweather import AccuWeather, ApiError, InvalidApiKeyError, RequestsExcee
from aiohttp.client_exceptions import ClientConnectorError
from homeassistant.config_entries import ConfigEntry
from homeassistant.const import CONF_NAME
from homeassistant.core import HomeAssistant
from homeassistant.helpers.device_registry import DeviceEntryType, DeviceInfo
from homeassistant.helpers.update_coordinator import (
@@ -22,13 +20,7 @@ from homeassistant.helpers.update_coordinator import (
UpdateFailed,
)
from .const import (
DOMAIN,
MANUFACTURER,
UPDATE_INTERVAL_DAILY_FORECAST,
UPDATE_INTERVAL_HOURLY_FORECAST,
UPDATE_INTERVAL_OBSERVATION,
)
from .const import DOMAIN, MANUFACTURER
EXCEPTIONS = (ApiError, ClientConnectorError, InvalidApiKeyError, RequestsExceededError)
@@ -41,7 +33,6 @@ class AccuWeatherData:
coordinator_observation: AccuWeatherObservationDataUpdateCoordinator
coordinator_daily_forecast: AccuWeatherDailyForecastDataUpdateCoordinator
coordinator_hourly_forecast: AccuWeatherHourlyForecastDataUpdateCoordinator
type AccuWeatherConfigEntry = ConfigEntry[AccuWeatherData]
@@ -57,11 +48,13 @@ class AccuWeatherObservationDataUpdateCoordinator(
hass: HomeAssistant,
config_entry: AccuWeatherConfigEntry,
accuweather: AccuWeather,
name: str,
coordinator_type: str,
update_interval: timedelta,
) -> None:
"""Initialize."""
self.accuweather = accuweather
self.location_key = accuweather.location_key
name = config_entry.data[CONF_NAME]
if TYPE_CHECKING:
assert self.location_key is not None
@@ -72,8 +65,8 @@ class AccuWeatherObservationDataUpdateCoordinator(
hass,
_LOGGER,
config_entry=config_entry,
name=f"{name} (observation)",
update_interval=UPDATE_INTERVAL_OBSERVATION,
name=f"{name} ({coordinator_type})",
update_interval=update_interval,
)
async def _async_update_data(self) -> dict[str, Any]:
@@ -93,25 +86,23 @@ class AccuWeatherObservationDataUpdateCoordinator(
return result
class AccuWeatherForecastDataUpdateCoordinator(
class AccuWeatherDailyForecastDataUpdateCoordinator(
TimestampDataUpdateCoordinator[list[dict[str, Any]]]
):
"""Base class for AccuWeather forecast."""
"""Class to manage fetching AccuWeather data API."""
def __init__(
self,
hass: HomeAssistant,
config_entry: AccuWeatherConfigEntry,
accuweather: AccuWeather,
name: str,
coordinator_type: str,
update_interval: timedelta,
fetch_method: Callable[..., Awaitable[list[dict[str, Any]]]],
) -> None:
"""Initialize."""
self.accuweather = accuweather
self.location_key = accuweather.location_key
self._fetch_method = fetch_method
name = config_entry.data[CONF_NAME]
if TYPE_CHECKING:
assert self.location_key is not None
@@ -127,10 +118,12 @@ class AccuWeatherForecastDataUpdateCoordinator(
)
async def _async_update_data(self) -> list[dict[str, Any]]:
"""Update forecast data via library."""
"""Update data via library."""
try:
async with timeout(10):
result = await self._fetch_method(language=self.hass.config.language)
result = await self.accuweather.async_get_daily_forecast(
language=self.hass.config.language
)
except EXCEPTIONS as error:
raise UpdateFailed(
translation_domain=DOMAIN,
@@ -139,53 +132,10 @@ class AccuWeatherForecastDataUpdateCoordinator(
) from error
_LOGGER.debug("Requests remaining: %d", self.accuweather.requests_remaining)
return result
class AccuWeatherDailyForecastDataUpdateCoordinator(
AccuWeatherForecastDataUpdateCoordinator
):
"""Coordinator for daily forecast."""
def __init__(
self,
hass: HomeAssistant,
config_entry: AccuWeatherConfigEntry,
accuweather: AccuWeather,
) -> None:
"""Initialize."""
super().__init__(
hass,
config_entry,
accuweather,
"daily forecast",
UPDATE_INTERVAL_DAILY_FORECAST,
fetch_method=accuweather.async_get_daily_forecast,
)
class AccuWeatherHourlyForecastDataUpdateCoordinator(
AccuWeatherForecastDataUpdateCoordinator
):
"""Coordinator for hourly forecast."""
def __init__(
self,
hass: HomeAssistant,
config_entry: AccuWeatherConfigEntry,
accuweather: AccuWeather,
) -> None:
"""Initialize."""
super().__init__(
hass,
config_entry,
accuweather,
"hourly forecast",
UPDATE_INTERVAL_HOURLY_FORECAST,
fetch_method=accuweather.async_get_hourly_forecast,
)
def _get_device_info(location_key: str, name: str) -> DeviceInfo:
"""Get device info."""
return DeviceInfo(

View File

@@ -45,7 +45,6 @@ from .coordinator import (
AccuWeatherConfigEntry,
AccuWeatherDailyForecastDataUpdateCoordinator,
AccuWeatherData,
AccuWeatherHourlyForecastDataUpdateCoordinator,
AccuWeatherObservationDataUpdateCoordinator,
)
@@ -65,7 +64,6 @@ class AccuWeatherEntity(
CoordinatorWeatherEntity[
AccuWeatherObservationDataUpdateCoordinator,
AccuWeatherDailyForecastDataUpdateCoordinator,
AccuWeatherHourlyForecastDataUpdateCoordinator,
]
):
"""Define an AccuWeather entity."""
@@ -78,7 +76,6 @@ class AccuWeatherEntity(
super().__init__(
observation_coordinator=accuweather_data.coordinator_observation,
daily_coordinator=accuweather_data.coordinator_daily_forecast,
hourly_coordinator=accuweather_data.coordinator_hourly_forecast,
)
self._attr_native_precipitation_unit = UnitOfPrecipitationDepth.MILLIMETERS
@@ -89,13 +86,10 @@ class AccuWeatherEntity(
self._attr_unique_id = accuweather_data.coordinator_observation.location_key
self._attr_attribution = ATTRIBUTION
self._attr_device_info = accuweather_data.coordinator_observation.device_info
self._attr_supported_features = (
WeatherEntityFeature.FORECAST_DAILY | WeatherEntityFeature.FORECAST_HOURLY
)
self._attr_supported_features = WeatherEntityFeature.FORECAST_DAILY
self.observation_coordinator = accuweather_data.coordinator_observation
self.daily_coordinator = accuweather_data.coordinator_daily_forecast
self.hourly_coordinator = accuweather_data.coordinator_hourly_forecast
@property
def condition(self) -> str | None:
@@ -213,32 +207,3 @@ class AccuWeatherEntity(
}
for item in self.daily_coordinator.data
]
@callback
def _async_forecast_hourly(self) -> list[Forecast] | None:
"""Return the hourly forecast in native units."""
return [
{
ATTR_FORECAST_TIME: utc_from_timestamp(
item["EpochDateTime"]
).isoformat(),
ATTR_FORECAST_CLOUD_COVERAGE: item["CloudCover"],
ATTR_FORECAST_HUMIDITY: item["RelativeHumidity"],
ATTR_FORECAST_NATIVE_TEMP: item["Temperature"][ATTR_VALUE],
ATTR_FORECAST_NATIVE_APPARENT_TEMP: item["RealFeelTemperature"][
ATTR_VALUE
],
ATTR_FORECAST_NATIVE_PRECIPITATION: item["TotalLiquid"][ATTR_VALUE],
ATTR_FORECAST_PRECIPITATION_PROBABILITY: item[
"PrecipitationProbability"
],
ATTR_FORECAST_NATIVE_WIND_SPEED: item["Wind"][ATTR_SPEED][ATTR_VALUE],
ATTR_FORECAST_NATIVE_WIND_GUST_SPEED: item["WindGust"][ATTR_SPEED][
ATTR_VALUE
],
ATTR_FORECAST_UV_INDEX: item["UVIndex"],
ATTR_FORECAST_WIND_BEARING: item["Wind"][ATTR_DIRECTION]["Degrees"],
ATTR_FORECAST_CONDITION: CONDITION_MAP.get(item["WeatherIcon"]),
}
for item in self.hourly_coordinator.data
]

View File

@@ -3,8 +3,10 @@
import logging
from typing import Any
from aiohttp import web
import voluptuous as vol
from homeassistant.components.http import KEY_HASS, HomeAssistantView
from homeassistant.config_entries import ConfigEntry
from homeassistant.const import ATTR_ENTITY_ID, CONF_DESCRIPTION, CONF_SELECTOR
from homeassistant.core import (
@@ -26,6 +28,7 @@ from .const import (
ATTR_STRUCTURE,
ATTR_TASK_NAME,
DATA_COMPONENT,
DATA_IMAGES,
DATA_PREFERENCES,
DOMAIN,
SERVICE_GENERATE_DATA,
@@ -39,6 +42,7 @@ from .task import (
GenDataTaskResult,
GenImageTask,
GenImageTaskResult,
ImageData,
async_generate_data,
async_generate_image,
)
@@ -51,6 +55,7 @@ __all__ = [
"GenDataTaskResult",
"GenImageTask",
"GenImageTaskResult",
"ImageData",
"async_generate_data",
"async_generate_image",
"async_setup",
@@ -89,8 +94,10 @@ async def async_setup(hass: HomeAssistant, config: ConfigType) -> bool:
entity_component = EntityComponent[AITaskEntity](_LOGGER, DOMAIN, hass)
hass.data[DATA_COMPONENT] = entity_component
hass.data[DATA_PREFERENCES] = AITaskPreferences(hass)
hass.data[DATA_IMAGES] = {}
await hass.data[DATA_PREFERENCES].async_load()
async_setup_http(hass)
hass.http.register_view(ImageView)
hass.services.async_register(
DOMAIN,
SERVICE_GENERATE_DATA,
@@ -202,3 +209,28 @@ class AITaskPreferences:
def as_dict(self) -> dict[str, str | None]:
"""Get the current preferences."""
return {key: getattr(self, key) for key in self.KEYS}
class ImageView(HomeAssistantView):
"""View to generated images."""
url = f"/api/{DOMAIN}/images/{{filename}}"
name = f"api:{DOMAIN}/images"
async def get(
self,
request: web.Request,
filename: str,
) -> web.Response:
"""Serve image."""
hass = request.app[KEY_HASS]
image_storage = hass.data[DATA_IMAGES]
image_data = image_storage.get(filename)
if image_data is None:
raise web.HTTPNotFound
return web.Response(
body=image_data.data,
content_type=image_data.mime_type,
)

View File

@@ -8,19 +8,19 @@ from typing import TYPE_CHECKING, Final
from homeassistant.util.hass_dict import HassKey
if TYPE_CHECKING:
from homeassistant.components.media_source import local_source
from homeassistant.helpers.entity_component import EntityComponent
from . import AITaskPreferences
from .entity import AITaskEntity
from .task import ImageData
DOMAIN = "ai_task"
DATA_COMPONENT: HassKey[EntityComponent[AITaskEntity]] = HassKey(DOMAIN)
DATA_PREFERENCES: HassKey[AITaskPreferences] = HassKey(f"{DOMAIN}_preferences")
DATA_MEDIA_SOURCE: HassKey[local_source.LocalSource] = HassKey(f"{DOMAIN}_media_source")
DATA_IMAGES: HassKey[dict[str, ImageData]] = HassKey(f"{DOMAIN}_images")
IMAGE_DIR: Final = "image"
IMAGE_EXPIRY_TIME = 60 * 60 # 1 hour
MAX_IMAGES = 20
SERVICE_GENERATE_DATA = "generate_data"
SERVICE_GENERATE_IMAGE = "generate_image"

View File

@@ -1,7 +1,7 @@
{
"domain": "ai_task",
"name": "AI Task",
"after_dependencies": ["camera"],
"after_dependencies": ["camera", "http"],
"codeowners": ["@home-assistant/core"],
"dependencies": ["conversation", "media_source"],
"documentation": "https://www.home-assistant.io/integrations/ai_task",

View File

@@ -2,31 +2,89 @@
from __future__ import annotations
from pathlib import Path
from datetime import timedelta
import logging
from homeassistant.components.media_source import MediaSource, local_source
from homeassistant.components.http.auth import async_sign_path
from homeassistant.components.media_player import BrowseError, MediaClass
from homeassistant.components.media_source import (
BrowseMediaSource,
MediaSource,
MediaSourceItem,
PlayMedia,
Unresolvable,
)
from homeassistant.core import HomeAssistant
from homeassistant.exceptions import HomeAssistantError
from .const import DATA_MEDIA_SOURCE, DOMAIN, IMAGE_DIR
from .const import DATA_IMAGES, DOMAIN, IMAGE_EXPIRY_TIME
_LOGGER = logging.getLogger(__name__)
async def async_get_media_source(hass: HomeAssistant) -> MediaSource:
"""Set up local media source."""
media_dirs = list(hass.config.media_dirs.values())
async def async_get_media_source(hass: HomeAssistant) -> ImageMediaSource:
"""Set up image media source."""
_LOGGER.debug("Setting up image media source")
return ImageMediaSource(hass)
if not media_dirs:
raise HomeAssistantError(
"AI Task media source requires at least one media directory configured"
class ImageMediaSource(MediaSource):
"""Provide images as media sources."""
name: str = "AI Generated Images"
def __init__(self, hass: HomeAssistant) -> None:
"""Initialize ImageMediaSource."""
super().__init__(DOMAIN)
self.hass = hass
async def async_resolve_media(self, item: MediaSourceItem) -> PlayMedia:
"""Resolve media to a url."""
image_storage = self.hass.data[DATA_IMAGES]
image = image_storage.get(item.identifier)
if image is None:
raise Unresolvable(f"Could not resolve media item: {item.identifier}")
return PlayMedia(
async_sign_path(
self.hass,
f"/api/{DOMAIN}/images/{item.identifier}",
timedelta(seconds=IMAGE_EXPIRY_TIME or 1800),
),
image.mime_type,
)
media_dir = Path(media_dirs[0]) / DOMAIN / IMAGE_DIR
async def async_browse_media(
self,
item: MediaSourceItem,
) -> BrowseMediaSource:
"""Return media."""
if item.identifier:
raise BrowseError("Unknown item")
hass.data[DATA_MEDIA_SOURCE] = source = local_source.LocalSource(
hass,
DOMAIN,
"AI Generated Images",
{IMAGE_DIR: str(media_dir)},
f"/{DOMAIN}",
)
return source
image_storage = self.hass.data[DATA_IMAGES]
children = [
BrowseMediaSource(
domain=DOMAIN,
identifier=filename,
media_class=MediaClass.IMAGE,
media_content_type=image.mime_type,
title=image.title or filename,
can_play=True,
can_expand=False,
)
for filename, image in image_storage.items()
]
return BrowseMediaSource(
domain=DOMAIN,
identifier=None,
media_class=MediaClass.APP,
media_content_type="",
title="AI Generated Images",
can_play=False,
can_expand=True,
children_media_class=MediaClass.IMAGE,
children=children,
)

View File

@@ -4,7 +4,7 @@ from __future__ import annotations
from dataclasses import dataclass
from datetime import datetime, timedelta
import io
from functools import partial
import mimetypes
from pathlib import Path
import tempfile
@@ -12,33 +12,34 @@ from typing import Any
import voluptuous as vol
from homeassistant.components import camera, conversation, image, media_source
from homeassistant.components import camera, conversation, media_source
from homeassistant.components.http.auth import async_sign_path
from homeassistant.core import HomeAssistant, ServiceResponse, callback
from homeassistant.exceptions import HomeAssistantError
from homeassistant.helpers import llm
from homeassistant.helpers.chat_session import ChatSession, async_get_chat_session
from homeassistant.helpers.event import async_call_later
from homeassistant.util import RE_SANITIZE_FILENAME, slugify
from .const import (
DATA_COMPONENT,
DATA_MEDIA_SOURCE,
DATA_IMAGES,
DATA_PREFERENCES,
DOMAIN,
IMAGE_DIR,
IMAGE_EXPIRY_TIME,
MAX_IMAGES,
AITaskEntityFeature,
)
def _save_camera_snapshot(image_data: camera.Image | image.Image) -> Path:
def _save_camera_snapshot(image: camera.Image) -> Path:
"""Save camera snapshot to temp file."""
with tempfile.NamedTemporaryFile(
mode="wb",
suffix=mimetypes.guess_extension(image_data.content_type, False),
suffix=mimetypes.guess_extension(image.content_type, False),
delete=False,
) as temp_file:
temp_file.write(image_data.content)
temp_file.write(image.content)
return Path(temp_file.name)
@@ -54,31 +55,26 @@ async def _resolve_attachments(
for attachment in attachments or []:
media_content_id = attachment["media_content_id"]
# Special case for certain media sources
for integration in camera, image:
media_source_prefix = f"media-source://{integration.DOMAIN}/"
if not media_content_id.startswith(media_source_prefix):
continue
# Special case for camera media sources
if media_content_id.startswith("media-source://camera/"):
# Extract entity_id from the media content ID
entity_id = media_content_id.removeprefix(media_source_prefix)
entity_id = media_content_id.removeprefix("media-source://camera/")
# Get snapshot from entity
image_data = await integration.async_get_image(hass, entity_id)
# Get snapshot from camera
image = await camera.async_get_image(hass, entity_id)
temp_filename = await hass.async_add_executor_job(
_save_camera_snapshot, image_data
_save_camera_snapshot, image
)
created_files.append(temp_filename)
resolved_attachments.append(
conversation.Attachment(
media_content_id=media_content_id,
mime_type=image_data.content_type,
mime_type=image.content_type,
path=temp_filename,
)
)
break
else:
# Handle regular media sources
media = await media_source.async_resolve_media(hass, media_content_id, None)
@@ -161,6 +157,24 @@ async def async_generate_data(
)
def _cleanup_images(image_storage: dict[str, ImageData], num_to_remove: int) -> None:
"""Remove old images to keep the storage size under the limit."""
if num_to_remove <= 0:
return
if num_to_remove >= len(image_storage):
image_storage.clear()
return
sorted_images = sorted(
image_storage.items(),
key=lambda item: item[1].timestamp,
)
for filename, _ in sorted_images[:num_to_remove]:
image_storage.pop(filename, None)
async def async_generate_image(
hass: HomeAssistant,
*,
@@ -210,34 +224,36 @@ async def async_generate_image(
if service_result.get("revised_prompt") is None:
service_result["revised_prompt"] = instructions
source = hass.data[DATA_MEDIA_SOURCE]
image_storage = hass.data[DATA_IMAGES]
if len(image_storage) + 1 > MAX_IMAGES:
_cleanup_images(image_storage, len(image_storage) + 1 - MAX_IMAGES)
current_time = datetime.now()
ext = mimetypes.guess_extension(task_result.mime_type, False) or ".png"
sanitized_task_name = RE_SANITIZE_FILENAME.sub("", slugify(task_name))
filename = f"{current_time.strftime('%Y-%m-%d_%H%M%S')}_{sanitized_task_name}{ext}"
image_file = ImageData(
filename=f"{current_time.strftime('%Y-%m-%d_%H%M%S')}_{sanitized_task_name}{ext}",
file=io.BytesIO(image_data),
content_type=task_result.mime_type,
image_storage[filename] = ImageData(
data=image_data,
timestamp=int(current_time.timestamp()),
mime_type=task_result.mime_type,
title=service_result["revised_prompt"],
)
target_folder = media_source.MediaSourceItem.from_uri(
hass, f"media-source://{DOMAIN}/{IMAGE_DIR}", None
)
def _purge_image(filename: str, now: datetime) -> None:
"""Remove image from storage."""
image_storage.pop(filename, None)
service_result["media_source_id"] = await source.async_upload_media(
target_folder, image_file
)
if IMAGE_EXPIRY_TIME > 0:
async_call_later(hass, IMAGE_EXPIRY_TIME, partial(_purge_image, filename))
item = media_source.MediaSourceItem.from_uri(
hass, service_result["media_source_id"], None
)
service_result["url"] = async_sign_path(
hass,
(await source.async_resolve_media(item)).url,
timedelta(seconds=IMAGE_EXPIRY_TIME),
f"/api/{DOMAIN}/images/{filename}",
timedelta(seconds=IMAGE_EXPIRY_TIME or 1800),
)
service_result["media_source_id"] = f"media-source://{DOMAIN}/images/{filename}"
return service_result
@@ -342,8 +358,20 @@ class GenImageTaskResult:
@dataclass(slots=True)
class ImageData:
"""Implementation of media_source.local_source.UploadedFile protocol."""
"""Image data for stored generated images."""
filename: str
file: io.IOBase
content_type: str
data: bytes
"""Raw image data."""
timestamp: int
"""Timestamp when the image was generated, as a Unix timestamp."""
mime_type: str
"""MIME type of the image."""
title: str
"""Title of the image, usually the prompt used to generate it."""
def __str__(self) -> str:
"""Return image data as a string."""
return f"<ImageData {self.title}: {id(self)}>"

View File

@@ -3,6 +3,7 @@
from __future__ import annotations
from genie_partner_sdk.client import AladdinConnectClient
from genie_partner_sdk.model import GarageDoor
from homeassistant.const import Platform
from homeassistant.core import HomeAssistant
@@ -35,7 +36,22 @@ async def async_setup_entry(
api.AsyncConfigEntryAuth(aiohttp_client.async_get_clientsession(hass), session)
)
doors = await client.get_doors()
sdk_doors = await client.get_doors()
# Convert SDK GarageDoor objects to integration GarageDoor objects
doors = [
GarageDoor(
{
"device_id": door.device_id,
"door_number": door.door_number,
"name": door.name,
"status": door.status,
"link_status": door.link_status,
"battery_level": door.battery_level,
}
)
for door in sdk_doors
]
entry.runtime_data = {
door.unique_id: AladdinConnectCoordinator(hass, entry, client, door)

View File

@@ -41,10 +41,4 @@ class AladdinConnectCoordinator(DataUpdateCoordinator[GarageDoor]):
async def _async_update_data(self) -> GarageDoor:
"""Fetch data from the Aladdin Connect API."""
await self.client.update_door(self.data.device_id, self.data.door_number)
self.data.status = self.client.get_door_status(
self.data.device_id, self.data.door_number
)
self.data.battery_level = self.client.get_battery_status(
self.data.device_id, self.data.door_number
)
return self.data

View File

@@ -49,9 +49,7 @@ class AladdinCoverEntity(AladdinConnectEntity, CoverEntity):
@property
def is_closed(self) -> bool | None:
"""Update is closed attribute."""
if (status := self.coordinator.data.status) is None:
return None
return status == "closed"
return self.coordinator.data.status == "closed"
@property
def is_closing(self) -> bool | None:

View File

@@ -12,5 +12,5 @@
"documentation": "https://www.home-assistant.io/integrations/aladdin_connect",
"integration_type": "hub",
"iot_class": "cloud_polling",
"requirements": ["genie-partner-sdk==1.0.11"]
"requirements": ["genie-partner-sdk==1.0.10"]
}

View File

@@ -2,7 +2,7 @@
"domain": "analytics",
"name": "Analytics",
"after_dependencies": ["energy", "hassio", "recorder"],
"codeowners": ["@home-assistant/core"],
"codeowners": ["@home-assistant/core", "@ludeeus"],
"dependencies": ["api", "websocket_api", "http"],
"documentation": "https://www.home-assistant.io/integrations/analytics",
"integration_type": "system",

View File

@@ -33,11 +33,9 @@ from homeassistant.const import (
)
from homeassistant.core import Event, HomeAssistant
from homeassistant.exceptions import ConfigEntryNotReady
from homeassistant.helpers import config_validation as cv
from homeassistant.helpers.device_registry import format_mac
from homeassistant.helpers.dispatcher import async_dispatcher_send
from homeassistant.helpers.storage import STORAGE_DIR
from homeassistant.helpers.typing import ConfigType
from .const import (
CONF_ADB_SERVER_IP,
@@ -48,12 +46,10 @@ from .const import (
DEFAULT_ADB_SERVER_PORT,
DEVICE_ANDROIDTV,
DEVICE_FIRETV,
DOMAIN,
PROP_ETHMAC,
PROP_WIFIMAC,
SIGNAL_CONFIG_ENTITY,
)
from .services import async_setup_services
ADB_PYTHON_EXCEPTIONS: tuple = (
AdbTimeoutError,
@@ -67,8 +63,6 @@ ADB_PYTHON_EXCEPTIONS: tuple = (
)
ADB_TCP_EXCEPTIONS: tuple = (ConnectionResetError, RuntimeError)
CONFIG_SCHEMA = cv.config_entry_only_config_schema(DOMAIN)
PLATFORMS = [Platform.MEDIA_PLAYER, Platform.REMOTE]
RELOAD_OPTIONS = [CONF_STATE_DETECTION_RULES]
@@ -194,12 +188,6 @@ async def async_migrate_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool:
return True
async def async_setup(hass: HomeAssistant, config: ConfigType) -> bool:
"""Set up the Android TV / Fire TV integration."""
async_setup_services(hass)
return True
async def async_setup_entry(hass: HomeAssistant, entry: AndroidTVConfigEntry) -> bool:
"""Set up Android Debug Bridge platform."""

View File

@@ -8,6 +8,7 @@ import logging
from androidtv.constants import APPS, KEYS
from androidtv.setup_async import AndroidTVAsync, FireTVAsync
import voluptuous as vol
from homeassistant.components import persistent_notification
from homeassistant.components.media_player import (
@@ -16,7 +17,9 @@ from homeassistant.components.media_player import (
MediaPlayerEntityFeature,
MediaPlayerState,
)
from homeassistant.const import ATTR_COMMAND
from homeassistant.core import HomeAssistant
from homeassistant.helpers import config_validation as cv, entity_platform
from homeassistant.helpers.dispatcher import async_dispatcher_connect
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from homeassistant.util.dt import utcnow
@@ -36,10 +39,19 @@ from .const import (
SIGNAL_CONFIG_ENTITY,
)
from .entity import AndroidTVEntity, adb_decorator
from .services import ATTR_ADB_RESPONSE, ATTR_HDMI_INPUT, SERVICE_LEARN_SENDEVENT
_LOGGER = logging.getLogger(__name__)
ATTR_ADB_RESPONSE = "adb_response"
ATTR_DEVICE_PATH = "device_path"
ATTR_HDMI_INPUT = "hdmi_input"
ATTR_LOCAL_PATH = "local_path"
SERVICE_ADB_COMMAND = "adb_command"
SERVICE_DOWNLOAD = "download"
SERVICE_LEARN_SENDEVENT = "learn_sendevent"
SERVICE_UPLOAD = "upload"
# Translate from `AndroidTV` / `FireTV` reported state to HA state.
ANDROIDTV_STATES = {
"off": MediaPlayerState.OFF,
@@ -65,6 +77,32 @@ async def async_setup_entry(
]
)
platform = entity_platform.async_get_current_platform()
platform.async_register_entity_service(
SERVICE_ADB_COMMAND,
{vol.Required(ATTR_COMMAND): cv.string},
"adb_command",
)
platform.async_register_entity_service(
SERVICE_LEARN_SENDEVENT, None, "learn_sendevent"
)
platform.async_register_entity_service(
SERVICE_DOWNLOAD,
{
vol.Required(ATTR_DEVICE_PATH): cv.string,
vol.Required(ATTR_LOCAL_PATH): cv.string,
},
"service_download",
)
platform.async_register_entity_service(
SERVICE_UPLOAD,
{
vol.Required(ATTR_DEVICE_PATH): cv.string,
vol.Required(ATTR_LOCAL_PATH): cv.string,
},
"service_upload",
)
class ADBDevice(AndroidTVEntity, MediaPlayerEntity):
"""Representation of an Android or Fire TV device."""

View File

@@ -1,66 +0,0 @@
"""Services for Android/Fire TV devices."""
from __future__ import annotations
import voluptuous as vol
from homeassistant.components.media_player import DOMAIN as MEDIA_PLAYER_DOMAIN
from homeassistant.const import ATTR_COMMAND
from homeassistant.core import HomeAssistant, callback
from homeassistant.helpers import config_validation as cv, service
from .const import DOMAIN
ATTR_ADB_RESPONSE = "adb_response"
ATTR_DEVICE_PATH = "device_path"
ATTR_HDMI_INPUT = "hdmi_input"
ATTR_LOCAL_PATH = "local_path"
SERVICE_ADB_COMMAND = "adb_command"
SERVICE_DOWNLOAD = "download"
SERVICE_LEARN_SENDEVENT = "learn_sendevent"
SERVICE_UPLOAD = "upload"
@callback
def async_setup_services(hass: HomeAssistant) -> None:
"""Register the Android TV / Fire TV services."""
service.async_register_platform_entity_service(
hass,
DOMAIN,
SERVICE_ADB_COMMAND,
entity_domain=MEDIA_PLAYER_DOMAIN,
schema={vol.Required(ATTR_COMMAND): cv.string},
func="adb_command",
)
service.async_register_platform_entity_service(
hass,
DOMAIN,
SERVICE_LEARN_SENDEVENT,
entity_domain=MEDIA_PLAYER_DOMAIN,
schema=None,
func="learn_sendevent",
)
service.async_register_platform_entity_service(
hass,
DOMAIN,
SERVICE_DOWNLOAD,
entity_domain=MEDIA_PLAYER_DOMAIN,
schema={
vol.Required(ATTR_DEVICE_PATH): cv.string,
vol.Required(ATTR_LOCAL_PATH): cv.string,
},
func="service_download",
)
service.async_register_platform_entity_service(
hass,
DOMAIN,
SERVICE_UPLOAD,
entity_domain=MEDIA_PLAYER_DOMAIN,
schema={
vol.Required(ATTR_DEVICE_PATH): cv.string,
vol.Required(ATTR_LOCAL_PATH): cv.string,
},
func="service_upload",
)

View File

@@ -16,7 +16,7 @@ from .coordinator import (
AOSmithStatusCoordinator,
)
PLATFORMS: list[Platform] = [Platform.SELECT, Platform.SENSOR, Platform.WATER_HEATER]
PLATFORMS: list[Platform] = [Platform.SENSOR, Platform.WATER_HEATER]
async def async_setup_entry(hass: HomeAssistant, entry: AOSmithConfigEntry) -> bool:

View File

@@ -1,10 +1,5 @@
{
"entity": {
"select": {
"hot_water_plus_level": {
"default": "mdi:water-plus"
}
},
"sensor": {
"hot_water_availability": {
"default": "mdi:water-thermometer"

View File

@@ -1,70 +0,0 @@
"""The select platform for the A. O. Smith integration."""
from homeassistant.components.select import SelectEntity
from homeassistant.core import HomeAssistant
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from . import AOSmithConfigEntry
from .coordinator import AOSmithStatusCoordinator
from .entity import AOSmithStatusEntity
HWP_LEVEL_HA_TO_AOSMITH = {
"off": 0,
"level1": 1,
"level2": 2,
"level3": 3,
}
HWP_LEVEL_AOSMITH_TO_HA = {value: key for key, value in HWP_LEVEL_HA_TO_AOSMITH.items()}
async def async_setup_entry(
hass: HomeAssistant,
entry: AOSmithConfigEntry,
async_add_entities: AddConfigEntryEntitiesCallback,
) -> None:
"""Set up A. O. Smith select platform."""
data = entry.runtime_data
async_add_entities(
AOSmithHotWaterPlusSelectEntity(data.status_coordinator, device.junction_id)
for device in data.status_coordinator.data.values()
if device.supports_hot_water_plus
)
class AOSmithHotWaterPlusSelectEntity(AOSmithStatusEntity, SelectEntity):
"""Class for the Hot Water+ select entity."""
_attr_translation_key = "hot_water_plus_level"
_attr_options = list(HWP_LEVEL_HA_TO_AOSMITH)
def __init__(self, coordinator: AOSmithStatusCoordinator, junction_id: str) -> None:
"""Initialize the entity."""
super().__init__(coordinator, junction_id)
self._attr_unique_id = f"hot_water_plus_level_{junction_id}"
@property
def suggested_object_id(self) -> str | None:
"""Override the suggested object id to make '+' get converted to 'plus' in the entity id."""
return "hot_water_plus_level"
@property
def current_option(self) -> str | None:
"""Return the current Hot Water+ mode."""
hot_water_plus_level = self.device.status.hot_water_plus_level
return (
None
if hot_water_plus_level is None
else HWP_LEVEL_AOSMITH_TO_HA.get(hot_water_plus_level)
)
async def async_select_option(self, option: str) -> None:
"""Set the Hot Water+ mode."""
aosmith_hwp_level = HWP_LEVEL_HA_TO_AOSMITH[option]
await self.client.update_mode(
junction_id=self.junction_id,
mode=self.device.status.current_mode,
hot_water_plus_level=aosmith_hwp_level,
)
await self.coordinator.async_request_refresh()

View File

@@ -26,17 +26,6 @@
}
},
"entity": {
"select": {
"hot_water_plus_level": {
"name": "Hot Water+ level",
"state": {
"off": "[%key:common::state::off%]",
"level1": "Level 1",
"level2": "Level 2",
"level3": "Level 3"
}
}
},
"sensor": {
"hot_water_availability": {
"name": "Hot water availability"

View File

@@ -7,5 +7,5 @@
"iot_class": "local_polling",
"loggers": ["apcaccess"],
"quality_scale": "platinum",
"requirements": ["aioapcaccess==1.0.0"]
"requirements": ["aioapcaccess==0.4.2"]
}

View File

@@ -395,7 +395,6 @@ SENSORS: dict[str, SensorEntityDescription] = {
"upsmode": SensorEntityDescription(
key="upsmode",
translation_key="ups_mode",
entity_category=EntityCategory.DIAGNOSTIC,
),
"upsname": SensorEntityDescription(
key="upsname",
@@ -467,10 +466,7 @@ async def async_setup_entry(
# periodical (or manual) self test since last daemon restart. It might not be available
# when we set up the integration, and we do not know if it would ever be available. Here we
# add it anyway and mark it as unknown initially.
#
# We also sort the resources to ensure the order of entities created is deterministic since
# "APCMODEL" and "MODEL" resources map to the same "Model" name.
for resource in sorted(available_resources | {LAST_S_TEST}):
for resource in available_resources | {LAST_S_TEST}:
if resource not in SENSORS:
_LOGGER.warning("Invalid resource from APCUPSd: %s", resource.upper())
continue

View File

@@ -1,7 +1,5 @@
"""Constants for the Assist pipeline integration."""
from pathlib import Path
DOMAIN = "assist_pipeline"
DATA_CONFIG = f"{DOMAIN}.config"
@@ -25,5 +23,3 @@ SAMPLES_PER_CHUNK = SAMPLE_RATE // (1000 // MS_PER_CHUNK) # 10 ms @ 16Khz
BYTES_PER_CHUNK = SAMPLES_PER_CHUNK * SAMPLE_WIDTH * SAMPLE_CHANNELS # 16-bit
OPTION_PREFERRED = "preferred"
ACKNOWLEDGE_PATH = Path(__file__).parent / "acknowledge.mp3"

View File

@@ -23,12 +23,7 @@ from homeassistant.components import conversation, stt, tts, wake_word, websocke
from homeassistant.const import ATTR_SUPPORTED_FEATURES, MATCH_ALL
from homeassistant.core import Context, HomeAssistant, callback
from homeassistant.exceptions import HomeAssistantError
from homeassistant.helpers import (
chat_session,
device_registry as dr,
entity_registry as er,
intent,
)
from homeassistant.helpers import chat_session, intent
from homeassistant.helpers.collection import (
CHANGE_UPDATED,
CollectionError,
@@ -50,7 +45,6 @@ from homeassistant.util.limited_size_dict import LimitedSizeDict
from .audio_enhancer import AudioEnhancer, EnhancedAudioChunk, MicroVadSpeexEnhancer
from .const import (
ACKNOWLEDGE_PATH,
BYTES_PER_CHUNK,
CONF_DEBUG_RECORDING_DIR,
DATA_CONFIG,
@@ -119,7 +113,6 @@ PIPELINE_FIELDS: VolDictType = {
vol.Required("wake_word_entity"): vol.Any(str, None),
vol.Required("wake_word_id"): vol.Any(str, None),
vol.Optional("prefer_local_intents"): bool,
vol.Optional("acknowledge_media_id"): str,
}
STORED_PIPELINE_RUNS = 10
@@ -1073,11 +1066,8 @@ class PipelineRun:
intent_input: str,
conversation_id: str,
conversation_extra_system_prompt: str | None,
) -> tuple[str, bool]:
"""Run intent recognition portion of pipeline.
Returns (speech, all_targets_in_satellite_area).
"""
) -> str:
"""Run intent recognition portion of pipeline. Returns text to speak."""
if self.intent_agent is None or self._conversation_data is None:
raise RuntimeError("Recognize intent was not prepared")
@@ -1126,7 +1116,6 @@ class PipelineRun:
agent_id = self.intent_agent.id
processed_locally = agent_id == conversation.HOME_ASSISTANT_AGENT
all_targets_in_satellite_area = False
intent_response: intent.IntentResponse | None = None
if not processed_locally and not self._intent_agent_only:
# Sentence triggers override conversation agent
@@ -1301,17 +1290,6 @@ class PipelineRun:
if tts_input_stream and self._streamed_response_text:
tts_input_stream.put_nowait(None)
if agent_id == conversation.HOME_ASSISTANT_AGENT:
# Check if all targeted entities were in the same area as
# the satellite device.
# If so, the satellite should respond with an acknowledge beep
# instead of a full response.
all_targets_in_satellite_area = (
self._get_all_targets_in_satellite_area(
conversation_result.response, self._device_id
)
)
except Exception as src_error:
_LOGGER.exception("Unexpected error during intent recognition")
raise IntentRecognitionError(
@@ -1334,45 +1312,7 @@ class PipelineRun:
if conversation_result.continue_conversation:
self._conversation_data.continue_conversation_agent = agent_id
return (speech, all_targets_in_satellite_area)
def _get_all_targets_in_satellite_area(
self, intent_response: intent.IntentResponse, device_id: str | None
) -> bool:
"""Return true if all targeted entities were in the same area as the device."""
if (
(intent_response.response_type != intent.IntentResponseType.ACTION_DONE)
or (not intent_response.matched_states)
or (not device_id)
):
return False
device_registry = dr.async_get(self.hass)
if (not (device := device_registry.async_get(device_id))) or (
not device.area_id
):
return False
entity_registry = er.async_get(self.hass)
for state in intent_response.matched_states:
entity = entity_registry.async_get(state.entity_id)
if not entity:
return False
if (entity_area_id := entity.area_id) is None:
if (entity.device_id is None) or (
(entity_device := device_registry.async_get(entity.device_id))
is None
):
return False
entity_area_id = entity_device.area_id
if entity_area_id != device.area_id:
return False
return True
return speech
async def prepare_text_to_speech(self) -> None:
"""Prepare text-to-speech."""
@@ -1410,9 +1350,7 @@ class PipelineRun:
),
) from err
async def text_to_speech(
self, tts_input: str, override_media_path: Path | None = None
) -> None:
async def text_to_speech(self, tts_input: str) -> None:
"""Run text-to-speech portion of pipeline."""
assert self.tts_stream is not None
@@ -1424,14 +1362,11 @@ class PipelineRun:
"language": self.pipeline.tts_language,
"voice": self.pipeline.tts_voice,
"tts_input": tts_input,
"acknowledge_override": override_media_path is not None,
},
)
)
if override_media_path:
self.tts_stream.async_override_result(override_media_path)
elif not self._streamed_response_text:
if not self._streamed_response_text:
self.tts_stream.async_set_message(tts_input)
tts_output = {
@@ -1729,20 +1664,16 @@ class PipelineInput:
if self.run.end_stage != PipelineStage.STT:
tts_input = self.tts_input
all_targets_in_satellite_area = False
if current_stage == PipelineStage.INTENT:
# intent-recognition
assert intent_input is not None
(
tts_input,
all_targets_in_satellite_area,
) = await self.run.recognize_intent(
tts_input = await self.run.recognize_intent(
intent_input,
self.session.conversation_id,
self.conversation_extra_system_prompt,
)
if all_targets_in_satellite_area or tts_input.strip():
if tts_input.strip():
current_stage = PipelineStage.TTS
else:
# Skip TTS
@@ -1751,14 +1682,8 @@ class PipelineInput:
if self.run.end_stage != PipelineStage.INTENT:
# text-to-speech
if current_stage == PipelineStage.TTS:
if all_targets_in_satellite_area:
# Use acknowledge media instead of full response
await self.run.text_to_speech(
tts_input or "", override_media_path=ACKNOWLEDGE_PATH
)
else:
assert tts_input is not None
await self.run.text_to_speech(tts_input)
assert tts_input is not None
await self.run.text_to_speech(tts_input)
except PipelineError as err:
self.run.process_event(

View File

@@ -3,7 +3,6 @@
from __future__ import annotations
from collections.abc import Iterable
from dataclasses import replace
from homeassistant.components.select import SelectEntity, SelectEntityDescription
from homeassistant.const import EntityCategory, Platform
@@ -65,36 +64,15 @@ class AssistPipelineSelect(SelectEntity, restore_state.RestoreEntity):
translation_key="pipeline",
entity_category=EntityCategory.CONFIG,
)
_attr_should_poll = False
_attr_current_option = OPTION_PREFERRED
_attr_options = [OPTION_PREFERRED]
def __init__(
self,
hass: HomeAssistant,
domain: str,
unique_id_prefix: str,
index: int = 0,
) -> None:
def __init__(self, hass: HomeAssistant, domain: str, unique_id_prefix: str) -> None:
"""Initialize a pipeline selector."""
if index < 1:
# Keep compatibility
key_suffix = ""
placeholder = ""
else:
key_suffix = f"_{index + 1}"
placeholder = f" {index + 1}"
self.entity_description = replace(
self.entity_description,
key=f"pipeline{key_suffix}",
translation_placeholders={"index": placeholder},
)
self._domain = domain
self._unique_id_prefix = unique_id_prefix
self._attr_unique_id = f"{unique_id_prefix}-{self.entity_description.key}"
self._attr_unique_id = f"{unique_id_prefix}-pipeline"
self.hass = hass
self._update_options()

View File

@@ -7,7 +7,7 @@
},
"select": {
"pipeline": {
"name": "Assistant{index}",
"name": "Assistant",
"state": {
"preferred": "Preferred"
}

View File

@@ -120,7 +120,6 @@ class AsusWrtBridge(ABC):
def __init__(self, host: str) -> None:
"""Initialize Bridge."""
self._configuration_url = f"http://{host}"
self._host = host
self._firmware: str | None = None
self._label_mac: str | None = None
@@ -128,11 +127,6 @@ class AsusWrtBridge(ABC):
self._model_id: str | None = None
self._serial_number: str | None = None
@property
def configuration_url(self) -> str:
"""Return configuration URL."""
return self._configuration_url
@property
def host(self) -> str:
"""Return hostname."""
@@ -377,7 +371,6 @@ class AsusWrtHttpBridge(AsusWrtBridge):
# get main router properties
if mac := _identity.mac:
self._label_mac = format_mac(mac)
self._configuration_url = self._api.webpanel
self._firmware = str(_identity.firmware)
self._model = _identity.model
self._model_id = _identity.product_id

View File

@@ -388,13 +388,13 @@ class AsusWrtRouter:
def device_info(self) -> DeviceInfo:
"""Return the device information."""
info = DeviceInfo(
configuration_url=self._api.configuration_url,
identifiers={(DOMAIN, self._entry.unique_id or "AsusWRT")},
name=self.host,
model=self._api.model or "Asus Router",
model_id=self._api.model_id,
serial_number=self._api.serial_number,
manufacturer="Asus",
configuration_url=f"http://{self.host}",
)
if self._api.firmware:
info["sw_version"] = self._api.firmware

View File

@@ -29,5 +29,5 @@
"documentation": "https://www.home-assistant.io/integrations/august",
"iot_class": "cloud_push",
"loggers": ["pubnub", "yalexs"],
"requirements": ["yalexs==9.2.0", "yalexs-ble==3.1.2"]
"requirements": ["yalexs==9.0.1", "yalexs-ble==3.1.2"]
}

View File

@@ -92,11 +92,7 @@ from homeassistant.components.http.ban import (
from homeassistant.components.http.data_validator import RequestDataValidator
from homeassistant.components.http.view import HomeAssistantView
from homeassistant.core import HomeAssistant, callback
from homeassistant.helpers.network import (
NoURLAvailableError,
get_url,
is_cloud_connection,
)
from homeassistant.helpers.network import is_cloud_connection
from homeassistant.util.network import is_local
from . import indieauth
@@ -129,18 +125,11 @@ class WellKnownOAuthInfoView(HomeAssistantView):
async def get(self, request: web.Request) -> web.Response:
"""Return the well known OAuth2 authorization info."""
hass = request.app[KEY_HASS]
# Some applications require absolute urls, so we prefer using the
# current requests url if possible, with fallback to a relative url.
try:
url_prefix = get_url(hass, require_current_request=True)
except NoURLAvailableError:
url_prefix = ""
return self.json(
{
"authorization_endpoint": f"{url_prefix}/auth/authorize",
"token_endpoint": f"{url_prefix}/auth/token",
"revocation_endpoint": f"{url_prefix}/auth/revoke",
"authorization_endpoint": "/auth/authorize",
"token_endpoint": "/auth/token",
"revocation_endpoint": "/auth/revoke",
"response_types_supported": ["code"],
"service_documentation": (
"https://developers.home-assistant.io/docs/auth_api"

View File

@@ -8,7 +8,7 @@ import threading
from typing import IO, cast
from aiohttp import BodyPartReader
from aiohttp.hdrs import CONTENT_DISPOSITION, CONTENT_TYPE
from aiohttp.hdrs import CONTENT_DISPOSITION
from aiohttp.web import FileResponse, Request, Response, StreamResponse
from multidict import istr
@@ -76,8 +76,7 @@ class DownloadBackupView(HomeAssistantView):
return Response(status=HTTPStatus.NOT_FOUND)
headers = {
CONTENT_DISPOSITION: f"attachment; filename={slugify(backup.name)}.tar",
CONTENT_TYPE: "application/x-tar",
CONTENT_DISPOSITION: f"attachment; filename={slugify(backup.name)}.tar"
}
try:

View File

@@ -14,15 +14,15 @@
},
"automatic_backup_failed_addons": {
"title": "Not all add-ons could be included in automatic backup",
"description": "Add-ons {failed_addons} could not be included in automatic backup. Please check the Supervisor logs for more information. Another attempt will be made at the next scheduled time if a backup schedule is configured."
"description": "Add-ons {failed_addons} could not be included in automatic backup. Please check the supervisor logs for more information. Another attempt will be made at the next scheduled time if a backup schedule is configured."
},
"automatic_backup_failed_agents_addons_folders": {
"title": "Automatic backup was created with errors",
"description": "The automatic backup was created with errors:\n* Locations which the backup could not be uploaded to: {failed_agents}\n* Add-ons which could not be backed up: {failed_addons}\n* Folders which could not be backed up: {failed_folders}\n\nPlease check the Core and Supervisor logs for more information. Another attempt will be made at the next scheduled time if a backup schedule is configured."
"description": "The automatic backup was created with errors:\n* Locations which the backup could not be uploaded to: {failed_agents}\n* Add-ons which could not be backed up: {failed_addons}\n* Folders which could not be backed up: {failed_folders}\n\nPlease check the core and supervisor logs for more information. Another attempt will be made at the next scheduled time if a backup schedule is configured."
},
"automatic_backup_failed_folders": {
"title": "Not all folders could be included in automatic backup",
"description": "Folders {failed_folders} could not be included in automatic backup. Please check the Supervisor logs for more information. Another attempt will be made at the next scheduled time if a backup schedule is configured."
"description": "Folders {failed_folders} could not be included in automatic backup. Please check the supervisor logs for more information. Another attempt will be made at the next scheduled time if a backup schedule is configured."
}
},
"services": {

View File

@@ -18,9 +18,9 @@
"bleak==1.0.1",
"bleak-retry-connector==4.4.3",
"bluetooth-adapters==2.1.0",
"bluetooth-auto-recovery==1.5.3",
"bluetooth-auto-recovery==1.5.2",
"bluetooth-data-tools==1.28.2",
"dbus-fast==2.44.3",
"habluetooth==5.6.4"
"habluetooth==5.6.2"
]
}

View File

@@ -20,5 +20,5 @@
"dependencies": ["bluetooth_adapters"],
"documentation": "https://www.home-assistant.io/integrations/bthome",
"iot_class": "local_push",
"requirements": ["bthome-ble==3.14.2"]
"requirements": ["bthome-ble==3.13.1"]
}

View File

@@ -25,7 +25,6 @@ from homeassistant.const import (
DEGREE,
LIGHT_LUX,
PERCENTAGE,
REVOLUTIONS_PER_MINUTE,
SIGNAL_STRENGTH_DECIBELS_MILLIWATT,
EntityCategory,
UnitOfConductivity,
@@ -270,15 +269,6 @@ SENSOR_DESCRIPTIONS = {
native_unit_of_measurement=DEGREE,
state_class=SensorStateClass.MEASUREMENT,
),
# Rotational speed (rpm)
(
BTHomeExtendedSensorDeviceClass.ROTATIONAL_SPEED,
Units.REVOLUTIONS_PER_MINUTE,
): SensorEntityDescription(
key=f"{BTHomeExtendedSensorDeviceClass.ROTATIONAL_SPEED}_{Units.REVOLUTIONS_PER_MINUTE}",
native_unit_of_measurement=REVOLUTIONS_PER_MINUTE,
state_class=SensorStateClass.MEASUREMENT,
),
# Signal Strength (RSSI) (dB)
(
BTHomeSensorDeviceClass.SIGNAL_STRENGTH,

View File

@@ -37,10 +37,6 @@ from homeassistant.exceptions import HomeAssistantError
from homeassistant.helpers import config_validation as cv
from homeassistant.helpers.aiohttp_client import async_get_clientsession
from homeassistant.helpers.dispatcher import async_dispatcher_send
from homeassistant.loader import (
async_get_custom_components,
async_get_loaded_integration,
)
from homeassistant.util.location import async_detect_location_info
from .alexa_config import entity_supported as entity_supported_by_alexa
@@ -435,79 +431,6 @@ class DownloadSupportPackageView(HomeAssistantView):
url = "/api/cloud/support_package"
name = "api:cloud:support_package"
async def _get_integration_info(self, hass: HomeAssistant) -> dict[str, Any]:
"""Collect information about active and custom integrations."""
# Get loaded components from hass.config.components
loaded_components = hass.config.components.copy()
# Get custom integrations
custom_domains = set()
with suppress(Exception):
custom_domains = set(await async_get_custom_components(hass))
# Separate built-in and custom integrations
builtin_integrations = []
custom_integrations = []
for domain in sorted(loaded_components):
try:
integration = async_get_loaded_integration(hass, domain)
except Exception: # noqa: BLE001
# Broad exception catch for robustness in support package
# generation. If we can't get integration info,
# just add the domain
if domain in custom_domains:
custom_integrations.append(
{
"domain": domain,
"name": "Unknown",
"version": "Unknown",
"documentation": "Unknown",
}
)
else:
builtin_integrations.append(
{
"domain": domain,
"name": "Unknown",
}
)
else:
if domain in custom_domains:
# This is a custom integration
# include version and documentation link
version = (
str(integration.version) if integration.version else "Unknown"
)
if not (documentation := integration.documentation):
documentation = "Unknown"
custom_integrations.append(
{
"domain": domain,
"name": integration.name,
"version": version,
"documentation": documentation,
}
)
else:
# This is a built-in integration.
# No version needed, as it is always the same as the
# Home Assistant version
builtin_integrations.append(
{
"domain": domain,
"name": integration.name,
}
)
return {
"builtin_count": len(builtin_integrations),
"builtin_integrations": builtin_integrations,
"custom_count": len(custom_integrations),
"custom_integrations": custom_integrations,
}
async def _generate_markdown(
self,
hass: HomeAssistant,
@@ -530,38 +453,6 @@ class DownloadSupportPackageView(HomeAssistantView):
markdown = "## System Information\n\n"
markdown += get_domain_table_markdown(hass_info)
# Add integration information
try:
integration_info = await self._get_integration_info(hass)
except Exception: # noqa: BLE001
# Broad exception catch for robustness in support package generation
# If there's any error getting integration info, just note it
markdown += "## Active integrations\n\n"
markdown += "Unable to collect integration information\n\n"
else:
markdown += "## Active Integrations\n\n"
markdown += f"Built-in integrations: {integration_info['builtin_count']}\n"
markdown += f"Custom integrations: {integration_info['custom_count']}\n\n"
# Built-in integrations
if integration_info["builtin_integrations"]:
markdown += "<details><summary>Built-in integrations</summary>\n\n"
markdown += "Domain | Name\n"
markdown += "--- | ---\n"
for integration in integration_info["builtin_integrations"]:
markdown += f"{integration['domain']} | {integration['name']}\n"
markdown += "\n</details>\n\n"
# Custom integrations
if integration_info["custom_integrations"]:
markdown += "<details><summary>Custom integrations</summary>\n\n"
markdown += "Domain | Name | Version | Documentation\n"
markdown += "--- | --- | --- | ---\n"
for integration in integration_info["custom_integrations"]:
doc_url = integration.get("documentation") or "N/A"
markdown += f"{integration['domain']} | {integration['name']} | {integration['version']} | {doc_url}\n"
markdown += "\n</details>\n\n"
for domain, domain_info in domains_info.items():
domain_info_md = get_domain_table_markdown(domain_info)
markdown += (

View File

@@ -2,7 +2,7 @@
from abc import abstractmethod
from datetime import timedelta
from typing import Any, TypeVar
from typing import TypeVar
from aiocomelit.api import (
AlarmDataObject,
@@ -13,16 +13,7 @@ from aiocomelit.api import (
ComelitVedoAreaObject,
ComelitVedoZoneObject,
)
from aiocomelit.const import (
BRIDGE,
CLIMATE,
COVER,
IRRIGATION,
LIGHT,
OTHER,
SCENARIO,
VEDO,
)
from aiocomelit.const import BRIDGE, VEDO
from aiocomelit.exceptions import CannotAuthenticate, CannotConnect, CannotRetrieveData
from aiohttp import ClientSession
@@ -120,32 +111,6 @@ class ComelitBaseCoordinator(DataUpdateCoordinator[T]):
async def _async_update_system_data(self) -> T:
"""Class method for updating data."""
async def _async_remove_stale_devices(
self,
previous_list: dict[int, Any],
current_list: dict[int, Any],
dev_type: str,
) -> None:
"""Remove stale devices."""
device_registry = dr.async_get(self.hass)
for i in previous_list:
if i not in current_list:
_LOGGER.debug(
"Detected change in %s devices: index %s removed",
dev_type,
i,
)
identifier = f"{self.config_entry.entry_id}-{dev_type}-{i}"
device = device_registry.async_get_device(
identifiers={(DOMAIN, identifier)}
)
if device:
device_registry.async_update_device(
device_id=device.id,
remove_config_entry_id=self.config_entry.entry_id,
)
class ComelitSerialBridge(
ComelitBaseCoordinator[dict[str, dict[int, ComelitSerialBridgeObject]]]
@@ -172,15 +137,7 @@ class ComelitSerialBridge(
self,
) -> dict[str, dict[int, ComelitSerialBridgeObject]]:
"""Specific method for updating data."""
data = await self.api.get_all_devices()
if self.data:
for dev_type in (CLIMATE, COVER, LIGHT, IRRIGATION, OTHER, SCENARIO):
await self._async_remove_stale_devices(
self.data[dev_type], data[dev_type], dev_type
)
return data
return await self.api.get_all_devices()
class ComelitVedoSystem(ComelitBaseCoordinator[AlarmDataObject]):
@@ -206,14 +163,4 @@ class ComelitVedoSystem(ComelitBaseCoordinator[AlarmDataObject]):
self,
) -> AlarmDataObject:
"""Specific method for updating data."""
data = await self.api.get_all_areas_and_zones()
if self.data:
for obj_type in ("alarm_areas", "alarm_zones"):
await self._async_remove_stale_devices(
self.data[obj_type],
data[obj_type],
"area" if obj_type == "alarm_areas" else "zone",
)
return data
return await self.api.get_all_areas_and_zones()

View File

@@ -72,7 +72,9 @@ rules:
repair-issues:
status: exempt
comment: no known use cases for repair issues or flows, yet
stale-devices: done
stale-devices:
status: todo
comment: missing implementation
# Platinum
async-dependency: done

View File

@@ -50,13 +50,14 @@ from .const import (
ATTR_LANGUAGE,
ATTR_TEXT,
DATA_COMPONENT,
DATA_DEFAULT_ENTITY,
DOMAIN,
HOME_ASSISTANT_AGENT,
SERVICE_PROCESS,
SERVICE_RELOAD,
ConversationEntityFeature,
)
from .default_agent import async_setup_default_agent
from .default_agent import DefaultAgent, async_setup_default_agent
from .entity import ConversationEntity
from .http import async_setup as async_setup_conversation_http
from .models import AbstractConversationAgent, ConversationInput, ConversationResult
@@ -141,7 +142,7 @@ def async_unset_agent(
hass: HomeAssistant,
config_entry: ConfigEntry,
) -> None:
"""Unset the agent to handle the conversations."""
"""Set the agent to handle the conversations."""
get_agent_manager(hass).async_unset_agent(config_entry.entry_id)
@@ -240,10 +241,10 @@ async def async_handle_sentence_triggers(
Returns None if no match occurred.
"""
agent = get_agent_manager(hass).default_agent
assert agent is not None
default_agent = async_get_agent(hass)
assert isinstance(default_agent, DefaultAgent)
return await agent.async_handle_sentence_triggers(user_input)
return await default_agent.async_handle_sentence_triggers(user_input)
async def async_handle_intents(
@@ -256,10 +257,12 @@ async def async_handle_intents(
Returns None if no match occurred.
"""
agent = get_agent_manager(hass).default_agent
assert agent is not None
default_agent = async_get_agent(hass)
assert isinstance(default_agent, DefaultAgent)
return await agent.async_handle_intents(user_input, intent_filter=intent_filter)
return await default_agent.async_handle_intents(
user_input, intent_filter=intent_filter
)
async def async_setup(hass: HomeAssistant, config: ConfigType) -> bool:
@@ -295,9 +298,9 @@ async def async_setup(hass: HomeAssistant, config: ConfigType) -> bool:
async def handle_reload(service: ServiceCall) -> None:
"""Reload intents."""
agent = get_agent_manager(hass).default_agent
if agent is not None:
await agent.async_reload(language=service.data.get(ATTR_LANGUAGE))
await hass.data[DATA_DEFAULT_ENTITY].async_reload(
language=service.data.get(ATTR_LANGUAGE)
)
hass.services.async_register(
DOMAIN,

View File

@@ -4,7 +4,7 @@ from __future__ import annotations
import dataclasses
import logging
from typing import TYPE_CHECKING, Any
from typing import Any
import voluptuous as vol
@@ -12,7 +12,7 @@ from homeassistant.core import Context, HomeAssistant, async_get_hass, callback
from homeassistant.exceptions import HomeAssistantError
from homeassistant.helpers import config_validation as cv, intent, singleton
from .const import DATA_COMPONENT, HOME_ASSISTANT_AGENT
from .const import DATA_COMPONENT, DATA_DEFAULT_ENTITY, HOME_ASSISTANT_AGENT
from .entity import ConversationEntity
from .models import (
AbstractConversationAgent,
@@ -28,9 +28,6 @@ from .trace import (
_LOGGER = logging.getLogger(__name__)
if TYPE_CHECKING:
from .default_agent import DefaultAgent
@singleton.singleton("conversation_agent")
@callback
@@ -52,10 +49,8 @@ def async_get_agent(
hass: HomeAssistant, agent_id: str | None = None
) -> AbstractConversationAgent | ConversationEntity | None:
"""Get specified agent."""
manager = get_agent_manager(hass)
if agent_id is None or agent_id == HOME_ASSISTANT_AGENT:
return manager.default_agent
return hass.data[DATA_DEFAULT_ENTITY]
if "." in agent_id:
return hass.data[DATA_COMPONENT].get_entity(agent_id)
@@ -139,7 +134,6 @@ class AgentManager:
"""Initialize the conversation agents."""
self.hass = hass
self._agents: dict[str, AbstractConversationAgent] = {}
self.default_agent: DefaultAgent | None = None
@callback
def async_get_agent(self, agent_id: str) -> AbstractConversationAgent | None:
@@ -188,7 +182,3 @@ class AgentManager:
def async_unset_agent(self, agent_id: str) -> None:
"""Unset the agent."""
self._agents.pop(agent_id, None)
async def async_setup_default_agent(self, agent: DefaultAgent) -> None:
"""Set up the default agent."""
self.default_agent = agent

View File

@@ -10,9 +10,11 @@ from homeassistant.util.hass_dict import HassKey
if TYPE_CHECKING:
from homeassistant.helpers.entity_component import EntityComponent
from .default_agent import DefaultAgent
from .entity import ConversationEntity
DOMAIN = "conversation"
DEFAULT_EXPOSED_ATTRIBUTES = {"device_class"}
HOME_ASSISTANT_AGENT = "conversation.home_assistant"
ATTR_TEXT = "text"
@@ -24,6 +26,7 @@ SERVICE_PROCESS = "process"
SERVICE_RELOAD = "reload"
DATA_COMPONENT: HassKey[EntityComponent[ConversationEntity]] = HassKey(DOMAIN)
DATA_DEFAULT_ENTITY: HassKey[DefaultAgent] = HassKey(f"{DOMAIN}_default_entity")
class ConversationEntityFeature(IntFlag):

View File

@@ -68,9 +68,13 @@ from homeassistant.helpers.event import async_track_state_added_domain
from homeassistant.util import language as language_util
from homeassistant.util.json import JsonObjectType, json_loads_object
from .agent_manager import get_agent_manager
from .chat_log import AssistantContent, ChatLog
from .const import DOMAIN, ConversationEntityFeature
from .const import (
DATA_DEFAULT_ENTITY,
DEFAULT_EXPOSED_ATTRIBUTES,
DOMAIN,
ConversationEntityFeature,
)
from .entity import ConversationEntity
from .models import ConversationInput, ConversationResult
from .trace import ConversationTraceEventType, async_conversation_trace_append
@@ -79,8 +83,6 @@ _LOGGER = logging.getLogger(__name__)
_DEFAULT_ERROR_TEXT = "Sorry, I couldn't understand that"
_ENTITY_REGISTRY_UPDATE_FIELDS = ["aliases", "name", "original_name"]
_DEFAULT_EXPOSED_ATTRIBUTES = {"device_class"}
REGEX_TYPE = type(re.compile(""))
TRIGGER_CALLBACK_TYPE = Callable[
[ConversationInput, RecognizeResult], Awaitable[str | None]
@@ -207,9 +209,9 @@ async def async_setup_default_agent(
config_intents: dict[str, Any],
) -> None:
"""Set up entity registry listener for the default agent."""
agent = DefaultAgent(hass, config_intents)
await entity_component.async_add_entities([agent])
await get_agent_manager(hass).async_setup_default_agent(agent)
entity = DefaultAgent(hass, config_intents)
await entity_component.async_add_entities([entity])
hass.data[DATA_DEFAULT_ENTITY] = entity
@core.callback
def async_entity_state_listener(
@@ -844,7 +846,7 @@ class DefaultAgent(ConversationEntity):
context = {"domain": state.domain}
if state.attributes:
# Include some attributes
for attr in _DEFAULT_EXPOSED_ATTRIBUTES:
for attr in DEFAULT_EXPOSED_ATTRIBUTES:
if attr not in state.attributes:
continue
context[attr] = state.attributes[attr]

View File

@@ -25,7 +25,7 @@ from .agent_manager import (
async_get_agent,
get_agent_manager,
)
from .const import DATA_COMPONENT
from .const import DATA_COMPONENT, DATA_DEFAULT_ENTITY
from .default_agent import (
METADATA_CUSTOM_FILE,
METADATA_CUSTOM_SENTENCE,
@@ -169,8 +169,7 @@ async def websocket_list_sentences(
hass: HomeAssistant, connection: websocket_api.ActiveConnection, msg: dict
) -> None:
"""List custom registered sentences."""
agent = get_agent_manager(hass).default_agent
assert agent is not None
agent = hass.data[DATA_DEFAULT_ENTITY]
sentences = []
for trigger_data in agent.trigger_sentences:
@@ -192,8 +191,7 @@ async def websocket_hass_agent_debug(
hass: HomeAssistant, connection: websocket_api.ActiveConnection, msg: dict
) -> None:
"""Return intents that would be matched by the default agent for a list of sentences."""
agent = get_agent_manager(hass).default_agent
assert agent is not None
agent = hass.data[DATA_DEFAULT_ENTITY]
# Return results for each sentence in the same order as the input.
result_dicts: list[dict[str, Any] | None] = []

View File

@@ -1,9 +1,4 @@
{
"entity_component": {
"_": {
"default": "mdi:forum-outline"
}
},
"services": {
"process": {
"service": "mdi:message-processing"

View File

@@ -4,7 +4,7 @@
"codeowners": ["@home-assistant/core", "@synesthesiam", "@arturpragacz"],
"dependencies": ["http", "intent"],
"documentation": "https://www.home-assistant.io/integrations/conversation",
"integration_type": "entity",
"integration_type": "system",
"quality_scale": "internal",
"requirements": ["hassil==3.2.0", "home-assistant-intents==2025.9.3"]
}

View File

@@ -20,8 +20,7 @@ from homeassistant.helpers.script import ScriptRunResult
from homeassistant.helpers.trigger import TriggerActionType, TriggerInfo
from homeassistant.helpers.typing import UNDEFINED, ConfigType
from .agent_manager import get_agent_manager
from .const import DOMAIN
from .const import DATA_DEFAULT_ENTITY, DOMAIN
from .models import ConversationInput
@@ -124,6 +123,4 @@ async def async_attach_trigger(
# two trigger copies for who will provide a response.
return None
agent = get_agent_manager(hass).default_agent
assert agent is not None
return agent.register_trigger(sentences, call_action)
return hass.data[DATA_DEFAULT_ENTITY].register_trigger(sentences, call_action)

View File

@@ -19,7 +19,6 @@
"ssdp",
"stream",
"sun",
"usage_prediction",
"usb",
"webhook",
"zeroconf"

View File

@@ -43,5 +43,3 @@ class DelugeSensorType(enum.StrEnum):
UPLOAD_SPEED_SENSOR = "upload_speed"
PROTOCOL_TRAFFIC_UPLOAD_SPEED_SENSOR = "protocol_traffic_upload_speed"
PROTOCOL_TRAFFIC_DOWNLOAD_SPEED_SENSOR = "protocol_traffic_download_speed"
DOWNLOADING_COUNT_SENSOR = "downloading_count"
SEEDING_COUNT_SENSOR = "seeding_count"

View File

@@ -2,7 +2,6 @@
from __future__ import annotations
from collections import Counter
from datetime import timedelta
from ssl import SSLError
from typing import Any
@@ -15,22 +14,11 @@ from homeassistant.core import HomeAssistant
from homeassistant.exceptions import ConfigEntryAuthFailed
from homeassistant.helpers.update_coordinator import DataUpdateCoordinator, UpdateFailed
from .const import LOGGER, DelugeGetSessionStatusKeys, DelugeSensorType
from .const import LOGGER, DelugeGetSessionStatusKeys
type DelugeConfigEntry = ConfigEntry[DelugeDataUpdateCoordinator]
def count_states(data: dict[str, Any]) -> dict[str, int]:
"""Count the states of the provided torrents."""
counts = Counter(torrent[b"state"].decode() for torrent in data.values())
return {
DelugeSensorType.DOWNLOADING_COUNT_SENSOR.value: counts.get("Downloading", 0),
DelugeSensorType.SEEDING_COUNT_SENSOR.value: counts.get("Seeding", 0),
}
class DelugeDataUpdateCoordinator(
DataUpdateCoordinator[dict[Platform, dict[str, Any]]]
):
@@ -51,22 +39,19 @@ class DelugeDataUpdateCoordinator(
)
self.api = api
def _get_deluge_data(self):
"""Get the latest data from Deluge."""
async def _async_update_data(self) -> dict[Platform, dict[str, Any]]:
"""Get the latest data from Deluge and updates the state."""
data = {}
try:
data["session_status"] = self.api.call(
_data = await self.hass.async_add_executor_job(
self.api.call,
"core.get_session_status",
[iter_member.value for iter_member in list(DelugeGetSessionStatusKeys)],
)
data["torrents_status_state"] = self.api.call(
"core.get_torrents_status", {}, ["state"]
data[Platform.SENSOR] = {k.decode(): v for k, v in _data.items()}
data[Platform.SWITCH] = await self.hass.async_add_executor_job(
self.api.call, "core.get_torrents_status", {}, ["paused"]
)
data["torrents_status_paused"] = self.api.call(
"core.get_torrents_status", {}, ["paused"]
)
except (
ConnectionRefusedError,
TimeoutError,
@@ -81,18 +66,4 @@ class DelugeDataUpdateCoordinator(
) from ex
LOGGER.error("Unknown error connecting to Deluge: %s", ex)
raise
return data
async def _async_update_data(self) -> dict[Platform, dict[str, Any]]:
"""Get the latest data from Deluge and updates the state."""
deluge_data = await self.hass.async_add_executor_job(self._get_deluge_data)
data = {}
data[Platform.SENSOR] = {
k.decode(): v for k, v in deluge_data["session_status"].items()
}
data[Platform.SENSOR].update(count_states(deluge_data["torrents_status_state"]))
data[Platform.SWITCH] = deluge_data["torrents_status_paused"]
return data

View File

@@ -1,12 +0,0 @@
{
"entity": {
"sensor": {
"downloading_count": {
"default": "mdi:download"
},
"seeding_count": {
"default": "mdi:upload"
}
}
}
}

View File

@@ -110,18 +110,6 @@ SENSOR_TYPES: tuple[DelugeSensorEntityDescription, ...] = (
data, DelugeSensorType.PROTOCOL_TRAFFIC_DOWNLOAD_SPEED_SENSOR.value
),
),
DelugeSensorEntityDescription(
key=DelugeSensorType.DOWNLOADING_COUNT_SENSOR.value,
translation_key=DelugeSensorType.DOWNLOADING_COUNT_SENSOR.value,
state_class=SensorStateClass.TOTAL,
value=lambda data: data[DelugeSensorType.DOWNLOADING_COUNT_SENSOR.value],
),
DelugeSensorEntityDescription(
key=DelugeSensorType.SEEDING_COUNT_SENSOR.value,
translation_key=DelugeSensorType.SEEDING_COUNT_SENSOR.value,
state_class=SensorStateClass.TOTAL,
value=lambda data: data[DelugeSensorType.SEEDING_COUNT_SENSOR.value],
),
)

View File

@@ -36,10 +36,6 @@
"idle": "[%key:common::state::idle%]"
}
},
"downloading_count": {
"name": "Downloading count",
"unit_of_measurement": "torrents"
},
"download_speed": {
"name": "Download speed"
},
@@ -49,10 +45,6 @@
"protocol_traffic_upload_speed": {
"name": "Protocol traffic upload speed"
},
"seeding_count": {
"name": "Seeding count",
"unit_of_measurement": "[%key:component::deluge::entity::sensor::downloading_count::unit_of_measurement%]"
},
"upload_speed": {
"name": "Upload speed"
}

View File

@@ -1,23 +0,0 @@
"""Diagnostics support for derivative."""
from __future__ import annotations
from typing import Any
from homeassistant.config_entries import ConfigEntry
from homeassistant.core import HomeAssistant
from homeassistant.helpers import entity_registry as er
async def async_get_config_entry_diagnostics(
hass: HomeAssistant, config_entry: ConfigEntry
) -> dict[str, Any]:
"""Return diagnostics for a config entry."""
registry = er.async_get(hass)
entities = registry.entities.get_entries_for_config_entry_id(config_entry.entry_id)
return {
"config_entry": config_entry.as_dict(),
"entity": [entity.extended_dict for entity in entities],
}

View File

@@ -227,28 +227,15 @@ class DerivativeSensor(RestoreSensor, SensorEntity):
weight = calculate_weight(start, end, current_time)
derivative = derivative + (value * Decimal(weight))
_LOGGER.debug(
"%s: Calculated new derivative as %f from %d segments",
self.entity_id,
derivative,
len(self._state_list),
)
return derivative
def _prune_state_list(self, current_time: datetime) -> None:
# filter out all derivatives older than `time_window` from our window list
old_len = len(self._state_list)
self._state_list = [
(time_start, time_end, state)
for time_start, time_end, state in self._state_list
if (current_time - time_end).total_seconds() < self._time_window
]
_LOGGER.debug(
"%s: Pruned %d elements from state list",
self.entity_id,
old_len - len(self._state_list),
)
def _handle_invalid_source_state(self, state: State | None) -> bool:
# Check the source state for unknown/unavailable condition. If unusable, write unknown/unavailable state and return false.
@@ -305,10 +292,6 @@ class DerivativeSensor(RestoreSensor, SensorEntity):
) -> None:
"""Calculate derivative based on time and reschedule."""
_LOGGER.debug(
"%s: Recalculating derivative due to max_sub_interval time elapsed",
self.entity_id,
)
self._prune_state_list(now)
derivative = self._calc_derivative_from_state_list(now)
self._write_native_value(derivative)
@@ -317,11 +300,6 @@ class DerivativeSensor(RestoreSensor, SensorEntity):
if derivative != 0:
schedule_max_sub_interval_exceeded(source_state)
_LOGGER.debug(
"%s: Scheduling max_sub_interval_callback in %s",
self.entity_id,
self._max_sub_interval,
)
self._cancel_max_sub_interval_exceeded_callback = async_call_later(
self.hass,
self._max_sub_interval,
@@ -331,9 +309,6 @@ class DerivativeSensor(RestoreSensor, SensorEntity):
@callback
def on_state_reported(event: Event[EventStateReportedData]) -> None:
"""Handle constant sensor state."""
_LOGGER.debug(
"%s: New state reported event: %s", self.entity_id, event.data
)
self._cancel_max_sub_interval_exceeded_callback()
new_state = event.data["new_state"]
if not self._handle_invalid_source_state(new_state):
@@ -355,7 +330,6 @@ class DerivativeSensor(RestoreSensor, SensorEntity):
@callback
def on_state_changed(event: Event[EventStateChangedData]) -> None:
"""Handle changed sensor state."""
_LOGGER.debug("%s: New state changed event: %s", self.entity_id, event.data)
self._cancel_max_sub_interval_exceeded_callback()
new_state = event.data["new_state"]
if not self._handle_invalid_source_state(new_state):
@@ -408,32 +382,15 @@ class DerivativeSensor(RestoreSensor, SensorEntity):
/ Decimal(self._unit_prefix)
* Decimal(self._unit_time)
)
_LOGGER.debug(
"%s: Calculated new derivative segment as %f / %f / %f * %f = %f",
self.entity_id,
delta_value,
elapsed_time,
self._unit_prefix,
self._unit_time,
new_derivative,
)
except ValueError as err:
_LOGGER.warning(
"%s: While calculating derivative: %s", self.entity_id, err
)
_LOGGER.warning("While calculating derivative: %s", err)
except DecimalException as err:
_LOGGER.warning(
"%s: Invalid state (%s > %s): %s",
self.entity_id,
old_value,
new_state.state,
err,
"Invalid state (%s > %s): %s", old_value, new_state.state, err
)
except AssertionError as err:
_LOGGER.error(
"%s: Could not calculate derivative: %s", self.entity_id, err
)
_LOGGER.error("Could not calculate derivative: %s", err)
# For total inreasing sensors, the value is expected to continuously increase.
# A negative derivative for a total increasing sensor likely indicates the
@@ -443,10 +400,6 @@ class DerivativeSensor(RestoreSensor, SensorEntity):
== SensorStateClass.TOTAL_INCREASING
and new_derivative < 0
):
_LOGGER.debug(
"%s: Dropping sample as source total_increasing sensor decreased",
self.entity_id,
)
return
# add latest derivative to the window list

View File

@@ -152,28 +152,24 @@ ECOWITT_SENSORS_MAPPING: Final = {
native_unit_of_measurement=UnitOfPrecipitationDepth.MILLIMETERS,
device_class=SensorDeviceClass.PRECIPITATION,
state_class=SensorStateClass.TOTAL_INCREASING,
suggested_display_precision=1,
),
EcoWittSensorTypes.RAIN_COUNT_INCHES: SensorEntityDescription(
key="RAIN_COUNT_INCHES",
native_unit_of_measurement=UnitOfPrecipitationDepth.INCHES,
device_class=SensorDeviceClass.PRECIPITATION,
state_class=SensorStateClass.TOTAL_INCREASING,
suggested_display_precision=2,
),
EcoWittSensorTypes.RAIN_RATE_MM: SensorEntityDescription(
key="RAIN_RATE_MM",
native_unit_of_measurement=UnitOfVolumetricFlux.MILLIMETERS_PER_HOUR,
state_class=SensorStateClass.MEASUREMENT,
device_class=SensorDeviceClass.PRECIPITATION_INTENSITY,
suggested_display_precision=1,
),
EcoWittSensorTypes.RAIN_RATE_INCHES: SensorEntityDescription(
key="RAIN_RATE_INCHES",
native_unit_of_measurement=UnitOfVolumetricFlux.INCHES_PER_HOUR,
state_class=SensorStateClass.MEASUREMENT,
device_class=SensorDeviceClass.PRECIPITATION_INTENSITY,
suggested_display_precision=2,
),
EcoWittSensorTypes.LIGHTNING_DISTANCE_KM: SensorEntityDescription(
key="LIGHTNING_DISTANCE_KM",
@@ -234,17 +230,6 @@ ECOWITT_SENSORS_MAPPING: Final = {
native_unit_of_measurement=PERCENTAGE,
state_class=SensorStateClass.MEASUREMENT,
),
EcoWittSensorTypes.DISTANCE_MM: SensorEntityDescription(
key="DISTANCE_MM",
device_class=SensorDeviceClass.DISTANCE,
native_unit_of_measurement=UnitOfLength.MILLIMETERS,
state_class=SensorStateClass.MEASUREMENT,
),
EcoWittSensorTypes.HEAT_COUNT: SensorEntityDescription(
key="HEAT_COUNT",
state_class=SensorStateClass.TOTAL_INCREASING,
entity_category=EntityCategory.DIAGNOSTIC,
),
EcoWittSensorTypes.PM1: SensorEntityDescription(
key="PM1",
device_class=SensorDeviceClass.PM1,

View File

@@ -120,14 +120,6 @@ def _make_url_from_data(data: dict[str, str]) -> str:
return f"{protocol}{address}"
def _get_protocol_from_url(url: str) -> str:
"""Get protocol from URL. Returns the configured protocol from URL or the default secure protocol."""
return next(
(k for k, v in PROTOCOL_MAP.items() if url.startswith(v)),
DEFAULT_SECURE_PROTOCOL,
)
def _placeholders_from_device(device: ElkSystem) -> dict[str, str]:
return {
"mac_address": _short_mac(device.mac_address),
@@ -213,78 +205,6 @@ class Elkm1ConfigFlow(ConfigFlow, domain=DOMAIN):
)
return await self.async_step_discovered_connection()
async def async_step_reconfigure(
self, user_input: dict[str, Any] | None = None
) -> ConfigFlowResult:
"""Handle reconfiguration of the integration."""
errors: dict[str, str] = {}
reconfigure_entry = self._get_reconfigure_entry()
existing_data = reconfigure_entry.data
if user_input is not None:
validate_input_data = dict(user_input)
validate_input_data[CONF_PREFIX] = existing_data.get(CONF_PREFIX, "")
try:
info = await validate_input(
validate_input_data, reconfigure_entry.unique_id
)
except TimeoutError:
errors["base"] = "cannot_connect"
except InvalidAuth:
errors[CONF_PASSWORD] = "invalid_auth"
except Exception:
_LOGGER.exception("Unexpected exception during reconfiguration")
errors["base"] = "unknown"
else:
# Discover the device at the provided address to obtain its MAC (unique_id)
device = await async_discover_device(
self.hass, validate_input_data[CONF_ADDRESS]
)
if device is not None and device.mac_address:
await self.async_set_unique_id(dr.format_mac(device.mac_address))
self._abort_if_unique_id_mismatch() # aborts if user tried to switch devices
else:
# If we cannot confirm identity, keep existing behavior (don't block reconfigure)
await self.async_set_unique_id(reconfigure_entry.unique_id)
return self.async_update_reload_and_abort(
reconfigure_entry,
data_updates={
**reconfigure_entry.data,
CONF_HOST: info[CONF_HOST],
CONF_USERNAME: validate_input_data[CONF_USERNAME],
CONF_PASSWORD: validate_input_data[CONF_PASSWORD],
CONF_PREFIX: info[CONF_PREFIX],
},
reason="reconfigure_successful",
)
return self.async_show_form(
step_id="reconfigure",
data_schema=vol.Schema(
{
vol.Optional(
CONF_USERNAME,
default=existing_data.get(CONF_USERNAME, ""),
): str,
vol.Optional(
CONF_PASSWORD,
default="",
): str,
vol.Required(
CONF_ADDRESS,
default=hostname_from_url(existing_data[CONF_HOST]),
): str,
vol.Required(
CONF_PROTOCOL,
default=_get_protocol_from_url(existing_data[CONF_HOST]),
): vol.In(ALL_PROTOCOLS),
}
),
errors=errors,
)
async def async_step_user(
self, user_input: dict[str, Any] | None = None
) -> ConfigFlowResult:
@@ -329,14 +249,12 @@ class Elkm1ConfigFlow(ConfigFlow, domain=DOMAIN):
try:
info = await validate_input(user_input, self.unique_id)
except TimeoutError as ex:
_LOGGER.debug("Connection timed out: %s", ex)
except TimeoutError:
return {"base": "cannot_connect"}, None
except InvalidAuth as ex:
_LOGGER.debug("Invalid auth for %s: %s", user_input.get(CONF_HOST), ex)
except InvalidAuth:
return {CONF_PASSWORD: "invalid_auth"}, None
except Exception:
_LOGGER.exception("Unexpected error validating input")
_LOGGER.exception("Unexpected exception")
return {"base": "unknown"}, None
if importing:

View File

@@ -14,11 +14,7 @@ from elkm1_lib.util import pretty_const
from elkm1_lib.zones import Zone
import voluptuous as vol
from homeassistant.components.sensor import (
SensorDeviceClass,
SensorEntity,
SensorStateClass,
)
from homeassistant.components.sensor import SensorEntity
from homeassistant.const import EntityCategory, UnitOfElectricPotential
from homeassistant.core import HomeAssistant
from homeassistant.exceptions import HomeAssistantError
@@ -36,16 +32,6 @@ SERVICE_SENSOR_ZONE_BYPASS = "sensor_zone_bypass"
SERVICE_SENSOR_ZONE_TRIGGER = "sensor_zone_trigger"
UNDEFINED_TEMPERATURE = -40
_DEVICE_CLASS_MAP: dict[ZoneType, SensorDeviceClass] = {
ZoneType.TEMPERATURE: SensorDeviceClass.TEMPERATURE,
ZoneType.ANALOG_ZONE: SensorDeviceClass.VOLTAGE,
}
_STATE_CLASS_MAP: dict[ZoneType, SensorStateClass] = {
ZoneType.TEMPERATURE: SensorStateClass.MEASUREMENT,
ZoneType.ANALOG_ZONE: SensorStateClass.MEASUREMENT,
}
ELK_SET_COUNTER_SERVICE_SCHEMA: VolDictType = {
vol.Required(ATTR_VALUE): vol.All(vol.Coerce(int), vol.Range(0, 65535))
}
@@ -262,16 +248,6 @@ class ElkZone(ElkSensor):
return self._temperature_unit
return None
@property
def device_class(self) -> SensorDeviceClass | None:
"""Return the device class of the sensor."""
return _DEVICE_CLASS_MAP.get(self._element.definition)
@property
def state_class(self) -> SensorStateClass | None:
"""Return the state class of the sensor."""
return _STATE_CLASS_MAP.get(self._element.definition)
@property
def native_unit_of_measurement(self) -> str | None:
"""Return the unit of measurement."""

View File

@@ -17,8 +17,8 @@
"address": "The IP address or domain or serial port if connecting via serial.",
"username": "[%key:common::config_flow::data::username%]",
"password": "[%key:common::config_flow::data::password%]",
"prefix": "A unique prefix (leave blank if you only have one Elk-M1).",
"temperature_unit": "The temperature unit Elk-M1 uses."
"prefix": "A unique prefix (leave blank if you only have one ElkM1).",
"temperature_unit": "The temperature unit ElkM1 uses."
}
},
"discovered_connection": {
@@ -30,16 +30,6 @@
"password": "[%key:common::config_flow::data::password%]",
"temperature_unit": "[%key:component::elkm1::config::step::manual_connection::data::temperature_unit%]"
}
},
"reconfigure": {
"title": "Reconfigure Elk-M1 Control",
"description": "[%key:component::elkm1::config::step::manual_connection::description%]",
"data": {
"protocol": "[%key:component::elkm1::config::step::manual_connection::data::protocol%]",
"address": "[%key:component::elkm1::config::step::manual_connection::data::address%]",
"username": "[%key:common::config_flow::data::username%]",
"password": "[%key:common::config_flow::data::password%]"
}
}
},
"error": {
@@ -52,10 +42,8 @@
"unknown": "[%key:common::config_flow::error::unknown%]",
"already_in_progress": "[%key:common::config_flow::abort::already_in_progress%]",
"cannot_connect": "[%key:common::config_flow::error::cannot_connect%]",
"already_configured": "An Elk-M1 with this prefix is already configured",
"address_already_configured": "An Elk-M1 with this address is already configured",
"reconfigure_successful": "Successfully reconfigured Elk-M1 integration",
"unique_id_mismatch": "Reconfigure should be used for the same device not a new one"
"already_configured": "An ElkM1 with this prefix is already configured",
"address_already_configured": "An ElkM1 with this address is already configured"
}
},
"services": {
@@ -81,7 +69,7 @@
},
"alarm_arm_home_instant": {
"name": "Alarm arm home instant",
"description": "Arms the Elk-M1 in home instant mode.",
"description": "Arms the ElkM1 in home instant mode.",
"fields": {
"code": {
"name": "Code",
@@ -91,7 +79,7 @@
},
"alarm_arm_night_instant": {
"name": "Alarm arm night instant",
"description": "Arms the Elk-M1 in night instant mode.",
"description": "Arms the ElkM1 in night instant mode.",
"fields": {
"code": {
"name": "Code",
@@ -101,7 +89,7 @@
},
"alarm_arm_vacation": {
"name": "Alarm arm vacation",
"description": "Arms the Elk-M1 in vacation mode.",
"description": "Arms the ElkM1 in vacation mode.",
"fields": {
"code": {
"name": "Code",
@@ -111,7 +99,7 @@
},
"alarm_display_message": {
"name": "Alarm display message",
"description": "Displays a message on all of the Elk-M1 keypads for an area.",
"description": "Displays a message on all of the ElkM1 keypads for an area.",
"fields": {
"clear": {
"name": "Clear",
@@ -147,7 +135,7 @@
},
"speak_phrase": {
"name": "Speak phrase",
"description": "Speaks a phrase. See list of phrases in Elk-M1 ASCII Protocol documentation.",
"description": "Speaks a phrase. See list of phrases in ElkM1 ASCII Protocol documentation.",
"fields": {
"number": {
"name": "Phrase number",
@@ -161,7 +149,7 @@
},
"speak_word": {
"name": "Speak word",
"description": "Speaks a word. See list of words in Elk-M1 ASCII Protocol documentation.",
"description": "Speaks a word. See list of words in ElkM1 ASCII Protocol documentation.",
"fields": {
"number": {
"name": "Word number",

View File

@@ -5,5 +5,5 @@
"config_flow": true,
"documentation": "https://www.home-assistant.io/integrations/emoncms",
"iot_class": "local_polling",
"requirements": ["pyemoncms==0.1.3"]
"requirements": ["pyemoncms==0.1.2"]
}

View File

@@ -5,5 +5,5 @@
"documentation": "https://www.home-assistant.io/integrations/emoncms_history",
"iot_class": "local_polling",
"quality_scale": "legacy",
"requirements": ["pyemoncms==0.1.3"]
"requirements": ["pyemoncms==0.1.2"]
}

View File

@@ -1,7 +1,7 @@
{
"domain": "enocean",
"name": "EnOcean",
"codeowners": [],
"codeowners": ["@bdurrer"],
"config_flow": true,
"documentation": "https://www.home-assistant.io/integrations/enocean",
"iot_class": "local_push",

View File

@@ -52,7 +52,7 @@ async def async_setup_entry(hass: HomeAssistant, entry: Eq3ConfigEntry) -> bool:
f"[{eq3_config.mac_address}] Device could not be found"
)
thermostat = Thermostat(device)
thermostat = Thermostat(mac_address=device) # type: ignore[arg-type]
entry.runtime_data = Eq3ConfigEntryData(
eq3_config=eq3_config, thermostat=thermostat

View File

@@ -22,5 +22,5 @@
"integration_type": "device",
"iot_class": "local_polling",
"loggers": ["eq3btsmart"],
"requirements": ["eq3btsmart==2.3.0"]
"requirements": ["eq3btsmart==2.1.0", "bleak-esphome==3.3.0"]
}

View File

@@ -127,39 +127,27 @@ class EsphomeAssistSatellite(
available_wake_words=[], active_wake_words=[], max_active_wake_words=1
)
self._active_pipeline_index = 0
def _get_entity_id(self, suffix: str) -> str | None:
"""Return the entity id for pipeline select, etc."""
if self._entry_data.device_info is None:
return None
@property
def pipeline_entity_id(self) -> str | None:
"""Return the entity ID of the pipeline to use for the next conversation."""
assert self._entry_data.device_info is not None
ent_reg = er.async_get(self.hass)
return ent_reg.async_get_entity_id(
Platform.SELECT,
DOMAIN,
f"{self._entry_data.device_info.mac_address}-{suffix}",
f"{self._entry_data.device_info.mac_address}-pipeline",
)
@property
def pipeline_entity_id(self) -> str | None:
"""Return the entity ID of the primary pipeline to use for the next conversation."""
return self.get_pipeline_entity(self._active_pipeline_index)
def get_pipeline_entity(self, index: int) -> str | None:
"""Return the entity ID of a pipeline by index."""
id_suffix = "" if index < 1 else f"_{index + 1}"
return self._get_entity_id(f"pipeline{id_suffix}")
def get_wake_word_entity(self, index: int) -> str | None:
"""Return the entity ID of a wake word by index."""
id_suffix = "" if index < 1 else f"_{index + 1}"
return self._get_entity_id(f"wake_word{id_suffix}")
@property
def vad_sensitivity_entity_id(self) -> str | None:
"""Return the entity ID of the VAD sensitivity to use for the next conversation."""
return self._get_entity_id("vad_sensitivity")
assert self._entry_data.device_info is not None
ent_reg = er.async_get(self.hass)
return ent_reg.async_get_entity_id(
Platform.SELECT,
DOMAIN,
f"{self._entry_data.device_info.mac_address}-vad_sensitivity",
)
@callback
def async_get_configuration(
@@ -247,7 +235,6 @@ class EsphomeAssistSatellite(
)
)
assert self._attr_supported_features is not None
if feature_flags & VoiceAssistantFeature.ANNOUNCE:
# Device supports announcements
self._attr_supported_features |= (
@@ -270,8 +257,8 @@ class EsphomeAssistSatellite(
# Update wake word select when config is updated
self.async_on_remove(
self._entry_data.async_register_assist_satellite_set_wake_words_callback(
self.async_set_wake_words
self._entry_data.async_register_assist_satellite_set_wake_word_callback(
self.async_set_wake_word
)
)
@@ -495,31 +482,8 @@ class EsphomeAssistSatellite(
# ANNOUNCEMENT format from media player
self._update_tts_format()
# Run the appropriate pipeline.
self._active_pipeline_index = 0
maybe_pipeline_index = 0
while True:
if not (ww_entity_id := self.get_wake_word_entity(maybe_pipeline_index)):
break
if not (ww_state := self.hass.states.get(ww_entity_id)):
continue
if ww_state.state == wake_word_phrase:
# First match
self._active_pipeline_index = maybe_pipeline_index
break
# Try next wake word select
maybe_pipeline_index += 1
_LOGGER.debug(
"Running pipeline %s from %s to %s",
self._active_pipeline_index + 1,
start_stage,
end_stage,
)
# Run the pipeline
_LOGGER.debug("Running pipeline from %s to %s", start_stage, end_stage)
self._pipeline_task = self.config_entry.async_create_background_task(
self.hass,
self.async_accept_pipeline_from_satellite(
@@ -550,7 +514,6 @@ class EsphomeAssistSatellite(
def handle_pipeline_finished(self) -> None:
"""Handle when pipeline has finished running."""
self._stop_udp_server()
self._active_pipeline_index = 0
_LOGGER.debug("Pipeline finished")
def handle_timer_event(
@@ -579,15 +542,15 @@ class EsphomeAssistSatellite(
self.tts_response_finished()
@callback
def async_set_wake_words(self, wake_word_ids: list[str]) -> None:
"""Set active wake words and update config on satellite."""
self._satellite_config.active_wake_words = wake_word_ids
def async_set_wake_word(self, wake_word_id: str) -> None:
"""Set active wake word and update config on satellite."""
self._satellite_config.active_wake_words = [wake_word_id]
self.config_entry.async_create_background_task(
self.hass,
self.async_set_configuration(self._satellite_config),
"esphome_voice_assistant_set_config",
)
_LOGGER.debug("Setting active wake word(s): %s", wake_word_ids)
_LOGGER.debug("Setting active wake word: %s", wake_word_id)
def _update_tts_format(self) -> None:
"""Update the TTS format from the first media player."""

View File

@@ -25,5 +25,3 @@ PROJECT_URLS = {
# ESPHome always uses .0 for the changelog URL
STABLE_BLE_URL_VERSION = f"{STABLE_BLE_VERSION.major}.{STABLE_BLE_VERSION.minor}.0"
DEFAULT_URL = f"https://esphome.io/changelog/{STABLE_BLE_URL_VERSION}.html"
NO_WAKE_WORD: Final[str] = "no_wake_word"

View File

@@ -177,10 +177,9 @@ class RuntimeEntryData:
assist_satellite_config_update_callbacks: list[
Callable[[AssistSatelliteConfiguration], None]
] = field(default_factory=list)
assist_satellite_set_wake_words_callbacks: list[Callable[[list[str]], None]] = (
field(default_factory=list)
assist_satellite_set_wake_word_callbacks: list[Callable[[str], None]] = field(
default_factory=list
)
assist_satellite_wake_words: dict[int, str] = field(default_factory=dict)
device_id_to_name: dict[int, str] = field(default_factory=dict)
entity_removal_callbacks: dict[EntityInfoKey, list[CALLBACK_TYPE]] = field(
default_factory=dict
@@ -502,28 +501,19 @@ class RuntimeEntryData:
callback_(config)
@callback
def async_register_assist_satellite_set_wake_words_callback(
def async_register_assist_satellite_set_wake_word_callback(
self,
callback_: Callable[[list[str]], None],
callback_: Callable[[str], None],
) -> CALLBACK_TYPE:
"""Register to receive callbacks when the Assist satellite's wake word is set."""
self.assist_satellite_set_wake_words_callbacks.append(callback_)
return partial(self.assist_satellite_set_wake_words_callbacks.remove, callback_)
self.assist_satellite_set_wake_word_callbacks.append(callback_)
return partial(self.assist_satellite_set_wake_word_callbacks.remove, callback_)
@callback
def async_assist_satellite_set_wake_word(
self, wake_word_index: int, wake_word_id: str | None
) -> None:
"""Notify listeners that the Assist satellite wake words have been set."""
if wake_word_id:
self.assist_satellite_wake_words[wake_word_index] = wake_word_id
else:
self.assist_satellite_wake_words.pop(wake_word_index, None)
wake_word_ids = list(self.assist_satellite_wake_words.values())
for callback_ in self.assist_satellite_set_wake_words_callbacks.copy():
callback_(wake_word_ids)
def async_assist_satellite_set_wake_word(self, wake_word_id: str) -> None:
"""Notify listeners that the Assist satellite wake word has been set."""
for callback_ in self.assist_satellite_set_wake_word_callbacks.copy():
callback_(wake_word_id)
@callback
def async_register_entity_removal_callback(

View File

@@ -9,17 +9,11 @@
"pipeline": {
"default": "mdi:filter-outline"
},
"pipeline_2": {
"default": "mdi:filter-outline"
},
"vad_sensitivity": {
"default": "mdi:volume-high"
},
"wake_word": {
"default": "mdi:microphone"
},
"wake_word_2": {
"default": "mdi:microphone"
}
}
}

View File

@@ -17,7 +17,7 @@
"mqtt": ["esphome/discover/#"],
"quality_scale": "platinum",
"requirements": [
"aioesphomeapi==41.1.0",
"aioesphomeapi==40.1.0",
"esphome-dashboard-api==1.3.0",
"bleak-esphome==3.3.0"
],

View File

@@ -2,8 +2,6 @@
from __future__ import annotations
from dataclasses import replace
from aioesphomeapi import EntityInfo, SelectInfo, SelectState
from homeassistant.components.assist_pipeline.select import (
@@ -17,7 +15,7 @@ from homeassistant.core import HomeAssistant, callback
from homeassistant.helpers import restore_state
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from .const import DOMAIN, NO_WAKE_WORD
from .const import DOMAIN
from .entity import (
EsphomeAssistEntity,
EsphomeEntity,
@@ -52,11 +50,9 @@ async def async_setup_entry(
):
async_add_entities(
[
EsphomeAssistPipelineSelect(hass, entry_data, index=0),
EsphomeAssistPipelineSelect(hass, entry_data, index=1),
EsphomeAssistPipelineSelect(hass, entry_data),
EsphomeVadSensitivitySelect(hass, entry_data),
EsphomeAssistSatelliteWakeWordSelect(entry_data, index=0),
EsphomeAssistSatelliteWakeWordSelect(entry_data, index=1),
EsphomeAssistSatelliteWakeWordSelect(entry_data),
]
)
@@ -88,14 +84,10 @@ class EsphomeSelect(EsphomeEntity[SelectInfo, SelectState], SelectEntity):
class EsphomeAssistPipelineSelect(EsphomeAssistEntity, AssistPipelineSelect):
"""Pipeline selector for esphome devices."""
def __init__(
self, hass: HomeAssistant, entry_data: RuntimeEntryData, index: int = 0
) -> None:
def __init__(self, hass: HomeAssistant, entry_data: RuntimeEntryData) -> None:
"""Initialize a pipeline selector."""
EsphomeAssistEntity.__init__(self, entry_data)
AssistPipelineSelect.__init__(
self, hass, DOMAIN, self._device_info.mac_address, index=index
)
AssistPipelineSelect.__init__(self, hass, DOMAIN, self._device_info.mac_address)
class EsphomeVadSensitivitySelect(EsphomeAssistEntity, VadSensitivitySelect):
@@ -117,47 +109,28 @@ class EsphomeAssistSatelliteWakeWordSelect(
translation_key="wake_word",
entity_category=EntityCategory.CONFIG,
)
_attr_current_option: str | None = None
_attr_options: list[str] = [NO_WAKE_WORD]
_attr_options: list[str] = []
def __init__(self, entry_data: RuntimeEntryData, index: int = 0) -> None:
def __init__(self, entry_data: RuntimeEntryData) -> None:
"""Initialize a wake word selector."""
if index < 1:
# Keep compatibility
key_suffix = ""
placeholder = ""
else:
key_suffix = f"_{index + 1}"
placeholder = f" {index + 1}"
self.entity_description = replace(
self.entity_description,
key=f"wake_word{key_suffix}",
translation_placeholders={"index": placeholder},
)
EsphomeAssistEntity.__init__(self, entry_data)
unique_id_prefix = self._device_info.mac_address
self._attr_unique_id = f"{unique_id_prefix}-{self.entity_description.key}"
self._attr_unique_id = f"{unique_id_prefix}-wake_word"
# name -> id
self._wake_words: dict[str, str] = {}
self._wake_word_index = index
@property
def available(self) -> bool:
"""Return if entity is available."""
return len(self._attr_options) > 1 # more than just NO_WAKE_WORD
return bool(self._attr_options)
async def async_added_to_hass(self) -> None:
"""Run when entity about to be added to hass."""
await super().async_added_to_hass()
if last_state := await self.async_get_last_state():
self._attr_current_option = last_state.state
# Update options when config is updated
self.async_on_remove(
self._entry_data.async_register_assist_satellite_config_updated_callback(
@@ -167,49 +140,33 @@ class EsphomeAssistSatelliteWakeWordSelect(
async def async_select_option(self, option: str) -> None:
"""Select an option."""
self._attr_current_option = option
self.async_write_ha_state()
wake_word_id = self._wake_words.get(option)
self._entry_data.async_assist_satellite_set_wake_word(
self._wake_word_index, wake_word_id
)
if wake_word_id := self._wake_words.get(option):
# _attr_current_option will be updated on
# async_satellite_config_updated after the device sets the wake
# word.
self._entry_data.async_assist_satellite_set_wake_word(wake_word_id)
def async_satellite_config_updated(
self, config: AssistSatelliteConfiguration
) -> None:
"""Update options with available wake words."""
if (not config.available_wake_words) or (config.max_active_wake_words < 1):
# No wake words
self._attr_current_option = None
self._wake_words.clear()
self._attr_current_option = NO_WAKE_WORD
self._attr_options = [NO_WAKE_WORD]
self._entry_data.assist_satellite_wake_words.pop(
self._wake_word_index, None
)
self.async_write_ha_state()
return
self._wake_words = {w.wake_word: w.id for w in config.available_wake_words}
self._attr_options = [NO_WAKE_WORD, *sorted(self._wake_words)]
self._attr_options = sorted(self._wake_words)
option = self._attr_current_option
if (
(option is None)
or ((wake_word_id := self._wake_words.get(option)) is None)
or (wake_word_id not in config.active_wake_words)
):
option = NO_WAKE_WORD
self._attr_current_option = option
self.async_write_ha_state()
# Keep entry data in sync
if wake_word_id := self._wake_words.get(option):
self._entry_data.assist_satellite_wake_words[self._wake_word_index] = (
wake_word_id
)
if config.active_wake_words:
# Select first active wake word
wake_word_id = config.active_wake_words[0]
for wake_word in config.available_wake_words:
if wake_word.id == wake_word_id:
self._attr_current_option = wake_word.wake_word
else:
self._entry_data.assist_satellite_wake_words.pop(
self._wake_word_index, None
)
# Select first available wake word
self._attr_current_option = config.available_wake_words[0].wake_word
self.async_write_ha_state()

View File

@@ -12,7 +12,7 @@
"mqtt_missing_mac": "Missing MAC address in MQTT properties.",
"mqtt_missing_api": "Missing API port in MQTT properties.",
"mqtt_missing_ip": "Missing IP address in MQTT properties.",
"mqtt_missing_payload": "Missing MQTT payload.",
"mqtt_missing_payload": "Missing MQTT Payload.",
"name_conflict_migrated": "The configuration for `{name}` has been migrated to a new device with MAC address `{mac}` from `{existing_mac}`.",
"reconfigure_successful": "[%key:common::config_flow::abort::reconfigure_successful%]",
"reauth_unique_id_changed": "**Re-authentication of `{name}` was aborted** because the address `{host}` points to a different device: `{unexpected_device_name}` (MAC: `{unexpected_mac}`) instead of the expected one (MAC: `{expected_mac}`).",
@@ -91,7 +91,7 @@
"subscribe_logs": "Subscribe to logs from the device."
},
"data_description": {
"allow_service_calls": "When enabled, ESPHome devices can perform Home Assistant actions or send events. Only enable this if you trust the device.",
"allow_service_calls": "When enabled, ESPHome devices can perform Home Assistant actions, such as calling services or sending events. Only enable this if you trust the device.",
"subscribe_logs": "When enabled, the device will send logs to Home Assistant and you can view them in the logs panel."
}
}
@@ -119,9 +119,8 @@
}
},
"wake_word": {
"name": "Wake word{index}",
"name": "Wake word",
"state": {
"no_wake_word": "No wake word",
"okay_nabu": "Okay Nabu"
}
}
@@ -154,7 +153,7 @@
"description": "To improve Bluetooth reliability and performance, we highly recommend updating {name} with ESPHome {version} or later. When updating the device from ESPHome earlier than 2022.12.0, it is recommended to use a serial cable instead of an over-the-air update to take advantage of the new partition scheme."
},
"api_password_deprecated": {
"title": "API password deprecated on {name}",
"title": "API Password deprecated on {name}",
"description": "The API password for ESPHome is deprecated and the use of an API encryption key is recommended instead.\n\nRemove the API password and add an encryption key to your ESPHome device to resolve this issue."
},
"service_calls_not_allowed": {
@@ -193,10 +192,10 @@
"message": "Error communicating with the device {device_name}: {error}"
},
"error_compiling": {
"message": "Error compiling {configuration}. Try again in ESPHome dashboard for more information."
"message": "Error compiling {configuration}; Try again in ESPHome dashboard for more information."
},
"error_uploading": {
"message": "Error during OTA (Over-The-Air) update of {configuration}. Try again in ESPHome dashboard for more information."
"message": "Error during OTA (Over-The-Air) of {configuration}; Try again in ESPHome dashboard for more information."
},
"ota_in_progress": {
"message": "An OTA (Over-The-Air) update is already in progress for {configuration}."

View File

@@ -14,7 +14,13 @@ from homeassistant.components.climate import (
HVACAction,
HVACMode,
)
from homeassistant.components.modbus import ModbusHub, get_hub
from homeassistant.components.modbus import (
CALL_TYPE_REGISTER_HOLDING,
CALL_TYPE_REGISTER_INPUT,
DEFAULT_HUB,
ModbusHub,
get_hub,
)
from homeassistant.const import (
ATTR_TEMPERATURE,
CONF_NAME,
@@ -27,13 +33,7 @@ from homeassistant.helpers import config_validation as cv
from homeassistant.helpers.entity_platform import AddEntitiesCallback
from homeassistant.helpers.typing import ConfigType, DiscoveryInfoType
# These constants are not offered by modbus, because modbus do not have
# an official API.
CALL_TYPE_REGISTER_HOLDING = "holding"
CALL_TYPE_REGISTER_INPUT = "input"
CALL_TYPE_WRITE_REGISTER = "write_register"
DEFAULT_HUB = "modbus_hub"
CONF_HUB = "hub"
PLATFORM_SCHEMA = CLIMATE_PLATFORM_SCHEMA.extend(

View File

@@ -37,7 +37,6 @@ SENSOR_TYPES: tuple[FlexitSensorEntityDescription, ...] = (
FlexitSensorEntityDescription(
key="outside_air_temperature",
device_class=SensorDeviceClass.TEMPERATURE,
state_class=SensorStateClass.MEASUREMENT,
native_unit_of_measurement=UnitOfTemperature.CELSIUS,
translation_key="outside_air_temperature",
value_fn=lambda data: data.outside_air_temperature,
@@ -45,7 +44,6 @@ SENSOR_TYPES: tuple[FlexitSensorEntityDescription, ...] = (
FlexitSensorEntityDescription(
key="supply_air_temperature",
device_class=SensorDeviceClass.TEMPERATURE,
state_class=SensorStateClass.MEASUREMENT,
native_unit_of_measurement=UnitOfTemperature.CELSIUS,
translation_key="supply_air_temperature",
value_fn=lambda data: data.supply_air_temperature,
@@ -53,7 +51,6 @@ SENSOR_TYPES: tuple[FlexitSensorEntityDescription, ...] = (
FlexitSensorEntityDescription(
key="exhaust_air_temperature",
device_class=SensorDeviceClass.TEMPERATURE,
state_class=SensorStateClass.MEASUREMENT,
native_unit_of_measurement=UnitOfTemperature.CELSIUS,
translation_key="exhaust_air_temperature",
value_fn=lambda data: data.exhaust_air_temperature,
@@ -61,7 +58,6 @@ SENSOR_TYPES: tuple[FlexitSensorEntityDescription, ...] = (
FlexitSensorEntityDescription(
key="extract_air_temperature",
device_class=SensorDeviceClass.TEMPERATURE,
state_class=SensorStateClass.MEASUREMENT,
native_unit_of_measurement=UnitOfTemperature.CELSIUS,
translation_key="extract_air_temperature",
value_fn=lambda data: data.extract_air_temperature,
@@ -69,7 +65,6 @@ SENSOR_TYPES: tuple[FlexitSensorEntityDescription, ...] = (
FlexitSensorEntityDescription(
key="room_temperature",
device_class=SensorDeviceClass.TEMPERATURE,
state_class=SensorStateClass.MEASUREMENT,
native_unit_of_measurement=UnitOfTemperature.CELSIUS,
translation_key="room_temperature",
value_fn=lambda data: data.room_temperature,

View File

@@ -20,5 +20,5 @@
"documentation": "https://www.home-assistant.io/integrations/frontend",
"integration_type": "system",
"quality_scale": "internal",
"requirements": ["home-assistant-frontend==20250903.5"]
"requirements": ["home-assistant-frontend==20250903.3"]
}

View File

@@ -1,39 +0,0 @@
"""Sensor entities for Geocaching."""
from typing import cast
from geocachingapi.models import GeocachingCache
from homeassistant.helpers.device_registry import DeviceEntryType, DeviceInfo
from homeassistant.helpers.update_coordinator import CoordinatorEntity
from .const import DOMAIN
from .coordinator import GeocachingDataUpdateCoordinator
# Base class for all platforms
class GeocachingBaseEntity(CoordinatorEntity[GeocachingDataUpdateCoordinator]):
"""Base class for Geocaching sensors."""
_attr_has_entity_name = True
# Base class for cache entities
class GeocachingCacheEntity(GeocachingBaseEntity):
"""Base class for Geocaching cache entities."""
def __init__(
self, coordinator: GeocachingDataUpdateCoordinator, cache: GeocachingCache
) -> None:
"""Initialize the Geocaching cache entity."""
super().__init__(coordinator)
self.cache = cache
# A device can have multiple entities, and for a cache which requires multiple entities we want to group them together.
# Therefore, we create a device for each cache, which holds all related entities.
self._attr_device_info = DeviceInfo(
name=f"Geocache {cache.name}",
identifiers={(DOMAIN, cast(str, cache.reference_code))},
entry_type=DeviceEntryType.SERVICE,
manufacturer=cache.owner.username,
)

View File

@@ -15,24 +15,6 @@
},
"awarded_favorite_points": {
"default": "mdi:heart"
},
"cache_name": {
"default": "mdi:label"
},
"cache_owner": {
"default": "mdi:account"
},
"cache_found_date": {
"default": "mdi:calendar-search"
},
"cache_found": {
"default": "mdi:package-variant-closed-check"
},
"cache_favorite_points": {
"default": "mdi:star-check"
},
"cache_hidden_date": {
"default": "mdi:calendar-badge"
}
}
}

View File

@@ -4,25 +4,18 @@ from __future__ import annotations
from collections.abc import Callable
from dataclasses import dataclass
import datetime
from typing import cast
from geocachingapi.models import GeocachingCache, GeocachingStatus
from geocachingapi.models import GeocachingStatus
from homeassistant.components.sensor import (
SensorDeviceClass,
SensorEntity,
SensorEntityDescription,
)
from homeassistant.components.sensor import SensorEntity, SensorEntityDescription
from homeassistant.core import HomeAssistant
from homeassistant.helpers.device_registry import DeviceEntryType, DeviceInfo
from homeassistant.helpers.entity import Entity
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from homeassistant.helpers.typing import StateType
from homeassistant.helpers.update_coordinator import CoordinatorEntity
from .const import DOMAIN
from .coordinator import GeocachingConfigEntry, GeocachingDataUpdateCoordinator
from .entity import GeocachingBaseEntity, GeocachingCacheEntity
@dataclass(frozen=True, kw_only=True)
@@ -32,63 +25,43 @@ class GeocachingSensorEntityDescription(SensorEntityDescription):
value_fn: Callable[[GeocachingStatus], str | int | None]
PROFILE_SENSORS: tuple[GeocachingSensorEntityDescription, ...] = (
SENSORS: tuple[GeocachingSensorEntityDescription, ...] = (
GeocachingSensorEntityDescription(
key="find_count",
translation_key="find_count",
native_unit_of_measurement="caches",
value_fn=lambda status: status.user.find_count,
),
GeocachingSensorEntityDescription(
key="hide_count",
translation_key="hide_count",
native_unit_of_measurement="caches",
entity_registry_visible_default=False,
value_fn=lambda status: status.user.hide_count,
),
GeocachingSensorEntityDescription(
key="favorite_points",
translation_key="favorite_points",
native_unit_of_measurement="points",
entity_registry_visible_default=False,
value_fn=lambda status: status.user.favorite_points,
),
GeocachingSensorEntityDescription(
key="souvenir_count",
translation_key="souvenir_count",
native_unit_of_measurement="souvenirs",
value_fn=lambda status: status.user.souvenir_count,
),
GeocachingSensorEntityDescription(
key="awarded_favorite_points",
translation_key="awarded_favorite_points",
native_unit_of_measurement="points",
entity_registry_visible_default=False,
value_fn=lambda status: status.user.awarded_favorite_points,
),
)
@dataclass(frozen=True, kw_only=True)
class GeocachingCacheSensorDescription(SensorEntityDescription):
"""Define Sensor entity description class."""
value_fn: Callable[[GeocachingCache], StateType | datetime.date]
CACHE_SENSORS: tuple[GeocachingCacheSensorDescription, ...] = (
GeocachingCacheSensorDescription(
key="found_date",
device_class=SensorDeviceClass.DATE,
value_fn=lambda cache: cache.found_date_time,
),
GeocachingCacheSensorDescription(
key="favorite_points",
value_fn=lambda cache: cache.favorite_points,
),
GeocachingCacheSensorDescription(
key="hidden_date",
device_class=SensorDeviceClass.DATE,
value_fn=lambda cache: cache.hidden_date,
),
)
async def async_setup_entry(
hass: HomeAssistant,
entry: GeocachingConfigEntry,
@@ -96,68 +69,14 @@ async def async_setup_entry(
) -> None:
"""Set up a Geocaching sensor entry."""
coordinator = entry.runtime_data
entities: list[Entity] = []
entities.extend(
GeocachingProfileSensor(coordinator, description)
for description in PROFILE_SENSORS
async_add_entities(
GeocachingSensor(coordinator, description) for description in SENSORS
)
status = coordinator.data
# Add entities for tracked caches
entities.extend(
GeoEntityCacheSensorEntity(coordinator, cache, description)
for cache in status.tracked_caches
for description in CACHE_SENSORS
)
async_add_entities(entities)
# Base class for a cache entity.
# Sets the device, ID and translation settings to correctly group the entity to the correct cache device and give it the correct name.
class GeoEntityBaseCache(GeocachingCacheEntity, SensorEntity):
"""Base class for cache entities."""
def __init__(
self,
coordinator: GeocachingDataUpdateCoordinator,
cache: GeocachingCache,
key: str,
) -> None:
"""Initialize the Geocaching sensor."""
super().__init__(coordinator, cache)
self._attr_unique_id = f"{cache.reference_code}_{key}"
# The translation key determines the name of the entity as this is the lookup for the `strings.json` file.
self._attr_translation_key = f"cache_{key}"
class GeoEntityCacheSensorEntity(GeoEntityBaseCache, SensorEntity):
"""Representation of a cache sensor."""
entity_description: GeocachingCacheSensorDescription
def __init__(
self,
coordinator: GeocachingDataUpdateCoordinator,
cache: GeocachingCache,
description: GeocachingCacheSensorDescription,
) -> None:
"""Initialize the Geocaching sensor."""
super().__init__(coordinator, cache, description.key)
self.entity_description = description
@property
def native_value(self) -> StateType | datetime.date:
"""Return the state of the sensor."""
return self.entity_description.value_fn(self.cache)
class GeocachingProfileSensor(GeocachingBaseEntity, SensorEntity):
class GeocachingSensor(
CoordinatorEntity[GeocachingDataUpdateCoordinator], SensorEntity
):
"""Representation of a Sensor."""
entity_description: GeocachingSensorEntityDescription

View File

@@ -33,36 +33,11 @@
},
"entity": {
"sensor": {
"find_count": {
"name": "Total finds",
"unit_of_measurement": "caches"
},
"hide_count": {
"name": "Total hides",
"unit_of_measurement": "caches"
},
"favorite_points": {
"name": "Favorite points",
"unit_of_measurement": "points"
},
"souvenir_count": {
"name": "Total souvenirs",
"unit_of_measurement": "souvenirs"
},
"awarded_favorite_points": {
"name": "Awarded favorite points",
"unit_of_measurement": "points"
},
"cache_found_date": {
"name": "Found date"
},
"cache_favorite_points": {
"name": "Favorite points",
"unit_of_measurement": "points"
},
"cache_hidden_date": {
"name": "Hidden date"
}
"find_count": { "name": "Total finds" },
"hide_count": { "name": "Total hides" },
"favorite_points": { "name": "Favorite points" },
"souvenir_count": { "name": "Total souvenirs" },
"awarded_favorite_points": { "name": "Awarded favorite points" }
}
}
}

View File

@@ -8,5 +8,5 @@
"documentation": "https://www.home-assistant.io/integrations/google_generative_ai_conversation",
"integration_type": "service",
"iot_class": "cloud_polling",
"requirements": ["google-genai==1.38.0"]
"requirements": ["google-genai==1.29.0"]
}

View File

@@ -6,5 +6,5 @@
"dependencies": ["network"],
"documentation": "https://www.home-assistant.io/integrations/govee_light_local",
"iot_class": "local_push",
"requirements": ["govee-local-api==2.2.0"]
"requirements": ["govee-local-api==2.1.0"]
}

View File

@@ -112,14 +112,11 @@ PLACEHOLDER_KEY_ADDON = "addon"
PLACEHOLDER_KEY_ADDON_URL = "addon_url"
PLACEHOLDER_KEY_REFERENCE = "reference"
PLACEHOLDER_KEY_COMPONENTS = "components"
PLACEHOLDER_KEY_FREE_SPACE = "free_space"
ISSUE_KEY_ADDON_BOOT_FAIL = "issue_addon_boot_fail"
ISSUE_KEY_SYSTEM_DOCKER_CONFIG = "issue_system_docker_config"
ISSUE_KEY_ADDON_DETACHED_ADDON_MISSING = "issue_addon_detached_addon_missing"
ISSUE_KEY_ADDON_DETACHED_ADDON_REMOVED = "issue_addon_detached_addon_removed"
ISSUE_KEY_ADDON_PWNED = "issue_addon_pwned"
ISSUE_KEY_SYSTEM_FREE_SPACE = "issue_system_free_space"
CORE_CONTAINER = "homeassistant"
SUPERVISOR_CONTAINER = "hassio_supervisor"
@@ -140,24 +137,6 @@ KEY_TO_UPDATE_TYPES: dict[str, set[str]] = {
REQUEST_REFRESH_DELAY = 10
HELP_URLS = {
"help_url": "https://www.home-assistant.io/help/",
"community_url": "https://community.home-assistant.io/",
}
EXTRA_PLACEHOLDERS = {
"issue_mount_mount_failed": {
"storage_url": "/config/storage",
},
ISSUE_KEY_ADDON_DETACHED_ADDON_REMOVED: HELP_URLS,
ISSUE_KEY_SYSTEM_FREE_SPACE: {
"more_info_free_space": "https://www.home-assistant.io/more-info/free-space",
},
ISSUE_KEY_ADDON_PWNED: {
"more_info_pwned": "https://www.home-assistant.io/more-info/pwned-passwords",
},
}
class SupervisorEntityModel(StrEnum):
"""Supervisor entity model."""

View File

@@ -41,21 +41,17 @@ from .const import (
EVENT_SUPERVISOR_EVENT,
EVENT_SUPERVISOR_UPDATE,
EVENT_SUPPORTED_CHANGED,
EXTRA_PLACEHOLDERS,
ISSUE_KEY_ADDON_BOOT_FAIL,
ISSUE_KEY_ADDON_DETACHED_ADDON_MISSING,
ISSUE_KEY_ADDON_DETACHED_ADDON_REMOVED,
ISSUE_KEY_ADDON_PWNED,
ISSUE_KEY_SYSTEM_DOCKER_CONFIG,
ISSUE_KEY_SYSTEM_FREE_SPACE,
PLACEHOLDER_KEY_ADDON,
PLACEHOLDER_KEY_ADDON_URL,
PLACEHOLDER_KEY_FREE_SPACE,
PLACEHOLDER_KEY_REFERENCE,
REQUEST_REFRESH_DELAY,
UPDATE_KEY_SUPERVISOR,
)
from .coordinator import get_addons_info, get_host_info
from .coordinator import get_addons_info
from .handler import HassIO, get_supervisor_client
ISSUE_KEY_UNHEALTHY = "unhealthy"
@@ -82,8 +78,6 @@ ISSUE_KEYS_FOR_REPAIRS = {
ISSUE_KEY_ADDON_DETACHED_ADDON_MISSING,
ISSUE_KEY_ADDON_DETACHED_ADDON_REMOVED,
"issue_system_disk_lifetime",
ISSUE_KEY_SYSTEM_FREE_SPACE,
ISSUE_KEY_ADDON_PWNED,
}
_LOGGER = logging.getLogger(__name__)
@@ -247,17 +241,11 @@ class SupervisorIssues:
def add_issue(self, issue: Issue) -> None:
"""Add or update an issue in the list. Create or update a repair if necessary."""
if issue.key in ISSUE_KEYS_FOR_REPAIRS:
placeholders: dict[str, str] = {}
if not issue.suggestions and issue.key in EXTRA_PLACEHOLDERS:
placeholders |= EXTRA_PLACEHOLDERS[issue.key]
placeholders: dict[str, str] | None = None
if issue.reference:
placeholders[PLACEHOLDER_KEY_REFERENCE] = issue.reference
placeholders = {PLACEHOLDER_KEY_REFERENCE: issue.reference}
if issue.key in {
ISSUE_KEY_ADDON_DETACHED_ADDON_MISSING,
ISSUE_KEY_ADDON_PWNED,
}:
if issue.key == ISSUE_KEY_ADDON_DETACHED_ADDON_MISSING:
placeholders[PLACEHOLDER_KEY_ADDON_URL] = (
f"/hassio/addon/{issue.reference}"
)
@@ -269,19 +257,6 @@ class SupervisorIssues:
else:
placeholders[PLACEHOLDER_KEY_ADDON] = issue.reference
elif issue.key == ISSUE_KEY_SYSTEM_FREE_SPACE:
host_info = get_host_info(self._hass)
if (
host_info
and "data" in host_info
and "disk_free" in host_info["data"]
):
placeholders[PLACEHOLDER_KEY_FREE_SPACE] = str(
host_info["data"]["disk_free"]
)
else:
placeholders[PLACEHOLDER_KEY_FREE_SPACE] = "<2"
async_create_issue(
self._hass,
DOMAIN,
@@ -289,7 +264,7 @@ class SupervisorIssues:
is_fixable=bool(issue.suggestions),
severity=IssueSeverity.WARNING,
translation_key=issue.key,
translation_placeholders=placeholders or None,
translation_placeholders=placeholders,
)
self._issues[issue.uuid] = issue

View File

@@ -16,10 +16,8 @@ from homeassistant.data_entry_flow import FlowResult
from . import get_addons_info, get_issues_info
from .const import (
EXTRA_PLACEHOLDERS,
ISSUE_KEY_ADDON_BOOT_FAIL,
ISSUE_KEY_ADDON_DETACHED_ADDON_REMOVED,
ISSUE_KEY_ADDON_PWNED,
ISSUE_KEY_SYSTEM_DOCKER_CONFIG,
PLACEHOLDER_KEY_ADDON,
PLACEHOLDER_KEY_COMPONENTS,
@@ -28,6 +26,11 @@ from .const import (
from .handler import get_supervisor_client
from .issues import Issue, Suggestion
HELP_URLS = {
"help_url": "https://www.home-assistant.io/help/",
"community_url": "https://community.home-assistant.io/",
}
SUGGESTION_CONFIRMATION_REQUIRED = {
"addon_execute_remove",
"system_adopt_data_disk",
@@ -35,6 +38,14 @@ SUGGESTION_CONFIRMATION_REQUIRED = {
}
EXTRA_PLACEHOLDERS = {
"issue_mount_mount_failed": {
"storage_url": "/config/storage",
},
ISSUE_KEY_ADDON_DETACHED_ADDON_REMOVED: HELP_URLS,
}
class SupervisorIssueRepairFlow(RepairsFlow):
"""Handler for an issue fixing flow."""
@@ -208,7 +219,6 @@ async def async_create_fix_flow(
if issue and issue.key in {
ISSUE_KEY_ADDON_DETACHED_ADDON_REMOVED,
ISSUE_KEY_ADDON_BOOT_FAIL,
ISSUE_KEY_ADDON_PWNED,
}:
return AddonIssueRepairFlow(hass, issue_id)

View File

@@ -37,14 +37,14 @@
},
"issue_addon_detached_addon_missing": {
"title": "Missing repository for an installed add-on",
"description": "Repository for add-on {addon} is missing. This means it will not get updates, and backups may not be restored correctly as the Home Assistant Supervisor may not be able to build/download the resources required.\n\nPlease check the [add-on's documentation]({addon_url}) for installation instructions and add the repository to the store."
"description": "Repository for add-on {addon} is missing. This means it will not get updates, and backups may not be restored correctly as the supervisor may not be able to build/download the resources required.\n\nPlease check the [add-on's documentation]({addon_url}) for installation instructions and add the repository to the store."
},
"issue_addon_detached_addon_removed": {
"title": "Installed add-on has been removed from repository",
"fix_flow": {
"step": {
"addon_execute_remove": {
"description": "Add-on {addon} has been removed from the repository it was installed from. This means it will not get updates, and backups may not be restored correctly as the Home Assistant Supervisor may not be able to build/download the resources required.\n\nSelecting **Submit** will uninstall this deprecated add-on. Alternatively, you can check [Home Assistant help]({help_url}) and the [community forum]({community_url}) for alternatives to migrate to."
"description": "Add-on {addon} has been removed from the repository it was installed from. This means it will not get updates, and backups may not be restored correctly as the supervisor may not be able to build/download the resources required.\n\nSelecting **Submit** will uninstall this deprecated add-on. Alternatively, you can check [Home Assistant help]({help_url}) and the [community forum]({community_url}) for alternatives to migrate to."
}
},
"abort": {
@@ -52,10 +52,6 @@
}
}
},
"issue_addon_pwned": {
"title": "Insecure secrets detected in add-on configuration",
"description": "Add-on {addon} uses secrets/passwords in its configuration which are detected as not secure. See [pwned passwords and secrets]({more_info_pwned}) for more information on this issue."
},
"issue_mount_mount_failed": {
"title": "Network storage device failed",
"fix_flow": {
@@ -123,10 +119,6 @@
"title": "Disk lifetime exceeding 90%",
"description": "The data disk has exceeded 90% of its expected lifespan. The disk may soon malfunction which can lead to data loss. You should replace it soon and migrate your data."
},
"issue_system_free_space": {
"title": "Data disk is running low on free space",
"description": "The data disk has only {free_space}GB free space left. This may cause issues with system stability and interfere with functionality such as backups and updates. See [clear up storage]({more_info_free_space}) for tips on how to free up space."
},
"unhealthy": {
"title": "Unhealthy system - {reason}",
"description": "System is currently unhealthy due to {reason}. For troubleshooting information, select Learn more."
@@ -193,7 +185,7 @@
},
"unsupported_docker_version": {
"title": "Unsupported system - Docker version",
"description": "System is unsupported because the Docker version is out of date. For information about the required version and how to fix this, select Learn more."
"description": "System is unsupported because the wrong version of Docker is in use. Use the link to learn the correct version and how to fix this."
},
"unsupported_job_conditions": {
"title": "Unsupported system - Protections disabled",
@@ -209,7 +201,7 @@
},
"unsupported_os": {
"title": "Unsupported system - Operating System",
"description": "System is unsupported because the operating system in use is not tested or maintained for use with Supervisor. For information about supported operating systems and how to fix this, select Learn more."
"description": "System is unsupported because the operating system in use is not tested or maintained for use with Supervisor. Use the link to which operating systems are supported and how to fix this."
},
"unsupported_os_agent": {
"title": "Unsupported system - OS-Agent issues",

View File

@@ -1,23 +0,0 @@
"""Diagnostics support for history_stats."""
from __future__ import annotations
from typing import Any
from homeassistant.config_entries import ConfigEntry
from homeassistant.core import HomeAssistant
from homeassistant.helpers import entity_registry as er
async def async_get_config_entry_diagnostics(
hass: HomeAssistant, config_entry: ConfigEntry
) -> dict[str, Any]:
"""Return diagnostics for a config entry."""
registry = er.async_get(hass)
entities = registry.entities.get_entries_for_config_entry_id(config_entry.entry_id)
return {
"config_entry": config_entry.as_dict(),
"entity": [entity.extended_dict for entity in entities],
}

Some files were not shown because too many files have changed in this diff Show More