Compare commits

..

9 Commits

Author SHA1 Message Date
J. Nick Koston
8dc9fa7bba Add async_current_scanners API to Bluetooth integration 2025-09-11 09:50:38 -05:00
J. Nick Koston
7d2cdf8024 preen 2025-09-10 13:41:46 -05:00
J. Nick Koston
d9091c3d52 Merge remote-tracking branch 'upstream/dev' into switchbot_passive_id 2025-09-10 13:39:58 -05:00
J. Nick Koston
1553933c6c update tests 2025-09-10 13:32:26 -05:00
J. Nick Koston
d21d708efd update more tests 2025-09-10 13:22:21 -05:00
J. Nick Koston
3827ba64a9 update more tests 2025-09-10 13:20:15 -05:00
J. Nick Koston
1336095e95 update more tests 2025-09-10 13:19:10 -05:00
J. Nick Koston
f75bebb532 dont ask again 2025-09-10 12:57:38 -05:00
J. Nick Koston
bd1862538b switchbot passive 2025-09-10 12:47:51 -05:00
4196 changed files with 92821 additions and 276881 deletions

View File

@@ -1,77 +0,0 @@
---
name: quality-scale-rule-verifier
description: |
Use this agent when you need to verify that a Home Assistant integration follows a specific quality scale rule. This includes checking if the integration implements required patterns, configurations, or code structures defined by the quality scale system.
<example>
Context: The user wants to verify if an integration follows a specific quality scale rule.
user: "Check if the peblar integration follows the config-flow rule"
assistant: "I'll use the quality scale rule verifier to check if the peblar integration properly implements the config-flow rule."
<commentary>
Since the user is asking to verify a quality scale rule implementation, use the quality-scale-rule-verifier agent.
</commentary>
</example>
<example>
Context: The user is reviewing if an integration reaches a specific quality scale level.
user: "Verify that this integration reaches the bronze quality scale"
assistant: "Let me use the quality scale rule verifier to check the bronze quality scale implementation."
<commentary>
The user wants to verify the integration has reached a certain quality level, so use multiple quality-scale-rule-verifier agents to verify each bronze rule.
</commentary>
</example>
model: inherit
color: yellow
tools: Read, Bash, Grep, Glob, WebFetch
---
You are an expert Home Assistant integration quality scale auditor specializing in verifying compliance with specific quality scale rules. You have deep knowledge of Home Assistant's architecture, best practices, and the quality scale system that ensures integration consistency and reliability.
You will verify if an integration follows a specific quality scale rule by:
1. **Fetching Rule Documentation**: Retrieve the official rule documentation from:
`https://raw.githubusercontent.com/home-assistant/developers.home-assistant/refs/heads/master/docs/core/integration-quality-scale/rules/{rule_name}.md`
where `{rule_name}` is the rule identifier (e.g., 'config-flow', 'entity-unique-id', 'parallel-updates')
2. **Understanding Rule Requirements**: Parse the rule documentation to identify:
- Core requirements and mandatory implementations
- Specific code patterns or configurations required
- Common violations and anti-patterns
- Exemption criteria (when a rule might not apply)
- The quality tier this rule belongs to (Bronze, Silver, Gold, Platinum)
3. **Analyzing Integration Code**: Examine the integration's codebase at `homeassistant/components/<integration domain>` focusing on:
- `manifest.json` for quality scale declaration and configuration
- `quality_scale.yaml` for rule status (done, todo, exempt)
- Relevant Python modules based on the rule requirements
- Configuration files and service definitions as needed
4. **Verification Process**:
- Check if the rule is marked as 'done', 'todo', or 'exempt' in quality_scale.yaml
- If marked 'exempt', verify the exemption reason is valid
- If marked 'done', verify the actual implementation matches requirements
- Identify specific files and code sections that demonstrate compliance or violations
- Consider the integration's declared quality tier when applying rules
- To fetch the integration docs, use WebFetch to fetch from `https://raw.githubusercontent.com/home-assistant/home-assistant.io/refs/heads/current/source/_integrations/<integration domain>.markdown`
- To fetch information about a PyPI package, use the URL `https://pypi.org/pypi/<package>/json`
5. **Reporting Findings**: Provide a comprehensive verification report that includes:
- **Rule Summary**: Brief description of what the rule requires
- **Compliance Status**: Clear pass/fail/exempt determination
- **Evidence**: Specific code examples showing compliance or violations
- **Issues Found**: Detailed list of any non-compliance issues with file locations
- **Recommendations**: Actionable steps to achieve compliance if needed
- **Exemption Analysis**: If applicable, whether the exemption is justified
When examining code, you will:
- Look for exact implementation patterns specified in the rule
- Verify all required components are present and properly configured
- Check for common mistakes and anti-patterns
- Consider edge cases and error handling requirements
- Validate that implementations follow Home Assistant conventions
You will be thorough but focused, examining only the aspects relevant to the specific rule being verified. You will provide clear, actionable feedback that helps developers understand both what needs to be fixed and why it matters for integration quality.
If you cannot access the rule documentation or find the integration code, clearly state what information is missing and what you would need to complete the verification.
Remember that quality scale rules are cumulative - Bronze rules apply to all integrations with a quality scale, Silver rules apply to Silver+ integrations, and so on. Always consider the integration's target quality level when determining which rules should be enforced.

View File

@@ -58,7 +58,6 @@ base_platforms: &base_platforms
# Extra components that trigger the full suite
components: &components
- homeassistant/components/alexa/**
- homeassistant/components/analytics/**
- homeassistant/components/application_credentials/**
- homeassistant/components/assist_pipeline/**
- homeassistant/components/auth/**

View File

@@ -33,7 +33,7 @@
"GitHub.vscode-pull-request-github",
"GitHub.copilot"
],
// Please keep this file in sync with settings in home-assistant/.vscode/settings.default.jsonc
// Please keep this file in sync with settings in home-assistant/.vscode/settings.default.json
"settings": {
"python.experiments.optOutFrom": ["pythonTestAdapter"],
"python.defaultInterpreterPath": "/home/vscode/.local/ha-venv/bin/python",
@@ -41,7 +41,6 @@
"python.terminal.activateEnvInCurrentTerminal": true,
"python.testing.pytestArgs": ["--no-cov"],
"pylint.importStrategy": "fromEnvironment",
"python.analysis.typeCheckingMode": "basic",
"editor.formatOnPaste": false,
"editor.formatOnSave": true,
"editor.formatOnType": true,
@@ -63,9 +62,6 @@
"[python]": {
"editor.defaultFormatter": "charliermarsh.ruff"
},
"[json][jsonc][yaml]": {
"editor.defaultFormatter": "esbenp.prettier-vscode"
},
"json.schemas": [
{
"fileMatch": ["homeassistant/components/*/manifest.json"],

View File

@@ -55,12 +55,8 @@
creating the PR. If you're unsure about any of them, don't hesitate to ask.
We're here to help! This is simply a reminder of what we are going to look
for before merging your code.
AI tools are welcome, but contributors are responsible for *fully*
understanding the code before submitting a PR.
-->
- [ ] I understand the code I am submitting and can explain how it works.
- [ ] The code change is tested and works locally.
- [ ] Local tests pass. **Your PR cannot be merged unless tests pass**
- [ ] There is no commented out code in this PR.
@@ -68,7 +64,6 @@
- [ ] I have followed the [perfect PR recommendations][perfect-pr]
- [ ] The code has been formatted using Ruff (`ruff format homeassistant tests`)
- [ ] Tests have been added to verify that the new code works.
- [ ] Any generated code has been carefully reviewed for correctness and compliance with project standards.
If user exposed functionality or configuration variables are added/changed:

View File

@@ -74,7 +74,6 @@ rules:
- **Formatting**: Ruff
- **Linting**: PyLint and Ruff
- **Type Checking**: MyPy
- **Lint/Type/Format Fixes**: Always prefer addressing the underlying issue (e.g., import the typed source, update shared stubs, align with Ruff expectations, or correct formatting at the source) before disabling a rule, adding `# type: ignore`, or skipping a formatter. Treat suppressions and `noqa` comments as a last resort once no compliant fix exists
- **Testing**: pytest with plain functions and fixtures
- **Language**: American English for all code, comments, and documentation (use sentence case, including titles)

View File

@@ -27,12 +27,12 @@ jobs:
publish: ${{ steps.version.outputs.publish }}
steps:
- name: Checkout the repository
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
uses: actions/checkout@v5.0.0
with:
fetch-depth: 0
- name: Set up Python ${{ env.DEFAULT_PYTHON }}
uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
uses: actions/setup-python@v6.0.0
with:
python-version: ${{ env.DEFAULT_PYTHON }}
@@ -69,7 +69,7 @@ jobs:
run: find ./homeassistant/components/*/translations -name "*.json" | tar zcvf translations.tar.gz -T -
- name: Upload translations
uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4 # v5.0.0
uses: actions/upload-artifact@v4.6.2
with:
name: translations
path: translations.tar.gz
@@ -90,11 +90,11 @@ jobs:
arch: ${{ fromJson(needs.init.outputs.architectures) }}
steps:
- name: Checkout the repository
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
uses: actions/checkout@v5.0.0
- name: Download nightly wheels of frontend
if: needs.init.outputs.channel == 'dev'
uses: dawidd6/action-download-artifact@ac66b43f0e6a346234dd65d4d0c8fbb31cb316e5 # v11
uses: dawidd6/action-download-artifact@v11
with:
github_token: ${{secrets.GITHUB_TOKEN}}
repo: home-assistant/frontend
@@ -105,7 +105,7 @@ jobs:
- name: Download nightly wheels of intents
if: needs.init.outputs.channel == 'dev'
uses: dawidd6/action-download-artifact@ac66b43f0e6a346234dd65d4d0c8fbb31cb316e5 # v11
uses: dawidd6/action-download-artifact@v11
with:
github_token: ${{secrets.GITHUB_TOKEN}}
repo: OHF-Voice/intents-package
@@ -116,7 +116,7 @@ jobs:
- name: Set up Python ${{ env.DEFAULT_PYTHON }}
if: needs.init.outputs.channel == 'dev'
uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
uses: actions/setup-python@v6.0.0
with:
python-version: ${{ env.DEFAULT_PYTHON }}
@@ -175,7 +175,7 @@ jobs:
sed -i "s|pykrakenapi|# pykrakenapi|g" requirements_all.txt
- name: Download translations
uses: actions/download-artifact@018cc2cf5baa6db3ef3c5f8a56943fffe632ef53 # v6.0.0
uses: actions/download-artifact@v5.0.0
with:
name: translations
@@ -190,15 +190,14 @@ jobs:
echo "${{ github.sha }};${{ github.ref }};${{ github.event_name }};${{ github.actor }}" > rootfs/OFFICIAL_IMAGE
- name: Login to GitHub Container Registry
uses: docker/login-action@5e57cd118135c172c3672efd75eb46360885c0ef # v3.6.0
uses: docker/login-action@v3.5.0
with:
registry: ghcr.io
username: ${{ github.repository_owner }}
password: ${{ secrets.GITHUB_TOKEN }}
# home-assistant/builder doesn't support sha pinning
- name: Build base image
uses: home-assistant/builder@2025.09.0
uses: home-assistant/builder@2025.03.0
with:
args: |
$BUILD_ARGS \
@@ -243,7 +242,7 @@ jobs:
- green
steps:
- name: Checkout the repository
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
uses: actions/checkout@v5.0.0
- name: Set build additional args
run: |
@@ -257,15 +256,14 @@ jobs:
fi
- name: Login to GitHub Container Registry
uses: docker/login-action@5e57cd118135c172c3672efd75eb46360885c0ef # v3.6.0
uses: docker/login-action@v3.5.0
with:
registry: ghcr.io
username: ${{ github.repository_owner }}
password: ${{ secrets.GITHUB_TOKEN }}
# home-assistant/builder doesn't support sha pinning
- name: Build base image
uses: home-assistant/builder@2025.09.0
uses: home-assistant/builder@2025.03.0
with:
args: |
$BUILD_ARGS \
@@ -281,7 +279,7 @@ jobs:
runs-on: ubuntu-latest
steps:
- name: Checkout the repository
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
uses: actions/checkout@v5.0.0
- name: Initialize git
uses: home-assistant/actions/helpers/git-init@master
@@ -323,23 +321,23 @@ jobs:
registry: ["ghcr.io/home-assistant", "docker.io/homeassistant"]
steps:
- name: Checkout the repository
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
uses: actions/checkout@v5.0.0
- name: Install Cosign
uses: sigstore/cosign-installer@faadad0cce49287aee09b3a48701e75088a2c6ad # v4.0.0
uses: sigstore/cosign-installer@v3.9.2
with:
cosign-release: "v2.2.3"
- name: Login to DockerHub
if: matrix.registry == 'docker.io/homeassistant'
uses: docker/login-action@5e57cd118135c172c3672efd75eb46360885c0ef # v3.6.0
uses: docker/login-action@v3.5.0
with:
username: ${{ secrets.DOCKERHUB_USERNAME }}
password: ${{ secrets.DOCKERHUB_TOKEN }}
- name: Login to GitHub Container Registry
if: matrix.registry == 'ghcr.io/home-assistant'
uses: docker/login-action@5e57cd118135c172c3672efd75eb46360885c0ef # v3.6.0
uses: docker/login-action@v3.5.0
with:
registry: ghcr.io
username: ${{ github.repository_owner }}
@@ -456,15 +454,15 @@ jobs:
if: github.repository_owner == 'home-assistant' && needs.init.outputs.publish == 'true'
steps:
- name: Checkout the repository
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
uses: actions/checkout@v5.0.0
- name: Set up Python ${{ env.DEFAULT_PYTHON }}
uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
uses: actions/setup-python@v6.0.0
with:
python-version: ${{ env.DEFAULT_PYTHON }}
- name: Download translations
uses: actions/download-artifact@018cc2cf5baa6db3ef3c5f8a56943fffe632ef53 # v6.0.0
uses: actions/download-artifact@v5.0.0
with:
name: translations
@@ -482,7 +480,7 @@ jobs:
python -m build
- name: Upload package to PyPI
uses: pypa/gh-action-pypi-publish@ed0c53931b1dc9bd32cbe73a98c7f6766f8a527e # v1.13.0
uses: pypa/gh-action-pypi-publish@v1.13.0
with:
skip-existing: true
@@ -504,7 +502,7 @@ jobs:
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
- name: Login to GitHub Container Registry
uses: docker/login-action@5e57cd118135c172c3672efd75eb46360885c0ef # v3.6.0
uses: docker/login-action@184bdaa0721073962dff0199f1fb9940f07167d1 # v3.5.0
with:
registry: ghcr.io
username: ${{ github.repository_owner }}

File diff suppressed because it is too large Load Diff

View File

@@ -21,14 +21,14 @@ jobs:
steps:
- name: Check out code from GitHub
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
uses: actions/checkout@v5.0.0
- name: Initialize CodeQL
uses: github/codeql-action/init@4e94bd11f71e507f7f87df81788dff88d1dacbfb # v4.31.0
uses: github/codeql-action/init@v3.30.1
with:
languages: python
- name: Perform CodeQL Analysis
uses: github/codeql-action/analyze@4e94bd11f71e507f7f87df81788dff88d1dacbfb # v4.31.0
uses: github/codeql-action/analyze@v3.30.1
with:
category: "/language:python"

View File

@@ -16,7 +16,7 @@ jobs:
steps:
- name: Check if integration label was added and extract details
id: extract
uses: actions/github-script@ed597411d8f924073f98dfc5c65a23a2325f34cd # v8.0.0
uses: actions/github-script@v8
with:
script: |
// Debug: Log the event payload
@@ -113,7 +113,7 @@ jobs:
- name: Fetch similar issues
id: fetch_similar
if: steps.extract.outputs.should_continue == 'true'
uses: actions/github-script@ed597411d8f924073f98dfc5c65a23a2325f34cd # v8.0.0
uses: actions/github-script@v8
env:
INTEGRATION_LABELS: ${{ steps.extract.outputs.integration_labels }}
CURRENT_NUMBER: ${{ steps.extract.outputs.current_number }}
@@ -231,7 +231,7 @@ jobs:
- name: Detect duplicates using AI
id: ai_detection
if: steps.extract.outputs.should_continue == 'true' && steps.fetch_similar.outputs.has_similar == 'true'
uses: actions/ai-inference@a1c11829223a786afe3b5663db904a3aa1eac3a2 # v2.0.1
uses: actions/ai-inference@v2.0.1
with:
model: openai/gpt-4o
system-prompt: |
@@ -280,7 +280,7 @@ jobs:
- name: Post duplicate detection results
id: post_results
if: steps.extract.outputs.should_continue == 'true' && steps.fetch_similar.outputs.has_similar == 'true'
uses: actions/github-script@ed597411d8f924073f98dfc5c65a23a2325f34cd # v8.0.0
uses: actions/github-script@v8
env:
AI_RESPONSE: ${{ steps.ai_detection.outputs.response }}
SIMILAR_ISSUES: ${{ steps.fetch_similar.outputs.similar_issues }}

View File

@@ -16,7 +16,7 @@ jobs:
steps:
- name: Check issue language
id: detect_language
uses: actions/github-script@ed597411d8f924073f98dfc5c65a23a2325f34cd # v8.0.0
uses: actions/github-script@v8
env:
ISSUE_NUMBER: ${{ github.event.issue.number }}
ISSUE_TITLE: ${{ github.event.issue.title }}
@@ -57,7 +57,7 @@ jobs:
- name: Detect language using AI
id: ai_language_detection
if: steps.detect_language.outputs.should_continue == 'true'
uses: actions/ai-inference@a1c11829223a786afe3b5663db904a3aa1eac3a2 # v2.0.1
uses: actions/ai-inference@v2.0.1
with:
model: openai/gpt-4o-mini
system-prompt: |
@@ -90,7 +90,7 @@ jobs:
- name: Process non-English issues
if: steps.detect_language.outputs.should_continue == 'true'
uses: actions/github-script@ed597411d8f924073f98dfc5c65a23a2325f34cd # v8.0.0
uses: actions/github-script@v8
env:
AI_RESPONSE: ${{ steps.ai_language_detection.outputs.response }}
ISSUE_NUMBER: ${{ steps.detect_language.outputs.issue_number }}

View File

@@ -10,7 +10,7 @@ jobs:
if: github.repository_owner == 'home-assistant'
runs-on: ubuntu-latest
steps:
- uses: dessant/lock-threads@1bf7ec25051fe7c00bdd17e6a7cf3d7bfb7dc771 # v5.0.1
- uses: dessant/lock-threads@v5.0.1
with:
github-token: ${{ github.token }}
issue-inactive-days: "30"

View File

@@ -12,7 +12,7 @@ jobs:
if: github.event.issue.type.name == 'Task'
steps:
- name: Check if user is authorized
uses: actions/github-script@ed597411d8f924073f98dfc5c65a23a2325f34cd # v8.0.0
uses: actions/github-script@v8
with:
script: |
const issueAuthor = context.payload.issue.user.login;

View File

@@ -17,7 +17,7 @@ jobs:
# - No PRs marked as no-stale
# - No issues (-1)
- name: 60 days stale PRs policy
uses: actions/stale@5f858e3efba33a5ca4407a664cc011ad407f2008 # v10.1.0
uses: actions/stale@v10.0.0
with:
repo-token: ${{ secrets.GITHUB_TOKEN }}
days-before-stale: 60
@@ -57,7 +57,7 @@ jobs:
# - No issues marked as no-stale or help-wanted
# - No PRs (-1)
- name: 90 days stale issues
uses: actions/stale@5f858e3efba33a5ca4407a664cc011ad407f2008 # v10.1.0
uses: actions/stale@v10.0.0
with:
repo-token: ${{ steps.token.outputs.token }}
days-before-stale: 90
@@ -87,7 +87,7 @@ jobs:
# - No Issues marked as no-stale or help-wanted
# - No PRs (-1)
- name: Needs more information stale issues policy
uses: actions/stale@5f858e3efba33a5ca4407a664cc011ad407f2008 # v10.1.0
uses: actions/stale@v10.0.0
with:
repo-token: ${{ steps.token.outputs.token }}
only-labels: "needs-more-information"

View File

@@ -19,10 +19,10 @@ jobs:
runs-on: ubuntu-latest
steps:
- name: Checkout the repository
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
uses: actions/checkout@v5.0.0
- name: Set up Python ${{ env.DEFAULT_PYTHON }}
uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
uses: actions/setup-python@v6.0.0
with:
python-version: ${{ env.DEFAULT_PYTHON }}

View File

@@ -31,13 +31,12 @@ jobs:
outputs:
architectures: ${{ steps.info.outputs.architectures }}
steps:
- &checkout
name: Checkout the repository
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
- name: Checkout the repository
uses: actions/checkout@v5.0.0
- name: Set up Python ${{ env.DEFAULT_PYTHON }}
id: python
uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
uses: actions/setup-python@v6.0.0
with:
python-version: ${{ env.DEFAULT_PYTHON }}
check-latest: true
@@ -92,7 +91,7 @@ jobs:
) > build_constraints.txt
- name: Upload env_file
uses: &actions-upload-artifact actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4 # v5.0.0
uses: actions/upload-artifact@v4.6.2
with:
name: env_file
path: ./.env_file
@@ -100,14 +99,14 @@ jobs:
overwrite: true
- name: Upload build_constraints
uses: *actions-upload-artifact
uses: actions/upload-artifact@v4.6.2
with:
name: build_constraints
path: ./build_constraints.txt
overwrite: true
- name: Upload requirements_diff
uses: *actions-upload-artifact
uses: actions/upload-artifact@v4.6.2
with:
name: requirements_diff
path: ./requirements_diff.txt
@@ -119,7 +118,7 @@ jobs:
python -m script.gen_requirements_all ci
- name: Upload requirements_all_wheels
uses: *actions-upload-artifact
uses: actions/upload-artifact@v4.6.2
with:
name: requirements_all_wheels
path: ./requirements_all_wheels_*.txt
@@ -128,41 +127,28 @@ jobs:
name: Build Core wheels ${{ matrix.abi }} for ${{ matrix.arch }} (musllinux_1_2)
if: github.repository_owner == 'home-assistant'
needs: init
runs-on: ${{ matrix.os }}
runs-on: ubuntu-latest
strategy:
fail-fast: false
matrix: &matrix-build
abi: ["cp313", "cp314"]
matrix:
abi: ["cp313"]
arch: ${{ fromJson(needs.init.outputs.architectures) }}
include:
- os: ubuntu-latest
- arch: aarch64
os: ubuntu-24.04-arm
exclude:
- abi: cp314
arch: armv7
- abi: cp314
arch: armhf
- abi: cp314
arch: i386
steps:
- *checkout
- name: Checkout the repository
uses: actions/checkout@v5.0.0
- &download-env-file
name: Download env_file
uses: &actions-download-artifact actions/download-artifact@018cc2cf5baa6db3ef3c5f8a56943fffe632ef53 # v6.0.0
- name: Download env_file
uses: actions/download-artifact@v5.0.0
with:
name: env_file
- &download-build-constraints
name: Download build_constraints
uses: *actions-download-artifact
- name: Download build_constraints
uses: actions/download-artifact@v5.0.0
with:
name: build_constraints
- &download-requirements-diff
name: Download requirements_diff
uses: *actions-download-artifact
- name: Download requirements_diff
uses: actions/download-artifact@v5.0.0
with:
name: requirements_diff
@@ -172,9 +158,8 @@ jobs:
sed -i "/uv/d" requirements.txt
sed -i "/uv/d" requirements_diff.txt
# home-assistant/wheels doesn't support sha pinning
- name: Build wheels
uses: &home-assistant-wheels home-assistant/wheels@2025.10.0
uses: home-assistant/wheels@2025.07.0
with:
abi: ${{ matrix.abi }}
tag: musllinux_1_2
@@ -191,19 +176,33 @@ jobs:
name: Build wheels ${{ matrix.abi }} for ${{ matrix.arch }}
if: github.repository_owner == 'home-assistant'
needs: init
runs-on: ${{ matrix.os }}
runs-on: ubuntu-latest
strategy:
fail-fast: false
matrix: *matrix-build
matrix:
abi: ["cp313"]
arch: ${{ fromJson(needs.init.outputs.architectures) }}
steps:
- *checkout
- name: Checkout the repository
uses: actions/checkout@v5.0.0
- *download-env-file
- *download-build-constraints
- *download-requirements-diff
- name: Download env_file
uses: actions/download-artifact@v5.0.0
with:
name: env_file
- name: Download build_constraints
uses: actions/download-artifact@v5.0.0
with:
name: build_constraints
- name: Download requirements_diff
uses: actions/download-artifact@v5.0.0
with:
name: requirements_diff
- name: Download requirements_all_wheels
uses: *actions-download-artifact
uses: actions/download-artifact@v5.0.0
with:
name: requirements_all_wheels
@@ -219,9 +218,8 @@ jobs:
sed -i "/uv/d" requirements.txt
sed -i "/uv/d" requirements_diff.txt
# home-assistant/wheels doesn't support sha pinning
- name: Build wheels
uses: *home-assistant-wheels
uses: home-assistant/wheels@2025.07.0
with:
abi: ${{ matrix.abi }}
tag: musllinux_1_2

5
.gitignore vendored
View File

@@ -79,6 +79,7 @@ junit.xml
.project
.pydevproject
.python-version
.tool-versions
# emacs auto backups
@@ -111,7 +112,6 @@ virtualization/vagrant/config
!.vscode/cSpell.json
!.vscode/extensions.json
!.vscode/tasks.json
!.vscode/settings.default.jsonc
.env
# Windows Explorer
@@ -140,6 +140,5 @@ tmp_cache
pytest_buckets.txt
# AI tooling
.claude/settings.local.json
.serena/
.claude

View File

@@ -1,6 +1,6 @@
repos:
- repo: https://github.com/astral-sh/ruff-pre-commit
rev: v0.13.0
rev: v0.12.1
hooks:
- id: ruff-check
args:
@@ -33,13 +33,10 @@ repos:
rev: v1.37.1
hooks:
- id: yamllint
- repo: https://github.com/rbubley/mirrors-prettier
rev: v3.6.2
- repo: https://github.com/pre-commit/mirrors-prettier
rev: v3.0.3
hooks:
- id: prettier
additional_dependencies:
- prettier@3.6.2
- prettier-plugin-sort-json@4.1.1
- repo: https://github.com/cdce8p/python-typing-update
rev: v0.6.0
hooks:

View File

@@ -1,24 +0,0 @@
/** @type {import("prettier").Config} */
module.exports = {
overrides: [
{
files: "./homeassistant/**/*.json",
options: {
plugins: [require.resolve("prettier-plugin-sort-json")],
jsonRecursiveSort: true,
jsonSortOrder: JSON.stringify({ [/.*/]: "numeric" }),
},
},
{
files: ["manifest.json", "./**/brands/*.json"],
options: {
// domain and name should stay at the top
jsonSortOrder: JSON.stringify({
domain: null,
name: null,
[/.*/]: "numeric",
}),
},
},
],
};

View File

@@ -1 +0,0 @@
3.13

View File

@@ -142,7 +142,6 @@ homeassistant.components.cloud.*
homeassistant.components.co2signal.*
homeassistant.components.comelit.*
homeassistant.components.command_line.*
homeassistant.components.compit.*
homeassistant.components.config.*
homeassistant.components.configurator.*
homeassistant.components.cookidoo.*
@@ -182,6 +181,7 @@ homeassistant.components.efergy.*
homeassistant.components.eheimdigital.*
homeassistant.components.electrasmart.*
homeassistant.components.electric_kiwi.*
homeassistant.components.elevenlabs.*
homeassistant.components.elgato.*
homeassistant.components.elkm1.*
homeassistant.components.emulated_hue.*
@@ -202,7 +202,6 @@ homeassistant.components.feedreader.*
homeassistant.components.file_upload.*
homeassistant.components.filesize.*
homeassistant.components.filter.*
homeassistant.components.firefly_iii.*
homeassistant.components.fitbit.*
homeassistant.components.flexit_bacnet.*
homeassistant.components.flux_led.*
@@ -220,7 +219,6 @@ homeassistant.components.generic_thermostat.*
homeassistant.components.geo_location.*
homeassistant.components.geocaching.*
homeassistant.components.gios.*
homeassistant.components.github.*
homeassistant.components.glances.*
homeassistant.components.go2rtc.*
homeassistant.components.goalzero.*
@@ -278,7 +276,6 @@ homeassistant.components.imap.*
homeassistant.components.imgw_pib.*
homeassistant.components.immich.*
homeassistant.components.incomfort.*
homeassistant.components.inels.*
homeassistant.components.input_button.*
homeassistant.components.input_select.*
homeassistant.components.input_text.*
@@ -327,7 +324,6 @@ homeassistant.components.london_underground.*
homeassistant.components.lookin.*
homeassistant.components.lovelace.*
homeassistant.components.luftdaten.*
homeassistant.components.lunatone.*
homeassistant.components.madvr.*
homeassistant.components.manual.*
homeassistant.components.mastodon.*
@@ -406,7 +402,6 @@ homeassistant.components.person.*
homeassistant.components.pi_hole.*
homeassistant.components.ping.*
homeassistant.components.plugwise.*
homeassistant.components.portainer.*
homeassistant.components.powerfox.*
homeassistant.components.powerwall.*
homeassistant.components.private_ble_device.*
@@ -446,7 +441,6 @@ homeassistant.components.rituals_perfume_genie.*
homeassistant.components.roborock.*
homeassistant.components.roku.*
homeassistant.components.romy.*
homeassistant.components.route_b_smart_meter.*
homeassistant.components.rpi_power.*
homeassistant.components.rss_feed_template.*
homeassistant.components.russound_rio.*
@@ -478,7 +472,6 @@ homeassistant.components.skybell.*
homeassistant.components.slack.*
homeassistant.components.sleep_as_android.*
homeassistant.components.sleepiq.*
homeassistant.components.sma.*
homeassistant.components.smhi.*
homeassistant.components.smlight.*
homeassistant.components.smtp.*
@@ -557,7 +550,6 @@ homeassistant.components.vacuum.*
homeassistant.components.vallox.*
homeassistant.components.valve.*
homeassistant.components.velbus.*
homeassistant.components.vivotek.*
homeassistant.components.vlc_telnet.*
homeassistant.components.vodafone_station.*
homeassistant.components.volvo.*

View File

@@ -7,19 +7,13 @@
"python.testing.pytestEnabled": false,
// https://code.visualstudio.com/docs/python/linting#_general-settings
"pylint.importStrategy": "fromEnvironment",
// Pyright is too pedantic for Home Assistant
"python.analysis.typeCheckingMode": "basic",
"[python]": {
"editor.defaultFormatter": "charliermarsh.ruff",
},
"[json][jsonc][yaml]": {
"editor.defaultFormatter": "esbenp.prettier-vscode",
},
"json.schemas": [
{
"fileMatch": ["homeassistant/components/*/manifest.json"],
// This value differs between working with devcontainer and locally, therefore this value should NOT be in sync!
"url": "./script/json_schemas/manifest_schema.json",
},
],
{
"fileMatch": [
"homeassistant/components/*/manifest.json"
],
// This value differs between working with devcontainer and locally, therefor this value should NOT be in sync!
"url": "./script/json_schemas/manifest_schema.json"
}
]
}

View File

@@ -1 +0,0 @@
.github/copilot-instructions.md

69
CODEOWNERS generated
View File

@@ -46,8 +46,6 @@ build.json @home-assistant/supervisor
/tests/components/accuweather/ @bieniu
/homeassistant/components/acmeda/ @atmurray
/tests/components/acmeda/ @atmurray
/homeassistant/components/actron_air/ @kclif9 @JagadishDhanamjayam
/tests/components/actron_air/ @kclif9 @JagadishDhanamjayam
/homeassistant/components/adax/ @danielhiversen @lazytarget
/tests/components/adax/ @danielhiversen @lazytarget
/homeassistant/components/adguard/ @frenck
@@ -109,8 +107,8 @@ build.json @home-assistant/supervisor
/homeassistant/components/ambient_station/ @bachya
/tests/components/ambient_station/ @bachya
/homeassistant/components/amcrest/ @flacjacket
/homeassistant/components/analytics/ @home-assistant/core
/tests/components/analytics/ @home-assistant/core
/homeassistant/components/analytics/ @home-assistant/core @ludeeus
/tests/components/analytics/ @home-assistant/core @ludeeus
/homeassistant/components/analytics_insights/ @joostlek
/tests/components/analytics_insights/ @joostlek
/homeassistant/components/android_ip_webcam/ @engrbm87
@@ -294,8 +292,6 @@ build.json @home-assistant/supervisor
/tests/components/command_line/ @gjohansson-ST
/homeassistant/components/compensation/ @Petro31
/tests/components/compensation/ @Petro31
/homeassistant/components/compit/ @Przemko92
/tests/components/compit/ @Przemko92
/homeassistant/components/config/ @home-assistant/core
/tests/components/config/ @home-assistant/core
/homeassistant/components/configurator/ @home-assistant/core
@@ -318,8 +314,6 @@ build.json @home-assistant/supervisor
/tests/components/crownstone/ @Crownstone @RicArch97
/homeassistant/components/cups/ @fabaff
/tests/components/cups/ @fabaff
/homeassistant/components/cync/ @Kinachi249
/tests/components/cync/ @Kinachi249
/homeassistant/components/daikin/ @fredrike
/tests/components/daikin/ @fredrike
/homeassistant/components/date/ @home-assistant/core
@@ -414,8 +408,6 @@ build.json @home-assistant/supervisor
/homeassistant/components/egardia/ @jeroenterheerdt
/homeassistant/components/eheimdigital/ @autinerd
/tests/components/eheimdigital/ @autinerd
/homeassistant/components/ekeybionyx/ @richardpolzer
/tests/components/ekeybionyx/ @richardpolzer
/homeassistant/components/electrasmart/ @jafar-atili
/tests/components/electrasmart/ @jafar-atili
/homeassistant/components/electric_kiwi/ @mikey0000
@@ -450,6 +442,8 @@ build.json @home-assistant/supervisor
/tests/components/energyzero/ @klaasnicolaas
/homeassistant/components/enigma2/ @autinerd
/tests/components/enigma2/ @autinerd
/homeassistant/components/enocean/ @bdurrer
/tests/components/enocean/ @bdurrer
/homeassistant/components/enphase_envoy/ @bdraco @cgarwood @catsmanac
/tests/components/enphase_envoy/ @bdraco @cgarwood @catsmanac
/homeassistant/components/entur_public_transport/ @hfurubotten
@@ -494,10 +488,6 @@ build.json @home-assistant/supervisor
/tests/components/filesize/ @gjohansson-ST
/homeassistant/components/filter/ @dgomes
/tests/components/filter/ @dgomes
/homeassistant/components/fing/ @Lorenzo-Gasparini
/tests/components/fing/ @Lorenzo-Gasparini
/homeassistant/components/firefly_iii/ @erwindouna
/tests/components/firefly_iii/ @erwindouna
/homeassistant/components/fireservicerota/ @cyberjunky
/tests/components/fireservicerota/ @cyberjunky
/homeassistant/components/firmata/ @DaAwesomeP
@@ -621,8 +611,6 @@ build.json @home-assistant/supervisor
/tests/components/greeneye_monitor/ @jkeljo
/homeassistant/components/group/ @home-assistant/core
/tests/components/group/ @home-assistant/core
/homeassistant/components/growatt_server/ @johanzander
/tests/components/growatt_server/ @johanzander
/homeassistant/components/guardian/ @bachya
/tests/components/guardian/ @bachya
/homeassistant/components/habitica/ @tr4nt0r
@@ -743,8 +731,6 @@ build.json @home-assistant/supervisor
/tests/components/improv_ble/ @emontnemery
/homeassistant/components/incomfort/ @jbouwh
/tests/components/incomfort/ @jbouwh
/homeassistant/components/inels/ @epdevlab
/tests/components/inels/ @epdevlab
/homeassistant/components/influxdb/ @mdegat01
/tests/components/influxdb/ @mdegat01
/homeassistant/components/inkbird/ @bdraco
@@ -770,8 +756,8 @@ build.json @home-assistant/supervisor
/homeassistant/components/intent/ @home-assistant/core @synesthesiam @arturpragacz
/tests/components/intent/ @home-assistant/core @synesthesiam @arturpragacz
/homeassistant/components/intesishome/ @jnimmo
/homeassistant/components/iometer/ @jukrebs
/tests/components/iometer/ @jukrebs
/homeassistant/components/iometer/ @MaestroOnICe
/tests/components/iometer/ @MaestroOnICe
/homeassistant/components/ios/ @robbiet480
/tests/components/ios/ @robbiet480
/homeassistant/components/iotawatt/ @gtdiehl @jyavenard
@@ -786,8 +772,6 @@ build.json @home-assistant/supervisor
/homeassistant/components/iqvia/ @bachya
/tests/components/iqvia/ @bachya
/homeassistant/components/irish_rail_transport/ @ttroy50
/homeassistant/components/irm_kmi/ @jdejaegh
/tests/components/irm_kmi/ @jdejaegh
/homeassistant/components/iron_os/ @tr4nt0r
/tests/components/iron_os/ @tr4nt0r
/homeassistant/components/isal/ @bdraco
@@ -918,8 +902,6 @@ build.json @home-assistant/supervisor
/homeassistant/components/luci/ @mzdrale
/homeassistant/components/luftdaten/ @fabaff @frenck
/tests/components/luftdaten/ @fabaff @frenck
/homeassistant/components/lunatone/ @MoonDevLT
/tests/components/lunatone/ @MoonDevLT
/homeassistant/components/lupusec/ @majuss @suaveolent
/tests/components/lupusec/ @majuss @suaveolent
/homeassistant/components/lutron/ @cdheiser @wilburCForce
@@ -965,8 +947,6 @@ build.json @home-assistant/supervisor
/tests/components/met_eireann/ @DylanGore
/homeassistant/components/meteo_france/ @hacf-fr @oncleben31 @Quentame
/tests/components/meteo_france/ @hacf-fr @oncleben31 @Quentame
/homeassistant/components/meteo_lt/ @xE1H
/tests/components/meteo_lt/ @xE1H
/homeassistant/components/meteoalarm/ @rolfberkenbosch
/homeassistant/components/meteoclimatic/ @adrianmo
/tests/components/meteoclimatic/ @adrianmo
@@ -1037,8 +1017,7 @@ build.json @home-assistant/supervisor
/tests/components/nanoleaf/ @milanmeu @joostlek
/homeassistant/components/nasweb/ @nasWebio
/tests/components/nasweb/ @nasWebio
/homeassistant/components/nederlandse_spoorwegen/ @YarmoM @heindrichpaul
/tests/components/nederlandse_spoorwegen/ @YarmoM @heindrichpaul
/homeassistant/components/nederlandse_spoorwegen/ @YarmoM
/homeassistant/components/ness_alarm/ @nickw444
/tests/components/ness_alarm/ @nickw444
/homeassistant/components/nest/ @allenporter
@@ -1073,8 +1052,6 @@ build.json @home-assistant/supervisor
/homeassistant/components/nilu/ @hfurubotten
/homeassistant/components/nina/ @DeerMaximum
/tests/components/nina/ @DeerMaximum
/homeassistant/components/nintendo_parental_controls/ @pantherale0
/tests/components/nintendo_parental_controls/ @pantherale0
/homeassistant/components/nissan_leaf/ @filcole
/homeassistant/components/noaa_tides/ @jdelaney72
/homeassistant/components/nobo_hub/ @echoromeo @oyvindwe
@@ -1143,8 +1120,6 @@ build.json @home-assistant/supervisor
/tests/components/opengarage/ @danielhiversen
/homeassistant/components/openhome/ @bazwilliams
/tests/components/openhome/ @bazwilliams
/homeassistant/components/openrgb/ @felipecrs
/tests/components/openrgb/ @felipecrs
/homeassistant/components/opensky/ @joostlek
/tests/components/opensky/ @joostlek
/homeassistant/components/opentherm_gw/ @mvn23
@@ -1208,14 +1183,14 @@ build.json @home-assistant/supervisor
/tests/components/plex/ @jjlawren
/homeassistant/components/plugwise/ @CoMPaTech @bouwew
/tests/components/plugwise/ @CoMPaTech @bouwew
/homeassistant/components/plum_lightpad/ @ColinHarrington @prystupa
/tests/components/plum_lightpad/ @ColinHarrington @prystupa
/homeassistant/components/point/ @fredrike
/tests/components/point/ @fredrike
/homeassistant/components/pooldose/ @lmaertin
/tests/components/pooldose/ @lmaertin
/homeassistant/components/poolsense/ @haemishkyd
/tests/components/poolsense/ @haemishkyd
/homeassistant/components/portainer/ @erwindouna
/tests/components/portainer/ @erwindouna
/homeassistant/components/powerfox/ @klaasnicolaas
/tests/components/powerfox/ @klaasnicolaas
/homeassistant/components/powerwall/ @bdraco @jrester @daniel-simpson
@@ -1350,8 +1325,6 @@ build.json @home-assistant/supervisor
/tests/components/roomba/ @pschmitt @cyr-ius @shenxn @Orhideous
/homeassistant/components/roon/ @pavoni
/tests/components/roon/ @pavoni
/homeassistant/components/route_b_smart_meter/ @SeraphicRav
/tests/components/route_b_smart_meter/ @SeraphicRav
/homeassistant/components/rpi_power/ @shenxn @swetoast
/tests/components/rpi_power/ @shenxn @swetoast
/homeassistant/components/rss_feed_template/ @home-assistant/core
@@ -1374,8 +1347,6 @@ build.json @home-assistant/supervisor
/tests/components/samsungtv/ @chemelli74 @epenet
/homeassistant/components/sanix/ @tomaszsluszniak
/tests/components/sanix/ @tomaszsluszniak
/homeassistant/components/satel_integra/ @Tommatheussen
/tests/components/satel_integra/ @Tommatheussen
/homeassistant/components/scene/ @home-assistant/core
/tests/components/scene/ @home-assistant/core
/homeassistant/components/schedule/ @home-assistant/core
@@ -1423,8 +1394,8 @@ build.json @home-assistant/supervisor
/tests/components/sfr_box/ @epenet
/homeassistant/components/sftp_storage/ @maretodoric
/tests/components/sftp_storage/ @maretodoric
/homeassistant/components/sharkiq/ @JeffResc @funkybunch @TheOneOgre
/tests/components/sharkiq/ @JeffResc @funkybunch @TheOneOgre
/homeassistant/components/sharkiq/ @JeffResc @funkybunch
/tests/components/sharkiq/ @JeffResc @funkybunch
/homeassistant/components/shell_command/ @home-assistant/core
/tests/components/shell_command/ @home-assistant/core
/homeassistant/components/shelly/ @bieniu @thecode @chemelli74 @bdraco
@@ -1489,8 +1460,8 @@ build.json @home-assistant/supervisor
/tests/components/snoo/ @Lash-L
/homeassistant/components/snooz/ @AustinBrunkhorst
/tests/components/snooz/ @AustinBrunkhorst
/homeassistant/components/solaredge/ @frenck @bdraco @tronikos
/tests/components/solaredge/ @frenck @bdraco @tronikos
/homeassistant/components/solaredge/ @frenck @bdraco
/tests/components/solaredge/ @frenck @bdraco
/homeassistant/components/solaredge_local/ @drobtravels @scheric
/homeassistant/components/solarlog/ @Ernst79 @dontinelli
/tests/components/solarlog/ @Ernst79 @dontinelli
@@ -1543,8 +1514,6 @@ build.json @home-assistant/supervisor
/tests/components/suez_water/ @ooii @jb101010-2
/homeassistant/components/sun/ @home-assistant/core
/tests/components/sun/ @home-assistant/core
/homeassistant/components/sunricher_dali_center/ @niracler
/tests/components/sunricher_dali_center/ @niracler
/homeassistant/components/supla/ @mwegrzynek
/homeassistant/components/surepetcare/ @benleb @danielhiversen
/tests/components/surepetcare/ @benleb @danielhiversen
@@ -1559,8 +1528,8 @@ build.json @home-assistant/supervisor
/tests/components/switchbee/ @jafar-atili
/homeassistant/components/switchbot/ @danielhiversen @RenierM26 @murtas @Eloston @dsypniewski @zerzhang
/tests/components/switchbot/ @danielhiversen @RenierM26 @murtas @Eloston @dsypniewski @zerzhang
/homeassistant/components/switchbot_cloud/ @SeraphicRav @laurence-presland @Gigatrappeur @XiaoLing-git
/tests/components/switchbot_cloud/ @SeraphicRav @laurence-presland @Gigatrappeur @XiaoLing-git
/homeassistant/components/switchbot_cloud/ @SeraphicRav @laurence-presland @Gigatrappeur
/tests/components/switchbot_cloud/ @SeraphicRav @laurence-presland @Gigatrappeur
/homeassistant/components/switcher_kis/ @thecode @YogevBokobza
/tests/components/switcher_kis/ @thecode @YogevBokobza
/homeassistant/components/switchmate/ @danielhiversen @qiz-li
@@ -1705,8 +1674,6 @@ build.json @home-assistant/supervisor
/tests/components/uptime_kuma/ @tr4nt0r
/homeassistant/components/uptimerobot/ @ludeeus @chemelli74
/tests/components/uptimerobot/ @ludeeus @chemelli74
/homeassistant/components/usage_prediction/ @home-assistant/core
/tests/components/usage_prediction/ @home-assistant/core
/homeassistant/components/usb/ @bdraco
/tests/components/usb/ @bdraco
/homeassistant/components/usgs_earthquakes_feed/ @exxamalte
@@ -1736,8 +1703,6 @@ build.json @home-assistant/supervisor
/tests/components/vesync/ @markperdue @webdjoe @thegardenmonkey @cdnninja @iprak @sapuseven
/homeassistant/components/vicare/ @CFenner
/tests/components/vicare/ @CFenner
/homeassistant/components/victron_remote_monitoring/ @AndyTempel
/tests/components/victron_remote_monitoring/ @AndyTempel
/homeassistant/components/vilfo/ @ManneW
/tests/components/vilfo/ @ManneW
/homeassistant/components/vivotek/ @HarlemSquirrel
@@ -1753,8 +1718,8 @@ build.json @home-assistant/supervisor
/tests/components/volumio/ @OnFreund
/homeassistant/components/volvo/ @thomasddn
/tests/components/volvo/ @thomasddn
/homeassistant/components/volvooncall/ @molobrakos @svrooij
/tests/components/volvooncall/ @molobrakos @svrooij
/homeassistant/components/volvooncall/ @molobrakos
/tests/components/volvooncall/ @molobrakos
/homeassistant/components/wake_on_lan/ @ntilley905
/tests/components/wake_on_lan/ @ntilley905
/homeassistant/components/wake_word/ @home-assistant/core @synesthesiam

4
Dockerfile generated
View File

@@ -25,13 +25,13 @@ RUN \
"armv7") go2rtc_suffix='arm' ;; \
*) go2rtc_suffix=${BUILD_ARCH} ;; \
esac \
&& curl -L https://github.com/AlexxIT/go2rtc/releases/download/v1.9.11/go2rtc_linux_${go2rtc_suffix} --output /bin/go2rtc \
&& curl -L https://github.com/AlexxIT/go2rtc/releases/download/v1.9.9/go2rtc_linux_${go2rtc_suffix} --output /bin/go2rtc \
&& chmod +x /bin/go2rtc \
# Verify go2rtc can be executed
&& go2rtc --version
# Install uv
RUN pip3 install uv==0.9.5
RUN pip3 install uv==0.8.9
WORKDIR /usr/src

View File

@@ -34,11 +34,9 @@ WORKDIR /usr/src
COPY --from=ghcr.io/astral-sh/uv:latest /uv /usr/local/bin/uv
RUN uv python install 3.13.2
USER vscode
COPY .python-version ./
RUN uv python install
ENV VIRTUAL_ENV="/home/vscode/.local/ha-venv"
RUN uv venv $VIRTUAL_ENV
ENV PATH="$VIRTUAL_ENV/bin:$PATH"

View File

@@ -1,10 +1,13 @@
image: ghcr.io/home-assistant/{arch}-homeassistant
build_from:
aarch64: ghcr.io/home-assistant/aarch64-homeassistant-base:2025.10.1
armhf: ghcr.io/home-assistant/armhf-homeassistant-base:2025.10.1
armv7: ghcr.io/home-assistant/armv7-homeassistant-base:2025.10.1
amd64: ghcr.io/home-assistant/amd64-homeassistant-base:2025.10.1
i386: ghcr.io/home-assistant/i386-homeassistant-base:2025.10.1
aarch64: ghcr.io/home-assistant/aarch64-homeassistant-base:2025.09.1
armhf: ghcr.io/home-assistant/armhf-homeassistant-base:2025.09.1
armv7: ghcr.io/home-assistant/armv7-homeassistant-base:2025.09.1
amd64: ghcr.io/home-assistant/amd64-homeassistant-base:2025.09.1
i386: ghcr.io/home-assistant/i386-homeassistant-base:2025.09.1
codenotary:
signer: notary@home-assistant.io
base_image: notary@home-assistant.io
cosign:
base_identity: https://github.com/home-assistant/docker/.*
identity: https://github.com/home-assistant/core/.*

View File

@@ -34,9 +34,6 @@ INPUT_FIELD_CODE = "code"
DUMMY_SECRET = "FPPTH34D4E3MI2HG"
GOOGLE_AUTHENTICATOR_URL = "https://support.google.com/accounts/answer/1066447"
AUTHY_URL = "https://authy.com/"
def _generate_qr_code(data: str) -> str:
"""Generate a base64 PNG string represent QR Code image of data."""
@@ -232,8 +229,6 @@ class TotpSetupFlow(SetupFlow[TotpAuthModule]):
"code": self._ota_secret,
"url": self._url,
"qr_code": self._image,
"google_authenticator_url": GOOGLE_AUTHENTICATOR_URL,
"authy_url": AUTHY_URL,
},
errors=errors,
)

View File

@@ -616,34 +616,34 @@ async def async_enable_logging(
),
)
logger = logging.getLogger()
logger.setLevel(logging.INFO if verbose else logging.WARNING)
# Log errors to a file if we have write access to file or config dir
if log_file is None:
default_log_path = hass.config.path(ERROR_LOG_FILENAME)
if "SUPERVISOR" in os.environ:
_LOGGER.info("Running in Supervisor, not logging to file")
# Rename the default log file if it exists, since previous versions created
# it even on Supervisor
if os.path.isfile(default_log_path):
with contextlib.suppress(OSError):
os.rename(default_log_path, f"{default_log_path}.old")
err_log_path = None
else:
err_log_path = default_log_path
err_log_path = hass.config.path(ERROR_LOG_FILENAME)
else:
err_log_path = os.path.abspath(log_file)
if err_log_path:
err_path_exists = os.path.isfile(err_log_path)
err_dir = os.path.dirname(err_log_path)
# Check if we can write to the error log if it exists or that
# we can create files in the containing directory if not.
if (err_path_exists and os.access(err_log_path, os.W_OK)) or (
not err_path_exists and os.access(err_dir, os.W_OK)
):
err_handler = await hass.async_add_executor_job(
_create_log_file, err_log_path, log_rotate_days
)
err_handler.setFormatter(logging.Formatter(fmt, datefmt=FORMAT_DATETIME))
logger = logging.getLogger()
logger.addHandler(err_handler)
logger.setLevel(logging.INFO if verbose else logging.WARNING)
# Save the log file location for access by other components.
hass.data[DATA_LOGGING] = err_log_path
else:
_LOGGER.error("Unable to set up error log %s (access denied)", err_log_path)
async_activate_log_queue_handler(hass)

View File

@@ -1,5 +0,0 @@
{
"domain": "eltako",
"name": "Eltako",
"iot_standards": ["matter"]
}

View File

@@ -6,6 +6,7 @@
"google_assistant_sdk",
"google_cloud",
"google_drive",
"google_gemini",
"google_generative_ai_conversation",
"google_mail",
"google_maps",

View File

@@ -0,0 +1,5 @@
{
"domain": "ibm",
"name": "IBM",
"integrations": ["watson_iot", "watson_tts"]
}

View File

@@ -1,5 +0,0 @@
{
"domain": "konnected",
"name": "Konnected",
"integrations": ["konnected", "konnected_esphome"]
}

View File

@@ -1,5 +0,0 @@
{
"domain": "level",
"name": "Level",
"iot_standards": ["matter"]
}

View File

@@ -1,70 +1,70 @@
{
"config": {
"abort": {
"reauth_successful": "[%key:common::config_flow::abort::reauth_successful%]"
},
"error": {
"cannot_connect": "[%key:common::config_flow::error::cannot_connect%]",
"invalid_auth": "[%key:common::config_flow::error::invalid_auth%]",
"invalid_mfa_code": "Invalid MFA code"
},
"step": {
"user": {
"title": "Fill in your Abode login information",
"data": {
"username": "[%key:common::config_flow::data::email%]",
"password": "[%key:common::config_flow::data::password%]"
}
},
"mfa": {
"title": "Enter your MFA code for Abode",
"data": {
"mfa_code": "MFA code (6-digits)"
},
"title": "Enter your MFA code for Abode"
}
},
"reauth_confirm": {
"title": "[%key:component::abode::config::step::user::title%]",
"data": {
"password": "[%key:common::config_flow::data::password%]",
"username": "[%key:common::config_flow::data::email%]"
},
"title": "[%key:component::abode::config::step::user::title%]"
},
"user": {
"data": {
"password": "[%key:common::config_flow::data::password%]",
"username": "[%key:common::config_flow::data::email%]"
},
"title": "Fill in your Abode login information"
"username": "[%key:common::config_flow::data::email%]",
"password": "[%key:common::config_flow::data::password%]"
}
}
},
"error": {
"invalid_auth": "[%key:common::config_flow::error::invalid_auth%]",
"cannot_connect": "[%key:common::config_flow::error::cannot_connect%]",
"invalid_mfa_code": "Invalid MFA code"
},
"abort": {
"reauth_successful": "[%key:common::config_flow::abort::reauth_successful%]"
}
},
"services": {
"capture_image": {
"name": "Capture image",
"description": "Requests a new image capture from a camera device.",
"fields": {
"entity_id": {
"description": "Entity ID of the camera to request an image from.",
"name": "Entity"
"name": "Entity",
"description": "Entity ID of the camera to request an image from."
}
},
"name": "Capture image"
}
},
"change_setting": {
"name": "Change setting",
"description": "Changes an Abode system setting.",
"fields": {
"setting": {
"description": "Setting to change.",
"name": "Setting"
"name": "Setting",
"description": "Setting to change."
},
"value": {
"description": "Value of the setting.",
"name": "Value"
"name": "Value",
"description": "Value of the setting."
}
},
"name": "Change setting"
}
},
"trigger_automation": {
"name": "Trigger automation",
"description": "Triggers an Abode automation.",
"fields": {
"entity_id": {
"description": "Entity ID of the automation to trigger.",
"name": "Entity"
"name": "Entity",
"description": "Entity ID of the automation to trigger."
}
},
"name": "Trigger automation"
}
}
}
}

View File

@@ -8,17 +8,14 @@ import logging
from aioacaia.acaiascale import AcaiaScale
from aioacaia.exceptions import AcaiaDeviceNotFound, AcaiaError
from homeassistant.components.bluetooth import async_get_scanner
from homeassistant.config_entries import ConfigEntry
from homeassistant.const import CONF_ADDRESS
from homeassistant.core import HomeAssistant
from homeassistant.helpers.debounce import Debouncer
from homeassistant.helpers.update_coordinator import DataUpdateCoordinator
from .const import CONF_IS_NEW_STYLE_SCALE
SCAN_INTERVAL = timedelta(seconds=15)
UPDATE_DEBOUNCE_TIME = 0.2
_LOGGER = logging.getLogger(__name__)
@@ -40,20 +37,11 @@ class AcaiaCoordinator(DataUpdateCoordinator[None]):
config_entry=entry,
)
debouncer = Debouncer(
hass=hass,
logger=_LOGGER,
cooldown=UPDATE_DEBOUNCE_TIME,
immediate=True,
function=self.async_update_listeners,
)
self._scale = AcaiaScale(
address_or_ble_device=entry.data[CONF_ADDRESS],
name=entry.title,
is_new_style_scale=entry.data[CONF_IS_NEW_STYLE_SCALE],
notify_callback=debouncer.async_schedule_call,
scanner=async_get_scanner(hass),
notify_callback=self.async_update_listeners,
)
@property

View File

@@ -4,20 +4,20 @@
"timer_running": {
"default": "mdi:timer",
"state": {
"off": "mdi:timer-off",
"on": "mdi:timer-play"
"on": "mdi:timer-play",
"off": "mdi:timer-off"
}
}
},
"button": {
"tare": {
"default": "mdi:scale-balance"
},
"reset_timer": {
"default": "mdi:timer-refresh"
},
"start_stop": {
"default": "mdi:timer-play"
},
"tare": {
"default": "mdi:scale-balance"
}
}
}

View File

@@ -26,5 +26,5 @@
"iot_class": "local_push",
"loggers": ["aioacaia"],
"quality_scale": "platinum",
"requirements": ["aioacaia==0.1.17"]
"requirements": ["aioacaia==0.1.14"]
}

View File

@@ -1,5 +1,6 @@
{
"config": {
"flow_title": "{name}",
"abort": {
"already_configured": "[%key:common::config_flow::abort::already_configured_device%]",
"no_devices_found": "[%key:common::config_flow::abort::no_devices_found%]",
@@ -9,19 +10,18 @@
"device_not_found": "Device could not be found.",
"unknown": "[%key:common::config_flow::error::unknown%]"
},
"flow_title": "{name}",
"step": {
"bluetooth_confirm": {
"description": "[%key:component::bluetooth::config::step::bluetooth_confirm::description%]"
},
"user": {
"description": "[%key:component::bluetooth::config::step::user::description%]",
"data": {
"address": "[%key:common::config_flow::data::device%]"
},
"data_description": {
"address": "Select Acaia scale you want to set up"
},
"description": "[%key:component::bluetooth::config::step::user::description%]"
}
}
}
},
@@ -32,14 +32,14 @@
}
},
"button": {
"tare": {
"name": "Tare"
},
"reset_timer": {
"name": "Reset timer"
},
"start_stop": {
"name": "Start/stop timer"
},
"tare": {
"name": "Tare"
}
}
}

View File

@@ -2,23 +2,21 @@
from __future__ import annotations
import asyncio
import logging
from accuweather import AccuWeather
from homeassistant.components.sensor import DOMAIN as SENSOR_PLATFORM
from homeassistant.const import CONF_API_KEY, Platform
from homeassistant.const import CONF_API_KEY, CONF_NAME, Platform
from homeassistant.core import HomeAssistant
from homeassistant.helpers import entity_registry as er
from homeassistant.helpers.aiohttp_client import async_get_clientsession
from .const import DOMAIN
from .const import DOMAIN, UPDATE_INTERVAL_DAILY_FORECAST, UPDATE_INTERVAL_OBSERVATION
from .coordinator import (
AccuWeatherConfigEntry,
AccuWeatherDailyForecastDataUpdateCoordinator,
AccuWeatherData,
AccuWeatherHourlyForecastDataUpdateCoordinator,
AccuWeatherObservationDataUpdateCoordinator,
)
@@ -30,6 +28,7 @@ PLATFORMS = [Platform.SENSOR, Platform.WEATHER]
async def async_setup_entry(hass: HomeAssistant, entry: AccuWeatherConfigEntry) -> bool:
"""Set up AccuWeather as config entry."""
api_key: str = entry.data[CONF_API_KEY]
name: str = entry.data[CONF_NAME]
location_key = entry.unique_id
@@ -42,28 +41,26 @@ async def async_setup_entry(hass: HomeAssistant, entry: AccuWeatherConfigEntry)
hass,
entry,
accuweather,
name,
"observation",
UPDATE_INTERVAL_OBSERVATION,
)
coordinator_daily_forecast = AccuWeatherDailyForecastDataUpdateCoordinator(
hass,
entry,
accuweather,
)
coordinator_hourly_forecast = AccuWeatherHourlyForecastDataUpdateCoordinator(
hass,
entry,
accuweather,
name,
"daily forecast",
UPDATE_INTERVAL_DAILY_FORECAST,
)
await asyncio.gather(
coordinator_observation.async_config_entry_first_refresh(),
coordinator_daily_forecast.async_config_entry_first_refresh(),
coordinator_hourly_forecast.async_config_entry_first_refresh(),
)
await coordinator_observation.async_config_entry_first_refresh()
await coordinator_daily_forecast.async_config_entry_first_refresh()
entry.runtime_data = AccuWeatherData(
coordinator_observation=coordinator_observation,
coordinator_daily_forecast=coordinator_daily_forecast,
coordinator_hourly_forecast=coordinator_hourly_forecast,
)
await hass.config_entries.async_forward_entry_setups(entry, PLATFORMS)

View File

@@ -3,7 +3,6 @@
from __future__ import annotations
from asyncio import timeout
from collections.abc import Mapping
from typing import Any
from accuweather import AccuWeather, ApiError, InvalidApiKeyError, RequestsExceededError
@@ -23,8 +22,6 @@ class AccuWeatherFlowHandler(ConfigFlow, domain=DOMAIN):
"""Config flow for AccuWeather."""
VERSION = 1
_latitude: float | None = None
_longitude: float | None = None
async def async_step_user(
self, user_input: dict[str, Any] | None = None
@@ -53,7 +50,6 @@ class AccuWeatherFlowHandler(ConfigFlow, domain=DOMAIN):
await self.async_set_unique_id(
accuweather.location_key, raise_on_progress=False
)
self._abort_if_unique_id_configured()
return self.async_create_entry(
title=user_input[CONF_NAME], data=user_input
@@ -77,46 +73,3 @@ class AccuWeatherFlowHandler(ConfigFlow, domain=DOMAIN):
),
errors=errors,
)
async def async_step_reauth(
self, entry_data: Mapping[str, Any]
) -> ConfigFlowResult:
"""Handle configuration by re-auth."""
self._latitude = entry_data[CONF_LATITUDE]
self._longitude = entry_data[CONF_LONGITUDE]
return await self.async_step_reauth_confirm()
async def async_step_reauth_confirm(
self, user_input: dict[str, Any] | None = None
) -> ConfigFlowResult:
"""Dialog that informs the user that reauth is required."""
errors: dict[str, str] = {}
if user_input is not None:
websession = async_get_clientsession(self.hass)
try:
async with timeout(10):
accuweather = AccuWeather(
user_input[CONF_API_KEY],
websession,
latitude=self._latitude,
longitude=self._longitude,
)
await accuweather.async_get_location()
except (ApiError, ClientConnectorError, TimeoutError, ClientError):
errors["base"] = "cannot_connect"
except InvalidApiKeyError:
errors["base"] = "invalid_api_key"
except RequestsExceededError:
errors["base"] = "requests_exceeded"
else:
return self.async_update_reload_and_abort(
self._get_reauth_entry(), data_updates=user_input
)
return self.async_show_form(
step_id="reauth_confirm",
data_schema=vol.Schema({vol.Required(CONF_API_KEY): str}),
errors=errors,
)

View File

@@ -69,6 +69,5 @@ POLLEN_CATEGORY_MAP = {
4: "very_high",
5: "extreme",
}
UPDATE_INTERVAL_OBSERVATION = timedelta(minutes=10)
UPDATE_INTERVAL_OBSERVATION = timedelta(minutes=40)
UPDATE_INTERVAL_DAILY_FORECAST = timedelta(hours=6)
UPDATE_INTERVAL_HOURLY_FORECAST = timedelta(minutes=30)

View File

@@ -3,7 +3,6 @@
from __future__ import annotations
from asyncio import timeout
from collections.abc import Awaitable, Callable
from dataclasses import dataclass
from datetime import timedelta
import logging
@@ -13,9 +12,7 @@ from accuweather import AccuWeather, ApiError, InvalidApiKeyError, RequestsExcee
from aiohttp.client_exceptions import ClientConnectorError
from homeassistant.config_entries import ConfigEntry
from homeassistant.const import CONF_NAME
from homeassistant.core import HomeAssistant
from homeassistant.exceptions import ConfigEntryAuthFailed
from homeassistant.helpers.device_registry import DeviceEntryType, DeviceInfo
from homeassistant.helpers.update_coordinator import (
DataUpdateCoordinator,
@@ -23,15 +20,9 @@ from homeassistant.helpers.update_coordinator import (
UpdateFailed,
)
from .const import (
DOMAIN,
MANUFACTURER,
UPDATE_INTERVAL_DAILY_FORECAST,
UPDATE_INTERVAL_HOURLY_FORECAST,
UPDATE_INTERVAL_OBSERVATION,
)
from .const import DOMAIN, MANUFACTURER
EXCEPTIONS = (ApiError, ClientConnectorError, RequestsExceededError)
EXCEPTIONS = (ApiError, ClientConnectorError, InvalidApiKeyError, RequestsExceededError)
_LOGGER = logging.getLogger(__name__)
@@ -42,7 +33,6 @@ class AccuWeatherData:
coordinator_observation: AccuWeatherObservationDataUpdateCoordinator
coordinator_daily_forecast: AccuWeatherDailyForecastDataUpdateCoordinator
coordinator_hourly_forecast: AccuWeatherHourlyForecastDataUpdateCoordinator
type AccuWeatherConfigEntry = ConfigEntry[AccuWeatherData]
@@ -53,18 +43,18 @@ class AccuWeatherObservationDataUpdateCoordinator(
):
"""Class to manage fetching AccuWeather data API."""
config_entry: AccuWeatherConfigEntry
def __init__(
self,
hass: HomeAssistant,
config_entry: AccuWeatherConfigEntry,
accuweather: AccuWeather,
name: str,
coordinator_type: str,
update_interval: timedelta,
) -> None:
"""Initialize."""
self.accuweather = accuweather
self.location_key = accuweather.location_key
name = config_entry.data[CONF_NAME]
if TYPE_CHECKING:
assert self.location_key is not None
@@ -75,8 +65,8 @@ class AccuWeatherObservationDataUpdateCoordinator(
hass,
_LOGGER,
config_entry=config_entry,
name=f"{name} (observation)",
update_interval=UPDATE_INTERVAL_OBSERVATION,
name=f"{name} ({coordinator_type})",
update_interval=update_interval,
)
async def _async_update_data(self) -> dict[str, Any]:
@@ -90,39 +80,29 @@ class AccuWeatherObservationDataUpdateCoordinator(
translation_key="current_conditions_update_error",
translation_placeholders={"error": repr(error)},
) from error
except InvalidApiKeyError as err:
raise ConfigEntryAuthFailed(
translation_domain=DOMAIN,
translation_key="auth_error",
translation_placeholders={"entry": self.config_entry.title},
) from err
_LOGGER.debug("Requests remaining: %d", self.accuweather.requests_remaining)
return result
class AccuWeatherForecastDataUpdateCoordinator(
class AccuWeatherDailyForecastDataUpdateCoordinator(
TimestampDataUpdateCoordinator[list[dict[str, Any]]]
):
"""Base class for AccuWeather forecast."""
config_entry: AccuWeatherConfigEntry
"""Class to manage fetching AccuWeather data API."""
def __init__(
self,
hass: HomeAssistant,
config_entry: AccuWeatherConfigEntry,
accuweather: AccuWeather,
name: str,
coordinator_type: str,
update_interval: timedelta,
fetch_method: Callable[..., Awaitable[list[dict[str, Any]]]],
) -> None:
"""Initialize."""
self.accuweather = accuweather
self.location_key = accuweather.location_key
self._fetch_method = fetch_method
name = config_entry.data[CONF_NAME]
if TYPE_CHECKING:
assert self.location_key is not None
@@ -138,71 +118,24 @@ class AccuWeatherForecastDataUpdateCoordinator(
)
async def _async_update_data(self) -> list[dict[str, Any]]:
"""Update forecast data via library."""
"""Update data via library."""
try:
async with timeout(10):
result = await self._fetch_method(language=self.hass.config.language)
result = await self.accuweather.async_get_daily_forecast(
language=self.hass.config.language
)
except EXCEPTIONS as error:
raise UpdateFailed(
translation_domain=DOMAIN,
translation_key="forecast_update_error",
translation_placeholders={"error": repr(error)},
) from error
except InvalidApiKeyError as err:
raise ConfigEntryAuthFailed(
translation_domain=DOMAIN,
translation_key="auth_error",
translation_placeholders={"entry": self.config_entry.title},
) from err
_LOGGER.debug("Requests remaining: %d", self.accuweather.requests_remaining)
return result
class AccuWeatherDailyForecastDataUpdateCoordinator(
AccuWeatherForecastDataUpdateCoordinator
):
"""Coordinator for daily forecast."""
def __init__(
self,
hass: HomeAssistant,
config_entry: AccuWeatherConfigEntry,
accuweather: AccuWeather,
) -> None:
"""Initialize."""
super().__init__(
hass,
config_entry,
accuweather,
"daily forecast",
UPDATE_INTERVAL_DAILY_FORECAST,
fetch_method=accuweather.async_get_daily_forecast,
)
class AccuWeatherHourlyForecastDataUpdateCoordinator(
AccuWeatherForecastDataUpdateCoordinator
):
"""Coordinator for hourly forecast."""
def __init__(
self,
hass: HomeAssistant,
config_entry: AccuWeatherConfigEntry,
accuweather: AccuWeather,
) -> None:
"""Initialize."""
super().__init__(
hass,
config_entry,
accuweather,
"hourly forecast",
UPDATE_INTERVAL_HOURLY_FORECAST,
fetch_method=accuweather.async_get_hourly_forecast,
)
def _get_device_info(location_key: str, name: str) -> DeviceInfo:
"""Get device info."""
return DeviceInfo(

View File

@@ -1,9 +1,6 @@
{
"entity": {
"sensor": {
"air_quality": {
"default": "mdi:air-filter"
},
"cloud_ceiling": {
"default": "mdi:weather-fog"
},
@@ -37,6 +34,9 @@
"thunderstorm_probability_night": {
"default": "mdi:weather-lightning"
},
"translation_key": {
"default": "mdi:air-filter"
},
"tree_pollen": {
"default": "mdi:tree-outline"
},

View File

@@ -7,5 +7,6 @@
"integration_type": "service",
"iot_class": "cloud_polling",
"loggers": ["accuweather"],
"requirements": ["accuweather==4.2.2"]
"requirements": ["accuweather==4.2.1"],
"single_config_entry": true
}

View File

@@ -1,8 +1,14 @@
{
"config": {
"abort": {
"already_configured": "[%key:common::config_flow::abort::already_configured_location%]",
"reauth_successful": "[%key:common::config_flow::abort::reauth_successful%]"
"step": {
"user": {
"data": {
"name": "[%key:common::config_flow::data::name%]",
"api_key": "[%key:common::config_flow::data::api_key%]",
"latitude": "[%key:common::config_flow::data::latitude%]",
"longitude": "[%key:common::config_flow::data::longitude%]"
}
}
},
"create_entry": {
"default": "Some sensors are not enabled by default. You can enable them in the entity registry after the integration configuration."
@@ -11,27 +17,6 @@
"cannot_connect": "[%key:common::config_flow::error::cannot_connect%]",
"invalid_api_key": "[%key:common::config_flow::error::invalid_api_key%]",
"requests_exceeded": "The allowed number of requests to the AccuWeather API has been exceeded. You have to wait or change the API key."
},
"step": {
"reauth_confirm": {
"data": {
"api_key": "[%key:common::config_flow::data::api_key%]"
},
"data_description": {
"api_key": "[%key:component::accuweather::config::step::user::data_description::api_key%]"
}
},
"user": {
"data": {
"api_key": "[%key:common::config_flow::data::api_key%]",
"latitude": "[%key:common::config_flow::data::latitude%]",
"longitude": "[%key:common::config_flow::data::longitude%]",
"name": "[%key:common::config_flow::data::name%]"
},
"data_description": {
"api_key": "API key generated in the AccuWeather APIs portal."
}
}
}
},
"entity": {
@@ -120,9 +105,9 @@
"pressure_tendency": {
"name": "Pressure tendency",
"state": {
"falling": "Falling",
"steady": "Steady",
"rising": "Rising",
"steady": "Steady"
"falling": "Falling"
},
"state_attributes": {
"options": {
@@ -227,6 +212,9 @@
"wet_bulb_temperature": {
"name": "Wet bulb temperature"
},
"wind_speed": {
"name": "[%key:component::weather::entity_component::_::state_attributes::wind_speed::name%]"
},
"wind_chill_temperature": {
"name": "Wind chill temperature"
},
@@ -239,9 +227,6 @@
"wind_gust_speed_night": {
"name": "Wind gust speed night {forecast_day}"
},
"wind_speed": {
"name": "[%key:component::weather::entity_component::_::state_attributes::wind_speed::name%]"
},
"wind_speed_day": {
"name": "Wind speed day {forecast_day}"
},
@@ -251,9 +236,6 @@
}
},
"exceptions": {
"auth_error": {
"message": "Authentication failed for {entry}, please update your API key"
},
"current_conditions_update_error": {
"message": "An error occurred while retrieving weather current conditions data from the AccuWeather API: {error}"
},

View File

@@ -45,7 +45,6 @@ from .coordinator import (
AccuWeatherConfigEntry,
AccuWeatherDailyForecastDataUpdateCoordinator,
AccuWeatherData,
AccuWeatherHourlyForecastDataUpdateCoordinator,
AccuWeatherObservationDataUpdateCoordinator,
)
@@ -65,7 +64,6 @@ class AccuWeatherEntity(
CoordinatorWeatherEntity[
AccuWeatherObservationDataUpdateCoordinator,
AccuWeatherDailyForecastDataUpdateCoordinator,
AccuWeatherHourlyForecastDataUpdateCoordinator,
]
):
"""Define an AccuWeather entity."""
@@ -78,7 +76,6 @@ class AccuWeatherEntity(
super().__init__(
observation_coordinator=accuweather_data.coordinator_observation,
daily_coordinator=accuweather_data.coordinator_daily_forecast,
hourly_coordinator=accuweather_data.coordinator_hourly_forecast,
)
self._attr_native_precipitation_unit = UnitOfPrecipitationDepth.MILLIMETERS
@@ -89,13 +86,10 @@ class AccuWeatherEntity(
self._attr_unique_id = accuweather_data.coordinator_observation.location_key
self._attr_attribution = ATTRIBUTION
self._attr_device_info = accuweather_data.coordinator_observation.device_info
self._attr_supported_features = (
WeatherEntityFeature.FORECAST_DAILY | WeatherEntityFeature.FORECAST_HOURLY
)
self._attr_supported_features = WeatherEntityFeature.FORECAST_DAILY
self.observation_coordinator = accuweather_data.coordinator_observation
self.daily_coordinator = accuweather_data.coordinator_daily_forecast
self.hourly_coordinator = accuweather_data.coordinator_hourly_forecast
@property
def condition(self) -> str | None:
@@ -213,32 +207,3 @@ class AccuWeatherEntity(
}
for item in self.daily_coordinator.data
]
@callback
def _async_forecast_hourly(self) -> list[Forecast] | None:
"""Return the hourly forecast in native units."""
return [
{
ATTR_FORECAST_TIME: utc_from_timestamp(
item["EpochDateTime"]
).isoformat(),
ATTR_FORECAST_CLOUD_COVERAGE: item["CloudCover"],
ATTR_FORECAST_HUMIDITY: item["RelativeHumidity"],
ATTR_FORECAST_NATIVE_TEMP: item["Temperature"][ATTR_VALUE],
ATTR_FORECAST_NATIVE_APPARENT_TEMP: item["RealFeelTemperature"][
ATTR_VALUE
],
ATTR_FORECAST_NATIVE_PRECIPITATION: item["TotalLiquid"][ATTR_VALUE],
ATTR_FORECAST_PRECIPITATION_PROBABILITY: item[
"PrecipitationProbability"
],
ATTR_FORECAST_NATIVE_WIND_SPEED: item["Wind"][ATTR_SPEED][ATTR_VALUE],
ATTR_FORECAST_NATIVE_WIND_GUST_SPEED: item["WindGust"][ATTR_SPEED][
ATTR_VALUE
],
ATTR_FORECAST_UV_INDEX: item["UVIndex"],
ATTR_FORECAST_WIND_BEARING: item["Wind"][ATTR_DIRECTION]["Degrees"],
ATTR_FORECAST_CONDITION: CONDITION_MAP.get(item["WeatherIcon"]),
}
for item in self.hourly_coordinator.data
]

View File

@@ -1,15 +1,15 @@
{
"config": {
"abort": {
"no_devices_found": "[%key:common::config_flow::abort::no_devices_found%]"
},
"step": {
"user": {
"title": "Pick a hub to add",
"data": {
"id": "Host ID"
},
"title": "Pick a hub to add"
}
}
},
"abort": {
"no_devices_found": "[%key:common::config_flow::abort::no_devices_found%]"
}
}
}

View File

@@ -1,57 +0,0 @@
"""The Actron Air integration."""
from actron_neo_api import (
ActronAirNeoACSystem,
ActronNeoAPI,
ActronNeoAPIError,
ActronNeoAuthError,
)
from homeassistant.const import CONF_API_TOKEN, Platform
from homeassistant.core import HomeAssistant
from .const import _LOGGER
from .coordinator import (
ActronAirConfigEntry,
ActronAirRuntimeData,
ActronAirSystemCoordinator,
)
PLATFORM = [Platform.CLIMATE]
async def async_setup_entry(hass: HomeAssistant, entry: ActronAirConfigEntry) -> bool:
"""Set up Actron Air integration from a config entry."""
api = ActronNeoAPI(refresh_token=entry.data[CONF_API_TOKEN])
systems: list[ActronAirNeoACSystem] = []
try:
systems = await api.get_ac_systems()
await api.update_status()
except ActronNeoAuthError:
_LOGGER.error("Authentication error while setting up Actron Air integration")
raise
except ActronNeoAPIError as err:
_LOGGER.error("API error while setting up Actron Air integration: %s", err)
raise
system_coordinators: dict[str, ActronAirSystemCoordinator] = {}
for system in systems:
coordinator = ActronAirSystemCoordinator(hass, entry, api, system)
_LOGGER.debug("Setting up coordinator for system: %s", system["serial"])
await coordinator.async_config_entry_first_refresh()
system_coordinators[system["serial"]] = coordinator
entry.runtime_data = ActronAirRuntimeData(
api=api,
system_coordinators=system_coordinators,
)
await hass.config_entries.async_forward_entry_setups(entry, PLATFORM)
return True
async def async_unload_entry(hass: HomeAssistant, entry: ActronAirConfigEntry) -> bool:
"""Unload a config entry."""
return await hass.config_entries.async_unload_platforms(entry, PLATFORM)

View File

@@ -1,259 +0,0 @@
"""Climate platform for Actron Air integration."""
from typing import Any
from actron_neo_api import ActronAirNeoStatus, ActronAirNeoZone
from homeassistant.components.climate import (
FAN_AUTO,
FAN_HIGH,
FAN_LOW,
FAN_MEDIUM,
ClimateEntity,
ClimateEntityFeature,
HVACMode,
)
from homeassistant.const import ATTR_TEMPERATURE, UnitOfTemperature
from homeassistant.core import HomeAssistant
from homeassistant.helpers.device_registry import DeviceInfo
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from homeassistant.helpers.update_coordinator import CoordinatorEntity
from .const import DOMAIN
from .coordinator import ActronAirConfigEntry, ActronAirSystemCoordinator
PARALLEL_UPDATES = 0
FAN_MODE_MAPPING_ACTRONAIR_TO_HA = {
"AUTO": FAN_AUTO,
"LOW": FAN_LOW,
"MED": FAN_MEDIUM,
"HIGH": FAN_HIGH,
}
FAN_MODE_MAPPING_HA_TO_ACTRONAIR = {
v: k for k, v in FAN_MODE_MAPPING_ACTRONAIR_TO_HA.items()
}
HVAC_MODE_MAPPING_ACTRONAIR_TO_HA = {
"COOL": HVACMode.COOL,
"HEAT": HVACMode.HEAT,
"FAN": HVACMode.FAN_ONLY,
"AUTO": HVACMode.AUTO,
"OFF": HVACMode.OFF,
}
HVAC_MODE_MAPPING_HA_TO_ACTRONAIR = {
v: k for k, v in HVAC_MODE_MAPPING_ACTRONAIR_TO_HA.items()
}
async def async_setup_entry(
hass: HomeAssistant,
entry: ActronAirConfigEntry,
async_add_entities: AddConfigEntryEntitiesCallback,
) -> None:
"""Set up Actron Air climate entities."""
system_coordinators = entry.runtime_data.system_coordinators
entities: list[ClimateEntity] = []
for coordinator in system_coordinators.values():
status = coordinator.data
name = status.ac_system.system_name
entities.append(ActronSystemClimate(coordinator, name))
entities.extend(
ActronZoneClimate(coordinator, zone)
for zone in status.remote_zone_info
if zone.exists
)
async_add_entities(entities)
class BaseClimateEntity(CoordinatorEntity[ActronAirSystemCoordinator], ClimateEntity):
"""Base class for Actron Air climate entities."""
_attr_has_entity_name = True
_attr_temperature_unit = UnitOfTemperature.CELSIUS
_attr_supported_features = (
ClimateEntityFeature.TARGET_TEMPERATURE
| ClimateEntityFeature.FAN_MODE
| ClimateEntityFeature.TURN_ON
| ClimateEntityFeature.TURN_OFF
)
_attr_name = None
_attr_fan_modes = list(FAN_MODE_MAPPING_ACTRONAIR_TO_HA.values())
_attr_hvac_modes = list(HVAC_MODE_MAPPING_ACTRONAIR_TO_HA.values())
def __init__(
self,
coordinator: ActronAirSystemCoordinator,
name: str,
) -> None:
"""Initialize an Actron Air unit."""
super().__init__(coordinator)
self._serial_number = coordinator.serial_number
class ActronSystemClimate(BaseClimateEntity):
"""Representation of the Actron Air system."""
_attr_supported_features = (
ClimateEntityFeature.TARGET_TEMPERATURE
| ClimateEntityFeature.FAN_MODE
| ClimateEntityFeature.TURN_ON
| ClimateEntityFeature.TURN_OFF
)
def __init__(
self,
coordinator: ActronAirSystemCoordinator,
name: str,
) -> None:
"""Initialize an Actron Air unit."""
super().__init__(coordinator, name)
serial_number = coordinator.serial_number
self._attr_unique_id = serial_number
self._attr_device_info = DeviceInfo(
identifiers={(DOMAIN, serial_number)},
name=self._status.ac_system.system_name,
manufacturer="Actron Air",
model_id=self._status.ac_system.master_wc_model,
sw_version=self._status.ac_system.master_wc_firmware_version,
serial_number=serial_number,
)
@property
def min_temp(self) -> float:
"""Return the minimum temperature that can be set."""
return self._status.min_temp
@property
def max_temp(self) -> float:
"""Return the maximum temperature that can be set."""
return self._status.max_temp
@property
def _status(self) -> ActronAirNeoStatus:
"""Get the current status from the coordinator."""
return self.coordinator.data
@property
def hvac_mode(self) -> HVACMode | None:
"""Return the current HVAC mode."""
if not self._status.user_aircon_settings.is_on:
return HVACMode.OFF
mode = self._status.user_aircon_settings.mode
return HVAC_MODE_MAPPING_ACTRONAIR_TO_HA.get(mode)
@property
def fan_mode(self) -> str | None:
"""Return the current fan mode."""
fan_mode = self._status.user_aircon_settings.fan_mode
return FAN_MODE_MAPPING_ACTRONAIR_TO_HA.get(fan_mode)
@property
def current_humidity(self) -> float:
"""Return the current humidity."""
return self._status.master_info.live_humidity_pc
@property
def current_temperature(self) -> float:
"""Return the current temperature."""
return self._status.master_info.live_temp_c
@property
def target_temperature(self) -> float:
"""Return the target temperature."""
return self._status.user_aircon_settings.temperature_setpoint_cool_c
async def async_set_fan_mode(self, fan_mode: str) -> None:
"""Set a new fan mode."""
api_fan_mode = FAN_MODE_MAPPING_HA_TO_ACTRONAIR.get(fan_mode.lower())
await self._status.user_aircon_settings.set_fan_mode(api_fan_mode)
async def async_set_hvac_mode(self, hvac_mode: HVACMode) -> None:
"""Set the HVAC mode."""
ac_mode = HVAC_MODE_MAPPING_HA_TO_ACTRONAIR.get(hvac_mode)
await self._status.ac_system.set_system_mode(ac_mode)
async def async_set_temperature(self, **kwargs: Any) -> None:
"""Set the temperature."""
temp = kwargs.get(ATTR_TEMPERATURE)
await self._status.user_aircon_settings.set_temperature(temperature=temp)
class ActronZoneClimate(BaseClimateEntity):
"""Representation of a zone within the Actron Air system."""
_attr_supported_features = (
ClimateEntityFeature.TARGET_TEMPERATURE
| ClimateEntityFeature.TURN_ON
| ClimateEntityFeature.TURN_OFF
)
def __init__(
self,
coordinator: ActronAirSystemCoordinator,
zone: ActronAirNeoZone,
) -> None:
"""Initialize an Actron Air unit."""
super().__init__(coordinator, zone.title)
serial_number = coordinator.serial_number
self._zone_id: int = zone.zone_id
self._attr_unique_id: str = f"{serial_number}_zone_{zone.zone_id}"
self._attr_device_info: DeviceInfo = DeviceInfo(
identifiers={(DOMAIN, self._attr_unique_id)},
name=zone.title,
manufacturer="Actron Air",
model="Zone",
suggested_area=zone.title,
via_device=(DOMAIN, serial_number),
)
@property
def min_temp(self) -> float:
"""Return the minimum temperature that can be set."""
return self._zone.min_temp
@property
def max_temp(self) -> float:
"""Return the maximum temperature that can be set."""
return self._zone.max_temp
@property
def _zone(self) -> ActronAirNeoZone:
"""Get the current zone data from the coordinator."""
status = self.coordinator.data
return status.zones[self._zone_id]
@property
def hvac_mode(self) -> HVACMode | None:
"""Return the current HVAC mode."""
if self._zone.is_active:
mode = self._zone.hvac_mode
return HVAC_MODE_MAPPING_ACTRONAIR_TO_HA.get(mode)
return HVACMode.OFF
@property
def current_humidity(self) -> float | None:
"""Return the current humidity."""
return self._zone.humidity
@property
def current_temperature(self) -> float | None:
"""Return the current temperature."""
return self._zone.live_temp_c
@property
def target_temperature(self) -> float | None:
"""Return the target temperature."""
return self._zone.temperature_setpoint_cool_c
async def async_set_hvac_mode(self, hvac_mode: HVACMode) -> None:
"""Set the HVAC mode."""
is_enabled = hvac_mode != HVACMode.OFF
await self._zone.enable(is_enabled)
async def async_set_temperature(self, **kwargs: Any) -> None:
"""Set the temperature."""
await self._zone.set_temperature(temperature=kwargs["temperature"])

View File

@@ -1,132 +0,0 @@
"""Setup config flow for Actron Air integration."""
import asyncio
from typing import Any
from actron_neo_api import ActronNeoAPI, ActronNeoAuthError
from homeassistant.config_entries import ConfigFlow, ConfigFlowResult
from homeassistant.const import CONF_API_TOKEN
from homeassistant.exceptions import HomeAssistantError
from .const import _LOGGER, DOMAIN
class ActronAirConfigFlow(ConfigFlow, domain=DOMAIN):
"""Handle a config flow for Actron Air."""
def __init__(self) -> None:
"""Initialize the config flow."""
self._api: ActronNeoAPI | None = None
self._device_code: str | None = None
self._user_code: str = ""
self._verification_uri: str = ""
self._expires_minutes: str = "30"
self.login_task: asyncio.Task | None = None
async def async_step_user(
self, user_input: dict[str, Any] | None = None
) -> ConfigFlowResult:
"""Handle the initial step."""
if self._api is None:
_LOGGER.debug("Initiating device authorization")
self._api = ActronNeoAPI()
try:
device_code_response = await self._api.request_device_code()
except ActronNeoAuthError as err:
_LOGGER.error("OAuth2 flow failed: %s", err)
return self.async_abort(reason="oauth2_error")
self._device_code = device_code_response["device_code"]
self._user_code = device_code_response["user_code"]
self._verification_uri = device_code_response["verification_uri_complete"]
self._expires_minutes = str(device_code_response["expires_in"] // 60)
async def _wait_for_authorization() -> None:
"""Wait for the user to authorize the device."""
assert self._api is not None
assert self._device_code is not None
_LOGGER.debug("Waiting for device authorization")
try:
await self._api.poll_for_token(self._device_code)
_LOGGER.debug("Authorization successful")
except ActronNeoAuthError as ex:
_LOGGER.exception("Error while waiting for device authorization")
raise CannotConnect from ex
_LOGGER.debug("Checking login task")
if self.login_task is None:
_LOGGER.debug("Creating task for device authorization")
self.login_task = self.hass.async_create_task(_wait_for_authorization())
if self.login_task.done():
_LOGGER.debug("Login task is done, checking results")
if exception := self.login_task.exception():
if isinstance(exception, CannotConnect):
return self.async_show_progress_done(
next_step_id="connection_error"
)
return self.async_show_progress_done(next_step_id="timeout")
return self.async_show_progress_done(next_step_id="finish_login")
return self.async_show_progress(
step_id="user",
progress_action="wait_for_authorization",
description_placeholders={
"user_code": self._user_code,
"verification_uri": self._verification_uri,
"expires_minutes": self._expires_minutes,
},
progress_task=self.login_task,
)
async def async_step_finish_login(
self, user_input: dict[str, Any] | None = None
) -> ConfigFlowResult:
"""Handle the finalization of login."""
_LOGGER.debug("Finalizing authorization")
assert self._api is not None
try:
user_data = await self._api.get_user_info()
except ActronNeoAuthError as err:
_LOGGER.error("Error getting user info: %s", err)
return self.async_abort(reason="oauth2_error")
unique_id = str(user_data["id"])
await self.async_set_unique_id(unique_id)
self._abort_if_unique_id_configured()
return self.async_create_entry(
title=user_data["email"],
data={CONF_API_TOKEN: self._api.refresh_token_value},
)
async def async_step_timeout(
self,
user_input: dict[str, Any] | None = None,
) -> ConfigFlowResult:
"""Handle issues that need transition await from progress step."""
if user_input is None:
return self.async_show_form(
step_id="timeout",
)
del self.login_task
return await self.async_step_user()
async def async_step_connection_error(
self, user_input: dict[str, Any] | None = None
) -> ConfigFlowResult:
"""Handle connection error from progress step."""
if user_input is None:
return self.async_show_form(step_id="connection_error")
# Reset state and try again
self._api = None
self._device_code = None
self.login_task = None
return await self.async_step_user()
class CannotConnect(HomeAssistantError):
"""Error to indicate we cannot connect."""

View File

@@ -1,6 +0,0 @@
"""Constants used by Actron Air integration."""
import logging
_LOGGER = logging.getLogger(__package__)
DOMAIN = "actron_air"

View File

@@ -1,69 +0,0 @@
"""Coordinator for Actron Air integration."""
from __future__ import annotations
from dataclasses import dataclass
from datetime import timedelta
from actron_neo_api import ActronAirNeoACSystem, ActronAirNeoStatus, ActronNeoAPI
from homeassistant.config_entries import ConfigEntry
from homeassistant.core import HomeAssistant
from homeassistant.helpers.update_coordinator import DataUpdateCoordinator
from homeassistant.util import dt as dt_util
from .const import _LOGGER
STALE_DEVICE_TIMEOUT = timedelta(hours=24)
ERROR_NO_SYSTEMS_FOUND = "no_systems_found"
ERROR_UNKNOWN = "unknown_error"
@dataclass
class ActronAirRuntimeData:
"""Runtime data for the Actron Air integration."""
api: ActronNeoAPI
system_coordinators: dict[str, ActronAirSystemCoordinator]
type ActronAirConfigEntry = ConfigEntry[ActronAirRuntimeData]
AUTH_ERROR_THRESHOLD = 3
SCAN_INTERVAL = timedelta(seconds=30)
class ActronAirSystemCoordinator(DataUpdateCoordinator[ActronAirNeoACSystem]):
"""System coordinator for Actron Air integration."""
def __init__(
self,
hass: HomeAssistant,
entry: ActronAirConfigEntry,
api: ActronNeoAPI,
system: ActronAirNeoACSystem,
) -> None:
"""Initialize the coordinator."""
super().__init__(
hass,
_LOGGER,
name="Actron Air Status",
update_interval=SCAN_INTERVAL,
config_entry=entry,
)
self.system = system
self.serial_number = system["serial"]
self.api = api
self.status = self.api.state_manager.get_status(self.serial_number)
self.last_seen = dt_util.utcnow()
async def _async_update_data(self) -> ActronAirNeoStatus:
"""Fetch updates and merge incremental changes into the full state."""
await self.api.update_status()
self.status = self.api.state_manager.get_status(self.serial_number)
self.last_seen = dt_util.utcnow()
return self.status
def is_device_stale(self) -> bool:
"""Check if a device is stale (not seen for a while)."""
return (dt_util.utcnow() - self.last_seen) > STALE_DEVICE_TIMEOUT

View File

@@ -1,16 +0,0 @@
{
"domain": "actron_air",
"name": "Actron Air",
"codeowners": ["@kclif9", "@JagadishDhanamjayam"],
"config_flow": true,
"dhcp": [
{
"hostname": "neo-*",
"macaddress": "FC0FE7*"
}
],
"documentation": "https://www.home-assistant.io/integrations/actron_air",
"iot_class": "cloud_polling",
"quality_scale": "bronze",
"requirements": ["actron-neo-api==0.1.84"]
}

View File

@@ -1,78 +0,0 @@
rules:
# Bronze
action-setup:
status: exempt
comment: This integration does not have custom service actions.
appropriate-polling: done
brands: done
common-modules: done
config-flow-test-coverage: done
config-flow: done
dependency-transparency: done
docs-actions:
status: exempt
comment: This integration does not have custom service actions.
docs-high-level-description: done
docs-installation-instructions: done
docs-removal-instructions: done
entity-event-setup:
status: exempt
comment: This integration does not subscribe to external events.
entity-unique-id: done
has-entity-name: done
runtime-data: done
test-before-configure: done
test-before-setup: done
unique-config-entry: done
# Silver
action-exceptions: todo
config-entry-unloading: done
docs-configuration-parameters:
status: exempt
comment: No options flow
docs-installation-parameters: done
entity-unavailable: done
integration-owner: done
log-when-unavailable: done
parallel-updates: done
reauthentication-flow: todo
test-coverage: todo
# Gold
devices: done
diagnostics: todo
discovery-update-info:
status: exempt
comment: This integration uses DHCP discovery, however is cloud polling. Therefore there is no information to update.
discovery: done
docs-data-update: done
docs-examples: done
docs-known-limitations: done
docs-supported-devices: done
docs-supported-functions: done
docs-troubleshooting: done
docs-use-cases: done
dynamic-devices: todo
entity-category:
status: exempt
comment: This integration does not use entity categories.
entity-device-class:
status: exempt
comment: This integration does not use entity device classes.
entity-disabled-by-default:
status: exempt
comment: Not required for this integration at this stage.
entity-translations: todo
exception-translations: todo
icon-translations: todo
reconfiguration-flow: todo
repair-issues:
status: exempt
comment: This integration does not have any known issues that require repair.
stale-devices: todo
# Platinum
async-dependency: done
inject-websession: todo
strict-typing: todo

View File

@@ -1,29 +0,0 @@
{
"config": {
"abort": {
"already_configured": "[%key:common::config_flow::abort::already_configured_account%]",
"oauth2_error": "Failed to start OAuth2 flow"
},
"error": {
"oauth2_error": "Failed to start OAuth2 flow. Please try again later."
},
"progress": {
"wait_for_authorization": "To authenticate, open the following URL and login at Actron Air:\n{verification_uri}\nIf the code is not automatically copied, paste the following code to authorize the integration:\n\n```{user_code}```\n\n\nThe login attempt will time out after {expires_minutes} minutes."
},
"step": {
"connection_error": {
"data": {},
"description": "Failed to connect to Actron Air. Please check your internet connection and try again.",
"title": "Connection error"
},
"timeout": {
"data": {},
"description": "The authorization process timed out. Please try again.",
"title": "Authorization timeout"
},
"user": {
"title": "Actron Air OAuth2 Authorization"
}
}
}
}

View File

@@ -6,5 +6,5 @@
"documentation": "https://www.home-assistant.io/integrations/adax",
"iot_class": "local_polling",
"loggers": ["adax", "adax_local"],
"requirements": ["adax==0.4.0", "Adax-local==0.2.0"]
"requirements": ["adax==0.4.0", "Adax-local==0.1.5"]
}

View File

@@ -1,34 +1,34 @@
{
"config": {
"abort": {
"already_configured": "[%key:common::config_flow::abort::already_configured_device%]",
"heater_not_available": "Heater not available. Try to reset the heater by pressing + and OK for some seconds.",
"heater_not_found": "Heater not found. Try to move the heater closer to Home Assistant computer.",
"invalid_auth": "[%key:common::config_flow::error::invalid_auth%]"
},
"error": {
"cannot_connect": "[%key:common::config_flow::error::cannot_connect%]"
},
"step": {
"cloud": {
"data": {
"account_id": "Account ID",
"password": "[%key:common::config_flow::data::password%]"
}
},
"local": {
"data": {
"wifi_pswd": "Wi-Fi password",
"wifi_ssid": "Wi-Fi SSID"
},
"description": "Reset the heater by pressing + and OK until display shows 'Reset'. Then press and hold OK button on the heater until the blue LED starts blinking before pressing Submit. Configuring heater might take some minutes."
},
"user": {
"data": {
"connection_type": "Select connection type"
},
"description": "Select connection type. Local requires heaters with Bluetooth"
},
"local": {
"data": {
"wifi_ssid": "Wi-Fi SSID",
"wifi_pswd": "Wi-Fi password"
},
"description": "Reset the heater by pressing + and OK until display shows 'Reset'. Then press and hold OK button on the heater until the blue LED starts blinking before pressing Submit. Configuring heater might take some minutes."
},
"cloud": {
"data": {
"account_id": "Account ID",
"password": "[%key:common::config_flow::data::password%]"
}
}
},
"error": {
"cannot_connect": "[%key:common::config_flow::error::cannot_connect%]"
},
"abort": {
"already_configured": "[%key:common::config_flow::abort::already_configured_device%]",
"heater_not_available": "Heater not available. Try to reset the heater by pressing + and OK for some seconds.",
"heater_not_found": "Heater not found. Try to move the heater closer to Home Assistant computer.",
"invalid_auth": "[%key:common::config_flow::error::invalid_auth%]"
}
}
}

View File

@@ -1,9 +1,6 @@
{
"entity": {
"sensor": {
"average_processing_speed": {
"default": "mdi:speedometer"
},
"dns_queries": {
"default": "mdi:magnify"
},
@@ -16,18 +13,21 @@
"parental_control_blocked": {
"default": "mdi:human-male-girl"
},
"rules_count": {
"default": "mdi:counter"
},
"safe_browsing_blocked": {
"default": "mdi:shield-half-full"
},
"safe_searches_enforced": {
"default": "mdi:shield-search"
},
"average_processing_speed": {
"default": "mdi:speedometer"
},
"rules_count": {
"default": "mdi:counter"
}
},
"switch": {
"filtering": {
"protection": {
"default": "mdi:shield-check",
"state": {
"off": "mdi:shield-off"
@@ -39,13 +39,7 @@
"off": "mdi:shield-off"
}
},
"protection": {
"default": "mdi:shield-check",
"state": {
"off": "mdi:shield-off"
}
},
"query_log": {
"safe_search": {
"default": "mdi:shield-check",
"state": {
"off": "mdi:shield-off"
@@ -57,7 +51,13 @@
"off": "mdi:shield-off"
}
},
"safe_search": {
"filtering": {
"default": "mdi:shield-check",
"state": {
"off": "mdi:shield-off"
}
},
"query_log": {
"default": "mdi:shield-check",
"state": {
"off": "mdi:shield-off"
@@ -69,17 +69,17 @@
"add_url": {
"service": "mdi:link-plus"
},
"disable_url": {
"service": "mdi:link-variant-off"
"remove_url": {
"service": "mdi:link-off"
},
"enable_url": {
"service": "mdi:link-variant"
},
"disable_url": {
"service": "mdi:link-variant-off"
},
"refresh": {
"service": "mdi:refresh"
},
"remove_url": {
"service": "mdi:link-off"
}
}
}

View File

@@ -1,38 +1,35 @@
{
"config": {
"abort": {
"already_configured": "[%key:common::config_flow::abort::already_configured_service%]",
"existing_instance_updated": "Updated existing configuration."
},
"error": {
"cannot_connect": "[%key:common::config_flow::error::cannot_connect%]"
},
"step": {
"hassio_confirm": {
"description": "Do you want to configure Home Assistant to connect to the AdGuard Home provided by the add-on: {addon}?",
"title": "AdGuard Home via Home Assistant add-on"
},
"user": {
"description": "Set up your AdGuard Home instance to allow monitoring and control.",
"data": {
"host": "[%key:common::config_flow::data::host%]",
"password": "[%key:common::config_flow::data::password%]",
"port": "[%key:common::config_flow::data::port%]",
"ssl": "[%key:common::config_flow::data::ssl%]",
"username": "[%key:common::config_flow::data::username%]",
"ssl": "[%key:common::config_flow::data::ssl%]",
"verify_ssl": "[%key:common::config_flow::data::verify_ssl%]"
},
"data_description": {
"host": "The hostname or IP address of the device running your AdGuard Home."
},
"description": "Set up your AdGuard Home instance to allow monitoring and control."
}
},
"hassio_confirm": {
"title": "AdGuard Home via Home Assistant add-on",
"description": "Do you want to configure Home Assistant to connect to the AdGuard Home provided by the add-on: {addon}?"
}
},
"error": {
"cannot_connect": "[%key:common::config_flow::error::cannot_connect%]"
},
"abort": {
"existing_instance_updated": "Updated existing configuration.",
"already_configured": "[%key:common::config_flow::abort::already_configured_service%]"
}
},
"entity": {
"sensor": {
"average_processing_speed": {
"name": "Average processing speed"
},
"dns_queries": {
"name": "DNS queries"
},
@@ -45,91 +42,94 @@
"parental_control_blocked": {
"name": "Parental control blocked"
},
"rules_count": {
"name": "Rules count"
},
"safe_browsing_blocked": {
"name": "Safe browsing blocked"
},
"safe_searches_enforced": {
"name": "Safe searches enforced"
},
"average_processing_speed": {
"name": "Average processing speed"
},
"rules_count": {
"name": "Rules count"
}
},
"switch": {
"filtering": {
"name": "Filtering"
"protection": {
"name": "Protection"
},
"parental": {
"name": "Parental control"
},
"protection": {
"name": "Protection"
},
"query_log": {
"name": "Query log"
"safe_search": {
"name": "Safe search"
},
"safe_browsing": {
"name": "Safe browsing"
},
"safe_search": {
"name": "Safe search"
"filtering": {
"name": "Filtering"
},
"query_log": {
"name": "Query log"
}
}
},
"services": {
"add_url": {
"name": "Add URL",
"description": "Adds a new filter subscription to AdGuard Home.",
"fields": {
"name": {
"description": "The name of the filter subscription.",
"name": "[%key:common::config_flow::data::name%]"
"name": "[%key:common::config_flow::data::name%]",
"description": "The name of the filter subscription."
},
"url": {
"description": "The filter URL to subscribe to, containing the filter rules.",
"name": "[%key:common::config_flow::data::url%]"
"name": "[%key:common::config_flow::data::url%]",
"description": "The filter URL to subscribe to, containing the filter rules."
}
},
"name": "Add URL"
},
"disable_url": {
"description": "Disables a filter subscription in AdGuard Home.",
"fields": {
"url": {
"description": "The filter subscription URL to disable.",
"name": "[%key:common::config_flow::data::url%]"
}
},
"name": "Disable URL"
},
"enable_url": {
"description": "Enables a filter subscription in AdGuard Home.",
"fields": {
"url": {
"description": "The filter subscription URL to enable.",
"name": "[%key:common::config_flow::data::url%]"
}
},
"name": "Enable URL"
},
"refresh": {
"description": "Refreshes all filter subscriptions in AdGuard Home.",
"fields": {
"force": {
"description": "Force update (bypasses AdGuard Home throttling), omit for a regular refresh.",
"name": "Force"
}
},
"name": "Refresh"
}
},
"remove_url": {
"name": "Remove URL",
"description": "Removes a filter subscription from AdGuard Home.",
"fields": {
"url": {
"description": "The filter subscription URL to remove.",
"name": "[%key:common::config_flow::data::url%]"
"name": "[%key:common::config_flow::data::url%]",
"description": "The filter subscription URL to remove."
}
},
"name": "Remove URL"
}
},
"enable_url": {
"name": "Enable URL",
"description": "Enables a filter subscription in AdGuard Home.",
"fields": {
"url": {
"name": "[%key:common::config_flow::data::url%]",
"description": "The filter subscription URL to enable."
}
}
},
"disable_url": {
"name": "Disable URL",
"description": "Disables a filter subscription in AdGuard Home.",
"fields": {
"url": {
"name": "[%key:common::config_flow::data::url%]",
"description": "The filter subscription URL to disable."
}
}
},
"refresh": {
"name": "Refresh",
"description": "Refreshes all filter subscriptions in AdGuard Home.",
"fields": {
"force": {
"name": "Force",
"description": "Force update (bypasses AdGuard Home throttling), omit for a regular refresh."
}
}
}
}
}

View File

@@ -1,22 +1,22 @@
{
"services": {
"write_data_by_name": {
"name": "Write data by name",
"description": "Write a value to the connected ADS device.",
"fields": {
"adstype": {
"description": "The data type of the variable to write to.",
"name": "ADS type"
},
"adsvar": {
"description": "The name of the variable to write to.",
"name": "ADS variable"
"name": "ADS variable",
"description": "The name of the variable to write to."
},
"adstype": {
"name": "ADS type",
"description": "The data type of the variable to write to."
},
"value": {
"description": "The value to write to the variable.",
"name": "Value"
"name": "Value",
"description": "The value to write to the variable."
}
},
"name": "Write data by name"
}
}
}
}

View File

@@ -1,11 +1,11 @@
{
"config": {
"abort": {
"already_configured": "[%key:common::config_flow::abort::already_configured_device%]"
},
"error": {
"cannot_connect": "[%key:common::config_flow::error::cannot_connect%]"
},
"abort": {
"already_configured": "[%key:common::config_flow::abort::already_configured_device%]"
},
"step": {
"user": {
"data": {
@@ -19,14 +19,14 @@
},
"services": {
"set_time_to": {
"name": "Set time to",
"description": "Controls timers to turn the system on or off after a set number of minutes.",
"fields": {
"minutes": {
"description": "Minutes until action.",
"name": "Minutes"
"name": "Minutes",
"description": "Minutes until action."
}
},
"name": "Set time to"
}
}
}
}

View File

@@ -71,14 +71,7 @@ class AemetConfigFlow(ConfigFlow, domain=DOMAIN):
}
)
return self.async_show_form(
step_id="user",
data_schema=schema,
errors=errors,
description_placeholders={
"api_key_url": "https://opendata.aemet.es/centrodedescargas/altaUsuario"
},
)
return self.async_show_form(step_id="user", data_schema=schema, errors=errors)
@staticmethod
@callback

View File

@@ -14,7 +14,7 @@
"longitude": "[%key:common::config_flow::data::longitude%]",
"name": "Name of the integration"
},
"description": "To generate API key go to {api_key_url}"
"description": "To generate API key go to https://opendata.aemet.es/centrodedescargas/altaUsuario"
}
}
},

View File

@@ -1,57 +1,57 @@
{
"config": {
"abort": {
"already_configured": "[%key:common::config_flow::abort::already_configured_device%]"
},
"error": {
"cannot_connect": "[%key:common::config_flow::error::cannot_connect%]"
},
"step": {
"user": {
"data": {
"api_key": "[%key:common::config_flow::data::api_key%]"
}
}
}
},
"issues": {
"deprecated_yaml_import_issue_cannot_connect": {
"description": "Configuring {integration_title} using YAML is being removed but there was a connection error importing your YAML configuration.\n\nEnsure connection to {integration_title} works and restart Home Assistant to try again or remove the {integration_title} YAML configuration from your configuration.yaml file and continue to [set up the integration]({url}) manually.",
"title": "The {integration_title} YAML configuration import failed"
},
"error": {
"cannot_connect": "[%key:common::config_flow::error::cannot_connect%]"
},
"abort": {
"already_configured": "[%key:common::config_flow::abort::already_configured_device%]"
}
},
"services": {
"add_tracking": {
"name": "Add tracking",
"description": "Adds a new tracking number to Aftership.",
"fields": {
"slug": {
"description": "Slug (carrier) of the new tracking.",
"name": "Slug"
"tracking_number": {
"name": "Tracking number",
"description": "Tracking number for the new tracking."
},
"title": {
"description": "A custom title for the new tracking.",
"name": "Title"
"name": "Title",
"description": "A custom title for the new tracking."
},
"tracking_number": {
"description": "Tracking number for the new tracking.",
"name": "Tracking number"
"slug": {
"name": "Slug",
"description": "Slug (carrier) of the new tracking."
}
},
"name": "Add tracking"
}
},
"remove_tracking": {
"name": "Remove tracking",
"description": "Removes a tracking number from Aftership.",
"fields": {
"slug": {
"description": "Slug (carrier) of the tracking to remove.",
"name": "[%key:component::aftership::services::add_tracking::fields::slug::name%]"
},
"tracking_number": {
"description": "Tracking number of the tracking to remove.",
"name": "[%key:component::aftership::services::add_tracking::fields::tracking_number::name%]"
"name": "[%key:component::aftership::services::add_tracking::fields::tracking_number::name%]",
"description": "Tracking number of the tracking to remove."
},
"slug": {
"name": "[%key:component::aftership::services::add_tracking::fields::slug::name%]",
"description": "Slug (carrier) of the tracking to remove."
}
},
"name": "Remove tracking"
}
}
},
"issues": {
"deprecated_yaml_import_issue_cannot_connect": {
"title": "The {integration_title} YAML configuration import failed",
"description": "Configuring {integration_title} using YAML is being removed but there was a connection error importing your YAML configuration.\n\nEnsure connection to {integration_title} works and restart Home Assistant to try again or remove the {integration_title} YAML configuration from your configuration.yaml file and continue to [set up the integration]({url}) manually."
}
}
}

View File

@@ -1,19 +1,19 @@
{
"services": {
"disable_alerts": {
"service": "mdi:bell-off"
},
"enable_alerts": {
"service": "mdi:bell-alert"
},
"snapshot": {
"service": "mdi:camera"
},
"start_recording": {
"service": "mdi:record-rec"
},
"stop_recording": {
"service": "mdi:stop"
},
"enable_alerts": {
"service": "mdi:bell-alert"
},
"disable_alerts": {
"service": "mdi:bell-off"
},
"snapshot": {
"service": "mdi:camera"
}
}
}

View File

@@ -1,45 +1,45 @@
{
"config": {
"abort": {
"already_configured": "[%key:common::config_flow::abort::already_configured_device%]"
},
"error": {
"already_in_progress": "[%key:common::config_flow::abort::already_in_progress%]",
"cannot_connect": "[%key:common::config_flow::error::cannot_connect%]"
},
"step": {
"user": {
"title": "Set up Agent DVR",
"data": {
"host": "[%key:common::config_flow::data::host%]",
"port": "[%key:common::config_flow::data::port%]"
},
"data_description": {
"host": "The IP address of the Agent DVR server."
},
"title": "Set up Agent DVR"
}
}
},
"abort": {
"already_configured": "[%key:common::config_flow::abort::already_configured_device%]"
},
"error": {
"already_in_progress": "[%key:common::config_flow::abort::already_in_progress%]",
"cannot_connect": "[%key:common::config_flow::error::cannot_connect%]"
}
},
"services": {
"disable_alerts": {
"description": "Disables alerts.",
"name": "Disable alerts"
},
"enable_alerts": {
"description": "Enables alerts.",
"name": "Enable alerts"
},
"snapshot": {
"description": "Takes a photo.",
"name": "Snapshot"
},
"start_recording": {
"description": "Enables continuous recording.",
"name": "Start recording"
"name": "Start recording",
"description": "Enables continuous recording."
},
"stop_recording": {
"description": "Disables continuous recording.",
"name": "Stop recording"
"name": "Stop recording",
"description": "Disables continuous recording."
},
"enable_alerts": {
"name": "Enable alerts",
"description": "Enables alerts."
},
"disable_alerts": {
"name": "Disable alerts",
"description": "Disables alerts."
},
"snapshot": {
"name": "Snapshot",
"description": "Takes a photo."
}
}
}

View File

@@ -3,8 +3,10 @@
import logging
from typing import Any
from aiohttp import web
import voluptuous as vol
from homeassistant.components.http import KEY_HASS, HomeAssistantView
from homeassistant.config_entries import ConfigEntry
from homeassistant.const import ATTR_ENTITY_ID, CONF_DESCRIPTION, CONF_SELECTOR
from homeassistant.core import (
@@ -26,6 +28,7 @@ from .const import (
ATTR_STRUCTURE,
ATTR_TASK_NAME,
DATA_COMPONENT,
DATA_IMAGES,
DATA_PREFERENCES,
DOMAIN,
SERVICE_GENERATE_DATA,
@@ -39,6 +42,7 @@ from .task import (
GenDataTaskResult,
GenImageTask,
GenImageTaskResult,
ImageData,
async_generate_data,
async_generate_image,
)
@@ -51,8 +55,12 @@ __all__ = [
"GenDataTaskResult",
"GenImageTask",
"GenImageTaskResult",
"ImageData",
"async_generate_data",
"async_generate_image",
"async_setup",
"async_setup_entry",
"async_unload_entry",
]
_LOGGER = logging.getLogger(__name__)
@@ -86,8 +94,10 @@ async def async_setup(hass: HomeAssistant, config: ConfigType) -> bool:
entity_component = EntityComponent[AITaskEntity](_LOGGER, DOMAIN, hass)
hass.data[DATA_COMPONENT] = entity_component
hass.data[DATA_PREFERENCES] = AITaskPreferences(hass)
hass.data[DATA_IMAGES] = {}
await hass.data[DATA_PREFERENCES].async_load()
async_setup_http(hass)
hass.http.register_view(ImageView)
hass.services.async_register(
DOMAIN,
SERVICE_GENERATE_DATA,
@@ -199,3 +209,28 @@ class AITaskPreferences:
def as_dict(self) -> dict[str, str | None]:
"""Get the current preferences."""
return {key: getattr(self, key) for key in self.KEYS}
class ImageView(HomeAssistantView):
"""View to generated images."""
url = f"/api/{DOMAIN}/images/{{filename}}"
name = f"api:{DOMAIN}/images"
async def get(
self,
request: web.Request,
filename: str,
) -> web.Response:
"""Serve image."""
hass = request.app[KEY_HASS]
image_storage = hass.data[DATA_IMAGES]
image_data = image_storage.get(filename)
if image_data is None:
raise web.HTTPNotFound
return web.Response(
body=image_data.data,
content_type=image_data.mime_type,
)

View File

@@ -8,19 +8,19 @@ from typing import TYPE_CHECKING, Final
from homeassistant.util.hass_dict import HassKey
if TYPE_CHECKING:
from homeassistant.components.media_source import local_source
from homeassistant.helpers.entity_component import EntityComponent
from . import AITaskPreferences
from .entity import AITaskEntity
from .task import ImageData
DOMAIN = "ai_task"
DATA_COMPONENT: HassKey[EntityComponent[AITaskEntity]] = HassKey(DOMAIN)
DATA_PREFERENCES: HassKey[AITaskPreferences] = HassKey(f"{DOMAIN}_preferences")
DATA_MEDIA_SOURCE: HassKey[local_source.LocalSource] = HassKey(f"{DOMAIN}_media_source")
DATA_IMAGES: HassKey[dict[str, ImageData]] = HassKey(f"{DOMAIN}_images")
IMAGE_DIR: Final = "image"
IMAGE_EXPIRY_TIME = 60 * 60 # 1 hour
MAX_IMAGES = 20
SERVICE_GENERATE_DATA = "generate_data"
SERVICE_GENERATE_IMAGE = "generate_image"

View File

@@ -1,7 +1,7 @@
{
"domain": "ai_task",
"name": "AI Task",
"after_dependencies": ["camera"],
"after_dependencies": ["camera", "http"],
"codeowners": ["@home-assistant/core"],
"dependencies": ["conversation", "media_source"],
"documentation": "https://www.home-assistant.io/integrations/ai_task",

View File

@@ -2,31 +2,89 @@
from __future__ import annotations
from pathlib import Path
from datetime import timedelta
import logging
from homeassistant.components.media_source import MediaSource, local_source
from homeassistant.components.http.auth import async_sign_path
from homeassistant.components.media_player import BrowseError, MediaClass
from homeassistant.components.media_source import (
BrowseMediaSource,
MediaSource,
MediaSourceItem,
PlayMedia,
Unresolvable,
)
from homeassistant.core import HomeAssistant
from homeassistant.exceptions import HomeAssistantError
from .const import DATA_MEDIA_SOURCE, DOMAIN, IMAGE_DIR
from .const import DATA_IMAGES, DOMAIN, IMAGE_EXPIRY_TIME
_LOGGER = logging.getLogger(__name__)
async def async_get_media_source(hass: HomeAssistant) -> MediaSource:
"""Set up local media source."""
media_dirs = list(hass.config.media_dirs.values())
async def async_get_media_source(hass: HomeAssistant) -> ImageMediaSource:
"""Set up image media source."""
_LOGGER.debug("Setting up image media source")
return ImageMediaSource(hass)
if not media_dirs:
raise HomeAssistantError(
"AI Task media source requires at least one media directory configured"
class ImageMediaSource(MediaSource):
"""Provide images as media sources."""
name: str = "AI Generated Images"
def __init__(self, hass: HomeAssistant) -> None:
"""Initialize ImageMediaSource."""
super().__init__(DOMAIN)
self.hass = hass
async def async_resolve_media(self, item: MediaSourceItem) -> PlayMedia:
"""Resolve media to a url."""
image_storage = self.hass.data[DATA_IMAGES]
image = image_storage.get(item.identifier)
if image is None:
raise Unresolvable(f"Could not resolve media item: {item.identifier}")
return PlayMedia(
async_sign_path(
self.hass,
f"/api/{DOMAIN}/images/{item.identifier}",
timedelta(seconds=IMAGE_EXPIRY_TIME or 1800),
),
image.mime_type,
)
media_dir = Path(media_dirs[0]) / DOMAIN / IMAGE_DIR
async def async_browse_media(
self,
item: MediaSourceItem,
) -> BrowseMediaSource:
"""Return media."""
if item.identifier:
raise BrowseError("Unknown item")
hass.data[DATA_MEDIA_SOURCE] = source = local_source.LocalSource(
hass,
DOMAIN,
"AI Generated Images",
{IMAGE_DIR: str(media_dir)},
f"/{DOMAIN}",
)
return source
image_storage = self.hass.data[DATA_IMAGES]
children = [
BrowseMediaSource(
domain=DOMAIN,
identifier=filename,
media_class=MediaClass.IMAGE,
media_content_type=image.mime_type,
title=image.title or filename,
can_play=True,
can_expand=False,
)
for filename, image in image_storage.items()
]
return BrowseMediaSource(
domain=DOMAIN,
identifier=None,
media_class=MediaClass.APP,
media_content_type="",
title="AI Generated Images",
can_play=False,
can_expand=True,
children_media_class=MediaClass.IMAGE,
children=children,
)

View File

@@ -1,52 +1,52 @@
{
"services": {
"generate_data": {
"name": "Generate data",
"description": "Uses AI to run a task that generates data.",
"fields": {
"attachments": {
"description": "List of files to attach for multi-modal AI analysis.",
"name": "Attachments"
},
"entity_id": {
"description": "Entity ID to run the task on. If not provided, the preferred entity will be used.",
"name": "Entity ID"
"task_name": {
"name": "Task name",
"description": "Name of the task."
},
"instructions": {
"description": "Instructions on what needs to be done.",
"name": "Instructions"
"name": "Instructions",
"description": "Instructions on what needs to be done."
},
"entity_id": {
"name": "Entity ID",
"description": "Entity ID to run the task on. If not provided, the preferred entity will be used."
},
"structure": {
"description": "When set, the AI Task will output fields with this in structure. The structure is a dictionary where the keys are the field names and the values contain a 'description', a 'selector', and an optional 'required' field.",
"name": "Structured output"
"name": "Structured output",
"description": "When set, the AI Task will output fields with this in structure. The structure is a dictionary where the keys are the field names and the values contain a 'description', a 'selector', and an optional 'required' field."
},
"task_name": {
"description": "Name of the task.",
"name": "Task name"
"attachments": {
"name": "Attachments",
"description": "List of files to attach for multi-modal AI analysis."
}
},
"name": "Generate data"
}
},
"generate_image": {
"name": "Generate image",
"description": "Uses AI to generate image.",
"fields": {
"attachments": {
"description": "List of files to attach for using as references.",
"name": "Attachments"
},
"entity_id": {
"description": "Entity ID to run the task on.",
"name": "Entity ID"
"task_name": {
"name": "Task name",
"description": "Name of the task."
},
"instructions": {
"description": "Instructions that explains the image to be generated.",
"name": "Instructions"
"name": "Instructions",
"description": "Instructions that explains the image to be generated."
},
"task_name": {
"description": "Name of the task.",
"name": "Task name"
"entity_id": {
"name": "Entity ID",
"description": "Entity ID to run the task on."
},
"attachments": {
"name": "Attachments",
"description": "List of files to attach for using as references."
}
},
"name": "Generate image"
}
}
}
}

View File

@@ -4,7 +4,7 @@ from __future__ import annotations
from dataclasses import dataclass
from datetime import datetime, timedelta
import io
from functools import partial
import mimetypes
from pathlib import Path
import tempfile
@@ -12,33 +12,35 @@ from typing import Any
import voluptuous as vol
from homeassistant.components import camera, conversation, image, media_source
from homeassistant.components import camera, conversation, media_source
from homeassistant.components.http.auth import async_sign_path
from homeassistant.core import HomeAssistant, ServiceResponse, callback
from homeassistant.exceptions import HomeAssistantError
from homeassistant.helpers import llm
from homeassistant.helpers.chat_session import ChatSession, async_get_chat_session
from homeassistant.helpers.event import async_call_later
from homeassistant.helpers.network import get_url
from homeassistant.util import RE_SANITIZE_FILENAME, slugify
from .const import (
DATA_COMPONENT,
DATA_MEDIA_SOURCE,
DATA_IMAGES,
DATA_PREFERENCES,
DOMAIN,
IMAGE_DIR,
IMAGE_EXPIRY_TIME,
MAX_IMAGES,
AITaskEntityFeature,
)
def _save_camera_snapshot(image_data: camera.Image | image.Image) -> Path:
def _save_camera_snapshot(image: camera.Image) -> Path:
"""Save camera snapshot to temp file."""
with tempfile.NamedTemporaryFile(
mode="wb",
suffix=mimetypes.guess_extension(image_data.content_type, False),
suffix=mimetypes.guess_extension(image.content_type, False),
delete=False,
) as temp_file:
temp_file.write(image_data.content)
temp_file.write(image.content)
return Path(temp_file.name)
@@ -54,31 +56,26 @@ async def _resolve_attachments(
for attachment in attachments or []:
media_content_id = attachment["media_content_id"]
# Special case for certain media sources
for integration in camera, image:
media_source_prefix = f"media-source://{integration.DOMAIN}/"
if not media_content_id.startswith(media_source_prefix):
continue
# Special case for camera media sources
if media_content_id.startswith("media-source://camera/"):
# Extract entity_id from the media content ID
entity_id = media_content_id.removeprefix(media_source_prefix)
entity_id = media_content_id.removeprefix("media-source://camera/")
# Get snapshot from entity
image_data = await integration.async_get_image(hass, entity_id)
# Get snapshot from camera
image = await camera.async_get_image(hass, entity_id)
temp_filename = await hass.async_add_executor_job(
_save_camera_snapshot, image_data
_save_camera_snapshot, image
)
created_files.append(temp_filename)
resolved_attachments.append(
conversation.Attachment(
media_content_id=media_content_id,
mime_type=image_data.content_type,
mime_type=image.content_type,
path=temp_filename,
)
)
break
else:
# Handle regular media sources
media = await media_source.async_resolve_media(hass, media_content_id, None)
@@ -161,6 +158,24 @@ async def async_generate_data(
)
def _cleanup_images(image_storage: dict[str, ImageData], num_to_remove: int) -> None:
"""Remove old images to keep the storage size under the limit."""
if num_to_remove <= 0:
return
if num_to_remove >= len(image_storage):
image_storage.clear()
return
sorted_images = sorted(
image_storage.items(),
key=lambda item: item[1].timestamp,
)
for filename, _ in sorted_images[:num_to_remove]:
image_storage.pop(filename, None)
async def async_generate_image(
hass: HomeAssistant,
*,
@@ -210,34 +225,36 @@ async def async_generate_image(
if service_result.get("revised_prompt") is None:
service_result["revised_prompt"] = instructions
source = hass.data[DATA_MEDIA_SOURCE]
image_storage = hass.data[DATA_IMAGES]
if len(image_storage) + 1 > MAX_IMAGES:
_cleanup_images(image_storage, len(image_storage) + 1 - MAX_IMAGES)
current_time = datetime.now()
ext = mimetypes.guess_extension(task_result.mime_type, False) or ".png"
sanitized_task_name = RE_SANITIZE_FILENAME.sub("", slugify(task_name))
filename = f"{current_time.strftime('%Y-%m-%d_%H%M%S')}_{sanitized_task_name}{ext}"
image_file = ImageData(
filename=f"{current_time.strftime('%Y-%m-%d_%H%M%S')}_{sanitized_task_name}{ext}",
file=io.BytesIO(image_data),
content_type=task_result.mime_type,
image_storage[filename] = ImageData(
data=image_data,
timestamp=int(current_time.timestamp()),
mime_type=task_result.mime_type,
title=service_result["revised_prompt"],
)
target_folder = media_source.MediaSourceItem.from_uri(
hass, f"media-source://{DOMAIN}/{IMAGE_DIR}", None
)
def _purge_image(filename: str, now: datetime) -> None:
"""Remove image from storage."""
image_storage.pop(filename, None)
service_result["media_source_id"] = await source.async_upload_media(
target_folder, image_file
)
if IMAGE_EXPIRY_TIME > 0:
async_call_later(hass, IMAGE_EXPIRY_TIME, partial(_purge_image, filename))
item = media_source.MediaSourceItem.from_uri(
hass, service_result["media_source_id"], None
)
service_result["url"] = async_sign_path(
service_result["url"] = get_url(hass) + async_sign_path(
hass,
(await source.async_resolve_media(item)).url,
timedelta(seconds=IMAGE_EXPIRY_TIME),
f"/api/{DOMAIN}/images/{filename}",
timedelta(seconds=IMAGE_EXPIRY_TIME or 1800),
)
service_result["media_source_id"] = f"media-source://{DOMAIN}/images/{filename}"
return service_result
@@ -342,8 +359,20 @@ class GenImageTaskResult:
@dataclass(slots=True)
class ImageData:
"""Implementation of media_source.local_source.UploadedFile protocol."""
"""Image data for stored generated images."""
filename: str
file: io.IOBase
content_type: str
data: bytes
"""Raw image data."""
timestamp: int
"""Timestamp when the image was generated, as a Unix timestamp."""
mime_type: str
"""MIME type of the image."""
title: str
"""Title of the image, usually the prompt used to generate it."""
def __str__(self) -> str:
"""Return image data as a string."""
return f"<ImageData {self.title}: {id(self)}>"

View File

@@ -9,17 +9,14 @@
}
},
"number": {
"display_brightness": {
"led_bar_brightness": {
"default": "mdi:brightness-percent"
},
"led_bar_brightness": {
"display_brightness": {
"default": "mdi:brightness-percent"
}
},
"select": {
"co2_automatic_baseline_calibration": {
"default": "mdi:molecule-co2"
},
"configuration_control": {
"default": "mdi:cloud-cog"
},
@@ -34,11 +31,23 @@
},
"voc_index_learning_time_offset": {
"default": "mdi:clock-outline"
},
"co2_automatic_baseline_calibration": {
"default": "mdi:molecule-co2"
}
},
"sensor": {
"co2_automatic_baseline_calibration": {
"default": "mdi:molecule-co2"
"total_volatile_organic_component_index": {
"default": "mdi:molecule"
},
"nitrogen_index": {
"default": "mdi:molecule"
},
"pm003_count": {
"default": "mdi:blur"
},
"led_bar_brightness": {
"default": "mdi:brightness-percent"
},
"display_brightness": {
"default": "mdi:brightness-percent"
@@ -46,26 +55,17 @@
"display_temperature_unit": {
"default": "mdi:thermometer-lines"
},
"led_bar_brightness": {
"default": "mdi:brightness-percent"
},
"led_bar_mode": {
"default": "mdi:led-strip"
},
"nitrogen_index": {
"default": "mdi:molecule"
},
"nox_index_learning_time_offset": {
"default": "mdi:clock-outline"
},
"pm003_count": {
"default": "mdi:blur"
},
"total_volatile_organic_component_index": {
"default": "mdi:molecule"
},
"voc_index_learning_time_offset": {
"default": "mdi:clock-outline"
},
"co2_automatic_baseline_calibration": {
"default": "mdi:molecule-co2"
}
},
"switch": {

View File

@@ -1,5 +1,19 @@
{
"config": {
"flow_title": "{model}",
"step": {
"user": {
"data": {
"host": "[%key:common::config_flow::data::host%]"
},
"data_description": {
"host": "The hostname or IP address of the Airgradient device."
}
},
"discovery_confirm": {
"description": "Do you want to set up {model}?"
}
},
"abort": {
"already_configured": "[%key:common::config_flow::abort::already_configured_device%]",
"already_in_progress": "[%key:common::config_flow::abort::already_in_progress%]",
@@ -10,20 +24,6 @@
"error": {
"cannot_connect": "[%key:common::config_flow::error::cannot_connect%]",
"unknown": "[%key:common::config_flow::error::unknown%]"
},
"flow_title": "{model}",
"step": {
"discovery_confirm": {
"description": "Do you want to set up {model}?"
},
"user": {
"data": {
"host": "[%key:common::config_flow::data::host%]"
},
"data_description": {
"host": "The hostname or IP address of the Airgradient device."
}
}
}
},
"entity": {
@@ -36,25 +36,14 @@
}
},
"number": {
"display_brightness": {
"name": "Display brightness"
},
"led_bar_brightness": {
"name": "LED bar brightness"
},
"display_brightness": {
"name": "Display brightness"
}
},
"select": {
"co2_automatic_baseline_calibration": {
"name": "CO2 automatic baseline duration",
"state": {
"0": "[%key:common::state::off%]",
"1": "1 day",
"8": "8 days",
"30": "30 days",
"90": "90 days",
"180": "180 days"
}
},
"configuration_control": {
"name": "Configuration source",
"state": {
@@ -62,13 +51,6 @@
"local": "Local"
}
},
"display_pm_standard": {
"name": "Display PM standard",
"state": {
"ugm3": "μg/m³",
"us_aqi": "US AQI"
}
},
"display_temperature_unit": {
"name": "Display temperature unit",
"state": {
@@ -76,11 +58,18 @@
"f": "Fahrenheit"
}
},
"display_pm_standard": {
"name": "Display PM standard",
"state": {
"ugm3": "μg/m³",
"us_aqi": "US AQI"
}
},
"led_bar_mode": {
"name": "LED bar mode",
"state": {
"co2": "[%key:component::sensor::entity_component::carbon_dioxide::name%]",
"off": "[%key:common::state::off%]",
"co2": "[%key:component::sensor::entity_component::carbon_dioxide::name%]",
"pm": "Particulate matter"
}
},
@@ -103,14 +92,37 @@
"360": "[%key:component::airgradient::entity::select::nox_index_learning_time_offset::state::360%]",
"720": "[%key:component::airgradient::entity::select::nox_index_learning_time_offset::state::720%]"
}
},
"co2_automatic_baseline_calibration": {
"name": "CO2 automatic baseline duration",
"state": {
"1": "1 day",
"8": "8 days",
"30": "30 days",
"90": "90 days",
"180": "180 days",
"0": "[%key:common::state::off%]"
}
}
},
"sensor": {
"co2_automatic_baseline_calibration_days": {
"name": "Carbon dioxide automatic baseline calibration"
"total_volatile_organic_component_index": {
"name": "VOC index"
},
"display_brightness": {
"name": "[%key:component::airgradient::entity::number::display_brightness::name%]"
"nitrogen_index": {
"name": "NOx index"
},
"pm003_count": {
"name": "PM0.3"
},
"raw_total_volatile_organic_component": {
"name": "Raw VOC"
},
"raw_nitrogen": {
"name": "Raw NOx"
},
"raw_pm02": {
"name": "Raw PM2.5"
},
"display_pm_standard": {
"name": "[%key:component::airgradient::entity::select::display_pm_standard::name%]",
@@ -119,6 +131,26 @@
"us_aqi": "[%key:component::airgradient::entity::select::display_pm_standard::state::us_aqi%]"
}
},
"co2_automatic_baseline_calibration_days": {
"name": "Carbon dioxide automatic baseline calibration"
},
"nox_learning_offset": {
"name": "[%key:component::airgradient::entity::select::nox_index_learning_time_offset::name%]"
},
"tvoc_learning_offset": {
"name": "[%key:component::airgradient::entity::select::voc_index_learning_time_offset::name%]"
},
"led_bar_mode": {
"name": "[%key:component::airgradient::entity::select::led_bar_mode::name%]",
"state": {
"off": "[%key:common::state::off%]",
"co2": "[%key:component::sensor::entity_component::carbon_dioxide::name%]",
"pm": "[%key:component::airgradient::entity::select::led_bar_mode::state::pm%]"
}
},
"led_bar_brightness": {
"name": "[%key:component::airgradient::entity::number::led_bar_brightness::name%]"
},
"display_temperature_unit": {
"name": "[%key:component::airgradient::entity::select::display_temperature_unit::name%]",
"state": {
@@ -126,40 +158,8 @@
"f": "[%key:component::airgradient::entity::select::display_temperature_unit::state::f%]"
}
},
"led_bar_brightness": {
"name": "[%key:component::airgradient::entity::number::led_bar_brightness::name%]"
},
"led_bar_mode": {
"name": "[%key:component::airgradient::entity::select::led_bar_mode::name%]",
"state": {
"co2": "[%key:component::sensor::entity_component::carbon_dioxide::name%]",
"off": "[%key:common::state::off%]",
"pm": "[%key:component::airgradient::entity::select::led_bar_mode::state::pm%]"
}
},
"nitrogen_index": {
"name": "NOx index"
},
"nox_learning_offset": {
"name": "[%key:component::airgradient::entity::select::nox_index_learning_time_offset::name%]"
},
"pm003_count": {
"name": "PM0.3"
},
"raw_nitrogen": {
"name": "Raw NOx"
},
"raw_pm02": {
"name": "Raw PM2.5"
},
"raw_total_volatile_organic_component": {
"name": "Raw VOC"
},
"total_volatile_organic_component_index": {
"name": "VOC index"
},
"tvoc_learning_offset": {
"name": "[%key:component::airgradient::entity::select::voc_index_learning_time_offset::name%]"
"display_brightness": {
"name": "[%key:component::airgradient::entity::number::display_brightness::name%]"
}
},
"switch": {

View File

@@ -1,9 +1,7 @@
"""Airgradient Update platform."""
from datetime import timedelta
import logging
from airgradient import AirGradientConnectionError
from propcache.api import cached_property
from homeassistant.components.update import UpdateDeviceClass, UpdateEntity
@@ -15,7 +13,6 @@ from .entity import AirGradientEntity
PARALLEL_UPDATES = 1
SCAN_INTERVAL = timedelta(hours=1)
_LOGGER = logging.getLogger(__name__)
async def async_setup_entry(
@@ -34,7 +31,6 @@ class AirGradientUpdate(AirGradientEntity, UpdateEntity):
"""Representation of Airgradient Update."""
_attr_device_class = UpdateDeviceClass.FIRMWARE
_server_unreachable_logged = False
def __init__(self, coordinator: AirGradientCoordinator) -> None:
"""Initialize the entity."""
@@ -51,27 +47,10 @@ class AirGradientUpdate(AirGradientEntity, UpdateEntity):
"""Return the installed version of the entity."""
return self.coordinator.data.measures.firmware_version
@property
def available(self) -> bool:
"""Return if entity is available."""
return super().available and self._attr_available
async def async_update(self) -> None:
"""Update the entity."""
try:
self._attr_latest_version = (
await self.coordinator.client.get_latest_firmware_version(
self.coordinator.serial_number
)
self._attr_latest_version = (
await self.coordinator.client.get_latest_firmware_version(
self.coordinator.serial_number
)
except AirGradientConnectionError:
self._attr_latest_version = None
self._attr_available = False
if not self._server_unreachable_logged:
_LOGGER.error(
"Unable to connect to AirGradient server to check for updates"
)
self._server_unreachable_logged = True
else:
self._server_unreachable_logged = False
self._attr_available = True
)

View File

@@ -18,10 +18,6 @@ from homeassistant.helpers.aiohttp_client import async_get_clientsession
from .const import CONF_USE_NEAREST, DOMAIN, NO_AIRLY_SENSORS
DESCRIPTION_PLACEHOLDERS = {
"developer_registration_url": "https://developer.airly.eu/register",
}
class AirlyFlowHandler(ConfigFlow, domain=DOMAIN):
"""Config flow for Airly."""
@@ -89,7 +85,6 @@ class AirlyFlowHandler(ConfigFlow, domain=DOMAIN):
}
),
errors=errors,
description_placeholders=DESCRIPTION_PLACEHOLDERS,
)

View File

@@ -1,23 +1,30 @@
{
"config": {
"step": {
"user": {
"description": "To generate API key go to https://developer.airly.eu/register",
"data": {
"name": "[%key:common::config_flow::data::name%]",
"api_key": "[%key:common::config_flow::data::api_key%]",
"latitude": "[%key:common::config_flow::data::latitude%]",
"longitude": "[%key:common::config_flow::data::longitude%]"
}
}
},
"error": {
"wrong_location": "No Airly measuring stations in this area.",
"invalid_api_key": "[%key:common::config_flow::error::invalid_api_key%]"
},
"abort": {
"already_configured": "[%key:common::config_flow::abort::already_configured_location%]",
"wrong_location": "[%key:component::airly::config::error::wrong_location%]"
},
"error": {
"invalid_api_key": "[%key:common::config_flow::error::invalid_api_key%]",
"wrong_location": "No Airly measuring stations in this area."
},
"step": {
"user": {
"data": {
"api_key": "[%key:common::config_flow::data::api_key%]",
"latitude": "[%key:common::config_flow::data::latitude%]",
"longitude": "[%key:common::config_flow::data::longitude%]",
"name": "[%key:common::config_flow::data::name%]"
},
"description": "To generate API key go to {developer_registration_url}"
}
}
},
"system_health": {
"info": {
"can_reach_server": "Reach Airly server",
"requests_remaining": "Remaining allowed requests",
"requests_per_day": "Allowed requests per day"
}
},
"entity": {
@@ -31,18 +38,11 @@
}
},
"exceptions": {
"no_station": {
"message": "An error occurred while retrieving data from the Airly API for {entry}: no measuring stations in this area"
},
"update_error": {
"message": "An error occurred while retrieving data from the Airly API for {entry}: {error}"
}
},
"system_health": {
"info": {
"can_reach_server": "Reach Airly server",
"requests_per_day": "Allowed requests per day",
"requests_remaining": "Remaining allowed requests"
},
"no_station": {
"message": "An error occurred while retrieving data from the Airly API for {entry}: no measuring stations in this area"
}
}
}

View File

@@ -26,10 +26,6 @@ from .const import DOMAIN
_LOGGER = logging.getLogger(__name__)
# Documentation URL for API key generation
_API_KEY_URL = "https://docs.airnowapi.org/account/request/"
async def validate_input(hass: HomeAssistant, data: dict[str, Any]) -> bool:
"""Validate the user input allows us to connect.
@@ -118,7 +114,6 @@ class AirNowConfigFlow(ConfigFlow, domain=DOMAIN):
),
}
),
description_placeholders={"api_key_url": _API_KEY_URL},
errors=errors,
)

View File

@@ -4,15 +4,15 @@
"aqi": {
"default": "mdi:blur"
},
"o3": {
"default": "mdi:blur"
},
"pm10": {
"default": "mdi:blur"
},
"pm25": {
"default": "mdi:blur"
},
"o3": {
"default": "mdi:blur"
},
"station": {
"default": "mdi:blur"
}

View File

@@ -1,7 +1,15 @@
{
"config": {
"abort": {
"already_configured": "[%key:common::config_flow::abort::already_configured_device%]"
"step": {
"user": {
"description": "To generate API key go to https://docs.airnowapi.org/account/request/",
"data": {
"api_key": "[%key:common::config_flow::data::api_key%]",
"latitude": "[%key:common::config_flow::data::latitude%]",
"longitude": "[%key:common::config_flow::data::longitude%]",
"radius": "Station radius (miles; optional)"
}
}
},
"error": {
"cannot_connect": "[%key:common::config_flow::error::cannot_connect%]",
@@ -9,15 +17,16 @@
"invalid_location": "No results found for that location, try changing the location or station radius.",
"unknown": "[%key:common::config_flow::error::unknown%]"
},
"abort": {
"already_configured": "[%key:common::config_flow::abort::already_configured_device%]"
}
},
"options": {
"step": {
"user": {
"init": {
"data": {
"api_key": "[%key:common::config_flow::data::api_key%]",
"latitude": "[%key:common::config_flow::data::latitude%]",
"longitude": "[%key:common::config_flow::data::longitude%]",
"radius": "Station radius (miles; optional)"
},
"description": "To generate API key go to {api_key_url}"
"radius": "Station radius (miles)"
}
}
}
},
@@ -34,14 +43,5 @@
}
}
}
},
"options": {
"step": {
"init": {
"data": {
"radius": "Station radius (miles)"
}
}
}
}
}

View File

@@ -2,23 +2,12 @@
from __future__ import annotations
import logging
from airos.airos8 import AirOS8
from homeassistant.const import (
CONF_HOST,
CONF_PASSWORD,
CONF_SSL,
CONF_USERNAME,
CONF_VERIFY_SSL,
Platform,
)
from homeassistant.core import HomeAssistant, callback
from homeassistant.helpers import device_registry as dr, entity_registry as er
from homeassistant.const import CONF_HOST, CONF_PASSWORD, CONF_USERNAME, Platform
from homeassistant.core import HomeAssistant
from homeassistant.helpers.aiohttp_client import async_get_clientsession
from .const import DEFAULT_SSL, DEFAULT_VERIFY_SSL, DOMAIN, SECTION_ADVANCED_SETTINGS
from .coordinator import AirOSConfigEntry, AirOSDataUpdateCoordinator
_PLATFORMS: list[Platform] = [
@@ -26,24 +15,19 @@ _PLATFORMS: list[Platform] = [
Platform.SENSOR,
]
_LOGGER = logging.getLogger(__name__)
async def async_setup_entry(hass: HomeAssistant, entry: AirOSConfigEntry) -> bool:
"""Set up Ubiquiti airOS from a config entry."""
# By default airOS 8 comes with self-signed SSL certificates,
# with no option in the web UI to change or upload a custom certificate.
session = async_get_clientsession(
hass, verify_ssl=entry.data[SECTION_ADVANCED_SETTINGS][CONF_VERIFY_SSL]
)
session = async_get_clientsession(hass, verify_ssl=False)
airos_device = AirOS8(
host=entry.data[CONF_HOST],
username=entry.data[CONF_USERNAME],
password=entry.data[CONF_PASSWORD],
session=session,
use_ssl=entry.data[SECTION_ADVANCED_SETTINGS][CONF_SSL],
)
coordinator = AirOSDataUpdateCoordinator(hass, entry, airos_device)
@@ -56,77 +40,6 @@ async def async_setup_entry(hass: HomeAssistant, entry: AirOSConfigEntry) -> boo
return True
async def async_migrate_entry(hass: HomeAssistant, entry: AirOSConfigEntry) -> bool:
"""Migrate old config entry."""
# This means the user has downgraded from a future version
if entry.version > 2:
return False
# 1.1 Migrate config_entry to add advanced ssl settings
if entry.version == 1 and entry.minor_version == 1:
new_minor_version = 2
new_data = {**entry.data}
advanced_data = {
CONF_SSL: DEFAULT_SSL,
CONF_VERIFY_SSL: DEFAULT_VERIFY_SSL,
}
new_data[SECTION_ADVANCED_SETTINGS] = advanced_data
hass.config_entries.async_update_entry(
entry,
data=new_data,
minor_version=new_minor_version,
)
# 2.1 Migrate binary_sensor entity unique_id from device_id to mac_address
# Step 1 - migrate binary_sensor entity unique_id
# Step 2 - migrate device entity identifier
if entry.version == 1:
new_version = 2
new_minor_version = 1
mac_adress = dr.format_mac(entry.unique_id)
device_registry = dr.async_get(hass)
if device_entry := device_registry.async_get_device(
connections={(dr.CONNECTION_NETWORK_MAC, mac_adress)}
):
old_device_id = next(
(
device_id
for domain, device_id in device_entry.identifiers
if domain == DOMAIN
),
)
@callback
def update_unique_id(
entity_entry: er.RegistryEntry,
) -> dict[str, str] | None:
"""Update unique id from device_id to mac address."""
if old_device_id and entity_entry.unique_id.startswith(old_device_id):
suffix = entity_entry.unique_id.removeprefix(old_device_id)
new_unique_id = f"{mac_adress}{suffix}"
return {"new_unique_id": new_unique_id}
return None
await er.async_migrate_entries(hass, entry.entry_id, update_unique_id)
new_identifiers = device_entry.identifiers.copy()
new_identifiers.discard((DOMAIN, old_device_id))
new_identifiers.add((DOMAIN, mac_adress))
device_registry.async_update_device(
device_entry.id, new_identifiers=new_identifiers
)
hass.config_entries.async_update_entry(
entry, version=new_version, minor_version=new_minor_version
)
return True
async def async_unload_entry(hass: HomeAssistant, entry: AirOSConfigEntry) -> bool:
"""Unload a config entry."""
return await hass.config_entries.async_unload_platforms(entry, _PLATFORMS)

View File

@@ -98,7 +98,7 @@ class AirOSBinarySensor(AirOSEntity, BinarySensorEntity):
super().__init__(coordinator)
self.entity_description = description
self._attr_unique_id = f"{coordinator.data.derived.mac}_{description.key}"
self._attr_unique_id = f"{coordinator.data.host.device_id}_{description.key}"
@property
def is_on(self) -> bool:

View File

@@ -2,7 +2,6 @@
from __future__ import annotations
from collections.abc import Mapping
import logging
from typing import Any
@@ -15,28 +14,11 @@ from airos.exceptions import (
)
import voluptuous as vol
from homeassistant.config_entries import (
SOURCE_REAUTH,
SOURCE_RECONFIGURE,
ConfigFlow,
ConfigFlowResult,
)
from homeassistant.const import (
CONF_HOST,
CONF_PASSWORD,
CONF_SSL,
CONF_USERNAME,
CONF_VERIFY_SSL,
)
from homeassistant.data_entry_flow import section
from homeassistant.config_entries import ConfigFlow, ConfigFlowResult
from homeassistant.const import CONF_HOST, CONF_PASSWORD, CONF_USERNAME
from homeassistant.helpers.aiohttp_client import async_get_clientsession
from homeassistant.helpers.selector import (
TextSelector,
TextSelectorConfig,
TextSelectorType,
)
from .const import DEFAULT_SSL, DEFAULT_VERIFY_SSL, DOMAIN, SECTION_ADVANCED_SETTINGS
from .const import DOMAIN
from .coordinator import AirOS8
_LOGGER = logging.getLogger(__name__)
@@ -46,15 +28,6 @@ STEP_USER_DATA_SCHEMA = vol.Schema(
vol.Required(CONF_HOST): str,
vol.Required(CONF_USERNAME, default="ubnt"): str,
vol.Required(CONF_PASSWORD): str,
vol.Required(SECTION_ADVANCED_SETTINGS): section(
vol.Schema(
{
vol.Required(CONF_SSL, default=DEFAULT_SSL): bool,
vol.Required(CONF_VERIFY_SSL, default=DEFAULT_VERIFY_SSL): bool,
}
),
{"collapsed": True},
),
}
)
@@ -62,161 +35,48 @@ STEP_USER_DATA_SCHEMA = vol.Schema(
class AirOSConfigFlow(ConfigFlow, domain=DOMAIN):
"""Handle a config flow for Ubiquiti airOS."""
VERSION = 2
MINOR_VERSION = 1
def __init__(self) -> None:
"""Initialize the config flow."""
super().__init__()
self.airos_device: AirOS8
self.errors: dict[str, str] = {}
VERSION = 1
async def async_step_user(
self, user_input: dict[str, Any] | None = None
self,
user_input: dict[str, Any] | None = None,
) -> ConfigFlowResult:
"""Handle the manual input of host and credentials."""
self.errors = {}
"""Handle the initial step."""
errors: dict[str, str] = {}
if user_input is not None:
validated_info = await self._validate_and_get_device_info(user_input)
if validated_info:
return self.async_create_entry(
title=validated_info["title"],
data=validated_info["data"],
)
return self.async_show_form(
step_id="user", data_schema=STEP_USER_DATA_SCHEMA, errors=self.errors
)
# By default airOS 8 comes with self-signed SSL certificates,
# with no option in the web UI to change or upload a custom certificate.
session = async_get_clientsession(self.hass, verify_ssl=False)
async def _validate_and_get_device_info(
self, config_data: dict[str, Any]
) -> dict[str, Any] | None:
"""Validate user input with the device API."""
# By default airOS 8 comes with self-signed SSL certificates,
# with no option in the web UI to change or upload a custom certificate.
session = async_get_clientsession(
self.hass,
verify_ssl=config_data[SECTION_ADVANCED_SETTINGS][CONF_VERIFY_SSL],
)
airos_device = AirOS8(
host=user_input[CONF_HOST],
username=user_input[CONF_USERNAME],
password=user_input[CONF_PASSWORD],
session=session,
)
try:
await airos_device.login()
airos_data = await airos_device.status()
airos_device = AirOS8(
host=config_data[CONF_HOST],
username=config_data[CONF_USERNAME],
password=config_data[CONF_PASSWORD],
session=session,
use_ssl=config_data[SECTION_ADVANCED_SETTINGS][CONF_SSL],
)
try:
await airos_device.login()
airos_data = await airos_device.status()
except (
AirOSConnectionSetupError,
AirOSDeviceConnectionError,
):
self.errors["base"] = "cannot_connect"
except (AirOSConnectionAuthenticationError, AirOSDataMissingError):
self.errors["base"] = "invalid_auth"
except AirOSKeyDataMissingError:
self.errors["base"] = "key_data_missing"
except Exception:
_LOGGER.exception("Unexpected exception during credential validation")
self.errors["base"] = "unknown"
else:
await self.async_set_unique_id(airos_data.derived.mac)
if self.source in [SOURCE_REAUTH, SOURCE_RECONFIGURE]:
self._abort_if_unique_id_mismatch()
except (
AirOSConnectionSetupError,
AirOSDeviceConnectionError,
):
errors["base"] = "cannot_connect"
except (AirOSConnectionAuthenticationError, AirOSDataMissingError):
errors["base"] = "invalid_auth"
except AirOSKeyDataMissingError:
errors["base"] = "key_data_missing"
except Exception:
_LOGGER.exception("Unexpected exception")
errors["base"] = "unknown"
else:
await self.async_set_unique_id(airos_data.derived.mac)
self._abort_if_unique_id_configured()
return {"title": airos_data.host.hostname, "data": config_data}
return None
async def async_step_reauth(
self,
user_input: Mapping[str, Any],
) -> ConfigFlowResult:
"""Perform reauthentication upon an API authentication error."""
return await self.async_step_reauth_confirm(user_input)
async def async_step_reauth_confirm(
self,
user_input: Mapping[str, Any],
) -> ConfigFlowResult:
"""Perform reauthentication upon an API authentication error."""
self.errors = {}
if user_input:
validate_data = {**self._get_reauth_entry().data, **user_input}
if await self._validate_and_get_device_info(config_data=validate_data):
return self.async_update_reload_and_abort(
self._get_reauth_entry(),
data_updates=validate_data,
return self.async_create_entry(
title=airos_data.host.hostname, data=user_input
)
return self.async_show_form(
step_id="reauth_confirm",
data_schema=vol.Schema(
{
vol.Required(CONF_PASSWORD): TextSelector(
TextSelectorConfig(
type=TextSelectorType.PASSWORD,
autocomplete="current-password",
)
),
}
),
errors=self.errors,
)
async def async_step_reconfigure(
self,
user_input: Mapping[str, Any] | None = None,
) -> ConfigFlowResult:
"""Handle reconfiguration of airOS."""
self.errors = {}
entry = self._get_reconfigure_entry()
current_data = entry.data
if user_input is not None:
validate_data = {**current_data, **user_input}
if await self._validate_and_get_device_info(config_data=validate_data):
return self.async_update_reload_and_abort(
entry,
data_updates=validate_data,
)
return self.async_show_form(
step_id="reconfigure",
data_schema=vol.Schema(
{
vol.Required(CONF_PASSWORD): TextSelector(
TextSelectorConfig(
type=TextSelectorType.PASSWORD,
autocomplete="current-password",
)
),
vol.Required(SECTION_ADVANCED_SETTINGS): section(
vol.Schema(
{
vol.Required(
CONF_SSL,
default=current_data[SECTION_ADVANCED_SETTINGS][
CONF_SSL
],
): bool,
vol.Required(
CONF_VERIFY_SSL,
default=current_data[SECTION_ADVANCED_SETTINGS][
CONF_VERIFY_SSL
],
): bool,
}
),
{"collapsed": True},
),
}
),
errors=self.errors,
step_id="user", data_schema=STEP_USER_DATA_SCHEMA, errors=errors
)

View File

@@ -7,8 +7,3 @@ DOMAIN = "airos"
SCAN_INTERVAL = timedelta(minutes=1)
MANUFACTURER = "Ubiquiti"
DEFAULT_VERIFY_SSL = False
DEFAULT_SSL = True
SECTION_ADVANCED_SETTINGS = "advanced_settings"

View File

@@ -14,7 +14,7 @@ from airos.exceptions import (
from homeassistant.config_entries import ConfigEntry
from homeassistant.core import HomeAssistant
from homeassistant.exceptions import ConfigEntryAuthFailed
from homeassistant.exceptions import ConfigEntryError
from homeassistant.helpers.update_coordinator import DataUpdateCoordinator, UpdateFailed
from .const import DOMAIN, SCAN_INTERVAL
@@ -47,9 +47,9 @@ class AirOSDataUpdateCoordinator(DataUpdateCoordinator[AirOS8Data]):
try:
await self.airos_device.login()
return await self.airos_device.status()
except AirOSConnectionAuthenticationError as err:
except (AirOSConnectionAuthenticationError,) as err:
_LOGGER.exception("Error authenticating with airOS device")
raise ConfigEntryAuthFailed(
raise ConfigEntryError(
translation_domain=DOMAIN, translation_key="invalid_auth"
) from err
except (

View File

@@ -2,11 +2,11 @@
from __future__ import annotations
from homeassistant.const import CONF_HOST, CONF_SSL
from homeassistant.const import CONF_HOST
from homeassistant.helpers.device_registry import CONNECTION_NETWORK_MAC, DeviceInfo
from homeassistant.helpers.update_coordinator import CoordinatorEntity
from .const import DOMAIN, MANUFACTURER, SECTION_ADVANCED_SETTINGS
from .const import DOMAIN, MANUFACTURER
from .coordinator import AirOSDataUpdateCoordinator
@@ -20,27 +20,17 @@ class AirOSEntity(CoordinatorEntity[AirOSDataUpdateCoordinator]):
super().__init__(coordinator)
airos_data = self.coordinator.data
url_schema = (
"https"
if coordinator.config_entry.data[SECTION_ADVANCED_SETTINGS][CONF_SSL]
else "http"
)
configuration_url: str | None = (
f"{url_schema}://{coordinator.config_entry.data[CONF_HOST]}"
f"https://{coordinator.config_entry.data[CONF_HOST]}"
)
self._attr_device_info = DeviceInfo(
connections={(CONNECTION_NETWORK_MAC, airos_data.derived.mac)},
configuration_url=configuration_url,
identifiers={(DOMAIN, airos_data.derived.mac)},
identifiers={(DOMAIN, str(airos_data.host.device_id))},
manufacturer=MANUFACTURER,
model=airos_data.host.devmodel,
model_id=(
sku
if (sku := airos_data.derived.sku) not in ["UNKNOWN", "AMBIGUOUS"]
else None
),
name=airos_data.host.hostname,
sw_version=airos_data.host.fwversion,
)

View File

@@ -4,8 +4,7 @@
"codeowners": ["@CoMPaTech"],
"config_flow": true,
"documentation": "https://www.home-assistant.io/integrations/airos",
"integration_type": "device",
"iot_class": "local_polling",
"quality_scale": "silver",
"requirements": ["airos==0.6.0"]
"quality_scale": "bronze",
"requirements": ["airos==0.5.1"]
}

View File

@@ -32,11 +32,11 @@ rules:
config-entry-unloading: done
docs-configuration-parameters: done
docs-installation-parameters: done
entity-unavailable: done
entity-unavailable: todo
integration-owner: done
log-when-unavailable: done
parallel-updates: done
reauthentication-flow: done
log-when-unavailable: todo
parallel-updates: todo
reauthentication-flow: todo
test-coverage: done
# Gold
@@ -48,9 +48,9 @@ rules:
docs-examples: todo
docs-known-limitations: done
docs-supported-devices: done
docs-supported-functions: done
docs-supported-functions: todo
docs-troubleshooting: done
docs-use-cases: done
docs-use-cases: todo
dynamic-devices: todo
entity-category: done
entity-device-class: done
@@ -60,7 +60,7 @@ rules:
icon-translations:
status: exempt
comment: no (custom) icons used or envisioned
reconfiguration-flow: done
reconfiguration-flow: todo
repair-issues: todo
stale-devices: todo

View File

@@ -1,10 +1,19 @@
{
"config": {
"abort": {
"already_configured": "[%key:common::config_flow::abort::already_configured_device%]",
"reauth_successful": "[%key:common::config_flow::abort::reauth_successful%]",
"reconfigure_successful": "[%key:common::config_flow::abort::reconfigure_successful%]",
"unique_id_mismatch": "Re-authentication should be used for the same device not a new one"
"flow_title": "Ubiquiti airOS device",
"step": {
"user": {
"data": {
"host": "[%key:common::config_flow::data::host%]",
"username": "[%key:common::config_flow::data::username%]",
"password": "[%key:common::config_flow::data::password%]"
},
"data_description": {
"host": "IP address or hostname of the airOS device",
"username": "Administrator username for the airOS device, normally 'ubnt'",
"password": "Password configured through the UISP app or web interface"
}
}
},
"error": {
"cannot_connect": "[%key:common::config_flow::error::cannot_connect%]",
@@ -12,68 +21,14 @@
"key_data_missing": "Expected data not returned from the device, check the documentation for supported devices",
"unknown": "[%key:common::config_flow::error::unknown%]"
},
"flow_title": "Ubiquiti airOS device",
"step": {
"reauth_confirm": {
"data": {
"password": "[%key:common::config_flow::data::password%]"
},
"data_description": {
"password": "[%key:component::airos::config::step::user::data_description::password%]"
}
},
"reconfigure": {
"data": {
"password": "[%key:common::config_flow::data::password%]"
},
"data_description": {
"password": "[%key:component::airos::config::step::user::data_description::password%]"
},
"sections": {
"advanced_settings": {
"data": {
"ssl": "[%key:component::airos::config::step::user::sections::advanced_settings::data::ssl%]",
"verify_ssl": "[%key:common::config_flow::data::verify_ssl%]"
},
"data_description": {
"ssl": "[%key:component::airos::config::step::user::sections::advanced_settings::data_description::ssl%]",
"verify_ssl": "[%key:component::airos::config::step::user::sections::advanced_settings::data_description::verify_ssl%]"
},
"name": "[%key:component::airos::config::step::user::sections::advanced_settings::name%]"
}
}
},
"user": {
"data": {
"host": "[%key:common::config_flow::data::host%]",
"password": "[%key:common::config_flow::data::password%]",
"username": "[%key:common::config_flow::data::username%]"
},
"data_description": {
"host": "IP address or hostname of the airOS device",
"password": "Password configured through the UISP app or web interface",
"username": "Administrator username for the airOS device, normally 'ubnt'"
},
"sections": {
"advanced_settings": {
"data": {
"ssl": "Use HTTPS",
"verify_ssl": "[%key:common::config_flow::data::verify_ssl%]"
},
"data_description": {
"ssl": "Whether the connection should be encrypted (required for most devices)",
"verify_ssl": "Whether the certificate should be verified when using HTTPS. This should be off for self-signed certificates"
},
"name": "Advanced settings"
}
}
}
"abort": {
"already_configured": "[%key:common::config_flow::abort::already_configured_device%]"
}
},
"entity": {
"binary_sensor": {
"dhcp6_server": {
"name": "DHCPv6 server"
"port_forwarding": {
"name": "Port forwarding"
},
"dhcp_client": {
"name": "DHCP client"
@@ -81,8 +36,8 @@
"dhcp_server": {
"name": "DHCP server"
},
"port_forwarding": {
"name": "Port forwarding"
"dhcp6_server": {
"name": "DHCPv6 server"
},
"pppoe": {
"name": "PPPoE link"
@@ -99,27 +54,20 @@
"router": "Router"
}
},
"host_uptime": {
"name": "Uptime"
},
"wireless_antenna_gain": {
"name": "Antenna gain"
},
"wireless_distance": {
"name": "Wireless distance"
"wireless_frequency": {
"name": "Wireless frequency"
},
"wireless_essid": {
"name": "Wireless SSID"
},
"wireless_frequency": {
"name": "Wireless frequency"
"wireless_antenna_gain": {
"name": "Antenna gain"
},
"wireless_mode": {
"name": "Wireless mode",
"state": {
"point_to_multipoint": "Point-to-multipoint",
"point_to_point": "Point-to-point"
}
"wireless_throughput_tx": {
"name": "Throughput transmit (actual)"
},
"wireless_throughput_rx": {
"name": "Throughput receive (actual)"
},
"wireless_polling_dl_capacity": {
"name": "Download capacity"
@@ -130,6 +78,12 @@
"wireless_remote_hostname": {
"name": "Remote hostname"
},
"host_uptime": {
"name": "Uptime"
},
"wireless_distance": {
"name": "Wireless distance"
},
"wireless_role": {
"name": "Wireless role",
"state": {
@@ -137,26 +91,27 @@
"station": "Station"
}
},
"wireless_throughput_rx": {
"name": "Throughput receive (actual)"
},
"wireless_throughput_tx": {
"name": "Throughput transmit (actual)"
"wireless_mode": {
"name": "Wireless mode",
"state": {
"point_to_point": "Point-to-point",
"point_to_multipoint": "Point-to-multipoint"
}
}
}
},
"exceptions": {
"cannot_connect": {
"message": "[%key:common::config_flow::error::cannot_connect%]"
},
"error_data_missing": {
"message": "Data incomplete or missing"
},
"invalid_auth": {
"message": "[%key:common::config_flow::error::invalid_auth%]"
},
"cannot_connect": {
"message": "[%key:common::config_flow::error::cannot_connect%]"
},
"key_data_missing": {
"message": "Key data not returned from device"
},
"error_data_missing": {
"message": "Data incomplete or missing"
}
}
}

View File

@@ -7,5 +7,5 @@
"integration_type": "hub",
"iot_class": "local_polling",
"loggers": ["aioairq"],
"requirements": ["aioairq==0.4.7"]
"requirements": ["aioairq==0.4.6"]
}

View File

@@ -1,21 +1,36 @@
{
"config": {
"abort": {
"already_configured": "[%key:common::config_flow::abort::already_configured_device%]"
"step": {
"user": {
"title": "Identify the device",
"description": "Provide the IP address or mDNS of the device and its password",
"data": {
"ip_address": "[%key:common::config_flow::data::ip%]",
"password": "[%key:common::config_flow::data::password%]"
}
}
},
"error": {
"cannot_connect": "[%key:common::config_flow::error::cannot_connect%]",
"invalid_auth": "[%key:common::config_flow::error::invalid_auth%]",
"invalid_input": "[%key:common::config_flow::error::invalid_host%]"
},
"abort": {
"already_configured": "[%key:common::config_flow::abort::already_configured_device%]"
}
},
"options": {
"step": {
"user": {
"init": {
"title": "Configure air-Q integration",
"data": {
"ip_address": "[%key:common::config_flow::data::ip%]",
"password": "[%key:common::config_flow::data::password%]"
"return_average": "Show values averaged by the device",
"clip_negatives": "Clip negative values"
},
"description": "Provide the IP address or mDNS of the device and its password",
"title": "Identify the device"
"data_description": {
"return_average": "air-Q allows to poll both the noisy sensor readings as well as the values averaged on the device (default)",
"clip_negatives": "For baseline calibration purposes, certain sensor values may briefly become negative. The default behaviour is to clip such values to 0"
}
}
}
},
@@ -38,11 +53,8 @@
"bromine": {
"name": "Bromine"
},
"carbon_disulfide": {
"name": "Carbon disulfide"
},
"carbon_monoxide": {
"name": "[%key:component::sensor::entity_component::carbon_monoxide::name%]"
"methanethiol": {
"name": "Methanethiol"
},
"chlorine": {
"name": "Chlorine"
@@ -50,6 +62,12 @@
"chlorine_dioxide": {
"name": "Chlorine dioxide"
},
"carbon_disulfide": {
"name": "Carbon disulfide"
},
"carbon_monoxide": {
"name": "[%key:component::sensor::entity_component::carbon_monoxide::name%]"
},
"dew_point": {
"name": "Dew point"
},
@@ -59,51 +77,36 @@
"ethylene": {
"name": "Ethylene"
},
"fluorine": {
"name": "Fluorine"
},
"formaldehyde": {
"name": "Formaldehyde"
},
"health_index": {
"name": "Health index"
"fluorine": {
"name": "Fluorine"
},
"hydrogen_sulfide": {
"name": "Hydrogen sulfide"
},
"hydrochloric_acid": {
"name": "Hydrochloric acid"
},
"hydrogen": {
"name": "Hydrogen"
},
"hydrogen_cyanide": {
"name": "Hydrogen cyanide"
},
"hydrogen_fluoride": {
"name": "Hydrogen fluoride"
},
"health_index": {
"name": "Health index"
},
"hydrogen": {
"name": "Hydrogen"
},
"hydrogen_peroxide": {
"name": "Hydrogen peroxide"
},
"hydrogen_phosphide": {
"name": "Hydrogen phosphide"
},
"hydrogen_sulfide": {
"name": "Hydrogen sulfide"
},
"industrial_volatile_organic_compounds": {
"name": "VOCs (industrial)"
},
"maximum_noise": {
"name": "Noise (maximum)"
},
"methane": {
"name": "Methane"
},
"methanethiol": {
"name": "Methanethiol"
},
"noise": {
"name": "Noise"
},
"organic_acid": {
"name": "Organic acid"
},
@@ -113,39 +116,36 @@
"performance_index": {
"name": "Performance index"
},
"propane": {
"name": "Propane"
},
"radon": {
"name": "Radon"
},
"refigerant": {
"name": "Refrigerant"
"hydrogen_phosphide": {
"name": "Hydrogen phosphide"
},
"relative_pressure": {
"name": "Relative pressure"
},
"propane": {
"name": "Propane"
},
"refigerant": {
"name": "Refrigerant"
},
"silicon_hydride": {
"name": "Silicon hydride"
},
"noise": {
"name": "Noise"
},
"maximum_noise": {
"name": "Noise (maximum)"
},
"radon": {
"name": "Radon"
},
"industrial_volatile_organic_compounds": {
"name": "VOCs (industrial)"
},
"virus_index": {
"name": "Virus index"
}
}
},
"options": {
"step": {
"init": {
"data": {
"clip_negatives": "Clip negative values",
"return_average": "Show values averaged by the device"
},
"data_description": {
"clip_negatives": "For baseline calibration purposes, certain sensor values may briefly become negative. The default behavior is to clip such values to 0",
"return_average": "air-Q allows to poll both the noisy sensor readings as well as the values averaged on the device (default)"
},
"title": "Configure air-Q integration"
}
}
}
}

View File

@@ -23,10 +23,6 @@ STEP_USER_DATA_SCHEMA = vol.Schema(
}
)
URL_API_INTEGRATION = {
"url": "https://dashboard.airthings.com/integrations/api-integration"
}
class AirthingsConfigFlow(ConfigFlow, domain=DOMAIN):
"""Handle a config flow for Airthings."""
@@ -41,7 +37,11 @@ class AirthingsConfigFlow(ConfigFlow, domain=DOMAIN):
return self.async_show_form(
step_id="user",
data_schema=STEP_USER_DATA_SCHEMA,
description_placeholders=URL_API_INTEGRATION,
description_placeholders={
"url": (
"https://dashboard.airthings.com/integrations/api-integration"
),
},
)
errors = {}
@@ -65,8 +65,5 @@ class AirthingsConfigFlow(ConfigFlow, domain=DOMAIN):
return self.async_create_entry(title="Airthings", data=user_input)
return self.async_show_form(
step_id="user",
data_schema=STEP_USER_DATA_SCHEMA,
errors=errors,
description_placeholders=URL_API_INTEGRATION,
step_id="user", data_schema=STEP_USER_DATA_SCHEMA, errors=errors
)

View File

@@ -1,36 +1,36 @@
{
"config": {
"abort": {
"already_configured": "[%key:common::config_flow::abort::already_configured_account%]"
"step": {
"user": {
"data": {
"id": "ID",
"secret": "Secret",
"description": "Login at {url} to find your credentials"
}
}
},
"error": {
"cannot_connect": "[%key:common::config_flow::error::cannot_connect%]",
"invalid_auth": "[%key:common::config_flow::error::invalid_auth%]",
"unknown": "[%key:common::config_flow::error::unknown%]"
},
"step": {
"user": {
"data": {
"id": "ID",
"secret": "Secret"
},
"description": "Log in at {url} to find your credentials"
}
"abort": {
"already_configured": "[%key:common::config_flow::abort::already_configured_account%]"
}
},
"entity": {
"sensor": {
"light": {
"name": "Light"
},
"mold": {
"name": "Mold"
},
"radon": {
"name": "Radon"
},
"light": {
"name": "Light"
},
"virus_risk": {
"name": "Virus Risk"
},
"mold": {
"name": "Mold"
}
}
}

View File

@@ -6,13 +6,8 @@ import dataclasses
import logging
from typing import Any
from airthings_ble import (
AirthingsBluetoothDeviceData,
AirthingsDevice,
UnsupportedDeviceError,
)
from airthings_ble import AirthingsBluetoothDeviceData, AirthingsDevice
from bleak import BleakError
from habluetooth import BluetoothServiceInfoBleak
import voluptuous as vol
from homeassistant.components import bluetooth
@@ -32,7 +27,6 @@ SERVICE_UUIDS = [
"b42e4a8e-ade7-11e4-89d3-123b93f75cba",
"b42e1c08-ade7-11e4-89d3-123b93f75cba",
"b42e3882-ade7-11e4-89d3-123b93f75cba",
"b42e90a2-ade7-11e4-89d3-123b93f75cba",
]
@@ -43,7 +37,6 @@ class Discovery:
name: str
discovery_info: BluetoothServiceInfo
device: AirthingsDevice
data: AirthingsBluetoothDeviceData
def get_name(device: AirthingsDevice) -> str:
@@ -51,7 +44,7 @@ def get_name(device: AirthingsDevice) -> str:
name = device.friendly_name()
if identifier := device.identifier:
name += f" ({device.model.value}{identifier})"
name += f" ({identifier})"
return name
@@ -69,8 +62,8 @@ class AirthingsConfigFlow(ConfigFlow, domain=DOMAIN):
self._discovered_device: Discovery | None = None
self._discovered_devices: dict[str, Discovery] = {}
async def _get_device(
self, data: AirthingsBluetoothDeviceData, discovery_info: BluetoothServiceInfo
async def _get_device_data(
self, discovery_info: BluetoothServiceInfo
) -> AirthingsDevice:
ble_device = bluetooth.async_ble_device_from_address(
self.hass, discovery_info.address
@@ -79,8 +72,10 @@ class AirthingsConfigFlow(ConfigFlow, domain=DOMAIN):
_LOGGER.debug("no ble_device in _get_device_data")
raise AirthingsDeviceUpdateError("No ble_device")
airthings = AirthingsBluetoothDeviceData(_LOGGER)
try:
device = await data.update_device(ble_device)
data = await airthings.update_device(ble_device)
except BleakError as err:
_LOGGER.error(
"Error connecting to and getting data from %s: %s",
@@ -88,15 +83,12 @@ class AirthingsConfigFlow(ConfigFlow, domain=DOMAIN):
err,
)
raise AirthingsDeviceUpdateError("Failed getting device data") from err
except UnsupportedDeviceError:
_LOGGER.debug("Skipping unsupported device: %s", discovery_info.name)
raise
except Exception as err:
_LOGGER.error(
"Unknown error occurred from %s: %s", discovery_info.address, err
)
raise
return device
return data
async def async_step_bluetooth(
self, discovery_info: BluetoothServiceInfo
@@ -106,21 +98,17 @@ class AirthingsConfigFlow(ConfigFlow, domain=DOMAIN):
await self.async_set_unique_id(discovery_info.address)
self._abort_if_unique_id_configured()
data = AirthingsBluetoothDeviceData(logger=_LOGGER)
try:
device = await self._get_device(data=data, discovery_info=discovery_info)
device = await self._get_device_data(discovery_info)
except AirthingsDeviceUpdateError:
return self.async_abort(reason="cannot_connect")
except UnsupportedDeviceError:
return self.async_abort(reason="unsupported_device")
except Exception:
_LOGGER.exception("Unknown error occurred")
return self.async_abort(reason="unknown")
name = get_name(device)
self.context["title_placeholders"] = {"name": name}
self._discovered_device = Discovery(name, discovery_info, device, data=data)
self._discovered_device = Discovery(name, discovery_info, device)
return await self.async_step_bluetooth_confirm()
@@ -129,12 +117,6 @@ class AirthingsConfigFlow(ConfigFlow, domain=DOMAIN):
) -> ConfigFlowResult:
"""Confirm discovery."""
if user_input is not None:
if (
self._discovered_device is not None
and self._discovered_device.device.firmware.need_firmware_upgrade
):
return self.async_abort(reason="firmware_upgrade_required")
return self.async_create_entry(
title=self.context["title_placeholders"]["name"], data={}
)
@@ -155,9 +137,6 @@ class AirthingsConfigFlow(ConfigFlow, domain=DOMAIN):
self._abort_if_unique_id_configured()
discovery = self._discovered_devices[address]
if discovery.device.firmware.need_firmware_upgrade:
return self.async_abort(reason="firmware_upgrade_required")
self.context["title_placeholders"] = {
"name": discovery.name,
}
@@ -167,53 +146,32 @@ class AirthingsConfigFlow(ConfigFlow, domain=DOMAIN):
return self.async_create_entry(title=discovery.name, data={})
current_addresses = self._async_current_ids(include_ignore=False)
devices: list[BluetoothServiceInfoBleak] = []
for discovery_info in async_discovered_service_info(self.hass):
address = discovery_info.address
if address in current_addresses or address in self._discovered_devices:
continue
if MFCT_ID not in discovery_info.manufacturer_data:
continue
if not any(uuid in SERVICE_UUIDS for uuid in discovery_info.service_uuids):
_LOGGER.debug(
"Skipping unsupported device: %s (%s)", discovery_info.name, address
)
continue
devices.append(discovery_info)
for discovery_info in devices:
address = discovery_info.address
data = AirthingsBluetoothDeviceData(logger=_LOGGER)
if not any(uuid in SERVICE_UUIDS for uuid in discovery_info.service_uuids):
continue
try:
device = await self._get_device(data, discovery_info)
device = await self._get_device_data(discovery_info)
except AirthingsDeviceUpdateError:
_LOGGER.error(
"Error connecting to and getting data from %s (%s)",
discovery_info.name,
discovery_info.address,
)
continue
except UnsupportedDeviceError:
_LOGGER.debug(
"Skipping unsupported device: %s (%s)",
discovery_info.name,
discovery_info.address,
)
continue
return self.async_abort(reason="cannot_connect")
except Exception:
_LOGGER.exception("Unknown error occurred")
return self.async_abort(reason="unknown")
name = get_name(device)
_LOGGER.debug("Discovered Airthings device: %s (%s)", name, address)
self._discovered_devices[address] = Discovery(
name, discovery_info, device, data
)
self._discovered_devices[address] = Discovery(name, discovery_info, device)
if not self._discovered_devices:
return self.async_abort(reason="no_devices_found")
titles = {
address: get_name(discovery.device)
address: discovery.device.name
for (address, discovery) in self._discovered_devices.items()
}
return self.async_show_form(

View File

@@ -4,10 +4,10 @@
"radon_1day_avg": {
"default": "mdi:radioactive"
},
"radon_1day_level": {
"radon_longterm_avg": {
"default": "mdi:radioactive"
},
"radon_longterm_avg": {
"radon_1day_level": {
"default": "mdi:radioactive"
},
"radon_longterm_level": {

View File

@@ -17,10 +17,6 @@
{
"manufacturer_id": 820,
"service_uuid": "b42e3882-ade7-11e4-89d3-123b93f75cba"
},
{
"manufacturer_id": 820,
"service_uuid": "b42e90a2-ade7-11e4-89d3-123b93f75cba"
}
],
"codeowners": ["@vincegio", "@LaStrada"],
@@ -28,5 +24,5 @@
"dependencies": ["bluetooth_adapters"],
"documentation": "https://www.home-assistant.io/integrations/airthings_ble",
"iot_class": "local_polling",
"requirements": ["airthings-ble==1.1.1"]
"requirements": ["airthings-ble==0.9.2"]
}

View File

@@ -16,12 +16,10 @@ from homeassistant.components.sensor import (
from homeassistant.const import (
CONCENTRATION_PARTS_PER_BILLION,
CONCENTRATION_PARTS_PER_MILLION,
LIGHT_LUX,
PERCENTAGE,
EntityCategory,
Platform,
UnitOfPressure,
UnitOfSoundPressure,
UnitOfTemperature,
)
from homeassistant.core import HomeAssistant, callback
@@ -114,25 +112,8 @@ SENSORS_MAPPING_TEMPLATE: dict[str, SensorEntityDescription] = {
state_class=SensorStateClass.MEASUREMENT,
suggested_display_precision=0,
),
"lux": SensorEntityDescription(
key="lux",
device_class=SensorDeviceClass.ILLUMINANCE,
native_unit_of_measurement=LIGHT_LUX,
state_class=SensorStateClass.MEASUREMENT,
suggested_display_precision=0,
),
"noise": SensorEntityDescription(
key="noise",
translation_key="ambient_noise",
device_class=SensorDeviceClass.SOUND_PRESSURE,
native_unit_of_measurement=UnitOfSoundPressure.WEIGHTED_DECIBEL_A,
state_class=SensorStateClass.MEASUREMENT,
suggested_display_precision=0,
),
}
PARALLEL_UPDATES = 0
@callback
def async_migrate(hass: HomeAssistant, address: str, sensor_name: str) -> None:

View File

@@ -1,49 +1,41 @@
{
"config": {
"abort": {
"already_configured": "[%key:common::config_flow::abort::already_configured_device%]",
"already_in_progress": "[%key:common::config_flow::abort::already_in_progress%]",
"cannot_connect": "[%key:common::config_flow::error::cannot_connect%]",
"firmware_upgrade_required": "Your device requires a firmware upgrade. Please use the Airthings app (Android/iOS) to upgrade it.",
"no_devices_found": "[%key:common::config_flow::abort::no_devices_found%]",
"unknown": "[%key:common::config_flow::error::unknown%]",
"unsupported_device": "Unsupported device"
},
"flow_title": "{name}",
"step": {
"bluetooth_confirm": {
"description": "[%key:component::bluetooth::config::step::bluetooth_confirm::description%]"
},
"user": {
"description": "[%key:component::bluetooth::config::step::user::description%]",
"data": {
"address": "[%key:common::config_flow::data::device%]"
},
"data_description": {
"address": "The Airthings devices discovered via Bluetooth."
},
"description": "[%key:component::bluetooth::config::step::user::description%]"
}
},
"bluetooth_confirm": {
"description": "[%key:component::bluetooth::config::step::bluetooth_confirm::description%]"
}
},
"abort": {
"no_devices_found": "[%key:common::config_flow::abort::no_devices_found%]",
"already_in_progress": "[%key:common::config_flow::abort::already_in_progress%]",
"already_configured": "[%key:common::config_flow::abort::already_configured_device%]",
"cannot_connect": "[%key:common::config_flow::error::cannot_connect%]",
"unknown": "[%key:common::config_flow::error::unknown%]"
}
},
"entity": {
"sensor": {
"ambient_noise": {
"name": "Ambient noise"
},
"illuminance": {
"name": "[%key:component::sensor::entity_component::illuminance::name%]"
},
"radon_1day_avg": {
"name": "Radon 1-day average"
},
"radon_1day_level": {
"name": "Radon 1-day level"
},
"radon_longterm_avg": {
"name": "Radon longterm average"
},
"radon_1day_level": {
"name": "Radon 1-day level"
},
"radon_longterm_level": {
"name": "Radon longterm level"
},
"illuminance": {
"name": "[%key:component::sensor::entity_component::illuminance::name%]"
}
}
}

View File

@@ -2,14 +2,17 @@
from airtouch4pyapi import AirTouch
from homeassistant.config_entries import ConfigEntry
from homeassistant.const import CONF_HOST, Platform
from homeassistant.core import HomeAssistant
from homeassistant.exceptions import ConfigEntryNotReady
from .coordinator import AirTouch4ConfigEntry, AirtouchDataUpdateCoordinator
from .coordinator import AirtouchDataUpdateCoordinator
PLATFORMS = [Platform.CLIMATE]
type AirTouch4ConfigEntry = ConfigEntry[AirtouchDataUpdateCoordinator]
async def async_setup_entry(hass: HomeAssistant, entry: AirTouch4ConfigEntry) -> bool:
"""Set up AirTouch4 from a config entry."""
@@ -19,7 +22,7 @@ async def async_setup_entry(hass: HomeAssistant, entry: AirTouch4ConfigEntry) ->
info = airtouch.GetAcs()
if not info:
raise ConfigEntryNotReady
coordinator = AirtouchDataUpdateCoordinator(hass, entry, airtouch)
coordinator = AirtouchDataUpdateCoordinator(hass, airtouch)
await coordinator.async_config_entry_first_refresh()
entry.runtime_data = coordinator

View File

@@ -2,34 +2,26 @@
import logging
from airtouch4pyapi import AirTouch
from airtouch4pyapi.airtouch import AirTouchStatus
from homeassistant.components.climate import SCAN_INTERVAL
from homeassistant.config_entries import ConfigEntry
from homeassistant.core import HomeAssistant
from homeassistant.helpers.update_coordinator import DataUpdateCoordinator, UpdateFailed
from .const import DOMAIN
_LOGGER = logging.getLogger(__name__)
type AirTouch4ConfigEntry = ConfigEntry[AirtouchDataUpdateCoordinator]
class AirtouchDataUpdateCoordinator(DataUpdateCoordinator):
"""Class to manage fetching Airtouch data."""
def __init__(
self, hass: HomeAssistant, entry: AirTouch4ConfigEntry, airtouch: AirTouch
) -> None:
def __init__(self, hass, airtouch):
"""Initialize global Airtouch data updater."""
self.airtouch = airtouch
super().__init__(
hass,
_LOGGER,
config_entry=entry,
name=DOMAIN,
update_interval=SCAN_INTERVAL,
)

Some files were not shown because too many files have changed in this diff Show More