mirror of
https://github.com/home-assistant/core.git
synced 2025-09-24 12:29:31 +00:00
Compare commits
1 Commits
light_targ
...
openai-mod
Author | SHA1 | Date | |
---|---|---|---|
![]() |
099a480e57 |
@@ -1,77 +0,0 @@
|
||||
---
|
||||
name: quality-scale-rule-verifier
|
||||
description: |
|
||||
Use this agent when you need to verify that a Home Assistant integration follows a specific quality scale rule. This includes checking if the integration implements required patterns, configurations, or code structures defined by the quality scale system.
|
||||
|
||||
<example>
|
||||
Context: The user wants to verify if an integration follows a specific quality scale rule.
|
||||
user: "Check if the peblar integration follows the config-flow rule"
|
||||
assistant: "I'll use the quality scale rule verifier to check if the peblar integration properly implements the config-flow rule."
|
||||
<commentary>
|
||||
Since the user is asking to verify a quality scale rule implementation, use the quality-scale-rule-verifier agent.
|
||||
</commentary>
|
||||
</example>
|
||||
|
||||
<example>
|
||||
Context: The user is reviewing if an integration reaches a specific quality scale level.
|
||||
user: "Verify that this integration reaches the bronze quality scale"
|
||||
assistant: "Let me use the quality scale rule verifier to check the bronze quality scale implementation."
|
||||
<commentary>
|
||||
The user wants to verify the integration has reached a certain quality level, so use multiple quality-scale-rule-verifier agents to verify each bronze rule.
|
||||
</commentary>
|
||||
</example>
|
||||
model: inherit
|
||||
color: yellow
|
||||
tools: Read, Bash, Grep, Glob, WebFetch
|
||||
---
|
||||
|
||||
You are an expert Home Assistant integration quality scale auditor specializing in verifying compliance with specific quality scale rules. You have deep knowledge of Home Assistant's architecture, best practices, and the quality scale system that ensures integration consistency and reliability.
|
||||
|
||||
You will verify if an integration follows a specific quality scale rule by:
|
||||
|
||||
1. **Fetching Rule Documentation**: Retrieve the official rule documentation from:
|
||||
`https://raw.githubusercontent.com/home-assistant/developers.home-assistant/refs/heads/master/docs/core/integration-quality-scale/rules/{rule_name}.md`
|
||||
where `{rule_name}` is the rule identifier (e.g., 'config-flow', 'entity-unique-id', 'parallel-updates')
|
||||
|
||||
2. **Understanding Rule Requirements**: Parse the rule documentation to identify:
|
||||
- Core requirements and mandatory implementations
|
||||
- Specific code patterns or configurations required
|
||||
- Common violations and anti-patterns
|
||||
- Exemption criteria (when a rule might not apply)
|
||||
- The quality tier this rule belongs to (Bronze, Silver, Gold, Platinum)
|
||||
|
||||
3. **Analyzing Integration Code**: Examine the integration's codebase at `homeassistant/components/<integration domain>` focusing on:
|
||||
- `manifest.json` for quality scale declaration and configuration
|
||||
- `quality_scale.yaml` for rule status (done, todo, exempt)
|
||||
- Relevant Python modules based on the rule requirements
|
||||
- Configuration files and service definitions as needed
|
||||
|
||||
4. **Verification Process**:
|
||||
- Check if the rule is marked as 'done', 'todo', or 'exempt' in quality_scale.yaml
|
||||
- If marked 'exempt', verify the exemption reason is valid
|
||||
- If marked 'done', verify the actual implementation matches requirements
|
||||
- Identify specific files and code sections that demonstrate compliance or violations
|
||||
- Consider the integration's declared quality tier when applying rules
|
||||
- To fetch the integration docs, use WebFetch to fetch from `https://raw.githubusercontent.com/home-assistant/home-assistant.io/refs/heads/current/source/_integrations/<integration domain>.markdown`
|
||||
- To fetch information about a PyPI package, use the URL `https://pypi.org/pypi/<package>/json`
|
||||
|
||||
5. **Reporting Findings**: Provide a comprehensive verification report that includes:
|
||||
- **Rule Summary**: Brief description of what the rule requires
|
||||
- **Compliance Status**: Clear pass/fail/exempt determination
|
||||
- **Evidence**: Specific code examples showing compliance or violations
|
||||
- **Issues Found**: Detailed list of any non-compliance issues with file locations
|
||||
- **Recommendations**: Actionable steps to achieve compliance if needed
|
||||
- **Exemption Analysis**: If applicable, whether the exemption is justified
|
||||
|
||||
When examining code, you will:
|
||||
- Look for exact implementation patterns specified in the rule
|
||||
- Verify all required components are present and properly configured
|
||||
- Check for common mistakes and anti-patterns
|
||||
- Consider edge cases and error handling requirements
|
||||
- Validate that implementations follow Home Assistant conventions
|
||||
|
||||
You will be thorough but focused, examining only the aspects relevant to the specific rule being verified. You will provide clear, actionable feedback that helps developers understand both what needs to be fixed and why it matters for integration quality.
|
||||
|
||||
If you cannot access the rule documentation or find the integration code, clearly state what information is missing and what you would need to complete the verification.
|
||||
|
||||
Remember that quality scale rules are cumulative - Bronze rules apply to all integrations with a quality scale, Silver rules apply to Silver+ integrations, and so on. Always consider the integration's target quality level when determining which rules should be enforced.
|
@@ -8,8 +8,6 @@
|
||||
"PYTHONASYNCIODEBUG": "1"
|
||||
},
|
||||
"features": {
|
||||
// Node feature required for Claude Code until fixed https://github.com/anthropics/devcontainer-features/issues/28
|
||||
"ghcr.io/devcontainers/features/node:1": {},
|
||||
"ghcr.io/anthropics/devcontainer-features/claude-code:1.0": {},
|
||||
"ghcr.io/devcontainers/features/github-cli:1": {}
|
||||
},
|
||||
|
@@ -14,8 +14,7 @@ tests
|
||||
|
||||
# Other virtualization methods
|
||||
venv
|
||||
.venv
|
||||
.vagrant
|
||||
|
||||
# Temporary files
|
||||
**/__pycache__
|
||||
**/__pycache__
|
5
.github/PULL_REQUEST_TEMPLATE.md
vendored
5
.github/PULL_REQUEST_TEMPLATE.md
vendored
@@ -55,12 +55,8 @@
|
||||
creating the PR. If you're unsure about any of them, don't hesitate to ask.
|
||||
We're here to help! This is simply a reminder of what we are going to look
|
||||
for before merging your code.
|
||||
|
||||
AI tools are welcome, but contributors are responsible for *fully*
|
||||
understanding the code before submitting a PR.
|
||||
-->
|
||||
|
||||
- [ ] I understand the code I am submitting and can explain how it works.
|
||||
- [ ] The code change is tested and works locally.
|
||||
- [ ] Local tests pass. **Your PR cannot be merged unless tests pass**
|
||||
- [ ] There is no commented out code in this PR.
|
||||
@@ -68,7 +64,6 @@
|
||||
- [ ] I have followed the [perfect PR recommendations][perfect-pr]
|
||||
- [ ] The code has been formatted using Ruff (`ruff format homeassistant tests`)
|
||||
- [ ] Tests have been added to verify that the new code works.
|
||||
- [ ] Any generated code has been carefully reviewed for correctness and compliance with project standards.
|
||||
|
||||
If user exposed functionality or configuration variables are added/changed:
|
||||
|
||||
|
19
.github/copilot-instructions.md
vendored
19
.github/copilot-instructions.md
vendored
@@ -1073,11 +1073,7 @@ async def test_flow_connection_error(hass, mock_api_error):
|
||||
|
||||
### Entity Testing Patterns
|
||||
```python
|
||||
@pytest.fixture
|
||||
def platforms() -> list[Platform]:
|
||||
"""Overridden fixture to specify platforms to test."""
|
||||
return [Platform.SENSOR] # Or another specific platform as needed.
|
||||
|
||||
@pytest.mark.parametrize("init_integration", [Platform.SENSOR], indirect=True)
|
||||
@pytest.mark.usefixtures("entity_registry_enabled_by_default", "init_integration")
|
||||
async def test_entities(
|
||||
hass: HomeAssistant,
|
||||
@@ -1124,25 +1120,16 @@ def mock_device_api() -> Generator[MagicMock]:
|
||||
)
|
||||
yield api
|
||||
|
||||
@pytest.fixture
|
||||
def platforms() -> list[Platform]:
|
||||
"""Fixture to specify platforms to test."""
|
||||
return PLATFORMS
|
||||
|
||||
@pytest.fixture
|
||||
async def init_integration(
|
||||
hass: HomeAssistant,
|
||||
mock_config_entry: MockConfigEntry,
|
||||
mock_device_api: MagicMock,
|
||||
platforms: list[Platform],
|
||||
) -> MockConfigEntry:
|
||||
"""Set up the integration for testing."""
|
||||
mock_config_entry.add_to_hass(hass)
|
||||
|
||||
with patch("homeassistant.components.my_integration.PLATFORMS", platforms):
|
||||
await hass.config_entries.async_setup(mock_config_entry.entry_id)
|
||||
await hass.async_block_till_done()
|
||||
|
||||
await hass.config_entries.async_setup(mock_config_entry.entry_id)
|
||||
await hass.async_block_till_done()
|
||||
return mock_config_entry
|
||||
```
|
||||
|
||||
|
48
.github/workflows/builder.yml
vendored
48
.github/workflows/builder.yml
vendored
@@ -27,12 +27,12 @@ jobs:
|
||||
publish: ${{ steps.version.outputs.publish }}
|
||||
steps:
|
||||
- name: Checkout the repository
|
||||
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
|
||||
uses: actions/checkout@v4.2.2
|
||||
with:
|
||||
fetch-depth: 0
|
||||
|
||||
- name: Set up Python ${{ env.DEFAULT_PYTHON }}
|
||||
uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
|
||||
uses: actions/setup-python@v5.6.0
|
||||
with:
|
||||
python-version: ${{ env.DEFAULT_PYTHON }}
|
||||
|
||||
@@ -69,7 +69,7 @@ jobs:
|
||||
run: find ./homeassistant/components/*/translations -name "*.json" | tar zcvf translations.tar.gz -T -
|
||||
|
||||
- name: Upload translations
|
||||
uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4.6.2
|
||||
uses: actions/upload-artifact@v4.6.2
|
||||
with:
|
||||
name: translations
|
||||
path: translations.tar.gz
|
||||
@@ -90,11 +90,11 @@ jobs:
|
||||
arch: ${{ fromJson(needs.init.outputs.architectures) }}
|
||||
steps:
|
||||
- name: Checkout the repository
|
||||
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
|
||||
uses: actions/checkout@v4.2.2
|
||||
|
||||
- name: Download nightly wheels of frontend
|
||||
if: needs.init.outputs.channel == 'dev'
|
||||
uses: dawidd6/action-download-artifact@ac66b43f0e6a346234dd65d4d0c8fbb31cb316e5 # v11
|
||||
uses: dawidd6/action-download-artifact@v11
|
||||
with:
|
||||
github_token: ${{secrets.GITHUB_TOKEN}}
|
||||
repo: home-assistant/frontend
|
||||
@@ -105,7 +105,7 @@ jobs:
|
||||
|
||||
- name: Download nightly wheels of intents
|
||||
if: needs.init.outputs.channel == 'dev'
|
||||
uses: dawidd6/action-download-artifact@ac66b43f0e6a346234dd65d4d0c8fbb31cb316e5 # v11
|
||||
uses: dawidd6/action-download-artifact@v11
|
||||
with:
|
||||
github_token: ${{secrets.GITHUB_TOKEN}}
|
||||
repo: OHF-Voice/intents-package
|
||||
@@ -116,7 +116,7 @@ jobs:
|
||||
|
||||
- name: Set up Python ${{ env.DEFAULT_PYTHON }}
|
||||
if: needs.init.outputs.channel == 'dev'
|
||||
uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
|
||||
uses: actions/setup-python@v5.6.0
|
||||
with:
|
||||
python-version: ${{ env.DEFAULT_PYTHON }}
|
||||
|
||||
@@ -175,7 +175,7 @@ jobs:
|
||||
sed -i "s|pykrakenapi|# pykrakenapi|g" requirements_all.txt
|
||||
|
||||
- name: Download translations
|
||||
uses: actions/download-artifact@634f93cb2916e3fdff6788551b99b062d0335ce0 # v5.0.0
|
||||
uses: actions/download-artifact@v4.3.0
|
||||
with:
|
||||
name: translations
|
||||
|
||||
@@ -190,13 +190,12 @@ jobs:
|
||||
echo "${{ github.sha }};${{ github.ref }};${{ github.event_name }};${{ github.actor }}" > rootfs/OFFICIAL_IMAGE
|
||||
|
||||
- name: Login to GitHub Container Registry
|
||||
uses: docker/login-action@184bdaa0721073962dff0199f1fb9940f07167d1 # v3.5.0
|
||||
uses: docker/login-action@v3.4.0
|
||||
with:
|
||||
registry: ghcr.io
|
||||
username: ${{ github.repository_owner }}
|
||||
password: ${{ secrets.GITHUB_TOKEN }}
|
||||
|
||||
# home-assistant/builder doesn't support sha pinning
|
||||
- name: Build base image
|
||||
uses: home-assistant/builder@2025.03.0
|
||||
with:
|
||||
@@ -243,7 +242,7 @@ jobs:
|
||||
- green
|
||||
steps:
|
||||
- name: Checkout the repository
|
||||
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
|
||||
uses: actions/checkout@v4.2.2
|
||||
|
||||
- name: Set build additional args
|
||||
run: |
|
||||
@@ -257,13 +256,12 @@ jobs:
|
||||
fi
|
||||
|
||||
- name: Login to GitHub Container Registry
|
||||
uses: docker/login-action@184bdaa0721073962dff0199f1fb9940f07167d1 # v3.5.0
|
||||
uses: docker/login-action@v3.4.0
|
||||
with:
|
||||
registry: ghcr.io
|
||||
username: ${{ github.repository_owner }}
|
||||
password: ${{ secrets.GITHUB_TOKEN }}
|
||||
|
||||
# home-assistant/builder doesn't support sha pinning
|
||||
- name: Build base image
|
||||
uses: home-assistant/builder@2025.03.0
|
||||
with:
|
||||
@@ -281,7 +279,7 @@ jobs:
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- name: Checkout the repository
|
||||
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
|
||||
uses: actions/checkout@v4.2.2
|
||||
|
||||
- name: Initialize git
|
||||
uses: home-assistant/actions/helpers/git-init@master
|
||||
@@ -323,23 +321,23 @@ jobs:
|
||||
registry: ["ghcr.io/home-assistant", "docker.io/homeassistant"]
|
||||
steps:
|
||||
- name: Checkout the repository
|
||||
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
|
||||
uses: actions/checkout@v4.2.2
|
||||
|
||||
- name: Install Cosign
|
||||
uses: sigstore/cosign-installer@d7543c93d881b35a8faa02e8e3605f69b7a1ce62 # v3.10.0
|
||||
uses: sigstore/cosign-installer@v3.9.2
|
||||
with:
|
||||
cosign-release: "v2.2.3"
|
||||
|
||||
- name: Login to DockerHub
|
||||
if: matrix.registry == 'docker.io/homeassistant'
|
||||
uses: docker/login-action@184bdaa0721073962dff0199f1fb9940f07167d1 # v3.5.0
|
||||
uses: docker/login-action@v3.4.0
|
||||
with:
|
||||
username: ${{ secrets.DOCKERHUB_USERNAME }}
|
||||
password: ${{ secrets.DOCKERHUB_TOKEN }}
|
||||
|
||||
- name: Login to GitHub Container Registry
|
||||
if: matrix.registry == 'ghcr.io/home-assistant'
|
||||
uses: docker/login-action@184bdaa0721073962dff0199f1fb9940f07167d1 # v3.5.0
|
||||
uses: docker/login-action@v3.4.0
|
||||
with:
|
||||
registry: ghcr.io
|
||||
username: ${{ github.repository_owner }}
|
||||
@@ -456,15 +454,15 @@ jobs:
|
||||
if: github.repository_owner == 'home-assistant' && needs.init.outputs.publish == 'true'
|
||||
steps:
|
||||
- name: Checkout the repository
|
||||
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
|
||||
uses: actions/checkout@v4.2.2
|
||||
|
||||
- name: Set up Python ${{ env.DEFAULT_PYTHON }}
|
||||
uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
|
||||
uses: actions/setup-python@v5.6.0
|
||||
with:
|
||||
python-version: ${{ env.DEFAULT_PYTHON }}
|
||||
|
||||
- name: Download translations
|
||||
uses: actions/download-artifact@634f93cb2916e3fdff6788551b99b062d0335ce0 # v5.0.0
|
||||
uses: actions/download-artifact@v4.3.0
|
||||
with:
|
||||
name: translations
|
||||
|
||||
@@ -482,7 +480,7 @@ jobs:
|
||||
python -m build
|
||||
|
||||
- name: Upload package to PyPI
|
||||
uses: pypa/gh-action-pypi-publish@ed0c53931b1dc9bd32cbe73a98c7f6766f8a527e # v1.13.0
|
||||
uses: pypa/gh-action-pypi-publish@v1.12.4
|
||||
with:
|
||||
skip-existing: true
|
||||
|
||||
@@ -501,10 +499,10 @@ jobs:
|
||||
HASSFEST_IMAGE_TAG: ghcr.io/home-assistant/hassfest:${{ needs.init.outputs.version }}
|
||||
steps:
|
||||
- name: Checkout repository
|
||||
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
|
||||
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
|
||||
|
||||
- name: Login to GitHub Container Registry
|
||||
uses: docker/login-action@184bdaa0721073962dff0199f1fb9940f07167d1 # v3.5.0
|
||||
uses: docker/login-action@74a5d142397b4f367a81961eba4e8cd7edddf772 # v3.4.0
|
||||
with:
|
||||
registry: ghcr.io
|
||||
username: ${{ github.repository_owner }}
|
||||
@@ -533,7 +531,7 @@ jobs:
|
||||
|
||||
- name: Generate artifact attestation
|
||||
if: needs.init.outputs.channel != 'dev' && needs.init.outputs.publish == 'true'
|
||||
uses: actions/attest-build-provenance@977bb373ede98d70efdf65b84cb5f73e068dcc2a # v3.0.0
|
||||
uses: actions/attest-build-provenance@e8998f949152b193b063cb0ec769d69d929409be # v2.4.0
|
||||
with:
|
||||
subject-name: ${{ env.HASSFEST_IMAGE_NAME }}
|
||||
subject-digest: ${{ steps.push.outputs.digest }}
|
||||
|
303
.github/workflows/ci.yaml
vendored
303
.github/workflows/ci.yaml
vendored
@@ -37,10 +37,10 @@ on:
|
||||
type: boolean
|
||||
|
||||
env:
|
||||
CACHE_VERSION: 8
|
||||
CACHE_VERSION: 4
|
||||
UV_CACHE_VERSION: 1
|
||||
MYPY_CACHE_VERSION: 1
|
||||
HA_SHORT_VERSION: "2025.10"
|
||||
HA_SHORT_VERSION: "2025.8"
|
||||
DEFAULT_PYTHON: "3.13"
|
||||
ALL_PYTHON_VERSIONS: "['3.13']"
|
||||
# 10.3 is the oldest supported version
|
||||
@@ -61,9 +61,6 @@ env:
|
||||
POSTGRESQL_VERSIONS: "['postgres:12.14','postgres:15.2']"
|
||||
PRE_COMMIT_CACHE: ~/.cache/pre-commit
|
||||
UV_CACHE_DIR: /tmp/uv-cache
|
||||
APT_CACHE_BASE: /home/runner/work/apt
|
||||
APT_CACHE_DIR: /home/runner/work/apt/cache
|
||||
APT_LIST_CACHE_DIR: /home/runner/work/apt/lists
|
||||
SQLALCHEMY_WARN_20: 1
|
||||
PYTHONASYNCIODEBUG: 1
|
||||
HASS_CI: 1
|
||||
@@ -81,7 +78,6 @@ jobs:
|
||||
core: ${{ steps.core.outputs.changes }}
|
||||
integrations_glob: ${{ steps.info.outputs.integrations_glob }}
|
||||
integrations: ${{ steps.integrations.outputs.changes }}
|
||||
apt_cache_key: ${{ steps.generate_apt_cache_key.outputs.key }}
|
||||
pre-commit_cache_key: ${{ steps.generate_pre-commit_cache_key.outputs.key }}
|
||||
python_cache_key: ${{ steps.generate_python_cache_key.outputs.key }}
|
||||
requirements: ${{ steps.core.outputs.requirements }}
|
||||
@@ -98,7 +94,7 @@ jobs:
|
||||
runs-on: ubuntu-24.04
|
||||
steps:
|
||||
- name: Check out code from GitHub
|
||||
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
|
||||
uses: actions/checkout@v4.2.2
|
||||
- name: Generate partial Python venv restore key
|
||||
id: generate_python_cache_key
|
||||
run: |
|
||||
@@ -115,12 +111,8 @@ jobs:
|
||||
run: >-
|
||||
echo "key=pre-commit-${{ env.CACHE_VERSION }}-${{
|
||||
hashFiles('.pre-commit-config.yaml') }}" >> $GITHUB_OUTPUT
|
||||
- name: Generate partial apt restore key
|
||||
id: generate_apt_cache_key
|
||||
run: |
|
||||
echo "key=$(lsb_release -rs)-apt-${{ env.CACHE_VERSION }}-${{ env.HA_SHORT_VERSION }}" >> $GITHUB_OUTPUT
|
||||
- name: Filter for core changes
|
||||
uses: dorny/paths-filter@de90cc6fb38fc0963ad72b210f1f284cd68cea36 # v3.0.2
|
||||
uses: dorny/paths-filter@v3.0.2
|
||||
id: core
|
||||
with:
|
||||
filters: .core_files.yaml
|
||||
@@ -135,7 +127,7 @@ jobs:
|
||||
echo "Result:"
|
||||
cat .integration_paths.yaml
|
||||
- name: Filter for integration changes
|
||||
uses: dorny/paths-filter@de90cc6fb38fc0963ad72b210f1f284cd68cea36 # v3.0.2
|
||||
uses: dorny/paths-filter@v3.0.2
|
||||
id: integrations
|
||||
with:
|
||||
filters: .integration_paths.yaml
|
||||
@@ -254,16 +246,16 @@ jobs:
|
||||
- info
|
||||
steps:
|
||||
- name: Check out code from GitHub
|
||||
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
|
||||
uses: actions/checkout@v4.2.2
|
||||
- name: Set up Python ${{ env.DEFAULT_PYTHON }}
|
||||
id: python
|
||||
uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
|
||||
uses: actions/setup-python@v5.6.0
|
||||
with:
|
||||
python-version: ${{ env.DEFAULT_PYTHON }}
|
||||
check-latest: true
|
||||
- name: Restore base Python virtual environment
|
||||
id: cache-venv
|
||||
uses: actions/cache@0400d5f644dc74513175e3cd8d07132dd4860809 # v4.2.4
|
||||
uses: actions/cache@v4.2.3
|
||||
with:
|
||||
path: venv
|
||||
key: >-
|
||||
@@ -279,7 +271,7 @@ jobs:
|
||||
uv pip install "$(cat requirements_test.txt | grep pre-commit)"
|
||||
- name: Restore pre-commit environment from cache
|
||||
id: cache-precommit
|
||||
uses: actions/cache@0400d5f644dc74513175e3cd8d07132dd4860809 # v4.2.4
|
||||
uses: actions/cache@v4.2.3
|
||||
with:
|
||||
path: ${{ env.PRE_COMMIT_CACHE }}
|
||||
lookup-only: true
|
||||
@@ -300,16 +292,16 @@ jobs:
|
||||
- pre-commit
|
||||
steps:
|
||||
- name: Check out code from GitHub
|
||||
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
|
||||
uses: actions/checkout@v4.2.2
|
||||
- name: Set up Python ${{ env.DEFAULT_PYTHON }}
|
||||
uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
|
||||
uses: actions/setup-python@v5.6.0
|
||||
id: python
|
||||
with:
|
||||
python-version: ${{ env.DEFAULT_PYTHON }}
|
||||
check-latest: true
|
||||
- name: Restore base Python virtual environment
|
||||
id: cache-venv
|
||||
uses: actions/cache/restore@0400d5f644dc74513175e3cd8d07132dd4860809 # v4.2.4
|
||||
uses: actions/cache/restore@v4.2.3
|
||||
with:
|
||||
path: venv
|
||||
fail-on-cache-miss: true
|
||||
@@ -318,7 +310,7 @@ jobs:
|
||||
needs.info.outputs.pre-commit_cache_key }}
|
||||
- name: Restore pre-commit environment from cache
|
||||
id: cache-precommit
|
||||
uses: actions/cache/restore@0400d5f644dc74513175e3cd8d07132dd4860809 # v4.2.4
|
||||
uses: actions/cache/restore@v4.2.3
|
||||
with:
|
||||
path: ${{ env.PRE_COMMIT_CACHE }}
|
||||
fail-on-cache-miss: true
|
||||
@@ -340,16 +332,16 @@ jobs:
|
||||
- pre-commit
|
||||
steps:
|
||||
- name: Check out code from GitHub
|
||||
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
|
||||
uses: actions/checkout@v4.2.2
|
||||
- name: Set up Python ${{ env.DEFAULT_PYTHON }}
|
||||
uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
|
||||
uses: actions/setup-python@v5.6.0
|
||||
id: python
|
||||
with:
|
||||
python-version: ${{ env.DEFAULT_PYTHON }}
|
||||
check-latest: true
|
||||
- name: Restore base Python virtual environment
|
||||
id: cache-venv
|
||||
uses: actions/cache/restore@0400d5f644dc74513175e3cd8d07132dd4860809 # v4.2.4
|
||||
uses: actions/cache/restore@v4.2.3
|
||||
with:
|
||||
path: venv
|
||||
fail-on-cache-miss: true
|
||||
@@ -358,7 +350,7 @@ jobs:
|
||||
needs.info.outputs.pre-commit_cache_key }}
|
||||
- name: Restore pre-commit environment from cache
|
||||
id: cache-precommit
|
||||
uses: actions/cache/restore@0400d5f644dc74513175e3cd8d07132dd4860809 # v4.2.4
|
||||
uses: actions/cache/restore@v4.2.3
|
||||
with:
|
||||
path: ${{ env.PRE_COMMIT_CACHE }}
|
||||
fail-on-cache-miss: true
|
||||
@@ -380,16 +372,16 @@ jobs:
|
||||
- pre-commit
|
||||
steps:
|
||||
- name: Check out code from GitHub
|
||||
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
|
||||
uses: actions/checkout@v4.2.2
|
||||
- name: Set up Python ${{ env.DEFAULT_PYTHON }}
|
||||
uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
|
||||
uses: actions/setup-python@v5.6.0
|
||||
id: python
|
||||
with:
|
||||
python-version: ${{ env.DEFAULT_PYTHON }}
|
||||
check-latest: true
|
||||
- name: Restore base Python virtual environment
|
||||
id: cache-venv
|
||||
uses: actions/cache/restore@0400d5f644dc74513175e3cd8d07132dd4860809 # v4.2.4
|
||||
uses: actions/cache/restore@v4.2.3
|
||||
with:
|
||||
path: venv
|
||||
fail-on-cache-miss: true
|
||||
@@ -398,7 +390,7 @@ jobs:
|
||||
needs.info.outputs.pre-commit_cache_key }}
|
||||
- name: Restore pre-commit environment from cache
|
||||
id: cache-precommit
|
||||
uses: actions/cache/restore@0400d5f644dc74513175e3cd8d07132dd4860809 # v4.2.4
|
||||
uses: actions/cache/restore@v4.2.3
|
||||
with:
|
||||
path: ${{ env.PRE_COMMIT_CACHE }}
|
||||
fail-on-cache-miss: true
|
||||
@@ -470,7 +462,7 @@ jobs:
|
||||
- script/hassfest/docker/Dockerfile
|
||||
steps:
|
||||
- name: Check out code from GitHub
|
||||
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
|
||||
uses: actions/checkout@v4.2.2
|
||||
- name: Register hadolint problem matcher
|
||||
run: |
|
||||
echo "::add-matcher::.github/workflows/matchers/hadolint.json"
|
||||
@@ -489,10 +481,10 @@ jobs:
|
||||
python-version: ${{ fromJSON(needs.info.outputs.python_versions) }}
|
||||
steps:
|
||||
- name: Check out code from GitHub
|
||||
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
|
||||
uses: actions/checkout@v4.2.2
|
||||
- name: Set up Python ${{ matrix.python-version }}
|
||||
id: python
|
||||
uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
|
||||
uses: actions/setup-python@v5.6.0
|
||||
with:
|
||||
python-version: ${{ matrix.python-version }}
|
||||
check-latest: true
|
||||
@@ -505,7 +497,7 @@ jobs:
|
||||
env.HA_SHORT_VERSION }}-$(date -u '+%Y-%m-%dT%H:%M:%s')" >> $GITHUB_OUTPUT
|
||||
- name: Restore base Python virtual environment
|
||||
id: cache-venv
|
||||
uses: actions/cache@0400d5f644dc74513175e3cd8d07132dd4860809 # v4.2.4
|
||||
uses: actions/cache@v4.2.3
|
||||
with:
|
||||
path: venv
|
||||
key: >-
|
||||
@@ -513,7 +505,7 @@ jobs:
|
||||
needs.info.outputs.python_cache_key }}
|
||||
- name: Restore uv wheel cache
|
||||
if: steps.cache-venv.outputs.cache-hit != 'true'
|
||||
uses: actions/cache@0400d5f644dc74513175e3cd8d07132dd4860809 # v4.2.4
|
||||
uses: actions/cache@v4.2.3
|
||||
with:
|
||||
path: ${{ env.UV_CACHE_DIR }}
|
||||
key: >-
|
||||
@@ -523,36 +515,15 @@ jobs:
|
||||
${{ runner.os }}-${{ runner.arch }}-${{ steps.python.outputs.python-version }}-uv-${{
|
||||
env.UV_CACHE_VERSION }}-${{ steps.generate-uv-key.outputs.version }}-${{
|
||||
env.HA_SHORT_VERSION }}-
|
||||
- name: Restore apt cache
|
||||
if: steps.cache-venv.outputs.cache-hit != 'true'
|
||||
id: cache-apt
|
||||
uses: actions/cache@v4.2.4
|
||||
with:
|
||||
path: |
|
||||
${{ env.APT_CACHE_DIR }}
|
||||
${{ env.APT_LIST_CACHE_DIR }}
|
||||
key: >-
|
||||
${{ runner.os }}-${{ runner.arch }}-${{ needs.info.outputs.apt_cache_key }}
|
||||
- name: Install additional OS dependencies
|
||||
if: steps.cache-venv.outputs.cache-hit != 'true'
|
||||
timeout-minutes: 10
|
||||
run: |
|
||||
sudo rm /etc/apt/sources.list.d/microsoft-prod.list
|
||||
if [[ "${{ steps.cache-apt.outputs.cache-hit }}" != 'true' ]]; then
|
||||
mkdir -p ${{ env.APT_CACHE_DIR }}
|
||||
mkdir -p ${{ env.APT_LIST_CACHE_DIR }}
|
||||
fi
|
||||
|
||||
sudo apt-get update \
|
||||
-o Dir::Cache=${{ env.APT_CACHE_DIR }} \
|
||||
-o Dir::State::Lists=${{ env.APT_LIST_CACHE_DIR }}
|
||||
sudo apt-get update
|
||||
sudo apt-get -y install \
|
||||
-o Dir::Cache=${{ env.APT_CACHE_DIR }} \
|
||||
-o Dir::State::Lists=${{ env.APT_LIST_CACHE_DIR }} \
|
||||
bluez \
|
||||
ffmpeg \
|
||||
libturbojpeg \
|
||||
libxml2-utils \
|
||||
libavcodec-dev \
|
||||
libavdevice-dev \
|
||||
libavfilter-dev \
|
||||
@@ -562,10 +533,6 @@ jobs:
|
||||
libswresample-dev \
|
||||
libswscale-dev \
|
||||
libudev-dev
|
||||
|
||||
if [[ "${{ steps.cache-apt.outputs.cache-hit }}" != 'true' ]]; then
|
||||
sudo chmod -R 755 ${{ env.APT_CACHE_BASE }}
|
||||
fi
|
||||
- name: Create Python virtual environment
|
||||
if: steps.cache-venv.outputs.cache-hit != 'true'
|
||||
run: |
|
||||
@@ -585,7 +552,7 @@ jobs:
|
||||
python --version
|
||||
uv pip freeze >> pip_freeze.txt
|
||||
- name: Upload pip_freeze artifact
|
||||
uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4.6.2
|
||||
uses: actions/upload-artifact@v4.6.2
|
||||
with:
|
||||
name: pip-freeze-${{ matrix.python-version }}
|
||||
path: pip_freeze.txt
|
||||
@@ -610,37 +577,23 @@ jobs:
|
||||
- info
|
||||
- base
|
||||
steps:
|
||||
- name: Restore apt cache
|
||||
uses: actions/cache/restore@v4.2.4
|
||||
with:
|
||||
path: |
|
||||
${{ env.APT_CACHE_DIR }}
|
||||
${{ env.APT_LIST_CACHE_DIR }}
|
||||
fail-on-cache-miss: true
|
||||
key: >-
|
||||
${{ runner.os }}-${{ runner.arch }}-${{ needs.info.outputs.apt_cache_key }}
|
||||
- name: Install additional OS dependencies
|
||||
timeout-minutes: 10
|
||||
run: |
|
||||
sudo rm /etc/apt/sources.list.d/microsoft-prod.list
|
||||
sudo apt-get update \
|
||||
-o Dir::Cache=${{ env.APT_CACHE_DIR }} \
|
||||
-o Dir::State::Lists=${{ env.APT_LIST_CACHE_DIR }}
|
||||
sudo apt-get update
|
||||
sudo apt-get -y install \
|
||||
-o Dir::Cache=${{ env.APT_CACHE_DIR }} \
|
||||
-o Dir::State::Lists=${{ env.APT_LIST_CACHE_DIR }} \
|
||||
libturbojpeg
|
||||
- name: Check out code from GitHub
|
||||
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
|
||||
uses: actions/checkout@v4.2.2
|
||||
- name: Set up Python ${{ env.DEFAULT_PYTHON }}
|
||||
id: python
|
||||
uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
|
||||
uses: actions/setup-python@v5.6.0
|
||||
with:
|
||||
python-version: ${{ env.DEFAULT_PYTHON }}
|
||||
check-latest: true
|
||||
- name: Restore full Python ${{ env.DEFAULT_PYTHON }} virtual environment
|
||||
id: cache-venv
|
||||
uses: actions/cache/restore@0400d5f644dc74513175e3cd8d07132dd4860809 # v4.2.4
|
||||
uses: actions/cache/restore@v4.2.3
|
||||
with:
|
||||
path: venv
|
||||
fail-on-cache-miss: true
|
||||
@@ -664,16 +617,16 @@ jobs:
|
||||
- base
|
||||
steps:
|
||||
- name: Check out code from GitHub
|
||||
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
|
||||
uses: actions/checkout@v4.2.2
|
||||
- name: Set up Python ${{ env.DEFAULT_PYTHON }}
|
||||
id: python
|
||||
uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
|
||||
uses: actions/setup-python@v5.6.0
|
||||
with:
|
||||
python-version: ${{ env.DEFAULT_PYTHON }}
|
||||
check-latest: true
|
||||
- name: Restore base Python virtual environment
|
||||
id: cache-venv
|
||||
uses: actions/cache/restore@0400d5f644dc74513175e3cd8d07132dd4860809 # v4.2.4
|
||||
uses: actions/cache/restore@v4.2.3
|
||||
with:
|
||||
path: venv
|
||||
fail-on-cache-miss: true
|
||||
@@ -698,9 +651,9 @@ jobs:
|
||||
&& github.event_name == 'pull_request'
|
||||
steps:
|
||||
- name: Check out code from GitHub
|
||||
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
|
||||
uses: actions/checkout@v4.2.2
|
||||
- name: Dependency review
|
||||
uses: actions/dependency-review-action@595b5aeba73380359d98a5e087f648dbb0edce1b # v4.7.3
|
||||
uses: actions/dependency-review-action@v4.7.1
|
||||
with:
|
||||
license-check: false # We use our own license audit checks
|
||||
|
||||
@@ -721,16 +674,16 @@ jobs:
|
||||
python-version: ${{ fromJson(needs.info.outputs.python_versions) }}
|
||||
steps:
|
||||
- name: Check out code from GitHub
|
||||
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
|
||||
uses: actions/checkout@v4.2.2
|
||||
- name: Set up Python ${{ matrix.python-version }}
|
||||
id: python
|
||||
uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
|
||||
uses: actions/setup-python@v5.6.0
|
||||
with:
|
||||
python-version: ${{ matrix.python-version }}
|
||||
check-latest: true
|
||||
- name: Restore full Python ${{ matrix.python-version }} virtual environment
|
||||
id: cache-venv
|
||||
uses: actions/cache/restore@0400d5f644dc74513175e3cd8d07132dd4860809 # v4.2.4
|
||||
uses: actions/cache/restore@v4.2.3
|
||||
with:
|
||||
path: venv
|
||||
fail-on-cache-miss: true
|
||||
@@ -742,7 +695,7 @@ jobs:
|
||||
. venv/bin/activate
|
||||
python -m script.licenses extract --output-file=licenses-${{ matrix.python-version }}.json
|
||||
- name: Upload licenses
|
||||
uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4.6.2
|
||||
uses: actions/upload-artifact@v4.6.2
|
||||
with:
|
||||
name: licenses-${{ github.run_number }}-${{ matrix.python-version }}
|
||||
path: licenses-${{ matrix.python-version }}.json
|
||||
@@ -764,16 +717,16 @@ jobs:
|
||||
- base
|
||||
steps:
|
||||
- name: Check out code from GitHub
|
||||
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
|
||||
uses: actions/checkout@v4.2.2
|
||||
- name: Set up Python ${{ env.DEFAULT_PYTHON }}
|
||||
id: python
|
||||
uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
|
||||
uses: actions/setup-python@v5.6.0
|
||||
with:
|
||||
python-version: ${{ env.DEFAULT_PYTHON }}
|
||||
check-latest: true
|
||||
- name: Restore full Python ${{ env.DEFAULT_PYTHON }} virtual environment
|
||||
id: cache-venv
|
||||
uses: actions/cache/restore@0400d5f644dc74513175e3cd8d07132dd4860809 # v4.2.4
|
||||
uses: actions/cache/restore@v4.2.3
|
||||
with:
|
||||
path: venv
|
||||
fail-on-cache-miss: true
|
||||
@@ -811,16 +764,16 @@ jobs:
|
||||
- base
|
||||
steps:
|
||||
- name: Check out code from GitHub
|
||||
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
|
||||
uses: actions/checkout@v4.2.2
|
||||
- name: Set up Python ${{ env.DEFAULT_PYTHON }}
|
||||
id: python
|
||||
uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
|
||||
uses: actions/setup-python@v5.6.0
|
||||
with:
|
||||
python-version: ${{ env.DEFAULT_PYTHON }}
|
||||
check-latest: true
|
||||
- name: Restore full Python ${{ env.DEFAULT_PYTHON }} virtual environment
|
||||
id: cache-venv
|
||||
uses: actions/cache/restore@0400d5f644dc74513175e3cd8d07132dd4860809 # v4.2.4
|
||||
uses: actions/cache/restore@v4.2.3
|
||||
with:
|
||||
path: venv
|
||||
fail-on-cache-miss: true
|
||||
@@ -856,10 +809,10 @@ jobs:
|
||||
- base
|
||||
steps:
|
||||
- name: Check out code from GitHub
|
||||
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
|
||||
uses: actions/checkout@v4.2.2
|
||||
- name: Set up Python ${{ env.DEFAULT_PYTHON }}
|
||||
id: python
|
||||
uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
|
||||
uses: actions/setup-python@v5.6.0
|
||||
with:
|
||||
python-version: ${{ env.DEFAULT_PYTHON }}
|
||||
check-latest: true
|
||||
@@ -872,7 +825,7 @@ jobs:
|
||||
env.HA_SHORT_VERSION }}-$(date -u '+%Y-%m-%dT%H:%M:%s')" >> $GITHUB_OUTPUT
|
||||
- name: Restore full Python ${{ env.DEFAULT_PYTHON }} virtual environment
|
||||
id: cache-venv
|
||||
uses: actions/cache/restore@0400d5f644dc74513175e3cd8d07132dd4860809 # v4.2.4
|
||||
uses: actions/cache/restore@v4.2.3
|
||||
with:
|
||||
path: venv
|
||||
fail-on-cache-miss: true
|
||||
@@ -880,7 +833,7 @@ jobs:
|
||||
${{ runner.os }}-${{ runner.arch }}-${{ steps.python.outputs.python-version }}-${{
|
||||
needs.info.outputs.python_cache_key }}
|
||||
- name: Restore mypy cache
|
||||
uses: actions/cache@0400d5f644dc74513175e3cd8d07132dd4860809 # v4.2.4
|
||||
uses: actions/cache@v4.2.3
|
||||
with:
|
||||
path: .mypy_cache
|
||||
key: >-
|
||||
@@ -923,40 +876,26 @@ jobs:
|
||||
- mypy
|
||||
name: Split tests for full run
|
||||
steps:
|
||||
- name: Restore apt cache
|
||||
uses: actions/cache/restore@v4.2.4
|
||||
with:
|
||||
path: |
|
||||
${{ env.APT_CACHE_DIR }}
|
||||
${{ env.APT_LIST_CACHE_DIR }}
|
||||
fail-on-cache-miss: true
|
||||
key: >-
|
||||
${{ runner.os }}-${{ runner.arch }}-${{ needs.info.outputs.apt_cache_key }}
|
||||
- name: Install additional OS dependencies
|
||||
timeout-minutes: 10
|
||||
run: |
|
||||
sudo rm /etc/apt/sources.list.d/microsoft-prod.list
|
||||
sudo apt-get update \
|
||||
-o Dir::Cache=${{ env.APT_CACHE_DIR }} \
|
||||
-o Dir::State::Lists=${{ env.APT_LIST_CACHE_DIR }}
|
||||
sudo apt-get update
|
||||
sudo apt-get -y install \
|
||||
-o Dir::Cache=${{ env.APT_CACHE_DIR }} \
|
||||
-o Dir::State::Lists=${{ env.APT_LIST_CACHE_DIR }} \
|
||||
bluez \
|
||||
ffmpeg \
|
||||
libturbojpeg \
|
||||
libgammu-dev
|
||||
- name: Check out code from GitHub
|
||||
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
|
||||
uses: actions/checkout@v4.2.2
|
||||
- name: Set up Python ${{ env.DEFAULT_PYTHON }}
|
||||
id: python
|
||||
uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
|
||||
uses: actions/setup-python@v5.6.0
|
||||
with:
|
||||
python-version: ${{ env.DEFAULT_PYTHON }}
|
||||
check-latest: true
|
||||
- name: Restore base Python virtual environment
|
||||
id: cache-venv
|
||||
uses: actions/cache/restore@0400d5f644dc74513175e3cd8d07132dd4860809 # v4.2.4
|
||||
uses: actions/cache/restore@v4.2.3
|
||||
with:
|
||||
path: venv
|
||||
fail-on-cache-miss: true
|
||||
@@ -968,7 +907,7 @@ jobs:
|
||||
. venv/bin/activate
|
||||
python -m script.split_tests ${{ needs.info.outputs.test_group_count }} tests
|
||||
- name: Upload pytest_buckets
|
||||
uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4.6.2
|
||||
uses: actions/upload-artifact@v4.6.2
|
||||
with:
|
||||
name: pytest_buckets
|
||||
path: pytest_buckets.txt
|
||||
@@ -997,41 +936,27 @@ jobs:
|
||||
name: >-
|
||||
Run tests Python ${{ matrix.python-version }} (${{ matrix.group }})
|
||||
steps:
|
||||
- name: Restore apt cache
|
||||
uses: actions/cache/restore@v4.2.4
|
||||
with:
|
||||
path: |
|
||||
${{ env.APT_CACHE_DIR }}
|
||||
${{ env.APT_LIST_CACHE_DIR }}
|
||||
fail-on-cache-miss: true
|
||||
key: >-
|
||||
${{ runner.os }}-${{ runner.arch }}-${{ needs.info.outputs.apt_cache_key }}
|
||||
- name: Install additional OS dependencies
|
||||
timeout-minutes: 10
|
||||
run: |
|
||||
sudo rm /etc/apt/sources.list.d/microsoft-prod.list
|
||||
sudo apt-get update \
|
||||
-o Dir::Cache=${{ env.APT_CACHE_DIR }} \
|
||||
-o Dir::State::Lists=${{ env.APT_LIST_CACHE_DIR }}
|
||||
sudo apt-get update
|
||||
sudo apt-get -y install \
|
||||
-o Dir::Cache=${{ env.APT_CACHE_DIR }} \
|
||||
-o Dir::State::Lists=${{ env.APT_LIST_CACHE_DIR }} \
|
||||
bluez \
|
||||
ffmpeg \
|
||||
libturbojpeg \
|
||||
libgammu-dev \
|
||||
libxml2-utils
|
||||
- name: Check out code from GitHub
|
||||
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
|
||||
uses: actions/checkout@v4.2.2
|
||||
- name: Set up Python ${{ matrix.python-version }}
|
||||
id: python
|
||||
uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
|
||||
uses: actions/setup-python@v5.6.0
|
||||
with:
|
||||
python-version: ${{ matrix.python-version }}
|
||||
check-latest: true
|
||||
- name: Restore full Python ${{ matrix.python-version }} virtual environment
|
||||
id: cache-venv
|
||||
uses: actions/cache/restore@0400d5f644dc74513175e3cd8d07132dd4860809 # v4.2.4
|
||||
uses: actions/cache/restore@v4.2.3
|
||||
with:
|
||||
path: venv
|
||||
fail-on-cache-miss: true
|
||||
@@ -1045,7 +970,7 @@ jobs:
|
||||
run: |
|
||||
echo "::add-matcher::.github/workflows/matchers/pytest-slow.json"
|
||||
- name: Download pytest_buckets
|
||||
uses: actions/download-artifact@634f93cb2916e3fdff6788551b99b062d0335ce0 # v5.0.0
|
||||
uses: actions/download-artifact@v4.3.0
|
||||
with:
|
||||
name: pytest_buckets
|
||||
- name: Compile English translations
|
||||
@@ -1084,14 +1009,14 @@ jobs:
|
||||
2>&1 | tee pytest-${{ matrix.python-version }}-${{ matrix.group }}.txt
|
||||
- name: Upload pytest output
|
||||
if: success() || failure() && steps.pytest-full.conclusion == 'failure'
|
||||
uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4.6.2
|
||||
uses: actions/upload-artifact@v4.6.2
|
||||
with:
|
||||
name: pytest-${{ github.run_number }}-${{ matrix.python-version }}-${{ matrix.group }}
|
||||
path: pytest-*.txt
|
||||
overwrite: true
|
||||
- name: Upload coverage artifact
|
||||
if: needs.info.outputs.skip_coverage != 'true'
|
||||
uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4.6.2
|
||||
uses: actions/upload-artifact@v4.6.2
|
||||
with:
|
||||
name: coverage-${{ matrix.python-version }}-${{ matrix.group }}
|
||||
path: coverage.xml
|
||||
@@ -1104,7 +1029,7 @@ jobs:
|
||||
mv "junit.xml-tmp" "junit.xml"
|
||||
- name: Upload test results artifact
|
||||
if: needs.info.outputs.skip_coverage != 'true' && !cancelled()
|
||||
uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4.6.2
|
||||
uses: actions/upload-artifact@v4.6.2
|
||||
with:
|
||||
name: test-results-full-${{ matrix.python-version }}-${{ matrix.group }}
|
||||
path: junit.xml
|
||||
@@ -1144,41 +1069,27 @@ jobs:
|
||||
name: >-
|
||||
Run ${{ matrix.mariadb-group }} tests Python ${{ matrix.python-version }}
|
||||
steps:
|
||||
- name: Restore apt cache
|
||||
uses: actions/cache/restore@v4.2.4
|
||||
with:
|
||||
path: |
|
||||
${{ env.APT_CACHE_DIR }}
|
||||
${{ env.APT_LIST_CACHE_DIR }}
|
||||
fail-on-cache-miss: true
|
||||
key: >-
|
||||
${{ runner.os }}-${{ runner.arch }}-${{ needs.info.outputs.apt_cache_key }}
|
||||
- name: Install additional OS dependencies
|
||||
timeout-minutes: 10
|
||||
run: |
|
||||
sudo rm /etc/apt/sources.list.d/microsoft-prod.list
|
||||
sudo apt-get update \
|
||||
-o Dir::Cache=${{ env.APT_CACHE_DIR }} \
|
||||
-o Dir::State::Lists=${{ env.APT_LIST_CACHE_DIR }}
|
||||
sudo apt-get update
|
||||
sudo apt-get -y install \
|
||||
-o Dir::Cache=${{ env.APT_CACHE_DIR }} \
|
||||
-o Dir::State::Lists=${{ env.APT_LIST_CACHE_DIR }} \
|
||||
bluez \
|
||||
ffmpeg \
|
||||
libturbojpeg \
|
||||
libmariadb-dev-compat \
|
||||
libxml2-utils
|
||||
- name: Check out code from GitHub
|
||||
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
|
||||
uses: actions/checkout@v4.2.2
|
||||
- name: Set up Python ${{ matrix.python-version }}
|
||||
id: python
|
||||
uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
|
||||
uses: actions/setup-python@v5.6.0
|
||||
with:
|
||||
python-version: ${{ matrix.python-version }}
|
||||
check-latest: true
|
||||
- name: Restore full Python ${{ matrix.python-version }} virtual environment
|
||||
id: cache-venv
|
||||
uses: actions/cache/restore@0400d5f644dc74513175e3cd8d07132dd4860809 # v4.2.4
|
||||
uses: actions/cache/restore@v4.2.3
|
||||
with:
|
||||
path: venv
|
||||
fail-on-cache-miss: true
|
||||
@@ -1237,7 +1148,7 @@ jobs:
|
||||
2>&1 | tee pytest-${{ matrix.python-version }}-${mariadb}.txt
|
||||
- name: Upload pytest output
|
||||
if: success() || failure() && steps.pytest-partial.conclusion == 'failure'
|
||||
uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4.6.2
|
||||
uses: actions/upload-artifact@v4.6.2
|
||||
with:
|
||||
name: pytest-${{ github.run_number }}-${{ matrix.python-version }}-${{
|
||||
steps.pytest-partial.outputs.mariadb }}
|
||||
@@ -1245,7 +1156,7 @@ jobs:
|
||||
overwrite: true
|
||||
- name: Upload coverage artifact
|
||||
if: needs.info.outputs.skip_coverage != 'true'
|
||||
uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4.6.2
|
||||
uses: actions/upload-artifact@v4.6.2
|
||||
with:
|
||||
name: coverage-${{ matrix.python-version }}-${{
|
||||
steps.pytest-partial.outputs.mariadb }}
|
||||
@@ -1259,7 +1170,7 @@ jobs:
|
||||
mv "junit.xml-tmp" "junit.xml"
|
||||
- name: Upload test results artifact
|
||||
if: needs.info.outputs.skip_coverage != 'true' && !cancelled()
|
||||
uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4.6.2
|
||||
uses: actions/upload-artifact@v4.6.2
|
||||
with:
|
||||
name: test-results-mariadb-${{ matrix.python-version }}-${{
|
||||
steps.pytest-partial.outputs.mariadb }}
|
||||
@@ -1298,25 +1209,11 @@ jobs:
|
||||
name: >-
|
||||
Run ${{ matrix.postgresql-group }} tests Python ${{ matrix.python-version }}
|
||||
steps:
|
||||
- name: Restore apt cache
|
||||
uses: actions/cache/restore@v4.2.4
|
||||
with:
|
||||
path: |
|
||||
${{ env.APT_CACHE_DIR }}
|
||||
${{ env.APT_LIST_CACHE_DIR }}
|
||||
fail-on-cache-miss: true
|
||||
key: >-
|
||||
${{ runner.os }}-${{ runner.arch }}-${{ needs.info.outputs.apt_cache_key }}
|
||||
- name: Install additional OS dependencies
|
||||
timeout-minutes: 10
|
||||
run: |
|
||||
sudo rm /etc/apt/sources.list.d/microsoft-prod.list
|
||||
sudo apt-get update \
|
||||
-o Dir::Cache=${{ env.APT_CACHE_DIR }} \
|
||||
-o Dir::State::Lists=${{ env.APT_LIST_CACHE_DIR }}
|
||||
sudo apt-get update
|
||||
sudo apt-get -y install \
|
||||
-o Dir::Cache=${{ env.APT_CACHE_DIR }} \
|
||||
-o Dir::State::Lists=${{ env.APT_LIST_CACHE_DIR }} \
|
||||
bluez \
|
||||
ffmpeg \
|
||||
libturbojpeg \
|
||||
@@ -1325,16 +1222,16 @@ jobs:
|
||||
sudo apt-get -y install \
|
||||
postgresql-server-dev-14
|
||||
- name: Check out code from GitHub
|
||||
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
|
||||
uses: actions/checkout@v4.2.2
|
||||
- name: Set up Python ${{ matrix.python-version }}
|
||||
id: python
|
||||
uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
|
||||
uses: actions/setup-python@v5.6.0
|
||||
with:
|
||||
python-version: ${{ matrix.python-version }}
|
||||
check-latest: true
|
||||
- name: Restore full Python ${{ matrix.python-version }} virtual environment
|
||||
id: cache-venv
|
||||
uses: actions/cache/restore@0400d5f644dc74513175e3cd8d07132dd4860809 # v4.2.4
|
||||
uses: actions/cache/restore@v4.2.3
|
||||
with:
|
||||
path: venv
|
||||
fail-on-cache-miss: true
|
||||
@@ -1394,7 +1291,7 @@ jobs:
|
||||
2>&1 | tee pytest-${{ matrix.python-version }}-${postgresql}.txt
|
||||
- name: Upload pytest output
|
||||
if: success() || failure() && steps.pytest-partial.conclusion == 'failure'
|
||||
uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4.6.2
|
||||
uses: actions/upload-artifact@v4.6.2
|
||||
with:
|
||||
name: pytest-${{ github.run_number }}-${{ matrix.python-version }}-${{
|
||||
steps.pytest-partial.outputs.postgresql }}
|
||||
@@ -1402,7 +1299,7 @@ jobs:
|
||||
overwrite: true
|
||||
- name: Upload coverage artifact
|
||||
if: needs.info.outputs.skip_coverage != 'true'
|
||||
uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4.6.2
|
||||
uses: actions/upload-artifact@v4.6.2
|
||||
with:
|
||||
name: coverage-${{ matrix.python-version }}-${{
|
||||
steps.pytest-partial.outputs.postgresql }}
|
||||
@@ -1416,7 +1313,7 @@ jobs:
|
||||
mv "junit.xml-tmp" "junit.xml"
|
||||
- name: Upload test results artifact
|
||||
if: needs.info.outputs.skip_coverage != 'true' && !cancelled()
|
||||
uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4.6.2
|
||||
uses: actions/upload-artifact@v4.6.2
|
||||
with:
|
||||
name: test-results-postgres-${{ matrix.python-version }}-${{
|
||||
steps.pytest-partial.outputs.postgresql }}
|
||||
@@ -1437,14 +1334,14 @@ jobs:
|
||||
timeout-minutes: 10
|
||||
steps:
|
||||
- name: Check out code from GitHub
|
||||
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
|
||||
uses: actions/checkout@v4.2.2
|
||||
- name: Download all coverage artifacts
|
||||
uses: actions/download-artifact@634f93cb2916e3fdff6788551b99b062d0335ce0 # v5.0.0
|
||||
uses: actions/download-artifact@v4.3.0
|
||||
with:
|
||||
pattern: coverage-*
|
||||
- name: Upload coverage to Codecov
|
||||
if: needs.info.outputs.test_full_suite == 'true'
|
||||
uses: codecov/codecov-action@5a1091511ad55cbe89839c7260b706298ca349f7 # v5.5.1
|
||||
uses: codecov/codecov-action@v5.4.3
|
||||
with:
|
||||
fail_ci_if_error: true
|
||||
flags: full-suite
|
||||
@@ -1473,41 +1370,27 @@ jobs:
|
||||
name: >-
|
||||
Run tests Python ${{ matrix.python-version }} (${{ matrix.group }})
|
||||
steps:
|
||||
- name: Restore apt cache
|
||||
uses: actions/cache/restore@v4.2.4
|
||||
with:
|
||||
path: |
|
||||
${{ env.APT_CACHE_DIR }}
|
||||
${{ env.APT_LIST_CACHE_DIR }}
|
||||
fail-on-cache-miss: true
|
||||
key: >-
|
||||
${{ runner.os }}-${{ runner.arch }}-${{ needs.info.outputs.apt_cache_key }}
|
||||
- name: Install additional OS dependencies
|
||||
timeout-minutes: 10
|
||||
run: |
|
||||
sudo rm /etc/apt/sources.list.d/microsoft-prod.list
|
||||
sudo apt-get update \
|
||||
-o Dir::Cache=${{ env.APT_CACHE_DIR }} \
|
||||
-o Dir::State::Lists=${{ env.APT_LIST_CACHE_DIR }}
|
||||
sudo apt-get update
|
||||
sudo apt-get -y install \
|
||||
-o Dir::Cache=${{ env.APT_CACHE_DIR }} \
|
||||
-o Dir::State::Lists=${{ env.APT_LIST_CACHE_DIR }} \
|
||||
bluez \
|
||||
ffmpeg \
|
||||
libturbojpeg \
|
||||
libgammu-dev \
|
||||
libxml2-utils
|
||||
- name: Check out code from GitHub
|
||||
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
|
||||
uses: actions/checkout@v4.2.2
|
||||
- name: Set up Python ${{ matrix.python-version }}
|
||||
id: python
|
||||
uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
|
||||
uses: actions/setup-python@v5.6.0
|
||||
with:
|
||||
python-version: ${{ matrix.python-version }}
|
||||
check-latest: true
|
||||
- name: Restore full Python ${{ matrix.python-version }} virtual environment
|
||||
id: cache-venv
|
||||
uses: actions/cache/restore@0400d5f644dc74513175e3cd8d07132dd4860809 # v4.2.4
|
||||
uses: actions/cache/restore@v4.2.3
|
||||
with:
|
||||
path: venv
|
||||
fail-on-cache-miss: true
|
||||
@@ -1563,14 +1446,14 @@ jobs:
|
||||
2>&1 | tee pytest-${{ matrix.python-version }}-${{ matrix.group }}.txt
|
||||
- name: Upload pytest output
|
||||
if: success() || failure() && steps.pytest-partial.conclusion == 'failure'
|
||||
uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4.6.2
|
||||
uses: actions/upload-artifact@v4.6.2
|
||||
with:
|
||||
name: pytest-${{ github.run_number }}-${{ matrix.python-version }}-${{ matrix.group }}
|
||||
path: pytest-*.txt
|
||||
overwrite: true
|
||||
- name: Upload coverage artifact
|
||||
if: needs.info.outputs.skip_coverage != 'true'
|
||||
uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4.6.2
|
||||
uses: actions/upload-artifact@v4.6.2
|
||||
with:
|
||||
name: coverage-${{ matrix.python-version }}-${{ matrix.group }}
|
||||
path: coverage.xml
|
||||
@@ -1583,7 +1466,7 @@ jobs:
|
||||
mv "junit.xml-tmp" "junit.xml"
|
||||
- name: Upload test results artifact
|
||||
if: needs.info.outputs.skip_coverage != 'true' && !cancelled()
|
||||
uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4.6.2
|
||||
uses: actions/upload-artifact@v4.6.2
|
||||
with:
|
||||
name: test-results-partial-${{ matrix.python-version }}-${{ matrix.group }}
|
||||
path: junit.xml
|
||||
@@ -1601,14 +1484,14 @@ jobs:
|
||||
timeout-minutes: 10
|
||||
steps:
|
||||
- name: Check out code from GitHub
|
||||
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
|
||||
uses: actions/checkout@v4.2.2
|
||||
- name: Download all coverage artifacts
|
||||
uses: actions/download-artifact@634f93cb2916e3fdff6788551b99b062d0335ce0 # v5.0.0
|
||||
uses: actions/download-artifact@v4.3.0
|
||||
with:
|
||||
pattern: coverage-*
|
||||
- name: Upload coverage to Codecov
|
||||
if: needs.info.outputs.test_full_suite == 'false'
|
||||
uses: codecov/codecov-action@5a1091511ad55cbe89839c7260b706298ca349f7 # v5.5.1
|
||||
uses: codecov/codecov-action@v5.4.3
|
||||
with:
|
||||
fail_ci_if_error: true
|
||||
token: ${{ secrets.CODECOV_TOKEN }}
|
||||
@@ -1628,11 +1511,11 @@ jobs:
|
||||
timeout-minutes: 10
|
||||
steps:
|
||||
- name: Download all coverage artifacts
|
||||
uses: actions/download-artifact@634f93cb2916e3fdff6788551b99b062d0335ce0 # v5.0.0
|
||||
uses: actions/download-artifact@v4.3.0
|
||||
with:
|
||||
pattern: test-results-*
|
||||
- name: Upload test results to Codecov
|
||||
uses: codecov/test-results-action@47f89e9acb64b76debcd5ea40642d25a4adced9f # v1.1.1
|
||||
uses: codecov/test-results-action@v1
|
||||
with:
|
||||
fail_ci_if_error: true
|
||||
verbose: true
|
||||
|
6
.github/workflows/codeql.yml
vendored
6
.github/workflows/codeql.yml
vendored
@@ -21,14 +21,14 @@ jobs:
|
||||
|
||||
steps:
|
||||
- name: Check out code from GitHub
|
||||
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
|
||||
uses: actions/checkout@v4.2.2
|
||||
|
||||
- name: Initialize CodeQL
|
||||
uses: github/codeql-action/init@192325c86100d080feab897ff886c34abd4c83a3 # v3.30.3
|
||||
uses: github/codeql-action/init@v3.29.4
|
||||
with:
|
||||
languages: python
|
||||
|
||||
- name: Perform CodeQL Analysis
|
||||
uses: github/codeql-action/analyze@192325c86100d080feab897ff886c34abd4c83a3 # v3.30.3
|
||||
uses: github/codeql-action/analyze@v3.29.4
|
||||
with:
|
||||
category: "/language:python"
|
||||
|
@@ -16,7 +16,7 @@ jobs:
|
||||
steps:
|
||||
- name: Check if integration label was added and extract details
|
||||
id: extract
|
||||
uses: actions/github-script@ed597411d8f924073f98dfc5c65a23a2325f34cd # v8.0.0
|
||||
uses: actions/github-script@v7.0.1
|
||||
with:
|
||||
script: |
|
||||
// Debug: Log the event payload
|
||||
@@ -113,7 +113,7 @@ jobs:
|
||||
- name: Fetch similar issues
|
||||
id: fetch_similar
|
||||
if: steps.extract.outputs.should_continue == 'true'
|
||||
uses: actions/github-script@ed597411d8f924073f98dfc5c65a23a2325f34cd # v8.0.0
|
||||
uses: actions/github-script@v7.0.1
|
||||
env:
|
||||
INTEGRATION_LABELS: ${{ steps.extract.outputs.integration_labels }}
|
||||
CURRENT_NUMBER: ${{ steps.extract.outputs.current_number }}
|
||||
@@ -231,7 +231,7 @@ jobs:
|
||||
- name: Detect duplicates using AI
|
||||
id: ai_detection
|
||||
if: steps.extract.outputs.should_continue == 'true' && steps.fetch_similar.outputs.has_similar == 'true'
|
||||
uses: actions/ai-inference@a1c11829223a786afe3b5663db904a3aa1eac3a2 # v2.0.1
|
||||
uses: actions/ai-inference@v1.2.3
|
||||
with:
|
||||
model: openai/gpt-4o
|
||||
system-prompt: |
|
||||
@@ -280,7 +280,7 @@ jobs:
|
||||
- name: Post duplicate detection results
|
||||
id: post_results
|
||||
if: steps.extract.outputs.should_continue == 'true' && steps.fetch_similar.outputs.has_similar == 'true'
|
||||
uses: actions/github-script@ed597411d8f924073f98dfc5c65a23a2325f34cd # v8.0.0
|
||||
uses: actions/github-script@v7.0.1
|
||||
env:
|
||||
AI_RESPONSE: ${{ steps.ai_detection.outputs.response }}
|
||||
SIMILAR_ISSUES: ${{ steps.fetch_similar.outputs.similar_issues }}
|
||||
|
@@ -16,7 +16,7 @@ jobs:
|
||||
steps:
|
||||
- name: Check issue language
|
||||
id: detect_language
|
||||
uses: actions/github-script@ed597411d8f924073f98dfc5c65a23a2325f34cd # v8.0.0
|
||||
uses: actions/github-script@v7.0.1
|
||||
env:
|
||||
ISSUE_NUMBER: ${{ github.event.issue.number }}
|
||||
ISSUE_TITLE: ${{ github.event.issue.title }}
|
||||
@@ -57,7 +57,7 @@ jobs:
|
||||
- name: Detect language using AI
|
||||
id: ai_language_detection
|
||||
if: steps.detect_language.outputs.should_continue == 'true'
|
||||
uses: actions/ai-inference@a1c11829223a786afe3b5663db904a3aa1eac3a2 # v2.0.1
|
||||
uses: actions/ai-inference@v1.2.3
|
||||
with:
|
||||
model: openai/gpt-4o-mini
|
||||
system-prompt: |
|
||||
@@ -90,7 +90,7 @@ jobs:
|
||||
|
||||
- name: Process non-English issues
|
||||
if: steps.detect_language.outputs.should_continue == 'true'
|
||||
uses: actions/github-script@ed597411d8f924073f98dfc5c65a23a2325f34cd # v8.0.0
|
||||
uses: actions/github-script@v7.0.1
|
||||
env:
|
||||
AI_RESPONSE: ${{ steps.ai_language_detection.outputs.response }}
|
||||
ISSUE_NUMBER: ${{ steps.detect_language.outputs.issue_number }}
|
||||
|
2
.github/workflows/lock.yml
vendored
2
.github/workflows/lock.yml
vendored
@@ -10,7 +10,7 @@ jobs:
|
||||
if: github.repository_owner == 'home-assistant'
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- uses: dessant/lock-threads@1bf7ec25051fe7c00bdd17e6a7cf3d7bfb7dc771 # v5.0.1
|
||||
- uses: dessant/lock-threads@v5.0.1
|
||||
with:
|
||||
github-token: ${{ github.token }}
|
||||
issue-inactive-days: "30"
|
||||
|
4
.github/workflows/restrict-task-creation.yml
vendored
4
.github/workflows/restrict-task-creation.yml
vendored
@@ -9,10 +9,10 @@ jobs:
|
||||
check-authorization:
|
||||
runs-on: ubuntu-latest
|
||||
# Only run if this is a Task issue type (from the issue form)
|
||||
if: github.event.issue.type.name == 'Task'
|
||||
if: github.event.issue.issue_type == 'Task'
|
||||
steps:
|
||||
- name: Check if user is authorized
|
||||
uses: actions/github-script@ed597411d8f924073f98dfc5c65a23a2325f34cd # v8.0.0
|
||||
uses: actions/github-script@v7
|
||||
with:
|
||||
script: |
|
||||
const issueAuthor = context.payload.issue.user.login;
|
||||
|
6
.github/workflows/stale.yml
vendored
6
.github/workflows/stale.yml
vendored
@@ -17,7 +17,7 @@ jobs:
|
||||
# - No PRs marked as no-stale
|
||||
# - No issues (-1)
|
||||
- name: 60 days stale PRs policy
|
||||
uses: actions/stale@3a9db7e6a41a89f618792c92c0e97cc736e1b13f # v10.0.0
|
||||
uses: actions/stale@v9.1.0
|
||||
with:
|
||||
repo-token: ${{ secrets.GITHUB_TOKEN }}
|
||||
days-before-stale: 60
|
||||
@@ -57,7 +57,7 @@ jobs:
|
||||
# - No issues marked as no-stale or help-wanted
|
||||
# - No PRs (-1)
|
||||
- name: 90 days stale issues
|
||||
uses: actions/stale@3a9db7e6a41a89f618792c92c0e97cc736e1b13f # v10.0.0
|
||||
uses: actions/stale@v9.1.0
|
||||
with:
|
||||
repo-token: ${{ steps.token.outputs.token }}
|
||||
days-before-stale: 90
|
||||
@@ -87,7 +87,7 @@ jobs:
|
||||
# - No Issues marked as no-stale or help-wanted
|
||||
# - No PRs (-1)
|
||||
- name: Needs more information stale issues policy
|
||||
uses: actions/stale@3a9db7e6a41a89f618792c92c0e97cc736e1b13f # v10.0.0
|
||||
uses: actions/stale@v9.1.0
|
||||
with:
|
||||
repo-token: ${{ steps.token.outputs.token }}
|
||||
only-labels: "needs-more-information"
|
||||
|
4
.github/workflows/translations.yml
vendored
4
.github/workflows/translations.yml
vendored
@@ -19,10 +19,10 @@ jobs:
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- name: Checkout the repository
|
||||
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
|
||||
uses: actions/checkout@v4.2.2
|
||||
|
||||
- name: Set up Python ${{ env.DEFAULT_PYTHON }}
|
||||
uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
|
||||
uses: actions/setup-python@v5.6.0
|
||||
with:
|
||||
python-version: ${{ env.DEFAULT_PYTHON }}
|
||||
|
||||
|
36
.github/workflows/wheels.yml
vendored
36
.github/workflows/wheels.yml
vendored
@@ -32,11 +32,11 @@ jobs:
|
||||
architectures: ${{ steps.info.outputs.architectures }}
|
||||
steps:
|
||||
- name: Checkout the repository
|
||||
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
|
||||
uses: actions/checkout@v4.2.2
|
||||
|
||||
- name: Set up Python ${{ env.DEFAULT_PYTHON }}
|
||||
id: python
|
||||
uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
|
||||
uses: actions/setup-python@v5.6.0
|
||||
with:
|
||||
python-version: ${{ env.DEFAULT_PYTHON }}
|
||||
check-latest: true
|
||||
@@ -91,7 +91,7 @@ jobs:
|
||||
) > build_constraints.txt
|
||||
|
||||
- name: Upload env_file
|
||||
uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4.6.2
|
||||
uses: actions/upload-artifact@v4.6.2
|
||||
with:
|
||||
name: env_file
|
||||
path: ./.env_file
|
||||
@@ -99,14 +99,14 @@ jobs:
|
||||
overwrite: true
|
||||
|
||||
- name: Upload build_constraints
|
||||
uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4.6.2
|
||||
uses: actions/upload-artifact@v4.6.2
|
||||
with:
|
||||
name: build_constraints
|
||||
path: ./build_constraints.txt
|
||||
overwrite: true
|
||||
|
||||
- name: Upload requirements_diff
|
||||
uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4.6.2
|
||||
uses: actions/upload-artifact@v4.6.2
|
||||
with:
|
||||
name: requirements_diff
|
||||
path: ./requirements_diff.txt
|
||||
@@ -118,7 +118,7 @@ jobs:
|
||||
python -m script.gen_requirements_all ci
|
||||
|
||||
- name: Upload requirements_all_wheels
|
||||
uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4.6.2
|
||||
uses: actions/upload-artifact@v4.6.2
|
||||
with:
|
||||
name: requirements_all_wheels
|
||||
path: ./requirements_all_wheels_*.txt
|
||||
@@ -135,20 +135,20 @@ jobs:
|
||||
arch: ${{ fromJson(needs.init.outputs.architectures) }}
|
||||
steps:
|
||||
- name: Checkout the repository
|
||||
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
|
||||
uses: actions/checkout@v4.2.2
|
||||
|
||||
- name: Download env_file
|
||||
uses: actions/download-artifact@634f93cb2916e3fdff6788551b99b062d0335ce0 # v5.0.0
|
||||
uses: actions/download-artifact@v4.3.0
|
||||
with:
|
||||
name: env_file
|
||||
|
||||
- name: Download build_constraints
|
||||
uses: actions/download-artifact@634f93cb2916e3fdff6788551b99b062d0335ce0 # v5.0.0
|
||||
uses: actions/download-artifact@v4.3.0
|
||||
with:
|
||||
name: build_constraints
|
||||
|
||||
- name: Download requirements_diff
|
||||
uses: actions/download-artifact@634f93cb2916e3fdff6788551b99b062d0335ce0 # v5.0.0
|
||||
uses: actions/download-artifact@v4.3.0
|
||||
with:
|
||||
name: requirements_diff
|
||||
|
||||
@@ -158,9 +158,8 @@ jobs:
|
||||
sed -i "/uv/d" requirements.txt
|
||||
sed -i "/uv/d" requirements_diff.txt
|
||||
|
||||
# home-assistant/wheels doesn't support sha pinning
|
||||
- name: Build wheels
|
||||
uses: home-assistant/wheels@2025.07.0
|
||||
uses: home-assistant/wheels@2025.03.0
|
||||
with:
|
||||
abi: ${{ matrix.abi }}
|
||||
tag: musllinux_1_2
|
||||
@@ -185,25 +184,25 @@ jobs:
|
||||
arch: ${{ fromJson(needs.init.outputs.architectures) }}
|
||||
steps:
|
||||
- name: Checkout the repository
|
||||
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
|
||||
uses: actions/checkout@v4.2.2
|
||||
|
||||
- name: Download env_file
|
||||
uses: actions/download-artifact@634f93cb2916e3fdff6788551b99b062d0335ce0 # v5.0.0
|
||||
uses: actions/download-artifact@v4.3.0
|
||||
with:
|
||||
name: env_file
|
||||
|
||||
- name: Download build_constraints
|
||||
uses: actions/download-artifact@634f93cb2916e3fdff6788551b99b062d0335ce0 # v5.0.0
|
||||
uses: actions/download-artifact@v4.3.0
|
||||
with:
|
||||
name: build_constraints
|
||||
|
||||
- name: Download requirements_diff
|
||||
uses: actions/download-artifact@634f93cb2916e3fdff6788551b99b062d0335ce0 # v5.0.0
|
||||
uses: actions/download-artifact@v4.3.0
|
||||
with:
|
||||
name: requirements_diff
|
||||
|
||||
- name: Download requirements_all_wheels
|
||||
uses: actions/download-artifact@634f93cb2916e3fdff6788551b99b062d0335ce0 # v5.0.0
|
||||
uses: actions/download-artifact@v4.3.0
|
||||
with:
|
||||
name: requirements_all_wheels
|
||||
|
||||
@@ -219,9 +218,8 @@ jobs:
|
||||
sed -i "/uv/d" requirements.txt
|
||||
sed -i "/uv/d" requirements_diff.txt
|
||||
|
||||
# home-assistant/wheels doesn't support sha pinning
|
||||
- name: Build wheels
|
||||
uses: home-assistant/wheels@2025.07.0
|
||||
uses: home-assistant/wheels@2025.03.0
|
||||
with:
|
||||
abi: ${{ matrix.abi }}
|
||||
tag: musllinux_1_2
|
||||
|
2
.gitignore
vendored
2
.gitignore
vendored
@@ -140,5 +140,5 @@ tmp_cache
|
||||
pytest_buckets.txt
|
||||
|
||||
# AI tooling
|
||||
.claude/settings.local.json
|
||||
.claude
|
||||
|
||||
|
@@ -1,6 +1,6 @@
|
||||
repos:
|
||||
- repo: https://github.com/astral-sh/ruff-pre-commit
|
||||
rev: v0.13.0
|
||||
rev: v0.12.1
|
||||
hooks:
|
||||
- id: ruff-check
|
||||
args:
|
||||
@@ -18,7 +18,7 @@ repos:
|
||||
exclude_types: [csv, json, html]
|
||||
exclude: ^tests/fixtures/|homeassistant/generated/|tests/components/.*/snapshots/
|
||||
- repo: https://github.com/pre-commit/pre-commit-hooks
|
||||
rev: v6.0.0
|
||||
rev: v5.0.0
|
||||
hooks:
|
||||
- id: check-executables-have-shebangs
|
||||
stages: [manual]
|
||||
|
@@ -53,7 +53,6 @@ homeassistant.components.air_quality.*
|
||||
homeassistant.components.airgradient.*
|
||||
homeassistant.components.airly.*
|
||||
homeassistant.components.airnow.*
|
||||
homeassistant.components.airos.*
|
||||
homeassistant.components.airq.*
|
||||
homeassistant.components.airthings.*
|
||||
homeassistant.components.airthings_ble.*
|
||||
@@ -169,7 +168,6 @@ homeassistant.components.dnsip.*
|
||||
homeassistant.components.doorbird.*
|
||||
homeassistant.components.dormakaba_dkey.*
|
||||
homeassistant.components.downloader.*
|
||||
homeassistant.components.droplet.*
|
||||
homeassistant.components.dsmr.*
|
||||
homeassistant.components.duckdns.*
|
||||
homeassistant.components.dunehd.*
|
||||
@@ -308,10 +306,10 @@ homeassistant.components.ld2410_ble.*
|
||||
homeassistant.components.led_ble.*
|
||||
homeassistant.components.lektrico.*
|
||||
homeassistant.components.letpot.*
|
||||
homeassistant.components.libre_hardware_monitor.*
|
||||
homeassistant.components.lidarr.*
|
||||
homeassistant.components.lifx.*
|
||||
homeassistant.components.light.*
|
||||
homeassistant.components.linear_garage_door.*
|
||||
homeassistant.components.linkplay.*
|
||||
homeassistant.components.litejet.*
|
||||
homeassistant.components.litterrobot.*
|
||||
@@ -384,7 +382,6 @@ homeassistant.components.openai_conversation.*
|
||||
homeassistant.components.openexchangerates.*
|
||||
homeassistant.components.opensky.*
|
||||
homeassistant.components.openuv.*
|
||||
homeassistant.components.opnsense.*
|
||||
homeassistant.components.opower.*
|
||||
homeassistant.components.oralb.*
|
||||
homeassistant.components.otbr.*
|
||||
@@ -402,7 +399,6 @@ homeassistant.components.person.*
|
||||
homeassistant.components.pi_hole.*
|
||||
homeassistant.components.ping.*
|
||||
homeassistant.components.plugwise.*
|
||||
homeassistant.components.portainer.*
|
||||
homeassistant.components.powerfox.*
|
||||
homeassistant.components.powerwall.*
|
||||
homeassistant.components.private_ble_device.*
|
||||
@@ -462,7 +458,6 @@ homeassistant.components.sensorpush_cloud.*
|
||||
homeassistant.components.sensoterra.*
|
||||
homeassistant.components.senz.*
|
||||
homeassistant.components.sfr_box.*
|
||||
homeassistant.components.sftp_storage.*
|
||||
homeassistant.components.shell_command.*
|
||||
homeassistant.components.shelly.*
|
||||
homeassistant.components.shopping_list.*
|
||||
@@ -471,7 +466,6 @@ homeassistant.components.simplisafe.*
|
||||
homeassistant.components.siren.*
|
||||
homeassistant.components.skybell.*
|
||||
homeassistant.components.slack.*
|
||||
homeassistant.components.sleep_as_android.*
|
||||
homeassistant.components.sleepiq.*
|
||||
homeassistant.components.smhi.*
|
||||
homeassistant.components.smlight.*
|
||||
@@ -507,7 +501,6 @@ homeassistant.components.tag.*
|
||||
homeassistant.components.tailscale.*
|
||||
homeassistant.components.tailwind.*
|
||||
homeassistant.components.tami4.*
|
||||
homeassistant.components.tankerkoenig.*
|
||||
homeassistant.components.tautulli.*
|
||||
homeassistant.components.tcp.*
|
||||
homeassistant.components.technove.*
|
||||
@@ -553,7 +546,6 @@ homeassistant.components.valve.*
|
||||
homeassistant.components.velbus.*
|
||||
homeassistant.components.vlc_telnet.*
|
||||
homeassistant.components.vodafone_station.*
|
||||
homeassistant.components.volvo.*
|
||||
homeassistant.components.wake_on_lan.*
|
||||
homeassistant.components.wake_word.*
|
||||
homeassistant.components.wallbox.*
|
||||
|
103
CODEOWNERS
generated
103
CODEOWNERS
generated
@@ -67,8 +67,6 @@ build.json @home-assistant/supervisor
|
||||
/tests/components/airly/ @bieniu
|
||||
/homeassistant/components/airnow/ @asymworks
|
||||
/tests/components/airnow/ @asymworks
|
||||
/homeassistant/components/airos/ @CoMPaTech
|
||||
/tests/components/airos/ @CoMPaTech
|
||||
/homeassistant/components/airq/ @Sibgatulin @dl2080
|
||||
/tests/components/airq/ @Sibgatulin @dl2080
|
||||
/homeassistant/components/airthings/ @danielhiversen @LaStrada
|
||||
@@ -87,8 +85,6 @@ build.json @home-assistant/supervisor
|
||||
/tests/components/airzone/ @Noltari
|
||||
/homeassistant/components/airzone_cloud/ @Noltari
|
||||
/tests/components/airzone_cloud/ @Noltari
|
||||
/homeassistant/components/aladdin_connect/ @swcloudgenie
|
||||
/tests/components/aladdin_connect/ @swcloudgenie
|
||||
/homeassistant/components/alarm_control_panel/ @home-assistant/core
|
||||
/tests/components/alarm_control_panel/ @home-assistant/core
|
||||
/homeassistant/components/alert/ @home-assistant/core @frenck
|
||||
@@ -154,12 +150,12 @@ build.json @home-assistant/supervisor
|
||||
/tests/components/arve/ @ikalnyi
|
||||
/homeassistant/components/aseko_pool_live/ @milanmeu
|
||||
/tests/components/aseko_pool_live/ @milanmeu
|
||||
/homeassistant/components/assist_pipeline/ @synesthesiam @arturpragacz
|
||||
/tests/components/assist_pipeline/ @synesthesiam @arturpragacz
|
||||
/homeassistant/components/assist_satellite/ @home-assistant/core @synesthesiam @arturpragacz
|
||||
/tests/components/assist_satellite/ @home-assistant/core @synesthesiam @arturpragacz
|
||||
/homeassistant/components/asuswrt/ @kennedyshead @ollo69 @Vaskivskyi
|
||||
/tests/components/asuswrt/ @kennedyshead @ollo69 @Vaskivskyi
|
||||
/homeassistant/components/assist_pipeline/ @balloob @synesthesiam
|
||||
/tests/components/assist_pipeline/ @balloob @synesthesiam
|
||||
/homeassistant/components/assist_satellite/ @home-assistant/core @synesthesiam
|
||||
/tests/components/assist_satellite/ @home-assistant/core @synesthesiam
|
||||
/homeassistant/components/asuswrt/ @kennedyshead @ollo69
|
||||
/tests/components/asuswrt/ @kennedyshead @ollo69
|
||||
/homeassistant/components/atag/ @MatsNL
|
||||
/tests/components/atag/ @MatsNL
|
||||
/homeassistant/components/aten_pe/ @mtdcr
|
||||
@@ -298,8 +294,8 @@ build.json @home-assistant/supervisor
|
||||
/tests/components/configurator/ @home-assistant/core
|
||||
/homeassistant/components/control4/ @lawtancool
|
||||
/tests/components/control4/ @lawtancool
|
||||
/homeassistant/components/conversation/ @home-assistant/core @synesthesiam @arturpragacz
|
||||
/tests/components/conversation/ @home-assistant/core @synesthesiam @arturpragacz
|
||||
/homeassistant/components/conversation/ @home-assistant/core @synesthesiam
|
||||
/tests/components/conversation/ @home-assistant/core @synesthesiam
|
||||
/homeassistant/components/cookidoo/ @miaucl
|
||||
/tests/components/cookidoo/ @miaucl
|
||||
/homeassistant/components/coolmaster/ @OnFreund
|
||||
@@ -377,8 +373,6 @@ build.json @home-assistant/supervisor
|
||||
/tests/components/dremel_3d_printer/ @tkdrob
|
||||
/homeassistant/components/drop_connect/ @ChandlerSystems @pfrazer
|
||||
/tests/components/drop_connect/ @ChandlerSystems @pfrazer
|
||||
/homeassistant/components/droplet/ @sarahseidman
|
||||
/tests/components/droplet/ @sarahseidman
|
||||
/homeassistant/components/dsmr/ @Robbie1221
|
||||
/tests/components/dsmr/ @Robbie1221
|
||||
/homeassistant/components/dsmr_reader/ @sorted-bits @glodenox @erwindouna
|
||||
@@ -426,8 +420,6 @@ build.json @home-assistant/supervisor
|
||||
/homeassistant/components/emby/ @mezz64
|
||||
/homeassistant/components/emoncms/ @borpin @alexandrecuer
|
||||
/tests/components/emoncms/ @borpin @alexandrecuer
|
||||
/homeassistant/components/emoncms_history/ @alexandrecuer
|
||||
/tests/components/emoncms_history/ @alexandrecuer
|
||||
/homeassistant/components/emonitor/ @bdraco
|
||||
/tests/components/emonitor/ @bdraco
|
||||
/homeassistant/components/emulated_hue/ @bdraco @Tho85
|
||||
@@ -442,8 +434,10 @@ build.json @home-assistant/supervisor
|
||||
/tests/components/energyzero/ @klaasnicolaas
|
||||
/homeassistant/components/enigma2/ @autinerd
|
||||
/tests/components/enigma2/ @autinerd
|
||||
/homeassistant/components/enphase_envoy/ @bdraco @cgarwood @catsmanac
|
||||
/tests/components/enphase_envoy/ @bdraco @cgarwood @catsmanac
|
||||
/homeassistant/components/enocean/ @bdurrer
|
||||
/tests/components/enocean/ @bdurrer
|
||||
/homeassistant/components/enphase_envoy/ @bdraco @cgarwood @joostlek @catsmanac
|
||||
/tests/components/enphase_envoy/ @bdraco @cgarwood @joostlek @catsmanac
|
||||
/homeassistant/components/entur_public_transport/ @hfurubotten
|
||||
/homeassistant/components/environment_canada/ @gwww @michaeldavie
|
||||
/tests/components/environment_canada/ @gwww @michaeldavie
|
||||
@@ -464,6 +458,8 @@ build.json @home-assistant/supervisor
|
||||
/tests/components/eufylife_ble/ @bdr99
|
||||
/homeassistant/components/event/ @home-assistant/core
|
||||
/tests/components/event/ @home-assistant/core
|
||||
/homeassistant/components/evil_genius_labs/ @balloob
|
||||
/tests/components/evil_genius_labs/ @balloob
|
||||
/homeassistant/components/evohome/ @zxdavb
|
||||
/tests/components/evohome/ @zxdavb
|
||||
/homeassistant/components/ezviz/ @RenierM26
|
||||
@@ -513,8 +509,8 @@ build.json @home-assistant/supervisor
|
||||
/homeassistant/components/forked_daapd/ @uvjustin
|
||||
/tests/components/forked_daapd/ @uvjustin
|
||||
/homeassistant/components/fortios/ @kimfrellsen
|
||||
/homeassistant/components/foscam/ @Foscam-wangzhengyu
|
||||
/tests/components/foscam/ @Foscam-wangzhengyu
|
||||
/homeassistant/components/foscam/ @krmarien
|
||||
/tests/components/foscam/ @krmarien
|
||||
/homeassistant/components/freebox/ @hacf-fr @Quentame
|
||||
/tests/components/freebox/ @hacf-fr @Quentame
|
||||
/homeassistant/components/freedompro/ @stefano055415
|
||||
@@ -648,8 +644,6 @@ build.json @home-assistant/supervisor
|
||||
/tests/components/homeassistant/ @home-assistant/core
|
||||
/homeassistant/components/homeassistant_alerts/ @home-assistant/core
|
||||
/tests/components/homeassistant_alerts/ @home-assistant/core
|
||||
/homeassistant/components/homeassistant_connect_zbt2/ @home-assistant/core
|
||||
/tests/components/homeassistant_connect_zbt2/ @home-assistant/core
|
||||
/homeassistant/components/homeassistant_green/ @home-assistant/core
|
||||
/tests/components/homeassistant_green/ @home-assistant/core
|
||||
/homeassistant/components/homeassistant_hardware/ @home-assistant/core
|
||||
@@ -678,8 +672,8 @@ build.json @home-assistant/supervisor
|
||||
/tests/components/http/ @home-assistant/core
|
||||
/homeassistant/components/huawei_lte/ @scop @fphammerle
|
||||
/tests/components/huawei_lte/ @scop @fphammerle
|
||||
/homeassistant/components/hue/ @marcelveldt
|
||||
/tests/components/hue/ @marcelveldt
|
||||
/homeassistant/components/hue/ @balloob @marcelveldt
|
||||
/tests/components/hue/ @balloob @marcelveldt
|
||||
/homeassistant/components/huisbaasje/ @dennisschroer
|
||||
/tests/components/huisbaasje/ @dennisschroer
|
||||
/homeassistant/components/humidifier/ @home-assistant/core @Shulyaka
|
||||
@@ -751,8 +745,8 @@ build.json @home-assistant/supervisor
|
||||
/tests/components/integration/ @dgomes
|
||||
/homeassistant/components/intellifire/ @jeeftor
|
||||
/tests/components/intellifire/ @jeeftor
|
||||
/homeassistant/components/intent/ @home-assistant/core @synesthesiam @arturpragacz
|
||||
/tests/components/intent/ @home-assistant/core @synesthesiam @arturpragacz
|
||||
/homeassistant/components/intent/ @home-assistant/core @synesthesiam
|
||||
/tests/components/intent/ @home-assistant/core @synesthesiam
|
||||
/homeassistant/components/intesishome/ @jnimmo
|
||||
/homeassistant/components/iometer/ @MaestroOnICe
|
||||
/tests/components/iometer/ @MaestroOnICe
|
||||
@@ -860,14 +854,14 @@ build.json @home-assistant/supervisor
|
||||
/tests/components/lg_netcast/ @Drafteed @splinter98
|
||||
/homeassistant/components/lg_thinq/ @LG-ThinQ-Integration
|
||||
/tests/components/lg_thinq/ @LG-ThinQ-Integration
|
||||
/homeassistant/components/libre_hardware_monitor/ @Sab44
|
||||
/tests/components/libre_hardware_monitor/ @Sab44
|
||||
/homeassistant/components/lidarr/ @tkdrob
|
||||
/tests/components/lidarr/ @tkdrob
|
||||
/homeassistant/components/lifx/ @Djelibeybi
|
||||
/tests/components/lifx/ @Djelibeybi
|
||||
/homeassistant/components/light/ @home-assistant/core
|
||||
/tests/components/light/ @home-assistant/core
|
||||
/homeassistant/components/linear_garage_door/ @IceBotYT
|
||||
/tests/components/linear_garage_door/ @IceBotYT
|
||||
/homeassistant/components/linkplay/ @Velleman
|
||||
/tests/components/linkplay/ @Velleman
|
||||
/homeassistant/components/linux_battery/ @fabaff
|
||||
@@ -968,8 +962,6 @@ build.json @home-assistant/supervisor
|
||||
/tests/components/moat/ @bdraco
|
||||
/homeassistant/components/mobile_app/ @home-assistant/core
|
||||
/tests/components/mobile_app/ @home-assistant/core
|
||||
/homeassistant/components/modbus/ @janiversen
|
||||
/tests/components/modbus/ @janiversen
|
||||
/homeassistant/components/modem_callerid/ @tkdrob
|
||||
/tests/components/modem_callerid/ @tkdrob
|
||||
/homeassistant/components/modern_forms/ @wonderslug
|
||||
@@ -1017,8 +1009,7 @@ build.json @home-assistant/supervisor
|
||||
/tests/components/nanoleaf/ @milanmeu @joostlek
|
||||
/homeassistant/components/nasweb/ @nasWebio
|
||||
/tests/components/nasweb/ @nasWebio
|
||||
/homeassistant/components/nederlandse_spoorwegen/ @YarmoM @heindrichpaul
|
||||
/tests/components/nederlandse_spoorwegen/ @YarmoM @heindrichpaul
|
||||
/homeassistant/components/nederlandse_spoorwegen/ @YarmoM
|
||||
/homeassistant/components/ness_alarm/ @nickw444
|
||||
/tests/components/ness_alarm/ @nickw444
|
||||
/homeassistant/components/nest/ @allenporter
|
||||
@@ -1113,6 +1104,8 @@ build.json @home-assistant/supervisor
|
||||
/tests/components/open_meteo/ @frenck
|
||||
/homeassistant/components/open_router/ @joostlek
|
||||
/tests/components/open_router/ @joostlek
|
||||
/homeassistant/components/openai_conversation/ @balloob
|
||||
/tests/components/openai_conversation/ @balloob
|
||||
/homeassistant/components/openerz/ @misialq
|
||||
/tests/components/openerz/ @misialq
|
||||
/homeassistant/components/openexchangerates/ @MartinHjelmare
|
||||
@@ -1188,12 +1181,8 @@ build.json @home-assistant/supervisor
|
||||
/tests/components/plum_lightpad/ @ColinHarrington @prystupa
|
||||
/homeassistant/components/point/ @fredrike
|
||||
/tests/components/point/ @fredrike
|
||||
/homeassistant/components/pooldose/ @lmaertin
|
||||
/tests/components/pooldose/ @lmaertin
|
||||
/homeassistant/components/poolsense/ @haemishkyd
|
||||
/tests/components/poolsense/ @haemishkyd
|
||||
/homeassistant/components/portainer/ @erwindouna
|
||||
/tests/components/portainer/ @erwindouna
|
||||
/homeassistant/components/powerfox/ @klaasnicolaas
|
||||
/tests/components/powerfox/ @klaasnicolaas
|
||||
/homeassistant/components/powerwall/ @bdraco @jrester @daniel-simpson
|
||||
@@ -1213,6 +1202,8 @@ build.json @home-assistant/supervisor
|
||||
/homeassistant/components/proximity/ @mib1185
|
||||
/tests/components/proximity/ @mib1185
|
||||
/homeassistant/components/proxmoxve/ @jhollowe @Corbeno
|
||||
/homeassistant/components/prusalink/ @balloob
|
||||
/tests/components/prusalink/ @balloob
|
||||
/homeassistant/components/ps4/ @ktnrg45
|
||||
/tests/components/ps4/ @ktnrg45
|
||||
/homeassistant/components/pterodactyl/ @elmurato
|
||||
@@ -1306,8 +1297,8 @@ build.json @home-assistant/supervisor
|
||||
/tests/components/rflink/ @javicalle
|
||||
/homeassistant/components/rfxtrx/ @danielhiversen @elupus @RobBie1221
|
||||
/tests/components/rfxtrx/ @danielhiversen @elupus @RobBie1221
|
||||
/homeassistant/components/rhasspy/ @synesthesiam
|
||||
/tests/components/rhasspy/ @synesthesiam
|
||||
/homeassistant/components/rhasspy/ @balloob @synesthesiam
|
||||
/tests/components/rhasspy/ @balloob @synesthesiam
|
||||
/homeassistant/components/ridwell/ @bachya
|
||||
/tests/components/ridwell/ @bachya
|
||||
/homeassistant/components/ring/ @sdb9696
|
||||
@@ -1395,14 +1386,12 @@ build.json @home-assistant/supervisor
|
||||
/tests/components/seventeentrack/ @shaiu
|
||||
/homeassistant/components/sfr_box/ @epenet
|
||||
/tests/components/sfr_box/ @epenet
|
||||
/homeassistant/components/sftp_storage/ @maretodoric
|
||||
/tests/components/sftp_storage/ @maretodoric
|
||||
/homeassistant/components/sharkiq/ @JeffResc @funkybunch
|
||||
/tests/components/sharkiq/ @JeffResc @funkybunch
|
||||
/homeassistant/components/shell_command/ @home-assistant/core
|
||||
/tests/components/shell_command/ @home-assistant/core
|
||||
/homeassistant/components/shelly/ @bieniu @thecode @chemelli74 @bdraco
|
||||
/tests/components/shelly/ @bieniu @thecode @chemelli74 @bdraco
|
||||
/homeassistant/components/shelly/ @balloob @bieniu @thecode @chemelli74 @bdraco
|
||||
/tests/components/shelly/ @balloob @bieniu @thecode @chemelli74 @bdraco
|
||||
/homeassistant/components/shodan/ @fabaff
|
||||
/homeassistant/components/sia/ @eavanvalkenburg
|
||||
/tests/components/sia/ @eavanvalkenburg
|
||||
@@ -1426,8 +1415,6 @@ build.json @home-assistant/supervisor
|
||||
/tests/components/skybell/ @tkdrob
|
||||
/homeassistant/components/slack/ @tkdrob @fletcherau
|
||||
/tests/components/slack/ @tkdrob @fletcherau
|
||||
/homeassistant/components/sleep_as_android/ @tr4nt0r
|
||||
/tests/components/sleep_as_android/ @tr4nt0r
|
||||
/homeassistant/components/sleepiq/ @mfugate1 @kbickar
|
||||
/tests/components/sleepiq/ @mfugate1 @kbickar
|
||||
/homeassistant/components/slide/ @ualex73
|
||||
@@ -1549,8 +1536,8 @@ build.json @home-assistant/supervisor
|
||||
/tests/components/systemmonitor/ @gjohansson-ST
|
||||
/homeassistant/components/tado/ @erwindouna
|
||||
/tests/components/tado/ @erwindouna
|
||||
/homeassistant/components/tag/ @home-assistant/core
|
||||
/tests/components/tag/ @home-assistant/core
|
||||
/homeassistant/components/tag/ @balloob @dmulcahey
|
||||
/tests/components/tag/ @balloob @dmulcahey
|
||||
/homeassistant/components/tailscale/ @frenck
|
||||
/tests/components/tailscale/ @frenck
|
||||
/homeassistant/components/tailwind/ @frenck
|
||||
@@ -1610,8 +1597,6 @@ build.json @home-assistant/supervisor
|
||||
/tests/components/todo/ @home-assistant/core
|
||||
/homeassistant/components/todoist/ @boralyl
|
||||
/tests/components/todoist/ @boralyl
|
||||
/homeassistant/components/togrill/ @elupus
|
||||
/tests/components/togrill/ @elupus
|
||||
/homeassistant/components/tolo/ @MatthiasLohr
|
||||
/tests/components/tolo/ @MatthiasLohr
|
||||
/homeassistant/components/tomorrowio/ @raman325 @lymanepp
|
||||
@@ -1626,6 +1611,8 @@ build.json @home-assistant/supervisor
|
||||
/tests/components/tplink_omada/ @MarkGodwin
|
||||
/homeassistant/components/traccar/ @ludeeus
|
||||
/tests/components/traccar/ @ludeeus
|
||||
/homeassistant/components/traccar_server/ @ludeeus
|
||||
/tests/components/traccar_server/ @ludeeus
|
||||
/homeassistant/components/trace/ @home-assistant/core
|
||||
/tests/components/trace/ @home-assistant/core
|
||||
/homeassistant/components/tractive/ @Danielhiversen @zhulik @bieniu
|
||||
@@ -1695,15 +1682,15 @@ build.json @home-assistant/supervisor
|
||||
/tests/components/vegehub/ @ghowevege
|
||||
/homeassistant/components/velbus/ @Cereal2nd @brefra
|
||||
/tests/components/velbus/ @Cereal2nd @brefra
|
||||
/homeassistant/components/velux/ @Julius2342 @DeerMaximum @pawlizio @wollew
|
||||
/tests/components/velux/ @Julius2342 @DeerMaximum @pawlizio @wollew
|
||||
/homeassistant/components/velux/ @Julius2342 @DeerMaximum @pawlizio
|
||||
/tests/components/velux/ @Julius2342 @DeerMaximum @pawlizio
|
||||
/homeassistant/components/venstar/ @garbled1 @jhollowe
|
||||
/tests/components/venstar/ @garbled1 @jhollowe
|
||||
/homeassistant/components/versasense/ @imstevenxyz
|
||||
/homeassistant/components/version/ @ludeeus
|
||||
/tests/components/version/ @ludeeus
|
||||
/homeassistant/components/vesync/ @markperdue @webdjoe @thegardenmonkey @cdnninja @iprak @sapuseven
|
||||
/tests/components/vesync/ @markperdue @webdjoe @thegardenmonkey @cdnninja @iprak @sapuseven
|
||||
/homeassistant/components/vesync/ @markperdue @webdjoe @thegardenmonkey @cdnninja @iprak
|
||||
/tests/components/vesync/ @markperdue @webdjoe @thegardenmonkey @cdnninja @iprak
|
||||
/homeassistant/components/vicare/ @CFenner
|
||||
/tests/components/vicare/ @CFenner
|
||||
/homeassistant/components/vilfo/ @ManneW
|
||||
@@ -1715,14 +1702,14 @@ build.json @home-assistant/supervisor
|
||||
/tests/components/vlc_telnet/ @rodripf @MartinHjelmare
|
||||
/homeassistant/components/vodafone_station/ @paoloantinori @chemelli74
|
||||
/tests/components/vodafone_station/ @paoloantinori @chemelli74
|
||||
/homeassistant/components/voip/ @synesthesiam @jaminh
|
||||
/tests/components/voip/ @synesthesiam @jaminh
|
||||
/homeassistant/components/voip/ @balloob @synesthesiam @jaminh
|
||||
/tests/components/voip/ @balloob @synesthesiam @jaminh
|
||||
/homeassistant/components/volumio/ @OnFreund
|
||||
/tests/components/volumio/ @OnFreund
|
||||
/homeassistant/components/volvo/ @thomasddn
|
||||
/tests/components/volvo/ @thomasddn
|
||||
/homeassistant/components/volvooncall/ @molobrakos
|
||||
/tests/components/volvooncall/ @molobrakos
|
||||
/homeassistant/components/vulcan/ @Antoni-Czaplicki
|
||||
/tests/components/vulcan/ @Antoni-Czaplicki
|
||||
/homeassistant/components/wake_on_lan/ @ntilley905
|
||||
/tests/components/wake_on_lan/ @ntilley905
|
||||
/homeassistant/components/wake_word/ @home-assistant/core @synesthesiam
|
||||
@@ -1787,8 +1774,8 @@ build.json @home-assistant/supervisor
|
||||
/tests/components/worldclock/ @fabaff
|
||||
/homeassistant/components/ws66i/ @ssaenger
|
||||
/tests/components/ws66i/ @ssaenger
|
||||
/homeassistant/components/wyoming/ @synesthesiam
|
||||
/tests/components/wyoming/ @synesthesiam
|
||||
/homeassistant/components/wyoming/ @balloob @synesthesiam
|
||||
/tests/components/wyoming/ @balloob @synesthesiam
|
||||
/homeassistant/components/xbox/ @hunterjm
|
||||
/tests/components/xbox/ @hunterjm
|
||||
/homeassistant/components/xiaomi_aqara/ @danielhiversen @syssi
|
||||
|
@@ -14,8 +14,5 @@ Still interested? Then you should take a peek at the [developer documentation](h
|
||||
|
||||
## Feature suggestions
|
||||
|
||||
If you want to suggest a new feature for Home Assistant (e.g. new integrations), please [start a discussion](https://github.com/orgs/home-assistant/discussions) on GitHub.
|
||||
|
||||
## Issue Tracker
|
||||
|
||||
If you want to report an issue, please [create an issue](https://github.com/home-assistant/core/issues) on GitHub.
|
||||
If you want to suggest a new feature for Home Assistant (e.g., new integrations), please open a thread in our [Community Forum: Feature Requests](https://community.home-assistant.io/c/feature-requests).
|
||||
We use [GitHub for tracking issues](https://github.com/home-assistant/core/issues), not for tracking feature requests.
|
||||
|
2
Dockerfile
generated
2
Dockerfile
generated
@@ -31,7 +31,7 @@ RUN \
|
||||
&& go2rtc --version
|
||||
|
||||
# Install uv
|
||||
RUN pip3 install uv==0.8.9
|
||||
RUN pip3 install uv==0.7.1
|
||||
|
||||
WORKDIR /usr/src
|
||||
|
||||
|
@@ -3,7 +3,8 @@ FROM mcr.microsoft.com/vscode/devcontainers/base:debian
|
||||
SHELL ["/bin/bash", "-o", "pipefail", "-c"]
|
||||
|
||||
RUN \
|
||||
apt-get update \
|
||||
curl -sS https://dl.yarnpkg.com/debian/pubkey.gpg | apt-key add - \
|
||||
&& apt-get update \
|
||||
&& DEBIAN_FRONTEND=noninteractive apt-get install -y --no-install-recommends \
|
||||
# Additional library needed by some tests and accordingly by VScode Tests Discovery
|
||||
bluez \
|
||||
|
10
build.yaml
10
build.yaml
@@ -1,10 +1,10 @@
|
||||
image: ghcr.io/home-assistant/{arch}-homeassistant
|
||||
build_from:
|
||||
aarch64: ghcr.io/home-assistant/aarch64-homeassistant-base:2025.09.1
|
||||
armhf: ghcr.io/home-assistant/armhf-homeassistant-base:2025.09.1
|
||||
armv7: ghcr.io/home-assistant/armv7-homeassistant-base:2025.09.1
|
||||
amd64: ghcr.io/home-assistant/amd64-homeassistant-base:2025.09.1
|
||||
i386: ghcr.io/home-assistant/i386-homeassistant-base:2025.09.1
|
||||
aarch64: ghcr.io/home-assistant/aarch64-homeassistant-base:2025.05.0
|
||||
armhf: ghcr.io/home-assistant/armhf-homeassistant-base:2025.05.0
|
||||
armv7: ghcr.io/home-assistant/armv7-homeassistant-base:2025.05.0
|
||||
amd64: ghcr.io/home-assistant/amd64-homeassistant-base:2025.05.0
|
||||
i386: ghcr.io/home-assistant/i386-homeassistant-base:2025.05.0
|
||||
codenotary:
|
||||
signer: notary@home-assistant.io
|
||||
base_image: notary@home-assistant.io
|
||||
|
@@ -187,42 +187,36 @@ def main() -> int:
|
||||
|
||||
from . import config, runner # noqa: PLC0415
|
||||
|
||||
# Ensure only one instance runs per config directory
|
||||
with runner.ensure_single_execution(config_dir) as single_execution_lock:
|
||||
# Check if another instance is already running
|
||||
if single_execution_lock.exit_code is not None:
|
||||
return single_execution_lock.exit_code
|
||||
safe_mode = config.safe_mode_enabled(config_dir)
|
||||
|
||||
safe_mode = config.safe_mode_enabled(config_dir)
|
||||
runtime_conf = runner.RuntimeConfig(
|
||||
config_dir=config_dir,
|
||||
verbose=args.verbose,
|
||||
log_rotate_days=args.log_rotate_days,
|
||||
log_file=args.log_file,
|
||||
log_no_color=args.log_no_color,
|
||||
skip_pip=args.skip_pip,
|
||||
skip_pip_packages=args.skip_pip_packages,
|
||||
recovery_mode=args.recovery_mode,
|
||||
debug=args.debug,
|
||||
open_ui=args.open_ui,
|
||||
safe_mode=safe_mode,
|
||||
)
|
||||
|
||||
runtime_conf = runner.RuntimeConfig(
|
||||
config_dir=config_dir,
|
||||
verbose=args.verbose,
|
||||
log_rotate_days=args.log_rotate_days,
|
||||
log_file=args.log_file,
|
||||
log_no_color=args.log_no_color,
|
||||
skip_pip=args.skip_pip,
|
||||
skip_pip_packages=args.skip_pip_packages,
|
||||
recovery_mode=args.recovery_mode,
|
||||
debug=args.debug,
|
||||
open_ui=args.open_ui,
|
||||
safe_mode=safe_mode,
|
||||
)
|
||||
fault_file_name = os.path.join(config_dir, FAULT_LOG_FILENAME)
|
||||
with open(fault_file_name, mode="a", encoding="utf8") as fault_file:
|
||||
faulthandler.enable(fault_file)
|
||||
exit_code = runner.run(runtime_conf)
|
||||
faulthandler.disable()
|
||||
|
||||
fault_file_name = os.path.join(config_dir, FAULT_LOG_FILENAME)
|
||||
with open(fault_file_name, mode="a", encoding="utf8") as fault_file:
|
||||
faulthandler.enable(fault_file)
|
||||
exit_code = runner.run(runtime_conf)
|
||||
faulthandler.disable()
|
||||
# It's possible for the fault file to disappear, so suppress obvious errors
|
||||
with suppress(FileNotFoundError):
|
||||
if os.path.getsize(fault_file_name) == 0:
|
||||
os.remove(fault_file_name)
|
||||
|
||||
# It's possible for the fault file to disappear, so suppress obvious errors
|
||||
with suppress(FileNotFoundError):
|
||||
if os.path.getsize(fault_file_name) == 0:
|
||||
os.remove(fault_file_name)
|
||||
check_threads()
|
||||
|
||||
check_threads()
|
||||
|
||||
return exit_code
|
||||
return exit_code
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
|
@@ -120,9 +120,6 @@ class AuthStore:
|
||||
|
||||
new_user = models.User(**kwargs)
|
||||
|
||||
while new_user.id in self._users:
|
||||
new_user = models.User(**kwargs)
|
||||
|
||||
self._users[new_user.id] = new_user
|
||||
|
||||
if credentials is None:
|
||||
|
@@ -27,7 +27,7 @@ from . import (
|
||||
SetupFlow,
|
||||
)
|
||||
|
||||
REQUIREMENTS = ["pyotp==2.9.0"]
|
||||
REQUIREMENTS = ["pyotp==2.8.0"]
|
||||
|
||||
CONF_MESSAGE = "message"
|
||||
|
||||
|
@@ -20,7 +20,7 @@ from . import (
|
||||
SetupFlow,
|
||||
)
|
||||
|
||||
REQUIREMENTS = ["pyotp==2.9.0", "PyQRCode==1.2.1"]
|
||||
REQUIREMENTS = ["pyotp==2.8.0", "PyQRCode==1.2.1"]
|
||||
|
||||
CONFIG_SCHEMA = MULTI_FACTOR_AUTH_MODULE_SCHEMA.extend({}, extra=vol.PREVENT_EXTRA)
|
||||
|
||||
|
@@ -33,10 +33,7 @@ class AuthFlowContext(FlowContext, total=False):
|
||||
redirect_uri: str
|
||||
|
||||
|
||||
class AuthFlowResult(FlowResult[AuthFlowContext, tuple[str, str]], total=False):
|
||||
"""Typed result dict for auth flow."""
|
||||
|
||||
result: Credentials # Only present if type is CREATE_ENTRY
|
||||
AuthFlowResult = FlowResult[AuthFlowContext, tuple[str, str]]
|
||||
|
||||
|
||||
@attr.s(slots=True)
|
||||
|
@@ -1,5 +0,0 @@
|
||||
{
|
||||
"domain": "frient",
|
||||
"name": "Frient",
|
||||
"iot_standards": ["zigbee"]
|
||||
}
|
@@ -1,5 +1,5 @@
|
||||
{
|
||||
"domain": "fritzbox",
|
||||
"name": "FRITZ!",
|
||||
"name": "FRITZ!Box",
|
||||
"integrations": ["fritz", "fritzbox", "fritzbox_callmonitor"]
|
||||
}
|
||||
|
@@ -6,6 +6,7 @@
|
||||
"google_assistant_sdk",
|
||||
"google_cloud",
|
||||
"google_drive",
|
||||
"google_gemini",
|
||||
"google_generative_ai_conversation",
|
||||
"google_mail",
|
||||
"google_maps",
|
||||
|
@@ -1,5 +1,5 @@
|
||||
{
|
||||
"domain": "third_reality",
|
||||
"name": "Third Reality",
|
||||
"iot_standards": ["matter", "zigbee"]
|
||||
"iot_standards": ["zigbee"]
|
||||
}
|
||||
|
@@ -1,5 +1,5 @@
|
||||
{
|
||||
"domain": "ubiquiti",
|
||||
"name": "Ubiquiti",
|
||||
"integrations": ["airos", "unifi", "unifi_direct", "unifiled", "unifiprotect"]
|
||||
"integrations": ["unifi", "unifi_direct", "unifiled", "unifiprotect"]
|
||||
}
|
||||
|
@@ -2,23 +2,21 @@
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
import asyncio
|
||||
import logging
|
||||
|
||||
from accuweather import AccuWeather
|
||||
|
||||
from homeassistant.components.sensor import DOMAIN as SENSOR_PLATFORM
|
||||
from homeassistant.const import CONF_API_KEY, Platform
|
||||
from homeassistant.const import CONF_API_KEY, CONF_NAME, Platform
|
||||
from homeassistant.core import HomeAssistant
|
||||
from homeassistant.helpers import entity_registry as er
|
||||
from homeassistant.helpers.aiohttp_client import async_get_clientsession
|
||||
|
||||
from .const import DOMAIN
|
||||
from .const import DOMAIN, UPDATE_INTERVAL_DAILY_FORECAST, UPDATE_INTERVAL_OBSERVATION
|
||||
from .coordinator import (
|
||||
AccuWeatherConfigEntry,
|
||||
AccuWeatherDailyForecastDataUpdateCoordinator,
|
||||
AccuWeatherData,
|
||||
AccuWeatherHourlyForecastDataUpdateCoordinator,
|
||||
AccuWeatherObservationDataUpdateCoordinator,
|
||||
)
|
||||
|
||||
@@ -30,6 +28,7 @@ PLATFORMS = [Platform.SENSOR, Platform.WEATHER]
|
||||
async def async_setup_entry(hass: HomeAssistant, entry: AccuWeatherConfigEntry) -> bool:
|
||||
"""Set up AccuWeather as config entry."""
|
||||
api_key: str = entry.data[CONF_API_KEY]
|
||||
name: str = entry.data[CONF_NAME]
|
||||
|
||||
location_key = entry.unique_id
|
||||
|
||||
@@ -42,28 +41,26 @@ async def async_setup_entry(hass: HomeAssistant, entry: AccuWeatherConfigEntry)
|
||||
hass,
|
||||
entry,
|
||||
accuweather,
|
||||
name,
|
||||
"observation",
|
||||
UPDATE_INTERVAL_OBSERVATION,
|
||||
)
|
||||
|
||||
coordinator_daily_forecast = AccuWeatherDailyForecastDataUpdateCoordinator(
|
||||
hass,
|
||||
entry,
|
||||
accuweather,
|
||||
)
|
||||
coordinator_hourly_forecast = AccuWeatherHourlyForecastDataUpdateCoordinator(
|
||||
hass,
|
||||
entry,
|
||||
accuweather,
|
||||
name,
|
||||
"daily forecast",
|
||||
UPDATE_INTERVAL_DAILY_FORECAST,
|
||||
)
|
||||
|
||||
await asyncio.gather(
|
||||
coordinator_observation.async_config_entry_first_refresh(),
|
||||
coordinator_daily_forecast.async_config_entry_first_refresh(),
|
||||
coordinator_hourly_forecast.async_config_entry_first_refresh(),
|
||||
)
|
||||
await coordinator_observation.async_config_entry_first_refresh()
|
||||
await coordinator_daily_forecast.async_config_entry_first_refresh()
|
||||
|
||||
entry.runtime_data = AccuWeatherData(
|
||||
coordinator_observation=coordinator_observation,
|
||||
coordinator_daily_forecast=coordinator_daily_forecast,
|
||||
coordinator_hourly_forecast=coordinator_hourly_forecast,
|
||||
)
|
||||
|
||||
await hass.config_entries.async_forward_entry_setups(entry, PLATFORMS)
|
||||
|
@@ -50,7 +50,6 @@ class AccuWeatherFlowHandler(ConfigFlow, domain=DOMAIN):
|
||||
await self.async_set_unique_id(
|
||||
accuweather.location_key, raise_on_progress=False
|
||||
)
|
||||
self._abort_if_unique_id_configured()
|
||||
|
||||
return self.async_create_entry(
|
||||
title=user_input[CONF_NAME], data=user_input
|
||||
|
@@ -69,6 +69,5 @@ POLLEN_CATEGORY_MAP = {
|
||||
4: "very_high",
|
||||
5: "extreme",
|
||||
}
|
||||
UPDATE_INTERVAL_OBSERVATION = timedelta(minutes=10)
|
||||
UPDATE_INTERVAL_OBSERVATION = timedelta(minutes=40)
|
||||
UPDATE_INTERVAL_DAILY_FORECAST = timedelta(hours=6)
|
||||
UPDATE_INTERVAL_HOURLY_FORECAST = timedelta(hours=30)
|
||||
|
@@ -3,7 +3,6 @@
|
||||
from __future__ import annotations
|
||||
|
||||
from asyncio import timeout
|
||||
from collections.abc import Awaitable, Callable
|
||||
from dataclasses import dataclass
|
||||
from datetime import timedelta
|
||||
import logging
|
||||
@@ -13,7 +12,6 @@ from accuweather import AccuWeather, ApiError, InvalidApiKeyError, RequestsExcee
|
||||
from aiohttp.client_exceptions import ClientConnectorError
|
||||
|
||||
from homeassistant.config_entries import ConfigEntry
|
||||
from homeassistant.const import CONF_NAME
|
||||
from homeassistant.core import HomeAssistant
|
||||
from homeassistant.helpers.device_registry import DeviceEntryType, DeviceInfo
|
||||
from homeassistant.helpers.update_coordinator import (
|
||||
@@ -22,13 +20,7 @@ from homeassistant.helpers.update_coordinator import (
|
||||
UpdateFailed,
|
||||
)
|
||||
|
||||
from .const import (
|
||||
DOMAIN,
|
||||
MANUFACTURER,
|
||||
UPDATE_INTERVAL_DAILY_FORECAST,
|
||||
UPDATE_INTERVAL_HOURLY_FORECAST,
|
||||
UPDATE_INTERVAL_OBSERVATION,
|
||||
)
|
||||
from .const import DOMAIN, MANUFACTURER
|
||||
|
||||
EXCEPTIONS = (ApiError, ClientConnectorError, InvalidApiKeyError, RequestsExceededError)
|
||||
|
||||
@@ -41,7 +33,6 @@ class AccuWeatherData:
|
||||
|
||||
coordinator_observation: AccuWeatherObservationDataUpdateCoordinator
|
||||
coordinator_daily_forecast: AccuWeatherDailyForecastDataUpdateCoordinator
|
||||
coordinator_hourly_forecast: AccuWeatherHourlyForecastDataUpdateCoordinator
|
||||
|
||||
|
||||
type AccuWeatherConfigEntry = ConfigEntry[AccuWeatherData]
|
||||
@@ -57,11 +48,13 @@ class AccuWeatherObservationDataUpdateCoordinator(
|
||||
hass: HomeAssistant,
|
||||
config_entry: AccuWeatherConfigEntry,
|
||||
accuweather: AccuWeather,
|
||||
name: str,
|
||||
coordinator_type: str,
|
||||
update_interval: timedelta,
|
||||
) -> None:
|
||||
"""Initialize."""
|
||||
self.accuweather = accuweather
|
||||
self.location_key = accuweather.location_key
|
||||
name = config_entry.data[CONF_NAME]
|
||||
|
||||
if TYPE_CHECKING:
|
||||
assert self.location_key is not None
|
||||
@@ -72,8 +65,8 @@ class AccuWeatherObservationDataUpdateCoordinator(
|
||||
hass,
|
||||
_LOGGER,
|
||||
config_entry=config_entry,
|
||||
name=f"{name} (observation)",
|
||||
update_interval=UPDATE_INTERVAL_OBSERVATION,
|
||||
name=f"{name} ({coordinator_type})",
|
||||
update_interval=update_interval,
|
||||
)
|
||||
|
||||
async def _async_update_data(self) -> dict[str, Any]:
|
||||
@@ -93,25 +86,23 @@ class AccuWeatherObservationDataUpdateCoordinator(
|
||||
return result
|
||||
|
||||
|
||||
class AccuWeatherForecastDataUpdateCoordinator(
|
||||
class AccuWeatherDailyForecastDataUpdateCoordinator(
|
||||
TimestampDataUpdateCoordinator[list[dict[str, Any]]]
|
||||
):
|
||||
"""Base class for AccuWeather forecast."""
|
||||
"""Class to manage fetching AccuWeather data API."""
|
||||
|
||||
def __init__(
|
||||
self,
|
||||
hass: HomeAssistant,
|
||||
config_entry: AccuWeatherConfigEntry,
|
||||
accuweather: AccuWeather,
|
||||
name: str,
|
||||
coordinator_type: str,
|
||||
update_interval: timedelta,
|
||||
fetch_method: Callable[..., Awaitable[list[dict[str, Any]]]],
|
||||
) -> None:
|
||||
"""Initialize."""
|
||||
self.accuweather = accuweather
|
||||
self.location_key = accuweather.location_key
|
||||
self._fetch_method = fetch_method
|
||||
name = config_entry.data[CONF_NAME]
|
||||
|
||||
if TYPE_CHECKING:
|
||||
assert self.location_key is not None
|
||||
@@ -127,10 +118,12 @@ class AccuWeatherForecastDataUpdateCoordinator(
|
||||
)
|
||||
|
||||
async def _async_update_data(self) -> list[dict[str, Any]]:
|
||||
"""Update forecast data via library."""
|
||||
"""Update data via library."""
|
||||
try:
|
||||
async with timeout(10):
|
||||
result = await self._fetch_method(language=self.hass.config.language)
|
||||
result = await self.accuweather.async_get_daily_forecast(
|
||||
language=self.hass.config.language
|
||||
)
|
||||
except EXCEPTIONS as error:
|
||||
raise UpdateFailed(
|
||||
translation_domain=DOMAIN,
|
||||
@@ -139,53 +132,10 @@ class AccuWeatherForecastDataUpdateCoordinator(
|
||||
) from error
|
||||
|
||||
_LOGGER.debug("Requests remaining: %d", self.accuweather.requests_remaining)
|
||||
|
||||
return result
|
||||
|
||||
|
||||
class AccuWeatherDailyForecastDataUpdateCoordinator(
|
||||
AccuWeatherForecastDataUpdateCoordinator
|
||||
):
|
||||
"""Coordinator for daily forecast."""
|
||||
|
||||
def __init__(
|
||||
self,
|
||||
hass: HomeAssistant,
|
||||
config_entry: AccuWeatherConfigEntry,
|
||||
accuweather: AccuWeather,
|
||||
) -> None:
|
||||
"""Initialize."""
|
||||
super().__init__(
|
||||
hass,
|
||||
config_entry,
|
||||
accuweather,
|
||||
"daily forecast",
|
||||
UPDATE_INTERVAL_DAILY_FORECAST,
|
||||
fetch_method=accuweather.async_get_daily_forecast,
|
||||
)
|
||||
|
||||
|
||||
class AccuWeatherHourlyForecastDataUpdateCoordinator(
|
||||
AccuWeatherForecastDataUpdateCoordinator
|
||||
):
|
||||
"""Coordinator for hourly forecast."""
|
||||
|
||||
def __init__(
|
||||
self,
|
||||
hass: HomeAssistant,
|
||||
config_entry: AccuWeatherConfigEntry,
|
||||
accuweather: AccuWeather,
|
||||
) -> None:
|
||||
"""Initialize."""
|
||||
super().__init__(
|
||||
hass,
|
||||
config_entry,
|
||||
accuweather,
|
||||
"hourly forecast",
|
||||
UPDATE_INTERVAL_HOURLY_FORECAST,
|
||||
fetch_method=accuweather.async_get_hourly_forecast,
|
||||
)
|
||||
|
||||
|
||||
def _get_device_info(location_key: str, name: str) -> DeviceInfo:
|
||||
"""Get device info."""
|
||||
return DeviceInfo(
|
||||
|
@@ -7,5 +7,6 @@
|
||||
"integration_type": "service",
|
||||
"iot_class": "cloud_polling",
|
||||
"loggers": ["accuweather"],
|
||||
"requirements": ["accuweather==4.2.1"]
|
||||
"requirements": ["accuweather==4.2.0"],
|
||||
"single_config_entry": true
|
||||
}
|
||||
|
@@ -17,9 +17,6 @@
|
||||
"cannot_connect": "[%key:common::config_flow::error::cannot_connect%]",
|
||||
"invalid_api_key": "[%key:common::config_flow::error::invalid_api_key%]",
|
||||
"requests_exceeded": "The allowed number of requests to the AccuWeather API has been exceeded. You have to wait or change the API key."
|
||||
},
|
||||
"abort": {
|
||||
"already_configured": "[%key:common::config_flow::abort::already_configured_location%]"
|
||||
}
|
||||
},
|
||||
"entity": {
|
||||
|
@@ -45,7 +45,6 @@ from .coordinator import (
|
||||
AccuWeatherConfigEntry,
|
||||
AccuWeatherDailyForecastDataUpdateCoordinator,
|
||||
AccuWeatherData,
|
||||
AccuWeatherHourlyForecastDataUpdateCoordinator,
|
||||
AccuWeatherObservationDataUpdateCoordinator,
|
||||
)
|
||||
|
||||
@@ -65,7 +64,6 @@ class AccuWeatherEntity(
|
||||
CoordinatorWeatherEntity[
|
||||
AccuWeatherObservationDataUpdateCoordinator,
|
||||
AccuWeatherDailyForecastDataUpdateCoordinator,
|
||||
AccuWeatherHourlyForecastDataUpdateCoordinator,
|
||||
]
|
||||
):
|
||||
"""Define an AccuWeather entity."""
|
||||
@@ -78,7 +76,6 @@ class AccuWeatherEntity(
|
||||
super().__init__(
|
||||
observation_coordinator=accuweather_data.coordinator_observation,
|
||||
daily_coordinator=accuweather_data.coordinator_daily_forecast,
|
||||
hourly_coordinator=accuweather_data.coordinator_hourly_forecast,
|
||||
)
|
||||
|
||||
self._attr_native_precipitation_unit = UnitOfPrecipitationDepth.MILLIMETERS
|
||||
@@ -89,13 +86,10 @@ class AccuWeatherEntity(
|
||||
self._attr_unique_id = accuweather_data.coordinator_observation.location_key
|
||||
self._attr_attribution = ATTRIBUTION
|
||||
self._attr_device_info = accuweather_data.coordinator_observation.device_info
|
||||
self._attr_supported_features = (
|
||||
WeatherEntityFeature.FORECAST_DAILY | WeatherEntityFeature.FORECAST_HOURLY
|
||||
)
|
||||
self._attr_supported_features = WeatherEntityFeature.FORECAST_DAILY
|
||||
|
||||
self.observation_coordinator = accuweather_data.coordinator_observation
|
||||
self.daily_coordinator = accuweather_data.coordinator_daily_forecast
|
||||
self.hourly_coordinator = accuweather_data.coordinator_hourly_forecast
|
||||
|
||||
@property
|
||||
def condition(self) -> str | None:
|
||||
@@ -213,32 +207,3 @@ class AccuWeatherEntity(
|
||||
}
|
||||
for item in self.daily_coordinator.data
|
||||
]
|
||||
|
||||
@callback
|
||||
def _async_forecast_hourly(self) -> list[Forecast] | None:
|
||||
"""Return the hourly forecast in native units."""
|
||||
return [
|
||||
{
|
||||
ATTR_FORECAST_TIME: utc_from_timestamp(
|
||||
item["EpochDateTime"]
|
||||
).isoformat(),
|
||||
ATTR_FORECAST_CLOUD_COVERAGE: item["CloudCover"],
|
||||
ATTR_FORECAST_HUMIDITY: item["RelativeHumidity"],
|
||||
ATTR_FORECAST_NATIVE_TEMP: item["Temperature"][ATTR_VALUE],
|
||||
ATTR_FORECAST_NATIVE_APPARENT_TEMP: item["RealFeelTemperature"][
|
||||
ATTR_VALUE
|
||||
],
|
||||
ATTR_FORECAST_NATIVE_PRECIPITATION: item["TotalLiquid"][ATTR_VALUE],
|
||||
ATTR_FORECAST_PRECIPITATION_PROBABILITY: item[
|
||||
"PrecipitationProbability"
|
||||
],
|
||||
ATTR_FORECAST_NATIVE_WIND_SPEED: item["Wind"][ATTR_SPEED][ATTR_VALUE],
|
||||
ATTR_FORECAST_NATIVE_WIND_GUST_SPEED: item["WindGust"][ATTR_SPEED][
|
||||
ATTR_VALUE
|
||||
],
|
||||
ATTR_FORECAST_UV_INDEX: item["UVIndex"],
|
||||
ATTR_FORECAST_WIND_BEARING: item["Wind"][ATTR_DIRECTION]["Degrees"],
|
||||
ATTR_FORECAST_CONDITION: CONDITION_MAP.get(item["WeatherIcon"]),
|
||||
}
|
||||
for item in self.hourly_coordinator.data
|
||||
]
|
||||
|
@@ -29,19 +29,11 @@ from .const import (
|
||||
DATA_PREFERENCES,
|
||||
DOMAIN,
|
||||
SERVICE_GENERATE_DATA,
|
||||
SERVICE_GENERATE_IMAGE,
|
||||
AITaskEntityFeature,
|
||||
)
|
||||
from .entity import AITaskEntity
|
||||
from .http import async_setup as async_setup_http
|
||||
from .task import (
|
||||
GenDataTask,
|
||||
GenDataTaskResult,
|
||||
GenImageTask,
|
||||
GenImageTaskResult,
|
||||
async_generate_data,
|
||||
async_generate_image,
|
||||
)
|
||||
from .task import GenDataTask, GenDataTaskResult, async_generate_data
|
||||
|
||||
__all__ = [
|
||||
"DOMAIN",
|
||||
@@ -49,10 +41,7 @@ __all__ = [
|
||||
"AITaskEntityFeature",
|
||||
"GenDataTask",
|
||||
"GenDataTaskResult",
|
||||
"GenImageTask",
|
||||
"GenImageTaskResult",
|
||||
"async_generate_data",
|
||||
"async_generate_image",
|
||||
"async_setup",
|
||||
"async_setup_entry",
|
||||
"async_unload_entry",
|
||||
@@ -112,23 +101,6 @@ async def async_setup(hass: HomeAssistant, config: ConfigType) -> bool:
|
||||
supports_response=SupportsResponse.ONLY,
|
||||
job_type=HassJobType.Coroutinefunction,
|
||||
)
|
||||
hass.services.async_register(
|
||||
DOMAIN,
|
||||
SERVICE_GENERATE_IMAGE,
|
||||
async_service_generate_image,
|
||||
schema=vol.Schema(
|
||||
{
|
||||
vol.Required(ATTR_TASK_NAME): cv.string,
|
||||
vol.Optional(ATTR_ENTITY_ID): cv.entity_id,
|
||||
vol.Required(ATTR_INSTRUCTIONS): cv.string,
|
||||
vol.Optional(ATTR_ATTACHMENTS): vol.All(
|
||||
cv.ensure_list, [selector.MediaSelector({"accept": ["*/*"]})]
|
||||
),
|
||||
}
|
||||
),
|
||||
supports_response=SupportsResponse.ONLY,
|
||||
job_type=HassJobType.Coroutinefunction,
|
||||
)
|
||||
return True
|
||||
|
||||
|
||||
@@ -143,23 +115,17 @@ async def async_unload_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool:
|
||||
|
||||
|
||||
async def async_service_generate_data(call: ServiceCall) -> ServiceResponse:
|
||||
"""Run the data task service."""
|
||||
"""Run the run task service."""
|
||||
result = await async_generate_data(hass=call.hass, **call.data)
|
||||
return result.as_dict()
|
||||
|
||||
|
||||
async def async_service_generate_image(call: ServiceCall) -> ServiceResponse:
|
||||
"""Run the image task service."""
|
||||
return await async_generate_image(hass=call.hass, **call.data)
|
||||
|
||||
|
||||
class AITaskPreferences:
|
||||
"""AI Task preferences."""
|
||||
|
||||
KEYS = ("gen_data_entity_id", "gen_image_entity_id")
|
||||
KEYS = ("gen_data_entity_id",)
|
||||
|
||||
gen_data_entity_id: str | None = None
|
||||
gen_image_entity_id: str | None = None
|
||||
|
||||
def __init__(self, hass: HomeAssistant) -> None:
|
||||
"""Initialize the preferences."""
|
||||
@@ -173,21 +139,17 @@ class AITaskPreferences:
|
||||
if data is None:
|
||||
return
|
||||
for key in self.KEYS:
|
||||
setattr(self, key, data.get(key))
|
||||
setattr(self, key, data[key])
|
||||
|
||||
@callback
|
||||
def async_set_preferences(
|
||||
self,
|
||||
*,
|
||||
gen_data_entity_id: str | None | UndefinedType = UNDEFINED,
|
||||
gen_image_entity_id: str | None | UndefinedType = UNDEFINED,
|
||||
) -> None:
|
||||
"""Set the preferences."""
|
||||
changed = False
|
||||
for key, value in (
|
||||
("gen_data_entity_id", gen_data_entity_id),
|
||||
("gen_image_entity_id", gen_image_entity_id),
|
||||
):
|
||||
for key, value in (("gen_data_entity_id", gen_data_entity_id),):
|
||||
if value is not UNDEFINED:
|
||||
if getattr(self, key) != value:
|
||||
setattr(self, key, value)
|
||||
|
@@ -8,7 +8,6 @@ from typing import TYPE_CHECKING, Final
|
||||
from homeassistant.util.hass_dict import HassKey
|
||||
|
||||
if TYPE_CHECKING:
|
||||
from homeassistant.components.media_source import local_source
|
||||
from homeassistant.helpers.entity_component import EntityComponent
|
||||
|
||||
from . import AITaskPreferences
|
||||
@@ -17,13 +16,8 @@ if TYPE_CHECKING:
|
||||
DOMAIN = "ai_task"
|
||||
DATA_COMPONENT: HassKey[EntityComponent[AITaskEntity]] = HassKey(DOMAIN)
|
||||
DATA_PREFERENCES: HassKey[AITaskPreferences] = HassKey(f"{DOMAIN}_preferences")
|
||||
DATA_MEDIA_SOURCE: HassKey[local_source.LocalSource] = HassKey(f"{DOMAIN}_media_source")
|
||||
|
||||
IMAGE_DIR: Final = "image"
|
||||
IMAGE_EXPIRY_TIME = 60 * 60 # 1 hour
|
||||
|
||||
SERVICE_GENERATE_DATA = "generate_data"
|
||||
SERVICE_GENERATE_IMAGE = "generate_image"
|
||||
|
||||
ATTR_INSTRUCTIONS: Final = "instructions"
|
||||
ATTR_TASK_NAME: Final = "task_name"
|
||||
@@ -44,6 +38,3 @@ class AITaskEntityFeature(IntFlag):
|
||||
|
||||
SUPPORT_ATTACHMENTS = 2
|
||||
"""Support attachments with generate data."""
|
||||
|
||||
GENERATE_IMAGE = 4
|
||||
"""Generate images based on instructions."""
|
||||
|
@@ -18,7 +18,7 @@ from homeassistant.helpers.restore_state import RestoreEntity
|
||||
from homeassistant.util import dt as dt_util
|
||||
|
||||
from .const import DEFAULT_SYSTEM_PROMPT, DOMAIN, AITaskEntityFeature
|
||||
from .task import GenDataTask, GenDataTaskResult, GenImageTask, GenImageTaskResult
|
||||
from .task import GenDataTask, GenDataTaskResult
|
||||
|
||||
|
||||
class AITaskEntity(RestoreEntity):
|
||||
@@ -57,13 +57,9 @@ class AITaskEntity(RestoreEntity):
|
||||
async def _async_get_ai_task_chat_log(
|
||||
self,
|
||||
session: ChatSession,
|
||||
task: GenDataTask | GenImageTask,
|
||||
task: GenDataTask,
|
||||
) -> AsyncGenerator[ChatLog]:
|
||||
"""Context manager used to manage the ChatLog used during an AI Task."""
|
||||
user_llm_hass_api: llm.API | None = None
|
||||
if isinstance(task, GenDataTask):
|
||||
user_llm_hass_api = task.llm_api
|
||||
|
||||
# pylint: disable-next=contextmanager-generator-missing-cleanup
|
||||
with (
|
||||
async_get_chat_log(
|
||||
@@ -81,7 +77,6 @@ class AITaskEntity(RestoreEntity):
|
||||
device_id=None,
|
||||
),
|
||||
user_llm_prompt=DEFAULT_SYSTEM_PROMPT,
|
||||
user_llm_hass_api=user_llm_hass_api,
|
||||
)
|
||||
|
||||
chat_log.async_add_user_content(
|
||||
@@ -109,23 +104,3 @@ class AITaskEntity(RestoreEntity):
|
||||
) -> GenDataTaskResult:
|
||||
"""Handle a gen data task."""
|
||||
raise NotImplementedError
|
||||
|
||||
@final
|
||||
async def internal_async_generate_image(
|
||||
self,
|
||||
session: ChatSession,
|
||||
task: GenImageTask,
|
||||
) -> GenImageTaskResult:
|
||||
"""Run a gen image task."""
|
||||
self.__last_activity = dt_util.utcnow().isoformat()
|
||||
self.async_write_ha_state()
|
||||
async with self._async_get_ai_task_chat_log(session, task) as chat_log:
|
||||
return await self._async_generate_image(task, chat_log)
|
||||
|
||||
async def _async_generate_image(
|
||||
self,
|
||||
task: GenImageTask,
|
||||
chat_log: ChatLog,
|
||||
) -> GenImageTaskResult:
|
||||
"""Handle a gen image task."""
|
||||
raise NotImplementedError
|
||||
|
@@ -37,7 +37,6 @@ def websocket_get_preferences(
|
||||
{
|
||||
vol.Required("type"): "ai_task/preferences/set",
|
||||
vol.Optional("gen_data_entity_id"): vol.Any(str, None),
|
||||
vol.Optional("gen_image_entity_id"): vol.Any(str, None),
|
||||
}
|
||||
)
|
||||
@websocket_api.require_admin
|
||||
|
@@ -1,15 +1,7 @@
|
||||
{
|
||||
"entity_component": {
|
||||
"_": {
|
||||
"default": "mdi:star-four-points"
|
||||
}
|
||||
},
|
||||
"services": {
|
||||
"generate_data": {
|
||||
"service": "mdi:file-star-four-points-outline"
|
||||
},
|
||||
"generate_image": {
|
||||
"service": "mdi:star-four-points-box-outline"
|
||||
}
|
||||
}
|
||||
}
|
||||
|
@@ -5,6 +5,6 @@
|
||||
"codeowners": ["@home-assistant/core"],
|
||||
"dependencies": ["conversation", "media_source"],
|
||||
"documentation": "https://www.home-assistant.io/integrations/ai_task",
|
||||
"integration_type": "entity",
|
||||
"integration_type": "system",
|
||||
"quality_scale": "internal"
|
||||
}
|
||||
|
@@ -1,22 +0,0 @@
|
||||
"""Expose images as media sources."""
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
from homeassistant.components.media_source import MediaSource, local_source
|
||||
from homeassistant.core import HomeAssistant
|
||||
|
||||
from .const import DATA_MEDIA_SOURCE, DOMAIN, IMAGE_DIR
|
||||
|
||||
|
||||
async def async_get_media_source(hass: HomeAssistant) -> MediaSource:
|
||||
"""Set up local media source."""
|
||||
media_dir = hass.config.path(f"{DOMAIN}/{IMAGE_DIR}")
|
||||
|
||||
hass.data[DATA_MEDIA_SOURCE] = source = local_source.LocalSource(
|
||||
hass,
|
||||
DOMAIN,
|
||||
"AI Generated Images",
|
||||
{IMAGE_DIR: media_dir},
|
||||
f"/{DOMAIN}",
|
||||
)
|
||||
return source
|
@@ -20,6 +20,7 @@ generate_data:
|
||||
supported_features:
|
||||
- ai_task.AITaskEntityFeature.GENERATE_DATA
|
||||
structure:
|
||||
advanced: true
|
||||
required: false
|
||||
example: '{ "name": { "selector": { "text": }, "description": "Name of the user", "required": "True" } } }, "age": { "selector": { "number": }, "description": "Age of the user" } }'
|
||||
selector:
|
||||
@@ -30,30 +31,3 @@ generate_data:
|
||||
media:
|
||||
accept:
|
||||
- "*"
|
||||
generate_image:
|
||||
fields:
|
||||
task_name:
|
||||
example: "picture of a dog"
|
||||
required: true
|
||||
selector:
|
||||
text:
|
||||
instructions:
|
||||
example: "Generate a high quality square image of a dog on transparent background"
|
||||
required: true
|
||||
selector:
|
||||
text:
|
||||
multiline: true
|
||||
entity_id:
|
||||
required: true
|
||||
selector:
|
||||
entity:
|
||||
filter:
|
||||
domain: ai_task
|
||||
supported_features:
|
||||
- ai_task.AITaskEntityFeature.GENERATE_IMAGE
|
||||
attachments:
|
||||
required: false
|
||||
selector:
|
||||
media:
|
||||
accept:
|
||||
- "*"
|
||||
|
@@ -25,28 +25,6 @@
|
||||
"description": "List of files to attach for multi-modal AI analysis."
|
||||
}
|
||||
}
|
||||
},
|
||||
"generate_image": {
|
||||
"name": "Generate image",
|
||||
"description": "Uses AI to generate image.",
|
||||
"fields": {
|
||||
"task_name": {
|
||||
"name": "Task name",
|
||||
"description": "Name of the task."
|
||||
},
|
||||
"instructions": {
|
||||
"name": "Instructions",
|
||||
"description": "Instructions that explains the image to be generated."
|
||||
},
|
||||
"entity_id": {
|
||||
"name": "Entity ID",
|
||||
"description": "Entity ID to run the task on."
|
||||
},
|
||||
"attachments": {
|
||||
"name": "Attachments",
|
||||
"description": "List of files to attach for using as references."
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
@@ -3,8 +3,6 @@
|
||||
from __future__ import annotations
|
||||
|
||||
from dataclasses import dataclass
|
||||
from datetime import datetime, timedelta
|
||||
import io
|
||||
import mimetypes
|
||||
from pathlib import Path
|
||||
import tempfile
|
||||
@@ -13,22 +11,11 @@ from typing import Any
|
||||
import voluptuous as vol
|
||||
|
||||
from homeassistant.components import camera, conversation, media_source
|
||||
from homeassistant.components.http.auth import async_sign_path
|
||||
from homeassistant.core import HomeAssistant, ServiceResponse, callback
|
||||
from homeassistant.core import HomeAssistant, callback
|
||||
from homeassistant.exceptions import HomeAssistantError
|
||||
from homeassistant.helpers import llm
|
||||
from homeassistant.helpers.chat_session import ChatSession, async_get_chat_session
|
||||
from homeassistant.util import RE_SANITIZE_FILENAME, slugify
|
||||
from homeassistant.helpers.chat_session import async_get_chat_session
|
||||
|
||||
from .const import (
|
||||
DATA_COMPONENT,
|
||||
DATA_MEDIA_SOURCE,
|
||||
DATA_PREFERENCES,
|
||||
DOMAIN,
|
||||
IMAGE_DIR,
|
||||
IMAGE_EXPIRY_TIME,
|
||||
AITaskEntityFeature,
|
||||
)
|
||||
from .const import DATA_COMPONENT, DATA_PREFERENCES, AITaskEntityFeature
|
||||
|
||||
|
||||
def _save_camera_snapshot(image: camera.Image) -> Path:
|
||||
@@ -42,15 +29,43 @@ def _save_camera_snapshot(image: camera.Image) -> Path:
|
||||
return Path(temp_file.name)
|
||||
|
||||
|
||||
async def _resolve_attachments(
|
||||
async def async_generate_data(
|
||||
hass: HomeAssistant,
|
||||
session: ChatSession,
|
||||
*,
|
||||
task_name: str,
|
||||
entity_id: str | None = None,
|
||||
instructions: str,
|
||||
structure: vol.Schema | None = None,
|
||||
attachments: list[dict] | None = None,
|
||||
) -> list[conversation.Attachment]:
|
||||
"""Resolve attachments for a task."""
|
||||
) -> GenDataTaskResult:
|
||||
"""Run a task in the AI Task integration."""
|
||||
if entity_id is None:
|
||||
entity_id = hass.data[DATA_PREFERENCES].gen_data_entity_id
|
||||
|
||||
if entity_id is None:
|
||||
raise HomeAssistantError("No entity_id provided and no preferred entity set")
|
||||
|
||||
entity = hass.data[DATA_COMPONENT].get_entity(entity_id)
|
||||
if entity is None:
|
||||
raise HomeAssistantError(f"AI Task entity {entity_id} not found")
|
||||
|
||||
if AITaskEntityFeature.GENERATE_DATA not in entity.supported_features:
|
||||
raise HomeAssistantError(
|
||||
f"AI Task entity {entity_id} does not support generating data"
|
||||
)
|
||||
|
||||
# Resolve attachments
|
||||
resolved_attachments: list[conversation.Attachment] = []
|
||||
created_files: list[Path] = []
|
||||
|
||||
if (
|
||||
attachments
|
||||
and AITaskEntityFeature.SUPPORT_ATTACHMENTS not in entity.supported_features
|
||||
):
|
||||
raise HomeAssistantError(
|
||||
f"AI Task entity {entity_id} does not support attachments"
|
||||
)
|
||||
|
||||
for attachment in attachments or []:
|
||||
media_content_id = attachment["media_content_id"]
|
||||
|
||||
@@ -89,60 +104,20 @@ async def _resolve_attachments(
|
||||
)
|
||||
)
|
||||
|
||||
if not created_files:
|
||||
return resolved_attachments
|
||||
|
||||
def cleanup_files() -> None:
|
||||
"""Cleanup temporary files."""
|
||||
for file in created_files:
|
||||
file.unlink(missing_ok=True)
|
||||
|
||||
@callback
|
||||
def cleanup_files_callback() -> None:
|
||||
"""Cleanup temporary files."""
|
||||
hass.async_add_executor_job(cleanup_files)
|
||||
|
||||
session.async_on_cleanup(cleanup_files_callback)
|
||||
|
||||
return resolved_attachments
|
||||
|
||||
|
||||
async def async_generate_data(
|
||||
hass: HomeAssistant,
|
||||
*,
|
||||
task_name: str,
|
||||
entity_id: str | None = None,
|
||||
instructions: str,
|
||||
structure: vol.Schema | None = None,
|
||||
attachments: list[dict] | None = None,
|
||||
llm_api: llm.API | None = None,
|
||||
) -> GenDataTaskResult:
|
||||
"""Run a data generation task in the AI Task integration."""
|
||||
if entity_id is None:
|
||||
entity_id = hass.data[DATA_PREFERENCES].gen_data_entity_id
|
||||
|
||||
if entity_id is None:
|
||||
raise HomeAssistantError("No entity_id provided and no preferred entity set")
|
||||
|
||||
entity = hass.data[DATA_COMPONENT].get_entity(entity_id)
|
||||
if entity is None:
|
||||
raise HomeAssistantError(f"AI Task entity {entity_id} not found")
|
||||
|
||||
if AITaskEntityFeature.GENERATE_DATA not in entity.supported_features:
|
||||
raise HomeAssistantError(
|
||||
f"AI Task entity {entity_id} does not support generating data"
|
||||
)
|
||||
|
||||
if (
|
||||
attachments
|
||||
and AITaskEntityFeature.SUPPORT_ATTACHMENTS not in entity.supported_features
|
||||
):
|
||||
raise HomeAssistantError(
|
||||
f"AI Task entity {entity_id} does not support attachments"
|
||||
)
|
||||
|
||||
with async_get_chat_session(hass) as session:
|
||||
resolved_attachments = await _resolve_attachments(hass, session, attachments)
|
||||
if created_files:
|
||||
|
||||
def cleanup_files() -> None:
|
||||
"""Cleanup temporary files."""
|
||||
for file in created_files:
|
||||
file.unlink(missing_ok=True)
|
||||
|
||||
@callback
|
||||
def cleanup_files_callback() -> None:
|
||||
"""Cleanup temporary files."""
|
||||
hass.async_add_executor_job(cleanup_files)
|
||||
|
||||
session.async_on_cleanup(cleanup_files_callback)
|
||||
|
||||
return await entity.internal_async_generate_data(
|
||||
session,
|
||||
@@ -151,92 +126,10 @@ async def async_generate_data(
|
||||
instructions=instructions,
|
||||
structure=structure,
|
||||
attachments=resolved_attachments or None,
|
||||
llm_api=llm_api,
|
||||
),
|
||||
)
|
||||
|
||||
|
||||
async def async_generate_image(
|
||||
hass: HomeAssistant,
|
||||
*,
|
||||
task_name: str,
|
||||
entity_id: str | None = None,
|
||||
instructions: str,
|
||||
attachments: list[dict] | None = None,
|
||||
) -> ServiceResponse:
|
||||
"""Run an image generation task in the AI Task integration."""
|
||||
if entity_id is None:
|
||||
entity_id = hass.data[DATA_PREFERENCES].gen_image_entity_id
|
||||
|
||||
if entity_id is None:
|
||||
raise HomeAssistantError("No entity_id provided and no preferred entity set")
|
||||
|
||||
entity = hass.data[DATA_COMPONENT].get_entity(entity_id)
|
||||
if entity is None:
|
||||
raise HomeAssistantError(f"AI Task entity {entity_id} not found")
|
||||
|
||||
if AITaskEntityFeature.GENERATE_IMAGE not in entity.supported_features:
|
||||
raise HomeAssistantError(
|
||||
f"AI Task entity {entity_id} does not support generating images"
|
||||
)
|
||||
|
||||
if (
|
||||
attachments
|
||||
and AITaskEntityFeature.SUPPORT_ATTACHMENTS not in entity.supported_features
|
||||
):
|
||||
raise HomeAssistantError(
|
||||
f"AI Task entity {entity_id} does not support attachments"
|
||||
)
|
||||
|
||||
with async_get_chat_session(hass) as session:
|
||||
resolved_attachments = await _resolve_attachments(hass, session, attachments)
|
||||
|
||||
task_result = await entity.internal_async_generate_image(
|
||||
session,
|
||||
GenImageTask(
|
||||
name=task_name,
|
||||
instructions=instructions,
|
||||
attachments=resolved_attachments or None,
|
||||
),
|
||||
)
|
||||
|
||||
service_result = task_result.as_dict()
|
||||
image_data = service_result.pop("image_data")
|
||||
if service_result.get("revised_prompt") is None:
|
||||
service_result["revised_prompt"] = instructions
|
||||
|
||||
source = hass.data[DATA_MEDIA_SOURCE]
|
||||
|
||||
current_time = datetime.now()
|
||||
ext = mimetypes.guess_extension(task_result.mime_type, False) or ".png"
|
||||
sanitized_task_name = RE_SANITIZE_FILENAME.sub("", slugify(task_name))
|
||||
|
||||
image_file = ImageData(
|
||||
filename=f"{current_time.strftime('%Y-%m-%d_%H%M%S')}_{sanitized_task_name}{ext}",
|
||||
file=io.BytesIO(image_data),
|
||||
content_type=task_result.mime_type,
|
||||
)
|
||||
|
||||
target_folder = media_source.MediaSourceItem.from_uri(
|
||||
hass, f"media-source://{DOMAIN}/{IMAGE_DIR}", None
|
||||
)
|
||||
|
||||
service_result["media_source_id"] = await source.async_upload_media(
|
||||
target_folder, image_file
|
||||
)
|
||||
|
||||
item = media_source.MediaSourceItem.from_uri(
|
||||
hass, service_result["media_source_id"], None
|
||||
)
|
||||
service_result["url"] = async_sign_path(
|
||||
hass,
|
||||
(await source.async_resolve_media(item)).url,
|
||||
timedelta(seconds=IMAGE_EXPIRY_TIME),
|
||||
)
|
||||
|
||||
return service_result
|
||||
|
||||
|
||||
@dataclass(slots=True)
|
||||
class GenDataTask:
|
||||
"""Gen data task to be processed."""
|
||||
@@ -253,9 +146,6 @@ class GenDataTask:
|
||||
attachments: list[conversation.Attachment] | None = None
|
||||
"""List of attachments to go along the instructions."""
|
||||
|
||||
llm_api: llm.API | None = None
|
||||
"""API to provide to the LLM."""
|
||||
|
||||
def __str__(self) -> str:
|
||||
"""Return task as a string."""
|
||||
return f"<GenDataTask {self.name}: {id(self)}>"
|
||||
@@ -277,68 +167,3 @@ class GenDataTaskResult:
|
||||
"conversation_id": self.conversation_id,
|
||||
"data": self.data,
|
||||
}
|
||||
|
||||
|
||||
@dataclass(slots=True)
|
||||
class GenImageTask:
|
||||
"""Gen image task to be processed."""
|
||||
|
||||
name: str
|
||||
"""Name of the task."""
|
||||
|
||||
instructions: str
|
||||
"""Instructions on what needs to be done."""
|
||||
|
||||
attachments: list[conversation.Attachment] | None = None
|
||||
"""List of attachments to go along the instructions."""
|
||||
|
||||
def __str__(self) -> str:
|
||||
"""Return task as a string."""
|
||||
return f"<GenImageTask {self.name}: {id(self)}>"
|
||||
|
||||
|
||||
@dataclass(slots=True)
|
||||
class GenImageTaskResult:
|
||||
"""Result of gen image task."""
|
||||
|
||||
image_data: bytes
|
||||
"""Raw image data generated by the model."""
|
||||
|
||||
conversation_id: str
|
||||
"""Unique identifier for the conversation."""
|
||||
|
||||
mime_type: str
|
||||
"""MIME type of the generated image."""
|
||||
|
||||
width: int | None = None
|
||||
"""Width of the generated image, if available."""
|
||||
|
||||
height: int | None = None
|
||||
"""Height of the generated image, if available."""
|
||||
|
||||
model: str | None = None
|
||||
"""Model used to generate the image, if available."""
|
||||
|
||||
revised_prompt: str | None = None
|
||||
"""Revised prompt used to generate the image, if applicable."""
|
||||
|
||||
def as_dict(self) -> dict[str, Any]:
|
||||
"""Return result as a dict."""
|
||||
return {
|
||||
"image_data": self.image_data,
|
||||
"conversation_id": self.conversation_id,
|
||||
"mime_type": self.mime_type,
|
||||
"width": self.width,
|
||||
"height": self.height,
|
||||
"model": self.model,
|
||||
"revised_prompt": self.revised_prompt,
|
||||
}
|
||||
|
||||
|
||||
@dataclass(slots=True)
|
||||
class ImageData:
|
||||
"""Implementation of media_source.local_source.UploadedFile protocol."""
|
||||
|
||||
filename: str
|
||||
file: io.IOBase
|
||||
content_type: str
|
||||
|
@@ -61,7 +61,7 @@
|
||||
"display_pm_standard": {
|
||||
"name": "Display PM standard",
|
||||
"state": {
|
||||
"ugm3": "μg/m³",
|
||||
"ugm3": "µg/m³",
|
||||
"us_aqi": "US AQI"
|
||||
}
|
||||
},
|
||||
|
@@ -1,45 +0,0 @@
|
||||
"""The Ubiquiti airOS integration."""
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
from airos.airos8 import AirOS8
|
||||
|
||||
from homeassistant.const import CONF_HOST, CONF_PASSWORD, CONF_USERNAME, Platform
|
||||
from homeassistant.core import HomeAssistant
|
||||
from homeassistant.helpers.aiohttp_client import async_get_clientsession
|
||||
|
||||
from .coordinator import AirOSConfigEntry, AirOSDataUpdateCoordinator
|
||||
|
||||
_PLATFORMS: list[Platform] = [
|
||||
Platform.BINARY_SENSOR,
|
||||
Platform.SENSOR,
|
||||
]
|
||||
|
||||
|
||||
async def async_setup_entry(hass: HomeAssistant, entry: AirOSConfigEntry) -> bool:
|
||||
"""Set up Ubiquiti airOS from a config entry."""
|
||||
|
||||
# By default airOS 8 comes with self-signed SSL certificates,
|
||||
# with no option in the web UI to change or upload a custom certificate.
|
||||
session = async_get_clientsession(hass, verify_ssl=False)
|
||||
|
||||
airos_device = AirOS8(
|
||||
host=entry.data[CONF_HOST],
|
||||
username=entry.data[CONF_USERNAME],
|
||||
password=entry.data[CONF_PASSWORD],
|
||||
session=session,
|
||||
)
|
||||
|
||||
coordinator = AirOSDataUpdateCoordinator(hass, entry, airos_device)
|
||||
await coordinator.async_config_entry_first_refresh()
|
||||
|
||||
entry.runtime_data = coordinator
|
||||
|
||||
await hass.config_entries.async_forward_entry_setups(entry, _PLATFORMS)
|
||||
|
||||
return True
|
||||
|
||||
|
||||
async def async_unload_entry(hass: HomeAssistant, entry: AirOSConfigEntry) -> bool:
|
||||
"""Unload a config entry."""
|
||||
return await hass.config_entries.async_unload_platforms(entry, _PLATFORMS)
|
@@ -1,106 +0,0 @@
|
||||
"""AirOS Binary Sensor component for Home Assistant."""
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
from collections.abc import Callable
|
||||
from dataclasses import dataclass
|
||||
import logging
|
||||
|
||||
from homeassistant.components.binary_sensor import (
|
||||
BinarySensorDeviceClass,
|
||||
BinarySensorEntity,
|
||||
BinarySensorEntityDescription,
|
||||
)
|
||||
from homeassistant.const import EntityCategory
|
||||
from homeassistant.core import HomeAssistant
|
||||
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
|
||||
|
||||
from .coordinator import AirOS8Data, AirOSConfigEntry, AirOSDataUpdateCoordinator
|
||||
from .entity import AirOSEntity
|
||||
|
||||
_LOGGER = logging.getLogger(__name__)
|
||||
|
||||
PARALLEL_UPDATES = 0
|
||||
|
||||
|
||||
@dataclass(frozen=True, kw_only=True)
|
||||
class AirOSBinarySensorEntityDescription(BinarySensorEntityDescription):
|
||||
"""Describe an AirOS binary sensor."""
|
||||
|
||||
value_fn: Callable[[AirOS8Data], bool]
|
||||
|
||||
|
||||
BINARY_SENSORS: tuple[AirOSBinarySensorEntityDescription, ...] = (
|
||||
AirOSBinarySensorEntityDescription(
|
||||
key="portfw",
|
||||
translation_key="port_forwarding",
|
||||
entity_category=EntityCategory.DIAGNOSTIC,
|
||||
value_fn=lambda data: data.portfw,
|
||||
),
|
||||
AirOSBinarySensorEntityDescription(
|
||||
key="dhcp_client",
|
||||
translation_key="dhcp_client",
|
||||
device_class=BinarySensorDeviceClass.RUNNING,
|
||||
entity_category=EntityCategory.DIAGNOSTIC,
|
||||
value_fn=lambda data: data.services.dhcpc,
|
||||
),
|
||||
AirOSBinarySensorEntityDescription(
|
||||
key="dhcp_server",
|
||||
translation_key="dhcp_server",
|
||||
device_class=BinarySensorDeviceClass.RUNNING,
|
||||
entity_category=EntityCategory.DIAGNOSTIC,
|
||||
value_fn=lambda data: data.services.dhcpd,
|
||||
entity_registry_enabled_default=False,
|
||||
),
|
||||
AirOSBinarySensorEntityDescription(
|
||||
key="dhcp6_server",
|
||||
translation_key="dhcp6_server",
|
||||
device_class=BinarySensorDeviceClass.RUNNING,
|
||||
entity_category=EntityCategory.DIAGNOSTIC,
|
||||
value_fn=lambda data: data.services.dhcp6d_stateful,
|
||||
entity_registry_enabled_default=False,
|
||||
),
|
||||
AirOSBinarySensorEntityDescription(
|
||||
key="pppoe",
|
||||
translation_key="pppoe",
|
||||
device_class=BinarySensorDeviceClass.CONNECTIVITY,
|
||||
entity_category=EntityCategory.DIAGNOSTIC,
|
||||
value_fn=lambda data: data.services.pppoe,
|
||||
entity_registry_enabled_default=False,
|
||||
),
|
||||
)
|
||||
|
||||
|
||||
async def async_setup_entry(
|
||||
hass: HomeAssistant,
|
||||
config_entry: AirOSConfigEntry,
|
||||
async_add_entities: AddConfigEntryEntitiesCallback,
|
||||
) -> None:
|
||||
"""Set up the AirOS binary sensors from a config entry."""
|
||||
coordinator = config_entry.runtime_data
|
||||
|
||||
async_add_entities(
|
||||
AirOSBinarySensor(coordinator, description) for description in BINARY_SENSORS
|
||||
)
|
||||
|
||||
|
||||
class AirOSBinarySensor(AirOSEntity, BinarySensorEntity):
|
||||
"""Representation of a binary sensor."""
|
||||
|
||||
entity_description: AirOSBinarySensorEntityDescription
|
||||
|
||||
def __init__(
|
||||
self,
|
||||
coordinator: AirOSDataUpdateCoordinator,
|
||||
description: AirOSBinarySensorEntityDescription,
|
||||
) -> None:
|
||||
"""Initialize the binary sensor."""
|
||||
super().__init__(coordinator)
|
||||
|
||||
self.entity_description = description
|
||||
self._attr_unique_id = f"{coordinator.data.host.device_id}_{description.key}"
|
||||
|
||||
@property
|
||||
def is_on(self) -> bool:
|
||||
"""Return the state of the binary sensor."""
|
||||
return self.entity_description.value_fn(self.coordinator.data)
|
@@ -1,82 +0,0 @@
|
||||
"""Config flow for the Ubiquiti airOS integration."""
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
import logging
|
||||
from typing import Any
|
||||
|
||||
from airos.exceptions import (
|
||||
AirOSConnectionAuthenticationError,
|
||||
AirOSConnectionSetupError,
|
||||
AirOSDataMissingError,
|
||||
AirOSDeviceConnectionError,
|
||||
AirOSKeyDataMissingError,
|
||||
)
|
||||
import voluptuous as vol
|
||||
|
||||
from homeassistant.config_entries import ConfigFlow, ConfigFlowResult
|
||||
from homeassistant.const import CONF_HOST, CONF_PASSWORD, CONF_USERNAME
|
||||
from homeassistant.helpers.aiohttp_client import async_get_clientsession
|
||||
|
||||
from .const import DOMAIN
|
||||
from .coordinator import AirOS8
|
||||
|
||||
_LOGGER = logging.getLogger(__name__)
|
||||
|
||||
STEP_USER_DATA_SCHEMA = vol.Schema(
|
||||
{
|
||||
vol.Required(CONF_HOST): str,
|
||||
vol.Required(CONF_USERNAME, default="ubnt"): str,
|
||||
vol.Required(CONF_PASSWORD): str,
|
||||
}
|
||||
)
|
||||
|
||||
|
||||
class AirOSConfigFlow(ConfigFlow, domain=DOMAIN):
|
||||
"""Handle a config flow for Ubiquiti airOS."""
|
||||
|
||||
VERSION = 1
|
||||
|
||||
async def async_step_user(
|
||||
self,
|
||||
user_input: dict[str, Any] | None = None,
|
||||
) -> ConfigFlowResult:
|
||||
"""Handle the initial step."""
|
||||
errors: dict[str, str] = {}
|
||||
if user_input is not None:
|
||||
# By default airOS 8 comes with self-signed SSL certificates,
|
||||
# with no option in the web UI to change or upload a custom certificate.
|
||||
session = async_get_clientsession(self.hass, verify_ssl=False)
|
||||
|
||||
airos_device = AirOS8(
|
||||
host=user_input[CONF_HOST],
|
||||
username=user_input[CONF_USERNAME],
|
||||
password=user_input[CONF_PASSWORD],
|
||||
session=session,
|
||||
)
|
||||
try:
|
||||
await airos_device.login()
|
||||
airos_data = await airos_device.status()
|
||||
|
||||
except (
|
||||
AirOSConnectionSetupError,
|
||||
AirOSDeviceConnectionError,
|
||||
):
|
||||
errors["base"] = "cannot_connect"
|
||||
except (AirOSConnectionAuthenticationError, AirOSDataMissingError):
|
||||
errors["base"] = "invalid_auth"
|
||||
except AirOSKeyDataMissingError:
|
||||
errors["base"] = "key_data_missing"
|
||||
except Exception:
|
||||
_LOGGER.exception("Unexpected exception")
|
||||
errors["base"] = "unknown"
|
||||
else:
|
||||
await self.async_set_unique_id(airos_data.derived.mac)
|
||||
self._abort_if_unique_id_configured()
|
||||
return self.async_create_entry(
|
||||
title=airos_data.host.hostname, data=user_input
|
||||
)
|
||||
|
||||
return self.async_show_form(
|
||||
step_id="user", data_schema=STEP_USER_DATA_SCHEMA, errors=errors
|
||||
)
|
@@ -1,9 +0,0 @@
|
||||
"""Constants for the Ubiquiti airOS integration."""
|
||||
|
||||
from datetime import timedelta
|
||||
|
||||
DOMAIN = "airos"
|
||||
|
||||
SCAN_INTERVAL = timedelta(minutes=1)
|
||||
|
||||
MANUFACTURER = "Ubiquiti"
|
@@ -1,70 +0,0 @@
|
||||
"""DataUpdateCoordinator for AirOS."""
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
import logging
|
||||
|
||||
from airos.airos8 import AirOS8, AirOS8Data
|
||||
from airos.exceptions import (
|
||||
AirOSConnectionAuthenticationError,
|
||||
AirOSConnectionSetupError,
|
||||
AirOSDataMissingError,
|
||||
AirOSDeviceConnectionError,
|
||||
)
|
||||
|
||||
from homeassistant.config_entries import ConfigEntry
|
||||
from homeassistant.core import HomeAssistant
|
||||
from homeassistant.exceptions import ConfigEntryError
|
||||
from homeassistant.helpers.update_coordinator import DataUpdateCoordinator, UpdateFailed
|
||||
|
||||
from .const import DOMAIN, SCAN_INTERVAL
|
||||
|
||||
_LOGGER = logging.getLogger(__name__)
|
||||
|
||||
type AirOSConfigEntry = ConfigEntry[AirOSDataUpdateCoordinator]
|
||||
|
||||
|
||||
class AirOSDataUpdateCoordinator(DataUpdateCoordinator[AirOS8Data]):
|
||||
"""Class to manage fetching AirOS data from single endpoint."""
|
||||
|
||||
config_entry: AirOSConfigEntry
|
||||
|
||||
def __init__(
|
||||
self, hass: HomeAssistant, config_entry: AirOSConfigEntry, airos_device: AirOS8
|
||||
) -> None:
|
||||
"""Initialize the coordinator."""
|
||||
self.airos_device = airos_device
|
||||
super().__init__(
|
||||
hass,
|
||||
_LOGGER,
|
||||
config_entry=config_entry,
|
||||
name=DOMAIN,
|
||||
update_interval=SCAN_INTERVAL,
|
||||
)
|
||||
|
||||
async def _async_update_data(self) -> AirOS8Data:
|
||||
"""Fetch data from AirOS."""
|
||||
try:
|
||||
await self.airos_device.login()
|
||||
return await self.airos_device.status()
|
||||
except (AirOSConnectionAuthenticationError,) as err:
|
||||
_LOGGER.exception("Error authenticating with airOS device")
|
||||
raise ConfigEntryError(
|
||||
translation_domain=DOMAIN, translation_key="invalid_auth"
|
||||
) from err
|
||||
except (
|
||||
AirOSConnectionSetupError,
|
||||
AirOSDeviceConnectionError,
|
||||
TimeoutError,
|
||||
) as err:
|
||||
_LOGGER.error("Error connecting to airOS device: %s", err)
|
||||
raise UpdateFailed(
|
||||
translation_domain=DOMAIN,
|
||||
translation_key="cannot_connect",
|
||||
) from err
|
||||
except (AirOSDataMissingError,) as err:
|
||||
_LOGGER.error("Expected data not returned by airOS device: %s", err)
|
||||
raise UpdateFailed(
|
||||
translation_domain=DOMAIN,
|
||||
translation_key="error_data_missing",
|
||||
) from err
|
@@ -1,33 +0,0 @@
|
||||
"""Diagnostics support for airOS."""
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
from typing import Any
|
||||
|
||||
from homeassistant.components.diagnostics import async_redact_data
|
||||
from homeassistant.const import CONF_HOST, CONF_PASSWORD
|
||||
from homeassistant.core import HomeAssistant
|
||||
|
||||
from .coordinator import AirOSConfigEntry
|
||||
|
||||
IP_REDACT = ["addr", "ipaddr", "ip6addr", "lastip"] # IP related
|
||||
HW_REDACT = ["apmac", "hwaddr", "mac"] # MAC address
|
||||
TO_REDACT_HA = [CONF_HOST, CONF_PASSWORD]
|
||||
TO_REDACT_AIROS = [
|
||||
"hostname", # Prevent leaking device naming
|
||||
"essid", # Network SSID
|
||||
"lat", # GPS latitude to prevent exposing location data.
|
||||
"lon", # GPS longitude to prevent exposing location data.
|
||||
*HW_REDACT,
|
||||
*IP_REDACT,
|
||||
]
|
||||
|
||||
|
||||
async def async_get_config_entry_diagnostics(
|
||||
hass: HomeAssistant, entry: AirOSConfigEntry
|
||||
) -> dict[str, Any]:
|
||||
"""Return diagnostics for a config entry."""
|
||||
return {
|
||||
"entry_data": async_redact_data(entry.data, TO_REDACT_HA),
|
||||
"data": async_redact_data(entry.runtime_data.data.to_dict(), TO_REDACT_AIROS),
|
||||
}
|
@@ -1,36 +0,0 @@
|
||||
"""Generic AirOS Entity Class."""
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
from homeassistant.const import CONF_HOST
|
||||
from homeassistant.helpers.device_registry import CONNECTION_NETWORK_MAC, DeviceInfo
|
||||
from homeassistant.helpers.update_coordinator import CoordinatorEntity
|
||||
|
||||
from .const import DOMAIN, MANUFACTURER
|
||||
from .coordinator import AirOSDataUpdateCoordinator
|
||||
|
||||
|
||||
class AirOSEntity(CoordinatorEntity[AirOSDataUpdateCoordinator]):
|
||||
"""Represent a AirOS Entity."""
|
||||
|
||||
_attr_has_entity_name = True
|
||||
|
||||
def __init__(self, coordinator: AirOSDataUpdateCoordinator) -> None:
|
||||
"""Initialise the gateway."""
|
||||
super().__init__(coordinator)
|
||||
|
||||
airos_data = self.coordinator.data
|
||||
|
||||
configuration_url: str | None = (
|
||||
f"https://{coordinator.config_entry.data[CONF_HOST]}"
|
||||
)
|
||||
|
||||
self._attr_device_info = DeviceInfo(
|
||||
connections={(CONNECTION_NETWORK_MAC, airos_data.derived.mac)},
|
||||
configuration_url=configuration_url,
|
||||
identifiers={(DOMAIN, str(airos_data.host.device_id))},
|
||||
manufacturer=MANUFACTURER,
|
||||
model=airos_data.host.devmodel,
|
||||
name=airos_data.host.hostname,
|
||||
sw_version=airos_data.host.fwversion,
|
||||
)
|
@@ -1,10 +0,0 @@
|
||||
{
|
||||
"domain": "airos",
|
||||
"name": "Ubiquiti airOS",
|
||||
"codeowners": ["@CoMPaTech"],
|
||||
"config_flow": true,
|
||||
"documentation": "https://www.home-assistant.io/integrations/airos",
|
||||
"iot_class": "local_polling",
|
||||
"quality_scale": "bronze",
|
||||
"requirements": ["airos==0.5.1"]
|
||||
}
|
@@ -1,70 +0,0 @@
|
||||
rules:
|
||||
# Bronze
|
||||
action-setup:
|
||||
status: exempt
|
||||
comment: airOS does not have actions
|
||||
appropriate-polling: done
|
||||
brands: done
|
||||
common-modules: done
|
||||
config-flow-test-coverage: done
|
||||
config-flow: done
|
||||
dependency-transparency: done
|
||||
docs-actions:
|
||||
status: exempt
|
||||
comment: airOS does not have actions
|
||||
docs-high-level-description: done
|
||||
docs-installation-instructions: done
|
||||
docs-removal-instructions: done
|
||||
entity-event-setup:
|
||||
status: exempt
|
||||
comment: local_polling without events
|
||||
entity-unique-id: done
|
||||
has-entity-name: done
|
||||
runtime-data: done
|
||||
test-before-configure: done
|
||||
test-before-setup: done
|
||||
unique-config-entry: done
|
||||
|
||||
# Silver
|
||||
action-exceptions:
|
||||
status: exempt
|
||||
comment: airOS does not have actions
|
||||
config-entry-unloading: done
|
||||
docs-configuration-parameters: done
|
||||
docs-installation-parameters: done
|
||||
entity-unavailable: todo
|
||||
integration-owner: done
|
||||
log-when-unavailable: todo
|
||||
parallel-updates: todo
|
||||
reauthentication-flow: todo
|
||||
test-coverage: done
|
||||
|
||||
# Gold
|
||||
devices: done
|
||||
diagnostics: done
|
||||
discovery-update-info: todo
|
||||
discovery: todo
|
||||
docs-data-update: done
|
||||
docs-examples: todo
|
||||
docs-known-limitations: done
|
||||
docs-supported-devices: done
|
||||
docs-supported-functions: todo
|
||||
docs-troubleshooting: done
|
||||
docs-use-cases: todo
|
||||
dynamic-devices: todo
|
||||
entity-category: done
|
||||
entity-device-class: done
|
||||
entity-disabled-by-default: done
|
||||
entity-translations: done
|
||||
exception-translations: done
|
||||
icon-translations:
|
||||
status: exempt
|
||||
comment: no (custom) icons used or envisioned
|
||||
reconfiguration-flow: todo
|
||||
repair-issues: todo
|
||||
stale-devices: todo
|
||||
|
||||
# Platinum
|
||||
async-dependency: done
|
||||
inject-websession: done
|
||||
strict-typing: done
|
@@ -1,194 +0,0 @@
|
||||
"""AirOS Sensor component for Home Assistant."""
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
from collections.abc import Callable
|
||||
from dataclasses import dataclass
|
||||
import logging
|
||||
|
||||
from airos.data import DerivedWirelessMode, DerivedWirelessRole, NetRole
|
||||
|
||||
from homeassistant.components.sensor import (
|
||||
SensorDeviceClass,
|
||||
SensorEntity,
|
||||
SensorEntityDescription,
|
||||
SensorStateClass,
|
||||
)
|
||||
from homeassistant.const import (
|
||||
PERCENTAGE,
|
||||
SIGNAL_STRENGTH_DECIBELS,
|
||||
UnitOfDataRate,
|
||||
UnitOfFrequency,
|
||||
UnitOfLength,
|
||||
UnitOfTime,
|
||||
)
|
||||
from homeassistant.core import HomeAssistant
|
||||
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
|
||||
from homeassistant.helpers.typing import StateType
|
||||
|
||||
from .coordinator import AirOS8Data, AirOSConfigEntry, AirOSDataUpdateCoordinator
|
||||
from .entity import AirOSEntity
|
||||
|
||||
_LOGGER = logging.getLogger(__name__)
|
||||
|
||||
NETROLE_OPTIONS = [mode.value for mode in NetRole]
|
||||
WIRELESS_MODE_OPTIONS = [mode.value for mode in DerivedWirelessMode]
|
||||
WIRELESS_ROLE_OPTIONS = [mode.value for mode in DerivedWirelessRole]
|
||||
|
||||
PARALLEL_UPDATES = 0
|
||||
|
||||
|
||||
@dataclass(frozen=True, kw_only=True)
|
||||
class AirOSSensorEntityDescription(SensorEntityDescription):
|
||||
"""Describe an AirOS sensor."""
|
||||
|
||||
value_fn: Callable[[AirOS8Data], StateType]
|
||||
|
||||
|
||||
SENSORS: tuple[AirOSSensorEntityDescription, ...] = (
|
||||
AirOSSensorEntityDescription(
|
||||
key="host_cpuload",
|
||||
translation_key="host_cpuload",
|
||||
native_unit_of_measurement=PERCENTAGE,
|
||||
state_class=SensorStateClass.MEASUREMENT,
|
||||
suggested_display_precision=1,
|
||||
value_fn=lambda data: data.host.cpuload,
|
||||
entity_registry_enabled_default=False,
|
||||
),
|
||||
AirOSSensorEntityDescription(
|
||||
key="host_netrole",
|
||||
translation_key="host_netrole",
|
||||
device_class=SensorDeviceClass.ENUM,
|
||||
value_fn=lambda data: data.host.netrole.value,
|
||||
options=NETROLE_OPTIONS,
|
||||
),
|
||||
AirOSSensorEntityDescription(
|
||||
key="wireless_frequency",
|
||||
translation_key="wireless_frequency",
|
||||
native_unit_of_measurement=UnitOfFrequency.MEGAHERTZ,
|
||||
device_class=SensorDeviceClass.FREQUENCY,
|
||||
state_class=SensorStateClass.MEASUREMENT,
|
||||
value_fn=lambda data: data.wireless.frequency,
|
||||
),
|
||||
AirOSSensorEntityDescription(
|
||||
key="wireless_essid",
|
||||
translation_key="wireless_essid",
|
||||
value_fn=lambda data: data.wireless.essid,
|
||||
),
|
||||
AirOSSensorEntityDescription(
|
||||
key="wireless_antenna_gain",
|
||||
translation_key="wireless_antenna_gain",
|
||||
native_unit_of_measurement=SIGNAL_STRENGTH_DECIBELS,
|
||||
device_class=SensorDeviceClass.SIGNAL_STRENGTH,
|
||||
state_class=SensorStateClass.MEASUREMENT,
|
||||
value_fn=lambda data: data.wireless.antenna_gain,
|
||||
),
|
||||
AirOSSensorEntityDescription(
|
||||
key="wireless_throughput_tx",
|
||||
translation_key="wireless_throughput_tx",
|
||||
native_unit_of_measurement=UnitOfDataRate.KILOBITS_PER_SECOND,
|
||||
device_class=SensorDeviceClass.DATA_RATE,
|
||||
state_class=SensorStateClass.MEASUREMENT,
|
||||
suggested_display_precision=0,
|
||||
suggested_unit_of_measurement=UnitOfDataRate.MEGABITS_PER_SECOND,
|
||||
value_fn=lambda data: data.wireless.throughput.tx,
|
||||
),
|
||||
AirOSSensorEntityDescription(
|
||||
key="wireless_throughput_rx",
|
||||
translation_key="wireless_throughput_rx",
|
||||
native_unit_of_measurement=UnitOfDataRate.KILOBITS_PER_SECOND,
|
||||
device_class=SensorDeviceClass.DATA_RATE,
|
||||
state_class=SensorStateClass.MEASUREMENT,
|
||||
suggested_display_precision=0,
|
||||
suggested_unit_of_measurement=UnitOfDataRate.MEGABITS_PER_SECOND,
|
||||
value_fn=lambda data: data.wireless.throughput.rx,
|
||||
),
|
||||
AirOSSensorEntityDescription(
|
||||
key="wireless_polling_dl_capacity",
|
||||
translation_key="wireless_polling_dl_capacity",
|
||||
native_unit_of_measurement=UnitOfDataRate.KILOBITS_PER_SECOND,
|
||||
device_class=SensorDeviceClass.DATA_RATE,
|
||||
state_class=SensorStateClass.MEASUREMENT,
|
||||
suggested_display_precision=0,
|
||||
suggested_unit_of_measurement=UnitOfDataRate.MEGABITS_PER_SECOND,
|
||||
value_fn=lambda data: data.wireless.polling.dl_capacity,
|
||||
),
|
||||
AirOSSensorEntityDescription(
|
||||
key="wireless_polling_ul_capacity",
|
||||
translation_key="wireless_polling_ul_capacity",
|
||||
native_unit_of_measurement=UnitOfDataRate.KILOBITS_PER_SECOND,
|
||||
device_class=SensorDeviceClass.DATA_RATE,
|
||||
state_class=SensorStateClass.MEASUREMENT,
|
||||
suggested_display_precision=0,
|
||||
suggested_unit_of_measurement=UnitOfDataRate.MEGABITS_PER_SECOND,
|
||||
value_fn=lambda data: data.wireless.polling.ul_capacity,
|
||||
),
|
||||
AirOSSensorEntityDescription(
|
||||
key="host_uptime",
|
||||
translation_key="host_uptime",
|
||||
native_unit_of_measurement=UnitOfTime.SECONDS,
|
||||
device_class=SensorDeviceClass.DURATION,
|
||||
suggested_display_precision=0,
|
||||
suggested_unit_of_measurement=UnitOfTime.DAYS,
|
||||
value_fn=lambda data: data.host.uptime,
|
||||
entity_registry_enabled_default=False,
|
||||
),
|
||||
AirOSSensorEntityDescription(
|
||||
key="wireless_distance",
|
||||
translation_key="wireless_distance",
|
||||
native_unit_of_measurement=UnitOfLength.METERS,
|
||||
device_class=SensorDeviceClass.DISTANCE,
|
||||
suggested_display_precision=1,
|
||||
suggested_unit_of_measurement=UnitOfLength.KILOMETERS,
|
||||
value_fn=lambda data: data.wireless.distance,
|
||||
),
|
||||
AirOSSensorEntityDescription(
|
||||
key="wireless_mode",
|
||||
translation_key="wireless_mode",
|
||||
device_class=SensorDeviceClass.ENUM,
|
||||
value_fn=lambda data: data.derived.mode.value,
|
||||
options=WIRELESS_MODE_OPTIONS,
|
||||
entity_registry_enabled_default=False,
|
||||
),
|
||||
AirOSSensorEntityDescription(
|
||||
key="wireless_role",
|
||||
translation_key="wireless_role",
|
||||
device_class=SensorDeviceClass.ENUM,
|
||||
value_fn=lambda data: data.derived.role.value,
|
||||
options=WIRELESS_ROLE_OPTIONS,
|
||||
entity_registry_enabled_default=False,
|
||||
),
|
||||
)
|
||||
|
||||
|
||||
async def async_setup_entry(
|
||||
hass: HomeAssistant,
|
||||
config_entry: AirOSConfigEntry,
|
||||
async_add_entities: AddConfigEntryEntitiesCallback,
|
||||
) -> None:
|
||||
"""Set up the AirOS sensors from a config entry."""
|
||||
coordinator = config_entry.runtime_data
|
||||
|
||||
async_add_entities(AirOSSensor(coordinator, description) for description in SENSORS)
|
||||
|
||||
|
||||
class AirOSSensor(AirOSEntity, SensorEntity):
|
||||
"""Representation of a Sensor."""
|
||||
|
||||
entity_description: AirOSSensorEntityDescription
|
||||
|
||||
def __init__(
|
||||
self,
|
||||
coordinator: AirOSDataUpdateCoordinator,
|
||||
description: AirOSSensorEntityDescription,
|
||||
) -> None:
|
||||
"""Initialize the sensor."""
|
||||
super().__init__(coordinator)
|
||||
|
||||
self.entity_description = description
|
||||
self._attr_unique_id = f"{coordinator.data.derived.mac}_{description.key}"
|
||||
|
||||
@property
|
||||
def native_value(self) -> StateType:
|
||||
"""Return the state of the sensor."""
|
||||
return self.entity_description.value_fn(self.coordinator.data)
|
@@ -1,117 +0,0 @@
|
||||
{
|
||||
"config": {
|
||||
"flow_title": "Ubiquiti airOS device",
|
||||
"step": {
|
||||
"user": {
|
||||
"data": {
|
||||
"host": "[%key:common::config_flow::data::host%]",
|
||||
"username": "[%key:common::config_flow::data::username%]",
|
||||
"password": "[%key:common::config_flow::data::password%]"
|
||||
},
|
||||
"data_description": {
|
||||
"host": "IP address or hostname of the airOS device",
|
||||
"username": "Administrator username for the airOS device, normally 'ubnt'",
|
||||
"password": "Password configured through the UISP app or web interface"
|
||||
}
|
||||
}
|
||||
},
|
||||
"error": {
|
||||
"cannot_connect": "[%key:common::config_flow::error::cannot_connect%]",
|
||||
"invalid_auth": "[%key:common::config_flow::error::invalid_auth%]",
|
||||
"key_data_missing": "Expected data not returned from the device, check the documentation for supported devices",
|
||||
"unknown": "[%key:common::config_flow::error::unknown%]"
|
||||
},
|
||||
"abort": {
|
||||
"already_configured": "[%key:common::config_flow::abort::already_configured_device%]"
|
||||
}
|
||||
},
|
||||
"entity": {
|
||||
"binary_sensor": {
|
||||
"port_forwarding": {
|
||||
"name": "Port forwarding"
|
||||
},
|
||||
"dhcp_client": {
|
||||
"name": "DHCP client"
|
||||
},
|
||||
"dhcp_server": {
|
||||
"name": "DHCP server"
|
||||
},
|
||||
"dhcp6_server": {
|
||||
"name": "DHCPv6 server"
|
||||
},
|
||||
"pppoe": {
|
||||
"name": "PPPoE link"
|
||||
}
|
||||
},
|
||||
"sensor": {
|
||||
"host_cpuload": {
|
||||
"name": "CPU load"
|
||||
},
|
||||
"host_netrole": {
|
||||
"name": "Network role",
|
||||
"state": {
|
||||
"bridge": "Bridge",
|
||||
"router": "Router"
|
||||
}
|
||||
},
|
||||
"wireless_frequency": {
|
||||
"name": "Wireless frequency"
|
||||
},
|
||||
"wireless_essid": {
|
||||
"name": "Wireless SSID"
|
||||
},
|
||||
"wireless_antenna_gain": {
|
||||
"name": "Antenna gain"
|
||||
},
|
||||
"wireless_throughput_tx": {
|
||||
"name": "Throughput transmit (actual)"
|
||||
},
|
||||
"wireless_throughput_rx": {
|
||||
"name": "Throughput receive (actual)"
|
||||
},
|
||||
"wireless_polling_dl_capacity": {
|
||||
"name": "Download capacity"
|
||||
},
|
||||
"wireless_polling_ul_capacity": {
|
||||
"name": "Upload capacity"
|
||||
},
|
||||
"wireless_remote_hostname": {
|
||||
"name": "Remote hostname"
|
||||
},
|
||||
"host_uptime": {
|
||||
"name": "Uptime"
|
||||
},
|
||||
"wireless_distance": {
|
||||
"name": "Wireless distance"
|
||||
},
|
||||
"wireless_role": {
|
||||
"name": "Wireless role",
|
||||
"state": {
|
||||
"access_point": "Access point",
|
||||
"station": "Station"
|
||||
}
|
||||
},
|
||||
"wireless_mode": {
|
||||
"name": "Wireless mode",
|
||||
"state": {
|
||||
"point_to_point": "Point-to-point",
|
||||
"point_to_multipoint": "Point-to-multipoint"
|
||||
}
|
||||
}
|
||||
}
|
||||
},
|
||||
"exceptions": {
|
||||
"invalid_auth": {
|
||||
"message": "[%key:common::config_flow::error::invalid_auth%]"
|
||||
},
|
||||
"cannot_connect": {
|
||||
"message": "[%key:common::config_flow::error::cannot_connect%]"
|
||||
},
|
||||
"key_data_missing": {
|
||||
"message": "Key data not returned from device"
|
||||
},
|
||||
"error_data_missing": {
|
||||
"message": "Data incomplete or missing"
|
||||
}
|
||||
}
|
||||
}
|
@@ -9,7 +9,7 @@ from homeassistant.core import HomeAssistant
|
||||
from .const import CONF_CLIP_NEGATIVE, CONF_RETURN_AVERAGE
|
||||
from .coordinator import AirQCoordinator
|
||||
|
||||
PLATFORMS: list[Platform] = [Platform.NUMBER, Platform.SENSOR]
|
||||
PLATFORMS: list[Platform] = [Platform.SENSOR]
|
||||
|
||||
AirQConfigEntry = ConfigEntry[AirQCoordinator]
|
||||
|
||||
|
@@ -75,7 +75,6 @@ class AirQCoordinator(DataUpdateCoordinator):
|
||||
return_average=self.return_average,
|
||||
clip_negative_values=self.clip_negative,
|
||||
)
|
||||
data["brightness"] = await self.airq.get_current_brightness()
|
||||
if warming_up_sensors := identify_warming_up_sensors(data):
|
||||
_LOGGER.debug(
|
||||
"Following sensors are still warming up: %s", warming_up_sensors
|
||||
|
@@ -1,85 +0,0 @@
|
||||
"""Definition of air-Q number platform used to control the LED strips."""
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
from collections.abc import Awaitable, Callable
|
||||
from dataclasses import dataclass
|
||||
import logging
|
||||
|
||||
from aioairq.core import AirQ
|
||||
|
||||
from homeassistant.components.number import NumberEntity, NumberEntityDescription
|
||||
from homeassistant.const import PERCENTAGE
|
||||
from homeassistant.core import HomeAssistant
|
||||
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
|
||||
from homeassistant.helpers.update_coordinator import CoordinatorEntity
|
||||
|
||||
from . import AirQConfigEntry, AirQCoordinator
|
||||
|
||||
_LOGGER = logging.getLogger(__name__)
|
||||
|
||||
|
||||
@dataclass(frozen=True, kw_only=True)
|
||||
class AirQBrightnessDescription(NumberEntityDescription):
|
||||
"""Describes AirQ number entity responsible for brightness control."""
|
||||
|
||||
value: Callable[[dict], float]
|
||||
set_value: Callable[[AirQ, float], Awaitable[None]]
|
||||
|
||||
|
||||
AIRQ_LED_BRIGHTNESS = AirQBrightnessDescription(
|
||||
key="airq_led_brightness",
|
||||
translation_key="airq_led_brightness",
|
||||
native_min_value=0.0,
|
||||
native_max_value=100.0,
|
||||
native_step=1.0,
|
||||
native_unit_of_measurement=PERCENTAGE,
|
||||
value=lambda data: data["brightness"],
|
||||
set_value=lambda device, value: device.set_current_brightness(value),
|
||||
)
|
||||
|
||||
|
||||
async def async_setup_entry(
|
||||
hass: HomeAssistant,
|
||||
entry: AirQConfigEntry,
|
||||
async_add_entities: AddConfigEntryEntitiesCallback,
|
||||
) -> None:
|
||||
"""Set up number entities: a single entity for the LEDs."""
|
||||
|
||||
coordinator = entry.runtime_data
|
||||
entities = [AirQLEDBrightness(coordinator, AIRQ_LED_BRIGHTNESS)]
|
||||
|
||||
async_add_entities(entities)
|
||||
|
||||
|
||||
class AirQLEDBrightness(CoordinatorEntity[AirQCoordinator], NumberEntity):
|
||||
"""Representation of the LEDs from a single AirQ."""
|
||||
|
||||
_attr_has_entity_name = True
|
||||
|
||||
def __init__(
|
||||
self,
|
||||
coordinator: AirQCoordinator,
|
||||
description: AirQBrightnessDescription,
|
||||
) -> None:
|
||||
"""Initialize a single sensor."""
|
||||
super().__init__(coordinator)
|
||||
self.entity_description: AirQBrightnessDescription = description
|
||||
|
||||
self._attr_device_info = coordinator.device_info
|
||||
self._attr_unique_id = f"{coordinator.device_id}_{description.key}"
|
||||
|
||||
@property
|
||||
def native_value(self) -> float:
|
||||
"""Return the brightness of the LEDs in %."""
|
||||
return self.entity_description.value(self.coordinator.data)
|
||||
|
||||
async def async_set_native_value(self, value: float) -> None:
|
||||
"""Set the brightness of the LEDs to the value in %."""
|
||||
_LOGGER.debug(
|
||||
"Changing LED brighntess from %.0f%% to %.0f%%",
|
||||
self.coordinator.data["brightness"],
|
||||
value,
|
||||
)
|
||||
await self.entity_description.set_value(self.coordinator.airq, value)
|
||||
await self.coordinator.async_request_refresh()
|
@@ -35,11 +35,6 @@
|
||||
}
|
||||
},
|
||||
"entity": {
|
||||
"number": {
|
||||
"airq_led_brightness": {
|
||||
"name": "LED brightness"
|
||||
}
|
||||
},
|
||||
"sensor": {
|
||||
"acetaldehyde": {
|
||||
"name": "Acetaldehyde"
|
||||
|
@@ -7,18 +7,21 @@ import logging
|
||||
|
||||
from airthings import Airthings
|
||||
|
||||
from homeassistant.config_entries import ConfigEntry
|
||||
from homeassistant.const import CONF_ID, Platform
|
||||
from homeassistant.core import HomeAssistant
|
||||
from homeassistant.helpers.aiohttp_client import async_get_clientsession
|
||||
|
||||
from .const import CONF_SECRET
|
||||
from .coordinator import AirthingsConfigEntry, AirthingsDataUpdateCoordinator
|
||||
from .coordinator import AirthingsDataUpdateCoordinator
|
||||
|
||||
_LOGGER = logging.getLogger(__name__)
|
||||
|
||||
PLATFORMS: list[Platform] = [Platform.SENSOR]
|
||||
SCAN_INTERVAL = timedelta(minutes=6)
|
||||
|
||||
type AirthingsConfigEntry = ConfigEntry[AirthingsDataUpdateCoordinator]
|
||||
|
||||
|
||||
async def async_setup_entry(hass: HomeAssistant, entry: AirthingsConfigEntry) -> bool:
|
||||
"""Set up Airthings from a config entry."""
|
||||
@@ -28,7 +31,7 @@ async def async_setup_entry(hass: HomeAssistant, entry: AirthingsConfigEntry) ->
|
||||
async_get_clientsession(hass),
|
||||
)
|
||||
|
||||
coordinator = AirthingsDataUpdateCoordinator(hass, airthings, entry)
|
||||
coordinator = AirthingsDataUpdateCoordinator(hass, airthings)
|
||||
|
||||
await coordinator.async_config_entry_first_refresh()
|
||||
|
||||
|
@@ -5,7 +5,6 @@ import logging
|
||||
|
||||
from airthings import Airthings, AirthingsDevice, AirthingsError
|
||||
|
||||
from homeassistant.config_entries import ConfigEntry
|
||||
from homeassistant.core import HomeAssistant
|
||||
from homeassistant.helpers.update_coordinator import DataUpdateCoordinator, UpdateFailed
|
||||
|
||||
@@ -14,23 +13,15 @@ from .const import DOMAIN
|
||||
_LOGGER = logging.getLogger(__name__)
|
||||
SCAN_INTERVAL = timedelta(minutes=6)
|
||||
|
||||
type AirthingsConfigEntry = ConfigEntry[AirthingsDataUpdateCoordinator]
|
||||
|
||||
|
||||
class AirthingsDataUpdateCoordinator(DataUpdateCoordinator[dict[str, AirthingsDevice]]):
|
||||
"""Coordinator for Airthings data updates."""
|
||||
|
||||
def __init__(
|
||||
self,
|
||||
hass: HomeAssistant,
|
||||
airthings: Airthings,
|
||||
config_entry: AirthingsConfigEntry,
|
||||
) -> None:
|
||||
def __init__(self, hass: HomeAssistant, airthings: Airthings) -> None:
|
||||
"""Initialize the coordinator."""
|
||||
super().__init__(
|
||||
hass,
|
||||
_LOGGER,
|
||||
config_entry=config_entry,
|
||||
name=DOMAIN,
|
||||
update_method=self._update_method,
|
||||
update_interval=SCAN_INTERVAL,
|
||||
|
@@ -11,5 +11,5 @@
|
||||
"documentation": "https://www.home-assistant.io/integrations/airzone",
|
||||
"iot_class": "local_polling",
|
||||
"loggers": ["aioairzone"],
|
||||
"requirements": ["aioairzone==1.0.1"]
|
||||
"requirements": ["aioairzone==1.0.0"]
|
||||
}
|
||||
|
@@ -6,5 +6,5 @@
|
||||
"documentation": "https://www.home-assistant.io/integrations/airzone_cloud",
|
||||
"iot_class": "cloud_push",
|
||||
"loggers": ["aioairzone_cloud"],
|
||||
"requirements": ["aioairzone-cloud==0.7.2"]
|
||||
"requirements": ["aioairzone-cloud==0.7.0"]
|
||||
}
|
||||
|
@@ -2,96 +2,39 @@
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
from genie_partner_sdk.client import AladdinConnectClient
|
||||
|
||||
from homeassistant.const import Platform
|
||||
from homeassistant.config_entries import ConfigEntry
|
||||
from homeassistant.core import HomeAssistant
|
||||
from homeassistant.helpers import (
|
||||
aiohttp_client,
|
||||
config_entry_oauth2_flow,
|
||||
device_registry as dr,
|
||||
)
|
||||
from homeassistant.helpers import issue_registry as ir
|
||||
|
||||
from . import api
|
||||
from .const import CONFIG_FLOW_MINOR_VERSION, CONFIG_FLOW_VERSION, DOMAIN
|
||||
from .coordinator import AladdinConnectConfigEntry, AladdinConnectCoordinator
|
||||
|
||||
PLATFORMS: list[Platform] = [Platform.COVER, Platform.SENSOR]
|
||||
DOMAIN = "aladdin_connect"
|
||||
|
||||
|
||||
async def async_setup_entry(
|
||||
hass: HomeAssistant, entry: AladdinConnectConfigEntry
|
||||
) -> bool:
|
||||
"""Set up Aladdin Connect Genie from a config entry."""
|
||||
implementation = (
|
||||
await config_entry_oauth2_flow.async_get_config_entry_implementation(
|
||||
hass, entry
|
||||
)
|
||||
async def async_setup_entry(hass: HomeAssistant, _: ConfigEntry) -> bool:
|
||||
"""Set up Aladdin Connect from a config entry."""
|
||||
ir.async_create_issue(
|
||||
hass,
|
||||
DOMAIN,
|
||||
DOMAIN,
|
||||
is_fixable=False,
|
||||
severity=ir.IssueSeverity.ERROR,
|
||||
translation_key="integration_removed",
|
||||
translation_placeholders={
|
||||
"entries": "/config/integrations/integration/aladdin_connect",
|
||||
},
|
||||
)
|
||||
|
||||
session = config_entry_oauth2_flow.OAuth2Session(hass, entry, implementation)
|
||||
|
||||
client = AladdinConnectClient(
|
||||
api.AsyncConfigEntryAuth(aiohttp_client.async_get_clientsession(hass), session)
|
||||
)
|
||||
|
||||
doors = await client.get_doors()
|
||||
|
||||
entry.runtime_data = {
|
||||
door.unique_id: AladdinConnectCoordinator(hass, entry, client, door)
|
||||
for door in doors
|
||||
}
|
||||
|
||||
await hass.config_entries.async_forward_entry_setups(entry, PLATFORMS)
|
||||
|
||||
remove_stale_devices(hass, entry)
|
||||
|
||||
return True
|
||||
|
||||
|
||||
async def async_unload_entry(
|
||||
hass: HomeAssistant, entry: AladdinConnectConfigEntry
|
||||
) -> bool:
|
||||
async def async_unload_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool:
|
||||
"""Unload a config entry."""
|
||||
return await hass.config_entries.async_unload_platforms(entry, PLATFORMS)
|
||||
|
||||
|
||||
async def async_migrate_entry(
|
||||
hass: HomeAssistant, config_entry: AladdinConnectConfigEntry
|
||||
) -> bool:
|
||||
"""Migrate old config."""
|
||||
if config_entry.version < CONFIG_FLOW_VERSION:
|
||||
config_entry.async_start_reauth(hass)
|
||||
new_data = {**config_entry.data}
|
||||
hass.config_entries.async_update_entry(
|
||||
config_entry,
|
||||
data=new_data,
|
||||
version=CONFIG_FLOW_VERSION,
|
||||
minor_version=CONFIG_FLOW_MINOR_VERSION,
|
||||
)
|
||||
|
||||
return True
|
||||
|
||||
|
||||
def remove_stale_devices(
|
||||
hass: HomeAssistant,
|
||||
config_entry: AladdinConnectConfigEntry,
|
||||
) -> None:
|
||||
"""Remove stale devices from device registry."""
|
||||
device_registry = dr.async_get(hass)
|
||||
device_entries = dr.async_entries_for_config_entry(
|
||||
device_registry, config_entry.entry_id
|
||||
)
|
||||
all_device_ids = set(config_entry.runtime_data)
|
||||
|
||||
for device_entry in device_entries:
|
||||
device_id: str | None = None
|
||||
for identifier in device_entry.identifiers:
|
||||
if identifier[0] == DOMAIN:
|
||||
device_id = identifier[1]
|
||||
break
|
||||
|
||||
if device_id and device_id not in all_device_ids:
|
||||
device_registry.async_update_device(
|
||||
device_entry.id, remove_config_entry_id=config_entry.entry_id
|
||||
)
|
||||
async def async_remove_entry(hass: HomeAssistant, entry: ConfigEntry) -> None:
|
||||
"""Remove a config entry."""
|
||||
if not hass.config_entries.async_loaded_entries(DOMAIN):
|
||||
ir.async_delete_issue(hass, DOMAIN, DOMAIN)
|
||||
# Remove any remaining disabled or ignored entries
|
||||
for _entry in hass.config_entries.async_entries(DOMAIN):
|
||||
hass.async_create_task(hass.config_entries.async_remove(_entry.entry_id))
|
||||
|
@@ -1,33 +0,0 @@
|
||||
"""API for Aladdin Connect Genie bound to Home Assistant OAuth."""
|
||||
|
||||
from typing import cast
|
||||
|
||||
from aiohttp import ClientSession
|
||||
from genie_partner_sdk.auth import Auth
|
||||
|
||||
from homeassistant.helpers import config_entry_oauth2_flow
|
||||
|
||||
API_URL = "https://twdvzuefzh.execute-api.us-east-2.amazonaws.com/v1"
|
||||
API_KEY = "k6QaiQmcTm2zfaNns5L1Z8duBtJmhDOW8JawlCC3"
|
||||
|
||||
|
||||
class AsyncConfigEntryAuth(Auth):
|
||||
"""Provide Aladdin Connect Genie authentication tied to an OAuth2 based config entry."""
|
||||
|
||||
def __init__(
|
||||
self,
|
||||
websession: ClientSession,
|
||||
oauth_session: config_entry_oauth2_flow.OAuth2Session,
|
||||
) -> None:
|
||||
"""Initialize Aladdin Connect Genie auth."""
|
||||
super().__init__(
|
||||
websession, API_URL, oauth_session.token["access_token"], API_KEY
|
||||
)
|
||||
self._oauth_session = oauth_session
|
||||
|
||||
async def async_get_access_token(self) -> str:
|
||||
"""Return a valid access token."""
|
||||
if not self._oauth_session.valid_token:
|
||||
await self._oauth_session.async_ensure_token_valid()
|
||||
|
||||
return cast(str, self._oauth_session.token["access_token"])
|
@@ -1,14 +0,0 @@
|
||||
"""application_credentials platform the Aladdin Connect Genie integration."""
|
||||
|
||||
from homeassistant.components.application_credentials import AuthorizationServer
|
||||
from homeassistant.core import HomeAssistant
|
||||
|
||||
from .const import OAUTH2_AUTHORIZE, OAUTH2_TOKEN
|
||||
|
||||
|
||||
async def async_get_authorization_server(hass: HomeAssistant) -> AuthorizationServer:
|
||||
"""Return authorization server."""
|
||||
return AuthorizationServer(
|
||||
authorize_url=OAUTH2_AUTHORIZE,
|
||||
token_url=OAUTH2_TOKEN,
|
||||
)
|
@@ -1,63 +1,11 @@
|
||||
"""Config flow for Aladdin Connect Genie."""
|
||||
"""Config flow for Aladdin Connect integration."""
|
||||
|
||||
from collections.abc import Mapping
|
||||
import logging
|
||||
from typing import Any
|
||||
from homeassistant.config_entries import ConfigFlow
|
||||
|
||||
import jwt
|
||||
import voluptuous as vol
|
||||
|
||||
from homeassistant.config_entries import SOURCE_REAUTH, ConfigFlowResult
|
||||
from homeassistant.helpers import config_entry_oauth2_flow
|
||||
|
||||
from .const import CONFIG_FLOW_MINOR_VERSION, CONFIG_FLOW_VERSION, DOMAIN
|
||||
from . import DOMAIN
|
||||
|
||||
|
||||
class OAuth2FlowHandler(
|
||||
config_entry_oauth2_flow.AbstractOAuth2FlowHandler, domain=DOMAIN
|
||||
):
|
||||
"""Config flow to handle Aladdin Connect Genie OAuth2 authentication."""
|
||||
class AladdinConnectConfigFlow(ConfigFlow, domain=DOMAIN):
|
||||
"""Handle a config flow for Aladdin Connect."""
|
||||
|
||||
DOMAIN = DOMAIN
|
||||
VERSION = CONFIG_FLOW_VERSION
|
||||
MINOR_VERSION = CONFIG_FLOW_MINOR_VERSION
|
||||
|
||||
async def async_step_reauth(
|
||||
self, user_input: Mapping[str, Any]
|
||||
) -> ConfigFlowResult:
|
||||
"""Perform reauth upon API auth error or upgrade from v1 to v2."""
|
||||
return await self.async_step_reauth_confirm()
|
||||
|
||||
async def async_step_reauth_confirm(
|
||||
self, user_input: Mapping[str, Any] | None = None
|
||||
) -> ConfigFlowResult:
|
||||
"""Dialog that informs the user that reauth is required."""
|
||||
if user_input is None:
|
||||
return self.async_show_form(
|
||||
step_id="reauth_confirm",
|
||||
data_schema=vol.Schema({}),
|
||||
)
|
||||
return await self.async_step_user()
|
||||
|
||||
async def async_oauth_create_entry(self, data: dict) -> ConfigFlowResult:
|
||||
"""Create an oauth config entry or update existing entry for reauth."""
|
||||
# Extract the user ID from the JWT token's 'sub' field
|
||||
token = jwt.decode(
|
||||
data["token"]["access_token"], options={"verify_signature": False}
|
||||
)
|
||||
user_id = token["sub"]
|
||||
await self.async_set_unique_id(user_id)
|
||||
|
||||
if self.source == SOURCE_REAUTH:
|
||||
self._abort_if_unique_id_mismatch(reason="wrong_account")
|
||||
return self.async_update_reload_and_abort(
|
||||
self._get_reauth_entry(), data=data
|
||||
)
|
||||
|
||||
self._abort_if_unique_id_configured()
|
||||
return self.async_create_entry(title="Aladdin Connect", data=data)
|
||||
|
||||
@property
|
||||
def logger(self) -> logging.Logger:
|
||||
"""Return logger."""
|
||||
return logging.getLogger(__name__)
|
||||
VERSION = 1
|
||||
|
@@ -1,14 +0,0 @@
|
||||
"""Constants for the Aladdin Connect Genie integration."""
|
||||
|
||||
from typing import Final
|
||||
|
||||
from homeassistant.components.cover import CoverEntityFeature
|
||||
|
||||
DOMAIN = "aladdin_connect"
|
||||
CONFIG_FLOW_VERSION = 2
|
||||
CONFIG_FLOW_MINOR_VERSION = 1
|
||||
|
||||
OAUTH2_AUTHORIZE = "https://app.aladdinconnect.com/login.html"
|
||||
OAUTH2_TOKEN = "https://twdvzuefzh.execute-api.us-east-2.amazonaws.com/v1/oauth2/token"
|
||||
|
||||
SUPPORTED_FEATURES: Final = CoverEntityFeature.OPEN | CoverEntityFeature.CLOSE
|
@@ -1,50 +0,0 @@
|
||||
"""Coordinator for Aladdin Connect integration."""
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
from datetime import timedelta
|
||||
import logging
|
||||
|
||||
from genie_partner_sdk.client import AladdinConnectClient
|
||||
from genie_partner_sdk.model import GarageDoor
|
||||
|
||||
from homeassistant.config_entries import ConfigEntry
|
||||
from homeassistant.core import HomeAssistant
|
||||
from homeassistant.helpers.update_coordinator import DataUpdateCoordinator
|
||||
|
||||
_LOGGER = logging.getLogger(__name__)
|
||||
type AladdinConnectConfigEntry = ConfigEntry[dict[str, AladdinConnectCoordinator]]
|
||||
SCAN_INTERVAL = timedelta(seconds=15)
|
||||
|
||||
|
||||
class AladdinConnectCoordinator(DataUpdateCoordinator[GarageDoor]):
|
||||
"""Coordinator for Aladdin Connect integration."""
|
||||
|
||||
def __init__(
|
||||
self,
|
||||
hass: HomeAssistant,
|
||||
entry: AladdinConnectConfigEntry,
|
||||
client: AladdinConnectClient,
|
||||
garage_door: GarageDoor,
|
||||
) -> None:
|
||||
"""Initialize the coordinator."""
|
||||
super().__init__(
|
||||
hass,
|
||||
logger=_LOGGER,
|
||||
config_entry=entry,
|
||||
name="Aladdin Connect Coordinator",
|
||||
update_interval=SCAN_INTERVAL,
|
||||
)
|
||||
self.client = client
|
||||
self.data = garage_door
|
||||
|
||||
async def _async_update_data(self) -> GarageDoor:
|
||||
"""Fetch data from the Aladdin Connect API."""
|
||||
await self.client.update_door(self.data.device_id, self.data.door_number)
|
||||
self.data.status = self.client.get_door_status(
|
||||
self.data.device_id, self.data.door_number
|
||||
)
|
||||
self.data.battery_level = self.client.get_battery_status(
|
||||
self.data.device_id, self.data.door_number
|
||||
)
|
||||
return self.data
|
@@ -1,64 +0,0 @@
|
||||
"""Cover Entity for Genie Garage Door."""
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
from typing import Any
|
||||
|
||||
from homeassistant.components.cover import CoverDeviceClass, CoverEntity
|
||||
from homeassistant.core import HomeAssistant
|
||||
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
|
||||
|
||||
from .const import SUPPORTED_FEATURES
|
||||
from .coordinator import AladdinConnectConfigEntry, AladdinConnectCoordinator
|
||||
from .entity import AladdinConnectEntity
|
||||
|
||||
|
||||
async def async_setup_entry(
|
||||
hass: HomeAssistant,
|
||||
entry: AladdinConnectConfigEntry,
|
||||
async_add_entities: AddConfigEntryEntitiesCallback,
|
||||
) -> None:
|
||||
"""Set up the cover platform."""
|
||||
coordinators = entry.runtime_data
|
||||
|
||||
async_add_entities(
|
||||
AladdinCoverEntity(coordinator) for coordinator in coordinators.values()
|
||||
)
|
||||
|
||||
|
||||
class AladdinCoverEntity(AladdinConnectEntity, CoverEntity):
|
||||
"""Representation of Aladdin Connect cover."""
|
||||
|
||||
_attr_device_class = CoverDeviceClass.GARAGE
|
||||
_attr_supported_features = SUPPORTED_FEATURES
|
||||
_attr_name = None
|
||||
|
||||
def __init__(self, coordinator: AladdinConnectCoordinator) -> None:
|
||||
"""Initialize the Aladdin Connect cover."""
|
||||
super().__init__(coordinator)
|
||||
self._attr_unique_id = coordinator.data.unique_id
|
||||
|
||||
async def async_open_cover(self, **kwargs: Any) -> None:
|
||||
"""Issue open command to cover."""
|
||||
await self.client.open_door(self._device_id, self._number)
|
||||
|
||||
async def async_close_cover(self, **kwargs: Any) -> None:
|
||||
"""Issue close command to cover."""
|
||||
await self.client.close_door(self._device_id, self._number)
|
||||
|
||||
@property
|
||||
def is_closed(self) -> bool | None:
|
||||
"""Update is closed attribute."""
|
||||
if (status := self.coordinator.data.status) is None:
|
||||
return None
|
||||
return status == "closed"
|
||||
|
||||
@property
|
||||
def is_closing(self) -> bool | None:
|
||||
"""Update is closing attribute."""
|
||||
return self.coordinator.data.status == "closing"
|
||||
|
||||
@property
|
||||
def is_opening(self) -> bool | None:
|
||||
"""Update is opening attribute."""
|
||||
return self.coordinator.data.status == "opening"
|
@@ -1,32 +0,0 @@
|
||||
"""Base class for Aladdin Connect entities."""
|
||||
|
||||
from genie_partner_sdk.client import AladdinConnectClient
|
||||
|
||||
from homeassistant.helpers.device_registry import DeviceInfo
|
||||
from homeassistant.helpers.update_coordinator import CoordinatorEntity
|
||||
|
||||
from .const import DOMAIN
|
||||
from .coordinator import AladdinConnectCoordinator
|
||||
|
||||
|
||||
class AladdinConnectEntity(CoordinatorEntity[AladdinConnectCoordinator]):
|
||||
"""Defines a base Aladdin Connect entity."""
|
||||
|
||||
_attr_has_entity_name = True
|
||||
|
||||
def __init__(self, coordinator: AladdinConnectCoordinator) -> None:
|
||||
"""Initialize Aladdin Connect entity."""
|
||||
super().__init__(coordinator)
|
||||
device = coordinator.data
|
||||
self._attr_device_info = DeviceInfo(
|
||||
identifiers={(DOMAIN, device.unique_id)},
|
||||
manufacturer="Aladdin Connect",
|
||||
name=device.name,
|
||||
)
|
||||
self._device_id = device.device_id
|
||||
self._number = device.door_number
|
||||
|
||||
@property
|
||||
def client(self) -> AladdinConnectClient:
|
||||
"""Return the client for this entity."""
|
||||
return self.coordinator.client
|
@@ -1,16 +1,9 @@
|
||||
{
|
||||
"domain": "aladdin_connect",
|
||||
"name": "Aladdin Connect",
|
||||
"codeowners": ["@swcloudgenie"],
|
||||
"config_flow": true,
|
||||
"dependencies": ["application_credentials"],
|
||||
"dhcp": [
|
||||
{
|
||||
"hostname": "gdocntl-*"
|
||||
}
|
||||
],
|
||||
"codeowners": [],
|
||||
"documentation": "https://www.home-assistant.io/integrations/aladdin_connect",
|
||||
"integration_type": "hub",
|
||||
"integration_type": "system",
|
||||
"iot_class": "cloud_polling",
|
||||
"requirements": ["genie-partner-sdk==1.0.11"]
|
||||
"requirements": []
|
||||
}
|
||||
|
@@ -1,94 +0,0 @@
|
||||
rules:
|
||||
# Bronze
|
||||
action-setup:
|
||||
status: exempt
|
||||
comment: Integration does not register any service actions.
|
||||
appropriate-polling: done
|
||||
brands: done
|
||||
common-modules: done
|
||||
config-flow: done
|
||||
config-flow-test-coverage: todo
|
||||
dependency-transparency: done
|
||||
docs-actions:
|
||||
status: exempt
|
||||
comment: Integration does not register any service actions.
|
||||
docs-high-level-description: done
|
||||
docs-installation-instructions:
|
||||
status: todo
|
||||
comment: Documentation needs to be created.
|
||||
docs-removal-instructions:
|
||||
status: todo
|
||||
comment: Documentation needs to be created.
|
||||
entity-event-setup:
|
||||
status: exempt
|
||||
comment: Integration does not subscribe to external events.
|
||||
entity-unique-id: done
|
||||
has-entity-name: done
|
||||
runtime-data: done
|
||||
test-before-configure:
|
||||
status: todo
|
||||
comment: Config flow does not currently test connection during setup.
|
||||
test-before-setup: todo
|
||||
unique-config-entry: done
|
||||
|
||||
# Silver
|
||||
action-exceptions: todo
|
||||
config-entry-unloading: done
|
||||
docs-configuration-parameters:
|
||||
status: todo
|
||||
comment: Documentation needs to be created.
|
||||
docs-installation-parameters:
|
||||
status: todo
|
||||
comment: Documentation needs to be created.
|
||||
entity-unavailable: todo
|
||||
integration-owner: done
|
||||
log-when-unavailable: todo
|
||||
parallel-updates: todo
|
||||
reauthentication-flow: done
|
||||
test-coverage:
|
||||
status: todo
|
||||
comment: Platform tests for cover and sensor need to be implemented to reach 95% coverage.
|
||||
|
||||
# Gold
|
||||
devices: done
|
||||
diagnostics: todo
|
||||
discovery: todo
|
||||
discovery-update-info: todo
|
||||
docs-data-update:
|
||||
status: todo
|
||||
comment: Documentation needs to be created.
|
||||
docs-examples:
|
||||
status: todo
|
||||
comment: Documentation needs to be created.
|
||||
docs-known-limitations:
|
||||
status: todo
|
||||
comment: Documentation needs to be created.
|
||||
docs-supported-devices:
|
||||
status: todo
|
||||
comment: Documentation needs to be created.
|
||||
docs-supported-functions:
|
||||
status: todo
|
||||
comment: Documentation needs to be created.
|
||||
docs-troubleshooting:
|
||||
status: todo
|
||||
comment: Documentation needs to be created.
|
||||
docs-use-cases:
|
||||
status: todo
|
||||
comment: Documentation needs to be created.
|
||||
dynamic-devices: todo
|
||||
entity-category: done
|
||||
entity-device-class: done
|
||||
entity-disabled-by-default: done
|
||||
entity-translations: done
|
||||
exception-translations: todo
|
||||
icon-translations: todo
|
||||
reconfiguration-flow: todo
|
||||
repair-issues: todo
|
||||
stale-devices:
|
||||
status: todo
|
||||
comment: Stale devices can be done dynamically
|
||||
|
||||
# Platinum
|
||||
async-dependency: todo
|
||||
inject-websession: done
|
||||
strict-typing: done
|
@@ -1,77 +0,0 @@
|
||||
"""Support for Aladdin Connect Genie sensors."""
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
from collections.abc import Callable
|
||||
from dataclasses import dataclass
|
||||
|
||||
from genie_partner_sdk.model import GarageDoor
|
||||
|
||||
from homeassistant.components.sensor import (
|
||||
SensorDeviceClass,
|
||||
SensorEntity,
|
||||
SensorEntityDescription,
|
||||
SensorStateClass,
|
||||
)
|
||||
from homeassistant.const import PERCENTAGE, EntityCategory
|
||||
from homeassistant.core import HomeAssistant
|
||||
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
|
||||
|
||||
from .coordinator import AladdinConnectConfigEntry, AladdinConnectCoordinator
|
||||
from .entity import AladdinConnectEntity
|
||||
|
||||
|
||||
@dataclass(frozen=True, kw_only=True)
|
||||
class AladdinConnectSensorEntityDescription(SensorEntityDescription):
|
||||
"""Sensor entity description for Aladdin Connect."""
|
||||
|
||||
value_fn: Callable[[GarageDoor], float | None]
|
||||
|
||||
|
||||
SENSOR_TYPES: tuple[AladdinConnectSensorEntityDescription, ...] = (
|
||||
AladdinConnectSensorEntityDescription(
|
||||
key="battery_level",
|
||||
device_class=SensorDeviceClass.BATTERY,
|
||||
entity_registry_enabled_default=False,
|
||||
native_unit_of_measurement=PERCENTAGE,
|
||||
state_class=SensorStateClass.MEASUREMENT,
|
||||
entity_category=EntityCategory.DIAGNOSTIC,
|
||||
value_fn=lambda garage_door: garage_door.battery_level,
|
||||
),
|
||||
)
|
||||
|
||||
|
||||
async def async_setup_entry(
|
||||
hass: HomeAssistant,
|
||||
entry: AladdinConnectConfigEntry,
|
||||
async_add_entities: AddConfigEntryEntitiesCallback,
|
||||
) -> None:
|
||||
"""Set up Aladdin Connect sensor devices."""
|
||||
coordinators = entry.runtime_data
|
||||
|
||||
async_add_entities(
|
||||
AladdinConnectSensor(coordinator, description)
|
||||
for coordinator in coordinators.values()
|
||||
for description in SENSOR_TYPES
|
||||
)
|
||||
|
||||
|
||||
class AladdinConnectSensor(AladdinConnectEntity, SensorEntity):
|
||||
"""A sensor implementation for Aladdin Connect device."""
|
||||
|
||||
entity_description: AladdinConnectSensorEntityDescription
|
||||
|
||||
def __init__(
|
||||
self,
|
||||
coordinator: AladdinConnectCoordinator,
|
||||
entity_description: AladdinConnectSensorEntityDescription,
|
||||
) -> None:
|
||||
"""Initialize the Aladdin Connect sensor."""
|
||||
super().__init__(coordinator)
|
||||
self.entity_description = entity_description
|
||||
self._attr_unique_id = f"{coordinator.data.unique_id}-{entity_description.key}"
|
||||
|
||||
@property
|
||||
def native_value(self) -> float | None:
|
||||
"""Return the state of the sensor."""
|
||||
return self.entity_description.value_fn(self.coordinator.data)
|
@@ -1,33 +1,8 @@
|
||||
{
|
||||
"config": {
|
||||
"step": {
|
||||
"pick_implementation": {
|
||||
"title": "[%key:common::config_flow::title::oauth2_pick_implementation%]"
|
||||
},
|
||||
"reauth_confirm": {
|
||||
"title": "[%key:common::config_flow::title::reauth%]",
|
||||
"description": "Aladdin Connect needs to re-authenticate your account"
|
||||
},
|
||||
"oauth_discovery": {
|
||||
"description": "Home Assistant has found an Aladdin Connect device on your network. Press **Submit** to continue setting up Aladdin Connect."
|
||||
}
|
||||
},
|
||||
"abort": {
|
||||
"already_configured": "[%key:common::config_flow::abort::already_configured_account%]",
|
||||
"already_in_progress": "[%key:common::config_flow::abort::already_in_progress%]",
|
||||
"oauth_error": "[%key:common::config_flow::abort::oauth2_error%]",
|
||||
"oauth_failed": "[%key:common::config_flow::abort::oauth2_failed%]",
|
||||
"oauth_timeout": "[%key:common::config_flow::abort::oauth2_timeout%]",
|
||||
"oauth_unauthorized": "[%key:common::config_flow::abort::oauth2_unauthorized%]",
|
||||
"missing_configuration": "[%key:common::config_flow::abort::oauth2_missing_configuration%]",
|
||||
"authorize_url_timeout": "[%key:common::config_flow::abort::oauth2_authorize_url_timeout%]",
|
||||
"no_url_available": "[%key:common::config_flow::abort::oauth2_no_url_available%]",
|
||||
"user_rejected_authorize": "[%key:common::config_flow::abort::oauth2_user_rejected_authorize%]",
|
||||
"reauth_successful": "[%key:common::config_flow::abort::reauth_successful%]",
|
||||
"wrong_account": "You are authenticated with a different account than the one set up. Please authenticate with the configured account."
|
||||
},
|
||||
"create_entry": {
|
||||
"default": "[%key:common::config_flow::create_entry::authenticated%]"
|
||||
"issues": {
|
||||
"integration_removed": {
|
||||
"title": "The Aladdin Connect integration has been removed",
|
||||
"description": "The Aladdin Connect integration has been removed from Home Assistant.\n\nTo resolve this issue, please remove the (now defunct) integration entries from your Home Assistant setup. [Click here to see your existing Aladdin Connect integration entries]({entries})."
|
||||
}
|
||||
}
|
||||
}
|
||||
|
@@ -61,7 +61,7 @@ ALARM_SERVICE_SCHEMA: Final = make_entity_service_schema(
|
||||
|
||||
|
||||
async def async_setup(hass: HomeAssistant, config: ConfigType) -> bool:
|
||||
"""Set up the alarm control panel component."""
|
||||
"""Track states and offer events for sensors."""
|
||||
component = hass.data[DATA_COMPONENT] = EntityComponent[AlarmControlPanelEntity](
|
||||
_LOGGER, DOMAIN, hass, SCAN_INTERVAL
|
||||
)
|
||||
|
@@ -1,7 +1,4 @@
|
||||
"""Support for repeating alerts when conditions are met.
|
||||
|
||||
DEVELOPMENT OF THE ALERT INTEGRATION IS FROZEN.
|
||||
"""
|
||||
"""Support for repeating alerts when conditions are met."""
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
@@ -66,10 +63,7 @@ CONFIG_SCHEMA = vol.Schema(
|
||||
|
||||
|
||||
async def async_setup(hass: HomeAssistant, config: ConfigType) -> bool:
|
||||
"""Set up the Alert component.
|
||||
|
||||
DEVELOPMENT OF THE ALERT INTEGRATION IS FROZEN.
|
||||
"""
|
||||
"""Set up the Alert component."""
|
||||
component = EntityComponent[AlertEntity](LOGGER, DOMAIN, hass)
|
||||
|
||||
entities: list[AlertEntity] = []
|
||||
|
@@ -1,7 +1,4 @@
|
||||
"""Support for repeating alerts when conditions are met.
|
||||
|
||||
DEVELOPMENT OF THE ALERT INTEGRATION IS FROZEN.
|
||||
"""
|
||||
"""Support for repeating alerts when conditions are met."""
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
@@ -30,10 +27,7 @@ from .const import DOMAIN, LOGGER
|
||||
|
||||
|
||||
class AlertEntity(Entity):
|
||||
"""Representation of an alert.
|
||||
|
||||
DEVELOPMENT OF THE ALERT INTEGRATION IS FROZEN.
|
||||
"""
|
||||
"""Representation of an alert."""
|
||||
|
||||
_attr_should_poll = False
|
||||
|
||||
|
@@ -1,7 +1,4 @@
|
||||
"""Reproduce an Alert state.
|
||||
|
||||
DEVELOPMENT OF THE ALERT INTEGRATION IS FROZEN.
|
||||
"""
|
||||
"""Reproduce an Alert state."""
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
|
@@ -1,13 +1,9 @@
|
||||
"""Alexa Devices integration."""
|
||||
|
||||
from homeassistant.const import CONF_COUNTRY, Platform
|
||||
from homeassistant.const import Platform
|
||||
from homeassistant.core import HomeAssistant
|
||||
from homeassistant.helpers import aiohttp_client, config_validation as cv
|
||||
from homeassistant.helpers.typing import ConfigType
|
||||
|
||||
from .const import _LOGGER, CONF_LOGIN_DATA, CONF_SITE, COUNTRY_DOMAINS, DOMAIN
|
||||
from .coordinator import AmazonConfigEntry, AmazonDevicesCoordinator
|
||||
from .services import async_setup_services
|
||||
|
||||
PLATFORMS = [
|
||||
Platform.BINARY_SENSOR,
|
||||
@@ -16,20 +12,11 @@ PLATFORMS = [
|
||||
Platform.SWITCH,
|
||||
]
|
||||
|
||||
CONFIG_SCHEMA = cv.config_entry_only_config_schema(DOMAIN)
|
||||
|
||||
|
||||
async def async_setup(hass: HomeAssistant, config: ConfigType) -> bool:
|
||||
"""Set up the Alexa Devices component."""
|
||||
async_setup_services(hass)
|
||||
return True
|
||||
|
||||
|
||||
async def async_setup_entry(hass: HomeAssistant, entry: AmazonConfigEntry) -> bool:
|
||||
"""Set up Alexa Devices platform."""
|
||||
|
||||
session = aiohttp_client.async_create_clientsession(hass)
|
||||
coordinator = AmazonDevicesCoordinator(hass, entry, session)
|
||||
coordinator = AmazonDevicesCoordinator(hass, entry)
|
||||
|
||||
await coordinator.async_config_entry_first_refresh()
|
||||
|
||||
@@ -40,48 +27,10 @@ async def async_setup_entry(hass: HomeAssistant, entry: AmazonConfigEntry) -> bo
|
||||
return True
|
||||
|
||||
|
||||
async def async_migrate_entry(hass: HomeAssistant, entry: AmazonConfigEntry) -> bool:
|
||||
"""Migrate old entry."""
|
||||
|
||||
if entry.version == 1 and entry.minor_version < 3:
|
||||
if CONF_SITE in entry.data:
|
||||
# Site in data (wrong place), just move to login data
|
||||
new_data = entry.data.copy()
|
||||
new_data[CONF_LOGIN_DATA][CONF_SITE] = new_data[CONF_SITE]
|
||||
new_data.pop(CONF_SITE)
|
||||
hass.config_entries.async_update_entry(
|
||||
entry, data=new_data, version=1, minor_version=3
|
||||
)
|
||||
return True
|
||||
|
||||
if CONF_SITE in entry.data[CONF_LOGIN_DATA]:
|
||||
# Site is there, just update version to avoid future migrations
|
||||
hass.config_entries.async_update_entry(entry, version=1, minor_version=3)
|
||||
return True
|
||||
|
||||
_LOGGER.debug(
|
||||
"Migrating from version %s.%s", entry.version, entry.minor_version
|
||||
)
|
||||
|
||||
# Convert country in domain
|
||||
country = entry.data[CONF_COUNTRY].lower()
|
||||
domain = COUNTRY_DOMAINS.get(country, country)
|
||||
|
||||
# Add site to login data
|
||||
new_data = entry.data.copy()
|
||||
new_data[CONF_LOGIN_DATA][CONF_SITE] = f"https://www.amazon.{domain}"
|
||||
|
||||
hass.config_entries.async_update_entry(
|
||||
entry, data=new_data, version=1, minor_version=3
|
||||
)
|
||||
|
||||
_LOGGER.info(
|
||||
"Migration to version %s.%s successful", entry.version, entry.minor_version
|
||||
)
|
||||
|
||||
return True
|
||||
|
||||
|
||||
async def async_unload_entry(hass: HomeAssistant, entry: AmazonConfigEntry) -> bool:
|
||||
"""Unload a config entry."""
|
||||
return await hass.config_entries.async_unload_platforms(entry, PLATFORMS)
|
||||
coordinator = entry.runtime_data
|
||||
if unload_ok := await hass.config_entries.async_unload_platforms(entry, PLATFORMS):
|
||||
await coordinator.api.close()
|
||||
|
||||
return unload_ok
|
||||
|
@@ -10,14 +10,15 @@ from aioamazondevices.exceptions import (
|
||||
CannotAuthenticate,
|
||||
CannotConnect,
|
||||
CannotRetrieveData,
|
||||
WrongCountry,
|
||||
)
|
||||
import voluptuous as vol
|
||||
|
||||
from homeassistant.config_entries import ConfigFlow, ConfigFlowResult
|
||||
from homeassistant.const import CONF_CODE, CONF_PASSWORD, CONF_USERNAME
|
||||
from homeassistant.const import CONF_CODE, CONF_COUNTRY, CONF_PASSWORD, CONF_USERNAME
|
||||
from homeassistant.core import HomeAssistant
|
||||
from homeassistant.helpers import aiohttp_client
|
||||
import homeassistant.helpers.config_validation as cv
|
||||
from homeassistant.helpers.selector import CountrySelector
|
||||
|
||||
from .const import CONF_LOGIN_DATA, DOMAIN
|
||||
|
||||
@@ -27,33 +28,28 @@ STEP_REAUTH_DATA_SCHEMA = vol.Schema(
|
||||
vol.Required(CONF_CODE): cv.string,
|
||||
}
|
||||
)
|
||||
STEP_RECONFIGURE = vol.Schema(
|
||||
{
|
||||
vol.Required(CONF_PASSWORD): cv.string,
|
||||
vol.Required(CONF_CODE): cv.string,
|
||||
}
|
||||
)
|
||||
|
||||
|
||||
async def validate_input(hass: HomeAssistant, data: dict[str, Any]) -> dict[str, Any]:
|
||||
"""Validate the user input allows us to connect."""
|
||||
|
||||
session = aiohttp_client.async_create_clientsession(hass)
|
||||
api = AmazonEchoApi(
|
||||
session,
|
||||
data[CONF_COUNTRY],
|
||||
data[CONF_USERNAME],
|
||||
data[CONF_PASSWORD],
|
||||
)
|
||||
|
||||
return await api.login_mode_interactive(data[CONF_CODE])
|
||||
try:
|
||||
data = await api.login_mode_interactive(data[CONF_CODE])
|
||||
finally:
|
||||
await api.close()
|
||||
|
||||
return data
|
||||
|
||||
|
||||
class AmazonDevicesConfigFlow(ConfigFlow, domain=DOMAIN):
|
||||
"""Handle a config flow for Alexa Devices."""
|
||||
|
||||
VERSION = 1
|
||||
MINOR_VERSION = 3
|
||||
|
||||
async def async_step_user(
|
||||
self, user_input: dict[str, Any] | None = None
|
||||
) -> ConfigFlowResult:
|
||||
@@ -64,10 +60,12 @@ class AmazonDevicesConfigFlow(ConfigFlow, domain=DOMAIN):
|
||||
data = await validate_input(self.hass, user_input)
|
||||
except CannotConnect:
|
||||
errors["base"] = "cannot_connect"
|
||||
except (CannotAuthenticate, TypeError):
|
||||
except CannotAuthenticate:
|
||||
errors["base"] = "invalid_auth"
|
||||
except CannotRetrieveData:
|
||||
errors["base"] = "cannot_retrieve_data"
|
||||
except WrongCountry:
|
||||
errors["base"] = "wrong_country"
|
||||
else:
|
||||
await self.async_set_unique_id(data["customer_info"]["user_id"])
|
||||
self._abort_if_unique_id_configured()
|
||||
@@ -82,6 +80,9 @@ class AmazonDevicesConfigFlow(ConfigFlow, domain=DOMAIN):
|
||||
errors=errors,
|
||||
data_schema=vol.Schema(
|
||||
{
|
||||
vol.Required(
|
||||
CONF_COUNTRY, default=self.hass.config.country
|
||||
): CountrySelector(),
|
||||
vol.Required(CONF_USERNAME): cv.string,
|
||||
vol.Required(CONF_PASSWORD): cv.string,
|
||||
vol.Required(CONF_CODE): cv.string,
|
||||
@@ -107,12 +108,10 @@ class AmazonDevicesConfigFlow(ConfigFlow, domain=DOMAIN):
|
||||
|
||||
if user_input is not None:
|
||||
try:
|
||||
data = await validate_input(
|
||||
self.hass, {**reauth_entry.data, **user_input}
|
||||
)
|
||||
await validate_input(self.hass, {**reauth_entry.data, **user_input})
|
||||
except CannotConnect:
|
||||
errors["base"] = "cannot_connect"
|
||||
except (CannotAuthenticate, TypeError):
|
||||
except CannotAuthenticate:
|
||||
errors["base"] = "invalid_auth"
|
||||
except CannotRetrieveData:
|
||||
errors["base"] = "cannot_retrieve_data"
|
||||
@@ -121,9 +120,8 @@ class AmazonDevicesConfigFlow(ConfigFlow, domain=DOMAIN):
|
||||
reauth_entry,
|
||||
data={
|
||||
CONF_USERNAME: entry_data[CONF_USERNAME],
|
||||
CONF_PASSWORD: user_input[CONF_PASSWORD],
|
||||
CONF_PASSWORD: entry_data[CONF_PASSWORD],
|
||||
CONF_CODE: user_input[CONF_CODE],
|
||||
CONF_LOGIN_DATA: data,
|
||||
},
|
||||
)
|
||||
|
||||
@@ -133,47 +131,3 @@ class AmazonDevicesConfigFlow(ConfigFlow, domain=DOMAIN):
|
||||
data_schema=STEP_REAUTH_DATA_SCHEMA,
|
||||
errors=errors,
|
||||
)
|
||||
|
||||
async def async_step_reconfigure(
|
||||
self, user_input: dict[str, Any] | None = None
|
||||
) -> ConfigFlowResult:
|
||||
"""Handle reconfiguration of the device."""
|
||||
reconfigure_entry = self._get_reconfigure_entry()
|
||||
if not user_input:
|
||||
return self.async_show_form(
|
||||
step_id="reconfigure",
|
||||
data_schema=STEP_RECONFIGURE,
|
||||
)
|
||||
|
||||
updated_password = user_input[CONF_PASSWORD]
|
||||
|
||||
self._async_abort_entries_match(
|
||||
{CONF_USERNAME: reconfigure_entry.data[CONF_USERNAME]}
|
||||
)
|
||||
|
||||
errors: dict[str, str] = {}
|
||||
|
||||
try:
|
||||
data = await validate_input(
|
||||
self.hass, {**reconfigure_entry.data, **user_input}
|
||||
)
|
||||
except CannotConnect:
|
||||
errors["base"] = "cannot_connect"
|
||||
except CannotAuthenticate:
|
||||
errors["base"] = "invalid_auth"
|
||||
except CannotRetrieveData:
|
||||
errors["base"] = "cannot_retrieve_data"
|
||||
else:
|
||||
return self.async_update_reload_and_abort(
|
||||
reconfigure_entry,
|
||||
data_updates={
|
||||
CONF_PASSWORD: updated_password,
|
||||
CONF_LOGIN_DATA: data,
|
||||
},
|
||||
)
|
||||
|
||||
return self.async_show_form(
|
||||
step_id="reconfigure",
|
||||
data_schema=STEP_RECONFIGURE,
|
||||
errors=errors,
|
||||
)
|
||||
|
@@ -6,23 +6,3 @@ _LOGGER = logging.getLogger(__package__)
|
||||
|
||||
DOMAIN = "alexa_devices"
|
||||
CONF_LOGIN_DATA = "login_data"
|
||||
CONF_SITE = "site"
|
||||
|
||||
DEFAULT_DOMAIN = "com"
|
||||
COUNTRY_DOMAINS = {
|
||||
"ar": DEFAULT_DOMAIN,
|
||||
"at": DEFAULT_DOMAIN,
|
||||
"au": "com.au",
|
||||
"be": "com.be",
|
||||
"br": DEFAULT_DOMAIN,
|
||||
"gb": "co.uk",
|
||||
"il": DEFAULT_DOMAIN,
|
||||
"jp": "co.jp",
|
||||
"mx": "com.mx",
|
||||
"no": DEFAULT_DOMAIN,
|
||||
"nz": "com.au",
|
||||
"pl": DEFAULT_DOMAIN,
|
||||
"tr": "com.tr",
|
||||
"us": DEFAULT_DOMAIN,
|
||||
"za": "co.za",
|
||||
}
|
||||
|
@@ -8,13 +8,11 @@ from aioamazondevices.exceptions import (
|
||||
CannotConnect,
|
||||
CannotRetrieveData,
|
||||
)
|
||||
from aiohttp import ClientSession
|
||||
|
||||
from homeassistant.config_entries import ConfigEntry
|
||||
from homeassistant.const import CONF_PASSWORD, CONF_USERNAME
|
||||
from homeassistant.const import CONF_COUNTRY, CONF_PASSWORD, CONF_USERNAME
|
||||
from homeassistant.core import HomeAssistant
|
||||
from homeassistant.exceptions import ConfigEntryAuthFailed
|
||||
from homeassistant.helpers import device_registry as dr
|
||||
from homeassistant.helpers.update_coordinator import DataUpdateCoordinator, UpdateFailed
|
||||
|
||||
from .const import _LOGGER, CONF_LOGIN_DATA, DOMAIN
|
||||
@@ -33,7 +31,6 @@ class AmazonDevicesCoordinator(DataUpdateCoordinator[dict[str, AmazonDevice]]):
|
||||
self,
|
||||
hass: HomeAssistant,
|
||||
entry: AmazonConfigEntry,
|
||||
session: ClientSession,
|
||||
) -> None:
|
||||
"""Initialize the scanner."""
|
||||
super().__init__(
|
||||
@@ -44,18 +41,17 @@ class AmazonDevicesCoordinator(DataUpdateCoordinator[dict[str, AmazonDevice]]):
|
||||
update_interval=timedelta(seconds=SCAN_INTERVAL),
|
||||
)
|
||||
self.api = AmazonEchoApi(
|
||||
session,
|
||||
entry.data[CONF_COUNTRY],
|
||||
entry.data[CONF_USERNAME],
|
||||
entry.data[CONF_PASSWORD],
|
||||
entry.data[CONF_LOGIN_DATA],
|
||||
)
|
||||
self.previous_devices: set[str] = set()
|
||||
|
||||
async def _async_update_data(self) -> dict[str, AmazonDevice]:
|
||||
"""Update device data."""
|
||||
try:
|
||||
await self.api.login_mode_stored_data()
|
||||
data = await self.api.get_devices_data()
|
||||
return await self.api.get_devices_data()
|
||||
except CannotConnect as err:
|
||||
raise UpdateFailed(
|
||||
translation_domain=DOMAIN,
|
||||
@@ -68,37 +64,9 @@ class AmazonDevicesCoordinator(DataUpdateCoordinator[dict[str, AmazonDevice]]):
|
||||
translation_key="cannot_retrieve_data_with_error",
|
||||
translation_placeholders={"error": repr(err)},
|
||||
) from err
|
||||
except (CannotAuthenticate, TypeError) as err:
|
||||
except CannotAuthenticate as err:
|
||||
raise ConfigEntryAuthFailed(
|
||||
translation_domain=DOMAIN,
|
||||
translation_key="invalid_auth",
|
||||
translation_placeholders={"error": repr(err)},
|
||||
) from err
|
||||
else:
|
||||
current_devices = set(data.keys())
|
||||
if stale_devices := self.previous_devices - current_devices:
|
||||
await self._async_remove_device_stale(stale_devices)
|
||||
|
||||
self.previous_devices = current_devices
|
||||
return data
|
||||
|
||||
async def _async_remove_device_stale(
|
||||
self,
|
||||
stale_devices: set[str],
|
||||
) -> None:
|
||||
"""Remove stale device."""
|
||||
device_registry = dr.async_get(self.hass)
|
||||
|
||||
for serial_num in stale_devices:
|
||||
_LOGGER.debug(
|
||||
"Detected change in devices: serial %s removed",
|
||||
serial_num,
|
||||
)
|
||||
device = device_registry.async_get_device(
|
||||
identifiers={(DOMAIN, serial_num)}
|
||||
)
|
||||
if device:
|
||||
device_registry.async_update_device(
|
||||
device_id=device.id,
|
||||
remove_config_entry_id=self.config_entry.entry_id,
|
||||
)
|
||||
|
@@ -38,13 +38,5 @@
|
||||
}
|
||||
}
|
||||
}
|
||||
},
|
||||
"services": {
|
||||
"send_sound": {
|
||||
"service": "mdi:cast-audio"
|
||||
},
|
||||
"send_text_command": {
|
||||
"service": "mdi:microphone-message"
|
||||
}
|
||||
}
|
||||
}
|
||||
|
@@ -8,5 +8,5 @@
|
||||
"iot_class": "cloud_polling",
|
||||
"loggers": ["aioamazondevices"],
|
||||
"quality_scale": "silver",
|
||||
"requirements": ["aioamazondevices==6.0.0"]
|
||||
"requirements": ["aioamazondevices==3.5.1"]
|
||||
}
|
||||
|
@@ -48,25 +48,27 @@ rules:
|
||||
comment: There are a ton of mac address ranges in use, but also by kindles which are not supported by this integration
|
||||
docs-data-update: done
|
||||
docs-examples: done
|
||||
docs-known-limitations: done
|
||||
docs-known-limitations: todo
|
||||
docs-supported-devices: done
|
||||
docs-supported-functions: done
|
||||
docs-troubleshooting: done
|
||||
docs-troubleshooting: todo
|
||||
docs-use-cases: done
|
||||
dynamic-devices: todo
|
||||
entity-category: done
|
||||
entity-device-class: done
|
||||
entity-disabled-by-default: done
|
||||
entity-translations: done
|
||||
exception-translations: done
|
||||
exception-translations: todo
|
||||
icon-translations: done
|
||||
reconfiguration-flow: done
|
||||
reconfiguration-flow: todo
|
||||
repair-issues:
|
||||
status: exempt
|
||||
comment: no known use cases for repair issues or flows, yet
|
||||
stale-devices: done
|
||||
stale-devices:
|
||||
status: todo
|
||||
comment: automate the cleanup process
|
||||
|
||||
# Platinum
|
||||
async-dependency: done
|
||||
inject-websession: done
|
||||
inject-websession: todo
|
||||
strict-typing: done
|
||||
|
@@ -12,7 +12,6 @@ from homeassistant.components.sensor import (
|
||||
SensorDeviceClass,
|
||||
SensorEntity,
|
||||
SensorEntityDescription,
|
||||
SensorStateClass,
|
||||
)
|
||||
from homeassistant.const import LIGHT_LUX, UnitOfTemperature
|
||||
from homeassistant.core import HomeAssistant
|
||||
@@ -42,13 +41,11 @@ SENSORS: Final = (
|
||||
if device.sensors[_key].scale == "CELSIUS"
|
||||
else UnitOfTemperature.FAHRENHEIT
|
||||
),
|
||||
state_class=SensorStateClass.MEASUREMENT,
|
||||
),
|
||||
AmazonSensorEntityDescription(
|
||||
key="illuminance",
|
||||
device_class=SensorDeviceClass.ILLUMINANCE,
|
||||
native_unit_of_measurement=LIGHT_LUX,
|
||||
state_class=SensorStateClass.MEASUREMENT,
|
||||
),
|
||||
)
|
||||
|
||||
|
@@ -1,116 +0,0 @@
|
||||
"""Support for services."""
|
||||
|
||||
from aioamazondevices.sounds import SOUNDS_LIST
|
||||
import voluptuous as vol
|
||||
|
||||
from homeassistant.config_entries import ConfigEntryState
|
||||
from homeassistant.const import ATTR_DEVICE_ID
|
||||
from homeassistant.core import HomeAssistant, ServiceCall, callback
|
||||
from homeassistant.exceptions import ServiceValidationError
|
||||
from homeassistant.helpers import config_validation as cv, device_registry as dr
|
||||
|
||||
from .const import DOMAIN
|
||||
from .coordinator import AmazonConfigEntry
|
||||
|
||||
ATTR_TEXT_COMMAND = "text_command"
|
||||
ATTR_SOUND = "sound"
|
||||
SERVICE_TEXT_COMMAND = "send_text_command"
|
||||
SERVICE_SOUND_NOTIFICATION = "send_sound"
|
||||
|
||||
SCHEMA_SOUND_SERVICE = vol.Schema(
|
||||
{
|
||||
vol.Required(ATTR_SOUND): cv.string,
|
||||
vol.Required(ATTR_DEVICE_ID): cv.string,
|
||||
},
|
||||
)
|
||||
SCHEMA_CUSTOM_COMMAND = vol.Schema(
|
||||
{
|
||||
vol.Required(ATTR_TEXT_COMMAND): cv.string,
|
||||
vol.Required(ATTR_DEVICE_ID): cv.string,
|
||||
}
|
||||
)
|
||||
|
||||
|
||||
@callback
|
||||
def async_get_entry_id_for_service_call(
|
||||
call: ServiceCall,
|
||||
) -> tuple[dr.DeviceEntry, AmazonConfigEntry]:
|
||||
"""Get the entry ID related to a service call (by device ID)."""
|
||||
device_registry = dr.async_get(call.hass)
|
||||
device_id = call.data[ATTR_DEVICE_ID]
|
||||
if (device_entry := device_registry.async_get(device_id)) is None:
|
||||
raise ServiceValidationError(
|
||||
translation_domain=DOMAIN,
|
||||
translation_key="invalid_device_id",
|
||||
translation_placeholders={"device_id": device_id},
|
||||
)
|
||||
|
||||
for entry_id in device_entry.config_entries:
|
||||
if (entry := call.hass.config_entries.async_get_entry(entry_id)) is None:
|
||||
continue
|
||||
if entry.domain == DOMAIN:
|
||||
if entry.state is not ConfigEntryState.LOADED:
|
||||
raise ServiceValidationError(
|
||||
translation_domain=DOMAIN,
|
||||
translation_key="entry_not_loaded",
|
||||
translation_placeholders={"entry": entry.title},
|
||||
)
|
||||
return (device_entry, entry)
|
||||
|
||||
raise ServiceValidationError(
|
||||
translation_domain=DOMAIN,
|
||||
translation_key="config_entry_not_found",
|
||||
translation_placeholders={"device_id": device_id},
|
||||
)
|
||||
|
||||
|
||||
async def _async_execute_action(call: ServiceCall, attribute: str) -> None:
|
||||
"""Execute action on the device."""
|
||||
device, config_entry = async_get_entry_id_for_service_call(call)
|
||||
assert device.serial_number
|
||||
value: str = call.data[attribute]
|
||||
|
||||
coordinator = config_entry.runtime_data
|
||||
|
||||
if attribute == ATTR_SOUND:
|
||||
if value not in SOUNDS_LIST:
|
||||
raise ServiceValidationError(
|
||||
translation_domain=DOMAIN,
|
||||
translation_key="invalid_sound_value",
|
||||
translation_placeholders={"sound": value},
|
||||
)
|
||||
await coordinator.api.call_alexa_sound(
|
||||
coordinator.data[device.serial_number], value
|
||||
)
|
||||
elif attribute == ATTR_TEXT_COMMAND:
|
||||
await coordinator.api.call_alexa_text_command(
|
||||
coordinator.data[device.serial_number], value
|
||||
)
|
||||
|
||||
|
||||
async def async_send_sound_notification(call: ServiceCall) -> None:
|
||||
"""Send a sound notification to a AmazonDevice."""
|
||||
await _async_execute_action(call, ATTR_SOUND)
|
||||
|
||||
|
||||
async def async_send_text_command(call: ServiceCall) -> None:
|
||||
"""Send a custom command to a AmazonDevice."""
|
||||
await _async_execute_action(call, ATTR_TEXT_COMMAND)
|
||||
|
||||
|
||||
@callback
|
||||
def async_setup_services(hass: HomeAssistant) -> None:
|
||||
"""Set up the services for the Amazon Devices integration."""
|
||||
for service_name, method, schema in (
|
||||
(
|
||||
SERVICE_SOUND_NOTIFICATION,
|
||||
async_send_sound_notification,
|
||||
SCHEMA_SOUND_SERVICE,
|
||||
),
|
||||
(
|
||||
SERVICE_TEXT_COMMAND,
|
||||
async_send_text_command,
|
||||
SCHEMA_CUSTOM_COMMAND,
|
||||
),
|
||||
):
|
||||
hass.services.async_register(DOMAIN, service_name, method, schema=schema)
|
@@ -1,69 +0,0 @@
|
||||
send_text_command:
|
||||
fields:
|
||||
device_id:
|
||||
required: true
|
||||
selector:
|
||||
device:
|
||||
integration: alexa_devices
|
||||
text_command:
|
||||
required: true
|
||||
example: "Play B.B.C. on TuneIn"
|
||||
selector:
|
||||
text:
|
||||
|
||||
send_sound:
|
||||
fields:
|
||||
device_id:
|
||||
required: true
|
||||
selector:
|
||||
device:
|
||||
integration: alexa_devices
|
||||
sound:
|
||||
required: true
|
||||
example: amzn_sfx_doorbell_chime
|
||||
default: amzn_sfx_doorbell_chime
|
||||
selector:
|
||||
select:
|
||||
options:
|
||||
- air_horn_03
|
||||
- amzn_sfx_cat_meow_1x_01
|
||||
- amzn_sfx_church_bell_1x_02
|
||||
- amzn_sfx_crowd_applause_01
|
||||
- amzn_sfx_dog_med_bark_1x_02
|
||||
- amzn_sfx_doorbell_01
|
||||
- amzn_sfx_doorbell_chime_01
|
||||
- amzn_sfx_doorbell_chime_02
|
||||
- amzn_sfx_large_crowd_cheer_01
|
||||
- amzn_sfx_lion_roar_02
|
||||
- amzn_sfx_rooster_crow_01
|
||||
- amzn_sfx_scifi_alarm_01
|
||||
- amzn_sfx_scifi_alarm_04
|
||||
- amzn_sfx_scifi_engines_on_02
|
||||
- amzn_sfx_scifi_sheilds_up_01
|
||||
- amzn_sfx_trumpet_bugle_04
|
||||
- amzn_sfx_wolf_howl_02
|
||||
- bell_02
|
||||
- boing_01
|
||||
- boing_03
|
||||
- buzzers_pistols_01
|
||||
- camera_01
|
||||
- christmas_05
|
||||
- clock_01
|
||||
- futuristic_10
|
||||
- halloween_bats
|
||||
- halloween_crows
|
||||
- halloween_footsteps
|
||||
- halloween_wind
|
||||
- halloween_wolf
|
||||
- holiday_halloween_ghost
|
||||
- horror_10
|
||||
- med_system_alerts_minimal_dragon_short
|
||||
- med_system_alerts_minimal_owl_short
|
||||
- med_system_alerts_minimals_blue_wave_small
|
||||
- med_system_alerts_minimals_galaxy_short
|
||||
- med_system_alerts_minimals_panda_short
|
||||
- med_system_alerts_minimals_tiger_short
|
||||
- med_ui_success_generic_1-1
|
||||
- squeaky_12
|
||||
- zap_01
|
||||
translation_key: sound
|
@@ -1,21 +1,23 @@
|
||||
{
|
||||
"common": {
|
||||
"data_code": "One-time password (OTP code)",
|
||||
"data_description_country": "The country where your Amazon account is registered.",
|
||||
"data_description_username": "The email address of your Amazon account.",
|
||||
"data_description_password": "The password of your Amazon account.",
|
||||
"data_description_code": "The one-time password to log in to your account. Currently, only tokens from OTP applications are supported.",
|
||||
"device_id_description": "The ID of the device to send the command to."
|
||||
"data_description_code": "The one-time password to log in to your account. Currently, only tokens from OTP applications are supported."
|
||||
},
|
||||
"config": {
|
||||
"flow_title": "{username}",
|
||||
"step": {
|
||||
"user": {
|
||||
"data": {
|
||||
"country": "[%key:common::config_flow::data::country%]",
|
||||
"username": "[%key:common::config_flow::data::username%]",
|
||||
"password": "[%key:common::config_flow::data::password%]",
|
||||
"code": "[%key:component::alexa_devices::common::data_code%]"
|
||||
},
|
||||
"data_description": {
|
||||
"country": "[%key:component::alexa_devices::common::data_description_country%]",
|
||||
"username": "[%key:component::alexa_devices::common::data_description_username%]",
|
||||
"password": "[%key:component::alexa_devices::common::data_description_password%]",
|
||||
"code": "[%key:component::alexa_devices::common::data_description_code%]"
|
||||
@@ -30,16 +32,6 @@
|
||||
"password": "[%key:component::alexa_devices::common::data_description_password%]",
|
||||
"code": "[%key:component::alexa_devices::common::data_description_code%]"
|
||||
}
|
||||
},
|
||||
"reconfigure": {
|
||||
"data": {
|
||||
"password": "[%key:common::config_flow::data::password%]",
|
||||
"code": "[%key:component::alexa_devices::common::data_code%]"
|
||||
},
|
||||
"data_description": {
|
||||
"password": "[%key:component::alexa_devices::common::data_description_password%]",
|
||||
"code": "[%key:component::alexa_devices::common::data_description_code%]"
|
||||
}
|
||||
}
|
||||
},
|
||||
"abort": {
|
||||
@@ -47,13 +39,13 @@
|
||||
"cannot_connect": "[%key:common::config_flow::error::cannot_connect%]",
|
||||
"invalid_auth": "[%key:common::config_flow::error::invalid_auth%]",
|
||||
"reauth_successful": "[%key:common::config_flow::abort::reauth_successful%]",
|
||||
"reconfigure_successful": "[%key:common::config_flow::abort::reconfigure_successful%]",
|
||||
"unknown": "[%key:common::config_flow::error::unknown%]"
|
||||
},
|
||||
"error": {
|
||||
"cannot_connect": "[%key:common::config_flow::error::cannot_connect%]",
|
||||
"cannot_retrieve_data": "Unable to retrieve data from Amazon. Please try again later.",
|
||||
"invalid_auth": "[%key:common::config_flow::error::invalid_auth%]",
|
||||
"wrong_country": "Wrong country selected. Please select the country where your Amazon account is registered.",
|
||||
"unknown": "[%key:common::config_flow::error::unknown%]"
|
||||
}
|
||||
},
|
||||
@@ -92,101 +84,12 @@
|
||||
}
|
||||
}
|
||||
},
|
||||
"services": {
|
||||
"send_sound": {
|
||||
"name": "Send sound",
|
||||
"description": "Sends a sound to a device",
|
||||
"fields": {
|
||||
"device_id": {
|
||||
"name": "Device",
|
||||
"description": "[%key:component::alexa_devices::common::device_id_description%]"
|
||||
},
|
||||
"sound": {
|
||||
"name": "Alexa Skill sound file",
|
||||
"description": "The sound file to play."
|
||||
}
|
||||
}
|
||||
},
|
||||
"send_text_command": {
|
||||
"name": "Send text command",
|
||||
"description": "Sends a text command to a device",
|
||||
"fields": {
|
||||
"text_command": {
|
||||
"name": "Alexa text command",
|
||||
"description": "The text command to send."
|
||||
},
|
||||
"device_id": {
|
||||
"name": "Device",
|
||||
"description": "[%key:component::alexa_devices::common::device_id_description%]"
|
||||
}
|
||||
}
|
||||
}
|
||||
},
|
||||
"selector": {
|
||||
"sound": {
|
||||
"options": {
|
||||
"air_horn_03": "Air horn",
|
||||
"amzn_sfx_cat_meow_1x_01": "Cat meow",
|
||||
"amzn_sfx_church_bell_1x_02": "Church bell",
|
||||
"amzn_sfx_crowd_applause_01": "Crowd applause",
|
||||
"amzn_sfx_dog_med_bark_1x_02": "Dog bark",
|
||||
"amzn_sfx_doorbell_01": "Doorbell 1",
|
||||
"amzn_sfx_doorbell_chime_01": "Doorbell 2",
|
||||
"amzn_sfx_doorbell_chime_02": "Doorbell 3",
|
||||
"amzn_sfx_large_crowd_cheer_01": "Crowd cheers",
|
||||
"amzn_sfx_lion_roar_02": "Lion roar",
|
||||
"amzn_sfx_rooster_crow_01": "Rooster",
|
||||
"amzn_sfx_scifi_alarm_01": "Sirens",
|
||||
"amzn_sfx_scifi_alarm_04": "Red alert",
|
||||
"amzn_sfx_scifi_engines_on_02": "Engines on",
|
||||
"amzn_sfx_scifi_sheilds_up_01": "Shields up",
|
||||
"amzn_sfx_trumpet_bugle_04": "Trumpet",
|
||||
"amzn_sfx_wolf_howl_02": "Wolf howl",
|
||||
"bell_02": "Bells",
|
||||
"boing_01": "Boing 1",
|
||||
"boing_03": "Boing 2",
|
||||
"buzzers_pistols_01": "Buzzer",
|
||||
"camera_01": "Camera",
|
||||
"christmas_05": "Christmas bells",
|
||||
"clock_01": "Ticking clock",
|
||||
"futuristic_10": "Aircraft",
|
||||
"halloween_bats": "Halloween bats",
|
||||
"halloween_crows": "Halloween crows",
|
||||
"halloween_footsteps": "Halloween spooky footsteps",
|
||||
"halloween_wind": "Halloween wind",
|
||||
"halloween_wolf": "Halloween wolf",
|
||||
"holiday_halloween_ghost": "Halloween ghost",
|
||||
"horror_10": "Halloween creepy door",
|
||||
"med_system_alerts_minimal_dragon_short": "Friendly dragon",
|
||||
"med_system_alerts_minimal_owl_short": "Happy owl",
|
||||
"med_system_alerts_minimals_blue_wave_small": "Underwater World Sonata",
|
||||
"med_system_alerts_minimals_galaxy_short": "Infinite Galaxy",
|
||||
"med_system_alerts_minimals_panda_short": "Baby panda",
|
||||
"med_system_alerts_minimals_tiger_short": "Playful tiger",
|
||||
"med_ui_success_generic_1-1": "Success 1",
|
||||
"squeaky_12": "Squeaky door",
|
||||
"zap_01": "Zap"
|
||||
}
|
||||
}
|
||||
},
|
||||
"exceptions": {
|
||||
"cannot_connect_with_error": {
|
||||
"message": "Error connecting: {error}"
|
||||
},
|
||||
"cannot_retrieve_data_with_error": {
|
||||
"message": "Error retrieving data: {error}"
|
||||
},
|
||||
"device_serial_number_missing": {
|
||||
"message": "Device serial number missing: {device_id}"
|
||||
},
|
||||
"invalid_device_id": {
|
||||
"message": "Invalid device ID specified: {device_id}"
|
||||
},
|
||||
"invalid_sound_value": {
|
||||
"message": "Invalid sound {sound} specified"
|
||||
},
|
||||
"entry_not_loaded": {
|
||||
"message": "Entry not loaded: {entry}"
|
||||
}
|
||||
}
|
||||
}
|
||||
|
@@ -16,7 +16,7 @@ from homeassistant.helpers.selector import (
|
||||
SelectSelectorMode,
|
||||
)
|
||||
|
||||
from .const import CONF_SITE_ID, CONF_SITE_NAME, DOMAIN, REQUEST_TIMEOUT
|
||||
from .const import CONF_SITE_ID, CONF_SITE_NAME, DOMAIN
|
||||
|
||||
API_URL = "https://app.amber.com.au/developers"
|
||||
|
||||
@@ -64,9 +64,7 @@ class AmberElectricConfigFlow(ConfigFlow, domain=DOMAIN):
|
||||
api = amberelectric.AmberApi(api_client)
|
||||
|
||||
try:
|
||||
sites: list[Site] = filter_sites(
|
||||
api.get_sites(_request_timeout=REQUEST_TIMEOUT)
|
||||
)
|
||||
sites: list[Site] = filter_sites(api.get_sites())
|
||||
except amberelectric.ApiException as api_exception:
|
||||
if api_exception.status == 403:
|
||||
self._errors[CONF_API_TOKEN] = "invalid_api_token"
|
||||
|
@@ -9,6 +9,7 @@ DOMAIN: Final = "amberelectric"
|
||||
CONF_SITE_NAME = "site_name"
|
||||
CONF_SITE_ID = "site_id"
|
||||
|
||||
ATTR_CONFIG_ENTRY_ID = "config_entry_id"
|
||||
ATTR_CHANNEL_TYPE = "channel_type"
|
||||
|
||||
ATTRIBUTION = "Data provided by Amber Electric"
|
||||
@@ -21,5 +22,3 @@ SERVICE_GET_FORECASTS = "get_forecasts"
|
||||
GENERAL_CHANNEL = "general"
|
||||
CONTROLLED_LOAD_CHANNEL = "controlled_load"
|
||||
FEED_IN_CHANNEL = "feed_in"
|
||||
|
||||
REQUEST_TIMEOUT = 15
|
||||
|
@@ -16,7 +16,7 @@ from homeassistant.config_entries import ConfigEntry
|
||||
from homeassistant.core import HomeAssistant
|
||||
from homeassistant.helpers.update_coordinator import DataUpdateCoordinator, UpdateFailed
|
||||
|
||||
from .const import LOGGER, REQUEST_TIMEOUT
|
||||
from .const import LOGGER
|
||||
from .helpers import normalize_descriptor
|
||||
|
||||
type AmberConfigEntry = ConfigEntry[AmberUpdateCoordinator]
|
||||
@@ -82,11 +82,7 @@ class AmberUpdateCoordinator(DataUpdateCoordinator):
|
||||
"grid": {},
|
||||
}
|
||||
try:
|
||||
data = self._api.get_current_prices(
|
||||
self.site_id,
|
||||
next=288,
|
||||
_request_timeout=REQUEST_TIMEOUT,
|
||||
)
|
||||
data = self._api.get_current_prices(self.site_id, next=288)
|
||||
intervals = [interval.actual_instance for interval in data]
|
||||
except ApiException as api_exception:
|
||||
raise UpdateFailed("Missing price data, skipping update") from api_exception
|
||||
|
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user