mirror of
				https://github.com/home-assistant/core.git
				synced 2025-10-29 21:49:33 +00:00 
			
		
		
		
	Compare commits
	
		
			1 Commits
		
	
	
		
			fix_ecovac
			...
			fix-radio-
		
	
	| Author | SHA1 | Date | |
|---|---|---|---|
|   | 35c8fefbd6 | 
| @@ -1,77 +0,0 @@ | ||||
| --- | ||||
| name: quality-scale-rule-verifier | ||||
| description: | | ||||
|   Use this agent when you need to verify that a Home Assistant integration follows a specific quality scale rule. This includes checking if the integration implements required patterns, configurations, or code structures defined by the quality scale system. | ||||
|  | ||||
|   <example> | ||||
|   Context: The user wants to verify if an integration follows a specific quality scale rule. | ||||
|   user: "Check if the peblar integration follows the config-flow rule" | ||||
|   assistant: "I'll use the quality scale rule verifier to check if the peblar integration properly implements the config-flow rule." | ||||
|   <commentary> | ||||
|   Since the user is asking to verify a quality scale rule implementation, use the quality-scale-rule-verifier agent. | ||||
|   </commentary> | ||||
|   </example> | ||||
|  | ||||
|   <example> | ||||
|   Context: The user is reviewing if an integration reaches a specific quality scale level. | ||||
|   user: "Verify that this integration reaches the bronze quality scale" | ||||
|   assistant: "Let me use the quality scale rule verifier to check the bronze quality scale implementation." | ||||
|   <commentary> | ||||
|   The user wants to verify the integration has reached a certain quality level, so use multiple quality-scale-rule-verifier agents to verify each bronze rule. | ||||
|   </commentary> | ||||
|   </example> | ||||
| model: inherit | ||||
| color: yellow | ||||
| tools: Read, Bash, Grep, Glob, WebFetch | ||||
| --- | ||||
|  | ||||
| You are an expert Home Assistant integration quality scale auditor specializing in verifying compliance with specific quality scale rules. You have deep knowledge of Home Assistant's architecture, best practices, and the quality scale system that ensures integration consistency and reliability. | ||||
|  | ||||
| You will verify if an integration follows a specific quality scale rule by: | ||||
|  | ||||
| 1. **Fetching Rule Documentation**: Retrieve the official rule documentation from: | ||||
|    `https://raw.githubusercontent.com/home-assistant/developers.home-assistant/refs/heads/master/docs/core/integration-quality-scale/rules/{rule_name}.md` | ||||
|    where `{rule_name}` is the rule identifier (e.g., 'config-flow', 'entity-unique-id', 'parallel-updates') | ||||
|  | ||||
| 2. **Understanding Rule Requirements**: Parse the rule documentation to identify: | ||||
|    - Core requirements and mandatory implementations | ||||
|    - Specific code patterns or configurations required | ||||
|    - Common violations and anti-patterns | ||||
|    - Exemption criteria (when a rule might not apply) | ||||
|    - The quality tier this rule belongs to (Bronze, Silver, Gold, Platinum) | ||||
|  | ||||
| 3. **Analyzing Integration Code**: Examine the integration's codebase at `homeassistant/components/<integration domain>` focusing on: | ||||
|    - `manifest.json` for quality scale declaration and configuration | ||||
|    - `quality_scale.yaml` for rule status (done, todo, exempt) | ||||
|    - Relevant Python modules based on the rule requirements | ||||
|    - Configuration files and service definitions as needed | ||||
|  | ||||
| 4. **Verification Process**: | ||||
|    - Check if the rule is marked as 'done', 'todo', or 'exempt' in quality_scale.yaml | ||||
|    - If marked 'exempt', verify the exemption reason is valid | ||||
|    - If marked 'done', verify the actual implementation matches requirements | ||||
|    - Identify specific files and code sections that demonstrate compliance or violations | ||||
|    - Consider the integration's declared quality tier when applying rules | ||||
|    - To fetch the integration docs, use WebFetch to fetch from `https://raw.githubusercontent.com/home-assistant/home-assistant.io/refs/heads/current/source/_integrations/<integration domain>.markdown` | ||||
|    - To fetch information about a PyPI package, use the URL `https://pypi.org/pypi/<package>/json` | ||||
|  | ||||
| 5. **Reporting Findings**: Provide a comprehensive verification report that includes: | ||||
|    - **Rule Summary**: Brief description of what the rule requires | ||||
|    - **Compliance Status**: Clear pass/fail/exempt determination | ||||
|    - **Evidence**: Specific code examples showing compliance or violations | ||||
|    - **Issues Found**: Detailed list of any non-compliance issues with file locations | ||||
|    - **Recommendations**: Actionable steps to achieve compliance if needed | ||||
|    - **Exemption Analysis**: If applicable, whether the exemption is justified | ||||
|  | ||||
| When examining code, you will: | ||||
| - Look for exact implementation patterns specified in the rule | ||||
| - Verify all required components are present and properly configured | ||||
| - Check for common mistakes and anti-patterns | ||||
| - Consider edge cases and error handling requirements | ||||
| - Validate that implementations follow Home Assistant conventions | ||||
|  | ||||
| You will be thorough but focused, examining only the aspects relevant to the specific rule being verified. You will provide clear, actionable feedback that helps developers understand both what needs to be fixed and why it matters for integration quality. | ||||
|  | ||||
| If you cannot access the rule documentation or find the integration code, clearly state what information is missing and what you would need to complete the verification. | ||||
|  | ||||
| Remember that quality scale rules are cumulative - Bronze rules apply to all integrations with a quality scale, Silver rules apply to Silver+ integrations, and so on. Always consider the integration's target quality level when determining which rules should be enforced. | ||||
| @@ -58,7 +58,6 @@ base_platforms: &base_platforms | ||||
| # Extra components that trigger the full suite | ||||
| components: &components | ||||
|   - homeassistant/components/alexa/** | ||||
|   - homeassistant/components/analytics/** | ||||
|   - homeassistant/components/application_credentials/** | ||||
|   - homeassistant/components/assist_pipeline/** | ||||
|   - homeassistant/components/auth/** | ||||
|   | ||||
| @@ -8,9 +8,6 @@ | ||||
|     "PYTHONASYNCIODEBUG": "1" | ||||
|   }, | ||||
|   "features": { | ||||
|     // Node feature required for Claude Code until fixed https://github.com/anthropics/devcontainer-features/issues/28 | ||||
|     "ghcr.io/devcontainers/features/node:1": {}, | ||||
|     "ghcr.io/anthropics/devcontainer-features/claude-code:1.0": {}, | ||||
|     "ghcr.io/devcontainers/features/github-cli:1": {} | ||||
|   }, | ||||
|   // Port 5683 udp is used by Shelly integration | ||||
| @@ -33,7 +30,7 @@ | ||||
|         "GitHub.vscode-pull-request-github", | ||||
|         "GitHub.copilot" | ||||
|       ], | ||||
|       // Please keep this file in sync with settings in home-assistant/.vscode/settings.default.jsonc | ||||
|       // Please keep this file in sync with settings in home-assistant/.vscode/settings.default.json | ||||
|       "settings": { | ||||
|         "python.experiments.optOutFrom": ["pythonTestAdapter"], | ||||
|         "python.defaultInterpreterPath": "/home/vscode/.local/ha-venv/bin/python", | ||||
| @@ -41,7 +38,6 @@ | ||||
|         "python.terminal.activateEnvInCurrentTerminal": true, | ||||
|         "python.testing.pytestArgs": ["--no-cov"], | ||||
|         "pylint.importStrategy": "fromEnvironment", | ||||
|         "python.analysis.typeCheckingMode": "basic", | ||||
|         "editor.formatOnPaste": false, | ||||
|         "editor.formatOnSave": true, | ||||
|         "editor.formatOnType": true, | ||||
| @@ -63,9 +59,6 @@ | ||||
|         "[python]": { | ||||
|           "editor.defaultFormatter": "charliermarsh.ruff" | ||||
|         }, | ||||
|         "[json][jsonc][yaml]": { | ||||
|           "editor.defaultFormatter": "esbenp.prettier-vscode" | ||||
|         }, | ||||
|         "json.schemas": [ | ||||
|           { | ||||
|             "fileMatch": ["homeassistant/components/*/manifest.json"], | ||||
|   | ||||
| @@ -14,8 +14,7 @@ tests | ||||
|  | ||||
| # Other virtualization methods | ||||
| venv | ||||
| .venv | ||||
| .vagrant | ||||
|  | ||||
| # Temporary files | ||||
| **/__pycache__ | ||||
| **/__pycache__ | ||||
							
								
								
									
										6
									
								
								.github/ISSUE_TEMPLATE/task.yml
									
									
									
									
										vendored
									
									
								
							
							
						
						
									
										6
									
								
								.github/ISSUE_TEMPLATE/task.yml
									
									
									
									
										vendored
									
									
								
							| @@ -21,7 +21,7 @@ body: | ||||
|   - type: textarea | ||||
|     id: description | ||||
|     attributes: | ||||
|       label: Description | ||||
|       label: Task description | ||||
|       description: | | ||||
|         Provide a clear and detailed description of the task that needs to be accomplished. | ||||
|  | ||||
| @@ -43,11 +43,9 @@ body: | ||||
|  | ||||
|         Include links to related issues, research, prototypes, roadmap opportunities etc. | ||||
|       placeholder: | | ||||
|         - Roadmap opportunity: [link] | ||||
|         - Epic: [link] | ||||
|         - Roadmap opportunity: [links] | ||||
|         - Feature request: [link] | ||||
|         - Technical design documents: [link] | ||||
|         - Prototype/mockup: [link] | ||||
|         - Dependencies: [links] | ||||
|     validations: | ||||
|       required: false | ||||
|   | ||||
							
								
								
									
										5
									
								
								.github/PULL_REQUEST_TEMPLATE.md
									
									
									
									
										vendored
									
									
								
							
							
						
						
									
										5
									
								
								.github/PULL_REQUEST_TEMPLATE.md
									
									
									
									
										vendored
									
									
								
							| @@ -55,12 +55,8 @@ | ||||
|   creating the PR. If you're unsure about any of them, don't hesitate to ask. | ||||
|   We're here to help! This is simply a reminder of what we are going to look | ||||
|   for before merging your code. | ||||
|  | ||||
|   AI tools are welcome, but contributors are responsible for *fully* | ||||
|   understanding the code before submitting a PR. | ||||
| --> | ||||
|  | ||||
| - [ ] I understand the code I am submitting and can explain how it works. | ||||
| - [ ] The code change is tested and works locally. | ||||
| - [ ] Local tests pass. **Your PR cannot be merged unless tests pass** | ||||
| - [ ] There is no commented out code in this PR. | ||||
| @@ -68,7 +64,6 @@ | ||||
| - [ ] I have followed the [perfect PR recommendations][perfect-pr] | ||||
| - [ ] The code has been formatted using Ruff (`ruff format homeassistant tests`) | ||||
| - [ ] Tests have been added to verify that the new code works. | ||||
| - [ ] Any generated code has been carefully reviewed for correctness and compliance with project standards. | ||||
|  | ||||
| If user exposed functionality or configuration variables are added/changed: | ||||
|  | ||||
|   | ||||
							
								
								
									
										28
									
								
								.github/copilot-instructions.md
									
									
									
									
										vendored
									
									
								
							
							
						
						
									
										28
									
								
								.github/copilot-instructions.md
									
									
									
									
										vendored
									
									
								
							| @@ -45,12 +45,6 @@ rules: | ||||
|  | ||||
| **When Reviewing/Creating Code**: Always check the integration's quality scale level and exemption status before applying rules. | ||||
|  | ||||
| ## Code Review Guidelines | ||||
|  | ||||
| **When reviewing code, do NOT comment on:** | ||||
| - **Missing imports** - We use static analysis tooling to catch that | ||||
| - **Code formatting** - We have ruff as a formatting tool that will catch those if needed (unless specifically instructed otherwise in these instructions) | ||||
|  | ||||
| ## Python Requirements | ||||
|  | ||||
| - **Compatibility**: Python 3.13+ | ||||
| @@ -74,7 +68,6 @@ rules: | ||||
| - **Formatting**: Ruff | ||||
| - **Linting**: PyLint and Ruff | ||||
| - **Type Checking**: MyPy | ||||
| - **Lint/Type/Format Fixes**: Always prefer addressing the underlying issue (e.g., import the typed source, update shared stubs, align with Ruff expectations, or correct formatting at the source) before disabling a rule, adding `# type: ignore`, or skipping a formatter. Treat suppressions and `noqa` comments as a last resort once no compliant fix exists | ||||
| - **Testing**: pytest with plain functions and fixtures | ||||
| - **Language**: American English for all code, comments, and documentation (use sentence case, including titles) | ||||
|  | ||||
| @@ -1074,11 +1067,7 @@ async def test_flow_connection_error(hass, mock_api_error): | ||||
|  | ||||
| ### Entity Testing Patterns | ||||
| ```python | ||||
| @pytest.fixture  | ||||
| def platforms() -> list[Platform]:  | ||||
|     """Overridden fixture to specify platforms to test."""  | ||||
|     return [Platform.SENSOR]  # Or another specific platform as needed. | ||||
|  | ||||
| @pytest.mark.parametrize("init_integration", [Platform.SENSOR], indirect=True) | ||||
| @pytest.mark.usefixtures("entity_registry_enabled_by_default", "init_integration") | ||||
| async def test_entities( | ||||
|     hass: HomeAssistant, | ||||
| @@ -1125,25 +1114,16 @@ def mock_device_api() -> Generator[MagicMock]: | ||||
|         ) | ||||
|         yield api | ||||
|  | ||||
| @pytest.fixture  | ||||
| def platforms() -> list[Platform]:  | ||||
|     """Fixture to specify platforms to test."""  | ||||
|     return PLATFORMS | ||||
|  | ||||
| @pytest.fixture | ||||
| async def init_integration( | ||||
|     hass: HomeAssistant, | ||||
|     mock_config_entry: MockConfigEntry, | ||||
|     mock_device_api: MagicMock, | ||||
|     platforms: list[Platform], | ||||
| ) -> MockConfigEntry: | ||||
|     """Set up the integration for testing.""" | ||||
|     mock_config_entry.add_to_hass(hass) | ||||
|      | ||||
|     with patch("homeassistant.components.my_integration.PLATFORMS", platforms): | ||||
|         await hass.config_entries.async_setup(mock_config_entry.entry_id) | ||||
|         await hass.async_block_till_done() | ||||
|  | ||||
|     await hass.config_entries.async_setup(mock_config_entry.entry_id) | ||||
|     await hass.async_block_till_done() | ||||
|     return mock_config_entry | ||||
| ``` | ||||
|  | ||||
| @@ -1169,7 +1149,7 @@ _LOGGER.debug("Processing data: %s", data)  # Use lazy logging | ||||
| ### Validation Commands | ||||
| ```bash | ||||
| # Check specific integration | ||||
| python -m script.hassfest --integration-path homeassistant/components/my_integration | ||||
| python -m script.hassfest --integration my_integration | ||||
|  | ||||
| # Validate quality scale | ||||
| # Check quality_scale.yaml against current rules | ||||
|   | ||||
							
								
								
									
										3
									
								
								.github/dependabot.yml
									
									
									
									
										vendored
									
									
								
							
							
						
						
									
										3
									
								
								.github/dependabot.yml
									
									
									
									
										vendored
									
									
								
							| @@ -6,6 +6,3 @@ updates: | ||||
|       interval: daily | ||||
|       time: "06:00" | ||||
|     open-pull-requests-limit: 10 | ||||
|     labels: | ||||
|       - dependency | ||||
|       - github_actions | ||||
|   | ||||
							
								
								
									
										52
									
								
								.github/workflows/builder.yml
									
									
									
									
										vendored
									
									
								
							
							
						
						
									
										52
									
								
								.github/workflows/builder.yml
									
									
									
									
										vendored
									
									
								
							| @@ -27,12 +27,12 @@ jobs: | ||||
|       publish: ${{ steps.version.outputs.publish }} | ||||
|     steps: | ||||
|       - name: Checkout the repository | ||||
|         uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0 | ||||
|         uses: actions/checkout@v4.2.2 | ||||
|         with: | ||||
|           fetch-depth: 0 | ||||
|  | ||||
|       - name: Set up Python ${{ env.DEFAULT_PYTHON }} | ||||
|         uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0 | ||||
|         uses: actions/setup-python@v5.6.0 | ||||
|         with: | ||||
|           python-version: ${{ env.DEFAULT_PYTHON }} | ||||
|  | ||||
| @@ -69,7 +69,7 @@ jobs: | ||||
|         run: find ./homeassistant/components/*/translations -name "*.json" | tar zcvf translations.tar.gz -T - | ||||
|  | ||||
|       - name: Upload translations | ||||
|         uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4 # v5.0.0 | ||||
|         uses: actions/upload-artifact@v4.6.2 | ||||
|         with: | ||||
|           name: translations | ||||
|           path: translations.tar.gz | ||||
| @@ -90,11 +90,11 @@ jobs: | ||||
|         arch: ${{ fromJson(needs.init.outputs.architectures) }} | ||||
|     steps: | ||||
|       - name: Checkout the repository | ||||
|         uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0 | ||||
|         uses: actions/checkout@v4.2.2 | ||||
|  | ||||
|       - name: Download nightly wheels of frontend | ||||
|         if: needs.init.outputs.channel == 'dev' | ||||
|         uses: dawidd6/action-download-artifact@ac66b43f0e6a346234dd65d4d0c8fbb31cb316e5 # v11 | ||||
|         uses: dawidd6/action-download-artifact@v11 | ||||
|         with: | ||||
|           github_token: ${{secrets.GITHUB_TOKEN}} | ||||
|           repo: home-assistant/frontend | ||||
| @@ -105,7 +105,7 @@ jobs: | ||||
|  | ||||
|       - name: Download nightly wheels of intents | ||||
|         if: needs.init.outputs.channel == 'dev' | ||||
|         uses: dawidd6/action-download-artifact@ac66b43f0e6a346234dd65d4d0c8fbb31cb316e5 # v11 | ||||
|         uses: dawidd6/action-download-artifact@v11 | ||||
|         with: | ||||
|           github_token: ${{secrets.GITHUB_TOKEN}} | ||||
|           repo: OHF-Voice/intents-package | ||||
| @@ -116,7 +116,7 @@ jobs: | ||||
|  | ||||
|       - name: Set up Python ${{ env.DEFAULT_PYTHON }} | ||||
|         if: needs.init.outputs.channel == 'dev' | ||||
|         uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0 | ||||
|         uses: actions/setup-python@v5.6.0 | ||||
|         with: | ||||
|           python-version: ${{ env.DEFAULT_PYTHON }} | ||||
|  | ||||
| @@ -175,7 +175,7 @@ jobs: | ||||
|           sed -i "s|pykrakenapi|# pykrakenapi|g" requirements_all.txt | ||||
|  | ||||
|       - name: Download translations | ||||
|         uses: actions/download-artifact@018cc2cf5baa6db3ef3c5f8a56943fffe632ef53 # v6.0.0 | ||||
|         uses: actions/download-artifact@v4.3.0 | ||||
|         with: | ||||
|           name: translations | ||||
|  | ||||
| @@ -190,15 +190,14 @@ jobs: | ||||
|           echo "${{ github.sha }};${{ github.ref }};${{ github.event_name }};${{ github.actor }}" > rootfs/OFFICIAL_IMAGE | ||||
|  | ||||
|       - name: Login to GitHub Container Registry | ||||
|         uses: docker/login-action@5e57cd118135c172c3672efd75eb46360885c0ef # v3.6.0 | ||||
|         uses: docker/login-action@v3.4.0 | ||||
|         with: | ||||
|           registry: ghcr.io | ||||
|           username: ${{ github.repository_owner }} | ||||
|           password: ${{ secrets.GITHUB_TOKEN }} | ||||
|  | ||||
|       # home-assistant/builder doesn't support sha pinning | ||||
|       - name: Build base image | ||||
|         uses: home-assistant/builder@2025.09.0 | ||||
|         uses: home-assistant/builder@2025.03.0 | ||||
|         with: | ||||
|           args: | | ||||
|             $BUILD_ARGS \ | ||||
| @@ -243,7 +242,7 @@ jobs: | ||||
|           - green | ||||
|     steps: | ||||
|       - name: Checkout the repository | ||||
|         uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0 | ||||
|         uses: actions/checkout@v4.2.2 | ||||
|  | ||||
|       - name: Set build additional args | ||||
|         run: | | ||||
| @@ -257,15 +256,14 @@ jobs: | ||||
|           fi | ||||
|  | ||||
|       - name: Login to GitHub Container Registry | ||||
|         uses: docker/login-action@5e57cd118135c172c3672efd75eb46360885c0ef # v3.6.0 | ||||
|         uses: docker/login-action@v3.4.0 | ||||
|         with: | ||||
|           registry: ghcr.io | ||||
|           username: ${{ github.repository_owner }} | ||||
|           password: ${{ secrets.GITHUB_TOKEN }} | ||||
|  | ||||
|       # home-assistant/builder doesn't support sha pinning | ||||
|       - name: Build base image | ||||
|         uses: home-assistant/builder@2025.09.0 | ||||
|         uses: home-assistant/builder@2025.03.0 | ||||
|         with: | ||||
|           args: | | ||||
|             $BUILD_ARGS \ | ||||
| @@ -281,7 +279,7 @@ jobs: | ||||
|     runs-on: ubuntu-latest | ||||
|     steps: | ||||
|       - name: Checkout the repository | ||||
|         uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0 | ||||
|         uses: actions/checkout@v4.2.2 | ||||
|  | ||||
|       - name: Initialize git | ||||
|         uses: home-assistant/actions/helpers/git-init@master | ||||
| @@ -323,23 +321,23 @@ jobs: | ||||
|         registry: ["ghcr.io/home-assistant", "docker.io/homeassistant"] | ||||
|     steps: | ||||
|       - name: Checkout the repository | ||||
|         uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0 | ||||
|         uses: actions/checkout@v4.2.2 | ||||
|  | ||||
|       - name: Install Cosign | ||||
|         uses: sigstore/cosign-installer@faadad0cce49287aee09b3a48701e75088a2c6ad # v4.0.0 | ||||
|         uses: sigstore/cosign-installer@v3.9.1 | ||||
|         with: | ||||
|           cosign-release: "v2.2.3" | ||||
|  | ||||
|       - name: Login to DockerHub | ||||
|         if: matrix.registry == 'docker.io/homeassistant' | ||||
|         uses: docker/login-action@5e57cd118135c172c3672efd75eb46360885c0ef # v3.6.0 | ||||
|         uses: docker/login-action@v3.4.0 | ||||
|         with: | ||||
|           username: ${{ secrets.DOCKERHUB_USERNAME }} | ||||
|           password: ${{ secrets.DOCKERHUB_TOKEN }} | ||||
|  | ||||
|       - name: Login to GitHub Container Registry | ||||
|         if: matrix.registry == 'ghcr.io/home-assistant' | ||||
|         uses: docker/login-action@5e57cd118135c172c3672efd75eb46360885c0ef # v3.6.0 | ||||
|         uses: docker/login-action@v3.4.0 | ||||
|         with: | ||||
|           registry: ghcr.io | ||||
|           username: ${{ github.repository_owner }} | ||||
| @@ -456,15 +454,15 @@ jobs: | ||||
|     if: github.repository_owner == 'home-assistant' && needs.init.outputs.publish == 'true' | ||||
|     steps: | ||||
|       - name: Checkout the repository | ||||
|         uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0 | ||||
|         uses: actions/checkout@v4.2.2 | ||||
|  | ||||
|       - name: Set up Python ${{ env.DEFAULT_PYTHON }} | ||||
|         uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0 | ||||
|         uses: actions/setup-python@v5.6.0 | ||||
|         with: | ||||
|           python-version: ${{ env.DEFAULT_PYTHON }} | ||||
|  | ||||
|       - name: Download translations | ||||
|         uses: actions/download-artifact@018cc2cf5baa6db3ef3c5f8a56943fffe632ef53 # v6.0.0 | ||||
|         uses: actions/download-artifact@v4.3.0 | ||||
|         with: | ||||
|           name: translations | ||||
|  | ||||
| @@ -482,7 +480,7 @@ jobs: | ||||
|           python -m build | ||||
|  | ||||
|       - name: Upload package to PyPI | ||||
|         uses: pypa/gh-action-pypi-publish@ed0c53931b1dc9bd32cbe73a98c7f6766f8a527e # v1.13.0 | ||||
|         uses: pypa/gh-action-pypi-publish@v1.12.4 | ||||
|         with: | ||||
|           skip-existing: true | ||||
|  | ||||
| @@ -501,10 +499,10 @@ jobs: | ||||
|       HASSFEST_IMAGE_TAG: ghcr.io/home-assistant/hassfest:${{ needs.init.outputs.version }} | ||||
|     steps: | ||||
|       - name: Checkout repository | ||||
|         uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0 | ||||
|         uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2 | ||||
|  | ||||
|       - name: Login to GitHub Container Registry | ||||
|         uses: docker/login-action@5e57cd118135c172c3672efd75eb46360885c0ef # v3.6.0 | ||||
|         uses: docker/login-action@74a5d142397b4f367a81961eba4e8cd7edddf772 # v3.4.0 | ||||
|         with: | ||||
|           registry: ghcr.io | ||||
|           username: ${{ github.repository_owner }} | ||||
| @@ -533,7 +531,7 @@ jobs: | ||||
|  | ||||
|       - name: Generate artifact attestation | ||||
|         if: needs.init.outputs.channel != 'dev' && needs.init.outputs.publish == 'true' | ||||
|         uses: actions/attest-build-provenance@977bb373ede98d70efdf65b84cb5f73e068dcc2a # v3.0.0 | ||||
|         uses: actions/attest-build-provenance@e8998f949152b193b063cb0ec769d69d929409be # v2.4.0 | ||||
|         with: | ||||
|           subject-name: ${{ env.HASSFEST_IMAGE_NAME }} | ||||
|           subject-digest: ${{ steps.push.outputs.digest }} | ||||
|   | ||||
							
								
								
									
										780
									
								
								.github/workflows/ci.yaml
									
									
									
									
										vendored
									
									
								
							
							
						
						
									
										780
									
								
								.github/workflows/ci.yaml
									
									
									
									
										vendored
									
									
								
							
										
											
												File diff suppressed because it is too large
												Load Diff
											
										
									
								
							
							
								
								
									
										6
									
								
								.github/workflows/codeql.yml
									
									
									
									
										vendored
									
									
								
							
							
						
						
									
										6
									
								
								.github/workflows/codeql.yml
									
									
									
									
										vendored
									
									
								
							| @@ -21,14 +21,14 @@ jobs: | ||||
|  | ||||
|     steps: | ||||
|       - name: Check out code from GitHub | ||||
|         uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0 | ||||
|         uses: actions/checkout@v4.2.2 | ||||
|  | ||||
|       - name: Initialize CodeQL | ||||
|         uses: github/codeql-action/init@4e94bd11f71e507f7f87df81788dff88d1dacbfb # v4.31.0 | ||||
|         uses: github/codeql-action/init@v3.29.2 | ||||
|         with: | ||||
|           languages: python | ||||
|  | ||||
|       - name: Perform CodeQL Analysis | ||||
|         uses: github/codeql-action/analyze@4e94bd11f71e507f7f87df81788dff88d1dacbfb # v4.31.0 | ||||
|         uses: github/codeql-action/analyze@v3.29.2 | ||||
|         with: | ||||
|           category: "/language:python" | ||||
|   | ||||
| @@ -16,7 +16,7 @@ jobs: | ||||
|     steps: | ||||
|       - name: Check if integration label was added and extract details | ||||
|         id: extract | ||||
|         uses: actions/github-script@ed597411d8f924073f98dfc5c65a23a2325f34cd # v8.0.0 | ||||
|         uses: actions/github-script@v7.0.1 | ||||
|         with: | ||||
|           script: | | ||||
|             // Debug: Log the event payload | ||||
| @@ -113,7 +113,7 @@ jobs: | ||||
|       - name: Fetch similar issues | ||||
|         id: fetch_similar | ||||
|         if: steps.extract.outputs.should_continue == 'true' | ||||
|         uses: actions/github-script@ed597411d8f924073f98dfc5c65a23a2325f34cd # v8.0.0 | ||||
|         uses: actions/github-script@v7.0.1 | ||||
|         env: | ||||
|           INTEGRATION_LABELS: ${{ steps.extract.outputs.integration_labels }} | ||||
|           CURRENT_NUMBER: ${{ steps.extract.outputs.current_number }} | ||||
| @@ -231,7 +231,7 @@ jobs: | ||||
|       - name: Detect duplicates using AI | ||||
|         id: ai_detection | ||||
|         if: steps.extract.outputs.should_continue == 'true' && steps.fetch_similar.outputs.has_similar == 'true' | ||||
|         uses: actions/ai-inference@a1c11829223a786afe3b5663db904a3aa1eac3a2 # v2.0.1 | ||||
|         uses: actions/ai-inference@v1.1.0 | ||||
|         with: | ||||
|           model: openai/gpt-4o | ||||
|           system-prompt: | | ||||
| @@ -280,7 +280,7 @@ jobs: | ||||
|       - name: Post duplicate detection results | ||||
|         id: post_results | ||||
|         if: steps.extract.outputs.should_continue == 'true' && steps.fetch_similar.outputs.has_similar == 'true' | ||||
|         uses: actions/github-script@ed597411d8f924073f98dfc5c65a23a2325f34cd # v8.0.0 | ||||
|         uses: actions/github-script@v7.0.1 | ||||
|         env: | ||||
|           AI_RESPONSE: ${{ steps.ai_detection.outputs.response }} | ||||
|           SIMILAR_ISSUES: ${{ steps.fetch_similar.outputs.similar_issues }} | ||||
|   | ||||
| @@ -16,7 +16,7 @@ jobs: | ||||
|     steps: | ||||
|       - name: Check issue language | ||||
|         id: detect_language | ||||
|         uses: actions/github-script@ed597411d8f924073f98dfc5c65a23a2325f34cd # v8.0.0 | ||||
|         uses: actions/github-script@v7.0.1 | ||||
|         env: | ||||
|           ISSUE_NUMBER: ${{ github.event.issue.number }} | ||||
|           ISSUE_TITLE: ${{ github.event.issue.title }} | ||||
| @@ -57,7 +57,7 @@ jobs: | ||||
|       - name: Detect language using AI | ||||
|         id: ai_language_detection | ||||
|         if: steps.detect_language.outputs.should_continue == 'true' | ||||
|         uses: actions/ai-inference@a1c11829223a786afe3b5663db904a3aa1eac3a2 # v2.0.1 | ||||
|         uses: actions/ai-inference@v1.1.0 | ||||
|         with: | ||||
|           model: openai/gpt-4o-mini | ||||
|           system-prompt: | | ||||
| @@ -90,7 +90,7 @@ jobs: | ||||
|  | ||||
|       - name: Process non-English issues | ||||
|         if: steps.detect_language.outputs.should_continue == 'true' | ||||
|         uses: actions/github-script@ed597411d8f924073f98dfc5c65a23a2325f34cd # v8.0.0 | ||||
|         uses: actions/github-script@v7.0.1 | ||||
|         env: | ||||
|           AI_RESPONSE: ${{ steps.ai_language_detection.outputs.response }} | ||||
|           ISSUE_NUMBER: ${{ steps.detect_language.outputs.issue_number }} | ||||
|   | ||||
							
								
								
									
										2
									
								
								.github/workflows/lock.yml
									
									
									
									
										vendored
									
									
								
							
							
						
						
									
										2
									
								
								.github/workflows/lock.yml
									
									
									
									
										vendored
									
									
								
							| @@ -10,7 +10,7 @@ jobs: | ||||
|     if: github.repository_owner == 'home-assistant' | ||||
|     runs-on: ubuntu-latest | ||||
|     steps: | ||||
|       - uses: dessant/lock-threads@1bf7ec25051fe7c00bdd17e6a7cf3d7bfb7dc771 # v5.0.1 | ||||
|       - uses: dessant/lock-threads@v5.0.1 | ||||
|         with: | ||||
|           github-token: ${{ github.token }} | ||||
|           issue-inactive-days: "30" | ||||
|   | ||||
							
								
								
									
										4
									
								
								.github/workflows/restrict-task-creation.yml
									
									
									
									
										vendored
									
									
								
							
							
						
						
									
										4
									
								
								.github/workflows/restrict-task-creation.yml
									
									
									
									
										vendored
									
									
								
							| @@ -9,10 +9,10 @@ jobs: | ||||
|   check-authorization: | ||||
|     runs-on: ubuntu-latest | ||||
|     # Only run if this is a Task issue type (from the issue form) | ||||
|     if: github.event.issue.type.name == 'Task' | ||||
|     if: github.event.issue.issue_type == 'Task' | ||||
|     steps: | ||||
|       - name: Check if user is authorized | ||||
|         uses: actions/github-script@ed597411d8f924073f98dfc5c65a23a2325f34cd # v8.0.0 | ||||
|         uses: actions/github-script@v7 | ||||
|         with: | ||||
|           script: | | ||||
|             const issueAuthor = context.payload.issue.user.login; | ||||
|   | ||||
							
								
								
									
										6
									
								
								.github/workflows/stale.yml
									
									
									
									
										vendored
									
									
								
							
							
						
						
									
										6
									
								
								.github/workflows/stale.yml
									
									
									
									
										vendored
									
									
								
							| @@ -17,7 +17,7 @@ jobs: | ||||
|       # - No PRs marked as no-stale | ||||
|       # - No issues (-1) | ||||
|       - name: 60 days stale PRs policy | ||||
|         uses: actions/stale@5f858e3efba33a5ca4407a664cc011ad407f2008 # v10.1.0 | ||||
|         uses: actions/stale@v9.1.0 | ||||
|         with: | ||||
|           repo-token: ${{ secrets.GITHUB_TOKEN }} | ||||
|           days-before-stale: 60 | ||||
| @@ -57,7 +57,7 @@ jobs: | ||||
|       # - No issues marked as no-stale or help-wanted | ||||
|       # - No PRs (-1) | ||||
|       - name: 90 days stale issues | ||||
|         uses: actions/stale@5f858e3efba33a5ca4407a664cc011ad407f2008 # v10.1.0 | ||||
|         uses: actions/stale@v9.1.0 | ||||
|         with: | ||||
|           repo-token: ${{ steps.token.outputs.token }} | ||||
|           days-before-stale: 90 | ||||
| @@ -87,7 +87,7 @@ jobs: | ||||
|       # - No Issues marked as no-stale or help-wanted | ||||
|       # - No PRs (-1) | ||||
|       - name: Needs more information stale issues policy | ||||
|         uses: actions/stale@5f858e3efba33a5ca4407a664cc011ad407f2008 # v10.1.0 | ||||
|         uses: actions/stale@v9.1.0 | ||||
|         with: | ||||
|           repo-token: ${{ steps.token.outputs.token }} | ||||
|           only-labels: "needs-more-information" | ||||
|   | ||||
							
								
								
									
										4
									
								
								.github/workflows/translations.yml
									
									
									
									
										vendored
									
									
								
							
							
						
						
									
										4
									
								
								.github/workflows/translations.yml
									
									
									
									
										vendored
									
									
								
							| @@ -19,10 +19,10 @@ jobs: | ||||
|     runs-on: ubuntu-latest | ||||
|     steps: | ||||
|       - name: Checkout the repository | ||||
|         uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0 | ||||
|         uses: actions/checkout@v4.2.2 | ||||
|  | ||||
|       - name: Set up Python ${{ env.DEFAULT_PYTHON }} | ||||
|         uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0 | ||||
|         uses: actions/setup-python@v5.6.0 | ||||
|         with: | ||||
|           python-version: ${{ env.DEFAULT_PYTHON }} | ||||
|  | ||||
|   | ||||
							
								
								
									
										84
									
								
								.github/workflows/wheels.yml
									
									
									
									
										vendored
									
									
								
							
							
						
						
									
										84
									
								
								.github/workflows/wheels.yml
									
									
									
									
										vendored
									
									
								
							| @@ -31,13 +31,12 @@ jobs: | ||||
|     outputs: | ||||
|       architectures: ${{ steps.info.outputs.architectures }} | ||||
|     steps: | ||||
|       - &checkout | ||||
|         name: Checkout the repository | ||||
|         uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0 | ||||
|       - name: Checkout the repository | ||||
|         uses: actions/checkout@v4.2.2 | ||||
|  | ||||
|       - name: Set up Python ${{ env.DEFAULT_PYTHON }} | ||||
|         id: python | ||||
|         uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0 | ||||
|         uses: actions/setup-python@v5.6.0 | ||||
|         with: | ||||
|           python-version: ${{ env.DEFAULT_PYTHON }} | ||||
|           check-latest: true | ||||
| @@ -92,7 +91,7 @@ jobs: | ||||
|           ) > build_constraints.txt | ||||
|  | ||||
|       - name: Upload env_file | ||||
|         uses: &actions-upload-artifact actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4 # v5.0.0 | ||||
|         uses: actions/upload-artifact@v4.6.2 | ||||
|         with: | ||||
|           name: env_file | ||||
|           path: ./.env_file | ||||
| @@ -100,14 +99,14 @@ jobs: | ||||
|           overwrite: true | ||||
|  | ||||
|       - name: Upload build_constraints | ||||
|         uses: *actions-upload-artifact | ||||
|         uses: actions/upload-artifact@v4.6.2 | ||||
|         with: | ||||
|           name: build_constraints | ||||
|           path: ./build_constraints.txt | ||||
|           overwrite: true | ||||
|  | ||||
|       - name: Upload requirements_diff | ||||
|         uses: *actions-upload-artifact | ||||
|         uses: actions/upload-artifact@v4.6.2 | ||||
|         with: | ||||
|           name: requirements_diff | ||||
|           path: ./requirements_diff.txt | ||||
| @@ -119,7 +118,7 @@ jobs: | ||||
|           python -m script.gen_requirements_all ci | ||||
|  | ||||
|       - name: Upload requirements_all_wheels | ||||
|         uses: *actions-upload-artifact | ||||
|         uses: actions/upload-artifact@v4.6.2 | ||||
|         with: | ||||
|           name: requirements_all_wheels | ||||
|           path: ./requirements_all_wheels_*.txt | ||||
| @@ -128,41 +127,28 @@ jobs: | ||||
|     name: Build Core wheels ${{ matrix.abi }} for ${{ matrix.arch }} (musllinux_1_2) | ||||
|     if: github.repository_owner == 'home-assistant' | ||||
|     needs: init | ||||
|     runs-on: ${{ matrix.os }} | ||||
|     runs-on: ubuntu-latest | ||||
|     strategy: | ||||
|       fail-fast: false | ||||
|       matrix: &matrix-build | ||||
|         abi: ["cp313", "cp314"] | ||||
|       matrix: | ||||
|         abi: ["cp313"] | ||||
|         arch: ${{ fromJson(needs.init.outputs.architectures) }} | ||||
|         include: | ||||
|           - os: ubuntu-latest | ||||
|           - arch: aarch64 | ||||
|             os: ubuntu-24.04-arm | ||||
|         exclude: | ||||
|           - abi: cp314 | ||||
|             arch: armv7 | ||||
|           - abi: cp314 | ||||
|             arch: armhf | ||||
|           - abi: cp314 | ||||
|             arch: i386 | ||||
|     steps: | ||||
|       - *checkout | ||||
|       - name: Checkout the repository | ||||
|         uses: actions/checkout@v4.2.2 | ||||
|  | ||||
|       - &download-env-file | ||||
|         name: Download env_file | ||||
|         uses: &actions-download-artifact actions/download-artifact@018cc2cf5baa6db3ef3c5f8a56943fffe632ef53 # v6.0.0 | ||||
|       - name: Download env_file | ||||
|         uses: actions/download-artifact@v4.3.0 | ||||
|         with: | ||||
|           name: env_file | ||||
|  | ||||
|       - &download-build-constraints | ||||
|         name: Download build_constraints | ||||
|         uses: *actions-download-artifact | ||||
|       - name: Download build_constraints | ||||
|         uses: actions/download-artifact@v4.3.0 | ||||
|         with: | ||||
|           name: build_constraints | ||||
|  | ||||
|       - &download-requirements-diff | ||||
|         name: Download requirements_diff | ||||
|         uses: *actions-download-artifact | ||||
|       - name: Download requirements_diff | ||||
|         uses: actions/download-artifact@v4.3.0 | ||||
|         with: | ||||
|           name: requirements_diff | ||||
|  | ||||
| @@ -172,9 +158,8 @@ jobs: | ||||
|           sed -i "/uv/d" requirements.txt | ||||
|           sed -i "/uv/d" requirements_diff.txt | ||||
|  | ||||
|       # home-assistant/wheels doesn't support sha pinning | ||||
|       - name: Build wheels | ||||
|         uses: &home-assistant-wheels home-assistant/wheels@2025.10.0 | ||||
|         uses: home-assistant/wheels@2025.03.0 | ||||
|         with: | ||||
|           abi: ${{ matrix.abi }} | ||||
|           tag: musllinux_1_2 | ||||
| @@ -191,19 +176,33 @@ jobs: | ||||
|     name: Build wheels ${{ matrix.abi }} for ${{ matrix.arch }} | ||||
|     if: github.repository_owner == 'home-assistant' | ||||
|     needs: init | ||||
|     runs-on: ${{ matrix.os }} | ||||
|     runs-on: ubuntu-latest | ||||
|     strategy: | ||||
|       fail-fast: false | ||||
|       matrix: *matrix-build | ||||
|       matrix: | ||||
|         abi: ["cp313"] | ||||
|         arch: ${{ fromJson(needs.init.outputs.architectures) }} | ||||
|     steps: | ||||
|       - *checkout | ||||
|       - name: Checkout the repository | ||||
|         uses: actions/checkout@v4.2.2 | ||||
|  | ||||
|       - *download-env-file | ||||
|       - *download-build-constraints | ||||
|       - *download-requirements-diff | ||||
|       - name: Download env_file | ||||
|         uses: actions/download-artifact@v4.3.0 | ||||
|         with: | ||||
|           name: env_file | ||||
|  | ||||
|       - name: Download build_constraints | ||||
|         uses: actions/download-artifact@v4.3.0 | ||||
|         with: | ||||
|           name: build_constraints | ||||
|  | ||||
|       - name: Download requirements_diff | ||||
|         uses: actions/download-artifact@v4.3.0 | ||||
|         with: | ||||
|           name: requirements_diff | ||||
|  | ||||
|       - name: Download requirements_all_wheels | ||||
|         uses: *actions-download-artifact | ||||
|         uses: actions/download-artifact@v4.3.0 | ||||
|         with: | ||||
|           name: requirements_all_wheels | ||||
|  | ||||
| @@ -219,9 +218,8 @@ jobs: | ||||
|           sed -i "/uv/d" requirements.txt | ||||
|           sed -i "/uv/d" requirements_diff.txt | ||||
|  | ||||
|       # home-assistant/wheels doesn't support sha pinning | ||||
|       - name: Build wheels | ||||
|         uses: *home-assistant-wheels | ||||
|         uses: home-assistant/wheels@2025.03.0 | ||||
|         with: | ||||
|           abi: ${{ matrix.abi }} | ||||
|           tag: musllinux_1_2 | ||||
|   | ||||
							
								
								
									
										5
									
								
								.gitignore
									
									
									
									
										vendored
									
									
								
							
							
						
						
									
										5
									
								
								.gitignore
									
									
									
									
										vendored
									
									
								
							| @@ -79,6 +79,7 @@ junit.xml | ||||
| .project | ||||
| .pydevproject | ||||
|  | ||||
| .python-version | ||||
| .tool-versions | ||||
|  | ||||
| # emacs auto backups | ||||
| @@ -111,7 +112,6 @@ virtualization/vagrant/config | ||||
| !.vscode/cSpell.json | ||||
| !.vscode/extensions.json | ||||
| !.vscode/tasks.json | ||||
| !.vscode/settings.default.jsonc | ||||
| .env | ||||
|  | ||||
| # Windows Explorer | ||||
| @@ -140,6 +140,5 @@ tmp_cache | ||||
| pytest_buckets.txt | ||||
|  | ||||
| # AI tooling | ||||
| .claude/settings.local.json | ||||
| .serena/ | ||||
| .claude | ||||
|  | ||||
|   | ||||
| @@ -1,6 +1,6 @@ | ||||
| repos: | ||||
|   - repo: https://github.com/astral-sh/ruff-pre-commit | ||||
|     rev: v0.13.0 | ||||
|     rev: v0.12.1 | ||||
|     hooks: | ||||
|       - id: ruff-check | ||||
|         args: | ||||
| @@ -18,7 +18,7 @@ repos: | ||||
|         exclude_types: [csv, json, html] | ||||
|         exclude: ^tests/fixtures/|homeassistant/generated/|tests/components/.*/snapshots/ | ||||
|   - repo: https://github.com/pre-commit/pre-commit-hooks | ||||
|     rev: v6.0.0 | ||||
|     rev: v5.0.0 | ||||
|     hooks: | ||||
|       - id: check-executables-have-shebangs | ||||
|         stages: [manual] | ||||
| @@ -33,13 +33,10 @@ repos: | ||||
|     rev: v1.37.1 | ||||
|     hooks: | ||||
|       - id: yamllint | ||||
|   - repo: https://github.com/rbubley/mirrors-prettier | ||||
|     rev: v3.6.2 | ||||
|   - repo: https://github.com/pre-commit/mirrors-prettier | ||||
|     rev: v3.0.3 | ||||
|     hooks: | ||||
|       - id: prettier | ||||
|         additional_dependencies: | ||||
|           - prettier@3.6.2 | ||||
|           - prettier-plugin-sort-json@4.1.1 | ||||
|   - repo: https://github.com/cdce8p/python-typing-update | ||||
|     rev: v0.6.0 | ||||
|     hooks: | ||||
|   | ||||
| @@ -1,24 +0,0 @@ | ||||
| /** @type {import("prettier").Config} */ | ||||
| module.exports = { | ||||
|   overrides: [ | ||||
|     { | ||||
|       files: "./homeassistant/**/*.json", | ||||
|       options: { | ||||
|         plugins: [require.resolve("prettier-plugin-sort-json")], | ||||
|         jsonRecursiveSort: true, | ||||
|         jsonSortOrder: JSON.stringify({ [/.*/]: "numeric" }), | ||||
|       }, | ||||
|     }, | ||||
|     { | ||||
|       files: ["manifest.json", "./**/brands/*.json"], | ||||
|       options: { | ||||
|         // domain and name should stay at the top | ||||
|         jsonSortOrder: JSON.stringify({ | ||||
|           domain: null, | ||||
|           name: null, | ||||
|           [/.*/]: "numeric", | ||||
|         }), | ||||
|       }, | ||||
|     }, | ||||
|   ], | ||||
| }; | ||||
| @@ -1 +0,0 @@ | ||||
| 3.13 | ||||
| @@ -53,7 +53,6 @@ homeassistant.components.air_quality.* | ||||
| homeassistant.components.airgradient.* | ||||
| homeassistant.components.airly.* | ||||
| homeassistant.components.airnow.* | ||||
| homeassistant.components.airos.* | ||||
| homeassistant.components.airq.* | ||||
| homeassistant.components.airthings.* | ||||
| homeassistant.components.airthings_ble.* | ||||
| @@ -142,7 +141,6 @@ homeassistant.components.cloud.* | ||||
| homeassistant.components.co2signal.* | ||||
| homeassistant.components.comelit.* | ||||
| homeassistant.components.command_line.* | ||||
| homeassistant.components.compit.* | ||||
| homeassistant.components.config.* | ||||
| homeassistant.components.configurator.* | ||||
| homeassistant.components.cookidoo.* | ||||
| @@ -170,7 +168,6 @@ homeassistant.components.dnsip.* | ||||
| homeassistant.components.doorbird.* | ||||
| homeassistant.components.dormakaba_dkey.* | ||||
| homeassistant.components.downloader.* | ||||
| homeassistant.components.droplet.* | ||||
| homeassistant.components.dsmr.* | ||||
| homeassistant.components.duckdns.* | ||||
| homeassistant.components.dunehd.* | ||||
| @@ -182,6 +179,7 @@ homeassistant.components.efergy.* | ||||
| homeassistant.components.eheimdigital.* | ||||
| homeassistant.components.electrasmart.* | ||||
| homeassistant.components.electric_kiwi.* | ||||
| homeassistant.components.elevenlabs.* | ||||
| homeassistant.components.elgato.* | ||||
| homeassistant.components.elkm1.* | ||||
| homeassistant.components.emulated_hue.* | ||||
| @@ -202,7 +200,6 @@ homeassistant.components.feedreader.* | ||||
| homeassistant.components.file_upload.* | ||||
| homeassistant.components.filesize.* | ||||
| homeassistant.components.filter.* | ||||
| homeassistant.components.firefly_iii.* | ||||
| homeassistant.components.fitbit.* | ||||
| homeassistant.components.flexit_bacnet.* | ||||
| homeassistant.components.flux_led.* | ||||
| @@ -220,7 +217,6 @@ homeassistant.components.generic_thermostat.* | ||||
| homeassistant.components.geo_location.* | ||||
| homeassistant.components.geocaching.* | ||||
| homeassistant.components.gios.* | ||||
| homeassistant.components.github.* | ||||
| homeassistant.components.glances.* | ||||
| homeassistant.components.go2rtc.* | ||||
| homeassistant.components.goalzero.* | ||||
| @@ -278,7 +274,6 @@ homeassistant.components.imap.* | ||||
| homeassistant.components.imgw_pib.* | ||||
| homeassistant.components.immich.* | ||||
| homeassistant.components.incomfort.* | ||||
| homeassistant.components.inels.* | ||||
| homeassistant.components.input_button.* | ||||
| homeassistant.components.input_select.* | ||||
| homeassistant.components.input_text.* | ||||
| @@ -311,10 +306,10 @@ homeassistant.components.ld2410_ble.* | ||||
| homeassistant.components.led_ble.* | ||||
| homeassistant.components.lektrico.* | ||||
| homeassistant.components.letpot.* | ||||
| homeassistant.components.libre_hardware_monitor.* | ||||
| homeassistant.components.lidarr.* | ||||
| homeassistant.components.lifx.* | ||||
| homeassistant.components.light.* | ||||
| homeassistant.components.linear_garage_door.* | ||||
| homeassistant.components.linkplay.* | ||||
| homeassistant.components.litejet.* | ||||
| homeassistant.components.litterrobot.* | ||||
| @@ -327,7 +322,6 @@ homeassistant.components.london_underground.* | ||||
| homeassistant.components.lookin.* | ||||
| homeassistant.components.lovelace.* | ||||
| homeassistant.components.luftdaten.* | ||||
| homeassistant.components.lunatone.* | ||||
| homeassistant.components.madvr.* | ||||
| homeassistant.components.manual.* | ||||
| homeassistant.components.mastodon.* | ||||
| @@ -383,12 +377,10 @@ homeassistant.components.onedrive.* | ||||
| homeassistant.components.onewire.* | ||||
| homeassistant.components.onkyo.* | ||||
| homeassistant.components.open_meteo.* | ||||
| homeassistant.components.open_router.* | ||||
| homeassistant.components.openai_conversation.* | ||||
| homeassistant.components.openexchangerates.* | ||||
| homeassistant.components.opensky.* | ||||
| homeassistant.components.openuv.* | ||||
| homeassistant.components.opnsense.* | ||||
| homeassistant.components.opower.* | ||||
| homeassistant.components.oralb.* | ||||
| homeassistant.components.otbr.* | ||||
| @@ -406,7 +398,6 @@ homeassistant.components.person.* | ||||
| homeassistant.components.pi_hole.* | ||||
| homeassistant.components.ping.* | ||||
| homeassistant.components.plugwise.* | ||||
| homeassistant.components.portainer.* | ||||
| homeassistant.components.powerfox.* | ||||
| homeassistant.components.powerwall.* | ||||
| homeassistant.components.private_ble_device.* | ||||
| @@ -446,7 +437,6 @@ homeassistant.components.rituals_perfume_genie.* | ||||
| homeassistant.components.roborock.* | ||||
| homeassistant.components.roku.* | ||||
| homeassistant.components.romy.* | ||||
| homeassistant.components.route_b_smart_meter.* | ||||
| homeassistant.components.rpi_power.* | ||||
| homeassistant.components.rss_feed_template.* | ||||
| homeassistant.components.russound_rio.* | ||||
| @@ -467,7 +457,6 @@ homeassistant.components.sensorpush_cloud.* | ||||
| homeassistant.components.sensoterra.* | ||||
| homeassistant.components.senz.* | ||||
| homeassistant.components.sfr_box.* | ||||
| homeassistant.components.sftp_storage.* | ||||
| homeassistant.components.shell_command.* | ||||
| homeassistant.components.shelly.* | ||||
| homeassistant.components.shopping_list.* | ||||
| @@ -476,9 +465,7 @@ homeassistant.components.simplisafe.* | ||||
| homeassistant.components.siren.* | ||||
| homeassistant.components.skybell.* | ||||
| homeassistant.components.slack.* | ||||
| homeassistant.components.sleep_as_android.* | ||||
| homeassistant.components.sleepiq.* | ||||
| homeassistant.components.sma.* | ||||
| homeassistant.components.smhi.* | ||||
| homeassistant.components.smlight.* | ||||
| homeassistant.components.smtp.* | ||||
| @@ -513,7 +500,6 @@ homeassistant.components.tag.* | ||||
| homeassistant.components.tailscale.* | ||||
| homeassistant.components.tailwind.* | ||||
| homeassistant.components.tami4.* | ||||
| homeassistant.components.tankerkoenig.* | ||||
| homeassistant.components.tautulli.* | ||||
| homeassistant.components.tcp.* | ||||
| homeassistant.components.technove.* | ||||
| @@ -549,7 +535,6 @@ homeassistant.components.unifiprotect.* | ||||
| homeassistant.components.upcloud.* | ||||
| homeassistant.components.update.* | ||||
| homeassistant.components.uptime.* | ||||
| homeassistant.components.uptime_kuma.* | ||||
| homeassistant.components.uptimerobot.* | ||||
| homeassistant.components.usb.* | ||||
| homeassistant.components.uvc.* | ||||
| @@ -557,10 +542,8 @@ homeassistant.components.vacuum.* | ||||
| homeassistant.components.vallox.* | ||||
| homeassistant.components.valve.* | ||||
| homeassistant.components.velbus.* | ||||
| homeassistant.components.vivotek.* | ||||
| homeassistant.components.vlc_telnet.* | ||||
| homeassistant.components.vodafone_station.* | ||||
| homeassistant.components.volvo.* | ||||
| homeassistant.components.wake_on_lan.* | ||||
| homeassistant.components.wake_word.* | ||||
| homeassistant.components.wallbox.* | ||||
|   | ||||
| @@ -7,19 +7,13 @@ | ||||
|   "python.testing.pytestEnabled": false, | ||||
|   // https://code.visualstudio.com/docs/python/linting#_general-settings | ||||
|   "pylint.importStrategy": "fromEnvironment", | ||||
|   // Pyright is too pedantic for Home Assistant | ||||
|   "python.analysis.typeCheckingMode": "basic", | ||||
|   "[python]": { | ||||
|     "editor.defaultFormatter": "charliermarsh.ruff", | ||||
|   }, | ||||
|   "[json][jsonc][yaml]": { | ||||
|     "editor.defaultFormatter": "esbenp.prettier-vscode", | ||||
|   }, | ||||
|   "json.schemas": [ | ||||
|     { | ||||
|       "fileMatch": ["homeassistant/components/*/manifest.json"], | ||||
|       // This value differs between working with devcontainer and locally, therefore this value should NOT be in sync! | ||||
|       "url": "./script/json_schemas/manifest_schema.json", | ||||
|     }, | ||||
|   ], | ||||
|         { | ||||
|             "fileMatch": [ | ||||
|                 "homeassistant/components/*/manifest.json" | ||||
|             ], | ||||
|             // This value differs between working with devcontainer and locally, therefor this value should NOT be in sync! | ||||
|             "url": "./script/json_schemas/manifest_schema.json" | ||||
|         } | ||||
|     ] | ||||
| } | ||||
							
								
								
									
										175
									
								
								CODEOWNERS
									
									
									
										generated
									
									
									
								
							
							
						
						
									
										175
									
								
								CODEOWNERS
									
									
									
										generated
									
									
									
								
							| @@ -46,8 +46,6 @@ build.json @home-assistant/supervisor | ||||
| /tests/components/accuweather/ @bieniu | ||||
| /homeassistant/components/acmeda/ @atmurray | ||||
| /tests/components/acmeda/ @atmurray | ||||
| /homeassistant/components/actron_air/ @kclif9 @JagadishDhanamjayam | ||||
| /tests/components/actron_air/ @kclif9 @JagadishDhanamjayam | ||||
| /homeassistant/components/adax/ @danielhiversen @lazytarget | ||||
| /tests/components/adax/ @danielhiversen @lazytarget | ||||
| /homeassistant/components/adguard/ @frenck | ||||
| @@ -69,8 +67,6 @@ build.json @home-assistant/supervisor | ||||
| /tests/components/airly/ @bieniu | ||||
| /homeassistant/components/airnow/ @asymworks | ||||
| /tests/components/airnow/ @asymworks | ||||
| /homeassistant/components/airos/ @CoMPaTech | ||||
| /tests/components/airos/ @CoMPaTech | ||||
| /homeassistant/components/airq/ @Sibgatulin @dl2080 | ||||
| /tests/components/airq/ @Sibgatulin @dl2080 | ||||
| /homeassistant/components/airthings/ @danielhiversen @LaStrada | ||||
| @@ -89,8 +85,6 @@ build.json @home-assistant/supervisor | ||||
| /tests/components/airzone/ @Noltari | ||||
| /homeassistant/components/airzone_cloud/ @Noltari | ||||
| /tests/components/airzone_cloud/ @Noltari | ||||
| /homeassistant/components/aladdin_connect/ @swcloudgenie | ||||
| /tests/components/aladdin_connect/ @swcloudgenie | ||||
| /homeassistant/components/alarm_control_panel/ @home-assistant/core | ||||
| /tests/components/alarm_control_panel/ @home-assistant/core | ||||
| /homeassistant/components/alert/ @home-assistant/core @frenck | ||||
| @@ -109,8 +103,8 @@ build.json @home-assistant/supervisor | ||||
| /homeassistant/components/ambient_station/ @bachya | ||||
| /tests/components/ambient_station/ @bachya | ||||
| /homeassistant/components/amcrest/ @flacjacket | ||||
| /homeassistant/components/analytics/ @home-assistant/core | ||||
| /tests/components/analytics/ @home-assistant/core | ||||
| /homeassistant/components/analytics/ @home-assistant/core @ludeeus | ||||
| /tests/components/analytics/ @home-assistant/core @ludeeus | ||||
| /homeassistant/components/analytics_insights/ @joostlek | ||||
| /tests/components/analytics_insights/ @joostlek | ||||
| /homeassistant/components/android_ip_webcam/ @engrbm87 | ||||
| @@ -156,12 +150,12 @@ build.json @home-assistant/supervisor | ||||
| /tests/components/arve/ @ikalnyi | ||||
| /homeassistant/components/aseko_pool_live/ @milanmeu | ||||
| /tests/components/aseko_pool_live/ @milanmeu | ||||
| /homeassistant/components/assist_pipeline/ @synesthesiam @arturpragacz | ||||
| /tests/components/assist_pipeline/ @synesthesiam @arturpragacz | ||||
| /homeassistant/components/assist_satellite/ @home-assistant/core @synesthesiam @arturpragacz | ||||
| /tests/components/assist_satellite/ @home-assistant/core @synesthesiam @arturpragacz | ||||
| /homeassistant/components/asuswrt/ @kennedyshead @ollo69 @Vaskivskyi | ||||
| /tests/components/asuswrt/ @kennedyshead @ollo69 @Vaskivskyi | ||||
| /homeassistant/components/assist_pipeline/ @balloob @synesthesiam | ||||
| /tests/components/assist_pipeline/ @balloob @synesthesiam | ||||
| /homeassistant/components/assist_satellite/ @home-assistant/core @synesthesiam | ||||
| /tests/components/assist_satellite/ @home-assistant/core @synesthesiam | ||||
| /homeassistant/components/asuswrt/ @kennedyshead @ollo69 | ||||
| /tests/components/asuswrt/ @kennedyshead @ollo69 | ||||
| /homeassistant/components/atag/ @MatsNL | ||||
| /tests/components/atag/ @MatsNL | ||||
| /homeassistant/components/aten_pe/ @mtdcr | ||||
| @@ -294,16 +288,14 @@ build.json @home-assistant/supervisor | ||||
| /tests/components/command_line/ @gjohansson-ST | ||||
| /homeassistant/components/compensation/ @Petro31 | ||||
| /tests/components/compensation/ @Petro31 | ||||
| /homeassistant/components/compit/ @Przemko92 | ||||
| /tests/components/compit/ @Przemko92 | ||||
| /homeassistant/components/config/ @home-assistant/core | ||||
| /tests/components/config/ @home-assistant/core | ||||
| /homeassistant/components/configurator/ @home-assistant/core | ||||
| /tests/components/configurator/ @home-assistant/core | ||||
| /homeassistant/components/control4/ @lawtancool | ||||
| /tests/components/control4/ @lawtancool | ||||
| /homeassistant/components/conversation/ @home-assistant/core @synesthesiam @arturpragacz | ||||
| /tests/components/conversation/ @home-assistant/core @synesthesiam @arturpragacz | ||||
| /homeassistant/components/conversation/ @home-assistant/core @synesthesiam | ||||
| /tests/components/conversation/ @home-assistant/core @synesthesiam | ||||
| /homeassistant/components/cookidoo/ @miaucl | ||||
| /tests/components/cookidoo/ @miaucl | ||||
| /homeassistant/components/coolmaster/ @OnFreund | ||||
| @@ -318,8 +310,6 @@ build.json @home-assistant/supervisor | ||||
| /tests/components/crownstone/ @Crownstone @RicArch97 | ||||
| /homeassistant/components/cups/ @fabaff | ||||
| /tests/components/cups/ @fabaff | ||||
| /homeassistant/components/cync/ @Kinachi249 | ||||
| /tests/components/cync/ @Kinachi249 | ||||
| /homeassistant/components/daikin/ @fredrike | ||||
| /tests/components/daikin/ @fredrike | ||||
| /homeassistant/components/date/ @home-assistant/core | ||||
| @@ -383,8 +373,6 @@ build.json @home-assistant/supervisor | ||||
| /tests/components/dremel_3d_printer/ @tkdrob | ||||
| /homeassistant/components/drop_connect/ @ChandlerSystems @pfrazer | ||||
| /tests/components/drop_connect/ @ChandlerSystems @pfrazer | ||||
| /homeassistant/components/droplet/ @sarahseidman | ||||
| /tests/components/droplet/ @sarahseidman | ||||
| /homeassistant/components/dsmr/ @Robbie1221 | ||||
| /tests/components/dsmr/ @Robbie1221 | ||||
| /homeassistant/components/dsmr_reader/ @sorted-bits @glodenox @erwindouna | ||||
| @@ -414,8 +402,6 @@ build.json @home-assistant/supervisor | ||||
| /homeassistant/components/egardia/ @jeroenterheerdt | ||||
| /homeassistant/components/eheimdigital/ @autinerd | ||||
| /tests/components/eheimdigital/ @autinerd | ||||
| /homeassistant/components/ekeybionyx/ @richardpolzer | ||||
| /tests/components/ekeybionyx/ @richardpolzer | ||||
| /homeassistant/components/electrasmart/ @jafar-atili | ||||
| /tests/components/electrasmart/ @jafar-atili | ||||
| /homeassistant/components/electric_kiwi/ @mikey0000 | ||||
| @@ -434,8 +420,6 @@ build.json @home-assistant/supervisor | ||||
| /homeassistant/components/emby/ @mezz64 | ||||
| /homeassistant/components/emoncms/ @borpin @alexandrecuer | ||||
| /tests/components/emoncms/ @borpin @alexandrecuer | ||||
| /homeassistant/components/emoncms_history/ @alexandrecuer | ||||
| /tests/components/emoncms_history/ @alexandrecuer | ||||
| /homeassistant/components/emonitor/ @bdraco | ||||
| /tests/components/emonitor/ @bdraco | ||||
| /homeassistant/components/emulated_hue/ @bdraco @Tho85 | ||||
| @@ -450,8 +434,10 @@ build.json @home-assistant/supervisor | ||||
| /tests/components/energyzero/ @klaasnicolaas | ||||
| /homeassistant/components/enigma2/ @autinerd | ||||
| /tests/components/enigma2/ @autinerd | ||||
| /homeassistant/components/enphase_envoy/ @bdraco @cgarwood @catsmanac | ||||
| /tests/components/enphase_envoy/ @bdraco @cgarwood @catsmanac | ||||
| /homeassistant/components/enocean/ @bdurrer | ||||
| /tests/components/enocean/ @bdurrer | ||||
| /homeassistant/components/enphase_envoy/ @bdraco @cgarwood @joostlek @catsmanac | ||||
| /tests/components/enphase_envoy/ @bdraco @cgarwood @joostlek @catsmanac | ||||
| /homeassistant/components/entur_public_transport/ @hfurubotten | ||||
| /homeassistant/components/environment_canada/ @gwww @michaeldavie | ||||
| /tests/components/environment_canada/ @gwww @michaeldavie | ||||
| @@ -472,6 +458,8 @@ build.json @home-assistant/supervisor | ||||
| /tests/components/eufylife_ble/ @bdr99 | ||||
| /homeassistant/components/event/ @home-assistant/core | ||||
| /tests/components/event/ @home-assistant/core | ||||
| /homeassistant/components/evil_genius_labs/ @balloob | ||||
| /tests/components/evil_genius_labs/ @balloob | ||||
| /homeassistant/components/evohome/ @zxdavb | ||||
| /tests/components/evohome/ @zxdavb | ||||
| /homeassistant/components/ezviz/ @RenierM26 | ||||
| @@ -494,10 +482,6 @@ build.json @home-assistant/supervisor | ||||
| /tests/components/filesize/ @gjohansson-ST | ||||
| /homeassistant/components/filter/ @dgomes | ||||
| /tests/components/filter/ @dgomes | ||||
| /homeassistant/components/fing/ @Lorenzo-Gasparini | ||||
| /tests/components/fing/ @Lorenzo-Gasparini | ||||
| /homeassistant/components/firefly_iii/ @erwindouna | ||||
| /tests/components/firefly_iii/ @erwindouna | ||||
| /homeassistant/components/fireservicerota/ @cyberjunky | ||||
| /tests/components/fireservicerota/ @cyberjunky | ||||
| /homeassistant/components/firmata/ @DaAwesomeP | ||||
| @@ -525,8 +509,8 @@ build.json @home-assistant/supervisor | ||||
| /homeassistant/components/forked_daapd/ @uvjustin | ||||
| /tests/components/forked_daapd/ @uvjustin | ||||
| /homeassistant/components/fortios/ @kimfrellsen | ||||
| /homeassistant/components/foscam/ @Foscam-wangzhengyu | ||||
| /tests/components/foscam/ @Foscam-wangzhengyu | ||||
| /homeassistant/components/foscam/ @krmarien | ||||
| /tests/components/foscam/ @krmarien | ||||
| /homeassistant/components/freebox/ @hacf-fr @Quentame | ||||
| /tests/components/freebox/ @hacf-fr @Quentame | ||||
| /homeassistant/components/freedompro/ @stefano055415 | ||||
| @@ -621,8 +605,6 @@ build.json @home-assistant/supervisor | ||||
| /tests/components/greeneye_monitor/ @jkeljo | ||||
| /homeassistant/components/group/ @home-assistant/core | ||||
| /tests/components/group/ @home-assistant/core | ||||
| /homeassistant/components/growatt_server/ @johanzander | ||||
| /tests/components/growatt_server/ @johanzander | ||||
| /homeassistant/components/guardian/ @bachya | ||||
| /tests/components/guardian/ @bachya | ||||
| /homeassistant/components/habitica/ @tr4nt0r | ||||
| @@ -662,8 +644,6 @@ build.json @home-assistant/supervisor | ||||
| /tests/components/homeassistant/ @home-assistant/core | ||||
| /homeassistant/components/homeassistant_alerts/ @home-assistant/core | ||||
| /tests/components/homeassistant_alerts/ @home-assistant/core | ||||
| /homeassistant/components/homeassistant_connect_zbt2/ @home-assistant/core | ||||
| /tests/components/homeassistant_connect_zbt2/ @home-assistant/core | ||||
| /homeassistant/components/homeassistant_green/ @home-assistant/core | ||||
| /tests/components/homeassistant_green/ @home-assistant/core | ||||
| /homeassistant/components/homeassistant_hardware/ @home-assistant/core | ||||
| @@ -692,8 +672,8 @@ build.json @home-assistant/supervisor | ||||
| /tests/components/http/ @home-assistant/core | ||||
| /homeassistant/components/huawei_lte/ @scop @fphammerle | ||||
| /tests/components/huawei_lte/ @scop @fphammerle | ||||
| /homeassistant/components/hue/ @marcelveldt | ||||
| /tests/components/hue/ @marcelveldt | ||||
| /homeassistant/components/hue/ @balloob @marcelveldt | ||||
| /tests/components/hue/ @balloob @marcelveldt | ||||
| /homeassistant/components/huisbaasje/ @dennisschroer | ||||
| /tests/components/huisbaasje/ @dennisschroer | ||||
| /homeassistant/components/humidifier/ @home-assistant/core @Shulyaka | ||||
| @@ -704,8 +684,8 @@ build.json @home-assistant/supervisor | ||||
| /tests/components/husqvarna_automower/ @Thomas55555 | ||||
| /homeassistant/components/husqvarna_automower_ble/ @alistair23 | ||||
| /tests/components/husqvarna_automower_ble/ @alistair23 | ||||
| /homeassistant/components/huum/ @frwickst @vincentwolsink | ||||
| /tests/components/huum/ @frwickst @vincentwolsink | ||||
| /homeassistant/components/huum/ @frwickst | ||||
| /tests/components/huum/ @frwickst | ||||
| /homeassistant/components/hvv_departures/ @vigonotion | ||||
| /tests/components/hvv_departures/ @vigonotion | ||||
| /homeassistant/components/hydrawise/ @dknowles2 @thomaskistler @ptcryan | ||||
| @@ -743,8 +723,6 @@ build.json @home-assistant/supervisor | ||||
| /tests/components/improv_ble/ @emontnemery | ||||
| /homeassistant/components/incomfort/ @jbouwh | ||||
| /tests/components/incomfort/ @jbouwh | ||||
| /homeassistant/components/inels/ @epdevlab | ||||
| /tests/components/inels/ @epdevlab | ||||
| /homeassistant/components/influxdb/ @mdegat01 | ||||
| /tests/components/influxdb/ @mdegat01 | ||||
| /homeassistant/components/inkbird/ @bdraco | ||||
| @@ -767,11 +745,11 @@ build.json @home-assistant/supervisor | ||||
| /tests/components/integration/ @dgomes | ||||
| /homeassistant/components/intellifire/ @jeeftor | ||||
| /tests/components/intellifire/ @jeeftor | ||||
| /homeassistant/components/intent/ @home-assistant/core @synesthesiam @arturpragacz | ||||
| /tests/components/intent/ @home-assistant/core @synesthesiam @arturpragacz | ||||
| /homeassistant/components/intent/ @home-assistant/core @synesthesiam | ||||
| /tests/components/intent/ @home-assistant/core @synesthesiam | ||||
| /homeassistant/components/intesishome/ @jnimmo | ||||
| /homeassistant/components/iometer/ @jukrebs | ||||
| /tests/components/iometer/ @jukrebs | ||||
| /homeassistant/components/iometer/ @MaestroOnICe | ||||
| /tests/components/iometer/ @MaestroOnICe | ||||
| /homeassistant/components/ios/ @robbiet480 | ||||
| /tests/components/ios/ @robbiet480 | ||||
| /homeassistant/components/iotawatt/ @gtdiehl @jyavenard | ||||
| @@ -786,8 +764,6 @@ build.json @home-assistant/supervisor | ||||
| /homeassistant/components/iqvia/ @bachya | ||||
| /tests/components/iqvia/ @bachya | ||||
| /homeassistant/components/irish_rail_transport/ @ttroy50 | ||||
| /homeassistant/components/irm_kmi/ @jdejaegh | ||||
| /tests/components/irm_kmi/ @jdejaegh | ||||
| /homeassistant/components/iron_os/ @tr4nt0r | ||||
| /tests/components/iron_os/ @tr4nt0r | ||||
| /homeassistant/components/isal/ @bdraco | ||||
| @@ -878,14 +854,14 @@ build.json @home-assistant/supervisor | ||||
| /tests/components/lg_netcast/ @Drafteed @splinter98 | ||||
| /homeassistant/components/lg_thinq/ @LG-ThinQ-Integration | ||||
| /tests/components/lg_thinq/ @LG-ThinQ-Integration | ||||
| /homeassistant/components/libre_hardware_monitor/ @Sab44 | ||||
| /tests/components/libre_hardware_monitor/ @Sab44 | ||||
| /homeassistant/components/lidarr/ @tkdrob | ||||
| /tests/components/lidarr/ @tkdrob | ||||
| /homeassistant/components/lifx/ @Djelibeybi | ||||
| /tests/components/lifx/ @Djelibeybi | ||||
| /homeassistant/components/light/ @home-assistant/core | ||||
| /tests/components/light/ @home-assistant/core | ||||
| /homeassistant/components/linear_garage_door/ @IceBotYT | ||||
| /tests/components/linear_garage_door/ @IceBotYT | ||||
| /homeassistant/components/linkplay/ @Velleman | ||||
| /tests/components/linkplay/ @Velleman | ||||
| /homeassistant/components/linux_battery/ @fabaff | ||||
| @@ -918,8 +894,6 @@ build.json @home-assistant/supervisor | ||||
| /homeassistant/components/luci/ @mzdrale | ||||
| /homeassistant/components/luftdaten/ @fabaff @frenck | ||||
| /tests/components/luftdaten/ @fabaff @frenck | ||||
| /homeassistant/components/lunatone/ @MoonDevLT | ||||
| /tests/components/lunatone/ @MoonDevLT | ||||
| /homeassistant/components/lupusec/ @majuss @suaveolent | ||||
| /tests/components/lupusec/ @majuss @suaveolent | ||||
| /homeassistant/components/lutron/ @cdheiser @wilburCForce | ||||
| @@ -965,8 +939,6 @@ build.json @home-assistant/supervisor | ||||
| /tests/components/met_eireann/ @DylanGore | ||||
| /homeassistant/components/meteo_france/ @hacf-fr @oncleben31 @Quentame | ||||
| /tests/components/meteo_france/ @hacf-fr @oncleben31 @Quentame | ||||
| /homeassistant/components/meteo_lt/ @xE1H | ||||
| /tests/components/meteo_lt/ @xE1H | ||||
| /homeassistant/components/meteoalarm/ @rolfberkenbosch | ||||
| /homeassistant/components/meteoclimatic/ @adrianmo | ||||
| /tests/components/meteoclimatic/ @adrianmo | ||||
| @@ -1037,8 +1009,7 @@ build.json @home-assistant/supervisor | ||||
| /tests/components/nanoleaf/ @milanmeu @joostlek | ||||
| /homeassistant/components/nasweb/ @nasWebio | ||||
| /tests/components/nasweb/ @nasWebio | ||||
| /homeassistant/components/nederlandse_spoorwegen/ @YarmoM @heindrichpaul | ||||
| /tests/components/nederlandse_spoorwegen/ @YarmoM @heindrichpaul | ||||
| /homeassistant/components/nederlandse_spoorwegen/ @YarmoM | ||||
| /homeassistant/components/ness_alarm/ @nickw444 | ||||
| /tests/components/ness_alarm/ @nickw444 | ||||
| /homeassistant/components/nest/ @allenporter | ||||
| @@ -1073,8 +1044,6 @@ build.json @home-assistant/supervisor | ||||
| /homeassistant/components/nilu/ @hfurubotten | ||||
| /homeassistant/components/nina/ @DeerMaximum | ||||
| /tests/components/nina/ @DeerMaximum | ||||
| /homeassistant/components/nintendo_parental_controls/ @pantherale0 | ||||
| /tests/components/nintendo_parental_controls/ @pantherale0 | ||||
| /homeassistant/components/nissan_leaf/ @filcole | ||||
| /homeassistant/components/noaa_tides/ @jdelaney72 | ||||
| /homeassistant/components/nobo_hub/ @echoromeo @oyvindwe | ||||
| @@ -1133,8 +1102,8 @@ build.json @home-assistant/supervisor | ||||
| /tests/components/onvif/ @hunterjm @jterrace | ||||
| /homeassistant/components/open_meteo/ @frenck | ||||
| /tests/components/open_meteo/ @frenck | ||||
| /homeassistant/components/open_router/ @joostlek | ||||
| /tests/components/open_router/ @joostlek | ||||
| /homeassistant/components/openai_conversation/ @balloob | ||||
| /tests/components/openai_conversation/ @balloob | ||||
| /homeassistant/components/openerz/ @misialq | ||||
| /tests/components/openerz/ @misialq | ||||
| /homeassistant/components/openexchangerates/ @MartinHjelmare | ||||
| @@ -1143,8 +1112,6 @@ build.json @home-assistant/supervisor | ||||
| /tests/components/opengarage/ @danielhiversen | ||||
| /homeassistant/components/openhome/ @bazwilliams | ||||
| /tests/components/openhome/ @bazwilliams | ||||
| /homeassistant/components/openrgb/ @felipecrs | ||||
| /tests/components/openrgb/ @felipecrs | ||||
| /homeassistant/components/opensky/ @joostlek | ||||
| /tests/components/opensky/ @joostlek | ||||
| /homeassistant/components/opentherm_gw/ @mvn23 | ||||
| @@ -1208,14 +1175,12 @@ build.json @home-assistant/supervisor | ||||
| /tests/components/plex/ @jjlawren | ||||
| /homeassistant/components/plugwise/ @CoMPaTech @bouwew | ||||
| /tests/components/plugwise/ @CoMPaTech @bouwew | ||||
| /homeassistant/components/plum_lightpad/ @ColinHarrington @prystupa | ||||
| /tests/components/plum_lightpad/ @ColinHarrington @prystupa | ||||
| /homeassistant/components/point/ @fredrike | ||||
| /tests/components/point/ @fredrike | ||||
| /homeassistant/components/pooldose/ @lmaertin | ||||
| /tests/components/pooldose/ @lmaertin | ||||
| /homeassistant/components/poolsense/ @haemishkyd | ||||
| /tests/components/poolsense/ @haemishkyd | ||||
| /homeassistant/components/portainer/ @erwindouna | ||||
| /tests/components/portainer/ @erwindouna | ||||
| /homeassistant/components/powerfox/ @klaasnicolaas | ||||
| /tests/components/powerfox/ @klaasnicolaas | ||||
| /homeassistant/components/powerwall/ @bdraco @jrester @daniel-simpson | ||||
| @@ -1235,6 +1200,8 @@ build.json @home-assistant/supervisor | ||||
| /homeassistant/components/proximity/ @mib1185 | ||||
| /tests/components/proximity/ @mib1185 | ||||
| /homeassistant/components/proxmoxve/ @jhollowe @Corbeno | ||||
| /homeassistant/components/prusalink/ @balloob | ||||
| /tests/components/prusalink/ @balloob | ||||
| /homeassistant/components/ps4/ @ktnrg45 | ||||
| /tests/components/ps4/ @ktnrg45 | ||||
| /homeassistant/components/pterodactyl/ @elmurato | ||||
| @@ -1328,8 +1295,8 @@ build.json @home-assistant/supervisor | ||||
| /tests/components/rflink/ @javicalle | ||||
| /homeassistant/components/rfxtrx/ @danielhiversen @elupus @RobBie1221 | ||||
| /tests/components/rfxtrx/ @danielhiversen @elupus @RobBie1221 | ||||
| /homeassistant/components/rhasspy/ @synesthesiam | ||||
| /tests/components/rhasspy/ @synesthesiam | ||||
| /homeassistant/components/rhasspy/ @balloob @synesthesiam | ||||
| /tests/components/rhasspy/ @balloob @synesthesiam | ||||
| /homeassistant/components/ridwell/ @bachya | ||||
| /tests/components/ridwell/ @bachya | ||||
| /homeassistant/components/ring/ @sdb9696 | ||||
| @@ -1350,8 +1317,6 @@ build.json @home-assistant/supervisor | ||||
| /tests/components/roomba/ @pschmitt @cyr-ius @shenxn @Orhideous | ||||
| /homeassistant/components/roon/ @pavoni | ||||
| /tests/components/roon/ @pavoni | ||||
| /homeassistant/components/route_b_smart_meter/ @SeraphicRav | ||||
| /tests/components/route_b_smart_meter/ @SeraphicRav | ||||
| /homeassistant/components/rpi_power/ @shenxn @swetoast | ||||
| /tests/components/rpi_power/ @shenxn @swetoast | ||||
| /homeassistant/components/rss_feed_template/ @home-assistant/core | ||||
| @@ -1374,8 +1339,6 @@ build.json @home-assistant/supervisor | ||||
| /tests/components/samsungtv/ @chemelli74 @epenet | ||||
| /homeassistant/components/sanix/ @tomaszsluszniak | ||||
| /tests/components/sanix/ @tomaszsluszniak | ||||
| /homeassistant/components/satel_integra/ @Tommatheussen | ||||
| /tests/components/satel_integra/ @Tommatheussen | ||||
| /homeassistant/components/scene/ @home-assistant/core | ||||
| /tests/components/scene/ @home-assistant/core | ||||
| /homeassistant/components/schedule/ @home-assistant/core | ||||
| @@ -1421,14 +1384,12 @@ build.json @home-assistant/supervisor | ||||
| /tests/components/seventeentrack/ @shaiu | ||||
| /homeassistant/components/sfr_box/ @epenet | ||||
| /tests/components/sfr_box/ @epenet | ||||
| /homeassistant/components/sftp_storage/ @maretodoric | ||||
| /tests/components/sftp_storage/ @maretodoric | ||||
| /homeassistant/components/sharkiq/ @JeffResc @funkybunch @TheOneOgre | ||||
| /tests/components/sharkiq/ @JeffResc @funkybunch @TheOneOgre | ||||
| /homeassistant/components/sharkiq/ @JeffResc @funkybunch | ||||
| /tests/components/sharkiq/ @JeffResc @funkybunch | ||||
| /homeassistant/components/shell_command/ @home-assistant/core | ||||
| /tests/components/shell_command/ @home-assistant/core | ||||
| /homeassistant/components/shelly/ @bieniu @thecode @chemelli74 @bdraco | ||||
| /tests/components/shelly/ @bieniu @thecode @chemelli74 @bdraco | ||||
| /homeassistant/components/shelly/ @balloob @bieniu @thecode @chemelli74 @bdraco | ||||
| /tests/components/shelly/ @balloob @bieniu @thecode @chemelli74 @bdraco | ||||
| /homeassistant/components/shodan/ @fabaff | ||||
| /homeassistant/components/sia/ @eavanvalkenburg | ||||
| /tests/components/sia/ @eavanvalkenburg | ||||
| @@ -1452,8 +1413,6 @@ build.json @home-assistant/supervisor | ||||
| /tests/components/skybell/ @tkdrob | ||||
| /homeassistant/components/slack/ @tkdrob @fletcherau | ||||
| /tests/components/slack/ @tkdrob @fletcherau | ||||
| /homeassistant/components/sleep_as_android/ @tr4nt0r | ||||
| /tests/components/sleep_as_android/ @tr4nt0r | ||||
| /homeassistant/components/sleepiq/ @mfugate1 @kbickar | ||||
| /tests/components/sleepiq/ @mfugate1 @kbickar | ||||
| /homeassistant/components/slide/ @ualex73 | ||||
| @@ -1489,8 +1448,8 @@ build.json @home-assistant/supervisor | ||||
| /tests/components/snoo/ @Lash-L | ||||
| /homeassistant/components/snooz/ @AustinBrunkhorst | ||||
| /tests/components/snooz/ @AustinBrunkhorst | ||||
| /homeassistant/components/solaredge/ @frenck @bdraco @tronikos | ||||
| /tests/components/solaredge/ @frenck @bdraco @tronikos | ||||
| /homeassistant/components/solaredge/ @frenck @bdraco | ||||
| /tests/components/solaredge/ @frenck @bdraco | ||||
| /homeassistant/components/solaredge_local/ @drobtravels @scheric | ||||
| /homeassistant/components/solarlog/ @Ernst79 @dontinelli | ||||
| /tests/components/solarlog/ @Ernst79 @dontinelli | ||||
| @@ -1543,8 +1502,6 @@ build.json @home-assistant/supervisor | ||||
| /tests/components/suez_water/ @ooii @jb101010-2 | ||||
| /homeassistant/components/sun/ @home-assistant/core | ||||
| /tests/components/sun/ @home-assistant/core | ||||
| /homeassistant/components/sunricher_dali_center/ @niracler | ||||
| /tests/components/sunricher_dali_center/ @niracler | ||||
| /homeassistant/components/supla/ @mwegrzynek | ||||
| /homeassistant/components/surepetcare/ @benleb @danielhiversen | ||||
| /tests/components/surepetcare/ @benleb @danielhiversen | ||||
| @@ -1559,8 +1516,8 @@ build.json @home-assistant/supervisor | ||||
| /tests/components/switchbee/ @jafar-atili | ||||
| /homeassistant/components/switchbot/ @danielhiversen @RenierM26 @murtas @Eloston @dsypniewski @zerzhang | ||||
| /tests/components/switchbot/ @danielhiversen @RenierM26 @murtas @Eloston @dsypniewski @zerzhang | ||||
| /homeassistant/components/switchbot_cloud/ @SeraphicRav @laurence-presland @Gigatrappeur @XiaoLing-git | ||||
| /tests/components/switchbot_cloud/ @SeraphicRav @laurence-presland @Gigatrappeur @XiaoLing-git | ||||
| /homeassistant/components/switchbot_cloud/ @SeraphicRav @laurence-presland @Gigatrappeur | ||||
| /tests/components/switchbot_cloud/ @SeraphicRav @laurence-presland @Gigatrappeur | ||||
| /homeassistant/components/switcher_kis/ @thecode @YogevBokobza | ||||
| /tests/components/switcher_kis/ @thecode @YogevBokobza | ||||
| /homeassistant/components/switchmate/ @danielhiversen @qiz-li | ||||
| @@ -1577,8 +1534,8 @@ build.json @home-assistant/supervisor | ||||
| /tests/components/systemmonitor/ @gjohansson-ST | ||||
| /homeassistant/components/tado/ @erwindouna | ||||
| /tests/components/tado/ @erwindouna | ||||
| /homeassistant/components/tag/ @home-assistant/core | ||||
| /tests/components/tag/ @home-assistant/core | ||||
| /homeassistant/components/tag/ @balloob @dmulcahey | ||||
| /tests/components/tag/ @balloob @dmulcahey | ||||
| /homeassistant/components/tailscale/ @frenck | ||||
| /tests/components/tailscale/ @frenck | ||||
| /homeassistant/components/tailwind/ @frenck | ||||
| @@ -1638,8 +1595,6 @@ build.json @home-assistant/supervisor | ||||
| /tests/components/todo/ @home-assistant/core | ||||
| /homeassistant/components/todoist/ @boralyl | ||||
| /tests/components/todoist/ @boralyl | ||||
| /homeassistant/components/togrill/ @elupus | ||||
| /tests/components/togrill/ @elupus | ||||
| /homeassistant/components/tolo/ @MatthiasLohr | ||||
| /tests/components/tolo/ @MatthiasLohr | ||||
| /homeassistant/components/tomorrowio/ @raman325 @lymanepp | ||||
| @@ -1654,6 +1609,8 @@ build.json @home-assistant/supervisor | ||||
| /tests/components/tplink_omada/ @MarkGodwin | ||||
| /homeassistant/components/traccar/ @ludeeus | ||||
| /tests/components/traccar/ @ludeeus | ||||
| /homeassistant/components/traccar_server/ @ludeeus | ||||
| /tests/components/traccar_server/ @ludeeus | ||||
| /homeassistant/components/trace/ @home-assistant/core | ||||
| /tests/components/trace/ @home-assistant/core | ||||
| /homeassistant/components/tractive/ @Danielhiversen @zhulik @bieniu | ||||
| @@ -1701,12 +1658,8 @@ build.json @home-assistant/supervisor | ||||
| /tests/components/upnp/ @StevenLooman | ||||
| /homeassistant/components/uptime/ @frenck | ||||
| /tests/components/uptime/ @frenck | ||||
| /homeassistant/components/uptime_kuma/ @tr4nt0r | ||||
| /tests/components/uptime_kuma/ @tr4nt0r | ||||
| /homeassistant/components/uptimerobot/ @ludeeus @chemelli74 | ||||
| /tests/components/uptimerobot/ @ludeeus @chemelli74 | ||||
| /homeassistant/components/usage_prediction/ @home-assistant/core | ||||
| /tests/components/usage_prediction/ @home-assistant/core | ||||
| /homeassistant/components/usb/ @bdraco | ||||
| /tests/components/usb/ @bdraco | ||||
| /homeassistant/components/usgs_earthquakes_feed/ @exxamalte | ||||
| @@ -1725,19 +1678,17 @@ build.json @home-assistant/supervisor | ||||
| /tests/components/vegehub/ @ghowevege | ||||
| /homeassistant/components/velbus/ @Cereal2nd @brefra | ||||
| /tests/components/velbus/ @Cereal2nd @brefra | ||||
| /homeassistant/components/velux/ @Julius2342 @DeerMaximum @pawlizio @wollew | ||||
| /tests/components/velux/ @Julius2342 @DeerMaximum @pawlizio @wollew | ||||
| /homeassistant/components/velux/ @Julius2342 @DeerMaximum @pawlizio | ||||
| /tests/components/velux/ @Julius2342 @DeerMaximum @pawlizio | ||||
| /homeassistant/components/venstar/ @garbled1 @jhollowe | ||||
| /tests/components/venstar/ @garbled1 @jhollowe | ||||
| /homeassistant/components/versasense/ @imstevenxyz | ||||
| /homeassistant/components/version/ @ludeeus | ||||
| /tests/components/version/ @ludeeus | ||||
| /homeassistant/components/vesync/ @markperdue @webdjoe @thegardenmonkey @cdnninja @iprak @sapuseven | ||||
| /tests/components/vesync/ @markperdue @webdjoe @thegardenmonkey @cdnninja @iprak @sapuseven | ||||
| /homeassistant/components/vesync/ @markperdue @webdjoe @thegardenmonkey @cdnninja @iprak | ||||
| /tests/components/vesync/ @markperdue @webdjoe @thegardenmonkey @cdnninja @iprak | ||||
| /homeassistant/components/vicare/ @CFenner | ||||
| /tests/components/vicare/ @CFenner | ||||
| /homeassistant/components/victron_remote_monitoring/ @AndyTempel | ||||
| /tests/components/victron_remote_monitoring/ @AndyTempel | ||||
| /homeassistant/components/vilfo/ @ManneW | ||||
| /tests/components/vilfo/ @ManneW | ||||
| /homeassistant/components/vivotek/ @HarlemSquirrel | ||||
| @@ -1747,14 +1698,14 @@ build.json @home-assistant/supervisor | ||||
| /tests/components/vlc_telnet/ @rodripf @MartinHjelmare | ||||
| /homeassistant/components/vodafone_station/ @paoloantinori @chemelli74 | ||||
| /tests/components/vodafone_station/ @paoloantinori @chemelli74 | ||||
| /homeassistant/components/voip/ @synesthesiam @jaminh | ||||
| /tests/components/voip/ @synesthesiam @jaminh | ||||
| /homeassistant/components/voip/ @balloob @synesthesiam @jaminh | ||||
| /tests/components/voip/ @balloob @synesthesiam @jaminh | ||||
| /homeassistant/components/volumio/ @OnFreund | ||||
| /tests/components/volumio/ @OnFreund | ||||
| /homeassistant/components/volvo/ @thomasddn | ||||
| /tests/components/volvo/ @thomasddn | ||||
| /homeassistant/components/volvooncall/ @molobrakos @svrooij | ||||
| /tests/components/volvooncall/ @molobrakos @svrooij | ||||
| /homeassistant/components/volvooncall/ @molobrakos | ||||
| /tests/components/volvooncall/ @molobrakos | ||||
| /homeassistant/components/vulcan/ @Antoni-Czaplicki | ||||
| /tests/components/vulcan/ @Antoni-Czaplicki | ||||
| /homeassistant/components/wake_on_lan/ @ntilley905 | ||||
| /tests/components/wake_on_lan/ @ntilley905 | ||||
| /homeassistant/components/wake_word/ @home-assistant/core @synesthesiam | ||||
| @@ -1805,8 +1756,8 @@ build.json @home-assistant/supervisor | ||||
| /homeassistant/components/wirelesstag/ @sergeymaysak | ||||
| /homeassistant/components/withings/ @joostlek | ||||
| /tests/components/withings/ @joostlek | ||||
| /homeassistant/components/wiz/ @sbidy @arturpragacz | ||||
| /tests/components/wiz/ @sbidy @arturpragacz | ||||
| /homeassistant/components/wiz/ @sbidy | ||||
| /tests/components/wiz/ @sbidy | ||||
| /homeassistant/components/wled/ @frenck | ||||
| /tests/components/wled/ @frenck | ||||
| /homeassistant/components/wmspro/ @mback2k | ||||
| @@ -1819,8 +1770,8 @@ build.json @home-assistant/supervisor | ||||
| /tests/components/worldclock/ @fabaff | ||||
| /homeassistant/components/ws66i/ @ssaenger | ||||
| /tests/components/ws66i/ @ssaenger | ||||
| /homeassistant/components/wyoming/ @synesthesiam | ||||
| /tests/components/wyoming/ @synesthesiam | ||||
| /homeassistant/components/wyoming/ @balloob @synesthesiam | ||||
| /tests/components/wyoming/ @balloob @synesthesiam | ||||
| /homeassistant/components/xbox/ @hunterjm | ||||
| /tests/components/xbox/ @hunterjm | ||||
| /homeassistant/components/xiaomi_aqara/ @danielhiversen @syssi | ||||
|   | ||||
| @@ -14,8 +14,5 @@ Still interested? Then you should take a peek at the [developer documentation](h | ||||
|  | ||||
| ## Feature suggestions | ||||
|  | ||||
| If you want to suggest a new feature for Home Assistant (e.g. new integrations), please [start a discussion](https://github.com/orgs/home-assistant/discussions) on GitHub. | ||||
|  | ||||
| ## Issue Tracker | ||||
|  | ||||
| If you want to report an issue, please [create an issue](https://github.com/home-assistant/core/issues) on GitHub. | ||||
| If you want to suggest a new feature for Home Assistant (e.g., new integrations), please open a thread in our [Community Forum: Feature Requests](https://community.home-assistant.io/c/feature-requests). | ||||
| We use [GitHub for tracking issues](https://github.com/home-assistant/core/issues), not for tracking feature requests. | ||||
|   | ||||
							
								
								
									
										4
									
								
								Dockerfile
									
									
									
										generated
									
									
									
								
							
							
						
						
									
										4
									
								
								Dockerfile
									
									
									
										generated
									
									
									
								
							| @@ -25,13 +25,13 @@ RUN \ | ||||
|         "armv7") go2rtc_suffix='arm' ;; \ | ||||
|         *) go2rtc_suffix=${BUILD_ARCH} ;; \ | ||||
|     esac \ | ||||
|     && curl -L https://github.com/AlexxIT/go2rtc/releases/download/v1.9.11/go2rtc_linux_${go2rtc_suffix} --output /bin/go2rtc \ | ||||
|     && curl -L https://github.com/AlexxIT/go2rtc/releases/download/v1.9.9/go2rtc_linux_${go2rtc_suffix} --output /bin/go2rtc \ | ||||
|     && chmod +x /bin/go2rtc \ | ||||
|     # Verify go2rtc can be executed | ||||
|     && go2rtc --version | ||||
|  | ||||
| # Install uv | ||||
| RUN pip3 install uv==0.9.5 | ||||
| RUN pip3 install uv==0.7.1 | ||||
|  | ||||
| WORKDIR /usr/src | ||||
|  | ||||
|   | ||||
| @@ -3,7 +3,8 @@ FROM mcr.microsoft.com/vscode/devcontainers/base:debian | ||||
| SHELL ["/bin/bash", "-o", "pipefail", "-c"] | ||||
|  | ||||
| RUN \ | ||||
|     apt-get update \ | ||||
|     curl -sS https://dl.yarnpkg.com/debian/pubkey.gpg | apt-key add - \ | ||||
|     && apt-get update \ | ||||
|     && DEBIAN_FRONTEND=noninteractive apt-get install -y --no-install-recommends \ | ||||
|         # Additional library needed by some tests and accordingly by VScode Tests Discovery | ||||
|         bluez \ | ||||
| @@ -34,11 +35,9 @@ WORKDIR /usr/src | ||||
|  | ||||
| COPY --from=ghcr.io/astral-sh/uv:latest /uv /usr/local/bin/uv | ||||
|  | ||||
| RUN uv python install 3.13.2 | ||||
|  | ||||
| USER vscode | ||||
|  | ||||
| COPY .python-version ./ | ||||
| RUN uv python install | ||||
|  | ||||
| ENV VIRTUAL_ENV="/home/vscode/.local/ha-venv" | ||||
| RUN uv venv $VIRTUAL_ENV | ||||
| ENV PATH="$VIRTUAL_ENV/bin:$PATH" | ||||
|   | ||||
							
								
								
									
										13
									
								
								build.yaml
									
									
									
									
									
								
							
							
						
						
									
										13
									
								
								build.yaml
									
									
									
									
									
								
							| @@ -1,10 +1,13 @@ | ||||
| image: ghcr.io/home-assistant/{arch}-homeassistant | ||||
| build_from: | ||||
|   aarch64: ghcr.io/home-assistant/aarch64-homeassistant-base:2025.10.1 | ||||
|   armhf: ghcr.io/home-assistant/armhf-homeassistant-base:2025.10.1 | ||||
|   armv7: ghcr.io/home-assistant/armv7-homeassistant-base:2025.10.1 | ||||
|   amd64: ghcr.io/home-assistant/amd64-homeassistant-base:2025.10.1 | ||||
|   i386: ghcr.io/home-assistant/i386-homeassistant-base:2025.10.1 | ||||
|   aarch64: ghcr.io/home-assistant/aarch64-homeassistant-base:2025.05.0 | ||||
|   armhf: ghcr.io/home-assistant/armhf-homeassistant-base:2025.05.0 | ||||
|   armv7: ghcr.io/home-assistant/armv7-homeassistant-base:2025.05.0 | ||||
|   amd64: ghcr.io/home-assistant/amd64-homeassistant-base:2025.05.0 | ||||
|   i386: ghcr.io/home-assistant/i386-homeassistant-base:2025.05.0 | ||||
| codenotary: | ||||
|   signer: notary@home-assistant.io | ||||
|   base_image: notary@home-assistant.io | ||||
| cosign: | ||||
|   base_identity: https://github.com/home-assistant/docker/.* | ||||
|   identity: https://github.com/home-assistant/core/.* | ||||
|   | ||||
| @@ -187,42 +187,36 @@ def main() -> int: | ||||
|  | ||||
|     from . import config, runner  # noqa: PLC0415 | ||||
|  | ||||
|     # Ensure only one instance runs per config directory | ||||
|     with runner.ensure_single_execution(config_dir) as single_execution_lock: | ||||
|         # Check if another instance is already running | ||||
|         if single_execution_lock.exit_code is not None: | ||||
|             return single_execution_lock.exit_code | ||||
|     safe_mode = config.safe_mode_enabled(config_dir) | ||||
|  | ||||
|         safe_mode = config.safe_mode_enabled(config_dir) | ||||
|     runtime_conf = runner.RuntimeConfig( | ||||
|         config_dir=config_dir, | ||||
|         verbose=args.verbose, | ||||
|         log_rotate_days=args.log_rotate_days, | ||||
|         log_file=args.log_file, | ||||
|         log_no_color=args.log_no_color, | ||||
|         skip_pip=args.skip_pip, | ||||
|         skip_pip_packages=args.skip_pip_packages, | ||||
|         recovery_mode=args.recovery_mode, | ||||
|         debug=args.debug, | ||||
|         open_ui=args.open_ui, | ||||
|         safe_mode=safe_mode, | ||||
|     ) | ||||
|  | ||||
|         runtime_conf = runner.RuntimeConfig( | ||||
|             config_dir=config_dir, | ||||
|             verbose=args.verbose, | ||||
|             log_rotate_days=args.log_rotate_days, | ||||
|             log_file=args.log_file, | ||||
|             log_no_color=args.log_no_color, | ||||
|             skip_pip=args.skip_pip, | ||||
|             skip_pip_packages=args.skip_pip_packages, | ||||
|             recovery_mode=args.recovery_mode, | ||||
|             debug=args.debug, | ||||
|             open_ui=args.open_ui, | ||||
|             safe_mode=safe_mode, | ||||
|         ) | ||||
|     fault_file_name = os.path.join(config_dir, FAULT_LOG_FILENAME) | ||||
|     with open(fault_file_name, mode="a", encoding="utf8") as fault_file: | ||||
|         faulthandler.enable(fault_file) | ||||
|         exit_code = runner.run(runtime_conf) | ||||
|         faulthandler.disable() | ||||
|  | ||||
|         fault_file_name = os.path.join(config_dir, FAULT_LOG_FILENAME) | ||||
|         with open(fault_file_name, mode="a", encoding="utf8") as fault_file: | ||||
|             faulthandler.enable(fault_file) | ||||
|             exit_code = runner.run(runtime_conf) | ||||
|             faulthandler.disable() | ||||
|     # It's possible for the fault file to disappear, so suppress obvious errors | ||||
|     with suppress(FileNotFoundError): | ||||
|         if os.path.getsize(fault_file_name) == 0: | ||||
|             os.remove(fault_file_name) | ||||
|  | ||||
|         # It's possible for the fault file to disappear, so suppress obvious errors | ||||
|         with suppress(FileNotFoundError): | ||||
|             if os.path.getsize(fault_file_name) == 0: | ||||
|                 os.remove(fault_file_name) | ||||
|     check_threads() | ||||
|  | ||||
|         check_threads() | ||||
|  | ||||
|         return exit_code | ||||
|     return exit_code | ||||
|  | ||||
|  | ||||
| if __name__ == "__main__": | ||||
|   | ||||
| @@ -120,9 +120,6 @@ class AuthStore: | ||||
|  | ||||
|         new_user = models.User(**kwargs) | ||||
|  | ||||
|         while new_user.id in self._users: | ||||
|             new_user = models.User(**kwargs) | ||||
|  | ||||
|         self._users[new_user.id] = new_user | ||||
|  | ||||
|         if credentials is None: | ||||
|   | ||||
| @@ -27,7 +27,7 @@ from . import ( | ||||
|     SetupFlow, | ||||
| ) | ||||
|  | ||||
| REQUIREMENTS = ["pyotp==2.9.0"] | ||||
| REQUIREMENTS = ["pyotp==2.8.0"] | ||||
|  | ||||
| CONF_MESSAGE = "message" | ||||
|  | ||||
|   | ||||
| @@ -20,7 +20,7 @@ from . import ( | ||||
|     SetupFlow, | ||||
| ) | ||||
|  | ||||
| REQUIREMENTS = ["pyotp==2.9.0", "PyQRCode==1.2.1"] | ||||
| REQUIREMENTS = ["pyotp==2.8.0", "PyQRCode==1.2.1"] | ||||
|  | ||||
| CONFIG_SCHEMA = MULTI_FACTOR_AUTH_MODULE_SCHEMA.extend({}, extra=vol.PREVENT_EXTRA) | ||||
|  | ||||
| @@ -34,9 +34,6 @@ INPUT_FIELD_CODE = "code" | ||||
|  | ||||
| DUMMY_SECRET = "FPPTH34D4E3MI2HG" | ||||
|  | ||||
| GOOGLE_AUTHENTICATOR_URL = "https://support.google.com/accounts/answer/1066447" | ||||
| AUTHY_URL = "https://authy.com/" | ||||
|  | ||||
|  | ||||
| def _generate_qr_code(data: str) -> str: | ||||
|     """Generate a base64 PNG string represent QR Code image of data.""" | ||||
| @@ -232,8 +229,6 @@ class TotpSetupFlow(SetupFlow[TotpAuthModule]): | ||||
|                 "code": self._ota_secret, | ||||
|                 "url": self._url, | ||||
|                 "qr_code": self._image, | ||||
|                 "google_authenticator_url": GOOGLE_AUTHENTICATOR_URL, | ||||
|                 "authy_url": AUTHY_URL, | ||||
|             }, | ||||
|             errors=errors, | ||||
|         ) | ||||
|   | ||||
| @@ -33,10 +33,7 @@ class AuthFlowContext(FlowContext, total=False): | ||||
|     redirect_uri: str | ||||
|  | ||||
|  | ||||
| class AuthFlowResult(FlowResult[AuthFlowContext, tuple[str, str]], total=False): | ||||
|     """Typed result dict for auth flow.""" | ||||
|  | ||||
|     result: Credentials  # Only present if type is CREATE_ENTRY | ||||
| AuthFlowResult = FlowResult[AuthFlowContext, tuple[str, str]] | ||||
|  | ||||
|  | ||||
| @attr.s(slots=True) | ||||
|   | ||||
| @@ -332,9 +332,6 @@ async def async_setup_hass( | ||||
|             if not is_virtual_env(): | ||||
|                 await async_mount_local_lib_path(runtime_config.config_dir) | ||||
|  | ||||
|             if hass.config.safe_mode: | ||||
|                 _LOGGER.info("Starting in safe mode") | ||||
|  | ||||
|             basic_setup_success = ( | ||||
|                 await async_from_config_dict(config_dict, hass) is not None | ||||
|             ) | ||||
| @@ -387,6 +384,8 @@ async def async_setup_hass( | ||||
|             {"recovery_mode": {}, "http": http_conf}, | ||||
|             hass, | ||||
|         ) | ||||
|     elif hass.config.safe_mode: | ||||
|         _LOGGER.info("Starting in safe mode") | ||||
|  | ||||
|     if runtime_config.open_ui: | ||||
|         hass.add_job(open_hass_ui, hass) | ||||
| @@ -616,34 +615,34 @@ async def async_enable_logging( | ||||
|         ), | ||||
|     ) | ||||
|  | ||||
|     logger = logging.getLogger() | ||||
|     logger.setLevel(logging.INFO if verbose else logging.WARNING) | ||||
|  | ||||
|     # Log errors to a file if we have write access to file or config dir | ||||
|     if log_file is None: | ||||
|         default_log_path = hass.config.path(ERROR_LOG_FILENAME) | ||||
|         if "SUPERVISOR" in os.environ: | ||||
|             _LOGGER.info("Running in Supervisor, not logging to file") | ||||
|             # Rename the default log file if it exists, since previous versions created | ||||
|             # it even on Supervisor | ||||
|             if os.path.isfile(default_log_path): | ||||
|                 with contextlib.suppress(OSError): | ||||
|                     os.rename(default_log_path, f"{default_log_path}.old") | ||||
|             err_log_path = None | ||||
|         else: | ||||
|             err_log_path = default_log_path | ||||
|         err_log_path = hass.config.path(ERROR_LOG_FILENAME) | ||||
|     else: | ||||
|         err_log_path = os.path.abspath(log_file) | ||||
|  | ||||
|     if err_log_path: | ||||
|     err_path_exists = os.path.isfile(err_log_path) | ||||
|     err_dir = os.path.dirname(err_log_path) | ||||
|  | ||||
|     # Check if we can write to the error log if it exists or that | ||||
|     # we can create files in the containing directory if not. | ||||
|     if (err_path_exists and os.access(err_log_path, os.W_OK)) or ( | ||||
|         not err_path_exists and os.access(err_dir, os.W_OK) | ||||
|     ): | ||||
|         err_handler = await hass.async_add_executor_job( | ||||
|             _create_log_file, err_log_path, log_rotate_days | ||||
|         ) | ||||
|  | ||||
|         err_handler.setFormatter(logging.Formatter(fmt, datefmt=FORMAT_DATETIME)) | ||||
|  | ||||
|         logger = logging.getLogger() | ||||
|         logger.addHandler(err_handler) | ||||
|         logger.setLevel(logging.INFO if verbose else logging.WARNING) | ||||
|  | ||||
|         # Save the log file location for access by other components. | ||||
|         hass.data[DATA_LOGGING] = err_log_path | ||||
|     else: | ||||
|         _LOGGER.error("Unable to set up error log %s (access denied)", err_log_path) | ||||
|  | ||||
|     async_activate_log_queue_handler(hass) | ||||
|  | ||||
| @@ -695,10 +694,10 @@ async def async_mount_local_lib_path(config_dir: str) -> str: | ||||
|  | ||||
| def _get_domains(hass: core.HomeAssistant, config: dict[str, Any]) -> set[str]: | ||||
|     """Get domains of components to set up.""" | ||||
|     # The common config section [homeassistant] could be filtered here, | ||||
|     # but that is not necessary, since it corresponds to the core integration, | ||||
|     # that is always unconditionally loaded. | ||||
|     domains = {cv.domain_key(key) for key in config} | ||||
|     # Filter out the repeating and common config section [homeassistant] | ||||
|     domains = { | ||||
|         domain for key in config if (domain := cv.domain_key(key)) != core.DOMAIN | ||||
|     } | ||||
|  | ||||
|     # Add config entry and default domains | ||||
|     if not hass.config.recovery_mode: | ||||
| @@ -726,28 +725,34 @@ async def _async_resolve_domains_and_preload( | ||||
|       together with all their dependencies. | ||||
|     """ | ||||
|     domains_to_setup = _get_domains(hass, config) | ||||
|  | ||||
|     # Also process all base platforms since we do not require the manifest | ||||
|     # to list them as dependencies. | ||||
|     # We want to later avoid lock contention when multiple integrations try to load | ||||
|     # their manifests at once. | ||||
|     platform_integrations = conf_util.extract_platform_integrations( | ||||
|         config, BASE_PLATFORMS | ||||
|     ) | ||||
|     # Ensure base platforms that have platform integrations are added to `domains`, | ||||
|     # so they can be setup first instead of discovering them later when a config | ||||
|     # entry setup task notices that it's needed and there is already a long line | ||||
|     # to use the import executor. | ||||
|     # | ||||
|     # Additionally process integrations that are defined under base platforms | ||||
|     # to speed things up. | ||||
|     # For example if we have | ||||
|     # sensor: | ||||
|     #   - platform: template | ||||
|     # | ||||
|     # `template` has to be loaded to validate the config for sensor. | ||||
|     # The more platforms under `sensor:`, the longer | ||||
|     # `template` has to be loaded to validate the config for sensor | ||||
|     # so we want to start loading `sensor` as soon as we know | ||||
|     # it will be needed. The more platforms under `sensor:`, the longer | ||||
|     # it will take to finish setup for `sensor` because each of these | ||||
|     # platforms has to be imported before we can validate the config. | ||||
|     # | ||||
|     # Thankfully we are migrating away from the platform pattern | ||||
|     # so this will be less of a problem in the future. | ||||
|     platform_integrations = conf_util.extract_platform_integrations( | ||||
|         config, BASE_PLATFORMS | ||||
|     ) | ||||
|     domains_to_setup.update(platform_integrations) | ||||
|  | ||||
|     # Additionally process base platforms since we do not require the manifest | ||||
|     # to list them as dependencies. | ||||
|     # We want to later avoid lock contention when multiple integrations try to load | ||||
|     # their manifests at once. | ||||
|     # Also process integrations that are defined under base platforms | ||||
|     # to speed things up. | ||||
|     additional_domains_to_process = { | ||||
|         *BASE_PLATFORMS, | ||||
|         *chain.from_iterable(platform_integrations.values()), | ||||
| @@ -865,9 +870,9 @@ async def _async_set_up_integrations( | ||||
|     domains = set(integrations) & all_domains | ||||
|  | ||||
|     _LOGGER.info( | ||||
|         "Domains to be set up: %s\nDependencies: %s", | ||||
|         domains or "{}", | ||||
|         (all_domains - domains) or "{}", | ||||
|         "Domains to be set up: %s | %s", | ||||
|         domains, | ||||
|         all_domains - domains, | ||||
|     ) | ||||
|  | ||||
|     async_set_domains_to_be_loaded(hass, all_domains) | ||||
| @@ -908,13 +913,12 @@ async def _async_set_up_integrations( | ||||
|         stage_all_domains = stage_domains | stage_dep_domains | ||||
|  | ||||
|         _LOGGER.info( | ||||
|             "Setting up stage %s: %s; already set up: %s\n" | ||||
|             "Dependencies: %s; already set up: %s", | ||||
|             "Setting up stage %s: %s | %s\nDependencies: %s | %s", | ||||
|             name, | ||||
|             stage_domains, | ||||
|             (stage_domains_unfiltered - stage_domains) or "{}", | ||||
|             stage_dep_domains or "{}", | ||||
|             (stage_dep_domains_unfiltered - stage_dep_domains) or "{}", | ||||
|             stage_domains_unfiltered - stage_domains, | ||||
|             stage_dep_domains, | ||||
|             stage_dep_domains_unfiltered - stage_dep_domains, | ||||
|         ) | ||||
|  | ||||
|         if timeout is None: | ||||
|   | ||||
| @@ -1,5 +0,0 @@ | ||||
| { | ||||
|   "domain": "eltako", | ||||
|   "name": "Eltako", | ||||
|   "iot_standards": ["matter"] | ||||
| } | ||||
| @@ -1,5 +0,0 @@ | ||||
| { | ||||
|   "domain": "frient", | ||||
|   "name": "Frient", | ||||
|   "iot_standards": ["zigbee"] | ||||
| } | ||||
| @@ -1,5 +1,5 @@ | ||||
| { | ||||
|   "domain": "fritzbox", | ||||
|   "name": "FRITZ!", | ||||
|   "name": "FRITZ!Box", | ||||
|   "integrations": ["fritz", "fritzbox", "fritzbox_callmonitor"] | ||||
| } | ||||
|   | ||||
| @@ -6,6 +6,7 @@ | ||||
|     "google_assistant_sdk", | ||||
|     "google_cloud", | ||||
|     "google_drive", | ||||
|     "google_gemini", | ||||
|     "google_generative_ai_conversation", | ||||
|     "google_mail", | ||||
|     "google_maps", | ||||
|   | ||||
							
								
								
									
										5
									
								
								homeassistant/brands/ibm.json
									
									
									
									
									
										Normal file
									
								
							
							
						
						
									
										5
									
								
								homeassistant/brands/ibm.json
									
									
									
									
									
										Normal file
									
								
							| @@ -0,0 +1,5 @@ | ||||
| { | ||||
|   "domain": "ibm", | ||||
|   "name": "IBM", | ||||
|   "integrations": ["watson_iot", "watson_tts"] | ||||
| } | ||||
| @@ -1,5 +0,0 @@ | ||||
| { | ||||
|   "domain": "konnected", | ||||
|   "name": "Konnected", | ||||
|   "integrations": ["konnected", "konnected_esphome"] | ||||
| } | ||||
| @@ -1,5 +0,0 @@ | ||||
| { | ||||
|   "domain": "level", | ||||
|   "name": "Level", | ||||
|   "iot_standards": ["matter"] | ||||
| } | ||||
| @@ -1,5 +1,5 @@ | ||||
| { | ||||
|   "domain": "third_reality", | ||||
|   "name": "Third Reality", | ||||
|   "iot_standards": ["matter", "zigbee"] | ||||
|   "iot_standards": ["zigbee"] | ||||
| } | ||||
|   | ||||
| @@ -1,5 +1,5 @@ | ||||
| { | ||||
|   "domain": "ubiquiti", | ||||
|   "name": "Ubiquiti", | ||||
|   "integrations": ["airos", "unifi", "unifi_direct", "unifiled", "unifiprotect"] | ||||
|   "integrations": ["unifi", "unifi_direct", "unifiled", "unifiprotect"] | ||||
| } | ||||
|   | ||||
| @@ -1,70 +1,70 @@ | ||||
| { | ||||
|   "config": { | ||||
|     "abort": { | ||||
|       "reauth_successful": "[%key:common::config_flow::abort::reauth_successful%]" | ||||
|     }, | ||||
|     "error": { | ||||
|       "cannot_connect": "[%key:common::config_flow::error::cannot_connect%]", | ||||
|       "invalid_auth": "[%key:common::config_flow::error::invalid_auth%]", | ||||
|       "invalid_mfa_code": "Invalid MFA code" | ||||
|     }, | ||||
|     "step": { | ||||
|       "user": { | ||||
|         "title": "Fill in your Abode login information", | ||||
|         "data": { | ||||
|           "username": "[%key:common::config_flow::data::email%]", | ||||
|           "password": "[%key:common::config_flow::data::password%]" | ||||
|         } | ||||
|       }, | ||||
|       "mfa": { | ||||
|         "title": "Enter your MFA code for Abode", | ||||
|         "data": { | ||||
|           "mfa_code": "MFA code (6-digits)" | ||||
|         }, | ||||
|         "title": "Enter your MFA code for Abode" | ||||
|         } | ||||
|       }, | ||||
|       "reauth_confirm": { | ||||
|         "title": "[%key:component::abode::config::step::user::title%]", | ||||
|         "data": { | ||||
|           "password": "[%key:common::config_flow::data::password%]", | ||||
|           "username": "[%key:common::config_flow::data::email%]" | ||||
|         }, | ||||
|         "title": "[%key:component::abode::config::step::user::title%]" | ||||
|       }, | ||||
|       "user": { | ||||
|         "data": { | ||||
|           "password": "[%key:common::config_flow::data::password%]", | ||||
|           "username": "[%key:common::config_flow::data::email%]" | ||||
|         }, | ||||
|         "title": "Fill in your Abode login information" | ||||
|           "username": "[%key:common::config_flow::data::email%]", | ||||
|           "password": "[%key:common::config_flow::data::password%]" | ||||
|         } | ||||
|       } | ||||
|     }, | ||||
|     "error": { | ||||
|       "invalid_auth": "[%key:common::config_flow::error::invalid_auth%]", | ||||
|       "cannot_connect": "[%key:common::config_flow::error::cannot_connect%]", | ||||
|       "invalid_mfa_code": "Invalid MFA code" | ||||
|     }, | ||||
|     "abort": { | ||||
|       "reauth_successful": "[%key:common::config_flow::abort::reauth_successful%]" | ||||
|     } | ||||
|   }, | ||||
|   "services": { | ||||
|     "capture_image": { | ||||
|       "name": "Capture image", | ||||
|       "description": "Requests a new image capture from a camera device.", | ||||
|       "fields": { | ||||
|         "entity_id": { | ||||
|           "description": "Entity ID of the camera to request an image from.", | ||||
|           "name": "Entity" | ||||
|           "name": "Entity", | ||||
|           "description": "Entity ID of the camera to request an image from." | ||||
|         } | ||||
|       }, | ||||
|       "name": "Capture image" | ||||
|       } | ||||
|     }, | ||||
|     "change_setting": { | ||||
|       "name": "Change setting", | ||||
|       "description": "Changes an Abode system setting.", | ||||
|       "fields": { | ||||
|         "setting": { | ||||
|           "description": "Setting to change.", | ||||
|           "name": "Setting" | ||||
|           "name": "Setting", | ||||
|           "description": "Setting to change." | ||||
|         }, | ||||
|         "value": { | ||||
|           "description": "Value of the setting.", | ||||
|           "name": "Value" | ||||
|           "name": "Value", | ||||
|           "description": "Value of the setting." | ||||
|         } | ||||
|       }, | ||||
|       "name": "Change setting" | ||||
|       } | ||||
|     }, | ||||
|     "trigger_automation": { | ||||
|       "name": "Trigger automation", | ||||
|       "description": "Triggers an Abode automation.", | ||||
|       "fields": { | ||||
|         "entity_id": { | ||||
|           "description": "Entity ID of the automation to trigger.", | ||||
|           "name": "Entity" | ||||
|           "name": "Entity", | ||||
|           "description": "Entity ID of the automation to trigger." | ||||
|         } | ||||
|       }, | ||||
|       "name": "Trigger automation" | ||||
|       } | ||||
|     } | ||||
|   } | ||||
| } | ||||
|   | ||||
| @@ -8,17 +8,14 @@ import logging | ||||
| from aioacaia.acaiascale import AcaiaScale | ||||
| from aioacaia.exceptions import AcaiaDeviceNotFound, AcaiaError | ||||
|  | ||||
| from homeassistant.components.bluetooth import async_get_scanner | ||||
| from homeassistant.config_entries import ConfigEntry | ||||
| from homeassistant.const import CONF_ADDRESS | ||||
| from homeassistant.core import HomeAssistant | ||||
| from homeassistant.helpers.debounce import Debouncer | ||||
| from homeassistant.helpers.update_coordinator import DataUpdateCoordinator | ||||
|  | ||||
| from .const import CONF_IS_NEW_STYLE_SCALE | ||||
|  | ||||
| SCAN_INTERVAL = timedelta(seconds=15) | ||||
| UPDATE_DEBOUNCE_TIME = 0.2 | ||||
|  | ||||
| _LOGGER = logging.getLogger(__name__) | ||||
|  | ||||
| @@ -40,20 +37,11 @@ class AcaiaCoordinator(DataUpdateCoordinator[None]): | ||||
|             config_entry=entry, | ||||
|         ) | ||||
|  | ||||
|         debouncer = Debouncer( | ||||
|             hass=hass, | ||||
|             logger=_LOGGER, | ||||
|             cooldown=UPDATE_DEBOUNCE_TIME, | ||||
|             immediate=True, | ||||
|             function=self.async_update_listeners, | ||||
|         ) | ||||
|  | ||||
|         self._scale = AcaiaScale( | ||||
|             address_or_ble_device=entry.data[CONF_ADDRESS], | ||||
|             name=entry.title, | ||||
|             is_new_style_scale=entry.data[CONF_IS_NEW_STYLE_SCALE], | ||||
|             notify_callback=debouncer.async_schedule_call, | ||||
|             scanner=async_get_scanner(hass), | ||||
|             notify_callback=self.async_update_listeners, | ||||
|         ) | ||||
|  | ||||
|     @property | ||||
|   | ||||
| @@ -4,20 +4,20 @@ | ||||
|       "timer_running": { | ||||
|         "default": "mdi:timer", | ||||
|         "state": { | ||||
|           "off": "mdi:timer-off", | ||||
|           "on": "mdi:timer-play" | ||||
|           "on": "mdi:timer-play", | ||||
|           "off": "mdi:timer-off" | ||||
|         } | ||||
|       } | ||||
|     }, | ||||
|     "button": { | ||||
|       "tare": { | ||||
|         "default": "mdi:scale-balance" | ||||
|       }, | ||||
|       "reset_timer": { | ||||
|         "default": "mdi:timer-refresh" | ||||
|       }, | ||||
|       "start_stop": { | ||||
|         "default": "mdi:timer-play" | ||||
|       }, | ||||
|       "tare": { | ||||
|         "default": "mdi:scale-balance" | ||||
|       } | ||||
|     } | ||||
|   } | ||||
|   | ||||
| @@ -26,5 +26,5 @@ | ||||
|   "iot_class": "local_push", | ||||
|   "loggers": ["aioacaia"], | ||||
|   "quality_scale": "platinum", | ||||
|   "requirements": ["aioacaia==0.1.17"] | ||||
|   "requirements": ["aioacaia==0.1.14"] | ||||
| } | ||||
|   | ||||
| @@ -1,5 +1,6 @@ | ||||
| { | ||||
|   "config": { | ||||
|     "flow_title": "{name}", | ||||
|     "abort": { | ||||
|       "already_configured": "[%key:common::config_flow::abort::already_configured_device%]", | ||||
|       "no_devices_found": "[%key:common::config_flow::abort::no_devices_found%]", | ||||
| @@ -9,19 +10,18 @@ | ||||
|       "device_not_found": "Device could not be found.", | ||||
|       "unknown": "[%key:common::config_flow::error::unknown%]" | ||||
|     }, | ||||
|     "flow_title": "{name}", | ||||
|     "step": { | ||||
|       "bluetooth_confirm": { | ||||
|         "description": "[%key:component::bluetooth::config::step::bluetooth_confirm::description%]" | ||||
|       }, | ||||
|       "user": { | ||||
|         "description": "[%key:component::bluetooth::config::step::user::description%]", | ||||
|         "data": { | ||||
|           "address": "[%key:common::config_flow::data::device%]" | ||||
|         }, | ||||
|         "data_description": { | ||||
|           "address": "Select Acaia scale you want to set up" | ||||
|         }, | ||||
|         "description": "[%key:component::bluetooth::config::step::user::description%]" | ||||
|         } | ||||
|       } | ||||
|     } | ||||
|   }, | ||||
| @@ -32,14 +32,14 @@ | ||||
|       } | ||||
|     }, | ||||
|     "button": { | ||||
|       "tare": { | ||||
|         "name": "Tare" | ||||
|       }, | ||||
|       "reset_timer": { | ||||
|         "name": "Reset timer" | ||||
|       }, | ||||
|       "start_stop": { | ||||
|         "name": "Start/stop timer" | ||||
|       }, | ||||
|       "tare": { | ||||
|         "name": "Tare" | ||||
|       } | ||||
|     } | ||||
|   } | ||||
|   | ||||
| @@ -2,23 +2,21 @@ | ||||
|  | ||||
| from __future__ import annotations | ||||
|  | ||||
| import asyncio | ||||
| import logging | ||||
|  | ||||
| from accuweather import AccuWeather | ||||
|  | ||||
| from homeassistant.components.sensor import DOMAIN as SENSOR_PLATFORM | ||||
| from homeassistant.const import CONF_API_KEY, Platform | ||||
| from homeassistant.const import CONF_API_KEY, CONF_NAME, Platform | ||||
| from homeassistant.core import HomeAssistant | ||||
| from homeassistant.helpers import entity_registry as er | ||||
| from homeassistant.helpers.aiohttp_client import async_get_clientsession | ||||
|  | ||||
| from .const import DOMAIN | ||||
| from .const import DOMAIN, UPDATE_INTERVAL_DAILY_FORECAST, UPDATE_INTERVAL_OBSERVATION | ||||
| from .coordinator import ( | ||||
|     AccuWeatherConfigEntry, | ||||
|     AccuWeatherDailyForecastDataUpdateCoordinator, | ||||
|     AccuWeatherData, | ||||
|     AccuWeatherHourlyForecastDataUpdateCoordinator, | ||||
|     AccuWeatherObservationDataUpdateCoordinator, | ||||
| ) | ||||
|  | ||||
| @@ -30,6 +28,7 @@ PLATFORMS = [Platform.SENSOR, Platform.WEATHER] | ||||
| async def async_setup_entry(hass: HomeAssistant, entry: AccuWeatherConfigEntry) -> bool: | ||||
|     """Set up AccuWeather as config entry.""" | ||||
|     api_key: str = entry.data[CONF_API_KEY] | ||||
|     name: str = entry.data[CONF_NAME] | ||||
|  | ||||
|     location_key = entry.unique_id | ||||
|  | ||||
| @@ -42,28 +41,26 @@ async def async_setup_entry(hass: HomeAssistant, entry: AccuWeatherConfigEntry) | ||||
|         hass, | ||||
|         entry, | ||||
|         accuweather, | ||||
|         name, | ||||
|         "observation", | ||||
|         UPDATE_INTERVAL_OBSERVATION, | ||||
|     ) | ||||
|  | ||||
|     coordinator_daily_forecast = AccuWeatherDailyForecastDataUpdateCoordinator( | ||||
|         hass, | ||||
|         entry, | ||||
|         accuweather, | ||||
|     ) | ||||
|     coordinator_hourly_forecast = AccuWeatherHourlyForecastDataUpdateCoordinator( | ||||
|         hass, | ||||
|         entry, | ||||
|         accuweather, | ||||
|         name, | ||||
|         "daily forecast", | ||||
|         UPDATE_INTERVAL_DAILY_FORECAST, | ||||
|     ) | ||||
|  | ||||
|     await asyncio.gather( | ||||
|         coordinator_observation.async_config_entry_first_refresh(), | ||||
|         coordinator_daily_forecast.async_config_entry_first_refresh(), | ||||
|         coordinator_hourly_forecast.async_config_entry_first_refresh(), | ||||
|     ) | ||||
|     await coordinator_observation.async_config_entry_first_refresh() | ||||
|     await coordinator_daily_forecast.async_config_entry_first_refresh() | ||||
|  | ||||
|     entry.runtime_data = AccuWeatherData( | ||||
|         coordinator_observation=coordinator_observation, | ||||
|         coordinator_daily_forecast=coordinator_daily_forecast, | ||||
|         coordinator_hourly_forecast=coordinator_hourly_forecast, | ||||
|     ) | ||||
|  | ||||
|     await hass.config_entries.async_forward_entry_setups(entry, PLATFORMS) | ||||
|   | ||||
| @@ -3,7 +3,6 @@ | ||||
| from __future__ import annotations | ||||
|  | ||||
| from asyncio import timeout | ||||
| from collections.abc import Mapping | ||||
| from typing import Any | ||||
|  | ||||
| from accuweather import AccuWeather, ApiError, InvalidApiKeyError, RequestsExceededError | ||||
| @@ -23,8 +22,6 @@ class AccuWeatherFlowHandler(ConfigFlow, domain=DOMAIN): | ||||
|     """Config flow for AccuWeather.""" | ||||
|  | ||||
|     VERSION = 1 | ||||
|     _latitude: float | None = None | ||||
|     _longitude: float | None = None | ||||
|  | ||||
|     async def async_step_user( | ||||
|         self, user_input: dict[str, Any] | None = None | ||||
| @@ -53,7 +50,6 @@ class AccuWeatherFlowHandler(ConfigFlow, domain=DOMAIN): | ||||
|                 await self.async_set_unique_id( | ||||
|                     accuweather.location_key, raise_on_progress=False | ||||
|                 ) | ||||
|                 self._abort_if_unique_id_configured() | ||||
|  | ||||
|                 return self.async_create_entry( | ||||
|                     title=user_input[CONF_NAME], data=user_input | ||||
| @@ -77,46 +73,3 @@ class AccuWeatherFlowHandler(ConfigFlow, domain=DOMAIN): | ||||
|             ), | ||||
|             errors=errors, | ||||
|         ) | ||||
|  | ||||
|     async def async_step_reauth( | ||||
|         self, entry_data: Mapping[str, Any] | ||||
|     ) -> ConfigFlowResult: | ||||
|         """Handle configuration by re-auth.""" | ||||
|         self._latitude = entry_data[CONF_LATITUDE] | ||||
|         self._longitude = entry_data[CONF_LONGITUDE] | ||||
|  | ||||
|         return await self.async_step_reauth_confirm() | ||||
|  | ||||
|     async def async_step_reauth_confirm( | ||||
|         self, user_input: dict[str, Any] | None = None | ||||
|     ) -> ConfigFlowResult: | ||||
|         """Dialog that informs the user that reauth is required.""" | ||||
|         errors: dict[str, str] = {} | ||||
|  | ||||
|         if user_input is not None: | ||||
|             websession = async_get_clientsession(self.hass) | ||||
|             try: | ||||
|                 async with timeout(10): | ||||
|                     accuweather = AccuWeather( | ||||
|                         user_input[CONF_API_KEY], | ||||
|                         websession, | ||||
|                         latitude=self._latitude, | ||||
|                         longitude=self._longitude, | ||||
|                     ) | ||||
|                     await accuweather.async_get_location() | ||||
|             except (ApiError, ClientConnectorError, TimeoutError, ClientError): | ||||
|                 errors["base"] = "cannot_connect" | ||||
|             except InvalidApiKeyError: | ||||
|                 errors["base"] = "invalid_api_key" | ||||
|             except RequestsExceededError: | ||||
|                 errors["base"] = "requests_exceeded" | ||||
|             else: | ||||
|                 return self.async_update_reload_and_abort( | ||||
|                     self._get_reauth_entry(), data_updates=user_input | ||||
|                 ) | ||||
|  | ||||
|         return self.async_show_form( | ||||
|             step_id="reauth_confirm", | ||||
|             data_schema=vol.Schema({vol.Required(CONF_API_KEY): str}), | ||||
|             errors=errors, | ||||
|         ) | ||||
|   | ||||
| @@ -69,6 +69,5 @@ POLLEN_CATEGORY_MAP = { | ||||
|     4: "very_high", | ||||
|     5: "extreme", | ||||
| } | ||||
| UPDATE_INTERVAL_OBSERVATION = timedelta(minutes=10) | ||||
| UPDATE_INTERVAL_OBSERVATION = timedelta(minutes=40) | ||||
| UPDATE_INTERVAL_DAILY_FORECAST = timedelta(hours=6) | ||||
| UPDATE_INTERVAL_HOURLY_FORECAST = timedelta(minutes=30) | ||||
|   | ||||
| @@ -3,7 +3,6 @@ | ||||
| from __future__ import annotations | ||||
|  | ||||
| from asyncio import timeout | ||||
| from collections.abc import Awaitable, Callable | ||||
| from dataclasses import dataclass | ||||
| from datetime import timedelta | ||||
| import logging | ||||
| @@ -13,9 +12,7 @@ from accuweather import AccuWeather, ApiError, InvalidApiKeyError, RequestsExcee | ||||
| from aiohttp.client_exceptions import ClientConnectorError | ||||
|  | ||||
| from homeassistant.config_entries import ConfigEntry | ||||
| from homeassistant.const import CONF_NAME | ||||
| from homeassistant.core import HomeAssistant | ||||
| from homeassistant.exceptions import ConfigEntryAuthFailed | ||||
| from homeassistant.helpers.device_registry import DeviceEntryType, DeviceInfo | ||||
| from homeassistant.helpers.update_coordinator import ( | ||||
|     DataUpdateCoordinator, | ||||
| @@ -23,15 +20,9 @@ from homeassistant.helpers.update_coordinator import ( | ||||
|     UpdateFailed, | ||||
| ) | ||||
|  | ||||
| from .const import ( | ||||
|     DOMAIN, | ||||
|     MANUFACTURER, | ||||
|     UPDATE_INTERVAL_DAILY_FORECAST, | ||||
|     UPDATE_INTERVAL_HOURLY_FORECAST, | ||||
|     UPDATE_INTERVAL_OBSERVATION, | ||||
| ) | ||||
| from .const import DOMAIN, MANUFACTURER | ||||
|  | ||||
| EXCEPTIONS = (ApiError, ClientConnectorError, RequestsExceededError) | ||||
| EXCEPTIONS = (ApiError, ClientConnectorError, InvalidApiKeyError, RequestsExceededError) | ||||
|  | ||||
| _LOGGER = logging.getLogger(__name__) | ||||
|  | ||||
| @@ -42,7 +33,6 @@ class AccuWeatherData: | ||||
|  | ||||
|     coordinator_observation: AccuWeatherObservationDataUpdateCoordinator | ||||
|     coordinator_daily_forecast: AccuWeatherDailyForecastDataUpdateCoordinator | ||||
|     coordinator_hourly_forecast: AccuWeatherHourlyForecastDataUpdateCoordinator | ||||
|  | ||||
|  | ||||
| type AccuWeatherConfigEntry = ConfigEntry[AccuWeatherData] | ||||
| @@ -53,18 +43,18 @@ class AccuWeatherObservationDataUpdateCoordinator( | ||||
| ): | ||||
|     """Class to manage fetching AccuWeather data API.""" | ||||
|  | ||||
|     config_entry: AccuWeatherConfigEntry | ||||
|  | ||||
|     def __init__( | ||||
|         self, | ||||
|         hass: HomeAssistant, | ||||
|         config_entry: AccuWeatherConfigEntry, | ||||
|         accuweather: AccuWeather, | ||||
|         name: str, | ||||
|         coordinator_type: str, | ||||
|         update_interval: timedelta, | ||||
|     ) -> None: | ||||
|         """Initialize.""" | ||||
|         self.accuweather = accuweather | ||||
|         self.location_key = accuweather.location_key | ||||
|         name = config_entry.data[CONF_NAME] | ||||
|  | ||||
|         if TYPE_CHECKING: | ||||
|             assert self.location_key is not None | ||||
| @@ -75,8 +65,8 @@ class AccuWeatherObservationDataUpdateCoordinator( | ||||
|             hass, | ||||
|             _LOGGER, | ||||
|             config_entry=config_entry, | ||||
|             name=f"{name} (observation)", | ||||
|             update_interval=UPDATE_INTERVAL_OBSERVATION, | ||||
|             name=f"{name} ({coordinator_type})", | ||||
|             update_interval=update_interval, | ||||
|         ) | ||||
|  | ||||
|     async def _async_update_data(self) -> dict[str, Any]: | ||||
| @@ -90,39 +80,29 @@ class AccuWeatherObservationDataUpdateCoordinator( | ||||
|                 translation_key="current_conditions_update_error", | ||||
|                 translation_placeholders={"error": repr(error)}, | ||||
|             ) from error | ||||
|         except InvalidApiKeyError as err: | ||||
|             raise ConfigEntryAuthFailed( | ||||
|                 translation_domain=DOMAIN, | ||||
|                 translation_key="auth_error", | ||||
|                 translation_placeholders={"entry": self.config_entry.title}, | ||||
|             ) from err | ||||
|  | ||||
|         _LOGGER.debug("Requests remaining: %d", self.accuweather.requests_remaining) | ||||
|  | ||||
|         return result | ||||
|  | ||||
|  | ||||
| class AccuWeatherForecastDataUpdateCoordinator( | ||||
| class AccuWeatherDailyForecastDataUpdateCoordinator( | ||||
|     TimestampDataUpdateCoordinator[list[dict[str, Any]]] | ||||
| ): | ||||
|     """Base class for AccuWeather forecast.""" | ||||
|  | ||||
|     config_entry: AccuWeatherConfigEntry | ||||
|     """Class to manage fetching AccuWeather data API.""" | ||||
|  | ||||
|     def __init__( | ||||
|         self, | ||||
|         hass: HomeAssistant, | ||||
|         config_entry: AccuWeatherConfigEntry, | ||||
|         accuweather: AccuWeather, | ||||
|         name: str, | ||||
|         coordinator_type: str, | ||||
|         update_interval: timedelta, | ||||
|         fetch_method: Callable[..., Awaitable[list[dict[str, Any]]]], | ||||
|     ) -> None: | ||||
|         """Initialize.""" | ||||
|         self.accuweather = accuweather | ||||
|         self.location_key = accuweather.location_key | ||||
|         self._fetch_method = fetch_method | ||||
|         name = config_entry.data[CONF_NAME] | ||||
|  | ||||
|         if TYPE_CHECKING: | ||||
|             assert self.location_key is not None | ||||
| @@ -138,71 +118,24 @@ class AccuWeatherForecastDataUpdateCoordinator( | ||||
|         ) | ||||
|  | ||||
|     async def _async_update_data(self) -> list[dict[str, Any]]: | ||||
|         """Update forecast data via library.""" | ||||
|         """Update data via library.""" | ||||
|         try: | ||||
|             async with timeout(10): | ||||
|                 result = await self._fetch_method(language=self.hass.config.language) | ||||
|                 result = await self.accuweather.async_get_daily_forecast( | ||||
|                     language=self.hass.config.language | ||||
|                 ) | ||||
|         except EXCEPTIONS as error: | ||||
|             raise UpdateFailed( | ||||
|                 translation_domain=DOMAIN, | ||||
|                 translation_key="forecast_update_error", | ||||
|                 translation_placeholders={"error": repr(error)}, | ||||
|             ) from error | ||||
|         except InvalidApiKeyError as err: | ||||
|             raise ConfigEntryAuthFailed( | ||||
|                 translation_domain=DOMAIN, | ||||
|                 translation_key="auth_error", | ||||
|                 translation_placeholders={"entry": self.config_entry.title}, | ||||
|             ) from err | ||||
|  | ||||
|         _LOGGER.debug("Requests remaining: %d", self.accuweather.requests_remaining) | ||||
|  | ||||
|         return result | ||||
|  | ||||
|  | ||||
| class AccuWeatherDailyForecastDataUpdateCoordinator( | ||||
|     AccuWeatherForecastDataUpdateCoordinator | ||||
| ): | ||||
|     """Coordinator for daily forecast.""" | ||||
|  | ||||
|     def __init__( | ||||
|         self, | ||||
|         hass: HomeAssistant, | ||||
|         config_entry: AccuWeatherConfigEntry, | ||||
|         accuweather: AccuWeather, | ||||
|     ) -> None: | ||||
|         """Initialize.""" | ||||
|         super().__init__( | ||||
|             hass, | ||||
|             config_entry, | ||||
|             accuweather, | ||||
|             "daily forecast", | ||||
|             UPDATE_INTERVAL_DAILY_FORECAST, | ||||
|             fetch_method=accuweather.async_get_daily_forecast, | ||||
|         ) | ||||
|  | ||||
|  | ||||
| class AccuWeatherHourlyForecastDataUpdateCoordinator( | ||||
|     AccuWeatherForecastDataUpdateCoordinator | ||||
| ): | ||||
|     """Coordinator for hourly forecast.""" | ||||
|  | ||||
|     def __init__( | ||||
|         self, | ||||
|         hass: HomeAssistant, | ||||
|         config_entry: AccuWeatherConfigEntry, | ||||
|         accuweather: AccuWeather, | ||||
|     ) -> None: | ||||
|         """Initialize.""" | ||||
|         super().__init__( | ||||
|             hass, | ||||
|             config_entry, | ||||
|             accuweather, | ||||
|             "hourly forecast", | ||||
|             UPDATE_INTERVAL_HOURLY_FORECAST, | ||||
|             fetch_method=accuweather.async_get_hourly_forecast, | ||||
|         ) | ||||
|  | ||||
|  | ||||
| def _get_device_info(location_key: str, name: str) -> DeviceInfo: | ||||
|     """Get device info.""" | ||||
|     return DeviceInfo( | ||||
|   | ||||
| @@ -1,9 +1,6 @@ | ||||
| { | ||||
|   "entity": { | ||||
|     "sensor": { | ||||
|       "air_quality": { | ||||
|         "default": "mdi:air-filter" | ||||
|       }, | ||||
|       "cloud_ceiling": { | ||||
|         "default": "mdi:weather-fog" | ||||
|       }, | ||||
| @@ -37,6 +34,9 @@ | ||||
|       "thunderstorm_probability_night": { | ||||
|         "default": "mdi:weather-lightning" | ||||
|       }, | ||||
|       "translation_key": { | ||||
|         "default": "mdi:air-filter" | ||||
|       }, | ||||
|       "tree_pollen": { | ||||
|         "default": "mdi:tree-outline" | ||||
|       }, | ||||
|   | ||||
| @@ -7,5 +7,6 @@ | ||||
|   "integration_type": "service", | ||||
|   "iot_class": "cloud_polling", | ||||
|   "loggers": ["accuweather"], | ||||
|   "requirements": ["accuweather==4.2.2"] | ||||
|   "requirements": ["accuweather==4.2.0"], | ||||
|   "single_config_entry": true | ||||
| } | ||||
|   | ||||
| @@ -1,8 +1,14 @@ | ||||
| { | ||||
|   "config": { | ||||
|     "abort": { | ||||
|       "already_configured": "[%key:common::config_flow::abort::already_configured_location%]", | ||||
|       "reauth_successful": "[%key:common::config_flow::abort::reauth_successful%]" | ||||
|     "step": { | ||||
|       "user": { | ||||
|         "data": { | ||||
|           "name": "[%key:common::config_flow::data::name%]", | ||||
|           "api_key": "[%key:common::config_flow::data::api_key%]", | ||||
|           "latitude": "[%key:common::config_flow::data::latitude%]", | ||||
|           "longitude": "[%key:common::config_flow::data::longitude%]" | ||||
|         } | ||||
|       } | ||||
|     }, | ||||
|     "create_entry": { | ||||
|       "default": "Some sensors are not enabled by default. You can enable them in the entity registry after the integration configuration." | ||||
| @@ -11,27 +17,6 @@ | ||||
|       "cannot_connect": "[%key:common::config_flow::error::cannot_connect%]", | ||||
|       "invalid_api_key": "[%key:common::config_flow::error::invalid_api_key%]", | ||||
|       "requests_exceeded": "The allowed number of requests to the AccuWeather API has been exceeded. You have to wait or change the API key." | ||||
|     }, | ||||
|     "step": { | ||||
|       "reauth_confirm": { | ||||
|         "data": { | ||||
|           "api_key": "[%key:common::config_flow::data::api_key%]" | ||||
|         }, | ||||
|         "data_description": { | ||||
|           "api_key": "[%key:component::accuweather::config::step::user::data_description::api_key%]" | ||||
|         } | ||||
|       }, | ||||
|       "user": { | ||||
|         "data": { | ||||
|           "api_key": "[%key:common::config_flow::data::api_key%]", | ||||
|           "latitude": "[%key:common::config_flow::data::latitude%]", | ||||
|           "longitude": "[%key:common::config_flow::data::longitude%]", | ||||
|           "name": "[%key:common::config_flow::data::name%]" | ||||
|         }, | ||||
|         "data_description": { | ||||
|           "api_key": "API key generated in the AccuWeather APIs portal." | ||||
|         } | ||||
|       } | ||||
|     } | ||||
|   }, | ||||
|   "entity": { | ||||
| @@ -120,9 +105,9 @@ | ||||
|       "pressure_tendency": { | ||||
|         "name": "Pressure tendency", | ||||
|         "state": { | ||||
|           "falling": "Falling", | ||||
|           "steady": "Steady", | ||||
|           "rising": "Rising", | ||||
|           "steady": "Steady" | ||||
|           "falling": "Falling" | ||||
|         }, | ||||
|         "state_attributes": { | ||||
|           "options": { | ||||
| @@ -227,6 +212,9 @@ | ||||
|       "wet_bulb_temperature": { | ||||
|         "name": "Wet bulb temperature" | ||||
|       }, | ||||
|       "wind_speed": { | ||||
|         "name": "[%key:component::weather::entity_component::_::state_attributes::wind_speed::name%]" | ||||
|       }, | ||||
|       "wind_chill_temperature": { | ||||
|         "name": "Wind chill temperature" | ||||
|       }, | ||||
| @@ -239,9 +227,6 @@ | ||||
|       "wind_gust_speed_night": { | ||||
|         "name": "Wind gust speed night {forecast_day}" | ||||
|       }, | ||||
|       "wind_speed": { | ||||
|         "name": "[%key:component::weather::entity_component::_::state_attributes::wind_speed::name%]" | ||||
|       }, | ||||
|       "wind_speed_day": { | ||||
|         "name": "Wind speed day {forecast_day}" | ||||
|       }, | ||||
| @@ -251,9 +236,6 @@ | ||||
|     } | ||||
|   }, | ||||
|   "exceptions": { | ||||
|     "auth_error": { | ||||
|       "message": "Authentication failed for {entry}, please update your API key" | ||||
|     }, | ||||
|     "current_conditions_update_error": { | ||||
|       "message": "An error occurred while retrieving weather current conditions data from the AccuWeather API: {error}" | ||||
|     }, | ||||
|   | ||||
| @@ -45,7 +45,6 @@ from .coordinator import ( | ||||
|     AccuWeatherConfigEntry, | ||||
|     AccuWeatherDailyForecastDataUpdateCoordinator, | ||||
|     AccuWeatherData, | ||||
|     AccuWeatherHourlyForecastDataUpdateCoordinator, | ||||
|     AccuWeatherObservationDataUpdateCoordinator, | ||||
| ) | ||||
|  | ||||
| @@ -65,7 +64,6 @@ class AccuWeatherEntity( | ||||
|     CoordinatorWeatherEntity[ | ||||
|         AccuWeatherObservationDataUpdateCoordinator, | ||||
|         AccuWeatherDailyForecastDataUpdateCoordinator, | ||||
|         AccuWeatherHourlyForecastDataUpdateCoordinator, | ||||
|     ] | ||||
| ): | ||||
|     """Define an AccuWeather entity.""" | ||||
| @@ -78,7 +76,6 @@ class AccuWeatherEntity( | ||||
|         super().__init__( | ||||
|             observation_coordinator=accuweather_data.coordinator_observation, | ||||
|             daily_coordinator=accuweather_data.coordinator_daily_forecast, | ||||
|             hourly_coordinator=accuweather_data.coordinator_hourly_forecast, | ||||
|         ) | ||||
|  | ||||
|         self._attr_native_precipitation_unit = UnitOfPrecipitationDepth.MILLIMETERS | ||||
| @@ -89,13 +86,10 @@ class AccuWeatherEntity( | ||||
|         self._attr_unique_id = accuweather_data.coordinator_observation.location_key | ||||
|         self._attr_attribution = ATTRIBUTION | ||||
|         self._attr_device_info = accuweather_data.coordinator_observation.device_info | ||||
|         self._attr_supported_features = ( | ||||
|             WeatherEntityFeature.FORECAST_DAILY | WeatherEntityFeature.FORECAST_HOURLY | ||||
|         ) | ||||
|         self._attr_supported_features = WeatherEntityFeature.FORECAST_DAILY | ||||
|  | ||||
|         self.observation_coordinator = accuweather_data.coordinator_observation | ||||
|         self.daily_coordinator = accuweather_data.coordinator_daily_forecast | ||||
|         self.hourly_coordinator = accuweather_data.coordinator_hourly_forecast | ||||
|  | ||||
|     @property | ||||
|     def condition(self) -> str | None: | ||||
| @@ -213,32 +207,3 @@ class AccuWeatherEntity( | ||||
|             } | ||||
|             for item in self.daily_coordinator.data | ||||
|         ] | ||||
|  | ||||
|     @callback | ||||
|     def _async_forecast_hourly(self) -> list[Forecast] | None: | ||||
|         """Return the hourly forecast in native units.""" | ||||
|         return [ | ||||
|             { | ||||
|                 ATTR_FORECAST_TIME: utc_from_timestamp( | ||||
|                     item["EpochDateTime"] | ||||
|                 ).isoformat(), | ||||
|                 ATTR_FORECAST_CLOUD_COVERAGE: item["CloudCover"], | ||||
|                 ATTR_FORECAST_HUMIDITY: item["RelativeHumidity"], | ||||
|                 ATTR_FORECAST_NATIVE_TEMP: item["Temperature"][ATTR_VALUE], | ||||
|                 ATTR_FORECAST_NATIVE_APPARENT_TEMP: item["RealFeelTemperature"][ | ||||
|                     ATTR_VALUE | ||||
|                 ], | ||||
|                 ATTR_FORECAST_NATIVE_PRECIPITATION: item["TotalLiquid"][ATTR_VALUE], | ||||
|                 ATTR_FORECAST_PRECIPITATION_PROBABILITY: item[ | ||||
|                     "PrecipitationProbability" | ||||
|                 ], | ||||
|                 ATTR_FORECAST_NATIVE_WIND_SPEED: item["Wind"][ATTR_SPEED][ATTR_VALUE], | ||||
|                 ATTR_FORECAST_NATIVE_WIND_GUST_SPEED: item["WindGust"][ATTR_SPEED][ | ||||
|                     ATTR_VALUE | ||||
|                 ], | ||||
|                 ATTR_FORECAST_UV_INDEX: item["UVIndex"], | ||||
|                 ATTR_FORECAST_WIND_BEARING: item["Wind"][ATTR_DIRECTION]["Degrees"], | ||||
|                 ATTR_FORECAST_CONDITION: CONDITION_MAP.get(item["WeatherIcon"]), | ||||
|             } | ||||
|             for item in self.hourly_coordinator.data | ||||
|         ] | ||||
|   | ||||
| @@ -1,15 +1,15 @@ | ||||
| { | ||||
|   "config": { | ||||
|     "abort": { | ||||
|       "no_devices_found": "[%key:common::config_flow::abort::no_devices_found%]" | ||||
|     }, | ||||
|     "step": { | ||||
|       "user": { | ||||
|         "title": "Pick a hub to add", | ||||
|         "data": { | ||||
|           "id": "Host ID" | ||||
|         }, | ||||
|         "title": "Pick a hub to add" | ||||
|         } | ||||
|       } | ||||
|     }, | ||||
|     "abort": { | ||||
|       "no_devices_found": "[%key:common::config_flow::abort::no_devices_found%]" | ||||
|     } | ||||
|   } | ||||
| } | ||||
|   | ||||
| @@ -1,57 +0,0 @@ | ||||
| """The Actron Air integration.""" | ||||
|  | ||||
| from actron_neo_api import ( | ||||
|     ActronAirNeoACSystem, | ||||
|     ActronNeoAPI, | ||||
|     ActronNeoAPIError, | ||||
|     ActronNeoAuthError, | ||||
| ) | ||||
|  | ||||
| from homeassistant.const import CONF_API_TOKEN, Platform | ||||
| from homeassistant.core import HomeAssistant | ||||
|  | ||||
| from .const import _LOGGER | ||||
| from .coordinator import ( | ||||
|     ActronAirConfigEntry, | ||||
|     ActronAirRuntimeData, | ||||
|     ActronAirSystemCoordinator, | ||||
| ) | ||||
|  | ||||
| PLATFORM = [Platform.CLIMATE] | ||||
|  | ||||
|  | ||||
| async def async_setup_entry(hass: HomeAssistant, entry: ActronAirConfigEntry) -> bool: | ||||
|     """Set up Actron Air integration from a config entry.""" | ||||
|  | ||||
|     api = ActronNeoAPI(refresh_token=entry.data[CONF_API_TOKEN]) | ||||
|     systems: list[ActronAirNeoACSystem] = [] | ||||
|  | ||||
|     try: | ||||
|         systems = await api.get_ac_systems() | ||||
|         await api.update_status() | ||||
|     except ActronNeoAuthError: | ||||
|         _LOGGER.error("Authentication error while setting up Actron Air integration") | ||||
|         raise | ||||
|     except ActronNeoAPIError as err: | ||||
|         _LOGGER.error("API error while setting up Actron Air integration: %s", err) | ||||
|         raise | ||||
|  | ||||
|     system_coordinators: dict[str, ActronAirSystemCoordinator] = {} | ||||
|     for system in systems: | ||||
|         coordinator = ActronAirSystemCoordinator(hass, entry, api, system) | ||||
|         _LOGGER.debug("Setting up coordinator for system: %s", system["serial"]) | ||||
|         await coordinator.async_config_entry_first_refresh() | ||||
|         system_coordinators[system["serial"]] = coordinator | ||||
|  | ||||
|     entry.runtime_data = ActronAirRuntimeData( | ||||
|         api=api, | ||||
|         system_coordinators=system_coordinators, | ||||
|     ) | ||||
|  | ||||
|     await hass.config_entries.async_forward_entry_setups(entry, PLATFORM) | ||||
|     return True | ||||
|  | ||||
|  | ||||
| async def async_unload_entry(hass: HomeAssistant, entry: ActronAirConfigEntry) -> bool: | ||||
|     """Unload a config entry.""" | ||||
|     return await hass.config_entries.async_unload_platforms(entry, PLATFORM) | ||||
| @@ -1,259 +0,0 @@ | ||||
| """Climate platform for Actron Air integration.""" | ||||
|  | ||||
| from typing import Any | ||||
|  | ||||
| from actron_neo_api import ActronAirNeoStatus, ActronAirNeoZone | ||||
|  | ||||
| from homeassistant.components.climate import ( | ||||
|     FAN_AUTO, | ||||
|     FAN_HIGH, | ||||
|     FAN_LOW, | ||||
|     FAN_MEDIUM, | ||||
|     ClimateEntity, | ||||
|     ClimateEntityFeature, | ||||
|     HVACMode, | ||||
| ) | ||||
| from homeassistant.const import ATTR_TEMPERATURE, UnitOfTemperature | ||||
| from homeassistant.core import HomeAssistant | ||||
| from homeassistant.helpers.device_registry import DeviceInfo | ||||
| from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback | ||||
| from homeassistant.helpers.update_coordinator import CoordinatorEntity | ||||
|  | ||||
| from .const import DOMAIN | ||||
| from .coordinator import ActronAirConfigEntry, ActronAirSystemCoordinator | ||||
|  | ||||
| PARALLEL_UPDATES = 0 | ||||
|  | ||||
| FAN_MODE_MAPPING_ACTRONAIR_TO_HA = { | ||||
|     "AUTO": FAN_AUTO, | ||||
|     "LOW": FAN_LOW, | ||||
|     "MED": FAN_MEDIUM, | ||||
|     "HIGH": FAN_HIGH, | ||||
| } | ||||
| FAN_MODE_MAPPING_HA_TO_ACTRONAIR = { | ||||
|     v: k for k, v in FAN_MODE_MAPPING_ACTRONAIR_TO_HA.items() | ||||
| } | ||||
| HVAC_MODE_MAPPING_ACTRONAIR_TO_HA = { | ||||
|     "COOL": HVACMode.COOL, | ||||
|     "HEAT": HVACMode.HEAT, | ||||
|     "FAN": HVACMode.FAN_ONLY, | ||||
|     "AUTO": HVACMode.AUTO, | ||||
|     "OFF": HVACMode.OFF, | ||||
| } | ||||
| HVAC_MODE_MAPPING_HA_TO_ACTRONAIR = { | ||||
|     v: k for k, v in HVAC_MODE_MAPPING_ACTRONAIR_TO_HA.items() | ||||
| } | ||||
|  | ||||
|  | ||||
| async def async_setup_entry( | ||||
|     hass: HomeAssistant, | ||||
|     entry: ActronAirConfigEntry, | ||||
|     async_add_entities: AddConfigEntryEntitiesCallback, | ||||
| ) -> None: | ||||
|     """Set up Actron Air climate entities.""" | ||||
|     system_coordinators = entry.runtime_data.system_coordinators | ||||
|     entities: list[ClimateEntity] = [] | ||||
|  | ||||
|     for coordinator in system_coordinators.values(): | ||||
|         status = coordinator.data | ||||
|         name = status.ac_system.system_name | ||||
|         entities.append(ActronSystemClimate(coordinator, name)) | ||||
|  | ||||
|         entities.extend( | ||||
|             ActronZoneClimate(coordinator, zone) | ||||
|             for zone in status.remote_zone_info | ||||
|             if zone.exists | ||||
|         ) | ||||
|  | ||||
|     async_add_entities(entities) | ||||
|  | ||||
|  | ||||
| class BaseClimateEntity(CoordinatorEntity[ActronAirSystemCoordinator], ClimateEntity): | ||||
|     """Base class for Actron Air climate entities.""" | ||||
|  | ||||
|     _attr_has_entity_name = True | ||||
|     _attr_temperature_unit = UnitOfTemperature.CELSIUS | ||||
|     _attr_supported_features = ( | ||||
|         ClimateEntityFeature.TARGET_TEMPERATURE | ||||
|         | ClimateEntityFeature.FAN_MODE | ||||
|         | ClimateEntityFeature.TURN_ON | ||||
|         | ClimateEntityFeature.TURN_OFF | ||||
|     ) | ||||
|     _attr_name = None | ||||
|     _attr_fan_modes = list(FAN_MODE_MAPPING_ACTRONAIR_TO_HA.values()) | ||||
|     _attr_hvac_modes = list(HVAC_MODE_MAPPING_ACTRONAIR_TO_HA.values()) | ||||
|  | ||||
|     def __init__( | ||||
|         self, | ||||
|         coordinator: ActronAirSystemCoordinator, | ||||
|         name: str, | ||||
|     ) -> None: | ||||
|         """Initialize an Actron Air unit.""" | ||||
|         super().__init__(coordinator) | ||||
|         self._serial_number = coordinator.serial_number | ||||
|  | ||||
|  | ||||
| class ActronSystemClimate(BaseClimateEntity): | ||||
|     """Representation of the Actron Air system.""" | ||||
|  | ||||
|     _attr_supported_features = ( | ||||
|         ClimateEntityFeature.TARGET_TEMPERATURE | ||||
|         | ClimateEntityFeature.FAN_MODE | ||||
|         | ClimateEntityFeature.TURN_ON | ||||
|         | ClimateEntityFeature.TURN_OFF | ||||
|     ) | ||||
|  | ||||
|     def __init__( | ||||
|         self, | ||||
|         coordinator: ActronAirSystemCoordinator, | ||||
|         name: str, | ||||
|     ) -> None: | ||||
|         """Initialize an Actron Air unit.""" | ||||
|         super().__init__(coordinator, name) | ||||
|         serial_number = coordinator.serial_number | ||||
|         self._attr_unique_id = serial_number | ||||
|         self._attr_device_info = DeviceInfo( | ||||
|             identifiers={(DOMAIN, serial_number)}, | ||||
|             name=self._status.ac_system.system_name, | ||||
|             manufacturer="Actron Air", | ||||
|             model_id=self._status.ac_system.master_wc_model, | ||||
|             sw_version=self._status.ac_system.master_wc_firmware_version, | ||||
|             serial_number=serial_number, | ||||
|         ) | ||||
|  | ||||
|     @property | ||||
|     def min_temp(self) -> float: | ||||
|         """Return the minimum temperature that can be set.""" | ||||
|         return self._status.min_temp | ||||
|  | ||||
|     @property | ||||
|     def max_temp(self) -> float: | ||||
|         """Return the maximum temperature that can be set.""" | ||||
|         return self._status.max_temp | ||||
|  | ||||
|     @property | ||||
|     def _status(self) -> ActronAirNeoStatus: | ||||
|         """Get the current status from the coordinator.""" | ||||
|         return self.coordinator.data | ||||
|  | ||||
|     @property | ||||
|     def hvac_mode(self) -> HVACMode | None: | ||||
|         """Return the current HVAC mode.""" | ||||
|         if not self._status.user_aircon_settings.is_on: | ||||
|             return HVACMode.OFF | ||||
|  | ||||
|         mode = self._status.user_aircon_settings.mode | ||||
|         return HVAC_MODE_MAPPING_ACTRONAIR_TO_HA.get(mode) | ||||
|  | ||||
|     @property | ||||
|     def fan_mode(self) -> str | None: | ||||
|         """Return the current fan mode.""" | ||||
|         fan_mode = self._status.user_aircon_settings.fan_mode | ||||
|         return FAN_MODE_MAPPING_ACTRONAIR_TO_HA.get(fan_mode) | ||||
|  | ||||
|     @property | ||||
|     def current_humidity(self) -> float: | ||||
|         """Return the current humidity.""" | ||||
|         return self._status.master_info.live_humidity_pc | ||||
|  | ||||
|     @property | ||||
|     def current_temperature(self) -> float: | ||||
|         """Return the current temperature.""" | ||||
|         return self._status.master_info.live_temp_c | ||||
|  | ||||
|     @property | ||||
|     def target_temperature(self) -> float: | ||||
|         """Return the target temperature.""" | ||||
|         return self._status.user_aircon_settings.temperature_setpoint_cool_c | ||||
|  | ||||
|     async def async_set_fan_mode(self, fan_mode: str) -> None: | ||||
|         """Set a new fan mode.""" | ||||
|         api_fan_mode = FAN_MODE_MAPPING_HA_TO_ACTRONAIR.get(fan_mode.lower()) | ||||
|         await self._status.user_aircon_settings.set_fan_mode(api_fan_mode) | ||||
|  | ||||
|     async def async_set_hvac_mode(self, hvac_mode: HVACMode) -> None: | ||||
|         """Set the HVAC mode.""" | ||||
|         ac_mode = HVAC_MODE_MAPPING_HA_TO_ACTRONAIR.get(hvac_mode) | ||||
|         await self._status.ac_system.set_system_mode(ac_mode) | ||||
|  | ||||
|     async def async_set_temperature(self, **kwargs: Any) -> None: | ||||
|         """Set the temperature.""" | ||||
|         temp = kwargs.get(ATTR_TEMPERATURE) | ||||
|         await self._status.user_aircon_settings.set_temperature(temperature=temp) | ||||
|  | ||||
|  | ||||
| class ActronZoneClimate(BaseClimateEntity): | ||||
|     """Representation of a zone within the Actron Air system.""" | ||||
|  | ||||
|     _attr_supported_features = ( | ||||
|         ClimateEntityFeature.TARGET_TEMPERATURE | ||||
|         | ClimateEntityFeature.TURN_ON | ||||
|         | ClimateEntityFeature.TURN_OFF | ||||
|     ) | ||||
|  | ||||
|     def __init__( | ||||
|         self, | ||||
|         coordinator: ActronAirSystemCoordinator, | ||||
|         zone: ActronAirNeoZone, | ||||
|     ) -> None: | ||||
|         """Initialize an Actron Air unit.""" | ||||
|         super().__init__(coordinator, zone.title) | ||||
|         serial_number = coordinator.serial_number | ||||
|         self._zone_id: int = zone.zone_id | ||||
|         self._attr_unique_id: str = f"{serial_number}_zone_{zone.zone_id}" | ||||
|         self._attr_device_info: DeviceInfo = DeviceInfo( | ||||
|             identifiers={(DOMAIN, self._attr_unique_id)}, | ||||
|             name=zone.title, | ||||
|             manufacturer="Actron Air", | ||||
|             model="Zone", | ||||
|             suggested_area=zone.title, | ||||
|             via_device=(DOMAIN, serial_number), | ||||
|         ) | ||||
|  | ||||
|     @property | ||||
|     def min_temp(self) -> float: | ||||
|         """Return the minimum temperature that can be set.""" | ||||
|         return self._zone.min_temp | ||||
|  | ||||
|     @property | ||||
|     def max_temp(self) -> float: | ||||
|         """Return the maximum temperature that can be set.""" | ||||
|         return self._zone.max_temp | ||||
|  | ||||
|     @property | ||||
|     def _zone(self) -> ActronAirNeoZone: | ||||
|         """Get the current zone data from the coordinator.""" | ||||
|         status = self.coordinator.data | ||||
|         return status.zones[self._zone_id] | ||||
|  | ||||
|     @property | ||||
|     def hvac_mode(self) -> HVACMode | None: | ||||
|         """Return the current HVAC mode.""" | ||||
|         if self._zone.is_active: | ||||
|             mode = self._zone.hvac_mode | ||||
|             return HVAC_MODE_MAPPING_ACTRONAIR_TO_HA.get(mode) | ||||
|         return HVACMode.OFF | ||||
|  | ||||
|     @property | ||||
|     def current_humidity(self) -> float | None: | ||||
|         """Return the current humidity.""" | ||||
|         return self._zone.humidity | ||||
|  | ||||
|     @property | ||||
|     def current_temperature(self) -> float | None: | ||||
|         """Return the current temperature.""" | ||||
|         return self._zone.live_temp_c | ||||
|  | ||||
|     @property | ||||
|     def target_temperature(self) -> float | None: | ||||
|         """Return the target temperature.""" | ||||
|         return self._zone.temperature_setpoint_cool_c | ||||
|  | ||||
|     async def async_set_hvac_mode(self, hvac_mode: HVACMode) -> None: | ||||
|         """Set the HVAC mode.""" | ||||
|         is_enabled = hvac_mode != HVACMode.OFF | ||||
|         await self._zone.enable(is_enabled) | ||||
|  | ||||
|     async def async_set_temperature(self, **kwargs: Any) -> None: | ||||
|         """Set the temperature.""" | ||||
|         await self._zone.set_temperature(temperature=kwargs["temperature"]) | ||||
| @@ -1,132 +0,0 @@ | ||||
| """Setup config flow for Actron Air integration.""" | ||||
|  | ||||
| import asyncio | ||||
| from typing import Any | ||||
|  | ||||
| from actron_neo_api import ActronNeoAPI, ActronNeoAuthError | ||||
|  | ||||
| from homeassistant.config_entries import ConfigFlow, ConfigFlowResult | ||||
| from homeassistant.const import CONF_API_TOKEN | ||||
| from homeassistant.exceptions import HomeAssistantError | ||||
|  | ||||
| from .const import _LOGGER, DOMAIN | ||||
|  | ||||
|  | ||||
| class ActronAirConfigFlow(ConfigFlow, domain=DOMAIN): | ||||
|     """Handle a config flow for Actron Air.""" | ||||
|  | ||||
|     def __init__(self) -> None: | ||||
|         """Initialize the config flow.""" | ||||
|         self._api: ActronNeoAPI | None = None | ||||
|         self._device_code: str | None = None | ||||
|         self._user_code: str = "" | ||||
|         self._verification_uri: str = "" | ||||
|         self._expires_minutes: str = "30" | ||||
|         self.login_task: asyncio.Task | None = None | ||||
|  | ||||
|     async def async_step_user( | ||||
|         self, user_input: dict[str, Any] | None = None | ||||
|     ) -> ConfigFlowResult: | ||||
|         """Handle the initial step.""" | ||||
|         if self._api is None: | ||||
|             _LOGGER.debug("Initiating device authorization") | ||||
|             self._api = ActronNeoAPI() | ||||
|             try: | ||||
|                 device_code_response = await self._api.request_device_code() | ||||
|             except ActronNeoAuthError as err: | ||||
|                 _LOGGER.error("OAuth2 flow failed: %s", err) | ||||
|                 return self.async_abort(reason="oauth2_error") | ||||
|  | ||||
|             self._device_code = device_code_response["device_code"] | ||||
|             self._user_code = device_code_response["user_code"] | ||||
|             self._verification_uri = device_code_response["verification_uri_complete"] | ||||
|             self._expires_minutes = str(device_code_response["expires_in"] // 60) | ||||
|  | ||||
|         async def _wait_for_authorization() -> None: | ||||
|             """Wait for the user to authorize the device.""" | ||||
|             assert self._api is not None | ||||
|             assert self._device_code is not None | ||||
|             _LOGGER.debug("Waiting for device authorization") | ||||
|             try: | ||||
|                 await self._api.poll_for_token(self._device_code) | ||||
|                 _LOGGER.debug("Authorization successful") | ||||
|             except ActronNeoAuthError as ex: | ||||
|                 _LOGGER.exception("Error while waiting for device authorization") | ||||
|                 raise CannotConnect from ex | ||||
|  | ||||
|         _LOGGER.debug("Checking login task") | ||||
|         if self.login_task is None: | ||||
|             _LOGGER.debug("Creating task for device authorization") | ||||
|             self.login_task = self.hass.async_create_task(_wait_for_authorization()) | ||||
|  | ||||
|         if self.login_task.done(): | ||||
|             _LOGGER.debug("Login task is done, checking results") | ||||
|             if exception := self.login_task.exception(): | ||||
|                 if isinstance(exception, CannotConnect): | ||||
|                     return self.async_show_progress_done( | ||||
|                         next_step_id="connection_error" | ||||
|                     ) | ||||
|                 return self.async_show_progress_done(next_step_id="timeout") | ||||
|             return self.async_show_progress_done(next_step_id="finish_login") | ||||
|  | ||||
|         return self.async_show_progress( | ||||
|             step_id="user", | ||||
|             progress_action="wait_for_authorization", | ||||
|             description_placeholders={ | ||||
|                 "user_code": self._user_code, | ||||
|                 "verification_uri": self._verification_uri, | ||||
|                 "expires_minutes": self._expires_minutes, | ||||
|             }, | ||||
|             progress_task=self.login_task, | ||||
|         ) | ||||
|  | ||||
|     async def async_step_finish_login( | ||||
|         self, user_input: dict[str, Any] | None = None | ||||
|     ) -> ConfigFlowResult: | ||||
|         """Handle the finalization of login.""" | ||||
|         _LOGGER.debug("Finalizing authorization") | ||||
|         assert self._api is not None | ||||
|  | ||||
|         try: | ||||
|             user_data = await self._api.get_user_info() | ||||
|         except ActronNeoAuthError as err: | ||||
|             _LOGGER.error("Error getting user info: %s", err) | ||||
|             return self.async_abort(reason="oauth2_error") | ||||
|  | ||||
|         unique_id = str(user_data["id"]) | ||||
|         await self.async_set_unique_id(unique_id) | ||||
|         self._abort_if_unique_id_configured() | ||||
|  | ||||
|         return self.async_create_entry( | ||||
|             title=user_data["email"], | ||||
|             data={CONF_API_TOKEN: self._api.refresh_token_value}, | ||||
|         ) | ||||
|  | ||||
|     async def async_step_timeout( | ||||
|         self, | ||||
|         user_input: dict[str, Any] | None = None, | ||||
|     ) -> ConfigFlowResult: | ||||
|         """Handle issues that need transition await from progress step.""" | ||||
|         if user_input is None: | ||||
|             return self.async_show_form( | ||||
|                 step_id="timeout", | ||||
|             ) | ||||
|         del self.login_task | ||||
|         return await self.async_step_user() | ||||
|  | ||||
|     async def async_step_connection_error( | ||||
|         self, user_input: dict[str, Any] | None = None | ||||
|     ) -> ConfigFlowResult: | ||||
|         """Handle connection error from progress step.""" | ||||
|         if user_input is None: | ||||
|             return self.async_show_form(step_id="connection_error") | ||||
|  | ||||
|         # Reset state and try again | ||||
|         self._api = None | ||||
|         self._device_code = None | ||||
|         self.login_task = None | ||||
|         return await self.async_step_user() | ||||
|  | ||||
|  | ||||
| class CannotConnect(HomeAssistantError): | ||||
|     """Error to indicate we cannot connect.""" | ||||
| @@ -1,6 +0,0 @@ | ||||
| """Constants used by Actron Air integration.""" | ||||
|  | ||||
| import logging | ||||
|  | ||||
| _LOGGER = logging.getLogger(__package__) | ||||
| DOMAIN = "actron_air" | ||||
| @@ -1,69 +0,0 @@ | ||||
| """Coordinator for Actron Air integration.""" | ||||
|  | ||||
| from __future__ import annotations | ||||
|  | ||||
| from dataclasses import dataclass | ||||
| from datetime import timedelta | ||||
|  | ||||
| from actron_neo_api import ActronAirNeoACSystem, ActronAirNeoStatus, ActronNeoAPI | ||||
|  | ||||
| from homeassistant.config_entries import ConfigEntry | ||||
| from homeassistant.core import HomeAssistant | ||||
| from homeassistant.helpers.update_coordinator import DataUpdateCoordinator | ||||
| from homeassistant.util import dt as dt_util | ||||
|  | ||||
| from .const import _LOGGER | ||||
|  | ||||
| STALE_DEVICE_TIMEOUT = timedelta(hours=24) | ||||
| ERROR_NO_SYSTEMS_FOUND = "no_systems_found" | ||||
| ERROR_UNKNOWN = "unknown_error" | ||||
|  | ||||
|  | ||||
| @dataclass | ||||
| class ActronAirRuntimeData: | ||||
|     """Runtime data for the Actron Air integration.""" | ||||
|  | ||||
|     api: ActronNeoAPI | ||||
|     system_coordinators: dict[str, ActronAirSystemCoordinator] | ||||
|  | ||||
|  | ||||
| type ActronAirConfigEntry = ConfigEntry[ActronAirRuntimeData] | ||||
|  | ||||
| AUTH_ERROR_THRESHOLD = 3 | ||||
| SCAN_INTERVAL = timedelta(seconds=30) | ||||
|  | ||||
|  | ||||
| class ActronAirSystemCoordinator(DataUpdateCoordinator[ActronAirNeoACSystem]): | ||||
|     """System coordinator for Actron Air integration.""" | ||||
|  | ||||
|     def __init__( | ||||
|         self, | ||||
|         hass: HomeAssistant, | ||||
|         entry: ActronAirConfigEntry, | ||||
|         api: ActronNeoAPI, | ||||
|         system: ActronAirNeoACSystem, | ||||
|     ) -> None: | ||||
|         """Initialize the coordinator.""" | ||||
|         super().__init__( | ||||
|             hass, | ||||
|             _LOGGER, | ||||
|             name="Actron Air Status", | ||||
|             update_interval=SCAN_INTERVAL, | ||||
|             config_entry=entry, | ||||
|         ) | ||||
|         self.system = system | ||||
|         self.serial_number = system["serial"] | ||||
|         self.api = api | ||||
|         self.status = self.api.state_manager.get_status(self.serial_number) | ||||
|         self.last_seen = dt_util.utcnow() | ||||
|  | ||||
|     async def _async_update_data(self) -> ActronAirNeoStatus: | ||||
|         """Fetch updates and merge incremental changes into the full state.""" | ||||
|         await self.api.update_status() | ||||
|         self.status = self.api.state_manager.get_status(self.serial_number) | ||||
|         self.last_seen = dt_util.utcnow() | ||||
|         return self.status | ||||
|  | ||||
|     def is_device_stale(self) -> bool: | ||||
|         """Check if a device is stale (not seen for a while).""" | ||||
|         return (dt_util.utcnow() - self.last_seen) > STALE_DEVICE_TIMEOUT | ||||
| @@ -1,16 +0,0 @@ | ||||
| { | ||||
|   "domain": "actron_air", | ||||
|   "name": "Actron Air", | ||||
|   "codeowners": ["@kclif9", "@JagadishDhanamjayam"], | ||||
|   "config_flow": true, | ||||
|   "dhcp": [ | ||||
|     { | ||||
|       "hostname": "neo-*", | ||||
|       "macaddress": "FC0FE7*" | ||||
|     } | ||||
|   ], | ||||
|   "documentation": "https://www.home-assistant.io/integrations/actron_air", | ||||
|   "iot_class": "cloud_polling", | ||||
|   "quality_scale": "bronze", | ||||
|   "requirements": ["actron-neo-api==0.1.84"] | ||||
| } | ||||
| @@ -1,78 +0,0 @@ | ||||
| rules: | ||||
|   # Bronze | ||||
|   action-setup: | ||||
|     status: exempt | ||||
|     comment: This integration does not have custom service actions. | ||||
|   appropriate-polling: done | ||||
|   brands: done | ||||
|   common-modules: done | ||||
|   config-flow-test-coverage: done | ||||
|   config-flow: done | ||||
|   dependency-transparency: done | ||||
|   docs-actions: | ||||
|     status: exempt | ||||
|     comment: This integration does not have custom service actions. | ||||
|   docs-high-level-description: done | ||||
|   docs-installation-instructions: done | ||||
|   docs-removal-instructions: done | ||||
|   entity-event-setup: | ||||
|     status: exempt | ||||
|     comment: This integration does not subscribe to external events. | ||||
|   entity-unique-id: done | ||||
|   has-entity-name: done | ||||
|   runtime-data: done | ||||
|   test-before-configure: done | ||||
|   test-before-setup: done | ||||
|   unique-config-entry: done | ||||
|  | ||||
|   # Silver | ||||
|   action-exceptions: todo | ||||
|   config-entry-unloading: done | ||||
|   docs-configuration-parameters: | ||||
|     status: exempt | ||||
|     comment: No options flow | ||||
|   docs-installation-parameters: done | ||||
|   entity-unavailable: done | ||||
|   integration-owner: done | ||||
|   log-when-unavailable: done | ||||
|   parallel-updates: done | ||||
|   reauthentication-flow: todo | ||||
|   test-coverage: todo | ||||
|  | ||||
|   # Gold | ||||
|   devices: done | ||||
|   diagnostics: todo | ||||
|   discovery-update-info: | ||||
|     status: exempt | ||||
|     comment: This integration uses DHCP discovery, however is cloud polling. Therefore there is no information to update. | ||||
|   discovery: done | ||||
|   docs-data-update: done | ||||
|   docs-examples: done | ||||
|   docs-known-limitations: done | ||||
|   docs-supported-devices: done | ||||
|   docs-supported-functions: done | ||||
|   docs-troubleshooting: done | ||||
|   docs-use-cases: done | ||||
|   dynamic-devices: todo | ||||
|   entity-category: | ||||
|     status: exempt | ||||
|     comment: This integration does not use entity categories. | ||||
|   entity-device-class: | ||||
|     status: exempt | ||||
|     comment: This integration does not use entity device classes. | ||||
|   entity-disabled-by-default: | ||||
|     status: exempt | ||||
|     comment: Not required for this integration at this stage. | ||||
|   entity-translations: todo | ||||
|   exception-translations: todo | ||||
|   icon-translations: todo | ||||
|   reconfiguration-flow: todo | ||||
|   repair-issues: | ||||
|     status: exempt | ||||
|     comment: This integration does not have any known issues that require repair. | ||||
|   stale-devices: todo | ||||
|  | ||||
|   # Platinum | ||||
|   async-dependency: done | ||||
|   inject-websession: todo | ||||
|   strict-typing: todo | ||||
| @@ -1,29 +0,0 @@ | ||||
| { | ||||
|   "config": { | ||||
|     "abort": { | ||||
|       "already_configured": "[%key:common::config_flow::abort::already_configured_account%]", | ||||
|       "oauth2_error": "Failed to start OAuth2 flow" | ||||
|     }, | ||||
|     "error": { | ||||
|       "oauth2_error": "Failed to start OAuth2 flow. Please try again later." | ||||
|     }, | ||||
|     "progress": { | ||||
|       "wait_for_authorization": "To authenticate, open the following URL and login at Actron Air:\n{verification_uri}\nIf the code is not automatically copied, paste the following code to authorize the integration:\n\n```{user_code}```\n\n\nThe login attempt will time out after {expires_minutes} minutes." | ||||
|     }, | ||||
|     "step": { | ||||
|       "connection_error": { | ||||
|         "data": {}, | ||||
|         "description": "Failed to connect to Actron Air. Please check your internet connection and try again.", | ||||
|         "title": "Connection error" | ||||
|       }, | ||||
|       "timeout": { | ||||
|         "data": {}, | ||||
|         "description": "The authorization process timed out. Please try again.", | ||||
|         "title": "Authorization timeout" | ||||
|       }, | ||||
|       "user": { | ||||
|         "title": "Actron Air OAuth2 Authorization" | ||||
|       } | ||||
|     } | ||||
|   } | ||||
| } | ||||
| @@ -6,5 +6,5 @@ | ||||
|   "documentation": "https://www.home-assistant.io/integrations/adax", | ||||
|   "iot_class": "local_polling", | ||||
|   "loggers": ["adax", "adax_local"], | ||||
|   "requirements": ["adax==0.4.0", "Adax-local==0.2.0"] | ||||
|   "requirements": ["adax==0.4.0", "Adax-local==0.1.5"] | ||||
| } | ||||
|   | ||||
| @@ -1,34 +1,34 @@ | ||||
| { | ||||
|   "config": { | ||||
|     "abort": { | ||||
|       "already_configured": "[%key:common::config_flow::abort::already_configured_device%]", | ||||
|       "heater_not_available": "Heater not available. Try to reset the heater by pressing + and OK for some seconds.", | ||||
|       "heater_not_found": "Heater not found. Try to move the heater closer to Home Assistant computer.", | ||||
|       "invalid_auth": "[%key:common::config_flow::error::invalid_auth%]" | ||||
|     }, | ||||
|     "error": { | ||||
|       "cannot_connect": "[%key:common::config_flow::error::cannot_connect%]" | ||||
|     }, | ||||
|     "step": { | ||||
|       "cloud": { | ||||
|         "data": { | ||||
|           "account_id": "Account ID", | ||||
|           "password": "[%key:common::config_flow::data::password%]" | ||||
|         } | ||||
|       }, | ||||
|       "local": { | ||||
|         "data": { | ||||
|           "wifi_pswd": "Wi-Fi password", | ||||
|           "wifi_ssid": "Wi-Fi SSID" | ||||
|         }, | ||||
|         "description": "Reset the heater by pressing + and OK until display shows 'Reset'. Then press and hold OK button on the heater until the blue LED starts blinking before pressing Submit. Configuring heater might take some minutes." | ||||
|       }, | ||||
|       "user": { | ||||
|         "data": { | ||||
|           "connection_type": "Select connection type" | ||||
|         }, | ||||
|         "description": "Select connection type. Local requires heaters with Bluetooth" | ||||
|       }, | ||||
|       "local": { | ||||
|         "data": { | ||||
|           "wifi_ssid": "Wi-Fi SSID", | ||||
|           "wifi_pswd": "Wi-Fi password" | ||||
|         }, | ||||
|         "description": "Reset the heater by pressing + and OK until display shows 'Reset'. Then press and hold OK button on the heater until the blue LED starts blinking before pressing Submit. Configuring heater might take some minutes." | ||||
|       }, | ||||
|       "cloud": { | ||||
|         "data": { | ||||
|           "account_id": "Account ID", | ||||
|           "password": "[%key:common::config_flow::data::password%]" | ||||
|         } | ||||
|       } | ||||
|     }, | ||||
|     "error": { | ||||
|       "cannot_connect": "[%key:common::config_flow::error::cannot_connect%]" | ||||
|     }, | ||||
|     "abort": { | ||||
|       "already_configured": "[%key:common::config_flow::abort::already_configured_device%]", | ||||
|       "heater_not_available": "Heater not available. Try to reset the heater by pressing + and OK for some seconds.", | ||||
|       "heater_not_found": "Heater not found. Try to move the heater closer to Home Assistant computer.", | ||||
|       "invalid_auth": "[%key:common::config_flow::error::invalid_auth%]" | ||||
|     } | ||||
|   } | ||||
| } | ||||
|   | ||||
| @@ -1,9 +1,6 @@ | ||||
| { | ||||
|   "entity": { | ||||
|     "sensor": { | ||||
|       "average_processing_speed": { | ||||
|         "default": "mdi:speedometer" | ||||
|       }, | ||||
|       "dns_queries": { | ||||
|         "default": "mdi:magnify" | ||||
|       }, | ||||
| @@ -16,18 +13,21 @@ | ||||
|       "parental_control_blocked": { | ||||
|         "default": "mdi:human-male-girl" | ||||
|       }, | ||||
|       "rules_count": { | ||||
|         "default": "mdi:counter" | ||||
|       }, | ||||
|       "safe_browsing_blocked": { | ||||
|         "default": "mdi:shield-half-full" | ||||
|       }, | ||||
|       "safe_searches_enforced": { | ||||
|         "default": "mdi:shield-search" | ||||
|       }, | ||||
|       "average_processing_speed": { | ||||
|         "default": "mdi:speedometer" | ||||
|       }, | ||||
|       "rules_count": { | ||||
|         "default": "mdi:counter" | ||||
|       } | ||||
|     }, | ||||
|     "switch": { | ||||
|       "filtering": { | ||||
|       "protection": { | ||||
|         "default": "mdi:shield-check", | ||||
|         "state": { | ||||
|           "off": "mdi:shield-off" | ||||
| @@ -39,13 +39,7 @@ | ||||
|           "off": "mdi:shield-off" | ||||
|         } | ||||
|       }, | ||||
|       "protection": { | ||||
|         "default": "mdi:shield-check", | ||||
|         "state": { | ||||
|           "off": "mdi:shield-off" | ||||
|         } | ||||
|       }, | ||||
|       "query_log": { | ||||
|       "safe_search": { | ||||
|         "default": "mdi:shield-check", | ||||
|         "state": { | ||||
|           "off": "mdi:shield-off" | ||||
| @@ -57,7 +51,13 @@ | ||||
|           "off": "mdi:shield-off" | ||||
|         } | ||||
|       }, | ||||
|       "safe_search": { | ||||
|       "filtering": { | ||||
|         "default": "mdi:shield-check", | ||||
|         "state": { | ||||
|           "off": "mdi:shield-off" | ||||
|         } | ||||
|       }, | ||||
|       "query_log": { | ||||
|         "default": "mdi:shield-check", | ||||
|         "state": { | ||||
|           "off": "mdi:shield-off" | ||||
| @@ -69,17 +69,17 @@ | ||||
|     "add_url": { | ||||
|       "service": "mdi:link-plus" | ||||
|     }, | ||||
|     "disable_url": { | ||||
|       "service": "mdi:link-variant-off" | ||||
|     "remove_url": { | ||||
|       "service": "mdi:link-off" | ||||
|     }, | ||||
|     "enable_url": { | ||||
|       "service": "mdi:link-variant" | ||||
|     }, | ||||
|     "disable_url": { | ||||
|       "service": "mdi:link-variant-off" | ||||
|     }, | ||||
|     "refresh": { | ||||
|       "service": "mdi:refresh" | ||||
|     }, | ||||
|     "remove_url": { | ||||
|       "service": "mdi:link-off" | ||||
|     } | ||||
|   } | ||||
| } | ||||
|   | ||||
| @@ -1,38 +1,35 @@ | ||||
| { | ||||
|   "config": { | ||||
|     "abort": { | ||||
|       "already_configured": "[%key:common::config_flow::abort::already_configured_service%]", | ||||
|       "existing_instance_updated": "Updated existing configuration." | ||||
|     }, | ||||
|     "error": { | ||||
|       "cannot_connect": "[%key:common::config_flow::error::cannot_connect%]" | ||||
|     }, | ||||
|     "step": { | ||||
|       "hassio_confirm": { | ||||
|         "description": "Do you want to configure Home Assistant to connect to the AdGuard Home provided by the add-on: {addon}?", | ||||
|         "title": "AdGuard Home via Home Assistant add-on" | ||||
|       }, | ||||
|       "user": { | ||||
|         "description": "Set up your AdGuard Home instance to allow monitoring and control.", | ||||
|         "data": { | ||||
|           "host": "[%key:common::config_flow::data::host%]", | ||||
|           "password": "[%key:common::config_flow::data::password%]", | ||||
|           "port": "[%key:common::config_flow::data::port%]", | ||||
|           "ssl": "[%key:common::config_flow::data::ssl%]", | ||||
|           "username": "[%key:common::config_flow::data::username%]", | ||||
|           "ssl": "[%key:common::config_flow::data::ssl%]", | ||||
|           "verify_ssl": "[%key:common::config_flow::data::verify_ssl%]" | ||||
|         }, | ||||
|         "data_description": { | ||||
|           "host": "The hostname or IP address of the device running your AdGuard Home." | ||||
|         }, | ||||
|         "description": "Set up your AdGuard Home instance to allow monitoring and control." | ||||
|         } | ||||
|       }, | ||||
|       "hassio_confirm": { | ||||
|         "title": "AdGuard Home via Home Assistant add-on", | ||||
|         "description": "Do you want to configure Home Assistant to connect to the AdGuard Home provided by the add-on: {addon}?" | ||||
|       } | ||||
|     }, | ||||
|     "error": { | ||||
|       "cannot_connect": "[%key:common::config_flow::error::cannot_connect%]" | ||||
|     }, | ||||
|     "abort": { | ||||
|       "existing_instance_updated": "Updated existing configuration.", | ||||
|       "already_configured": "[%key:common::config_flow::abort::already_configured_service%]" | ||||
|     } | ||||
|   }, | ||||
|   "entity": { | ||||
|     "sensor": { | ||||
|       "average_processing_speed": { | ||||
|         "name": "Average processing speed" | ||||
|       }, | ||||
|       "dns_queries": { | ||||
|         "name": "DNS queries" | ||||
|       }, | ||||
| @@ -45,91 +42,94 @@ | ||||
|       "parental_control_blocked": { | ||||
|         "name": "Parental control blocked" | ||||
|       }, | ||||
|       "rules_count": { | ||||
|         "name": "Rules count" | ||||
|       }, | ||||
|       "safe_browsing_blocked": { | ||||
|         "name": "Safe browsing blocked" | ||||
|       }, | ||||
|       "safe_searches_enforced": { | ||||
|         "name": "Safe searches enforced" | ||||
|       }, | ||||
|       "average_processing_speed": { | ||||
|         "name": "Average processing speed" | ||||
|       }, | ||||
|       "rules_count": { | ||||
|         "name": "Rules count" | ||||
|       } | ||||
|     }, | ||||
|     "switch": { | ||||
|       "filtering": { | ||||
|         "name": "Filtering" | ||||
|       "protection": { | ||||
|         "name": "Protection" | ||||
|       }, | ||||
|       "parental": { | ||||
|         "name": "Parental control" | ||||
|       }, | ||||
|       "protection": { | ||||
|         "name": "Protection" | ||||
|       }, | ||||
|       "query_log": { | ||||
|         "name": "Query log" | ||||
|       "safe_search": { | ||||
|         "name": "Safe search" | ||||
|       }, | ||||
|       "safe_browsing": { | ||||
|         "name": "Safe browsing" | ||||
|       }, | ||||
|       "safe_search": { | ||||
|         "name": "Safe search" | ||||
|       "filtering": { | ||||
|         "name": "Filtering" | ||||
|       }, | ||||
|       "query_log": { | ||||
|         "name": "Query log" | ||||
|       } | ||||
|     } | ||||
|   }, | ||||
|   "services": { | ||||
|     "add_url": { | ||||
|       "name": "Add URL", | ||||
|       "description": "Adds a new filter subscription to AdGuard Home.", | ||||
|       "fields": { | ||||
|         "name": { | ||||
|           "description": "The name of the filter subscription.", | ||||
|           "name": "[%key:common::config_flow::data::name%]" | ||||
|           "name": "[%key:common::config_flow::data::name%]", | ||||
|           "description": "The name of the filter subscription." | ||||
|         }, | ||||
|         "url": { | ||||
|           "description": "The filter URL to subscribe to, containing the filter rules.", | ||||
|           "name": "[%key:common::config_flow::data::url%]" | ||||
|           "name": "[%key:common::config_flow::data::url%]", | ||||
|           "description": "The filter URL to subscribe to, containing the filter rules." | ||||
|         } | ||||
|       }, | ||||
|       "name": "Add URL" | ||||
|     }, | ||||
|     "disable_url": { | ||||
|       "description": "Disables a filter subscription in AdGuard Home.", | ||||
|       "fields": { | ||||
|         "url": { | ||||
|           "description": "The filter subscription URL to disable.", | ||||
|           "name": "[%key:common::config_flow::data::url%]" | ||||
|         } | ||||
|       }, | ||||
|       "name": "Disable URL" | ||||
|     }, | ||||
|     "enable_url": { | ||||
|       "description": "Enables a filter subscription in AdGuard Home.", | ||||
|       "fields": { | ||||
|         "url": { | ||||
|           "description": "The filter subscription URL to enable.", | ||||
|           "name": "[%key:common::config_flow::data::url%]" | ||||
|         } | ||||
|       }, | ||||
|       "name": "Enable URL" | ||||
|     }, | ||||
|     "refresh": { | ||||
|       "description": "Refreshes all filter subscriptions in AdGuard Home.", | ||||
|       "fields": { | ||||
|         "force": { | ||||
|           "description": "Force update (bypasses AdGuard Home throttling), omit for a regular refresh.", | ||||
|           "name": "Force" | ||||
|         } | ||||
|       }, | ||||
|       "name": "Refresh" | ||||
|       } | ||||
|     }, | ||||
|     "remove_url": { | ||||
|       "name": "Remove URL", | ||||
|       "description": "Removes a filter subscription from AdGuard Home.", | ||||
|       "fields": { | ||||
|         "url": { | ||||
|           "description": "The filter subscription URL to remove.", | ||||
|           "name": "[%key:common::config_flow::data::url%]" | ||||
|           "name": "[%key:common::config_flow::data::url%]", | ||||
|           "description": "The filter subscription URL to remove." | ||||
|         } | ||||
|       }, | ||||
|       "name": "Remove URL" | ||||
|       } | ||||
|     }, | ||||
|     "enable_url": { | ||||
|       "name": "Enable URL", | ||||
|       "description": "Enables a filter subscription in AdGuard Home.", | ||||
|       "fields": { | ||||
|         "url": { | ||||
|           "name": "[%key:common::config_flow::data::url%]", | ||||
|           "description": "The filter subscription URL to enable." | ||||
|         } | ||||
|       } | ||||
|     }, | ||||
|     "disable_url": { | ||||
|       "name": "Disable URL", | ||||
|       "description": "Disables a filter subscription in AdGuard Home.", | ||||
|       "fields": { | ||||
|         "url": { | ||||
|           "name": "[%key:common::config_flow::data::url%]", | ||||
|           "description": "The filter subscription URL to disable." | ||||
|         } | ||||
|       } | ||||
|     }, | ||||
|     "refresh": { | ||||
|       "name": "Refresh", | ||||
|       "description": "Refreshes all filter subscriptions in AdGuard Home.", | ||||
|       "fields": { | ||||
|         "force": { | ||||
|           "name": "Force", | ||||
|           "description": "Force update (bypasses AdGuard Home throttling), omit for a regular refresh." | ||||
|         } | ||||
|       } | ||||
|     } | ||||
|   } | ||||
| } | ||||
|   | ||||
| @@ -1,22 +1,22 @@ | ||||
| { | ||||
|   "services": { | ||||
|     "write_data_by_name": { | ||||
|       "name": "Write data by name", | ||||
|       "description": "Write a value to the connected ADS device.", | ||||
|       "fields": { | ||||
|         "adstype": { | ||||
|           "description": "The data type of the variable to write to.", | ||||
|           "name": "ADS type" | ||||
|         }, | ||||
|         "adsvar": { | ||||
|           "description": "The name of the variable to write to.", | ||||
|           "name": "ADS variable" | ||||
|           "name": "ADS variable", | ||||
|           "description": "The name of the variable to write to." | ||||
|         }, | ||||
|         "adstype": { | ||||
|           "name": "ADS type", | ||||
|           "description": "The data type of the variable to write to." | ||||
|         }, | ||||
|         "value": { | ||||
|           "description": "The value to write to the variable.", | ||||
|           "name": "Value" | ||||
|           "name": "Value", | ||||
|           "description": "The value to write to the variable." | ||||
|         } | ||||
|       }, | ||||
|       "name": "Write data by name" | ||||
|       } | ||||
|     } | ||||
|   } | ||||
| } | ||||
|   | ||||
| @@ -1,11 +1,11 @@ | ||||
| { | ||||
|   "config": { | ||||
|     "abort": { | ||||
|       "already_configured": "[%key:common::config_flow::abort::already_configured_device%]" | ||||
|     }, | ||||
|     "error": { | ||||
|       "cannot_connect": "[%key:common::config_flow::error::cannot_connect%]" | ||||
|     }, | ||||
|     "abort": { | ||||
|       "already_configured": "[%key:common::config_flow::abort::already_configured_device%]" | ||||
|     }, | ||||
|     "step": { | ||||
|       "user": { | ||||
|         "data": { | ||||
| @@ -19,14 +19,14 @@ | ||||
|   }, | ||||
|   "services": { | ||||
|     "set_time_to": { | ||||
|       "name": "Set time to", | ||||
|       "description": "Controls timers to turn the system on or off after a set number of minutes.", | ||||
|       "fields": { | ||||
|         "minutes": { | ||||
|           "description": "Minutes until action.", | ||||
|           "name": "Minutes" | ||||
|           "name": "Minutes", | ||||
|           "description": "Minutes until action." | ||||
|         } | ||||
|       }, | ||||
|       "name": "Set time to" | ||||
|       } | ||||
|     } | ||||
|   } | ||||
| } | ||||
|   | ||||
| @@ -71,14 +71,7 @@ class AemetConfigFlow(ConfigFlow, domain=DOMAIN): | ||||
|             } | ||||
|         ) | ||||
|  | ||||
|         return self.async_show_form( | ||||
|             step_id="user", | ||||
|             data_schema=schema, | ||||
|             errors=errors, | ||||
|             description_placeholders={ | ||||
|                 "api_key_url": "https://opendata.aemet.es/centrodedescargas/altaUsuario" | ||||
|             }, | ||||
|         ) | ||||
|         return self.async_show_form(step_id="user", data_schema=schema, errors=errors) | ||||
|  | ||||
|     @staticmethod | ||||
|     @callback | ||||
|   | ||||
| @@ -14,7 +14,7 @@ | ||||
|           "longitude": "[%key:common::config_flow::data::longitude%]", | ||||
|           "name": "Name of the integration" | ||||
|         }, | ||||
|         "description": "To generate API key go to {api_key_url}" | ||||
|         "description": "To generate API key go to https://opendata.aemet.es/centrodedescargas/altaUsuario" | ||||
|       } | ||||
|     } | ||||
|   }, | ||||
|   | ||||
| @@ -1,57 +1,57 @@ | ||||
| { | ||||
|   "config": { | ||||
|     "abort": { | ||||
|       "already_configured": "[%key:common::config_flow::abort::already_configured_device%]" | ||||
|     }, | ||||
|     "error": { | ||||
|       "cannot_connect": "[%key:common::config_flow::error::cannot_connect%]" | ||||
|     }, | ||||
|     "step": { | ||||
|       "user": { | ||||
|         "data": { | ||||
|           "api_key": "[%key:common::config_flow::data::api_key%]" | ||||
|         } | ||||
|       } | ||||
|     } | ||||
|   }, | ||||
|   "issues": { | ||||
|     "deprecated_yaml_import_issue_cannot_connect": { | ||||
|       "description": "Configuring {integration_title} using YAML is being removed but there was a connection error importing your YAML configuration.\n\nEnsure connection to {integration_title} works and restart Home Assistant to try again or remove the {integration_title} YAML configuration from your configuration.yaml file and continue to [set up the integration]({url}) manually.", | ||||
|       "title": "The {integration_title} YAML configuration import failed" | ||||
|     }, | ||||
|     "error": { | ||||
|       "cannot_connect": "[%key:common::config_flow::error::cannot_connect%]" | ||||
|     }, | ||||
|     "abort": { | ||||
|       "already_configured": "[%key:common::config_flow::abort::already_configured_device%]" | ||||
|     } | ||||
|   }, | ||||
|   "services": { | ||||
|     "add_tracking": { | ||||
|       "name": "Add tracking", | ||||
|       "description": "Adds a new tracking number to Aftership.", | ||||
|       "fields": { | ||||
|         "slug": { | ||||
|           "description": "Slug (carrier) of the new tracking.", | ||||
|           "name": "Slug" | ||||
|         "tracking_number": { | ||||
|           "name": "Tracking number", | ||||
|           "description": "Tracking number for the new tracking." | ||||
|         }, | ||||
|         "title": { | ||||
|           "description": "A custom title for the new tracking.", | ||||
|           "name": "Title" | ||||
|           "name": "Title", | ||||
|           "description": "A custom title for the new tracking." | ||||
|         }, | ||||
|         "tracking_number": { | ||||
|           "description": "Tracking number for the new tracking.", | ||||
|           "name": "Tracking number" | ||||
|         "slug": { | ||||
|           "name": "Slug", | ||||
|           "description": "Slug (carrier) of the new tracking." | ||||
|         } | ||||
|       }, | ||||
|       "name": "Add tracking" | ||||
|       } | ||||
|     }, | ||||
|     "remove_tracking": { | ||||
|       "name": "Remove tracking", | ||||
|       "description": "Removes a tracking number from Aftership.", | ||||
|       "fields": { | ||||
|         "slug": { | ||||
|           "description": "Slug (carrier) of the tracking to remove.", | ||||
|           "name": "[%key:component::aftership::services::add_tracking::fields::slug::name%]" | ||||
|         }, | ||||
|         "tracking_number": { | ||||
|           "description": "Tracking number of the tracking to remove.", | ||||
|           "name": "[%key:component::aftership::services::add_tracking::fields::tracking_number::name%]" | ||||
|           "name": "[%key:component::aftership::services::add_tracking::fields::tracking_number::name%]", | ||||
|           "description": "Tracking number of the tracking to remove." | ||||
|         }, | ||||
|         "slug": { | ||||
|           "name": "[%key:component::aftership::services::add_tracking::fields::slug::name%]", | ||||
|           "description": "Slug (carrier) of the tracking to remove." | ||||
|         } | ||||
|       }, | ||||
|       "name": "Remove tracking" | ||||
|       } | ||||
|     } | ||||
|   }, | ||||
|   "issues": { | ||||
|     "deprecated_yaml_import_issue_cannot_connect": { | ||||
|       "title": "The {integration_title} YAML configuration import failed", | ||||
|       "description": "Configuring {integration_title} using YAML is being removed but there was a connection error importing your YAML configuration.\n\nEnsure connection to {integration_title} works and restart Home Assistant to try again or remove the {integration_title} YAML configuration from your configuration.yaml file and continue to [set up the integration]({url}) manually." | ||||
|     } | ||||
|   } | ||||
| } | ||||
|   | ||||
| @@ -1,19 +1,19 @@ | ||||
| { | ||||
|   "services": { | ||||
|     "disable_alerts": { | ||||
|       "service": "mdi:bell-off" | ||||
|     }, | ||||
|     "enable_alerts": { | ||||
|       "service": "mdi:bell-alert" | ||||
|     }, | ||||
|     "snapshot": { | ||||
|       "service": "mdi:camera" | ||||
|     }, | ||||
|     "start_recording": { | ||||
|       "service": "mdi:record-rec" | ||||
|     }, | ||||
|     "stop_recording": { | ||||
|       "service": "mdi:stop" | ||||
|     }, | ||||
|     "enable_alerts": { | ||||
|       "service": "mdi:bell-alert" | ||||
|     }, | ||||
|     "disable_alerts": { | ||||
|       "service": "mdi:bell-off" | ||||
|     }, | ||||
|     "snapshot": { | ||||
|       "service": "mdi:camera" | ||||
|     } | ||||
|   } | ||||
| } | ||||
|   | ||||
| @@ -1,45 +1,45 @@ | ||||
| { | ||||
|   "config": { | ||||
|     "abort": { | ||||
|       "already_configured": "[%key:common::config_flow::abort::already_configured_device%]" | ||||
|     }, | ||||
|     "error": { | ||||
|       "already_in_progress": "[%key:common::config_flow::abort::already_in_progress%]", | ||||
|       "cannot_connect": "[%key:common::config_flow::error::cannot_connect%]" | ||||
|     }, | ||||
|     "step": { | ||||
|       "user": { | ||||
|         "title": "Set up Agent DVR", | ||||
|         "data": { | ||||
|           "host": "[%key:common::config_flow::data::host%]", | ||||
|           "port": "[%key:common::config_flow::data::port%]" | ||||
|         }, | ||||
|         "data_description": { | ||||
|           "host": "The IP address of the Agent DVR server." | ||||
|         }, | ||||
|         "title": "Set up Agent DVR" | ||||
|         } | ||||
|       } | ||||
|     }, | ||||
|     "abort": { | ||||
|       "already_configured": "[%key:common::config_flow::abort::already_configured_device%]" | ||||
|     }, | ||||
|     "error": { | ||||
|       "already_in_progress": "[%key:common::config_flow::abort::already_in_progress%]", | ||||
|       "cannot_connect": "[%key:common::config_flow::error::cannot_connect%]" | ||||
|     } | ||||
|   }, | ||||
|   "services": { | ||||
|     "disable_alerts": { | ||||
|       "description": "Disables alerts.", | ||||
|       "name": "Disable alerts" | ||||
|     }, | ||||
|     "enable_alerts": { | ||||
|       "description": "Enables alerts.", | ||||
|       "name": "Enable alerts" | ||||
|     }, | ||||
|     "snapshot": { | ||||
|       "description": "Takes a photo.", | ||||
|       "name": "Snapshot" | ||||
|     }, | ||||
|     "start_recording": { | ||||
|       "description": "Enables continuous recording.", | ||||
|       "name": "Start recording" | ||||
|       "name": "Start recording", | ||||
|       "description": "Enables continuous recording." | ||||
|     }, | ||||
|     "stop_recording": { | ||||
|       "description": "Disables continuous recording.", | ||||
|       "name": "Stop recording" | ||||
|       "name": "Stop recording", | ||||
|       "description": "Disables continuous recording." | ||||
|     }, | ||||
|     "enable_alerts": { | ||||
|       "name": "Enable alerts", | ||||
|       "description": "Enables alerts." | ||||
|     }, | ||||
|     "disable_alerts": { | ||||
|       "name": "Disable alerts", | ||||
|       "description": "Disables alerts." | ||||
|     }, | ||||
|     "snapshot": { | ||||
|       "name": "Snapshot", | ||||
|       "description": "Takes a photo." | ||||
|     } | ||||
|   } | ||||
| } | ||||
|   | ||||
| @@ -20,7 +20,6 @@ from homeassistant.helpers.entity_component import EntityComponent | ||||
| from homeassistant.helpers.typing import UNDEFINED, ConfigType, UndefinedType | ||||
|  | ||||
| from .const import ( | ||||
|     ATTR_ATTACHMENTS, | ||||
|     ATTR_INSTRUCTIONS, | ||||
|     ATTR_REQUIRED, | ||||
|     ATTR_STRUCTURE, | ||||
| @@ -29,19 +28,11 @@ from .const import ( | ||||
|     DATA_PREFERENCES, | ||||
|     DOMAIN, | ||||
|     SERVICE_GENERATE_DATA, | ||||
|     SERVICE_GENERATE_IMAGE, | ||||
|     AITaskEntityFeature, | ||||
| ) | ||||
| from .entity import AITaskEntity | ||||
| from .http import async_setup as async_setup_http | ||||
| from .task import ( | ||||
|     GenDataTask, | ||||
|     GenDataTaskResult, | ||||
|     GenImageTask, | ||||
|     GenImageTaskResult, | ||||
|     async_generate_data, | ||||
|     async_generate_image, | ||||
| ) | ||||
| from .task import GenDataTask, GenDataTaskResult, async_generate_data | ||||
|  | ||||
| __all__ = [ | ||||
|     "DOMAIN", | ||||
| @@ -49,10 +40,10 @@ __all__ = [ | ||||
|     "AITaskEntityFeature", | ||||
|     "GenDataTask", | ||||
|     "GenDataTaskResult", | ||||
|     "GenImageTask", | ||||
|     "GenImageTaskResult", | ||||
|     "async_generate_data", | ||||
|     "async_generate_image", | ||||
|     "async_setup", | ||||
|     "async_setup_entry", | ||||
|     "async_unload_entry", | ||||
| ] | ||||
|  | ||||
| _LOGGER = logging.getLogger(__name__) | ||||
| @@ -101,26 +92,6 @@ async def async_setup(hass: HomeAssistant, config: ConfigType) -> bool: | ||||
|                     vol.Schema({str: STRUCTURE_FIELD_SCHEMA}), | ||||
|                     _validate_structure_fields, | ||||
|                 ), | ||||
|                 vol.Optional(ATTR_ATTACHMENTS): vol.All( | ||||
|                     cv.ensure_list, [selector.MediaSelector({"accept": ["*/*"]})] | ||||
|                 ), | ||||
|             } | ||||
|         ), | ||||
|         supports_response=SupportsResponse.ONLY, | ||||
|         job_type=HassJobType.Coroutinefunction, | ||||
|     ) | ||||
|     hass.services.async_register( | ||||
|         DOMAIN, | ||||
|         SERVICE_GENERATE_IMAGE, | ||||
|         async_service_generate_image, | ||||
|         schema=vol.Schema( | ||||
|             { | ||||
|                 vol.Required(ATTR_TASK_NAME): cv.string, | ||||
|                 vol.Optional(ATTR_ENTITY_ID): cv.entity_id, | ||||
|                 vol.Required(ATTR_INSTRUCTIONS): cv.string, | ||||
|                 vol.Optional(ATTR_ATTACHMENTS): vol.All( | ||||
|                     cv.ensure_list, [selector.MediaSelector({"accept": ["*/*"]})] | ||||
|                 ), | ||||
|             } | ||||
|         ), | ||||
|         supports_response=SupportsResponse.ONLY, | ||||
| @@ -140,23 +111,17 @@ async def async_unload_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool: | ||||
|  | ||||
|  | ||||
| async def async_service_generate_data(call: ServiceCall) -> ServiceResponse: | ||||
|     """Run the data task service.""" | ||||
|     """Run the run task service.""" | ||||
|     result = await async_generate_data(hass=call.hass, **call.data) | ||||
|     return result.as_dict() | ||||
|  | ||||
|  | ||||
| async def async_service_generate_image(call: ServiceCall) -> ServiceResponse: | ||||
|     """Run the image task service.""" | ||||
|     return await async_generate_image(hass=call.hass, **call.data) | ||||
|  | ||||
|  | ||||
| class AITaskPreferences: | ||||
|     """AI Task preferences.""" | ||||
|  | ||||
|     KEYS = ("gen_data_entity_id", "gen_image_entity_id") | ||||
|     KEYS = ("gen_data_entity_id",) | ||||
|  | ||||
|     gen_data_entity_id: str | None = None | ||||
|     gen_image_entity_id: str | None = None | ||||
|  | ||||
|     def __init__(self, hass: HomeAssistant) -> None: | ||||
|         """Initialize the preferences.""" | ||||
| @@ -170,21 +135,17 @@ class AITaskPreferences: | ||||
|         if data is None: | ||||
|             return | ||||
|         for key in self.KEYS: | ||||
|             setattr(self, key, data.get(key)) | ||||
|             setattr(self, key, data[key]) | ||||
|  | ||||
|     @callback | ||||
|     def async_set_preferences( | ||||
|         self, | ||||
|         *, | ||||
|         gen_data_entity_id: str | None | UndefinedType = UNDEFINED, | ||||
|         gen_image_entity_id: str | None | UndefinedType = UNDEFINED, | ||||
|     ) -> None: | ||||
|         """Set the preferences.""" | ||||
|         changed = False | ||||
|         for key, value in ( | ||||
|             ("gen_data_entity_id", gen_data_entity_id), | ||||
|             ("gen_image_entity_id", gen_image_entity_id), | ||||
|         ): | ||||
|         for key, value in (("gen_data_entity_id", gen_data_entity_id),): | ||||
|             if value is not UNDEFINED: | ||||
|                 if getattr(self, key) != value: | ||||
|                     setattr(self, key, value) | ||||
|   | ||||
| @@ -8,7 +8,6 @@ from typing import TYPE_CHECKING, Final | ||||
| from homeassistant.util.hass_dict import HassKey | ||||
|  | ||||
| if TYPE_CHECKING: | ||||
|     from homeassistant.components.media_source import local_source | ||||
|     from homeassistant.helpers.entity_component import EntityComponent | ||||
|  | ||||
|     from . import AITaskPreferences | ||||
| @@ -17,19 +16,13 @@ if TYPE_CHECKING: | ||||
| DOMAIN = "ai_task" | ||||
| DATA_COMPONENT: HassKey[EntityComponent[AITaskEntity]] = HassKey(DOMAIN) | ||||
| DATA_PREFERENCES: HassKey[AITaskPreferences] = HassKey(f"{DOMAIN}_preferences") | ||||
| DATA_MEDIA_SOURCE: HassKey[local_source.LocalSource] = HassKey(f"{DOMAIN}_media_source") | ||||
|  | ||||
| IMAGE_DIR: Final = "image" | ||||
| IMAGE_EXPIRY_TIME = 60 * 60  # 1 hour | ||||
|  | ||||
| SERVICE_GENERATE_DATA = "generate_data" | ||||
| SERVICE_GENERATE_IMAGE = "generate_image" | ||||
|  | ||||
| ATTR_INSTRUCTIONS: Final = "instructions" | ||||
| ATTR_TASK_NAME: Final = "task_name" | ||||
| ATTR_STRUCTURE: Final = "structure" | ||||
| ATTR_REQUIRED: Final = "required" | ||||
| ATTR_ATTACHMENTS: Final = "attachments" | ||||
|  | ||||
| DEFAULT_SYSTEM_PROMPT = ( | ||||
|     "You are a Home Assistant expert and help users with their tasks." | ||||
| @@ -41,9 +34,3 @@ class AITaskEntityFeature(IntFlag): | ||||
|  | ||||
|     GENERATE_DATA = 1 | ||||
|     """Generate data based on instructions.""" | ||||
|  | ||||
|     SUPPORT_ATTACHMENTS = 2 | ||||
|     """Support attachments with generate data.""" | ||||
|  | ||||
|     GENERATE_IMAGE = 4 | ||||
|     """Generate images based on instructions.""" | ||||
|   | ||||
| @@ -13,12 +13,12 @@ from homeassistant.components.conversation import ( | ||||
| ) | ||||
| from homeassistant.const import STATE_UNAVAILABLE, STATE_UNKNOWN | ||||
| from homeassistant.helpers import llm | ||||
| from homeassistant.helpers.chat_session import ChatSession | ||||
| from homeassistant.helpers.chat_session import async_get_chat_session | ||||
| from homeassistant.helpers.restore_state import RestoreEntity | ||||
| from homeassistant.util import dt as dt_util | ||||
|  | ||||
| from .const import DEFAULT_SYSTEM_PROMPT, DOMAIN, AITaskEntityFeature | ||||
| from .task import GenDataTask, GenDataTaskResult, GenImageTask, GenImageTaskResult | ||||
| from .task import GenDataTask, GenDataTaskResult | ||||
|  | ||||
|  | ||||
| class AITaskEntity(RestoreEntity): | ||||
| @@ -56,16 +56,12 @@ class AITaskEntity(RestoreEntity): | ||||
|     @contextlib.asynccontextmanager | ||||
|     async def _async_get_ai_task_chat_log( | ||||
|         self, | ||||
|         session: ChatSession, | ||||
|         task: GenDataTask | GenImageTask, | ||||
|         task: GenDataTask, | ||||
|     ) -> AsyncGenerator[ChatLog]: | ||||
|         """Context manager used to manage the ChatLog used during an AI Task.""" | ||||
|         user_llm_hass_api: llm.API | None = None | ||||
|         if isinstance(task, GenDataTask): | ||||
|             user_llm_hass_api = task.llm_api | ||||
|  | ||||
|         # pylint: disable-next=contextmanager-generator-missing-cleanup | ||||
|         with ( | ||||
|             async_get_chat_session(self.hass) as session, | ||||
|             async_get_chat_log( | ||||
|                 self.hass, | ||||
|                 session, | ||||
| @@ -81,25 +77,21 @@ class AITaskEntity(RestoreEntity): | ||||
|                     device_id=None, | ||||
|                 ), | ||||
|                 user_llm_prompt=DEFAULT_SYSTEM_PROMPT, | ||||
|                 user_llm_hass_api=user_llm_hass_api, | ||||
|             ) | ||||
|  | ||||
|             chat_log.async_add_user_content( | ||||
|                 UserContent(task.instructions, attachments=task.attachments) | ||||
|             ) | ||||
|             chat_log.async_add_user_content(UserContent(task.instructions)) | ||||
|  | ||||
|             yield chat_log | ||||
|  | ||||
|     @final | ||||
|     async def internal_async_generate_data( | ||||
|         self, | ||||
|         session: ChatSession, | ||||
|         task: GenDataTask, | ||||
|     ) -> GenDataTaskResult: | ||||
|         """Run a gen data task.""" | ||||
|         self.__last_activity = dt_util.utcnow().isoformat() | ||||
|         self.async_write_ha_state() | ||||
|         async with self._async_get_ai_task_chat_log(session, task) as chat_log: | ||||
|         async with self._async_get_ai_task_chat_log(task) as chat_log: | ||||
|             return await self._async_generate_data(task, chat_log) | ||||
|  | ||||
|     async def _async_generate_data( | ||||
| @@ -109,23 +101,3 @@ class AITaskEntity(RestoreEntity): | ||||
|     ) -> GenDataTaskResult: | ||||
|         """Handle a gen data task.""" | ||||
|         raise NotImplementedError | ||||
|  | ||||
|     @final | ||||
|     async def internal_async_generate_image( | ||||
|         self, | ||||
|         session: ChatSession, | ||||
|         task: GenImageTask, | ||||
|     ) -> GenImageTaskResult: | ||||
|         """Run a gen image task.""" | ||||
|         self.__last_activity = dt_util.utcnow().isoformat() | ||||
|         self.async_write_ha_state() | ||||
|         async with self._async_get_ai_task_chat_log(session, task) as chat_log: | ||||
|             return await self._async_generate_image(task, chat_log) | ||||
|  | ||||
|     async def _async_generate_image( | ||||
|         self, | ||||
|         task: GenImageTask, | ||||
|         chat_log: ChatLog, | ||||
|     ) -> GenImageTaskResult: | ||||
|         """Handle a gen image task.""" | ||||
|         raise NotImplementedError | ||||
|   | ||||
| @@ -37,7 +37,6 @@ def websocket_get_preferences( | ||||
|     { | ||||
|         vol.Required("type"): "ai_task/preferences/set", | ||||
|         vol.Optional("gen_data_entity_id"): vol.Any(str, None), | ||||
|         vol.Optional("gen_image_entity_id"): vol.Any(str, None), | ||||
|     } | ||||
| ) | ||||
| @websocket_api.require_admin | ||||
|   | ||||
| @@ -1,15 +1,7 @@ | ||||
| { | ||||
|   "entity_component": { | ||||
|     "_": { | ||||
|       "default": "mdi:star-four-points" | ||||
|     } | ||||
|   }, | ||||
|   "services": { | ||||
|     "generate_data": { | ||||
|       "service": "mdi:file-star-four-points-outline" | ||||
|     }, | ||||
|     "generate_image": { | ||||
|       "service": "mdi:star-four-points-box-outline" | ||||
|     } | ||||
|   } | ||||
| } | ||||
|   | ||||
| @@ -1,10 +1,9 @@ | ||||
| { | ||||
|   "domain": "ai_task", | ||||
|   "name": "AI Task", | ||||
|   "after_dependencies": ["camera"], | ||||
|   "codeowners": ["@home-assistant/core"], | ||||
|   "dependencies": ["conversation", "media_source"], | ||||
|   "dependencies": ["conversation"], | ||||
|   "documentation": "https://www.home-assistant.io/integrations/ai_task", | ||||
|   "integration_type": "entity", | ||||
|   "integration_type": "system", | ||||
|   "quality_scale": "internal" | ||||
| } | ||||
|   | ||||
| @@ -1,32 +0,0 @@ | ||||
| """Expose images as media sources.""" | ||||
|  | ||||
| from __future__ import annotations | ||||
|  | ||||
| from pathlib import Path | ||||
|  | ||||
| from homeassistant.components.media_source import MediaSource, local_source | ||||
| from homeassistant.core import HomeAssistant | ||||
| from homeassistant.exceptions import HomeAssistantError | ||||
|  | ||||
| from .const import DATA_MEDIA_SOURCE, DOMAIN, IMAGE_DIR | ||||
|  | ||||
|  | ||||
| async def async_get_media_source(hass: HomeAssistant) -> MediaSource: | ||||
|     """Set up local media source.""" | ||||
|     media_dirs = list(hass.config.media_dirs.values()) | ||||
|  | ||||
|     if not media_dirs: | ||||
|         raise HomeAssistantError( | ||||
|             "AI Task media source requires at least one media directory configured" | ||||
|         ) | ||||
|  | ||||
|     media_dir = Path(media_dirs[0]) / DOMAIN / IMAGE_DIR | ||||
|  | ||||
|     hass.data[DATA_MEDIA_SOURCE] = source = local_source.LocalSource( | ||||
|         hass, | ||||
|         DOMAIN, | ||||
|         "AI Generated Images", | ||||
|         {IMAGE_DIR: str(media_dir)}, | ||||
|         f"/{DOMAIN}", | ||||
|     ) | ||||
|     return source | ||||
| @@ -10,50 +10,16 @@ generate_data: | ||||
|       required: true | ||||
|       selector: | ||||
|         text: | ||||
|           multiline: true | ||||
|     entity_id: | ||||
|       required: false | ||||
|       selector: | ||||
|         entity: | ||||
|           filter: | ||||
|             domain: ai_task | ||||
|             supported_features: | ||||
|               - ai_task.AITaskEntityFeature.GENERATE_DATA | ||||
|           domain: ai_task | ||||
|           supported_features: | ||||
|             - ai_task.AITaskEntityFeature.GENERATE_DATA | ||||
|     structure: | ||||
|       advanced: true | ||||
|       required: false | ||||
|       example: '{ "name": { "selector": { "text": }, "description": "Name of the user", "required": "True" } } }, "age": { "selector": { "number": }, "description": "Age of the user" } }' | ||||
|       selector: | ||||
|         object: | ||||
|     attachments: | ||||
|       required: false | ||||
|       selector: | ||||
|         media: | ||||
|           accept: | ||||
|             - "*" | ||||
| generate_image: | ||||
|   fields: | ||||
|     task_name: | ||||
|       example: "picture of a dog" | ||||
|       required: true | ||||
|       selector: | ||||
|         text: | ||||
|     instructions: | ||||
|       example: "Generate a high quality square image of a dog on transparent background" | ||||
|       required: true | ||||
|       selector: | ||||
|         text: | ||||
|           multiline: true | ||||
|     entity_id: | ||||
|       required: true | ||||
|       selector: | ||||
|         entity: | ||||
|           filter: | ||||
|             domain: ai_task | ||||
|             supported_features: | ||||
|               - ai_task.AITaskEntityFeature.GENERATE_IMAGE | ||||
|     attachments: | ||||
|       required: false | ||||
|       selector: | ||||
|         media: | ||||
|           accept: | ||||
|             - "*" | ||||
|   | ||||
| @@ -1,52 +1,26 @@ | ||||
| { | ||||
|   "services": { | ||||
|     "generate_data": { | ||||
|       "name": "Generate data", | ||||
|       "description": "Uses AI to run a task that generates data.", | ||||
|       "fields": { | ||||
|         "attachments": { | ||||
|           "description": "List of files to attach for multi-modal AI analysis.", | ||||
|           "name": "Attachments" | ||||
|         }, | ||||
|         "entity_id": { | ||||
|           "description": "Entity ID to run the task on. If not provided, the preferred entity will be used.", | ||||
|           "name": "Entity ID" | ||||
|         "task_name": { | ||||
|           "name": "Task name", | ||||
|           "description": "Name of the task." | ||||
|         }, | ||||
|         "instructions": { | ||||
|           "description": "Instructions on what needs to be done.", | ||||
|           "name": "Instructions" | ||||
|           "name": "Instructions", | ||||
|           "description": "Instructions on what needs to be done." | ||||
|         }, | ||||
|         "entity_id": { | ||||
|           "name": "Entity ID", | ||||
|           "description": "Entity ID to run the task on. If not provided, the preferred entity will be used." | ||||
|         }, | ||||
|         "structure": { | ||||
|           "description": "When set, the AI Task will output fields with this in structure. The structure is a dictionary where the keys are the field names and the values contain a 'description', a 'selector', and an optional 'required' field.", | ||||
|           "name": "Structured output" | ||||
|         }, | ||||
|         "task_name": { | ||||
|           "description": "Name of the task.", | ||||
|           "name": "Task name" | ||||
|           "name": "Structured output", | ||||
|           "description": "When set, the AI Task will output fields with this in structure. The structure is a dictionary where the keys are the field names and the values contain a 'description', a 'selector', and an optional 'required' field." | ||||
|         } | ||||
|       }, | ||||
|       "name": "Generate data" | ||||
|     }, | ||||
|     "generate_image": { | ||||
|       "description": "Uses AI to generate image.", | ||||
|       "fields": { | ||||
|         "attachments": { | ||||
|           "description": "List of files to attach for using as references.", | ||||
|           "name": "Attachments" | ||||
|         }, | ||||
|         "entity_id": { | ||||
|           "description": "Entity ID to run the task on.", | ||||
|           "name": "Entity ID" | ||||
|         }, | ||||
|         "instructions": { | ||||
|           "description": "Instructions that explains the image to be generated.", | ||||
|           "name": "Instructions" | ||||
|         }, | ||||
|         "task_name": { | ||||
|           "description": "Name of the task.", | ||||
|           "name": "Task name" | ||||
|         } | ||||
|       }, | ||||
|       "name": "Generate image" | ||||
|       } | ||||
|     } | ||||
|   } | ||||
| } | ||||
|   | ||||
| @@ -3,113 +3,14 @@ | ||||
| from __future__ import annotations | ||||
|  | ||||
| from dataclasses import dataclass | ||||
| from datetime import datetime, timedelta | ||||
| import io | ||||
| import mimetypes | ||||
| from pathlib import Path | ||||
| import tempfile | ||||
| from typing import Any | ||||
|  | ||||
| import voluptuous as vol | ||||
|  | ||||
| from homeassistant.components import camera, conversation, image, media_source | ||||
| from homeassistant.components.http.auth import async_sign_path | ||||
| from homeassistant.core import HomeAssistant, ServiceResponse, callback | ||||
| from homeassistant.core import HomeAssistant | ||||
| from homeassistant.exceptions import HomeAssistantError | ||||
| from homeassistant.helpers import llm | ||||
| from homeassistant.helpers.chat_session import ChatSession, async_get_chat_session | ||||
| from homeassistant.util import RE_SANITIZE_FILENAME, slugify | ||||
|  | ||||
| from .const import ( | ||||
|     DATA_COMPONENT, | ||||
|     DATA_MEDIA_SOURCE, | ||||
|     DATA_PREFERENCES, | ||||
|     DOMAIN, | ||||
|     IMAGE_DIR, | ||||
|     IMAGE_EXPIRY_TIME, | ||||
|     AITaskEntityFeature, | ||||
| ) | ||||
|  | ||||
|  | ||||
| def _save_camera_snapshot(image_data: camera.Image | image.Image) -> Path: | ||||
|     """Save camera snapshot to temp file.""" | ||||
|     with tempfile.NamedTemporaryFile( | ||||
|         mode="wb", | ||||
|         suffix=mimetypes.guess_extension(image_data.content_type, False), | ||||
|         delete=False, | ||||
|     ) as temp_file: | ||||
|         temp_file.write(image_data.content) | ||||
|         return Path(temp_file.name) | ||||
|  | ||||
|  | ||||
| async def _resolve_attachments( | ||||
|     hass: HomeAssistant, | ||||
|     session: ChatSession, | ||||
|     attachments: list[dict] | None = None, | ||||
| ) -> list[conversation.Attachment]: | ||||
|     """Resolve attachments for a task.""" | ||||
|     resolved_attachments: list[conversation.Attachment] = [] | ||||
|     created_files: list[Path] = [] | ||||
|  | ||||
|     for attachment in attachments or []: | ||||
|         media_content_id = attachment["media_content_id"] | ||||
|  | ||||
|         # Special case for certain media sources | ||||
|         for integration in camera, image: | ||||
|             media_source_prefix = f"media-source://{integration.DOMAIN}/" | ||||
|             if not media_content_id.startswith(media_source_prefix): | ||||
|                 continue | ||||
|  | ||||
|             # Extract entity_id from the media content ID | ||||
|             entity_id = media_content_id.removeprefix(media_source_prefix) | ||||
|  | ||||
|             # Get snapshot from entity | ||||
|             image_data = await integration.async_get_image(hass, entity_id) | ||||
|  | ||||
|             temp_filename = await hass.async_add_executor_job( | ||||
|                 _save_camera_snapshot, image_data | ||||
|             ) | ||||
|             created_files.append(temp_filename) | ||||
|  | ||||
|             resolved_attachments.append( | ||||
|                 conversation.Attachment( | ||||
|                     media_content_id=media_content_id, | ||||
|                     mime_type=image_data.content_type, | ||||
|                     path=temp_filename, | ||||
|                 ) | ||||
|             ) | ||||
|             break | ||||
|         else: | ||||
|             # Handle regular media sources | ||||
|             media = await media_source.async_resolve_media(hass, media_content_id, None) | ||||
|             if media.path is None: | ||||
|                 raise HomeAssistantError( | ||||
|                     "Only local attachments are currently supported" | ||||
|                 ) | ||||
|             resolved_attachments.append( | ||||
|                 conversation.Attachment( | ||||
|                     media_content_id=media_content_id, | ||||
|                     mime_type=media.mime_type, | ||||
|                     path=media.path, | ||||
|                 ) | ||||
|             ) | ||||
|  | ||||
|     if not created_files: | ||||
|         return resolved_attachments | ||||
|  | ||||
|     def cleanup_files() -> None: | ||||
|         """Cleanup temporary files.""" | ||||
|         for file in created_files: | ||||
|             file.unlink(missing_ok=True) | ||||
|  | ||||
|     @callback | ||||
|     def cleanup_files_callback() -> None: | ||||
|         """Cleanup temporary files.""" | ||||
|         hass.async_add_executor_job(cleanup_files) | ||||
|  | ||||
|     session.async_on_cleanup(cleanup_files_callback) | ||||
|  | ||||
|     return resolved_attachments | ||||
| from .const import DATA_COMPONENT, DATA_PREFERENCES, AITaskEntityFeature | ||||
|  | ||||
|  | ||||
| async def async_generate_data( | ||||
| @@ -119,10 +20,8 @@ async def async_generate_data( | ||||
|     entity_id: str | None = None, | ||||
|     instructions: str, | ||||
|     structure: vol.Schema | None = None, | ||||
|     attachments: list[dict] | None = None, | ||||
|     llm_api: llm.API | None = None, | ||||
| ) -> GenDataTaskResult: | ||||
|     """Run a data generation task in the AI Task integration.""" | ||||
|     """Run a task in the AI Task integration.""" | ||||
|     if entity_id is None: | ||||
|         entity_id = hass.data[DATA_PREFERENCES].gen_data_entity_id | ||||
|  | ||||
| @@ -138,109 +37,14 @@ async def async_generate_data( | ||||
|             f"AI Task entity {entity_id} does not support generating data" | ||||
|         ) | ||||
|  | ||||
|     if ( | ||||
|         attachments | ||||
|         and AITaskEntityFeature.SUPPORT_ATTACHMENTS not in entity.supported_features | ||||
|     ): | ||||
|         raise HomeAssistantError( | ||||
|             f"AI Task entity {entity_id} does not support attachments" | ||||
|     return await entity.internal_async_generate_data( | ||||
|         GenDataTask( | ||||
|             name=task_name, | ||||
|             instructions=instructions, | ||||
|             structure=structure, | ||||
|         ) | ||||
|  | ||||
|     with async_get_chat_session(hass) as session: | ||||
|         resolved_attachments = await _resolve_attachments(hass, session, attachments) | ||||
|  | ||||
|         return await entity.internal_async_generate_data( | ||||
|             session, | ||||
|             GenDataTask( | ||||
|                 name=task_name, | ||||
|                 instructions=instructions, | ||||
|                 structure=structure, | ||||
|                 attachments=resolved_attachments or None, | ||||
|                 llm_api=llm_api, | ||||
|             ), | ||||
|         ) | ||||
|  | ||||
|  | ||||
| async def async_generate_image( | ||||
|     hass: HomeAssistant, | ||||
|     *, | ||||
|     task_name: str, | ||||
|     entity_id: str | None = None, | ||||
|     instructions: str, | ||||
|     attachments: list[dict] | None = None, | ||||
| ) -> ServiceResponse: | ||||
|     """Run an image generation task in the AI Task integration.""" | ||||
|     if entity_id is None: | ||||
|         entity_id = hass.data[DATA_PREFERENCES].gen_image_entity_id | ||||
|  | ||||
|     if entity_id is None: | ||||
|         raise HomeAssistantError("No entity_id provided and no preferred entity set") | ||||
|  | ||||
|     entity = hass.data[DATA_COMPONENT].get_entity(entity_id) | ||||
|     if entity is None: | ||||
|         raise HomeAssistantError(f"AI Task entity {entity_id} not found") | ||||
|  | ||||
|     if AITaskEntityFeature.GENERATE_IMAGE not in entity.supported_features: | ||||
|         raise HomeAssistantError( | ||||
|             f"AI Task entity {entity_id} does not support generating images" | ||||
|         ) | ||||
|  | ||||
|     if ( | ||||
|         attachments | ||||
|         and AITaskEntityFeature.SUPPORT_ATTACHMENTS not in entity.supported_features | ||||
|     ): | ||||
|         raise HomeAssistantError( | ||||
|             f"AI Task entity {entity_id} does not support attachments" | ||||
|         ) | ||||
|  | ||||
|     with async_get_chat_session(hass) as session: | ||||
|         resolved_attachments = await _resolve_attachments(hass, session, attachments) | ||||
|  | ||||
|         task_result = await entity.internal_async_generate_image( | ||||
|             session, | ||||
|             GenImageTask( | ||||
|                 name=task_name, | ||||
|                 instructions=instructions, | ||||
|                 attachments=resolved_attachments or None, | ||||
|             ), | ||||
|         ) | ||||
|  | ||||
|     service_result = task_result.as_dict() | ||||
|     image_data = service_result.pop("image_data") | ||||
|     if service_result.get("revised_prompt") is None: | ||||
|         service_result["revised_prompt"] = instructions | ||||
|  | ||||
|     source = hass.data[DATA_MEDIA_SOURCE] | ||||
|  | ||||
|     current_time = datetime.now() | ||||
|     ext = mimetypes.guess_extension(task_result.mime_type, False) or ".png" | ||||
|     sanitized_task_name = RE_SANITIZE_FILENAME.sub("", slugify(task_name)) | ||||
|  | ||||
|     image_file = ImageData( | ||||
|         filename=f"{current_time.strftime('%Y-%m-%d_%H%M%S')}_{sanitized_task_name}{ext}", | ||||
|         file=io.BytesIO(image_data), | ||||
|         content_type=task_result.mime_type, | ||||
|     ) | ||||
|  | ||||
|     target_folder = media_source.MediaSourceItem.from_uri( | ||||
|         hass, f"media-source://{DOMAIN}/{IMAGE_DIR}", None | ||||
|     ) | ||||
|  | ||||
|     service_result["media_source_id"] = await source.async_upload_media( | ||||
|         target_folder, image_file | ||||
|     ) | ||||
|  | ||||
|     item = media_source.MediaSourceItem.from_uri( | ||||
|         hass, service_result["media_source_id"], None | ||||
|     ) | ||||
|     service_result["url"] = async_sign_path( | ||||
|         hass, | ||||
|         (await source.async_resolve_media(item)).url, | ||||
|         timedelta(seconds=IMAGE_EXPIRY_TIME), | ||||
|     ) | ||||
|  | ||||
|     return service_result | ||||
|  | ||||
|  | ||||
| @dataclass(slots=True) | ||||
| class GenDataTask: | ||||
| @@ -255,12 +59,6 @@ class GenDataTask: | ||||
|     structure: vol.Schema | None = None | ||||
|     """Optional structure for the data to be generated.""" | ||||
|  | ||||
|     attachments: list[conversation.Attachment] | None = None | ||||
|     """List of attachments to go along the instructions.""" | ||||
|  | ||||
|     llm_api: llm.API | None = None | ||||
|     """API to provide to the LLM.""" | ||||
|  | ||||
|     def __str__(self) -> str: | ||||
|         """Return task as a string.""" | ||||
|         return f"<GenDataTask {self.name}: {id(self)}>" | ||||
| @@ -282,68 +80,3 @@ class GenDataTaskResult: | ||||
|             "conversation_id": self.conversation_id, | ||||
|             "data": self.data, | ||||
|         } | ||||
|  | ||||
|  | ||||
| @dataclass(slots=True) | ||||
| class GenImageTask: | ||||
|     """Gen image task to be processed.""" | ||||
|  | ||||
|     name: str | ||||
|     """Name of the task.""" | ||||
|  | ||||
|     instructions: str | ||||
|     """Instructions on what needs to be done.""" | ||||
|  | ||||
|     attachments: list[conversation.Attachment] | None = None | ||||
|     """List of attachments to go along the instructions.""" | ||||
|  | ||||
|     def __str__(self) -> str: | ||||
|         """Return task as a string.""" | ||||
|         return f"<GenImageTask {self.name}: {id(self)}>" | ||||
|  | ||||
|  | ||||
| @dataclass(slots=True) | ||||
| class GenImageTaskResult: | ||||
|     """Result of gen image task.""" | ||||
|  | ||||
|     image_data: bytes | ||||
|     """Raw image data generated by the model.""" | ||||
|  | ||||
|     conversation_id: str | ||||
|     """Unique identifier for the conversation.""" | ||||
|  | ||||
|     mime_type: str | ||||
|     """MIME type of the generated image.""" | ||||
|  | ||||
|     width: int | None = None | ||||
|     """Width of the generated image, if available.""" | ||||
|  | ||||
|     height: int | None = None | ||||
|     """Height of the generated image, if available.""" | ||||
|  | ||||
|     model: str | None = None | ||||
|     """Model used to generate the image, if available.""" | ||||
|  | ||||
|     revised_prompt: str | None = None | ||||
|     """Revised prompt used to generate the image, if applicable.""" | ||||
|  | ||||
|     def as_dict(self) -> dict[str, Any]: | ||||
|         """Return result as a dict.""" | ||||
|         return { | ||||
|             "image_data": self.image_data, | ||||
|             "conversation_id": self.conversation_id, | ||||
|             "mime_type": self.mime_type, | ||||
|             "width": self.width, | ||||
|             "height": self.height, | ||||
|             "model": self.model, | ||||
|             "revised_prompt": self.revised_prompt, | ||||
|         } | ||||
|  | ||||
|  | ||||
| @dataclass(slots=True) | ||||
| class ImageData: | ||||
|     """Implementation of media_source.local_source.UploadedFile protocol.""" | ||||
|  | ||||
|     filename: str | ||||
|     file: io.IOBase | ||||
|     content_type: str | ||||
|   | ||||
| @@ -9,17 +9,14 @@ | ||||
|       } | ||||
|     }, | ||||
|     "number": { | ||||
|       "display_brightness": { | ||||
|       "led_bar_brightness": { | ||||
|         "default": "mdi:brightness-percent" | ||||
|       }, | ||||
|       "led_bar_brightness": { | ||||
|       "display_brightness": { | ||||
|         "default": "mdi:brightness-percent" | ||||
|       } | ||||
|     }, | ||||
|     "select": { | ||||
|       "co2_automatic_baseline_calibration": { | ||||
|         "default": "mdi:molecule-co2" | ||||
|       }, | ||||
|       "configuration_control": { | ||||
|         "default": "mdi:cloud-cog" | ||||
|       }, | ||||
| @@ -34,11 +31,23 @@ | ||||
|       }, | ||||
|       "voc_index_learning_time_offset": { | ||||
|         "default": "mdi:clock-outline" | ||||
|       }, | ||||
|       "co2_automatic_baseline_calibration": { | ||||
|         "default": "mdi:molecule-co2" | ||||
|       } | ||||
|     }, | ||||
|     "sensor": { | ||||
|       "co2_automatic_baseline_calibration": { | ||||
|         "default": "mdi:molecule-co2" | ||||
|       "total_volatile_organic_component_index": { | ||||
|         "default": "mdi:molecule" | ||||
|       }, | ||||
|       "nitrogen_index": { | ||||
|         "default": "mdi:molecule" | ||||
|       }, | ||||
|       "pm003_count": { | ||||
|         "default": "mdi:blur" | ||||
|       }, | ||||
|       "led_bar_brightness": { | ||||
|         "default": "mdi:brightness-percent" | ||||
|       }, | ||||
|       "display_brightness": { | ||||
|         "default": "mdi:brightness-percent" | ||||
| @@ -46,26 +55,17 @@ | ||||
|       "display_temperature_unit": { | ||||
|         "default": "mdi:thermometer-lines" | ||||
|       }, | ||||
|       "led_bar_brightness": { | ||||
|         "default": "mdi:brightness-percent" | ||||
|       }, | ||||
|       "led_bar_mode": { | ||||
|         "default": "mdi:led-strip" | ||||
|       }, | ||||
|       "nitrogen_index": { | ||||
|         "default": "mdi:molecule" | ||||
|       }, | ||||
|       "nox_index_learning_time_offset": { | ||||
|         "default": "mdi:clock-outline" | ||||
|       }, | ||||
|       "pm003_count": { | ||||
|         "default": "mdi:blur" | ||||
|       }, | ||||
|       "total_volatile_organic_component_index": { | ||||
|         "default": "mdi:molecule" | ||||
|       }, | ||||
|       "voc_index_learning_time_offset": { | ||||
|         "default": "mdi:clock-outline" | ||||
|       }, | ||||
|       "co2_automatic_baseline_calibration": { | ||||
|         "default": "mdi:molecule-co2" | ||||
|       } | ||||
|     }, | ||||
|     "switch": { | ||||
|   | ||||
| @@ -6,7 +6,6 @@ | ||||
|   "documentation": "https://www.home-assistant.io/integrations/airgradient", | ||||
|   "integration_type": "device", | ||||
|   "iot_class": "local_polling", | ||||
|   "quality_scale": "platinum", | ||||
|   "requirements": ["airgradient==0.9.2"], | ||||
|   "zeroconf": ["_airgradient._tcp.local."] | ||||
| } | ||||
|   | ||||
| @@ -14,9 +14,9 @@ rules: | ||||
|     status: exempt | ||||
|     comment: | | ||||
|       This integration does not provide additional actions. | ||||
|   docs-high-level-description: done | ||||
|   docs-installation-instructions: done | ||||
|   docs-removal-instructions: done | ||||
|   docs-high-level-description: todo | ||||
|   docs-installation-instructions: todo | ||||
|   docs-removal-instructions: todo | ||||
|   entity-event-setup: | ||||
|     status: exempt | ||||
|     comment: | | ||||
| @@ -34,7 +34,7 @@ rules: | ||||
|   docs-configuration-parameters: | ||||
|     status: exempt | ||||
|     comment: No options to configure | ||||
|   docs-installation-parameters: done | ||||
|   docs-installation-parameters: todo | ||||
|   entity-unavailable: done | ||||
|   integration-owner: done | ||||
|   log-when-unavailable: done | ||||
| @@ -43,19 +43,23 @@ rules: | ||||
|     status: exempt | ||||
|     comment: | | ||||
|       This integration does not require authentication. | ||||
|   test-coverage: done | ||||
|   test-coverage: todo | ||||
|   # Gold | ||||
|   devices: done | ||||
|   diagnostics: done | ||||
|   discovery-update-info: done | ||||
|   discovery: done | ||||
|   docs-data-update: done | ||||
|   docs-examples: done | ||||
|   docs-known-limitations: done | ||||
|   docs-supported-devices: done | ||||
|   docs-supported-functions: done | ||||
|   docs-troubleshooting: done | ||||
|   docs-use-cases: done | ||||
|   discovery-update-info: | ||||
|     status: todo | ||||
|     comment: DHCP is still possible | ||||
|   discovery: | ||||
|     status: todo | ||||
|     comment: DHCP is still possible | ||||
|   docs-data-update: todo | ||||
|   docs-examples: todo | ||||
|   docs-known-limitations: todo | ||||
|   docs-supported-devices: todo | ||||
|   docs-supported-functions: todo | ||||
|   docs-troubleshooting: todo | ||||
|   docs-use-cases: todo | ||||
|   dynamic-devices: | ||||
|     status: exempt | ||||
|     comment: | | ||||
|   | ||||
| @@ -1,5 +1,19 @@ | ||||
| { | ||||
|   "config": { | ||||
|     "flow_title": "{model}", | ||||
|     "step": { | ||||
|       "user": { | ||||
|         "data": { | ||||
|           "host": "[%key:common::config_flow::data::host%]" | ||||
|         }, | ||||
|         "data_description": { | ||||
|           "host": "The hostname or IP address of the Airgradient device." | ||||
|         } | ||||
|       }, | ||||
|       "discovery_confirm": { | ||||
|         "description": "Do you want to set up {model}?" | ||||
|       } | ||||
|     }, | ||||
|     "abort": { | ||||
|       "already_configured": "[%key:common::config_flow::abort::already_configured_device%]", | ||||
|       "already_in_progress": "[%key:common::config_flow::abort::already_in_progress%]", | ||||
| @@ -10,20 +24,6 @@ | ||||
|     "error": { | ||||
|       "cannot_connect": "[%key:common::config_flow::error::cannot_connect%]", | ||||
|       "unknown": "[%key:common::config_flow::error::unknown%]" | ||||
|     }, | ||||
|     "flow_title": "{model}", | ||||
|     "step": { | ||||
|       "discovery_confirm": { | ||||
|         "description": "Do you want to set up {model}?" | ||||
|       }, | ||||
|       "user": { | ||||
|         "data": { | ||||
|           "host": "[%key:common::config_flow::data::host%]" | ||||
|         }, | ||||
|         "data_description": { | ||||
|           "host": "The hostname or IP address of the Airgradient device." | ||||
|         } | ||||
|       } | ||||
|     } | ||||
|   }, | ||||
|   "entity": { | ||||
| @@ -36,25 +36,14 @@ | ||||
|       } | ||||
|     }, | ||||
|     "number": { | ||||
|       "display_brightness": { | ||||
|         "name": "Display brightness" | ||||
|       }, | ||||
|       "led_bar_brightness": { | ||||
|         "name": "LED bar brightness" | ||||
|       }, | ||||
|       "display_brightness": { | ||||
|         "name": "Display brightness" | ||||
|       } | ||||
|     }, | ||||
|     "select": { | ||||
|       "co2_automatic_baseline_calibration": { | ||||
|         "name": "CO2 automatic baseline duration", | ||||
|         "state": { | ||||
|           "0": "[%key:common::state::off%]", | ||||
|           "1": "1 day", | ||||
|           "8": "8 days", | ||||
|           "30": "30 days", | ||||
|           "90": "90 days", | ||||
|           "180": "180 days" | ||||
|         } | ||||
|       }, | ||||
|       "configuration_control": { | ||||
|         "name": "Configuration source", | ||||
|         "state": { | ||||
| @@ -62,13 +51,6 @@ | ||||
|           "local": "Local" | ||||
|         } | ||||
|       }, | ||||
|       "display_pm_standard": { | ||||
|         "name": "Display PM standard", | ||||
|         "state": { | ||||
|           "ugm3": "μg/m³", | ||||
|           "us_aqi": "US AQI" | ||||
|         } | ||||
|       }, | ||||
|       "display_temperature_unit": { | ||||
|         "name": "Display temperature unit", | ||||
|         "state": { | ||||
| @@ -76,11 +58,18 @@ | ||||
|           "f": "Fahrenheit" | ||||
|         } | ||||
|       }, | ||||
|       "display_pm_standard": { | ||||
|         "name": "Display PM standard", | ||||
|         "state": { | ||||
|           "ugm3": "µg/m³", | ||||
|           "us_aqi": "US AQI" | ||||
|         } | ||||
|       }, | ||||
|       "led_bar_mode": { | ||||
|         "name": "LED bar mode", | ||||
|         "state": { | ||||
|           "co2": "[%key:component::sensor::entity_component::carbon_dioxide::name%]", | ||||
|           "off": "[%key:common::state::off%]", | ||||
|           "co2": "[%key:component::sensor::entity_component::carbon_dioxide::name%]", | ||||
|           "pm": "Particulate matter" | ||||
|         } | ||||
|       }, | ||||
| @@ -103,14 +92,37 @@ | ||||
|           "360": "[%key:component::airgradient::entity::select::nox_index_learning_time_offset::state::360%]", | ||||
|           "720": "[%key:component::airgradient::entity::select::nox_index_learning_time_offset::state::720%]" | ||||
|         } | ||||
|       }, | ||||
|       "co2_automatic_baseline_calibration": { | ||||
|         "name": "CO2 automatic baseline duration", | ||||
|         "state": { | ||||
|           "1": "1 day", | ||||
|           "8": "8 days", | ||||
|           "30": "30 days", | ||||
|           "90": "90 days", | ||||
|           "180": "180 days", | ||||
|           "0": "[%key:common::state::off%]" | ||||
|         } | ||||
|       } | ||||
|     }, | ||||
|     "sensor": { | ||||
|       "co2_automatic_baseline_calibration_days": { | ||||
|         "name": "Carbon dioxide automatic baseline calibration" | ||||
|       "total_volatile_organic_component_index": { | ||||
|         "name": "VOC index" | ||||
|       }, | ||||
|       "display_brightness": { | ||||
|         "name": "[%key:component::airgradient::entity::number::display_brightness::name%]" | ||||
|       "nitrogen_index": { | ||||
|         "name": "NOx index" | ||||
|       }, | ||||
|       "pm003_count": { | ||||
|         "name": "PM0.3" | ||||
|       }, | ||||
|       "raw_total_volatile_organic_component": { | ||||
|         "name": "Raw VOC" | ||||
|       }, | ||||
|       "raw_nitrogen": { | ||||
|         "name": "Raw NOx" | ||||
|       }, | ||||
|       "raw_pm02": { | ||||
|         "name": "Raw PM2.5" | ||||
|       }, | ||||
|       "display_pm_standard": { | ||||
|         "name": "[%key:component::airgradient::entity::select::display_pm_standard::name%]", | ||||
| @@ -119,6 +131,26 @@ | ||||
|           "us_aqi": "[%key:component::airgradient::entity::select::display_pm_standard::state::us_aqi%]" | ||||
|         } | ||||
|       }, | ||||
|       "co2_automatic_baseline_calibration_days": { | ||||
|         "name": "Carbon dioxide automatic baseline calibration" | ||||
|       }, | ||||
|       "nox_learning_offset": { | ||||
|         "name": "[%key:component::airgradient::entity::select::nox_index_learning_time_offset::name%]" | ||||
|       }, | ||||
|       "tvoc_learning_offset": { | ||||
|         "name": "[%key:component::airgradient::entity::select::voc_index_learning_time_offset::name%]" | ||||
|       }, | ||||
|       "led_bar_mode": { | ||||
|         "name": "[%key:component::airgradient::entity::select::led_bar_mode::name%]", | ||||
|         "state": { | ||||
|           "off": "[%key:common::state::off%]", | ||||
|           "co2": "[%key:component::sensor::entity_component::carbon_dioxide::name%]", | ||||
|           "pm": "[%key:component::airgradient::entity::select::led_bar_mode::state::pm%]" | ||||
|         } | ||||
|       }, | ||||
|       "led_bar_brightness": { | ||||
|         "name": "[%key:component::airgradient::entity::number::led_bar_brightness::name%]" | ||||
|       }, | ||||
|       "display_temperature_unit": { | ||||
|         "name": "[%key:component::airgradient::entity::select::display_temperature_unit::name%]", | ||||
|         "state": { | ||||
| @@ -126,40 +158,8 @@ | ||||
|           "f": "[%key:component::airgradient::entity::select::display_temperature_unit::state::f%]" | ||||
|         } | ||||
|       }, | ||||
|       "led_bar_brightness": { | ||||
|         "name": "[%key:component::airgradient::entity::number::led_bar_brightness::name%]" | ||||
|       }, | ||||
|       "led_bar_mode": { | ||||
|         "name": "[%key:component::airgradient::entity::select::led_bar_mode::name%]", | ||||
|         "state": { | ||||
|           "co2": "[%key:component::sensor::entity_component::carbon_dioxide::name%]", | ||||
|           "off": "[%key:common::state::off%]", | ||||
|           "pm": "[%key:component::airgradient::entity::select::led_bar_mode::state::pm%]" | ||||
|         } | ||||
|       }, | ||||
|       "nitrogen_index": { | ||||
|         "name": "NOx index" | ||||
|       }, | ||||
|       "nox_learning_offset": { | ||||
|         "name": "[%key:component::airgradient::entity::select::nox_index_learning_time_offset::name%]" | ||||
|       }, | ||||
|       "pm003_count": { | ||||
|         "name": "PM0.3" | ||||
|       }, | ||||
|       "raw_nitrogen": { | ||||
|         "name": "Raw NOx" | ||||
|       }, | ||||
|       "raw_pm02": { | ||||
|         "name": "Raw PM2.5" | ||||
|       }, | ||||
|       "raw_total_volatile_organic_component": { | ||||
|         "name": "Raw VOC" | ||||
|       }, | ||||
|       "total_volatile_organic_component_index": { | ||||
|         "name": "VOC index" | ||||
|       }, | ||||
|       "tvoc_learning_offset": { | ||||
|         "name": "[%key:component::airgradient::entity::select::voc_index_learning_time_offset::name%]" | ||||
|       "display_brightness": { | ||||
|         "name": "[%key:component::airgradient::entity::number::display_brightness::name%]" | ||||
|       } | ||||
|     }, | ||||
|     "switch": { | ||||
|   | ||||
| @@ -1,9 +1,7 @@ | ||||
| """Airgradient Update platform.""" | ||||
|  | ||||
| from datetime import timedelta | ||||
| import logging | ||||
|  | ||||
| from airgradient import AirGradientConnectionError | ||||
| from propcache.api import cached_property | ||||
|  | ||||
| from homeassistant.components.update import UpdateDeviceClass, UpdateEntity | ||||
| @@ -15,7 +13,6 @@ from .entity import AirGradientEntity | ||||
|  | ||||
| PARALLEL_UPDATES = 1 | ||||
| SCAN_INTERVAL = timedelta(hours=1) | ||||
| _LOGGER = logging.getLogger(__name__) | ||||
|  | ||||
|  | ||||
| async def async_setup_entry( | ||||
| @@ -34,7 +31,6 @@ class AirGradientUpdate(AirGradientEntity, UpdateEntity): | ||||
|     """Representation of Airgradient Update.""" | ||||
|  | ||||
|     _attr_device_class = UpdateDeviceClass.FIRMWARE | ||||
|     _server_unreachable_logged = False | ||||
|  | ||||
|     def __init__(self, coordinator: AirGradientCoordinator) -> None: | ||||
|         """Initialize the entity.""" | ||||
| @@ -51,27 +47,10 @@ class AirGradientUpdate(AirGradientEntity, UpdateEntity): | ||||
|         """Return the installed version of the entity.""" | ||||
|         return self.coordinator.data.measures.firmware_version | ||||
|  | ||||
|     @property | ||||
|     def available(self) -> bool: | ||||
|         """Return if entity is available.""" | ||||
|         return super().available and self._attr_available | ||||
|  | ||||
|     async def async_update(self) -> None: | ||||
|         """Update the entity.""" | ||||
|         try: | ||||
|             self._attr_latest_version = ( | ||||
|                 await self.coordinator.client.get_latest_firmware_version( | ||||
|                     self.coordinator.serial_number | ||||
|                 ) | ||||
|         self._attr_latest_version = ( | ||||
|             await self.coordinator.client.get_latest_firmware_version( | ||||
|                 self.coordinator.serial_number | ||||
|             ) | ||||
|         except AirGradientConnectionError: | ||||
|             self._attr_latest_version = None | ||||
|             self._attr_available = False | ||||
|             if not self._server_unreachable_logged: | ||||
|                 _LOGGER.error( | ||||
|                     "Unable to connect to AirGradient server to check for updates" | ||||
|                 ) | ||||
|                 self._server_unreachable_logged = True | ||||
|         else: | ||||
|             self._server_unreachable_logged = False | ||||
|             self._attr_available = True | ||||
|         ) | ||||
|   | ||||
| @@ -18,10 +18,6 @@ from homeassistant.helpers.aiohttp_client import async_get_clientsession | ||||
|  | ||||
| from .const import CONF_USE_NEAREST, DOMAIN, NO_AIRLY_SENSORS | ||||
|  | ||||
| DESCRIPTION_PLACEHOLDERS = { | ||||
|     "developer_registration_url": "https://developer.airly.eu/register", | ||||
| } | ||||
|  | ||||
|  | ||||
| class AirlyFlowHandler(ConfigFlow, domain=DOMAIN): | ||||
|     """Config flow for Airly.""" | ||||
| @@ -89,7 +85,6 @@ class AirlyFlowHandler(ConfigFlow, domain=DOMAIN): | ||||
|                 } | ||||
|             ), | ||||
|             errors=errors, | ||||
|             description_placeholders=DESCRIPTION_PLACEHOLDERS, | ||||
|         ) | ||||
|  | ||||
|  | ||||
|   | ||||
| @@ -1,23 +1,30 @@ | ||||
| { | ||||
|   "config": { | ||||
|     "step": { | ||||
|       "user": { | ||||
|         "description": "To generate API key go to https://developer.airly.eu/register", | ||||
|         "data": { | ||||
|           "name": "[%key:common::config_flow::data::name%]", | ||||
|           "api_key": "[%key:common::config_flow::data::api_key%]", | ||||
|           "latitude": "[%key:common::config_flow::data::latitude%]", | ||||
|           "longitude": "[%key:common::config_flow::data::longitude%]" | ||||
|         } | ||||
|       } | ||||
|     }, | ||||
|     "error": { | ||||
|       "wrong_location": "No Airly measuring stations in this area.", | ||||
|       "invalid_api_key": "[%key:common::config_flow::error::invalid_api_key%]" | ||||
|     }, | ||||
|     "abort": { | ||||
|       "already_configured": "[%key:common::config_flow::abort::already_configured_location%]", | ||||
|       "wrong_location": "[%key:component::airly::config::error::wrong_location%]" | ||||
|     }, | ||||
|     "error": { | ||||
|       "invalid_api_key": "[%key:common::config_flow::error::invalid_api_key%]", | ||||
|       "wrong_location": "No Airly measuring stations in this area." | ||||
|     }, | ||||
|     "step": { | ||||
|       "user": { | ||||
|         "data": { | ||||
|           "api_key": "[%key:common::config_flow::data::api_key%]", | ||||
|           "latitude": "[%key:common::config_flow::data::latitude%]", | ||||
|           "longitude": "[%key:common::config_flow::data::longitude%]", | ||||
|           "name": "[%key:common::config_flow::data::name%]" | ||||
|         }, | ||||
|         "description": "To generate API key go to {developer_registration_url}" | ||||
|       } | ||||
|     } | ||||
|   }, | ||||
|   "system_health": { | ||||
|     "info": { | ||||
|       "can_reach_server": "Reach Airly server", | ||||
|       "requests_remaining": "Remaining allowed requests", | ||||
|       "requests_per_day": "Allowed requests per day" | ||||
|     } | ||||
|   }, | ||||
|   "entity": { | ||||
| @@ -31,18 +38,11 @@ | ||||
|     } | ||||
|   }, | ||||
|   "exceptions": { | ||||
|     "no_station": { | ||||
|       "message": "An error occurred while retrieving data from the Airly API for {entry}: no measuring stations in this area" | ||||
|     }, | ||||
|     "update_error": { | ||||
|       "message": "An error occurred while retrieving data from the Airly API for {entry}: {error}" | ||||
|     } | ||||
|   }, | ||||
|   "system_health": { | ||||
|     "info": { | ||||
|       "can_reach_server": "Reach Airly server", | ||||
|       "requests_per_day": "Allowed requests per day", | ||||
|       "requests_remaining": "Remaining allowed requests" | ||||
|     }, | ||||
|     "no_station": { | ||||
|       "message": "An error occurred while retrieving data from the Airly API for {entry}: no measuring stations in this area" | ||||
|     } | ||||
|   } | ||||
| } | ||||
|   | ||||
| @@ -45,6 +45,9 @@ async def async_setup_entry(hass: HomeAssistant, entry: AirNowConfigEntry) -> bo | ||||
|     # Store Entity and Initialize Platforms | ||||
|     entry.runtime_data = coordinator | ||||
|  | ||||
|     # Listen for option changes | ||||
|     entry.async_on_unload(entry.add_update_listener(update_listener)) | ||||
|  | ||||
|     await hass.config_entries.async_forward_entry_setups(entry, PLATFORMS) | ||||
|  | ||||
|     # Clean up unused device entries with no entities | ||||
| @@ -85,3 +88,8 @@ async def async_migrate_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool: | ||||
| async def async_unload_entry(hass: HomeAssistant, entry: AirNowConfigEntry) -> bool: | ||||
|     """Unload a config entry.""" | ||||
|     return await hass.config_entries.async_unload_platforms(entry, PLATFORMS) | ||||
|  | ||||
|  | ||||
| async def update_listener(hass: HomeAssistant, entry: ConfigEntry) -> None: | ||||
|     """Handle options update.""" | ||||
|     await hass.config_entries.async_reload(entry.entry_id) | ||||
|   | ||||
| @@ -13,7 +13,7 @@ from homeassistant.config_entries import ( | ||||
|     ConfigEntry, | ||||
|     ConfigFlow, | ||||
|     ConfigFlowResult, | ||||
|     OptionsFlowWithReload, | ||||
|     OptionsFlow, | ||||
| ) | ||||
| from homeassistant.const import CONF_API_KEY, CONF_LATITUDE, CONF_LONGITUDE, CONF_RADIUS | ||||
| from homeassistant.core import HomeAssistant, callback | ||||
| @@ -26,10 +26,6 @@ from .const import DOMAIN | ||||
| _LOGGER = logging.getLogger(__name__) | ||||
|  | ||||
|  | ||||
| # Documentation URL for API key generation | ||||
| _API_KEY_URL = "https://docs.airnowapi.org/account/request/" | ||||
|  | ||||
|  | ||||
| async def validate_input(hass: HomeAssistant, data: dict[str, Any]) -> bool: | ||||
|     """Validate the user input allows us to connect. | ||||
|  | ||||
| @@ -118,7 +114,6 @@ class AirNowConfigFlow(ConfigFlow, domain=DOMAIN): | ||||
|                     ), | ||||
|                 } | ||||
|             ), | ||||
|             description_placeholders={"api_key_url": _API_KEY_URL}, | ||||
|             errors=errors, | ||||
|         ) | ||||
|  | ||||
| @@ -131,7 +126,7 @@ class AirNowConfigFlow(ConfigFlow, domain=DOMAIN): | ||||
|         return AirNowOptionsFlowHandler() | ||||
|  | ||||
|  | ||||
| class AirNowOptionsFlowHandler(OptionsFlowWithReload): | ||||
| class AirNowOptionsFlowHandler(OptionsFlow): | ||||
|     """Handle an options flow for AirNow.""" | ||||
|  | ||||
|     async def async_step_init( | ||||
|   | ||||
| @@ -4,15 +4,15 @@ | ||||
|       "aqi": { | ||||
|         "default": "mdi:blur" | ||||
|       }, | ||||
|       "o3": { | ||||
|         "default": "mdi:blur" | ||||
|       }, | ||||
|       "pm10": { | ||||
|         "default": "mdi:blur" | ||||
|       }, | ||||
|       "pm25": { | ||||
|         "default": "mdi:blur" | ||||
|       }, | ||||
|       "o3": { | ||||
|         "default": "mdi:blur" | ||||
|       }, | ||||
|       "station": { | ||||
|         "default": "mdi:blur" | ||||
|       } | ||||
|   | ||||
| @@ -1,7 +1,15 @@ | ||||
| { | ||||
|   "config": { | ||||
|     "abort": { | ||||
|       "already_configured": "[%key:common::config_flow::abort::already_configured_device%]" | ||||
|     "step": { | ||||
|       "user": { | ||||
|         "description": "To generate API key go to https://docs.airnowapi.org/account/request/", | ||||
|         "data": { | ||||
|           "api_key": "[%key:common::config_flow::data::api_key%]", | ||||
|           "latitude": "[%key:common::config_flow::data::latitude%]", | ||||
|           "longitude": "[%key:common::config_flow::data::longitude%]", | ||||
|           "radius": "Station radius (miles; optional)" | ||||
|         } | ||||
|       } | ||||
|     }, | ||||
|     "error": { | ||||
|       "cannot_connect": "[%key:common::config_flow::error::cannot_connect%]", | ||||
| @@ -9,15 +17,16 @@ | ||||
|       "invalid_location": "No results found for that location, try changing the location or station radius.", | ||||
|       "unknown": "[%key:common::config_flow::error::unknown%]" | ||||
|     }, | ||||
|     "abort": { | ||||
|       "already_configured": "[%key:common::config_flow::abort::already_configured_device%]" | ||||
|     } | ||||
|   }, | ||||
|   "options": { | ||||
|     "step": { | ||||
|       "user": { | ||||
|       "init": { | ||||
|         "data": { | ||||
|           "api_key": "[%key:common::config_flow::data::api_key%]", | ||||
|           "latitude": "[%key:common::config_flow::data::latitude%]", | ||||
|           "longitude": "[%key:common::config_flow::data::longitude%]", | ||||
|           "radius": "Station radius (miles; optional)" | ||||
|         }, | ||||
|         "description": "To generate API key go to {api_key_url}" | ||||
|           "radius": "Station radius (miles)" | ||||
|         } | ||||
|       } | ||||
|     } | ||||
|   }, | ||||
| @@ -34,14 +43,5 @@ | ||||
|         } | ||||
|       } | ||||
|     } | ||||
|   }, | ||||
|   "options": { | ||||
|     "step": { | ||||
|       "init": { | ||||
|         "data": { | ||||
|           "radius": "Station radius (miles)" | ||||
|         } | ||||
|       } | ||||
|     } | ||||
|   } | ||||
| } | ||||
|   | ||||
| @@ -1,132 +0,0 @@ | ||||
| """The Ubiquiti airOS integration.""" | ||||
|  | ||||
| from __future__ import annotations | ||||
|  | ||||
| import logging | ||||
|  | ||||
| from airos.airos8 import AirOS8 | ||||
|  | ||||
| from homeassistant.const import ( | ||||
|     CONF_HOST, | ||||
|     CONF_PASSWORD, | ||||
|     CONF_SSL, | ||||
|     CONF_USERNAME, | ||||
|     CONF_VERIFY_SSL, | ||||
|     Platform, | ||||
| ) | ||||
| from homeassistant.core import HomeAssistant, callback | ||||
| from homeassistant.helpers import device_registry as dr, entity_registry as er | ||||
| from homeassistant.helpers.aiohttp_client import async_get_clientsession | ||||
|  | ||||
| from .const import DEFAULT_SSL, DEFAULT_VERIFY_SSL, DOMAIN, SECTION_ADVANCED_SETTINGS | ||||
| from .coordinator import AirOSConfigEntry, AirOSDataUpdateCoordinator | ||||
|  | ||||
| _PLATFORMS: list[Platform] = [ | ||||
|     Platform.BINARY_SENSOR, | ||||
|     Platform.SENSOR, | ||||
| ] | ||||
|  | ||||
| _LOGGER = logging.getLogger(__name__) | ||||
|  | ||||
|  | ||||
| async def async_setup_entry(hass: HomeAssistant, entry: AirOSConfigEntry) -> bool: | ||||
|     """Set up Ubiquiti airOS from a config entry.""" | ||||
|  | ||||
|     # By default airOS 8 comes with self-signed SSL certificates, | ||||
|     # with no option in the web UI to change or upload a custom certificate. | ||||
|     session = async_get_clientsession( | ||||
|         hass, verify_ssl=entry.data[SECTION_ADVANCED_SETTINGS][CONF_VERIFY_SSL] | ||||
|     ) | ||||
|  | ||||
|     airos_device = AirOS8( | ||||
|         host=entry.data[CONF_HOST], | ||||
|         username=entry.data[CONF_USERNAME], | ||||
|         password=entry.data[CONF_PASSWORD], | ||||
|         session=session, | ||||
|         use_ssl=entry.data[SECTION_ADVANCED_SETTINGS][CONF_SSL], | ||||
|     ) | ||||
|  | ||||
|     coordinator = AirOSDataUpdateCoordinator(hass, entry, airos_device) | ||||
|     await coordinator.async_config_entry_first_refresh() | ||||
|  | ||||
|     entry.runtime_data = coordinator | ||||
|  | ||||
|     await hass.config_entries.async_forward_entry_setups(entry, _PLATFORMS) | ||||
|  | ||||
|     return True | ||||
|  | ||||
|  | ||||
| async def async_migrate_entry(hass: HomeAssistant, entry: AirOSConfigEntry) -> bool: | ||||
|     """Migrate old config entry.""" | ||||
|  | ||||
|     # This means the user has downgraded from a future version | ||||
|     if entry.version > 2: | ||||
|         return False | ||||
|  | ||||
|     # 1.1 Migrate config_entry to add advanced ssl settings | ||||
|     if entry.version == 1 and entry.minor_version == 1: | ||||
|         new_minor_version = 2 | ||||
|         new_data = {**entry.data} | ||||
|         advanced_data = { | ||||
|             CONF_SSL: DEFAULT_SSL, | ||||
|             CONF_VERIFY_SSL: DEFAULT_VERIFY_SSL, | ||||
|         } | ||||
|         new_data[SECTION_ADVANCED_SETTINGS] = advanced_data | ||||
|  | ||||
|         hass.config_entries.async_update_entry( | ||||
|             entry, | ||||
|             data=new_data, | ||||
|             minor_version=new_minor_version, | ||||
|         ) | ||||
|  | ||||
|     # 2.1 Migrate binary_sensor entity unique_id from device_id to mac_address | ||||
|     #     Step 1 - migrate binary_sensor entity unique_id | ||||
|     #     Step 2 - migrate device entity identifier | ||||
|     if entry.version == 1: | ||||
|         new_version = 2 | ||||
|         new_minor_version = 1 | ||||
|  | ||||
|         mac_adress = dr.format_mac(entry.unique_id) | ||||
|  | ||||
|         device_registry = dr.async_get(hass) | ||||
|         if device_entry := device_registry.async_get_device( | ||||
|             connections={(dr.CONNECTION_NETWORK_MAC, mac_adress)} | ||||
|         ): | ||||
|             old_device_id = next( | ||||
|                 ( | ||||
|                     device_id | ||||
|                     for domain, device_id in device_entry.identifiers | ||||
|                     if domain == DOMAIN | ||||
|                 ), | ||||
|             ) | ||||
|  | ||||
|             @callback | ||||
|             def update_unique_id( | ||||
|                 entity_entry: er.RegistryEntry, | ||||
|             ) -> dict[str, str] | None: | ||||
|                 """Update unique id from device_id to mac address.""" | ||||
|                 if old_device_id and entity_entry.unique_id.startswith(old_device_id): | ||||
|                     suffix = entity_entry.unique_id.removeprefix(old_device_id) | ||||
|                     new_unique_id = f"{mac_adress}{suffix}" | ||||
|                     return {"new_unique_id": new_unique_id} | ||||
|                 return None | ||||
|  | ||||
|             await er.async_migrate_entries(hass, entry.entry_id, update_unique_id) | ||||
|  | ||||
|             new_identifiers = device_entry.identifiers.copy() | ||||
|             new_identifiers.discard((DOMAIN, old_device_id)) | ||||
|             new_identifiers.add((DOMAIN, mac_adress)) | ||||
|             device_registry.async_update_device( | ||||
|                 device_entry.id, new_identifiers=new_identifiers | ||||
|             ) | ||||
|  | ||||
|         hass.config_entries.async_update_entry( | ||||
|             entry, version=new_version, minor_version=new_minor_version | ||||
|         ) | ||||
|  | ||||
|     return True | ||||
|  | ||||
|  | ||||
| async def async_unload_entry(hass: HomeAssistant, entry: AirOSConfigEntry) -> bool: | ||||
|     """Unload a config entry.""" | ||||
|     return await hass.config_entries.async_unload_platforms(entry, _PLATFORMS) | ||||
Some files were not shown because too many files have changed in this diff Show More
		Reference in New Issue
	
	Block a user