Compare commits

..

3 Commits

Author SHA1 Message Date
Samuel Sieb
d014a95674 lint 2025-04-11 20:50:32 -07:00
Samuel Sieb
b6dce21154 check if the select is configured 2025-04-11 20:48:24 -07:00
Samuel Sieb
865157afba [ld2420] only use select if available 2025-04-11 20:41:54 -07:00
2521 changed files with 38722 additions and 123831 deletions

View File

@@ -1,225 +0,0 @@
# ESPHome AI Collaboration Guide
This document provides essential context for AI models interacting with this project. Adhering to these guidelines will ensure consistency and maintain code quality.
## 1. Project Overview & Purpose
* **Primary Goal:** ESPHome is a system to configure microcontrollers (like ESP32, ESP8266, RP2040, and LibreTiny-based chips) using simple yet powerful YAML configuration files. It generates C++ firmware that can be compiled and flashed to these devices, allowing users to control them remotely through home automation systems.
* **Business Domain:** Internet of Things (IoT), Home Automation.
## 2. Core Technologies & Stack
* **Languages:** Python (>=3.11), C++ (gnu++20)
* **Frameworks & Runtimes:** PlatformIO, Arduino, ESP-IDF.
* **Build Systems:** PlatformIO is the primary build system. CMake is used as an alternative.
* **Configuration:** YAML.
* **Key Libraries/Dependencies:**
* **Python:** `voluptuous` (for configuration validation), `PyYAML` (for parsing configuration files), `paho-mqtt` (for MQTT communication), `tornado` (for the web server), `aioesphomeapi` (for the native API).
* **C++:** `ArduinoJson` (for JSON serialization/deserialization), `AsyncMqttClient-esphome` (for MQTT), `ESPAsyncWebServer` (for the web server).
* **Package Manager(s):** `pip` (for Python dependencies), `platformio` (for C++/PlatformIO dependencies).
* **Communication Protocols:** Protobuf (for native API), MQTT, HTTP.
## 3. Architectural Patterns
* **Overall Architecture:** The project follows a code-generation architecture. The Python code parses user-defined YAML configuration files and generates C++ source code. This C++ code is then compiled and flashed to the target microcontroller using PlatformIO.
* **Directory Structure Philosophy:**
* `/esphome`: Contains the core Python source code for the ESPHome application.
* `/esphome/components`: Contains the individual components that can be used in ESPHome configurations. Each component is a self-contained unit with its own C++ and Python code.
* `/tests`: Contains all unit and integration tests for the Python code.
* `/docker`: Contains Docker-related files for building and running ESPHome in a container.
* `/script`: Contains helper scripts for development and maintenance.
* **Core Architectural Components:**
1. **Configuration System** (`esphome/config*.py`): Handles YAML parsing and validation using Voluptuous, schema definitions, and multi-platform configurations.
2. **Code Generation** (`esphome/codegen.py`, `esphome/cpp_generator.py`): Manages Python to C++ code generation, template processing, and build flag management.
3. **Component System** (`esphome/components/`): Contains modular hardware and software components with platform-specific implementations and dependency management.
4. **Core Framework** (`esphome/core/`): Manages the application lifecycle, hardware abstraction, and component registration.
5. **Dashboard** (`esphome/dashboard/`): A web-based interface for device configuration, management, and OTA updates.
* **Platform Support:**
1. **ESP32** (`components/esp32/`): Espressif ESP32 family. Supports multiple variants (Original, C2, C3, C5, C6, H2, P4, S2, S3) with ESP-IDF framework. Arduino framework supports only a subset of the variants (Original, C3, S2, S3).
2. **ESP8266** (`components/esp8266/`): Espressif ESP8266. Arduino framework only, with memory constraints.
3. **RP2040** (`components/rp2040/`): Raspberry Pi Pico/RP2040. Arduino framework with PIO (Programmable I/O) support.
4. **LibreTiny** (`components/libretiny/`): Realtek and Beken chips. Supports multiple chip families and auto-generated components.
## 4. Coding Conventions & Style Guide
* **Formatting:**
* **Python:** Uses `ruff` and `flake8` for linting and formatting. Configuration is in `pyproject.toml`.
* **C++:** Uses `clang-format` for formatting. Configuration is in `.clang-format`.
* **Naming Conventions:**
* **Python:** Follows PEP 8. Use clear, descriptive names following snake_case.
* **C++:** Follows the Google C++ Style Guide.
* **Component Structure:**
* **Standard Files:**
```
components/[component_name]/
├── __init__.py # Component configuration schema and code generation
├── [component].h # C++ header file (if needed)
├── [component].cpp # C++ implementation (if needed)
└── [platform]/ # Platform-specific implementations
├── __init__.py # Platform-specific configuration
├── [platform].h # Platform C++ header
└── [platform].cpp # Platform C++ implementation
```
* **Component Metadata:**
- `DEPENDENCIES`: List of required components
- `AUTO_LOAD`: Components to automatically load
- `CONFLICTS_WITH`: Incompatible components
- `CODEOWNERS`: GitHub usernames responsible for maintenance
- `MULTI_CONF`: Whether multiple instances are allowed
* **Code Generation & Common Patterns:**
* **Configuration Schema Pattern:**
```python
import esphome.codegen as cg
import esphome.config_validation as cv
from esphome.const import CONF_KEY, CONF_ID
CONF_PARAM = "param" # A constant that does not yet exist in esphome/const.py
my_component_ns = cg.esphome_ns.namespace("my_component")
MyComponent = my_component_ns.class_("MyComponent", cg.Component)
CONFIG_SCHEMA = cv.Schema({
cv.GenerateID(): cv.declare_id(MyComponent),
cv.Required(CONF_KEY): cv.string,
cv.Optional(CONF_PARAM, default=42): cv.int_,
}).extend(cv.COMPONENT_SCHEMA)
async def to_code(config):
var = cg.new_Pvariable(config[CONF_ID])
await cg.register_component(var, config)
cg.add(var.set_key(config[CONF_KEY]))
cg.add(var.set_param(config[CONF_PARAM]))
```
* **C++ Class Pattern:**
```cpp
namespace esphome {
namespace my_component {
class MyComponent : public Component {
public:
void setup() override;
void loop() override;
void dump_config() override;
void set_key(const std::string &key) { this->key_ = key; }
void set_param(int param) { this->param_ = param; }
protected:
std::string key_;
int param_{0};
};
} // namespace my_component
} // namespace esphome
```
* **Common Component Examples:**
- **Sensor:**
```python
from esphome.components import sensor
CONFIG_SCHEMA = sensor.sensor_schema(MySensor).extend(cv.polling_component_schema("60s"))
async def to_code(config):
var = await sensor.new_sensor(config)
await cg.register_component(var, config)
```
- **Binary Sensor:**
```python
from esphome.components import binary_sensor
CONFIG_SCHEMA = binary_sensor.binary_sensor_schema().extend({ ... })
async def to_code(config):
var = await binary_sensor.new_binary_sensor(config)
```
- **Switch:**
```python
from esphome.components import switch
CONFIG_SCHEMA = switch.switch_schema().extend({ ... })
async def to_code(config):
var = await switch.new_switch(config)
```
* **Configuration Validation:**
* **Common Validators:** `cv.int_`, `cv.float_`, `cv.string`, `cv.boolean`, `cv.int_range(min=0, max=100)`, `cv.positive_int`, `cv.percentage`.
* **Complex Validation:** `cv.All(cv.string, cv.Length(min=1, max=50))`, `cv.Any(cv.int_, cv.string)`.
* **Platform-Specific:** `cv.only_on(["esp32", "esp8266"])`, `esp32.only_on_variant(...)`, `cv.only_on_esp32`, `cv.only_on_esp8266`, `cv.only_on_rp2040`.
* **Framework-Specific:** `cv.only_with_framework(...)`, `cv.only_with_arduino`, `cv.only_with_esp_idf`.
* **Schema Extensions:**
```python
CONFIG_SCHEMA = cv.Schema({ ... })
.extend(cv.COMPONENT_SCHEMA)
.extend(uart.UART_DEVICE_SCHEMA)
.extend(i2c.i2c_device_schema(0x48))
.extend(spi.spi_device_schema(cs_pin_required=True))
```
## 5. Key Files & Entrypoints
* **Main Entrypoint(s):** `esphome/__main__.py` is the main entrypoint for the ESPHome command-line interface.
* **Configuration:**
* `pyproject.toml`: Defines the Python project metadata and dependencies.
* `platformio.ini`: Configures the PlatformIO build environments for different microcontrollers.
* `.pre-commit-config.yaml`: Configures the pre-commit hooks for linting and formatting.
* **CI/CD Pipeline:** Defined in `.github/workflows`.
* **Static Analysis & Development:**
* `esphome/core/defines.h`: A comprehensive header file containing all `#define` directives that can be added by components using `cg.add_define()` in Python. This file is used exclusively for development, static analysis tools, and CI testing - it is not used during runtime compilation. When developing components that add new defines, they must be added to this file to ensure proper IDE support and static analysis coverage. The file includes feature flags, build configurations, and platform-specific defines that help static analyzers understand the complete codebase without needing to compile for specific platforms.
## 6. Development & Testing Workflow
* **Local Development Environment:** Use the provided Docker container or create a Python virtual environment and install dependencies from `requirements_dev.txt`.
* **Running Commands:** Use the `script/run-in-env.py` script to execute commands within the project's virtual environment. For example, to run the linter: `python3 script/run-in-env.py pre-commit run`.
* **Testing:**
* **Python:** Run unit tests with `pytest`.
* **C++:** Use `clang-tidy` for static analysis.
* **Component Tests:** YAML-based compilation tests are located in `tests/`. The structure is as follows:
```
tests/
├── test_build_components/ # Base test configurations
└── components/[component]/ # Component-specific tests
```
Run them using `script/test_build_components`. Use `-c <component>` to test specific components and `-t <target>` for specific platforms.
* **Debugging and Troubleshooting:**
* **Debug Tools:**
- `esphome config <file>.yaml` to validate configuration.
- `esphome compile <file>.yaml` to compile without uploading.
- Check the Dashboard for real-time logs.
- Use component-specific debug logging.
* **Common Issues:**
- **Import Errors**: Check component dependencies and `PYTHONPATH`.
- **Validation Errors**: Review configuration schema definitions.
- **Build Errors**: Check platform compatibility and library versions.
- **Runtime Errors**: Review generated C++ code and component logic.
## 7. Specific Instructions for AI Collaboration
* **Contribution Workflow (Pull Request Process):**
1. **Fork & Branch:** Create a new branch in your fork.
2. **Make Changes:** Adhere to all coding conventions and patterns.
3. **Test:** Create component tests for all supported platforms and run the full test suite locally.
4. **Lint:** Run `pre-commit` to ensure code is compliant.
5. **Commit:** Commit your changes. There is no strict format for commit messages.
6. **Pull Request:** Submit a PR against the `dev` branch. The Pull Request title should have a prefix of the component being worked on (e.g., `[display] Fix bug`, `[abc123] Add new component`). Update documentation, examples, and add `CODEOWNERS` entries as needed. Pull requests should always be made with the PULL_REQUEST_TEMPLATE.md template filled out correctly.
* **Documentation Contributions:**
* Documentation is hosted in the separate `esphome/esphome-docs` repository.
* The contribution workflow is the same as for the codebase.
* **Best Practices:**
* **Component Development:** Keep dependencies minimal, provide clear error messages, and write comprehensive docstrings and tests.
* **Code Generation:** Generate minimal and efficient C++ code. Validate all user inputs thoroughly. Support multiple platform variations.
* **Configuration Design:** Aim for simplicity with sensible defaults, while allowing for advanced customization.
* **Security:** Be mindful of security when making changes to the API, web server, or any other network-related code. Do not hardcode secrets or keys.
* **Dependencies & Build System Integration:**
* **Python:** When adding a new Python dependency, add it to the appropriate `requirements*.txt` file and `pyproject.toml`.
* **C++ / PlatformIO:** When adding a new C++ dependency, add it to `platformio.ini` and use `cg.add_library`.
* **Build Flags:** Use `cg.add_build_flag(...)` to add compiler flags.

View File

@@ -1 +0,0 @@
4368db58e8f884aff245996b1e8b644cc0796c0bb2fa706d5740d40b823d3ac9

View File

@@ -1,4 +1,2 @@
[run]
omit =
esphome/components/*
tests/integration/*
omit = esphome/components/*

View File

@@ -1,37 +0,0 @@
ARG BUILD_BASE_VERSION=2025.04.0
FROM ghcr.io/esphome/docker-base:debian-${BUILD_BASE_VERSION} AS base
RUN git config --system --add safe.directory "*"
RUN apt update \
&& apt install -y \
protobuf-compiler
RUN pip install uv
RUN useradd esphome -m
USER esphome
ENV VIRTUAL_ENV=/home/esphome/.local/esphome-venv
RUN uv venv $VIRTUAL_ENV
ENV PATH="$VIRTUAL_ENV/bin:$PATH"
# Override this set to true in the docker-base image
ENV UV_SYSTEM_PYTHON=false
WORKDIR /tmp
COPY requirements.txt ./
RUN uv pip install -r requirements.txt
COPY requirements_dev.txt requirements_test.txt ./
RUN uv pip install -r requirements_dev.txt -r requirements_test.txt
RUN \
platformio settings set enable_telemetry No \
&& platformio settings set check_platformio_interval 1000000
COPY script/platformio_install_deps.py platformio.ini ./
RUN ./platformio_install_deps.py platformio.ini --libraries --platforms --tools
WORKDIR /workspaces

View File

@@ -1,17 +1,18 @@
{
"name": "ESPHome Dev",
"context": "..",
"dockerFile": "Dockerfile",
"image": "ghcr.io/esphome/esphome-lint:dev",
"postCreateCommand": [
"script/devcontainer-post-create"
],
"features": {
"ghcr.io/devcontainers/features/github-cli:1": {}
"containerEnv": {
"DEVCONTAINER": "1",
"PIP_BREAK_SYSTEM_PACKAGES": "1",
"PIP_ROOT_USER_ACTION": "ignore"
},
"runArgs": [
"--privileged",
"-e",
"GIT_EDITOR=code --wait"
"ESPHOME_DASHBOARD_USE_PING=1"
// uncomment and edit the path in order to pass though local USB serial to the conatiner
// , "--device=/dev/ttyACM0"
],

View File

@@ -114,5 +114,4 @@ config/
examples/
Dockerfile
.git/
tests/
.*
tests/build/

View File

@@ -1,92 +0,0 @@
name: Report an issue with ESPHome
description: Report an issue with ESPHome.
body:
- type: markdown
attributes:
value: |
This issue form is for reporting bugs only!
If you have a feature request or enhancement, please [request them here instead][fr].
[fr]: https://github.com/orgs/esphome/discussions
- type: textarea
validations:
required: true
id: problem
attributes:
label: The problem
description: >-
Describe the issue you are experiencing here to communicate to the
maintainers. Tell us what you were trying to do and what happened.
Provide a clear and concise description of what the problem is.
- type: markdown
attributes:
value: |
## Environment
- type: input
id: version
validations:
required: true
attributes:
label: Which version of ESPHome has the issue?
description: >
ESPHome version like 1.19, 2025.6.0 or 2025.XX.X-dev.
- type: dropdown
validations:
required: true
id: installation
attributes:
label: What type of installation are you using?
options:
- Home Assistant Add-on
- Docker
- pip
- type: dropdown
validations:
required: true
id: platform
attributes:
label: What platform are you using?
options:
- ESP8266
- ESP32
- RP2040
- BK72XX
- RTL87XX
- LN882X
- Host
- Other
- type: input
id: component_name
attributes:
label: Component causing the issue
description: >
The name of the component or platform. For example, api/i2c or ultrasonic.
- type: markdown
attributes:
value: |
# Details
- type: textarea
id: config
attributes:
label: YAML Config
description: |
Include a complete YAML configuration file demonstrating the problem here. Preferably post the *entire* file - don't make assumptions about what is unimportant. However, if it's a large or complicated config then you will need to reduce it to the smallest possible file *that still demonstrates the problem*. If you don't provide enough information to *easily* reproduce the problem, it's unlikely your bug report will get any attention. Logs do not belong here, attach them below.
render: yaml
- type: textarea
id: logs
attributes:
label: Anything in the logs that might be useful for us?
description: For example, error message, or stack traces. Serial or USB logs are much more useful than WiFi logs.
render: txt
- type: textarea
id: additional
attributes:
label: Additional information
description: >
If you have any additional information for us, use the field below.
Please note, you can attach screenshots or screen recordings here, by
dragging and dropping files in the field below.

View File

@@ -1,21 +1,15 @@
---
blank_issues_enabled: false
contact_links:
- name: Report an issue with the ESPHome documentation
url: https://github.com/esphome/esphome-docs/issues/new/choose
about: Report an issue with the ESPHome documentation.
- name: Report an issue with the ESPHome web server
url: https://github.com/esphome/esphome-webserver/issues/new/choose
about: Report an issue with the ESPHome web server.
- name: Report an issue with the ESPHome Builder / Dashboard
url: https://github.com/esphome/dashboard/issues/new/choose
about: Report an issue with the ESPHome Builder / Dashboard.
- name: Report an issue with the ESPHome API client
url: https://github.com/esphome/aioesphomeapi/issues/new/choose
about: Report an issue with the ESPHome API client.
- name: Make a Feature Request
url: https://github.com/orgs/esphome/discussions
about: Please create feature requests in the dedicated feature request tracker.
- name: Issue Tracker
url: https://github.com/esphome/issues
about: Please create bug reports in the dedicated issue tracker.
- name: Feature Request Tracker
url: https://github.com/esphome/feature-requests
about: |
Please create feature requests in the dedicated feature request tracker.
- name: Frequently Asked Question
url: https://esphome.io/guides/faq.html
about: Please view the FAQ for common questions and what to include in a bug report.
about: |
Please view the FAQ for common questions and what
to include in a bug report.

View File

@@ -26,7 +26,6 @@
- [ ] RP2040
- [ ] BK72xx
- [ ] RTL87xx
- [ ] nRF52840
## Example entry for `config.yaml`:

View File

@@ -1,11 +1,15 @@
name: Build Image
inputs:
platform:
description: "Platform to build for"
required: true
example: "linux/amd64"
target:
description: "Target to build"
required: true
example: "docker"
build_type:
description: "Build type"
baseimg:
description: "Base image type"
required: true
example: "docker"
suffix:
@@ -15,11 +19,6 @@ inputs:
description: "Version to build"
required: true
example: "2023.12.0"
base_os:
description: "Base OS to use"
required: false
default: "debian"
example: "debian"
runs:
using: "composite"
steps:
@@ -47,52 +46,52 @@ runs:
- name: Build and push to ghcr by digest
id: build-ghcr
uses: docker/build-push-action@v6.18.0
uses: docker/build-push-action@v6.15.0
env:
DOCKER_BUILD_SUMMARY: false
DOCKER_BUILD_RECORD_UPLOAD: false
with:
context: .
file: ./docker/Dockerfile
platforms: ${{ inputs.platform }}
target: ${{ inputs.target }}
cache-from: type=gha
cache-to: ${{ steps.cache-to.outputs.value }}
build-args: |
BUILD_TYPE=${{ inputs.build_type }}
BASEIMGTYPE=${{ inputs.baseimg }}
BUILD_VERSION=${{ inputs.version }}
BUILD_OS=${{ inputs.base_os }}
outputs: |
type=image,name=ghcr.io/${{ steps.tags.outputs.image_name }},push-by-digest=true,name-canonical=true,push=true
- name: Export ghcr digests
shell: bash
run: |
mkdir -p /tmp/digests/${{ inputs.build_type }}/ghcr
mkdir -p /tmp/digests/${{ inputs.target }}/ghcr
digest="${{ steps.build-ghcr.outputs.digest }}"
touch "/tmp/digests/${{ inputs.build_type }}/ghcr/${digest#sha256:}"
touch "/tmp/digests/${{ inputs.target }}/ghcr/${digest#sha256:}"
- name: Build and push to dockerhub by digest
id: build-dockerhub
uses: docker/build-push-action@v6.18.0
uses: docker/build-push-action@v6.15.0
env:
DOCKER_BUILD_SUMMARY: false
DOCKER_BUILD_RECORD_UPLOAD: false
with:
context: .
file: ./docker/Dockerfile
platforms: ${{ inputs.platform }}
target: ${{ inputs.target }}
cache-from: type=gha
cache-to: ${{ steps.cache-to.outputs.value }}
build-args: |
BUILD_TYPE=${{ inputs.build_type }}
BASEIMGTYPE=${{ inputs.baseimg }}
BUILD_VERSION=${{ inputs.version }}
BUILD_OS=${{ inputs.base_os }}
outputs: |
type=image,name=docker.io/${{ steps.tags.outputs.image_name }},push-by-digest=true,name-canonical=true,push=true
- name: Export dockerhub digests
shell: bash
run: |
mkdir -p /tmp/digests/${{ inputs.build_type }}/dockerhub
mkdir -p /tmp/digests/${{ inputs.target }}/dockerhub
digest="${{ steps.build-dockerhub.outputs.digest }}"
touch "/tmp/digests/${{ inputs.build_type }}/dockerhub/${digest#sha256:}"
touch "/tmp/digests/${{ inputs.target }}/dockerhub/${digest#sha256:}"

View File

@@ -17,12 +17,12 @@ runs:
steps:
- name: Set up Python ${{ inputs.python-version }}
id: python
uses: actions/setup-python@v6.0.0
uses: actions/setup-python@v5.5.0
with:
python-version: ${{ inputs.python-version }}
- name: Restore Python virtual environment
id: cache-venv
uses: actions/cache/restore@v4.2.4
uses: actions/cache/restore@v4.2.3
with:
path: venv
# yamllint disable-line rule:line-length
@@ -34,14 +34,14 @@ runs:
python -m venv venv
source venv/bin/activate
python --version
pip install -r requirements.txt -r requirements_test.txt
pip install -r requirements.txt -r requirements_optional.txt -r requirements_test.txt
pip install -e .
- name: Create Python virtual environment
if: steps.cache-venv.outputs.cache-hit != 'true' && runner.os == 'Windows'
shell: bash
run: |
python -m venv venv
source ./venv/Scripts/activate
./venv/Scripts/activate
python --version
pip install -r requirements.txt -r requirements_test.txt
pip install -r requirements.txt -r requirements_optional.txt -r requirements_test.txt
pip install -e .

View File

@@ -1 +0,0 @@
../.ai/instructions.md

View File

@@ -9,9 +9,6 @@ updates:
# Hypotehsis is only used for testing and is updated quite often
- dependency-name: hypothesis
- package-ecosystem: github-actions
labels:
- "dependencies"
- "github-actions"
directory: "/"
schedule:
interval: daily
@@ -20,20 +17,15 @@ updates:
docker-actions:
applies-to: version-updates
patterns:
- "docker/setup-qemu-action"
- "docker/login-action"
- "docker/setup-buildx-action"
- package-ecosystem: github-actions
labels:
- "dependencies"
- "github-actions"
directory: "/.github/actions/build-image"
schedule:
interval: daily
open-pull-requests-limit: 10
- package-ecosystem: github-actions
labels:
- "dependencies"
- "github-actions"
directory: "/.github/actions/restore-python"
schedule:
interval: daily

View File

@@ -1,662 +0,0 @@
name: Auto Label PR
on:
# Runs only on pull_request_target due to having access to a App token.
# This means PRs from forks will not be able to alter this workflow to get the tokens
pull_request_target:
types: [labeled, opened, reopened, synchronize, edited]
permissions:
pull-requests: write
contents: read
env:
SMALL_PR_THRESHOLD: 30
MAX_LABELS: 15
TOO_BIG_THRESHOLD: 1000
COMPONENT_LABEL_THRESHOLD: 10
jobs:
label:
runs-on: ubuntu-latest
if: github.event.action != 'labeled' || github.event.sender.type != 'Bot'
steps:
- name: Checkout
uses: actions/checkout@v5.0.0
- name: Generate a token
id: generate-token
uses: actions/create-github-app-token@v2
with:
app-id: ${{ secrets.ESPHOME_GITHUB_APP_ID }}
private-key: ${{ secrets.ESPHOME_GITHUB_APP_PRIVATE_KEY }}
- name: Auto Label PR
uses: actions/github-script@v8.0.0
with:
github-token: ${{ steps.generate-token.outputs.token }}
script: |
const fs = require('fs');
// Constants
const SMALL_PR_THRESHOLD = parseInt('${{ env.SMALL_PR_THRESHOLD }}');
const MAX_LABELS = parseInt('${{ env.MAX_LABELS }}');
const TOO_BIG_THRESHOLD = parseInt('${{ env.TOO_BIG_THRESHOLD }}');
const COMPONENT_LABEL_THRESHOLD = parseInt('${{ env.COMPONENT_LABEL_THRESHOLD }}');
const BOT_COMMENT_MARKER = '<!-- auto-label-pr-bot -->';
const CODEOWNERS_MARKER = '<!-- codeowners-request -->';
const TOO_BIG_MARKER = '<!-- too-big-request -->';
const MANAGED_LABELS = [
'new-component',
'new-platform',
'new-target-platform',
'merging-to-release',
'merging-to-beta',
'core',
'small-pr',
'dashboard',
'github-actions',
'by-code-owner',
'has-tests',
'needs-tests',
'needs-docs',
'needs-codeowners',
'too-big',
'labeller-recheck',
'bugfix',
'new-feature',
'breaking-change',
'code-quality'
];
const DOCS_PR_PATTERNS = [
/https:\/\/github\.com\/esphome\/esphome-docs\/pull\/\d+/,
/esphome\/esphome-docs#\d+/
];
// Global state
const { owner, repo } = context.repo;
const pr_number = context.issue.number;
// Get current labels and PR data
const { data: currentLabelsData } = await github.rest.issues.listLabelsOnIssue({
owner,
repo,
issue_number: pr_number
});
const currentLabels = currentLabelsData.map(label => label.name);
const managedLabels = currentLabels.filter(label =>
label.startsWith('component: ') || MANAGED_LABELS.includes(label)
);
// Check for mega-PR early - if present, skip most automatic labeling
const isMegaPR = currentLabels.includes('mega-pr');
// Get all PR files with automatic pagination
const prFiles = await github.paginate(
github.rest.pulls.listFiles,
{
owner,
repo,
pull_number: pr_number
}
);
// Calculate data from PR files
const changedFiles = prFiles.map(file => file.filename);
const totalAdditions = prFiles.reduce((sum, file) => sum + (file.additions || 0), 0);
const totalDeletions = prFiles.reduce((sum, file) => sum + (file.deletions || 0), 0);
const totalChanges = totalAdditions + totalDeletions;
console.log('Current labels:', currentLabels.join(', '));
console.log('Changed files:', changedFiles.length);
console.log('Total changes:', totalChanges);
if (isMegaPR) {
console.log('Mega-PR detected - applying limited labeling logic');
}
// Fetch API data
async function fetchApiData() {
try {
const response = await fetch('https://data.esphome.io/components.json');
const componentsData = await response.json();
return {
targetPlatforms: componentsData.target_platforms || [],
platformComponents: componentsData.platform_components || []
};
} catch (error) {
console.log('Failed to fetch components data from API:', error.message);
return { targetPlatforms: [], platformComponents: [] };
}
}
// Strategy: Merge branch detection
async function detectMergeBranch() {
const labels = new Set();
const baseRef = context.payload.pull_request.base.ref;
if (baseRef === 'release') {
labels.add('merging-to-release');
} else if (baseRef === 'beta') {
labels.add('merging-to-beta');
}
return labels;
}
// Strategy: Component and platform labeling
async function detectComponentPlatforms(apiData) {
const labels = new Set();
const componentRegex = /^esphome\/components\/([^\/]+)\//;
const targetPlatformRegex = new RegExp(`^esphome\/components\/(${apiData.targetPlatforms.join('|')})/`);
for (const file of changedFiles) {
const componentMatch = file.match(componentRegex);
if (componentMatch) {
labels.add(`component: ${componentMatch[1]}`);
}
const platformMatch = file.match(targetPlatformRegex);
if (platformMatch) {
labels.add(`platform: ${platformMatch[1]}`);
}
}
return labels;
}
// Strategy: New component detection
async function detectNewComponents() {
const labels = new Set();
const addedFiles = prFiles.filter(file => file.status === 'added').map(file => file.filename);
for (const file of addedFiles) {
const componentMatch = file.match(/^esphome\/components\/([^\/]+)\/__init__\.py$/);
if (componentMatch) {
try {
const content = fs.readFileSync(file, 'utf8');
if (content.includes('IS_TARGET_PLATFORM = True')) {
labels.add('new-target-platform');
}
} catch (error) {
console.log(`Failed to read content of ${file}:`, error.message);
}
labels.add('new-component');
}
}
return labels;
}
// Strategy: New platform detection
async function detectNewPlatforms(apiData) {
const labels = new Set();
const addedFiles = prFiles.filter(file => file.status === 'added').map(file => file.filename);
for (const file of addedFiles) {
const platformFileMatch = file.match(/^esphome\/components\/([^\/]+)\/([^\/]+)\.py$/);
if (platformFileMatch) {
const [, component, platform] = platformFileMatch;
if (apiData.platformComponents.includes(platform)) {
labels.add('new-platform');
}
}
const platformDirMatch = file.match(/^esphome\/components\/([^\/]+)\/([^\/]+)\/__init__\.py$/);
if (platformDirMatch) {
const [, component, platform] = platformDirMatch;
if (apiData.platformComponents.includes(platform)) {
labels.add('new-platform');
}
}
}
return labels;
}
// Strategy: Core files detection
async function detectCoreChanges() {
const labels = new Set();
const coreFiles = changedFiles.filter(file =>
file.startsWith('esphome/core/') ||
(file.startsWith('esphome/') && file.split('/').length === 2)
);
if (coreFiles.length > 0) {
labels.add('core');
}
return labels;
}
// Strategy: PR size detection
async function detectPRSize() {
const labels = new Set();
if (totalChanges <= SMALL_PR_THRESHOLD) {
labels.add('small-pr');
return labels;
}
const testAdditions = prFiles
.filter(file => file.filename.startsWith('tests/'))
.reduce((sum, file) => sum + (file.additions || 0), 0);
const testDeletions = prFiles
.filter(file => file.filename.startsWith('tests/'))
.reduce((sum, file) => sum + (file.deletions || 0), 0);
const nonTestChanges = (totalAdditions - testAdditions) - (totalDeletions - testDeletions);
// Don't add too-big if mega-pr label is already present
if (nonTestChanges > TOO_BIG_THRESHOLD && !isMegaPR) {
labels.add('too-big');
}
return labels;
}
// Strategy: Dashboard changes
async function detectDashboardChanges() {
const labels = new Set();
const dashboardFiles = changedFiles.filter(file =>
file.startsWith('esphome/dashboard/') ||
file.startsWith('esphome/components/dashboard_import/')
);
if (dashboardFiles.length > 0) {
labels.add('dashboard');
}
return labels;
}
// Strategy: GitHub Actions changes
async function detectGitHubActionsChanges() {
const labels = new Set();
const githubActionsFiles = changedFiles.filter(file =>
file.startsWith('.github/workflows/')
);
if (githubActionsFiles.length > 0) {
labels.add('github-actions');
}
return labels;
}
// Strategy: Code owner detection
async function detectCodeOwner() {
const labels = new Set();
try {
const { data: codeownersFile } = await github.rest.repos.getContent({
owner,
repo,
path: 'CODEOWNERS',
});
const codeownersContent = Buffer.from(codeownersFile.content, 'base64').toString('utf8');
const prAuthor = context.payload.pull_request.user.login;
const codeownersLines = codeownersContent.split('\n')
.map(line => line.trim())
.filter(line => line && !line.startsWith('#'));
const codeownersRegexes = codeownersLines.map(line => {
const parts = line.split(/\s+/);
const pattern = parts[0];
const owners = parts.slice(1);
let regex;
if (pattern.endsWith('*')) {
const dir = pattern.slice(0, -1);
regex = new RegExp(`^${dir.replace(/[.*+?^${}()|[\]\\]/g, '\\$&')}`);
} else if (pattern.includes('*')) {
// First escape all regex special chars except *, then replace * with .*
const regexPattern = pattern
.replace(/[.+?^${}()|[\]\\]/g, '\\$&')
.replace(/\*/g, '.*');
regex = new RegExp(`^${regexPattern}$`);
} else {
regex = new RegExp(`^${pattern.replace(/[.*+?^${}()|[\]\\]/g, '\\$&')}$`);
}
return { regex, owners };
});
for (const file of changedFiles) {
for (const { regex, owners } of codeownersRegexes) {
if (regex.test(file) && owners.some(owner => owner === `@${prAuthor}`)) {
labels.add('by-code-owner');
return labels;
}
}
}
} catch (error) {
console.log('Failed to read or parse CODEOWNERS file:', error.message);
}
return labels;
}
// Strategy: Test detection
async function detectTests() {
const labels = new Set();
const testFiles = changedFiles.filter(file => file.startsWith('tests/'));
if (testFiles.length > 0) {
labels.add('has-tests');
}
return labels;
}
// Strategy: PR Template Checkbox detection
async function detectPRTemplateCheckboxes() {
const labels = new Set();
const prBody = context.payload.pull_request.body || '';
console.log('Checking PR template checkboxes...');
// Check for checked checkboxes in the "Types of changes" section
const checkboxPatterns = [
{ pattern: /- \[x\] Bugfix \(non-breaking change which fixes an issue\)/i, label: 'bugfix' },
{ pattern: /- \[x\] New feature \(non-breaking change which adds functionality\)/i, label: 'new-feature' },
{ pattern: /- \[x\] Breaking change \(fix or feature that would cause existing functionality to not work as expected\)/i, label: 'breaking-change' },
{ pattern: /- \[x\] Code quality improvements to existing code or addition of tests/i, label: 'code-quality' }
];
for (const { pattern, label } of checkboxPatterns) {
if (pattern.test(prBody)) {
console.log(`Found checked checkbox for: ${label}`);
labels.add(label);
}
}
return labels;
}
// Strategy: Requirements detection
async function detectRequirements(allLabels) {
const labels = new Set();
// Check for missing tests
if ((allLabels.has('new-component') || allLabels.has('new-platform') || allLabels.has('new-feature')) && !allLabels.has('has-tests')) {
labels.add('needs-tests');
}
// Check for missing docs
if (allLabels.has('new-component') || allLabels.has('new-platform') || allLabels.has('new-feature')) {
const prBody = context.payload.pull_request.body || '';
const hasDocsLink = DOCS_PR_PATTERNS.some(pattern => pattern.test(prBody));
if (!hasDocsLink) {
labels.add('needs-docs');
}
}
// Check for missing CODEOWNERS
if (allLabels.has('new-component')) {
const codeownersModified = prFiles.some(file =>
file.filename === 'CODEOWNERS' &&
(file.status === 'modified' || file.status === 'added') &&
(file.additions || 0) > 0
);
if (!codeownersModified) {
labels.add('needs-codeowners');
}
}
return labels;
}
// Generate review messages
function generateReviewMessages(finalLabels) {
const messages = [];
const prAuthor = context.payload.pull_request.user.login;
// Too big message
if (finalLabels.includes('too-big')) {
const testAdditions = prFiles
.filter(file => file.filename.startsWith('tests/'))
.reduce((sum, file) => sum + (file.additions || 0), 0);
const testDeletions = prFiles
.filter(file => file.filename.startsWith('tests/'))
.reduce((sum, file) => sum + (file.deletions || 0), 0);
const nonTestChanges = (totalAdditions - testAdditions) - (totalDeletions - testDeletions);
const tooManyLabels = finalLabels.length > MAX_LABELS;
const tooManyChanges = nonTestChanges > TOO_BIG_THRESHOLD;
let message = `${TOO_BIG_MARKER}\n### 📦 Pull Request Size\n\n`;
if (tooManyLabels && tooManyChanges) {
message += `This PR is too large with ${nonTestChanges} line changes (excluding tests) and affects ${finalLabels.length} different components/areas.`;
} else if (tooManyLabels) {
message += `This PR affects ${finalLabels.length} different components/areas.`;
} else {
message += `This PR is too large with ${nonTestChanges} line changes (excluding tests).`;
}
message += ` Please consider breaking it down into smaller, focused PRs to make review easier and reduce the risk of conflicts.\n\n`;
message += `For guidance on breaking down large PRs, see: https://developers.esphome.io/contributing/submitting-your-work/#how-to-approach-large-submissions`;
messages.push(message);
}
// CODEOWNERS message
if (finalLabels.includes('needs-codeowners')) {
const message = `${CODEOWNERS_MARKER}\n### 👥 Code Ownership\n\n` +
`Hey there @${prAuthor},\n` +
`Thanks for submitting this pull request! Can you add yourself as a codeowner for this integration? ` +
`This way we can notify you if a bug report for this integration is reported.\n\n` +
`In \`__init__.py\` of the integration, please add:\n\n` +
`\`\`\`python\nCODEOWNERS = ["@${prAuthor}"]\n\`\`\`\n\n` +
`And run \`script/build_codeowners.py\``;
messages.push(message);
}
return messages;
}
// Handle reviews
async function handleReviews(finalLabels) {
const reviewMessages = generateReviewMessages(finalLabels);
const hasReviewableLabels = finalLabels.some(label =>
['too-big', 'needs-codeowners'].includes(label)
);
const { data: reviews } = await github.rest.pulls.listReviews({
owner,
repo,
pull_number: pr_number
});
const botReviews = reviews.filter(review =>
review.user.type === 'Bot' &&
review.state === 'CHANGES_REQUESTED' &&
review.body && review.body.includes(BOT_COMMENT_MARKER)
);
if (hasReviewableLabels) {
const reviewBody = `${BOT_COMMENT_MARKER}\n\n${reviewMessages.join('\n\n---\n\n')}`;
if (botReviews.length > 0) {
// Update existing review
await github.rest.pulls.updateReview({
owner,
repo,
pull_number: pr_number,
review_id: botReviews[0].id,
body: reviewBody
});
console.log('Updated existing bot review');
} else {
// Create new review
await github.rest.pulls.createReview({
owner,
repo,
pull_number: pr_number,
body: reviewBody,
event: 'REQUEST_CHANGES'
});
console.log('Created new bot review');
}
} else if (botReviews.length > 0) {
// Dismiss existing reviews
for (const review of botReviews) {
try {
await github.rest.pulls.dismissReview({
owner,
repo,
pull_number: pr_number,
review_id: review.id,
message: 'Review dismissed: All requirements have been met'
});
console.log(`Dismissed bot review ${review.id}`);
} catch (error) {
console.log(`Failed to dismiss review ${review.id}:`, error.message);
}
}
}
}
// Main execution
const apiData = await fetchApiData();
const baseRef = context.payload.pull_request.base.ref;
// Early exit for non-dev branches
if (baseRef !== 'dev') {
const branchLabels = await detectMergeBranch();
const finalLabels = Array.from(branchLabels);
console.log('Computed labels (merge branch only):', finalLabels.join(', '));
// Apply labels
if (finalLabels.length > 0) {
await github.rest.issues.addLabels({
owner,
repo,
issue_number: pr_number,
labels: finalLabels
});
}
// Remove old managed labels
const labelsToRemove = managedLabels.filter(label => !finalLabels.includes(label));
for (const label of labelsToRemove) {
try {
await github.rest.issues.removeLabel({
owner,
repo,
issue_number: pr_number,
name: label
});
} catch (error) {
console.log(`Failed to remove label ${label}:`, error.message);
}
}
return;
}
// Run all strategies
const [
branchLabels,
componentLabels,
newComponentLabels,
newPlatformLabels,
coreLabels,
sizeLabels,
dashboardLabels,
actionsLabels,
codeOwnerLabels,
testLabels,
checkboxLabels
] = await Promise.all([
detectMergeBranch(),
detectComponentPlatforms(apiData),
detectNewComponents(),
detectNewPlatforms(apiData),
detectCoreChanges(),
detectPRSize(),
detectDashboardChanges(),
detectGitHubActionsChanges(),
detectCodeOwner(),
detectTests(),
detectPRTemplateCheckboxes()
]);
// Combine all labels
const allLabels = new Set([
...branchLabels,
...componentLabels,
...newComponentLabels,
...newPlatformLabels,
...coreLabels,
...sizeLabels,
...dashboardLabels,
...actionsLabels,
...codeOwnerLabels,
...testLabels,
...checkboxLabels
]);
// Detect requirements based on all other labels
const requirementLabels = await detectRequirements(allLabels);
for (const label of requirementLabels) {
allLabels.add(label);
}
let finalLabels = Array.from(allLabels);
// For mega-PRs, exclude component labels if there are too many
if (isMegaPR) {
const componentLabels = finalLabels.filter(label => label.startsWith('component: '));
if (componentLabels.length > COMPONENT_LABEL_THRESHOLD) {
finalLabels = finalLabels.filter(label => !label.startsWith('component: '));
console.log(`Mega-PR detected - excluding ${componentLabels.length} component labels (threshold: ${COMPONENT_LABEL_THRESHOLD})`);
}
}
// Handle too many labels (only for non-mega PRs)
const tooManyLabels = finalLabels.length > MAX_LABELS;
if (tooManyLabels && !isMegaPR && !finalLabels.includes('too-big')) {
finalLabels = ['too-big'];
}
console.log('Computed labels:', finalLabels.join(', '));
// Handle reviews
await handleReviews(finalLabels);
// Apply labels
if (finalLabels.length > 0) {
console.log(`Adding labels: ${finalLabels.join(', ')}`);
await github.rest.issues.addLabels({
owner,
repo,
issue_number: pr_number,
labels: finalLabels
});
}
// Remove old managed labels
const labelsToRemove = managedLabels.filter(label => !finalLabels.includes(label));
for (const label of labelsToRemove) {
console.log(`Removing label: ${label}`);
try {
await github.rest.issues.removeLabel({
owner,
repo,
issue_number: pr_number,
name: label
});
} catch (error) {
console.log(`Failed to remove label ${label}:`, error.message);
}
}

View File

@@ -21,9 +21,9 @@ jobs:
runs-on: ubuntu-latest
steps:
- name: Checkout
uses: actions/checkout@v5.0.0
uses: actions/checkout@v4.1.7
- name: Set up Python
uses: actions/setup-python@v6.0.0
uses: actions/setup-python@v5.5.0
with:
python-version: "3.11"
@@ -47,7 +47,7 @@ jobs:
fi
- if: failure()
name: Review PR
uses: actions/github-script@v8.0.0
uses: actions/github-script@v7.0.1
with:
script: |
await github.rest.pulls.createReview({
@@ -57,20 +57,9 @@ jobs:
event: 'REQUEST_CHANGES',
body: 'You have altered the generated proto files but they do not match what is expected.\nPlease run "script/api_protobuf/api_protobuf.py" and commit the changes.'
})
- if: failure()
name: Show changes
run: git diff
- if: failure()
name: Archive artifacts
uses: actions/upload-artifact@v4.6.2
with:
name: generated-proto-files
path: |
esphome/components/api/api_pb2.*
esphome/components/api/api_pb2_service.*
- if: success()
name: Dismiss review
uses: actions/github-script@v8.0.0
uses: actions/github-script@v7.0.1
with:
script: |
let reviews = await github.rest.pulls.listReviews({

View File

@@ -1,75 +0,0 @@
name: Clang-tidy Hash CI
on:
pull_request:
paths:
- ".clang-tidy"
- "platformio.ini"
- "requirements_dev.txt"
- ".clang-tidy.hash"
- "script/clang_tidy_hash.py"
- ".github/workflows/ci-clang-tidy-hash.yml"
permissions:
contents: read
pull-requests: write
jobs:
verify-hash:
name: Verify clang-tidy hash
runs-on: ubuntu-latest
steps:
- name: Checkout
uses: actions/checkout@v5.0.0
- name: Set up Python
uses: actions/setup-python@v6.0.0
with:
python-version: "3.11"
- name: Verify hash
run: |
python script/clang_tidy_hash.py --verify
- if: failure()
name: Show hash details
run: |
python script/clang_tidy_hash.py
echo "## Job Failed" | tee -a $GITHUB_STEP_SUMMARY
echo "You have modified clang-tidy configuration but have not updated the hash." | tee -a $GITHUB_STEP_SUMMARY
echo "Please run 'script/clang_tidy_hash.py --update' and commit the changes." | tee -a $GITHUB_STEP_SUMMARY
- if: failure()
name: Request changes
uses: actions/github-script@v8.0.0
with:
script: |
await github.rest.pulls.createReview({
pull_number: context.issue.number,
owner: context.repo.owner,
repo: context.repo.repo,
event: 'REQUEST_CHANGES',
body: 'You have modified clang-tidy configuration but have not updated the hash.\nPlease run `script/clang_tidy_hash.py --update` and commit the changes.'
})
- if: success()
name: Dismiss review
uses: actions/github-script@v8.0.0
with:
script: |
let reviews = await github.rest.pulls.listReviews({
pull_number: context.issue.number,
owner: context.repo.owner,
repo: context.repo.repo
});
for (let review of reviews.data) {
if (review.user.login === 'github-actions[bot]' && review.state === 'CHANGES_REQUESTED') {
await github.rest.pulls.dismissReview({
pull_number: context.issue.number,
owner: context.repo.owner,
repo: context.repo.repo,
review_id: review.id,
message: 'Clang-tidy hash now matches configuration.'
});
}
}

View File

@@ -37,19 +37,16 @@ jobs:
strategy:
fail-fast: false
matrix:
os: ["ubuntu-24.04", "ubuntu-24.04-arm"]
build_type:
- "ha-addon"
- "docker"
# - "lint"
os: ["ubuntu-latest", "ubuntu-24.04-arm"]
build_type: ["ha-addon", "docker", "lint"]
steps:
- uses: actions/checkout@v5.0.0
- uses: actions/checkout@v4.1.7
- name: Set up Python
uses: actions/setup-python@v6.0.0
uses: actions/setup-python@v5.5.0
with:
python-version: "3.11"
python-version: "3.9"
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v3.11.1
uses: docker/setup-buildx-action@v3.10.0
- name: Set TAG
run: |

View File

@@ -20,8 +20,8 @@ permissions:
contents: read
env:
DEFAULT_PYTHON: "3.11"
PYUPGRADE_TARGET: "--py311-plus"
DEFAULT_PYTHON: "3.9"
PYUPGRADE_TARGET: "--py39-plus"
concurrency:
# yamllint disable-line rule:line-length
@@ -36,18 +36,18 @@ jobs:
cache-key: ${{ steps.cache-key.outputs.key }}
steps:
- name: Check out code from GitHub
uses: actions/checkout@v5.0.0
uses: actions/checkout@v4.1.7
- name: Generate cache-key
id: cache-key
run: echo key="${{ hashFiles('requirements.txt', 'requirements_test.txt', '.pre-commit-config.yaml') }}" >> $GITHUB_OUTPUT
run: echo key="${{ hashFiles('requirements.txt', 'requirements_optional.txt', 'requirements_test.txt') }}" >> $GITHUB_OUTPUT
- name: Set up Python ${{ env.DEFAULT_PYTHON }}
id: python
uses: actions/setup-python@v6.0.0
uses: actions/setup-python@v5.5.0
with:
python-version: ${{ env.DEFAULT_PYTHON }}
- name: Restore Python virtual environment
id: cache-venv
uses: actions/cache@v4.2.4
uses: actions/cache@v4.2.3
with:
path: venv
# yamllint disable-line rule:line-length
@@ -58,19 +58,59 @@ jobs:
python -m venv venv
. venv/bin/activate
python --version
pip install -r requirements.txt -r requirements_test.txt pre-commit
pip install -r requirements.txt -r requirements_optional.txt -r requirements_test.txt
pip install -e .
ruff:
name: Check ruff
runs-on: ubuntu-24.04
needs:
- common
steps:
- name: Check out code from GitHub
uses: actions/checkout@v4.1.7
- name: Restore Python
uses: ./.github/actions/restore-python
with:
python-version: ${{ env.DEFAULT_PYTHON }}
cache-key: ${{ needs.common.outputs.cache-key }}
- name: Run Ruff
run: |
. venv/bin/activate
ruff format esphome tests
- name: Suggested changes
run: script/ci-suggest-changes
if: always()
flake8:
name: Check flake8
runs-on: ubuntu-24.04
needs:
- common
steps:
- name: Check out code from GitHub
uses: actions/checkout@v4.1.7
- name: Restore Python
uses: ./.github/actions/restore-python
with:
python-version: ${{ env.DEFAULT_PYTHON }}
cache-key: ${{ needs.common.outputs.cache-key }}
- name: Run flake8
run: |
. venv/bin/activate
flake8 esphome
- name: Suggested changes
run: script/ci-suggest-changes
if: always()
pylint:
name: Check pylint
runs-on: ubuntu-24.04
needs:
- common
- determine-jobs
if: needs.determine-jobs.outputs.python-linters == 'true'
steps:
- name: Check out code from GitHub
uses: actions/checkout@v5.0.0
uses: actions/checkout@v4.1.7
- name: Restore Python
uses: ./.github/actions/restore-python
with:
@@ -84,6 +124,27 @@ jobs:
run: script/ci-suggest-changes
if: always()
pyupgrade:
name: Check pyupgrade
runs-on: ubuntu-24.04
needs:
- common
steps:
- name: Check out code from GitHub
uses: actions/checkout@v4.1.7
- name: Restore Python
uses: ./.github/actions/restore-python
with:
python-version: ${{ env.DEFAULT_PYTHON }}
cache-key: ${{ needs.common.outputs.cache-key }}
- name: Run pyupgrade
run: |
. venv/bin/activate
pyupgrade ${{ env.PYUPGRADE_TARGET }} `find esphome -name "*.py" -type f`
- name: Suggested changes
run: script/ci-suggest-changes
if: always()
ci-custom:
name: Run script/ci-custom
runs-on: ubuntu-24.04
@@ -91,7 +152,7 @@ jobs:
- common
steps:
- name: Check out code from GitHub
uses: actions/checkout@v5.0.0
uses: actions/checkout@v4.1.7
- name: Restore Python
uses: ./.github/actions/restore-python
with:
@@ -104,8 +165,6 @@ jobs:
. venv/bin/activate
script/ci-custom.py
script/build_codeowners.py --check
script/build_language_schema.py --check
script/generate-esp32-boards.py --check
pytest:
name: Run pytest
@@ -113,9 +172,10 @@ jobs:
fail-fast: false
matrix:
python-version:
- "3.9"
- "3.10"
- "3.11"
- "3.12"
- "3.13"
os:
- ubuntu-latest
- macOS-latest
@@ -124,22 +184,25 @@ jobs:
# Minimize CI resource usage
# by only running the Python version
# version used for docker images on Windows and macOS
- python-version: "3.13"
os: windows-latest
- python-version: "3.12"
os: windows-latest
- python-version: "3.13"
- python-version: "3.10"
os: windows-latest
- python-version: "3.9"
os: windows-latest
- python-version: "3.12"
os: macOS-latest
- python-version: "3.12"
- python-version: "3.10"
os: macOS-latest
- python-version: "3.9"
os: macOS-latest
runs-on: ${{ matrix.os }}
needs:
- common
steps:
- name: Check out code from GitHub
uses: actions/checkout@v5.0.0
uses: actions/checkout@v4.1.7
- name: Restore Python
id: restore-python
uses: ./.github/actions/restore-python
with:
python-version: ${{ matrix.python-version }}
@@ -149,108 +212,56 @@ jobs:
- name: Run pytest
if: matrix.os == 'windows-latest'
run: |
. ./venv/Scripts/activate.ps1
pytest -vv --cov-report=xml --tb=native -n auto tests --ignore=tests/integration/
./venv/Scripts/activate
pytest -vv --cov-report=xml --tb=native tests
- name: Run pytest
if: matrix.os == 'ubuntu-latest' || matrix.os == 'macOS-latest'
run: |
. venv/bin/activate
pytest -vv --cov-report=xml --tb=native -n auto tests --ignore=tests/integration/
pytest -vv --cov-report=xml --tb=native tests
- name: Upload coverage to Codecov
uses: codecov/codecov-action@v5.5.1
uses: codecov/codecov-action@v5
with:
token: ${{ secrets.CODECOV_TOKEN }}
- name: Save Python virtual environment cache
if: github.ref == 'refs/heads/dev'
uses: actions/cache/save@v4.2.4
with:
path: venv
key: ${{ runner.os }}-${{ steps.restore-python.outputs.python-version }}-venv-${{ needs.common.outputs.cache-key }}
determine-jobs:
name: Determine which jobs to run
clang-format:
name: Check clang-format
runs-on: ubuntu-24.04
needs:
- common
outputs:
integration-tests: ${{ steps.determine.outputs.integration-tests }}
clang-tidy: ${{ steps.determine.outputs.clang-tidy }}
python-linters: ${{ steps.determine.outputs.python-linters }}
changed-components: ${{ steps.determine.outputs.changed-components }}
component-test-count: ${{ steps.determine.outputs.component-test-count }}
steps:
- name: Check out code from GitHub
uses: actions/checkout@v5.0.0
with:
# Fetch enough history to find the merge base
fetch-depth: 2
uses: actions/checkout@v4.1.7
- name: Restore Python
uses: ./.github/actions/restore-python
with:
python-version: ${{ env.DEFAULT_PYTHON }}
cache-key: ${{ needs.common.outputs.cache-key }}
- name: Determine which tests to run
id: determine
env:
GH_TOKEN: ${{ github.token }}
- name: Install clang-format
run: |
. venv/bin/activate
output=$(python script/determine-jobs.py)
echo "Test determination output:"
echo "$output" | jq
# Extract individual fields
echo "integration-tests=$(echo "$output" | jq -r '.integration_tests')" >> $GITHUB_OUTPUT
echo "clang-tidy=$(echo "$output" | jq -r '.clang_tidy')" >> $GITHUB_OUTPUT
echo "python-linters=$(echo "$output" | jq -r '.python_linters')" >> $GITHUB_OUTPUT
echo "changed-components=$(echo "$output" | jq -c '.changed_components')" >> $GITHUB_OUTPUT
echo "component-test-count=$(echo "$output" | jq -r '.component_test_count')" >> $GITHUB_OUTPUT
integration-tests:
name: Run integration tests
runs-on: ubuntu-latest
needs:
- common
- determine-jobs
if: needs.determine-jobs.outputs.integration-tests == 'true'
steps:
- name: Check out code from GitHub
uses: actions/checkout@v5.0.0
- name: Set up Python 3.13
id: python
uses: actions/setup-python@v6.0.0
with:
python-version: "3.13"
- name: Restore Python virtual environment
id: cache-venv
uses: actions/cache@v4.2.4
with:
path: venv
key: ${{ runner.os }}-${{ steps.python.outputs.python-version }}-venv-${{ needs.common.outputs.cache-key }}
- name: Create Python virtual environment
if: steps.cache-venv.outputs.cache-hit != 'true'
run: |
python -m venv venv
. venv/bin/activate
python --version
pip install -r requirements.txt -r requirements_test.txt
pip install -e .
- name: Register matcher
run: echo "::add-matcher::.github/workflows/matchers/pytest.json"
- name: Run integration tests
pip install clang-format -c requirements_dev.txt
- name: Run clang-format
run: |
. venv/bin/activate
pytest -vv --no-cov --tb=native -n auto tests/integration/
script/clang-format -i
git diff-index --quiet HEAD --
- name: Suggested changes
run: script/ci-suggest-changes
if: always()
clang-tidy:
name: ${{ matrix.name }}
runs-on: ubuntu-24.04
needs:
- common
- determine-jobs
if: needs.determine-jobs.outputs.clang-tidy == 'true'
env:
GH_TOKEN: ${{ github.token }}
- ruff
- ci-custom
- clang-format
- flake8
- pylint
- pytest
- pyupgrade
strategy:
fail-fast: false
max-parallel: 2
@@ -280,19 +291,10 @@ jobs:
name: Run script/clang-tidy for ESP32 IDF
options: --environment esp32-idf-tidy --grep USE_ESP_IDF
pio_cache_key: tidyesp32-idf
- id: clang-tidy
name: Run script/clang-tidy for ZEPHYR
options: --environment nrf52-tidy --grep USE_ZEPHYR --grep USE_NRF52
pio_cache_key: tidy-zephyr
ignore_errors: false
steps:
- name: Check out code from GitHub
uses: actions/checkout@v5.0.0
with:
# Need history for HEAD~1 to work for checking changed files
fetch-depth: 2
uses: actions/checkout@v4.1.7
- name: Restore Python
uses: ./.github/actions/restore-python
with:
@@ -301,17 +303,17 @@ jobs:
- name: Cache platformio
if: github.ref == 'refs/heads/dev'
uses: actions/cache@v4.2.4
uses: actions/cache@v4.2.3
with:
path: ~/.platformio
key: platformio-${{ matrix.pio_cache_key }}-${{ hashFiles('platformio.ini') }}
key: platformio-${{ matrix.pio_cache_key }}
- name: Cache platformio
if: github.ref != 'refs/heads/dev'
uses: actions/cache/restore@v4.2.4
uses: actions/cache/restore@v4.2.3
with:
path: ~/.platformio
key: platformio-${{ matrix.pio_cache_key }}-${{ hashFiles('platformio.ini') }}
key: platformio-${{ matrix.pio_cache_key }}
- name: Register problem matchers
run: |
@@ -325,49 +327,72 @@ jobs:
mkdir -p .temp
pio run --list-targets -e esp32-idf-tidy
- name: Check if full clang-tidy scan needed
id: check_full_scan
run: |
. venv/bin/activate
if python script/clang_tidy_hash.py --check; then
echo "full_scan=true" >> $GITHUB_OUTPUT
echo "reason=hash_changed" >> $GITHUB_OUTPUT
else
echo "full_scan=false" >> $GITHUB_OUTPUT
echo "reason=normal" >> $GITHUB_OUTPUT
fi
- name: Run clang-tidy
run: |
. venv/bin/activate
if [ "${{ steps.check_full_scan.outputs.full_scan }}" = "true" ]; then
echo "Running FULL clang-tidy scan (hash changed)"
script/clang-tidy --all-headers --fix ${{ matrix.options }} ${{ matrix.ignore_errors && '|| true' || '' }}
else
echo "Running clang-tidy on changed files only"
script/clang-tidy --all-headers --fix --changed ${{ matrix.options }} ${{ matrix.ignore_errors && '|| true' || '' }}
fi
script/clang-tidy --all-headers --fix ${{ matrix.options }}
env:
# Also cache libdeps, store them in a ~/.platformio subfolder
PLATFORMIO_LIBDEPS_DIR: ~/.platformio/libdeps
- name: Suggested changes
run: script/ci-suggest-changes ${{ matrix.ignore_errors && '|| true' || '' }}
run: script/ci-suggest-changes
# yamllint disable-line rule:line-length
if: always()
list-components:
runs-on: ubuntu-24.04
needs:
- common
if: github.event_name == 'pull_request'
outputs:
components: ${{ steps.list-components.outputs.components }}
count: ${{ steps.list-components.outputs.count }}
steps:
- name: Check out code from GitHub
uses: actions/checkout@v4.1.7
with:
# Fetch enough history so `git merge-base refs/remotes/origin/dev HEAD` works.
fetch-depth: 500
- name: Get target branch
id: target-branch
run: |
echo "branch=${{ github.event.pull_request.base.ref }}" >> $GITHUB_OUTPUT
- name: Fetch ${{ steps.target-branch.outputs.branch }} branch
run: |
git -c protocol.version=2 fetch --no-tags --prune --no-recurse-submodules --depth=1 origin +refs/heads/${{ steps.target-branch.outputs.branch }}:refs/remotes/origin/${{ steps.target-branch.outputs.branch }}
git merge-base refs/remotes/origin/${{ steps.target-branch.outputs.branch }} HEAD
- name: Restore Python
uses: ./.github/actions/restore-python
with:
python-version: ${{ env.DEFAULT_PYTHON }}
cache-key: ${{ needs.common.outputs.cache-key }}
- name: Find changed components
id: list-components
run: |
. venv/bin/activate
components=$(script/list-components.py --changed --branch ${{ steps.target-branch.outputs.branch }})
output_components=$(echo "$components" | jq -R -s -c 'split("\n")[:-1] | map(select(length > 0))')
count=$(echo "$output_components" | jq length)
echo "components=$output_components" >> $GITHUB_OUTPUT
echo "count=$count" >> $GITHUB_OUTPUT
echo "$count Components:"
echo "$output_components" | jq
test-build-components:
name: Component test ${{ matrix.file }}
runs-on: ubuntu-24.04
needs:
- common
- determine-jobs
if: github.event_name == 'pull_request' && fromJSON(needs.determine-jobs.outputs.component-test-count) > 0 && fromJSON(needs.determine-jobs.outputs.component-test-count) < 100
- list-components
if: github.event_name == 'pull_request' && fromJSON(needs.list-components.outputs.count) > 0 && fromJSON(needs.list-components.outputs.count) < 100
strategy:
fail-fast: false
max-parallel: 2
matrix:
file: ${{ fromJson(needs.determine-jobs.outputs.changed-components) }}
file: ${{ fromJson(needs.list-components.outputs.components) }}
steps:
- name: Install dependencies
run: |
@@ -375,7 +400,7 @@ jobs:
sudo apt-get install libsdl2-dev
- name: Check out code from GitHub
uses: actions/checkout@v5.0.0
uses: actions/checkout@v4.1.7
- name: Restore Python
uses: ./.github/actions/restore-python
with:
@@ -395,17 +420,17 @@ jobs:
runs-on: ubuntu-24.04
needs:
- common
- determine-jobs
if: github.event_name == 'pull_request' && fromJSON(needs.determine-jobs.outputs.component-test-count) >= 100
- list-components
if: github.event_name == 'pull_request' && fromJSON(needs.list-components.outputs.count) >= 100
outputs:
matrix: ${{ steps.split.outputs.components }}
steps:
- name: Check out code from GitHub
uses: actions/checkout@v5.0.0
uses: actions/checkout@v4.1.7
- name: Split components into 20 groups
id: split
run: |
components=$(echo '${{ needs.determine-jobs.outputs.changed-components }}' | jq -c '.[]' | shuf | jq -s -c '[_nwise(20) | join(" ")]')
components=$(echo '${{ needs.list-components.outputs.components }}' | jq -c '.[]' | shuf | jq -s -c '[_nwise(20) | join(" ")]')
echo "components=$components" >> $GITHUB_OUTPUT
test-build-components-split:
@@ -413,9 +438,9 @@ jobs:
runs-on: ubuntu-24.04
needs:
- common
- determine-jobs
- list-components
- test-build-components-splitter
if: github.event_name == 'pull_request' && fromJSON(needs.determine-jobs.outputs.component-test-count) >= 100
if: github.event_name == 'pull_request' && fromJSON(needs.list-components.outputs.count) >= 100
strategy:
fail-fast: false
max-parallel: 4
@@ -431,7 +456,7 @@ jobs:
sudo apt-get install libsdl2-dev
- name: Check out code from GitHub
uses: actions/checkout@v5.0.0
uses: actions/checkout@v4.1.7
- name: Restore Python
uses: ./.github/actions/restore-python
with:
@@ -452,41 +477,23 @@ jobs:
./script/test_build_components -e compile -c $component
done
pre-commit-ci-lite:
name: pre-commit.ci lite
runs-on: ubuntu-latest
needs:
- common
if: github.event_name == 'pull_request' && github.base_ref != 'beta' && github.base_ref != 'release'
steps:
- name: Check out code from GitHub
uses: actions/checkout@v5.0.0
- name: Restore Python
uses: ./.github/actions/restore-python
with:
python-version: ${{ env.DEFAULT_PYTHON }}
cache-key: ${{ needs.common.outputs.cache-key }}
- uses: pre-commit/action@v3.0.1
env:
SKIP: pylint,clang-tidy-hash
- uses: pre-commit-ci/lite-action@v1.1.0
if: always()
ci-status:
name: CI Status
runs-on: ubuntu-24.04
needs:
- common
- ruff
- ci-custom
- clang-format
- flake8
- pylint
- pytest
- integration-tests
- pyupgrade
- clang-tidy
- determine-jobs
- list-components
- test-build-components
- test-build-components-splitter
- test-build-components-split
- pre-commit-ci-lite
if: always()
steps:
- name: Success

View File

@@ -1,324 +0,0 @@
# This workflow automatically requests reviews from codeowners when:
# 1. A PR is opened, reopened, or synchronized (updated)
# 2. A PR is marked as ready for review
#
# It reads the CODEOWNERS file and matches all changed files in the PR against
# the codeowner patterns, then requests reviews from the appropriate owners
# while avoiding duplicate requests for users who have already been requested
# or have already reviewed the PR.
name: Request Codeowner Reviews
on:
# Needs to be pull_request_target to get write permissions
pull_request_target:
types: [opened, reopened, synchronize, ready_for_review]
permissions:
pull-requests: write
contents: read
jobs:
request-codeowner-reviews:
name: Run
if: ${{ !github.event.pull_request.draft }}
runs-on: ubuntu-latest
steps:
- name: Request reviews from component codeowners
uses: actions/github-script@v8.0.0
with:
script: |
const owner = context.repo.owner;
const repo = context.repo.repo;
const pr_number = context.payload.pull_request.number;
console.log(`Processing PR #${pr_number} for codeowner review requests`);
// Hidden marker to identify bot comments from this workflow
const BOT_COMMENT_MARKER = '<!-- codeowner-review-request-bot -->';
try {
// Get the list of changed files in this PR
const { data: files } = await github.rest.pulls.listFiles({
owner,
repo,
pull_number: pr_number
});
const changedFiles = files.map(file => file.filename);
console.log(`Found ${changedFiles.length} changed files`);
if (changedFiles.length === 0) {
console.log('No changed files found, skipping codeowner review requests');
return;
}
// Fetch CODEOWNERS file from root
const { data: codeownersFile } = await github.rest.repos.getContent({
owner,
repo,
path: 'CODEOWNERS',
ref: context.payload.pull_request.base.sha
});
const codeownersContent = Buffer.from(codeownersFile.content, 'base64').toString('utf8');
// Parse CODEOWNERS file to extract all patterns and their owners
const codeownersLines = codeownersContent.split('\n')
.map(line => line.trim())
.filter(line => line && !line.startsWith('#'));
const codeownersPatterns = [];
// Convert CODEOWNERS pattern to regex (robust glob handling)
function globToRegex(pattern) {
// Escape regex special characters except for glob wildcards
let regexStr = pattern
.replace(/([.+^=!:${}()|[\]\\])/g, '\\$1') // escape regex chars
.replace(/\*\*/g, '.*') // globstar
.replace(/\*/g, '[^/]*') // single star
.replace(/\?/g, '.'); // question mark
return new RegExp('^' + regexStr + '$');
}
// Helper function to create comment body
function createCommentBody(reviewersList, teamsList, matchedFileCount, isSuccessful = true) {
const reviewerMentions = reviewersList.map(r => `@${r}`);
const teamMentions = teamsList.map(t => `@${owner}/${t}`);
const allMentions = [...reviewerMentions, ...teamMentions].join(', ');
if (isSuccessful) {
return `${BOT_COMMENT_MARKER}\n👋 Hi there! I've automatically requested reviews from codeowners based on the files changed in this PR.\n\n${allMentions} - You've been requested to review this PR as codeowner(s) of ${matchedFileCount} file(s) that were modified. Thanks for your time! 🙏`;
} else {
return `${BOT_COMMENT_MARKER}\n👋 Hi there! This PR modifies ${matchedFileCount} file(s) with codeowners.\n\n${allMentions} - As codeowner(s) of the affected files, your review would be appreciated! 🙏\n\n_Note: Automatic review request may have failed, but you're still welcome to review._`;
}
}
for (const line of codeownersLines) {
const parts = line.split(/\s+/);
if (parts.length < 2) continue;
const pattern = parts[0];
const owners = parts.slice(1);
// Use robust glob-to-regex conversion
const regex = globToRegex(pattern);
codeownersPatterns.push({ pattern, regex, owners });
}
console.log(`Parsed ${codeownersPatterns.length} codeowner patterns`);
// Match changed files against CODEOWNERS patterns
const matchedOwners = new Set();
const matchedTeams = new Set();
const fileMatches = new Map(); // Track which files matched which patterns
for (const file of changedFiles) {
for (const { pattern, regex, owners } of codeownersPatterns) {
if (regex.test(file)) {
console.log(`File '${file}' matches pattern '${pattern}' with owners: ${owners.join(', ')}`);
if (!fileMatches.has(file)) {
fileMatches.set(file, []);
}
fileMatches.get(file).push({ pattern, owners });
// Add owners to the appropriate set (remove @ prefix)
for (const owner of owners) {
const cleanOwner = owner.startsWith('@') ? owner.slice(1) : owner;
if (cleanOwner.includes('/')) {
// Team mention (org/team-name)
const teamName = cleanOwner.split('/')[1];
matchedTeams.add(teamName);
} else {
// Individual user
matchedOwners.add(cleanOwner);
}
}
}
}
}
if (matchedOwners.size === 0 && matchedTeams.size === 0) {
console.log('No codeowners found for any changed files');
return;
}
// Remove the PR author from reviewers
const prAuthor = context.payload.pull_request.user.login;
matchedOwners.delete(prAuthor);
// Get current reviewers to avoid duplicate requests (but still mention them)
const { data: prData } = await github.rest.pulls.get({
owner,
repo,
pull_number: pr_number
});
const currentReviewers = new Set();
const currentTeams = new Set();
if (prData.requested_reviewers) {
prData.requested_reviewers.forEach(reviewer => {
currentReviewers.add(reviewer.login);
});
}
if (prData.requested_teams) {
prData.requested_teams.forEach(team => {
currentTeams.add(team.slug);
});
}
// Check for completed reviews to avoid re-requesting users who have already reviewed
const { data: reviews } = await github.rest.pulls.listReviews({
owner,
repo,
pull_number: pr_number
});
const reviewedUsers = new Set();
reviews.forEach(review => {
reviewedUsers.add(review.user.login);
});
// Check for previous comments from this workflow to avoid duplicate pings
const comments = await github.paginate(
github.rest.issues.listComments,
{
owner,
repo,
issue_number: pr_number
}
);
const previouslyPingedUsers = new Set();
const previouslyPingedTeams = new Set();
// Look for comments from github-actions bot that contain our bot marker
const workflowComments = comments.filter(comment =>
comment.user.type === 'Bot' &&
comment.body.includes(BOT_COMMENT_MARKER)
);
// Extract previously mentioned users and teams from workflow comments
for (const comment of workflowComments) {
// Match @username patterns (not team mentions)
const userMentions = comment.body.match(/@([a-zA-Z0-9_.-]+)(?![/])/g) || [];
userMentions.forEach(mention => {
const username = mention.slice(1); // remove @
previouslyPingedUsers.add(username);
});
// Match @org/team patterns
const teamMentions = comment.body.match(/@[a-zA-Z0-9_.-]+\/([a-zA-Z0-9_.-]+)/g) || [];
teamMentions.forEach(mention => {
const teamName = mention.split('/')[1];
previouslyPingedTeams.add(teamName);
});
}
console.log(`Found ${previouslyPingedUsers.size} previously pinged users and ${previouslyPingedTeams.size} previously pinged teams`);
// Remove users who have already been pinged in previous workflow comments
previouslyPingedUsers.forEach(user => {
matchedOwners.delete(user);
});
previouslyPingedTeams.forEach(team => {
matchedTeams.delete(team);
});
// Remove only users who have already submitted reviews (not just requested reviewers)
reviewedUsers.forEach(reviewer => {
matchedOwners.delete(reviewer);
});
// For teams, we'll still remove already requested teams to avoid API errors
currentTeams.forEach(team => {
matchedTeams.delete(team);
});
const reviewersList = Array.from(matchedOwners);
const teamsList = Array.from(matchedTeams);
if (reviewersList.length === 0 && teamsList.length === 0) {
console.log('No eligible reviewers found (all may already be requested, reviewed, or previously pinged)');
return;
}
const totalReviewers = reviewersList.length + teamsList.length;
console.log(`Requesting reviews from ${reviewersList.length} users and ${teamsList.length} teams for ${fileMatches.size} matched files`);
// Request reviews
try {
const requestParams = {
owner,
repo,
pull_number: pr_number
};
// Filter out users who are already requested reviewers for the API call
const newReviewers = reviewersList.filter(reviewer => !currentReviewers.has(reviewer));
const newTeams = teamsList.filter(team => !currentTeams.has(team));
if (newReviewers.length > 0) {
requestParams.reviewers = newReviewers;
}
if (newTeams.length > 0) {
requestParams.team_reviewers = newTeams;
}
// Only make the API call if there are new reviewers to request
if (newReviewers.length > 0 || newTeams.length > 0) {
await github.rest.pulls.requestReviewers(requestParams);
console.log(`Successfully requested reviews from ${newReviewers.length} new users and ${newTeams.length} new teams`);
} else {
console.log('All codeowners are already requested reviewers or have reviewed');
}
// Only add a comment if there are new codeowners to mention (not previously pinged)
if (reviewersList.length > 0 || teamsList.length > 0) {
const commentBody = createCommentBody(reviewersList, teamsList, fileMatches.size, true);
await github.rest.issues.createComment({
owner,
repo,
issue_number: pr_number,
body: commentBody
});
console.log(`Added comment mentioning ${reviewersList.length} users and ${teamsList.length} teams`);
} else {
console.log('No new codeowners to mention in comment (all previously pinged)');
}
} catch (error) {
if (error.status === 422) {
console.log('Some reviewers may already be requested or unavailable:', error.message);
// Only try to add a comment if there are new codeowners to mention
if (reviewersList.length > 0 || teamsList.length > 0) {
const commentBody = createCommentBody(reviewersList, teamsList, fileMatches.size, false);
try {
await github.rest.issues.createComment({
owner,
repo,
issue_number: pr_number,
body: commentBody
});
console.log(`Added fallback comment mentioning ${reviewersList.length} users and ${teamsList.length} teams`);
} catch (commentError) {
console.log('Failed to add comment:', commentError.message);
}
} else {
console.log('No new codeowners to mention in fallback comment');
}
} else {
throw error;
}
}
} catch (error) {
console.log('Failed to process codeowner review requests:', error.message);
console.error(error);
}

View File

@@ -54,7 +54,7 @@ jobs:
# your codebase is analyzed, see https://docs.github.com/en/code-security/code-scanning/creating-an-advanced-setup-for-code-scanning/codeql-code-scanning-for-compiled-languages
steps:
- name: Checkout repository
uses: actions/checkout@v5.0.0
uses: actions/checkout@v4
# Initializes the CodeQL tools for scanning.
- name: Initialize CodeQL

View File

@@ -1,157 +0,0 @@
name: Add External Component Comment
on:
pull_request_target:
types: [opened, synchronize]
permissions:
contents: read # Needed to fetch PR details
issues: write # Needed to create and update comments (PR comments are managed via the issues REST API)
pull-requests: write # also needed?
jobs:
external-comment:
name: External component comment
runs-on: ubuntu-latest
steps:
- name: Add external component comment
uses: actions/github-script@v8.0.0
with:
github-token: ${{ secrets.GITHUB_TOKEN }}
script: |
// Generate external component usage instructions
function generateExternalComponentInstructions(prNumber, componentNames, owner, repo) {
let source;
if (owner === 'esphome' && repo === 'esphome')
source = `github://pr#${prNumber}`;
else
source = `github://${owner}/${repo}@pull/${prNumber}/head`;
return `To use the changes from this PR as an external component, add the following to your ESPHome configuration YAML file:
\`\`\`yaml
external_components:
- source: ${source}
components: [${componentNames.join(', ')}]
refresh: 1h
\`\`\``;
}
// Generate repo clone instructions
function generateRepoInstructions(prNumber, owner, repo, branch) {
return `To use the changes in this PR:
\`\`\`bash
# Clone the repository:
git clone https://github.com/${owner}/${repo}
cd ${repo}
# Checkout the PR branch:
git fetch origin pull/${prNumber}/head:${branch}
git checkout ${branch}
# Install the development version:
script/setup
# Activate the development version:
source venv/bin/activate
\`\`\`
Now you can run \`esphome\` as usual to test the changes in this PR.
`;
}
async function createComment(octokit, owner, repo, prNumber, esphomeChanges, componentChanges) {
const commentMarker = "<!-- This comment was generated automatically by the external-component-bot workflow. -->";
const legacyCommentMarker = "<!-- This comment was generated automatically by a GitHub workflow. -->";
let commentBody;
if (esphomeChanges.length === 1) {
commentBody = generateExternalComponentInstructions(prNumber, componentChanges, owner, repo);
} else {
commentBody = generateRepoInstructions(prNumber, owner, repo, context.payload.pull_request.head.ref);
}
commentBody += `\n\n---\n(Added by the PR bot)\n\n${commentMarker}`;
// Check for existing bot comment
const comments = await github.paginate(
github.rest.issues.listComments,
{
owner: owner,
repo: repo,
issue_number: prNumber,
per_page: 100,
}
);
const sorted = comments.sort((a, b) => new Date(b.updated_at) - new Date(a.updated_at));
const botComment = sorted.find(comment =>
(
comment.body.includes(commentMarker) ||
comment.body.includes(legacyCommentMarker)
) && comment.user.type === "Bot"
);
if (botComment && botComment.body === commentBody) {
// No changes in the comment, do nothing
return;
}
if (botComment) {
// Update existing comment
await github.rest.issues.updateComment({
owner: owner,
repo: repo,
comment_id: botComment.id,
body: commentBody,
});
} else {
// Create new comment
await github.rest.issues.createComment({
owner: owner,
repo: repo,
issue_number: prNumber,
body: commentBody,
});
}
}
async function getEsphomeAndComponentChanges(github, owner, repo, prNumber) {
const changedFiles = await github.rest.pulls.listFiles({
owner: owner,
repo: repo,
pull_number: prNumber,
});
const esphomeChanges = changedFiles.data
.filter(file => file.filename !== "esphome/core/defines.h" && file.filename.startsWith('esphome/'))
.map(file => {
const match = file.filename.match(/esphome\/([^/]+)/);
return match ? match[1] : null;
})
.filter(it => it !== null);
if (esphomeChanges.length === 0) {
return {esphomeChanges: [], componentChanges: []};
}
const uniqueEsphomeChanges = [...new Set(esphomeChanges)];
const componentChanges = changedFiles.data
.filter(file => file.filename.startsWith('esphome/components/'))
.map(file => {
const match = file.filename.match(/esphome\/components\/([^/]+)\//);
return match ? match[1] : null;
})
.filter(it => it !== null);
return {esphomeChanges: uniqueEsphomeChanges, componentChanges: [...new Set(componentChanges)]};
}
// Start of main code.
const prNumber = context.payload.pull_request.number;
const {owner, repo} = context.repo;
const {esphomeChanges, componentChanges} = await getEsphomeAndComponentChanges(github, owner, repo, prNumber);
if (componentChanges.length !== 0) {
await createComment(github, owner, repo, prNumber, esphomeChanges, componentChanges);
}

View File

@@ -1,163 +0,0 @@
# This workflow automatically notifies codeowners when an issue is labeled with component labels.
# It reads the CODEOWNERS file to find the maintainers for the labeled components
# and posts a comment mentioning them to ensure they're aware of the issue.
name: Notify Issue Codeowners
on:
issues:
types: [labeled]
permissions:
issues: write
contents: read
jobs:
notify-codeowners:
name: Run
if: ${{ startsWith(github.event.label.name, format('component{0} ', ':')) }}
runs-on: ubuntu-latest
steps:
- name: Notify codeowners for component issues
uses: actions/github-script@v8.0.0
with:
script: |
const owner = context.repo.owner;
const repo = context.repo.repo;
const issue_number = context.payload.issue.number;
const labelName = context.payload.label.name;
console.log(`Processing issue #${issue_number} with label: ${labelName}`);
// Hidden marker to identify bot comments from this workflow
const BOT_COMMENT_MARKER = '<!-- issue-codeowner-notify-bot -->';
// Extract component name from label
const componentName = labelName.replace('component: ', '');
console.log(`Component: ${componentName}`);
try {
// Fetch CODEOWNERS file from root
const { data: codeownersFile } = await github.rest.repos.getContent({
owner,
repo,
path: 'CODEOWNERS'
});
const codeownersContent = Buffer.from(codeownersFile.content, 'base64').toString('utf8');
// Parse CODEOWNERS file to extract component mappings
const codeownersLines = codeownersContent.split('\n')
.map(line => line.trim())
.filter(line => line && !line.startsWith('#'));
let componentOwners = null;
for (const line of codeownersLines) {
const parts = line.split(/\s+/);
if (parts.length < 2) continue;
const pattern = parts[0];
const owners = parts.slice(1);
// Look for component patterns: esphome/components/{component}/*
const componentMatch = pattern.match(/^esphome\/components\/([^\/]+)\/\*$/);
if (componentMatch && componentMatch[1] === componentName) {
componentOwners = owners;
break;
}
}
if (!componentOwners) {
console.log(`No codeowners found for component: ${componentName}`);
return;
}
console.log(`Found codeowners for '${componentName}': ${componentOwners.join(', ')}`);
// Separate users and teams
const userOwners = [];
const teamOwners = [];
for (const owner of componentOwners) {
const cleanOwner = owner.startsWith('@') ? owner.slice(1) : owner;
if (cleanOwner.includes('/')) {
// Team mention (org/team-name)
teamOwners.push(`@${cleanOwner}`);
} else {
// Individual user
userOwners.push(`@${cleanOwner}`);
}
}
// Remove issue author from mentions to avoid self-notification
const issueAuthor = context.payload.issue.user.login;
const filteredUserOwners = userOwners.filter(mention =>
mention !== `@${issueAuthor}`
);
// Check for previous comments from this workflow to avoid duplicate pings
const comments = await github.paginate(
github.rest.issues.listComments,
{
owner,
repo,
issue_number: issue_number
}
);
const previouslyPingedUsers = new Set();
const previouslyPingedTeams = new Set();
// Look for comments from github-actions bot that contain codeowner pings for this component
const workflowComments = comments.filter(comment =>
comment.user.type === 'Bot' &&
comment.body.includes(BOT_COMMENT_MARKER) &&
comment.body.includes(`component: ${componentName}`)
);
// Extract previously mentioned users and teams from workflow comments
for (const comment of workflowComments) {
// Match @username patterns (not team mentions)
const userMentions = comment.body.match(/@([a-zA-Z0-9_.-]+)(?![/])/g) || [];
userMentions.forEach(mention => {
previouslyPingedUsers.add(mention); // Keep @ prefix for easy comparison
});
// Match @org/team patterns
const teamMentions = comment.body.match(/@[a-zA-Z0-9_.-]+\/[a-zA-Z0-9_.-]+/g) || [];
teamMentions.forEach(mention => {
previouslyPingedTeams.add(mention);
});
}
console.log(`Found ${previouslyPingedUsers.size} previously pinged users and ${previouslyPingedTeams.size} previously pinged teams for component ${componentName}`);
// Remove previously pinged users and teams
const newUserOwners = filteredUserOwners.filter(mention => !previouslyPingedUsers.has(mention));
const newTeamOwners = teamOwners.filter(mention => !previouslyPingedTeams.has(mention));
const allMentions = [...newUserOwners, ...newTeamOwners];
if (allMentions.length === 0) {
console.log('No new codeowners to notify (all previously pinged or issue author is the only codeowner)');
return;
}
// Create comment body
const mentionString = allMentions.join(', ');
const commentBody = `${BOT_COMMENT_MARKER}\n👋 Hey ${mentionString}!\n\nThis issue has been labeled with \`component: ${componentName}\` and you've been identified as a codeowner of this component. Please take a look when you have a chance!\n\nThanks for maintaining this component! 🙏`;
// Post comment
await github.rest.issues.createComment({
owner,
repo,
issue_number: issue_number,
body: commentBody
});
console.log(`Successfully notified new codeowners: ${mentionString}`);
} catch (error) {
console.log('Failed to process codeowner notifications:', error.message);
console.error(error);
}

View File

@@ -1,11 +1,28 @@
---
name: Lock closed issues and PRs
name: Lock
on:
schedule:
- cron: "30 0 * * *" # Run daily at 00:30 UTC
- cron: "30 0 * * *"
workflow_dispatch:
permissions:
issues: write
pull-requests: write
concurrency:
group: lock
jobs:
lock:
uses: esphome/workflows/.github/workflows/lock.yml@main
runs-on: ubuntu-latest
steps:
- uses: dessant/lock-threads@v5.0.1
with:
pr-inactive-days: "1"
pr-lock-reason: ""
exclude-any-pr-labels: keep-open
issue-inactive-days: "7"
issue-lock-reason: ""
exclude-any-issue-labels: keep-open

24
.github/workflows/needs-docs.yml vendored Normal file
View File

@@ -0,0 +1,24 @@
name: Needs Docs
on:
pull_request:
types: [labeled, unlabeled]
jobs:
check:
name: Check
runs-on: ubuntu-latest
steps:
- name: Check for needs-docs label
uses: actions/github-script@v7.0.1
with:
script: |
const { data: labels } = await github.rest.issues.listLabelsOnIssue({
owner: context.repo.owner,
repo: context.repo.repo,
issue_number: context.issue.number
});
const needsDocs = labels.find(label => label.name === 'needs-docs');
if (needsDocs) {
core.setFailed('Pull request needs docs');
}

View File

@@ -18,9 +18,8 @@ jobs:
outputs:
tag: ${{ steps.tag.outputs.tag }}
branch_build: ${{ steps.tag.outputs.branch_build }}
deploy_env: ${{ steps.tag.outputs.deploy_env }}
steps:
- uses: actions/checkout@v5.0.0
- uses: actions/checkout@v4.1.7
- name: Get tag
id: tag
# yamllint disable rule:line-length
@@ -28,11 +27,6 @@ jobs:
if [[ "${{ github.event_name }}" = "release" ]]; then
TAG="${{ github.event.release.tag_name}}"
BRANCH_BUILD="false"
if [[ "${{ github.event.release.prerelease }}" = "true" ]]; then
ENVIRONMENT="beta"
else
ENVIRONMENT="production"
fi
else
TAG=$(cat esphome/const.py | sed -n -E "s/^__version__\s+=\s+\"(.+)\"$/\1/p")
today="$(date --utc '+%Y%m%d')"
@@ -41,15 +35,12 @@ jobs:
if [[ "$BRANCH" != "dev" ]]; then
TAG="${TAG}-${BRANCH}"
BRANCH_BUILD="true"
ENVIRONMENT=""
else
BRANCH_BUILD="false"
ENVIRONMENT="dev"
fi
fi
echo "tag=${TAG}" >> $GITHUB_OUTPUT
echo "branch_build=${BRANCH_BUILD}" >> $GITHUB_OUTPUT
echo "deploy_env=${ENVIRONMENT}" >> $GITHUB_OUTPUT
# yamllint enable rule:line-length
deploy-pypi:
@@ -60,54 +51,56 @@ jobs:
contents: read
id-token: write
steps:
- uses: actions/checkout@v5.0.0
- uses: actions/checkout@v4.1.7
- name: Set up Python
uses: actions/setup-python@v6.0.0
uses: actions/setup-python@v5.5.0
with:
python-version: "3.x"
- name: Set up python environment
env:
ESPHOME_NO_VENV: 1
run: script/setup
- name: Build
run: |-
pip3 install build
python3 -m build
- name: Publish
uses: pypa/gh-action-pypi-publish@v1.13.0
with:
skip-existing: true
uses: pypa/gh-action-pypi-publish@v1.12.4
deploy-docker:
name: Build ESPHome ${{ matrix.platform.arch }}
name: Build ESPHome ${{ matrix.platform }}
if: github.repository == 'esphome/esphome'
permissions:
contents: read
packages: write
runs-on: ${{ matrix.platform.os }}
runs-on: ubuntu-latest
needs: [init]
strategy:
fail-fast: false
matrix:
platform:
- arch: amd64
os: "ubuntu-24.04"
- arch: arm64
os: "ubuntu-24.04-arm"
- linux/amd64
- linux/arm64
steps:
- uses: actions/checkout@v5.0.0
- uses: actions/checkout@v4.1.7
- name: Set up Python
uses: actions/setup-python@v6.0.0
uses: actions/setup-python@v5.5.0
with:
python-version: "3.11"
python-version: "3.9"
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v3.11.1
uses: docker/setup-buildx-action@v3.10.0
- name: Set up QEMU
if: matrix.platform != 'linux/amd64'
uses: docker/setup-qemu-action@v3.6.0
- name: Log in to docker hub
uses: docker/login-action@v3.5.0
uses: docker/login-action@v3.4.0
with:
username: ${{ secrets.DOCKER_USER }}
password: ${{ secrets.DOCKER_PASSWORD }}
- name: Log in to the GitHub container registry
uses: docker/login-action@v3.5.0
uses: docker/login-action@v3.4.0
with:
registry: ghcr.io
username: ${{ github.actor }}
@@ -116,36 +109,45 @@ jobs:
- name: Build docker
uses: ./.github/actions/build-image
with:
target: final
build_type: docker
platform: ${{ matrix.platform }}
target: docker
baseimg: docker
suffix: ""
version: ${{ needs.init.outputs.tag }}
- name: Build ha-addon
uses: ./.github/actions/build-image
with:
target: final
build_type: ha-addon
platform: ${{ matrix.platform }}
target: hassio
baseimg: hassio
suffix: "hassio"
version: ${{ needs.init.outputs.tag }}
# - name: Build lint
# uses: ./.github/actions/build-image
# with:
# target: lint
# build_type: lint
# suffix: lint
# version: ${{ needs.init.outputs.tag }}
- name: Build lint
uses: ./.github/actions/build-image
with:
platform: ${{ matrix.platform }}
target: lint
baseimg: docker
suffix: lint
version: ${{ needs.init.outputs.tag }}
- name: Sanitize platform name
id: sanitize
run: |
echo "${{ matrix.platform }}" | sed 's|/|-|g' > /tmp/platform
echo name=$(cat /tmp/platform) >> $GITHUB_OUTPUT
- name: Upload digests
uses: actions/upload-artifact@v4.6.2
with:
name: digests-${{ matrix.platform.arch }}
name: digests-${{ steps.sanitize.outputs.name }}
path: /tmp/digests
retention-days: 1
deploy-manifest:
name: Publish ESPHome ${{ matrix.image.build_type }} to ${{ matrix.registry }}
name: Publish ESPHome ${{ matrix.image.title }} to ${{ matrix.registry }}
runs-on: ubuntu-latest
needs:
- init
@@ -158,37 +160,40 @@ jobs:
fail-fast: false
matrix:
image:
- build_type: "docker"
suffix: ""
- build_type: "ha-addon"
- title: "ha-addon"
target: "hassio"
suffix: "hassio"
# - build_type: "lint"
# suffix: "lint"
- title: "docker"
target: "docker"
suffix: ""
- title: "lint"
target: "lint"
suffix: "lint"
registry:
- ghcr
- dockerhub
steps:
- uses: actions/checkout@v5.0.0
- uses: actions/checkout@v4.1.7
- name: Download digests
uses: actions/download-artifact@v5.0.0
uses: actions/download-artifact@v4.2.1
with:
pattern: digests-*
path: /tmp/digests
merge-multiple: true
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v3.11.1
uses: docker/setup-buildx-action@v3.10.0
- name: Log in to docker hub
if: matrix.registry == 'dockerhub'
uses: docker/login-action@v3.5.0
uses: docker/login-action@v3.4.0
with:
username: ${{ secrets.DOCKER_USER }}
password: ${{ secrets.DOCKER_PASSWORD }}
- name: Log in to the GitHub container registry
if: matrix.registry == 'ghcr'
uses: docker/login-action@v3.5.0
uses: docker/login-action@v3.4.0
with:
registry: ghcr.io
username: ${{ github.actor }}
@@ -207,7 +212,7 @@ jobs:
done
- name: Create manifest list and push
working-directory: /tmp/digests/${{ matrix.image.build_type }}/${{ matrix.registry }}
working-directory: /tmp/digests/${{ matrix.image.target }}/${{ matrix.registry }}
run: |
docker buildx imagetools create $(jq -Rcnr 'inputs | . / "," | map("-t " + .) | join(" ")' <<< "${{ steps.tags.outputs.tags}}") \
$(printf '${{ steps.tags.outputs.image }}@sha256:%s ' *)
@@ -220,7 +225,7 @@ jobs:
- deploy-manifest
steps:
- name: Trigger Workflow
uses: actions/github-script@v8.0.0
uses: actions/github-script@v7.0.1
with:
github-token: ${{ secrets.DEPLOY_HA_ADDON_REPO_TOKEN }}
script: |
@@ -238,24 +243,3 @@ jobs:
content: description
}
})
deploy-esphome-schema:
if: github.repository == 'esphome/esphome' && needs.init.outputs.branch_build == 'false'
runs-on: ubuntu-latest
needs: [init]
environment: ${{ needs.init.outputs.deploy_env }}
steps:
- name: Trigger Workflow
uses: actions/github-script@v8.0.0
with:
github-token: ${{ secrets.DEPLOY_ESPHOME_SCHEMA_REPO_TOKEN }}
script: |
github.rest.actions.createWorkflowDispatch({
owner: "esphome",
repo: "esphome-schema",
workflow_id: "generate-schemas.yml",
ref: "main",
inputs: {
version: "${{ needs.init.outputs.tag }}",
}
})

View File

@@ -17,7 +17,7 @@ jobs:
stale:
runs-on: ubuntu-latest
steps:
- uses: actions/stale@v10.0.0
- uses: actions/stale@v9.1.0
with:
days-before-pr-stale: 90
days-before-pr-close: 7
@@ -37,7 +37,7 @@ jobs:
close-issues:
runs-on: ubuntu-latest
steps:
- uses: actions/stale@v10.0.0
- uses: actions/stale@v9.1.0
with:
days-before-pr-stale: -1
days-before-pr-close: -1

View File

@@ -1,30 +0,0 @@
name: Status check labels
on:
pull_request:
types: [labeled, unlabeled]
jobs:
check:
name: Check ${{ matrix.label }}
runs-on: ubuntu-latest
strategy:
fail-fast: false
matrix:
label:
- needs-docs
- merge-after-release
steps:
- name: Check for ${{ matrix.label }} label
uses: actions/github-script@v8.0.0
with:
script: |
const { data: labels } = await github.rest.issues.listLabelsOnIssue({
owner: context.repo.owner,
repo: context.repo.repo,
issue_number: context.issue.number
});
const hasLabel = labels.find(label => label.name === '${{ matrix.label }}');
if (hasLabel) {
core.setFailed('Pull request cannot be merged, it is labeled as ${{ matrix.label }}');
}

View File

@@ -13,18 +13,18 @@ jobs:
if: github.repository == 'esphome/esphome'
steps:
- name: Checkout
uses: actions/checkout@v5.0.0
uses: actions/checkout@v4.1.7
- name: Checkout Home Assistant
uses: actions/checkout@v5.0.0
uses: actions/checkout@v4.1.7
with:
repository: home-assistant/core
path: lib/home-assistant
- name: Setup Python
uses: actions/setup-python@v6.0.0
uses: actions/setup-python@v5.5.0
with:
python-version: 3.13
python-version: 3.12
- name: Install Home Assistant
run: |

25
.github/workflows/yaml-lint.yml vendored Normal file
View File

@@ -0,0 +1,25 @@
---
name: YAML lint
on:
push:
branches: [dev, beta, release]
paths:
- "**.yaml"
- "**.yml"
pull_request:
paths:
- "**.yaml"
- "**.yml"
jobs:
yamllint:
name: yamllint
runs-on: ubuntu-latest
steps:
- name: Check out code from GitHub
uses: actions/checkout@v4.1.7
- name: Run yamllint
uses: frenck/action-yamllint@v1.5.0
with:
strict: true

1
.gitignore vendored
View File

@@ -143,4 +143,3 @@ sdkconfig.*
/components
/managed_components
api-docs/

View File

@@ -1,17 +1,10 @@
---
# See https://pre-commit.com for more information
# See https://pre-commit.com/hooks.html for more hooks
ci:
autoupdate_commit_msg: 'pre-commit: autoupdate'
autoupdate_schedule: off # Disabled until ruff versions are synced between deps and pre-commit
# Skip hooks that have issues in pre-commit CI environment
skip: [pylint, clang-tidy-hash]
repos:
- repo: https://github.com/astral-sh/ruff-pre-commit
# Ruff version.
rev: v0.13.0
rev: v0.11.0
hooks:
# Run the linter.
- id: ruff
@@ -19,7 +12,7 @@ repos:
# Run the formatter.
- id: ruff-format
- repo: https://github.com/PyCQA/flake8
rev: 7.3.0
rev: 7.2.0
hooks:
- id: flake8
additional_dependencies:
@@ -27,25 +20,22 @@ repos:
- pydocstyle==5.1.1
files: ^(esphome|tests)/.+\.py$
- repo: https://github.com/pre-commit/pre-commit-hooks
rev: v5.0.0
rev: v3.4.0
hooks:
- id: no-commit-to-branch
args:
- --branch=dev
- --branch=release
- --branch=beta
- id: end-of-file-fixer
- id: trailing-whitespace
- repo: https://github.com/asottile/pyupgrade
rev: v3.20.0
rev: v3.15.2
hooks:
- id: pyupgrade
args: [--py311-plus]
args: [--py39-plus]
- repo: https://github.com/adrienverge/yamllint.git
rev: v1.37.1
rev: v1.35.1
hooks:
- id: yamllint
exclude: ^(\.clang-format|\.clang-tidy)$
- repo: https://github.com/pre-commit/mirrors-clang-format
rev: v13.0.1
hooks:
@@ -58,10 +48,3 @@ repos:
entry: python3 script/run-in-env.py pylint
language: system
types: [python]
- id: clang-tidy-hash
name: Update clang-tidy hash
entry: python script/clang_tidy_hash.py --update-if-changed
language: python
files: ^(\.clang-tidy|platformio\.ini|requirements_dev\.txt)$
pass_filenames: false
additional_dependencies: []

View File

@@ -1 +0,0 @@
.ai/instructions.md

View File

@@ -9,7 +9,6 @@
pyproject.toml @esphome/core
esphome/*.py @esphome/core
esphome/core/* @esphome/core
.github/** @esphome/core
# Integrations
esphome/components/a01nyub/* @MrSuicideParrot
@@ -29,7 +28,7 @@ esphome/components/aic3204/* @kbx81
esphome/components/airthings_ble/* @jeromelaban
esphome/components/airthings_wave_base/* @jeromelaban @kpfleming @ncareau
esphome/components/airthings_wave_mini/* @ncareau
esphome/components/airthings_wave_plus/* @jeromelaban @precurse
esphome/components/airthings_wave_plus/* @jeromelaban
esphome/components/alarm_control_panel/* @grahambrown11 @hwstar
esphome/components/alpha3/* @jan-hofmeier
esphome/components/am2315c/* @swoboda1337
@@ -40,11 +39,11 @@ esphome/components/analog_threshold/* @ianchi
esphome/components/animation/* @syndlex
esphome/components/anova/* @buxtronix
esphome/components/apds9306/* @aodrenah
esphome/components/api/* @esphome/core
esphome/components/api/* @OttoWinter
esphome/components/as5600/* @ammmze
esphome/components/as5600/sensor/* @ammmze
esphome/components/as7341/* @mrgnr
esphome/components/async_tcp/* @esphome/core
esphome/components/async_tcp/* @OttoWinter
esphome/components/at581x/* @X-Ryl669
esphome/components/atc_mithermometer/* @ahpohl
esphome/components/atm90e26/* @danieltwagner
@@ -66,10 +65,10 @@ esphome/components/binary_sensor/* @esphome/core
esphome/components/bk72xx/* @kuba2k2
esphome/components/bl0906/* @athom-tech @jesserockz @tarontop
esphome/components/bl0939/* @ziceva
esphome/components/bl0940/* @dan-s-github @tobias-
esphome/components/bl0940/* @tobias-
esphome/components/bl0942/* @dbuezas @dwmw2
esphome/components/ble_client/* @buxtronix @clydebarrow
esphome/components/bluetooth_proxy/* @bdraco @jesserockz
esphome/components/bluetooth_proxy/* @jesserockz
esphome/components/bme280_base/* @esphome/core
esphome/components/bme280_spi/* @apbodrov
esphome/components/bme680_bsec/* @trvrnrth
@@ -88,21 +87,17 @@ esphome/components/bp1658cj/* @Cossid
esphome/components/bp5758d/* @Cossid
esphome/components/button/* @esphome/core
esphome/components/bytebuffer/* @clydebarrow
esphome/components/camera/* @bdraco @DT-art1
esphome/components/camera_encoder/* @DT-art1
esphome/components/canbus/* @danielschramm @mvturnho
esphome/components/cap1188/* @mreditor97
esphome/components/captive_portal/* @esphome/core
esphome/components/captive_portal/* @OttoWinter
esphome/components/ccs811/* @habbie
esphome/components/cd74hc4067/* @asoehlke
esphome/components/ch422g/* @clydebarrow @jesterret
esphome/components/chsc6x/* @kkosik20
esphome/components/climate/* @esphome/core
esphome/components/climate_ir/* @glmnet
esphome/components/cm1106/* @andrewjswan
esphome/components/color_temperature/* @jesserockz
esphome/components/combination/* @Cat-Ion @kahrendt
esphome/components/const/* @esphome/core
esphome/components/coolix/* @glmnet
esphome/components/copy/* @OttoWinter
esphome/components/cover/* @esphome/core
@@ -119,7 +114,7 @@ esphome/components/dallas_temp/* @ssieb
esphome/components/daly_bms/* @s1lvi0
esphome/components/dashboard_import/* @esphome/core
esphome/components/datetime/* @jesserockz @rfdarter
esphome/components/debug/* @esphome/core
esphome/components/debug/* @OttoWinter
esphome/components/delonghi/* @grob6000
esphome/components/dfplayer/* @glmnet
esphome/components/dfrobot_sen0395/* @niklasweber
@@ -127,7 +122,6 @@ esphome/components/dht/* @OttoWinter
esphome/components/display_menu_base/* @numo68
esphome/components/dps310/* @kbx81
esphome/components/ds1307/* @badbadc0ffee
esphome/components/ds2484/* @mrk-its
esphome/components/dsmr/* @glmnet @zuidwijk
esphome/components/duty_time/* @dudanov
esphome/components/ee895/* @Stock-M
@@ -143,21 +137,16 @@ esphome/components/es7210/* @kahrendt
esphome/components/es7243e/* @kbx81
esphome/components/es8156/* @kbx81
esphome/components/es8311/* @kahrendt @kroimon
esphome/components/es8388/* @P4uLT
esphome/components/esp32/* @esphome/core
esphome/components/esp32_ble/* @bdraco @jesserockz @Rapsssito
esphome/components/esp32_ble_client/* @bdraco @jesserockz
esphome/components/esp32_ble_server/* @clydebarrow @jesserockz @Rapsssito
esphome/components/esp32_ble_tracker/* @bdraco
esphome/components/esp32_ble/* @Rapsssito @jesserockz
esphome/components/esp32_ble_client/* @jesserockz
esphome/components/esp32_ble_server/* @Rapsssito @clydebarrow @jesserockz
esphome/components/esp32_camera_web_server/* @ayufan
esphome/components/esp32_can/* @Sympatron
esphome/components/esp32_hosted/* @swoboda1337
esphome/components/esp32_improv/* @jesserockz
esphome/components/esp32_rmt/* @jesserockz
esphome/components/esp32_rmt_led_strip/* @jesserockz
esphome/components/esp8266/* @esphome/core
esphome/components/esp_ldo/* @clydebarrow
esphome/components/espnow/* @jesserockz
esphome/components/ethernet_info/* @gtjadsonsantos
esphome/components/event/* @nohat
esphome/components/event_emitter/* @Rapsssito
@@ -167,20 +156,19 @@ esphome/components/ezo_pmp/* @carlos-sarmiento
esphome/components/factory_reset/* @anatoly-savchenkov
esphome/components/fastled_base/* @OttoWinter
esphome/components/feedback/* @ianchi
esphome/components/fingerprint_grow/* @alexborro @loongyh @OnFreund
esphome/components/fingerprint_grow/* @OnFreund @alexborro @loongyh
esphome/components/font/* @clydebarrow @esphome/core
esphome/components/fs3000/* @kahrendt
esphome/components/ft5x06/* @clydebarrow
esphome/components/ft63x6/* @gpambrozio
esphome/components/gcja5/* @gcormier
esphome/components/gdk101/* @Szewcson
esphome/components/gl_r01_i2c/* @pkejval
esphome/components/globals/* @esphome/core
esphome/components/gp2y1010au0f/* @zry98
esphome/components/gp8403/* @jesserockz
esphome/components/gpio/* @esphome/core
esphome/components/gpio/one_wire/* @ssieb
esphome/components/gps/* @coogle @ximex
esphome/components/gps/* @coogle
esphome/components/graph/* @synco
esphome/components/graphical_display_menu/* @MrMDavidson
esphome/components/gree/* @orestismers
@@ -203,7 +191,7 @@ esphome/components/heatpumpir/* @rob-deutsch
esphome/components/hitachi_ac424/* @sourabhjaiswal
esphome/components/hm3301/* @freekode
esphome/components/hmac_md5/* @dwmw2
esphome/components/homeassistant/* @esphome/core @OttoWinter
esphome/components/homeassistant/* @OttoWinter @esphome/core
esphome/components/homeassistant/number/* @landonr
esphome/components/homeassistant/switch/* @Links2004
esphome/components/honeywell_hih_i2c/* @Benichou34
@@ -228,46 +216,40 @@ esphome/components/iaqcore/* @yozik04
esphome/components/ili9xxx/* @clydebarrow @nielsnl68
esphome/components/improv_base/* @esphome/core
esphome/components/improv_serial/* @esphome/core
esphome/components/ina226/* @latonita @Sergio303
esphome/components/ina226/* @Sergio303 @latonita
esphome/components/ina260/* @mreditor97
esphome/components/ina2xx_base/* @latonita
esphome/components/ina2xx_i2c/* @latonita
esphome/components/ina2xx_spi/* @latonita
esphome/components/inkbird_ibsth1_mini/* @fkirill
esphome/components/inkplate/* @jesserockz @JosipKuci
esphome/components/inkplate6/* @jesserockz
esphome/components/integration/* @OttoWinter
esphome/components/internal_temperature/* @Mat931
esphome/components/interval/* @esphome/core
esphome/components/jsn_sr04t/* @Mafus1
esphome/components/json/* @esphome/core
esphome/components/json/* @OttoWinter
esphome/components/kamstrup_kmp/* @cfeenstra1024
esphome/components/key_collector/* @ssieb
esphome/components/key_provider/* @ssieb
esphome/components/kuntze/* @ssieb
esphome/components/lc709203f/* @ilikecake
esphome/components/lcd_menu/* @numo68
esphome/components/ld2410/* @regevbr @sebcaps
esphome/components/ld2412/* @Rihan9
esphome/components/ld2420/* @descipher
esphome/components/ld2450/* @hareeshmu
esphome/components/ld24xx/* @kbx81
esphome/components/ledc/* @OttoWinter
esphome/components/libretiny/* @kuba2k2
esphome/components/libretiny_pwm/* @kuba2k2
esphome/components/light/* @esphome/core
esphome/components/lightwaverf/* @max246
esphome/components/lilygo_t5_47/touchscreen/* @jesserockz
esphome/components/ln882x/* @lamauny
esphome/components/lock/* @esphome/core
esphome/components/logger/* @esphome/core
esphome/components/logger/select/* @clydebarrow
esphome/components/lps22/* @nagisa
esphome/components/ltr390/* @latonita @sjtrny
esphome/components/ltr501/* @latonita
esphome/components/ltr_als_ps/* @latonita
esphome/components/lvgl/* @clydebarrow
esphome/components/m5stack_8angle/* @rnauber
esphome/components/mapping/* @clydebarrow
esphome/components/matrix_keypad/* @ssieb
esphome/components/max17043/* @blacknell
esphome/components/max31865/* @DAVe3283
@@ -277,8 +259,8 @@ esphome/components/max7219digit/* @rspaargaren
esphome/components/max9611/* @mckaymatthew
esphome/components/mcp23008/* @jesserockz
esphome/components/mcp23017/* @jesserockz
esphome/components/mcp23s08/* @jesserockz @SenexCrenshaw
esphome/components/mcp23s17/* @jesserockz @SenexCrenshaw
esphome/components/mcp23s08/* @SenexCrenshaw @jesserockz
esphome/components/mcp23s17/* @SenexCrenshaw @jesserockz
esphome/components/mcp23x08_base/* @jesserockz
esphome/components/mcp23x17_base/* @jesserockz
esphome/components/mcp23xxx_base/* @jesserockz
@@ -294,13 +276,10 @@ esphome/components/mdns/* @esphome/core
esphome/components/media_player/* @jesserockz
esphome/components/micro_wake_word/* @jesserockz @kahrendt
esphome/components/micronova/* @jorre05
esphome/components/microphone/* @jesserockz @kahrendt
esphome/components/microphone/* @jesserockz
esphome/components/mics_4514/* @jesserockz
esphome/components/midea/* @dudanov
esphome/components/midea_ir/* @dudanov
esphome/components/mipi_dsi/* @clydebarrow
esphome/components/mipi_rgb/* @clydebarrow
esphome/components/mipi_spi/* @clydebarrow
esphome/components/mitsubishi/* @RubyBailey
esphome/components/mixer/speaker/* @kahrendt
esphome/components/mlx90393/* @functionpointer
@@ -332,31 +311,24 @@ esphome/components/nextion/text_sensor/* @senexcrenshaw
esphome/components/nfc/* @jesserockz @kbx81
esphome/components/noblex/* @AGalfra
esphome/components/npi19/* @bakerkj
esphome/components/nrf52/* @tomaszduda23
esphome/components/number/* @esphome/core
esphome/components/one_wire/* @ssieb
esphome/components/online_image/* @clydebarrow @guillempages
esphome/components/opentherm/* @olegtarasov
esphome/components/openthread/* @mrene
esphome/components/opt3001/* @ccutrer
esphome/components/ota/* @esphome/core
esphome/components/output/* @esphome/core
esphome/components/packet_transport/* @clydebarrow
esphome/components/pca6416a/* @Mat931
esphome/components/pca9554/* @bdraco @clydebarrow @hwstar
esphome/components/pca9554/* @clydebarrow @hwstar
esphome/components/pcf85063/* @brogon
esphome/components/pcf8563/* @KoenBreeman
esphome/components/pi4ioe5v6408/* @jesserockz
esphome/components/pid/* @OttoWinter
esphome/components/pipsolar/* @andreashergert1984
esphome/components/pm1006/* @habbie
esphome/components/pm2005/* @andrewjswan
esphome/components/pmsa003i/* @sjtrny
esphome/components/pmsx003/* @ximex
esphome/components/pmwcs3/* @SeByDocKy
esphome/components/pn532/* @jesserockz @OttoWinter
esphome/components/pn532_i2c/* @jesserockz @OttoWinter
esphome/components/pn532_spi/* @jesserockz @OttoWinter
esphome/components/pn532/* @OttoWinter @jesserockz
esphome/components/pn532_i2c/* @OttoWinter @jesserockz
esphome/components/pn532_spi/* @OttoWinter @jesserockz
esphome/components/pn7150/* @jesserockz @kbx81
esphome/components/pn7150_i2c/* @jesserockz @kbx81
esphome/components/pn7160/* @jesserockz @kbx81
@@ -365,7 +337,7 @@ esphome/components/pn7160_spi/* @jesserockz @kbx81
esphome/components/power_supply/* @esphome/core
esphome/components/preferences/* @esphome/core
esphome/components/psram/* @esphome/core
esphome/components/pulse_meter/* @cstaahl @stevebaxter @TrentHouliston
esphome/components/pulse_meter/* @TrentHouliston @cstaahl @stevebaxter
esphome/components/pvvx_mithermometer/* @pasiz
esphome/components/pylontech/* @functionpointer
esphome/components/qmp6988/* @andrewpc
@@ -387,7 +359,6 @@ esphome/components/rp2040_pwm/* @jesserockz
esphome/components/rpi_dpi_rgb/* @clydebarrow
esphome/components/rtl87xx/* @kuba2k2
esphome/components/rtttl/* @glmnet
esphome/components/runtime_stats/* @bdraco
esphome/components/safe_mode/* @jsuanet @kbx81 @paulmonigatti
esphome/components/scd4x/* @martgras @sjtrny
esphome/components/script/* @esphome/core
@@ -406,7 +377,7 @@ esphome/components/sensirion_common/* @martgras
esphome/components/sensor/* @esphome/core
esphome/components/sfa30/* @ghsensdev
esphome/components/sgp40/* @SenexCrenshaw
esphome/components/sgp4x/* @martgras @SenexCrenshaw
esphome/components/sgp4x/* @SenexCrenshaw @martgras
esphome/components/shelly_dimmer/* @edge90 @rnauber
esphome/components/sht3xd/* @mrtoy-me
esphome/components/sht4x/* @sjtrny
@@ -422,7 +393,6 @@ esphome/components/smt100/* @piechade
esphome/components/sn74hc165/* @jesserockz
esphome/components/socket/* @esphome/core
esphome/components/sonoff_d1/* @anatoly-savchenkov
esphome/components/sound_level/* @kahrendt
esphome/components/speaker/* @jesserockz @kahrendt
esphome/components/speaker/media_player/* @kahrendt @synesthesiam
esphome/components/spi/* @clydebarrow @esphome/core
@@ -454,9 +424,6 @@ esphome/components/sun/* @OttoWinter
esphome/components/sun_gtil2/* @Mat931
esphome/components/switch/* @esphome/core
esphome/components/switch/binary_sensor/* @ssieb
esphome/components/sx126x/* @swoboda1337
esphome/components/sx127x/* @swoboda1337
esphome/components/syslog/* @clydebarrow
esphome/components/t6615/* @tylermenezes
esphome/components/tc74/* @sethgirvan
esphome/components/tca9548a/* @andreashergert1984
@@ -471,13 +438,13 @@ esphome/components/template/event/* @nohat
esphome/components/template/fan/* @ssieb
esphome/components/text/* @mauritskorse
esphome/components/thermostat/* @kbx81
esphome/components/time/* @esphome/core
esphome/components/time/* @OttoWinter
esphome/components/tlc5947/* @rnauber
esphome/components/tlc5971/* @IJIJI
esphome/components/tm1621/* @Philippe12
esphome/components/tm1637/* @glmnet
esphome/components/tm1638/* @skykingjwc
esphome/components/tm1651/* @mrtoy-me
esphome/components/tm1651/* @freekode
esphome/components/tmp102/* @timsavage
esphome/components/tmp1075/* @sybrenstuvel
esphome/components/tmp117/* @Azimath
@@ -496,26 +463,22 @@ esphome/components/tuya/switch/* @jesserockz
esphome/components/tuya/text_sensor/* @dentra
esphome/components/uart/* @esphome/core
esphome/components/uart/button/* @ssieb
esphome/components/uart/packet_transport/* @clydebarrow
esphome/components/udp/* @clydebarrow
esphome/components/ufire_ec/* @pvizeli
esphome/components/ufire_ise/* @pvizeli
esphome/components/ultrasonic/* @OttoWinter
esphome/components/update/* @jesserockz
esphome/components/uponor_smatrix/* @kroimon
esphome/components/usb_host/* @clydebarrow
esphome/components/usb_uart/* @clydebarrow
esphome/components/valve/* @esphome/core
esphome/components/vbus/* @ssieb
esphome/components/veml3235/* @kbx81
esphome/components/veml7700/* @latonita
esphome/components/version/* @esphome/core
esphome/components/voice_assistant/* @jesserockz @kahrendt
esphome/components/voice_assistant/* @jesserockz
esphome/components/wake_on_lan/* @clydebarrow @willwill2will54
esphome/components/watchdog/* @oarcher
esphome/components/waveshare_epaper/* @clydebarrow
esphome/components/web_server/ota/* @esphome/core
esphome/components/web_server_base/* @esphome/core
esphome/components/web_server_base/* @OttoWinter
esphome/components/web_server_idf/* @dentra
esphome/components/weikai/* @DrCoolZic
esphome/components/weikai_i2c/* @DrCoolZic
@@ -541,10 +504,8 @@ esphome/components/xiaomi_lywsd03mmc/* @ahpohl
esphome/components/xiaomi_mhoc303/* @drug123
esphome/components/xiaomi_mhoc401/* @vevsvevs
esphome/components/xiaomi_rtcgq02lm/* @jesserockz
esphome/components/xiaomi_xmwsdj04mmc/* @medusalix
esphome/components/xl9535/* @mreditor97
esphome/components/xpt2046/touchscreen/* @nielsnl68 @numo68
esphome/components/xxtea/* @clydebarrow
esphome/components/zephyr/* @tomaszduda23
esphome/components/zhlt01/* @cfeenstra1024
esphome/components/zio_ultrasonic/* @kahrendt

View File

@@ -7,7 +7,7 @@ project and be sure to join us on [Discord](https://discord.gg/KhAMKrd).
**See also:**
[Documentation](https://esphome.io) -- [Issues](https://github.com/esphome/esphome/issues) -- [Feature requests](https://github.com/orgs/esphome/discussions)
[Documentation](https://esphome.io) -- [Issues](https://github.com/esphome/issues/issues) -- [Feature requests](https://github.com/esphome/feature-requests/issues)
---

2877
Doxyfile

File diff suppressed because it is too large Load Diff

View File

@@ -1 +0,0 @@
.ai/instructions.md

View File

@@ -9,7 +9,7 @@
---
[Documentation](https://esphome.io) -- [Issues](https://github.com/esphome/esphome/issues) -- [Feature requests](https://github.com/orgs/esphome/discussions)
[Documentation](https://esphome.io) -- [Issues](https://github.com/esphome/issues/issues) -- [Feature requests](https://github.com/esphome/feature-requests/issues)
---

View File

@@ -1,56 +1,131 @@
ARG BUILD_VERSION=dev
ARG BUILD_OS=alpine
ARG BUILD_BASE_VERSION=2025.04.0
ARG BUILD_TYPE=docker
# Build these with the build.py script
# Example:
# python3 docker/build.py --tag dev --arch amd64 --build-type docker build
FROM ghcr.io/esphome/docker-base:${BUILD_OS}-${BUILD_BASE_VERSION} AS base-source-docker
FROM ghcr.io/esphome/docker-base:${BUILD_OS}-ha-addon-${BUILD_BASE_VERSION} AS base-source-ha-addon
# One of "docker", "hassio"
ARG BASEIMGTYPE=docker
ARG BUILD_TYPE
FROM base-source-${BUILD_TYPE} AS base
RUN git config --system --add safe.directory "*"
# https://github.com/hassio-addons/addon-debian-base/releases
FROM ghcr.io/hassio-addons/debian-base:7.2.0 AS base-hassio
# https://hub.docker.com/_/debian?tab=tags&page=1&name=bookworm
FROM debian:12.2-slim AS base-docker
ENV PIP_DISABLE_PIP_VERSION_CHECK=1
FROM base-${BASEIMGTYPE} AS base
RUN pip install --no-cache-dir -U pip uv==0.6.14
COPY requirements.txt /
ARG TARGETARCH
ARG TARGETVARIANT
# Note that --break-system-packages is used below because
# https://peps.python.org/pep-0668/ added a safety check that prevents
# installing packages with the same name as a system package. This is
# not a problem for us because we are not concerned about overwriting
# system packages because we are running in an isolated container.
RUN \
uv pip install --no-cache-dir \
-r /requirements.txt
apt-get update \
# Use pinned versions so that we get updates with build caching
&& apt-get install -y --no-install-recommends \
python3-pip=23.0.1+dfsg-1 \
python3-setuptools=66.1.1-1+deb12u1 \
python3-venv=3.11.2-1+b1 \
python3-wheel=0.38.4-2 \
iputils-ping=3:20221126-1+deb12u1 \
git=1:2.39.5-0+deb12u2 \
curl=7.88.1-10+deb12u12 \
openssh-client=1:9.2p1-2+deb12u5 \
python3-cffi=1.15.1-5 \
libcairo2=1.16.0-7 \
libmagic1=1:5.44-3 \
patch=2.7.6-7 \
&& rm -rf \
/tmp/* \
/var/{cache,log}/* \
/var/lib/apt/lists/*
ENV \
# Fix click python3 lang warning https://click.palletsprojects.com/en/7.x/python3/
LANG=C.UTF-8 LC_ALL=C.UTF-8 \
# Store globally installed pio libs in /piolibs
PLATFORMIO_GLOBALLIB_DIR=/piolibs
RUN \
platformio settings set enable_telemetry No \
pip3 install \
--break-system-packages --no-cache-dir \
# Keep platformio version in sync with requirements.txt
platformio==6.1.18 \
# Change some platformio settings
&& platformio settings set enable_telemetry No \
&& platformio settings set check_platformio_interval 1000000 \
&& mkdir -p /piolibs
# First install requirements to leverage caching when requirements don't change
# tmpfs is for https://github.com/rust-lang/cargo/issues/8719
COPY requirements.txt requirements_optional.txt /
RUN --mount=type=tmpfs,target=/root/.cargo <<END-OF-RUN
# Fail on any non-zero status
set -e
# install build tools in case wheels are not available
BUILD_DEPS="
build-essential=12.9
python3-dev=3.11.2-1+b1
zlib1g-dev=1:1.2.13.dfsg-1
libjpeg-dev=1:2.1.5-2
libfreetype-dev=2.12.1+dfsg-5+deb12u4
libssl-dev=3.0.15-1~deb12u1
libffi-dev=3.4.4-1
cargo=0.66.0+ds1-1
pkg-config=1.8.1-1
"
LIB_DEPS="
libtiff6=4.5.0-6+deb12u1
libopenjp2-7=2.5.0-2+deb12u1
"
if [ "$TARGETARCH$TARGETVARIANT" = "arm64" ]
then
apt-get update
apt-get install -y --no-install-recommends $BUILD_DEPS $LIB_DEPS
fi
CARGO_REGISTRIES_CRATES_IO_PROTOCOL=sparse CARGO_HOME=/root/.cargo
pip3 install --break-system-packages --no-cache-dir -r /requirements.txt -r /requirements_optional.txt
if [ "$TARGETARCH$TARGETVARIANT" = "arm64" ]
then
apt-get remove -y --purge --auto-remove $BUILD_DEPS
rm -rf /tmp/* /var/{cache,log}/* /var/lib/apt/lists/*
fi
END-OF-RUN
COPY script/platformio_install_deps.py platformio.ini /
RUN /platformio_install_deps.py /platformio.ini --libraries
ARG BUILD_VERSION
LABEL \
org.opencontainers.image.authors="The ESPHome Authors" \
org.opencontainers.image.title="ESPHome" \
org.opencontainers.image.description="ESPHome is a system to configure your microcontrollers by simple yet powerful configuration files and control them remotely through Home Automation systems" \
org.opencontainers.image.url="https://esphome.io/" \
org.opencontainers.image.documentation="https://esphome.io/" \
org.opencontainers.image.source="https://github.com/esphome/esphome" \
org.opencontainers.image.licenses="ESPHome" \
org.opencontainers.image.version=${BUILD_VERSION}
# Avoid unsafe git error when container user and file config volume permissions don't match
RUN git config --system --add safe.directory '*'
# ======================= docker-type image =======================
FROM base AS base-docker
FROM base AS docker
# Copy esphome and install
COPY . /esphome
RUN pip3 install --break-system-packages --no-cache-dir -e /esphome
# Settings for dashboard
ENV USERNAME="" PASSWORD=""
# Expose the dashboard to Docker
EXPOSE 6052
# Run healthcheck (heartbeat)
HEALTHCHECK --interval=30s --timeout=30s \
CMD curl --fail http://localhost:6052/version -A "HealthCheck" || exit 1
CMD curl --fail http://localhost:6052/version -A "HealthCheck" || exit 1
COPY docker/docker_entrypoint.sh /entrypoint.sh
@@ -64,13 +139,43 @@ ENTRYPOINT ["/entrypoint.sh"]
CMD ["dashboard", "/config"]
# ======================= ha-addon-type image =======================
FROM base AS base-ha-addon
ARG BUILD_VERSION=dev
# Labels
LABEL \
org.opencontainers.image.authors="The ESPHome Authors" \
org.opencontainers.image.title="ESPHome" \
org.opencontainers.image.description="ESPHome is a system to configure your microcontrollers by simple yet powerful configuration files and control them remotely through Home Automation systems" \
org.opencontainers.image.url="https://esphome.io/" \
org.opencontainers.image.documentation="https://esphome.io/" \
org.opencontainers.image.source="https://github.com/esphome/esphome" \
org.opencontainers.image.licenses="ESPHome" \
org.opencontainers.image.version=${BUILD_VERSION}
# ======================= hassio-type image =======================
FROM base AS hassio
RUN \
apt-get update \
# Use pinned versions so that we get updates with build caching
&& apt-get install -y --no-install-recommends \
nginx-light=1.22.1-9+deb12u1 \
&& rm -rf \
/tmp/* \
/var/{cache,log}/* \
/var/lib/apt/lists/*
ARG BUILD_VERSION=dev
# Copy root filesystem
COPY docker/ha-addon-rootfs/ /
ARG BUILD_VERSION
# Copy esphome and install
COPY . /esphome
RUN pip3 install --break-system-packages --no-cache-dir -e /esphome
# Labels
LABEL \
io.hass.name="ESPHome" \
io.hass.description="ESPHome is a system to configure your microcontrollers by simple yet powerful configuration files and control them remotely through Home Automation systems" \
@@ -78,9 +183,35 @@ LABEL \
io.hass.version="${BUILD_VERSION}"
# io.hass.arch is inherited from addon-debian-base
ARG BUILD_TYPE
FROM base-${BUILD_TYPE} AS final
# Copy esphome and install
COPY . /esphome
RUN uv pip install --no-cache-dir -e /esphome
# ======================= lint-type image =======================
FROM base AS lint
ENV \
PLATFORMIO_CORE_DIR=/esphome/.temp/platformio
RUN \
curl -L https://apt.llvm.org/llvm-snapshot.gpg.key -o /etc/apt/trusted.gpg.d/apt.llvm.org.asc \
&& echo "deb http://apt.llvm.org/bookworm/ llvm-toolchain-bookworm-18 main" > /etc/apt/sources.list.d/llvm.sources.list \
&& apt-get update \
# Use pinned versions so that we get updates with build caching
&& apt-get install -y --no-install-recommends \
clang-format-13=1:13.0.1-11+b2 \
patch=2.7.6-7 \
software-properties-common=0.99.30-4.1~deb12u1 \
nano=7.2-1+deb12u1 \
build-essential=12.9 \
python3-dev=3.11.2-1+b1 \
clang-tidy-18=1:18.1.8~++20240731024826+3b5b5c1ec4a3-1~exp1~20240731144843.145 \
&& rm -rf \
/tmp/* \
/var/{cache,log}/* \
/var/lib/apt/lists/*
COPY requirements_test.txt /
RUN pip3 install --break-system-packages --no-cache-dir -r /requirements_test.txt
VOLUME ["/esphome"]
WORKDIR /esphome

View File

@@ -54,7 +54,7 @@ manifest_parser = subparsers.add_parser(
class DockerParams:
build_to: str
manifest_to: str
build_type: str
baseimgtype: str
platform: str
target: str
@@ -66,19 +66,24 @@ class DockerParams:
TYPE_LINT: "esphome/esphome-lint",
}[build_type]
build_to = f"{prefix}-{arch}"
baseimgtype = {
TYPE_DOCKER: "docker",
TYPE_HA_ADDON: "hassio",
TYPE_LINT: "docker",
}[build_type]
platform = {
ARCH_AMD64: "linux/amd64",
ARCH_AARCH64: "linux/arm64",
}[arch]
target = {
TYPE_DOCKER: "final",
TYPE_HA_ADDON: "final",
TYPE_DOCKER: "docker",
TYPE_HA_ADDON: "hassio",
TYPE_LINT: "lint",
}[build_type]
return cls(
build_to=build_to,
manifest_to=prefix,
build_type=build_type,
baseimgtype=baseimgtype,
platform=platform,
target=target,
)
@@ -90,7 +95,7 @@ def main():
def run_command(*cmd, ignore_error: bool = False):
print(f"$ {shlex.join(list(cmd))}")
if not args.dry_run:
rc = subprocess.call(list(cmd), close_fds=False)
rc = subprocess.call(list(cmd))
if rc != 0 and not ignore_error:
print("Command failed")
sys.exit(1)
@@ -140,7 +145,7 @@ def main():
"buildx",
"build",
"--build-arg",
f"BUILD_TYPE={params.build_type}",
f"BASEIMGTYPE={params.baseimgtype}",
"--build-arg",
f"BUILD_VERSION={args.tag}",
"--cache-from",

View File

@@ -2,24 +2,20 @@
import argparse
from datetime import datetime
import functools
import getpass
import importlib
import logging
import os
import re
import sys
import time
from typing import Protocol
import argcomplete
from esphome import const, writer, yaml_util
import esphome.codegen as cg
from esphome.components.mqtt import CONF_DISCOVER_IP
from esphome.config import iter_component_configs, read_config, strip_default_ids
from esphome.const import (
ALLOWED_NAME_CHARS,
CONF_API,
CONF_BAUD_RATE,
CONF_BROKER,
CONF_DEASSERT_RTS_DTR,
@@ -38,17 +34,16 @@ from esphome.const import (
CONF_PORT,
CONF_SUBSTITUTIONS,
CONF_TOPIC,
ENV_NOGITIGNORE,
PLATFORM_BK72XX,
PLATFORM_ESP32,
PLATFORM_ESP8266,
PLATFORM_RP2040,
PLATFORM_RTL87XX,
SECRETS_FILES,
)
from esphome.core import CORE, EsphomeError, coroutine
from esphome.enum import StrEnum
from esphome.helpers import get_bool_env, indent, is_ip_address
from esphome.log import AnsiFore, color, setup_log
from esphome.types import ConfigType
from esphome.log import Fore, color, setup_log
from esphome.util import (
get_serial_ports,
list_yaml_files,
@@ -60,23 +55,6 @@ from esphome.util import (
_LOGGER = logging.getLogger(__name__)
class ArgsProtocol(Protocol):
device: list[str] | None
reset: bool
username: str | None
password: str | None
client_id: str | None
topic: str | None
file: str | None
no_logs: bool
only_generate: bool
show_secrets: bool
dashboard: bool
configuration: str
name: str
upload_speed: str | None
def choose_prompt(options, purpose: str = None):
if not options:
raise EsphomeError(
@@ -105,181 +83,57 @@ def choose_prompt(options, purpose: str = None):
raise ValueError
break
except ValueError:
safe_print(color(AnsiFore.RED, f"Invalid option: '{opt}'"))
safe_print(color(Fore.RED, f"Invalid option: '{opt}'"))
return options[opt - 1][1]
class Purpose(StrEnum):
UPLOADING = "uploading"
LOGGING = "logging"
def choose_upload_log_host(
default: list[str] | str | None,
check_default: str | None,
purpose: Purpose,
) -> list[str]:
# Convert to list for uniform handling
defaults = [default] if isinstance(default, str) else default or []
# If devices specified, resolve them
if defaults:
resolved: list[str] = []
for device in defaults:
if device == "SERIAL":
serial_ports = get_serial_ports()
if not serial_ports:
_LOGGER.warning("No serial ports found, skipping SERIAL device")
continue
options = [
(f"{port.path} ({port.description})", port.path)
for port in serial_ports
]
resolved.append(choose_prompt(options, purpose=purpose))
elif device == "OTA":
# ensure IP adresses are used first
if is_ip_address(CORE.address) and (
(purpose == Purpose.LOGGING and has_api())
or (purpose == Purpose.UPLOADING and has_ota())
):
resolved.append(CORE.address)
if purpose == Purpose.LOGGING:
if has_api() and has_mqtt_ip_lookup():
resolved.append("MQTTIP")
if has_mqtt_logging():
resolved.append("MQTT")
if has_api() and has_non_ip_address():
resolved.append(CORE.address)
elif purpose == Purpose.UPLOADING:
if has_ota() and has_mqtt_ip_lookup():
resolved.append("MQTTIP")
if has_ota() and has_non_ip_address():
resolved.append(CORE.address)
else:
resolved.append(device)
if not resolved:
_LOGGER.error("All specified devices: %s could not be resolved.", defaults)
return resolved
# No devices specified, show interactive chooser
options = [
(f"{port.path} ({port.description})", port.path) for port in get_serial_ports()
]
if purpose == Purpose.LOGGING:
if has_mqtt_logging():
mqtt_config = CORE.config[CONF_MQTT]
options.append((f"MQTT ({mqtt_config[CONF_BROKER]})", "MQTT"))
if has_api():
if has_resolvable_address():
options.append((f"Over The Air ({CORE.address})", CORE.address))
if has_mqtt_ip_lookup():
options.append(("Over The Air (MQTT IP lookup)", "MQTTIP"))
elif purpose == Purpose.UPLOADING and has_ota():
if has_resolvable_address():
options.append((f"Over The Air ({CORE.address})", CORE.address))
if has_mqtt_ip_lookup():
options.append(("Over The Air (MQTT IP lookup)", "MQTTIP"))
default, check_default, show_ota, show_mqtt, show_api, purpose: str = None
):
options = []
for port in get_serial_ports():
options.append((f"{port.path} ({port.description})", port.path))
if default == "SERIAL":
return choose_prompt(options, purpose=purpose)
if (show_ota and "ota" in CORE.config) or (show_api and "api" in CORE.config):
options.append((f"Over The Air ({CORE.address})", CORE.address))
if default == "OTA":
return CORE.address
if (
show_mqtt
and (mqtt_config := CORE.config.get(CONF_MQTT))
and mqtt_logging_enabled(mqtt_config)
):
options.append((f"MQTT ({mqtt_config[CONF_BROKER]})", "MQTT"))
if default == "OTA":
return "MQTT"
if default is not None:
return default
if check_default is not None and check_default in [opt[1] for opt in options]:
return [check_default]
return [choose_prompt(options, purpose=purpose)]
return check_default
return choose_prompt(options, purpose=purpose)
def has_mqtt_logging() -> bool:
"""Check if MQTT logging is available."""
if CONF_MQTT not in CORE.config:
return False
mqtt_config = CORE.config[CONF_MQTT]
# enabled by default
if CONF_LOG_TOPIC not in mqtt_config:
return True
def mqtt_logging_enabled(mqtt_config):
log_topic = mqtt_config[CONF_LOG_TOPIC]
if log_topic is None:
return False
if CONF_TOPIC not in log_topic:
return False
return log_topic.get(CONF_LEVEL, None) != "NONE"
def has_mqtt() -> bool:
"""Check if MQTT is available."""
return CONF_MQTT in CORE.config
def has_api() -> bool:
"""Check if API is available."""
return CONF_API in CORE.config
def has_ota() -> bool:
"""Check if OTA is available."""
return CONF_OTA in CORE.config
def has_mqtt_ip_lookup() -> bool:
"""Check if MQTT is available and IP lookup is supported."""
if CONF_MQTT not in CORE.config:
if log_topic.get(CONF_LEVEL, None) == "NONE":
return False
# Default Enabled
if CONF_DISCOVER_IP not in CORE.config[CONF_MQTT]:
return True
return CORE.config[CONF_MQTT][CONF_DISCOVER_IP]
return True
def has_mdns() -> bool:
"""Check if MDNS is available."""
return CONF_MDNS not in CORE.config or not CORE.config[CONF_MDNS][CONF_DISABLED]
def has_non_ip_address() -> bool:
"""Check if CORE.address is set and is not an IP address."""
return CORE.address is not None and not is_ip_address(CORE.address)
def has_ip_address() -> bool:
"""Check if CORE.address is a valid IP address."""
return CORE.address is not None and is_ip_address(CORE.address)
def has_resolvable_address() -> bool:
"""Check if CORE.address is resolvable (via mDNS or is an IP address)."""
return has_mdns() or has_ip_address()
def mqtt_get_ip(config: ConfigType, username: str, password: str, client_id: str):
from esphome import mqtt
return mqtt.get_esphome_device_ip(config, username, password, client_id)
_PORT_TO_PORT_TYPE = {
"MQTT": "MQTT",
"MQTTIP": "MQTTIP",
}
def get_port_type(port: str) -> str:
def get_port_type(port):
if port.startswith("/") or port.startswith("COM"):
return "SERIAL"
return _PORT_TO_PORT_TYPE.get(port, "NETWORK")
if port == "MQTT":
return "MQTT"
return "NETWORK"
def run_miniterm(config: ConfigType, port: str, args) -> int:
from aioesphomeapi import LogParser
def run_miniterm(config, port, args):
import serial
from esphome import platformio_api
@@ -304,7 +158,6 @@ def run_miniterm(config: ConfigType, port: str, args) -> int:
ser.dtr = False
ser.rts = False
parser = LogParser()
tries = 0
while tries < 5:
try:
@@ -320,10 +173,9 @@ def run_miniterm(config: ConfigType, port: str, args) -> int:
.replace(b"\n", b"")
.decode("utf8", "backslashreplace")
)
time_ = datetime.now()
nanoseconds = time_.microsecond // 1000
time_str = f"[{time_.hour:02}:{time_.minute:02}:{time_.second:02}.{nanoseconds:03}]"
safe_print(parser.parse_line(line, time_str))
time_str = datetime.now().time().strftime("[%H:%M:%S]")
message = time_str + line
safe_print(message)
backtrace_state = platformio_api.process_stacktrace(
config, line, backtrace_state=backtrace_state
@@ -357,15 +209,12 @@ def wrap_to_code(name, comp):
return wrapped
def write_cpp(config: ConfigType) -> int:
if not get_bool_env(ENV_NOGITIGNORE):
writer.write_gitignore()
def write_cpp(config):
generate_cpp_contents(config)
return write_cpp_file()
def generate_cpp_contents(config: ConfigType) -> None:
def generate_cpp_contents(config):
_LOGGER.info("Generating C++ source...")
for name, component, conf in iter_component_configs(CORE.config):
@@ -376,18 +225,15 @@ def generate_cpp_contents(config: ConfigType) -> None:
CORE.flush_tasks()
def write_cpp_file() -> int:
def write_cpp_file():
writer.write_platformio_project()
code_s = indent(CORE.cpp_main_section)
writer.write_cpp(code_s)
from esphome.build_gen import platformio
platformio.write_project()
return 0
def compile_program(args: ArgsProtocol, config: ConfigType) -> int:
def compile_program(args, config):
from esphome import platformio_api
_LOGGER.info("Compiling app...")
@@ -398,9 +244,7 @@ def compile_program(args: ArgsProtocol, config: ConfigType) -> int:
return 0 if idedata is not None else 1
def upload_using_esptool(
config: ConfigType, port: str, file: str, speed: int
) -> str | int:
def upload_using_esptool(config, port, file, speed):
from esphome import platformio_api
first_baudrate = speed or config[CONF_ESPHOME][CONF_PLATFORMIO_OPTIONS].get(
@@ -428,20 +272,20 @@ def upload_using_esptool(
def run_esptool(baud_rate):
cmd = [
"esptool",
"esptool.py",
"--before",
"default-reset",
"default_reset",
"--after",
"hard-reset",
"hard_reset",
"--baud",
str(baud_rate),
"--port",
port,
"--chip",
mcu,
"write-flash",
"write_flash",
"-z",
"--flash-size",
"--flash_size",
"detect",
]
for img in flash_images:
@@ -465,7 +309,7 @@ def upload_using_esptool(
return run_esptool(115200)
def upload_using_platformio(config: ConfigType, port: str):
def upload_using_platformio(config, port):
from esphome import platformio_api
upload_args = ["-t", "upload", "-t", "nobuild"]
@@ -474,7 +318,7 @@ def upload_using_platformio(config: ConfigType, port: str):
return platformio_api.run_platformio_cli_run(config, CORE.verbose, *upload_args)
def check_permissions(port: str):
def check_permissions(port):
if os.name == "posix" and get_port_type(port) == "SERIAL":
# Check if we can open selected serial port
if not os.access(port, os.F_OK):
@@ -487,34 +331,32 @@ def check_permissions(port: str):
raise EsphomeError(
"You do not have read or write permission on the selected serial port. "
"To resolve this issue, you can add your user to the dialout group "
f"by running the following command: sudo usermod -a -G dialout {getpass.getuser()}. "
f"by running the following command: sudo usermod -a -G dialout {os.getlogin()}. "
"You will need to log out & back in or reboot to activate the new group access."
)
def upload_program(
config: ConfigType, args: ArgsProtocol, devices: list[str]
) -> tuple[int, str | None]:
host = devices[0]
def upload_program(config, args, host):
try:
module = importlib.import_module("esphome.components." + CORE.target_platform)
if getattr(module, "upload_program")(config, args, host):
return 0, host
return 0
except AttributeError:
pass
if get_port_type(host) == "SERIAL":
check_permissions(host)
exit_code = 1
if CORE.target_platform in (PLATFORM_ESP32, PLATFORM_ESP8266):
file = getattr(args, "file", None)
exit_code = upload_using_esptool(config, host, file, args.upload_speed)
elif CORE.target_platform == PLATFORM_RP2040 or CORE.is_libretiny:
exit_code = upload_using_platformio(config, host)
# else: Unknown target platform, exit_code remains 1
return upload_using_esptool(config, host, file, args.upload_speed)
return exit_code, host if exit_code == 0 else None
if CORE.target_platform in (PLATFORM_RP2040):
return upload_using_platformio(config, args.device)
if CORE.target_platform in (PLATFORM_BK72XX, PLATFORM_RTL87XX):
return upload_using_platformio(config, host)
return 1 # Unknown target platform
ota_conf = {}
for ota_item in config.get(CONF_OTA, []):
@@ -531,53 +373,43 @@ def upload_program(
remote_port = int(ota_conf[CONF_PORT])
password = ota_conf.get(CONF_PASSWORD, "")
binary = args.file if getattr(args, "file", None) is not None else CORE.firmware_bin
# MQTT address resolution
if get_port_type(host) in ("MQTT", "MQTTIP"):
devices = mqtt_get_ip(config, args.username, args.password, args.client_id)
if (
not is_ip_address(CORE.address) # pylint: disable=too-many-boolean-expressions
and (get_port_type(host) == "MQTT" or config[CONF_MDNS][CONF_DISABLED])
and CONF_MQTT in config
and (not args.device or args.device in ("MQTT", "OTA"))
):
from esphome import mqtt
return espota2.run_ota(devices, remote_port, password, binary)
host = mqtt.get_esphome_device_ip(
config, args.username, args.password, args.client_id
)
if getattr(args, "file", None) is not None:
return espota2.run_ota(host, remote_port, password, args.file)
return espota2.run_ota(host, remote_port, password, CORE.firmware_bin)
def show_logs(config: ConfigType, args: ArgsProtocol, devices: list[str]) -> int | None:
try:
module = importlib.import_module("esphome.components." + CORE.target_platform)
if getattr(module, "show_logs")(config, args, devices):
return 0
except AttributeError:
pass
def show_logs(config, args, port):
if "logger" not in config:
raise EsphomeError("Logger is not configured!")
port = devices[0]
if get_port_type(port) == "SERIAL":
check_permissions(port)
return run_miniterm(config, port, args)
if get_port_type(port) == "NETWORK" and "api" in config:
if config[CONF_MDNS][CONF_DISABLED] and CONF_MQTT in config:
from esphome import mqtt
port_type = get_port_type(port)
# Check if we should use API for logging
if has_api():
addresses_to_use: list[str] | None = None
if port_type == "NETWORK" and (has_mdns() or is_ip_address(port)):
addresses_to_use = devices
elif port_type in ("NETWORK", "MQTT", "MQTTIP") and has_mqtt_ip_lookup():
# Only use MQTT IP lookup if the first condition didn't match
# (for MQTT/MQTTIP types, or for NETWORK when mdns/ip check fails)
addresses_to_use = mqtt_get_ip(
port = mqtt.get_esphome_device_ip(
config, args.username, args.password, args.client_id
)
)[0]
if addresses_to_use is not None:
from esphome.components.api.client import run_logs
from esphome.components.api.client import run_logs
return run_logs(config, addresses_to_use)
if port_type in ("NETWORK", "MQTT") and has_mqtt_logging():
return run_logs(config, port)
if get_port_type(port) == "MQTT" and "mqtt" in config:
from esphome import mqtt
return mqtt.show_logs(
@@ -587,7 +419,7 @@ def show_logs(config: ConfigType, args: ArgsProtocol, devices: list[str]) -> int
raise EsphomeError("No remote or local logging method configured (api/mqtt/logger)")
def clean_mqtt(config: ConfigType, args: ArgsProtocol) -> int | None:
def clean_mqtt(config, args):
from esphome import mqtt
return mqtt.clear_topic(
@@ -595,13 +427,13 @@ def clean_mqtt(config: ConfigType, args: ArgsProtocol) -> int | None:
)
def command_wizard(args: ArgsProtocol) -> int | None:
def command_wizard(args):
from esphome import wizard
return wizard.wizard(args.configuration)
def command_config(args: ArgsProtocol, config: ConfigType) -> int | None:
def command_config(args, config):
if not CORE.verbose:
config = strip_default_ids(config)
output = yaml_util.dump(config, args.show_secrets)
@@ -616,7 +448,7 @@ def command_config(args: ArgsProtocol, config: ConfigType) -> int | None:
return 0
def command_vscode(args: ArgsProtocol) -> int | None:
def command_vscode(args):
from esphome import vscode
logging.disable(logging.INFO)
@@ -624,7 +456,7 @@ def command_vscode(args: ArgsProtocol) -> int | None:
vscode.read_config(args)
def command_compile(args: ArgsProtocol, config: ConfigType) -> int | None:
def command_compile(args, config):
exit_code = write_cpp(config)
if exit_code != 0:
return exit_code
@@ -638,23 +470,23 @@ def command_compile(args: ArgsProtocol, config: ConfigType) -> int | None:
return 0
def command_upload(args: ArgsProtocol, config: ConfigType) -> int | None:
# Get devices, resolving special identifiers like OTA
devices = choose_upload_log_host(
def command_upload(args, config):
port = choose_upload_log_host(
default=args.device,
check_default=None,
purpose=Purpose.UPLOADING,
show_ota=True,
show_mqtt=False,
show_api=False,
purpose="uploading",
)
exit_code, _ = upload_program(config, args, devices)
if exit_code == 0:
_LOGGER.info("Successfully uploaded program.")
else:
_LOGGER.warning("Failed to upload to %s", devices)
return exit_code
exit_code = upload_program(config, args, port)
if exit_code != 0:
return exit_code
_LOGGER.info("Successfully uploaded program.")
return 0
def command_discover(args: ArgsProtocol, config: ConfigType) -> int | None:
def command_discover(args, config):
if "mqtt" in config:
from esphome import mqtt
@@ -663,17 +495,19 @@ def command_discover(args: ArgsProtocol, config: ConfigType) -> int | None:
raise EsphomeError("No discover method configured (mqtt)")
def command_logs(args: ArgsProtocol, config: ConfigType) -> int | None:
# Get devices, resolving special identifiers like OTA
devices = choose_upload_log_host(
def command_logs(args, config):
port = choose_upload_log_host(
default=args.device,
check_default=None,
purpose=Purpose.LOGGING,
show_ota=False,
show_mqtt=True,
show_api=True,
purpose="logging",
)
return show_logs(config, args, devices)
return show_logs(config, args, port)
def command_run(args: ArgsProtocol, config: ConfigType) -> int | None:
def command_run(args, config):
exit_code = write_cpp(config)
if exit_code != 0:
return exit_code
@@ -690,48 +524,47 @@ def command_run(args: ArgsProtocol, config: ConfigType) -> int | None:
program_path = idedata.raw["prog_path"]
return run_external_process(program_path)
# Get devices, resolving special identifiers like OTA
devices = choose_upload_log_host(
port = choose_upload_log_host(
default=args.device,
check_default=None,
purpose=Purpose.UPLOADING,
show_ota=True,
show_mqtt=False,
show_api=True,
purpose="uploading",
)
exit_code, successful_device = upload_program(config, args, devices)
if exit_code == 0:
_LOGGER.info("Successfully uploaded program.")
else:
_LOGGER.warning("Failed to upload to %s", devices)
exit_code = upload_program(config, args, port)
if exit_code != 0:
return exit_code
_LOGGER.info("Successfully uploaded program.")
if args.no_logs:
return 0
# For logs, prefer the device we successfully uploaded to
devices = choose_upload_log_host(
default=successful_device,
check_default=successful_device,
purpose=Purpose.LOGGING,
port = choose_upload_log_host(
default=args.device,
check_default=port,
show_ota=False,
show_mqtt=True,
show_api=True,
purpose="logging",
)
return show_logs(config, args, devices)
return show_logs(config, args, port)
def command_clean_mqtt(args: ArgsProtocol, config: ConfigType) -> int | None:
def command_clean_mqtt(args, config):
return clean_mqtt(config, args)
def command_mqtt_fingerprint(args: ArgsProtocol, config: ConfigType) -> int | None:
def command_mqtt_fingerprint(args, config):
from esphome import mqtt
return mqtt.get_fingerprint(config)
def command_version(args: ArgsProtocol) -> int | None:
def command_version(args):
safe_print(f"Version: {const.__version__}")
return 0
def command_clean(args: ArgsProtocol, config: ConfigType) -> int | None:
def command_clean(args, config):
try:
writer.clean_build()
except OSError as err:
@@ -741,13 +574,13 @@ def command_clean(args: ArgsProtocol, config: ConfigType) -> int | None:
return 0
def command_dashboard(args: ArgsProtocol) -> int | None:
def command_dashboard(args):
from esphome.dashboard import dashboard
return dashboard.start_dashboard(args)
def command_update_all(args: ArgsProtocol) -> int | None:
def command_update_all(args):
import click
success = {}
@@ -758,43 +591,38 @@ def command_update_all(args: ArgsProtocol) -> int | None:
middle_text = f" {middle_text} "
width = len(click.unstyle(middle_text))
half_line = "=" * ((twidth - width) // 2)
safe_print(f"{half_line}{middle_text}{half_line}")
click.echo(f"{half_line}{middle_text}{half_line}")
for f in files:
safe_print(f"Updating {color(AnsiFore.CYAN, f)}")
safe_print("-" * twidth)
safe_print()
if CORE.dashboard:
rc = run_external_process(
"esphome", "--dashboard", "run", f, "--no-logs", "--device", "OTA"
)
else:
rc = run_external_process(
"esphome", "run", f, "--no-logs", "--device", "OTA"
)
print(f"Updating {color(Fore.CYAN, f)}")
print("-" * twidth)
print()
rc = run_external_process(
"esphome", "--dashboard", "run", f, "--no-logs", "--device", "OTA"
)
if rc == 0:
print_bar(f"[{color(AnsiFore.BOLD_GREEN, 'SUCCESS')}] {f}")
print_bar(f"[{color(Fore.BOLD_GREEN, 'SUCCESS')}] {f}")
success[f] = True
else:
print_bar(f"[{color(AnsiFore.BOLD_RED, 'ERROR')}] {f}")
print_bar(f"[{color(Fore.BOLD_RED, 'ERROR')}] {f}")
success[f] = False
safe_print()
safe_print()
safe_print()
print()
print()
print()
print_bar(f"[{color(AnsiFore.BOLD_WHITE, 'SUMMARY')}]")
print_bar(f"[{color(Fore.BOLD_WHITE, 'SUMMARY')}]")
failed = 0
for f in files:
if success[f]:
safe_print(f" - {f}: {color(AnsiFore.GREEN, 'SUCCESS')}")
print(f" - {f}: {color(Fore.GREEN, 'SUCCESS')}")
else:
safe_print(f" - {f}: {color(AnsiFore.BOLD_RED, 'FAILED')}")
print(f" - {f}: {color(Fore.BOLD_RED, 'FAILED')}")
failed += 1
return failed
def command_idedata(args: ArgsProtocol, config: ConfigType) -> int:
def command_idedata(args, config):
import json
from esphome import platformio_api
@@ -810,12 +638,12 @@ def command_idedata(args: ArgsProtocol, config: ConfigType) -> int:
return 0
def command_rename(args: ArgsProtocol, config: ConfigType) -> int | None:
def command_rename(args, config):
for c in args.name:
if c not in ALLOWED_NAME_CHARS:
print(
color(
AnsiFore.BOLD_RED,
Fore.BOLD_RED,
f"'{c}' is an invalid character for names. Valid characters are: "
f"{ALLOWED_NAME_CHARS} (lowercase, no spaces)",
)
@@ -828,9 +656,7 @@ def command_rename(args: ArgsProtocol, config: ConfigType) -> int | None:
yaml = yaml_util.load_yaml(CORE.config_path)
if CONF_ESPHOME not in yaml or CONF_NAME not in yaml[CONF_ESPHOME]:
print(
color(
AnsiFore.BOLD_RED, "Complex YAML files cannot be automatically renamed."
)
color(Fore.BOLD_RED, "Complex YAML files cannot be automatically renamed.")
)
return 1
old_name = yaml[CONF_ESPHOME][CONF_NAME]
@@ -853,7 +679,7 @@ def command_rename(args: ArgsProtocol, config: ConfigType) -> int | None:
)
> 1
):
print(color(AnsiFore.BOLD_RED, "Too many matches in YAML to safely rename"))
print(color(Fore.BOLD_RED, "Too many matches in YAML to safely rename"))
return 1
new_raw = re.sub(
@@ -865,7 +691,7 @@ def command_rename(args: ArgsProtocol, config: ConfigType) -> int | None:
new_path = os.path.join(CORE.config_dir, args.name + ".yaml")
print(
f"Updating {color(AnsiFore.CYAN, CORE.config_path)} to {color(AnsiFore.CYAN, new_path)}"
f"Updating {color(Fore.CYAN, CORE.config_path)} to {color(Fore.CYAN, new_path)}"
)
print()
@@ -874,7 +700,7 @@ def command_rename(args: ArgsProtocol, config: ConfigType) -> int | None:
rc = run_external_process("esphome", "config", new_path)
if rc != 0:
print(color(AnsiFore.BOLD_RED, "Rename failed. Reverting changes."))
print(color(Fore.BOLD_RED, "Rename failed. Reverting changes."))
os.remove(new_path)
return 1
@@ -900,7 +726,7 @@ def command_rename(args: ArgsProtocol, config: ConfigType) -> int | None:
if CORE.config_path != new_path:
os.remove(CORE.config_path)
print(color(AnsiFore.BOLD_GREEN, "SUCCESS"))
print(color(Fore.BOLD_GREEN, "SUCCESS"))
print()
return 0
@@ -927,12 +753,6 @@ POST_CONFIG_ACTIONS = {
"discover": command_discover,
}
SIMPLE_CONFIG_ACTIONS = [
"clean",
"clean-mqtt",
"config",
]
def parse_args(argv):
options_parser = argparse.ArgumentParser(add_help=False)
@@ -1020,8 +840,7 @@ def parse_args(argv):
)
parser_upload.add_argument(
"--device",
action="append",
help="Manually specify the serial port/address to use, for example /dev/ttyUSB0. Can be specified multiple times for fallback addresses.",
help="Manually specify the serial port/address to use, for example /dev/ttyUSB0.",
)
parser_upload.add_argument(
"--upload_speed",
@@ -1043,8 +862,7 @@ def parse_args(argv):
)
parser_logs.add_argument(
"--device",
action="append",
help="Manually specify the serial port/address to use, for example /dev/ttyUSB0. Can be specified multiple times for fallback addresses.",
help="Manually specify the serial port/address to use, for example /dev/ttyUSB0.",
)
parser_logs.add_argument(
"--reset",
@@ -1073,8 +891,7 @@ def parse_args(argv):
)
parser_run.add_argument(
"--device",
action="append",
help="Manually specify the serial port/address to use, for example /dev/ttyUSB0. Can be specified multiple times for fallback addresses.",
help="Manually specify the serial port/address to use, for example /dev/ttyUSB0.",
)
parser_run.add_argument(
"--upload_speed",
@@ -1201,13 +1018,6 @@ def parse_args(argv):
arguments = argv[1:]
argcomplete.autocomplete(parser)
if len(arguments) > 0 and arguments[0] in SIMPLE_CONFIG_ACTIONS:
args, unknown_args = parser.parse_known_args(arguments)
if unknown_args:
_LOGGER.warning("Ignored unrecognized arguments: %s", unknown_args)
return args
return parser.parse_args(arguments)

View File

@@ -391,7 +391,8 @@ async def build_action(full_config, template_arg, args):
)
action_id = full_config[CONF_TYPE_ID]
builder = registry_entry.coroutine_fun
return await builder(config, action_id, template_arg, args)
ret = await builder(config, action_id, template_arg, args)
return ret
async def build_action_list(config, templ, arg_type):
@@ -408,7 +409,8 @@ async def build_condition(full_config, template_arg, args):
)
action_id = full_config[CONF_TYPE_ID]
builder = registry_entry.coroutine_fun
return await builder(config, action_id, template_arg, args)
ret = await builder(config, action_id, template_arg, args)
return ret
async def build_condition_list(config, templ, args):

View File

@@ -1,102 +0,0 @@
import os
from esphome.const import __version__
from esphome.core import CORE
from esphome.helpers import mkdir_p, read_file, write_file_if_changed
from esphome.writer import find_begin_end, update_storage_json
INI_AUTO_GENERATE_BEGIN = "; ========== AUTO GENERATED CODE BEGIN ==========="
INI_AUTO_GENERATE_END = "; =========== AUTO GENERATED CODE END ============"
INI_BASE_FORMAT = (
"""; Auto generated code by esphome
[common]
lib_deps =
build_flags =
upload_flags =
""",
"""
""",
)
def format_ini(data: dict[str, str | list[str]]) -> str:
content = ""
for key, value in sorted(data.items()):
if isinstance(value, list):
content += f"{key} =\n"
for x in value:
content += f" {x}\n"
else:
content += f"{key} = {value}\n"
return content
def get_ini_content():
CORE.add_platformio_option(
"lib_deps",
[x.as_lib_dep for x in CORE.platformio_libraries.values()]
+ ["${common.lib_deps}"],
)
# Sort to avoid changing build flags order
CORE.add_platformio_option("build_flags", sorted(CORE.build_flags))
# Sort to avoid changing build unflags order
CORE.add_platformio_option("build_unflags", sorted(CORE.build_unflags))
# Add extra script for C++ flags
CORE.add_platformio_option("extra_scripts", [f"pre:{CXX_FLAGS_FILE_NAME}"])
content = "[platformio]\n"
content += f"description = ESPHome {__version__}\n"
content += f"[env:{CORE.name}]\n"
content += format_ini(CORE.platformio_options)
return content
def write_ini(content):
update_storage_json()
path = CORE.relative_build_path("platformio.ini")
if os.path.isfile(path):
text = read_file(path)
content_format = find_begin_end(
text, INI_AUTO_GENERATE_BEGIN, INI_AUTO_GENERATE_END
)
else:
content_format = INI_BASE_FORMAT
full_file = f"{content_format[0] + INI_AUTO_GENERATE_BEGIN}\n{content}"
full_file += INI_AUTO_GENERATE_END + content_format[1]
write_file_if_changed(path, full_file)
def write_project():
mkdir_p(CORE.build_path)
content = get_ini_content()
write_ini(content)
# Write extra script for C++ specific flags
write_cxx_flags_script()
CXX_FLAGS_FILE_NAME = "cxx_flags.py"
CXX_FLAGS_FILE_CONTENTS = """# Auto-generated ESPHome script for C++ specific compiler flags
Import("env")
# Add C++ specific flags
"""
def write_cxx_flags_script() -> None:
path = CORE.relative_build_path(CXX_FLAGS_FILE_NAME)
contents = CXX_FLAGS_FILE_CONTENTS
if not CORE.is_host:
contents += 'env.Append(CXXFLAGS=["-Wno-volatile"])'
contents += "\n"
write_file_if_changed(path, contents)

View File

@@ -22,7 +22,6 @@ from esphome.cpp_generator import ( # noqa: F401
TemplateArguments,
add,
add_build_flag,
add_build_unflag,
add_define,
add_global,
add_library,
@@ -35,7 +34,6 @@ from esphome.cpp_generator import ( # noqa: F401
process_lambda,
progmem_array,
safe_exp,
set_cpp_standard,
statement,
static_const_array,
templatable,

View File

@@ -7,6 +7,7 @@ namespace a4988 {
static const char *const TAG = "a4988.stepper";
void A4988::setup() {
ESP_LOGCONFIG(TAG, "Setting up A4988...");
if (this->sleep_pin_ != nullptr) {
this->sleep_pin_->setup();
this->sleep_pin_->digital_write(false);

View File

@@ -7,6 +7,8 @@ namespace absolute_humidity {
static const char *const TAG = "absolute_humidity.sensor";
void AbsoluteHumidityComponent::setup() {
ESP_LOGCONFIG(TAG, "Setting up absolute humidity '%s'...", this->get_name().c_str());
ESP_LOGD(TAG, " Added callback for temperature '%s'", this->temperature_sensor_->get_name().c_str());
this->temperature_sensor_->add_on_state_callback([this](float state) { this->temperature_callback_(state); });
if (this->temperature_sensor_->has_state()) {
@@ -38,11 +40,9 @@ void AbsoluteHumidityComponent::dump_config() {
break;
}
ESP_LOGCONFIG(TAG,
"Sources\n"
" Temperature: '%s'\n"
" Relative Humidity: '%s'",
this->temperature_sensor_->get_name().c_str(), this->humidity_sensor_->get_name().c_str());
ESP_LOGCONFIG(TAG, "Sources");
ESP_LOGCONFIG(TAG, " Temperature: '%s'", this->temperature_sensor_->get_name().c_str());
ESP_LOGCONFIG(TAG, " Relative Humidity: '%s'", this->humidity_sensor_->get_name().c_str());
}
float AbsoluteHumidityComponent::get_setup_priority() const { return setup_priority::DATA; }
@@ -61,10 +61,11 @@ void AbsoluteHumidityComponent::loop() {
ESP_LOGW(TAG, "No valid state from temperature sensor!");
}
if (no_humidity) {
ESP_LOGW(TAG, "No valid state from humidity sensor!");
ESP_LOGW(TAG, "No valid state from temperature sensor!");
}
ESP_LOGW(TAG, "Unable to calculate absolute humidity.");
this->publish_state(NAN);
this->status_set_warning(LOG_STR("Unable to calculate absolute humidity."));
this->status_set_warning();
return;
}
@@ -86,8 +87,9 @@ void AbsoluteHumidityComponent::loop() {
es = es_wobus(temperature_c);
break;
default:
ESP_LOGE(TAG, "Invalid saturation vapor pressure equation selection!");
this->publish_state(NAN);
this->status_set_error("Invalid saturation vapor pressure equation selection!");
this->status_set_error();
return;
}
ESP_LOGD(TAG, "Saturation vapor pressure %f kPa", es);

View File

@@ -5,7 +5,7 @@ from esphome.const import (
CONF_EQUATION,
CONF_HUMIDITY,
CONF_TEMPERATURE,
DEVICE_CLASS_ABSOLUTE_HUMIDITY,
ICON_WATER,
STATE_CLASS_MEASUREMENT,
UNIT_GRAMS_PER_CUBIC_METER,
)
@@ -27,8 +27,8 @@ EQUATION = {
CONFIG_SCHEMA = (
sensor.sensor_schema(
unit_of_measurement=UNIT_GRAMS_PER_CUBIC_METER,
icon=ICON_WATER,
accuracy_decimals=2,
device_class=DEVICE_CLASS_ABSOLUTE_HUMIDITY,
state_class=STATE_CLASS_MEASUREMENT,
)
.extend(

View File

@@ -4,7 +4,6 @@
#include "esphome/core/helpers.h"
#include "esphome/core/log.h"
#include <cmath>
#include <numbers>
#ifdef USE_ESP8266
#include <core_esp8266_waveform.h>
@@ -115,14 +114,13 @@ void IRAM_ATTR HOT AcDimmerDataStore::gpio_intr() {
// fully off, disable output immediately
this->gate_pin.digital_write(false);
} else {
auto min_us = this->cycle_time_us * this->min_power / 1000;
if (this->method == DIM_METHOD_TRAILING) {
this->enable_time_us = 1; // cannot be 0
// calculate time until disable in µs with integer arithmetic and take into account min_power
this->disable_time_us = std::max((uint32_t) 10, this->value * (this->cycle_time_us - min_us) / 65535 + min_us);
this->disable_time_us = std::max((uint32_t) 10, this->value * this->cycle_time_us / 65535);
} else {
// calculate time until enable in µs: (1.0-value)*cycle_time, but with integer arithmetic
// also take into account min_power
auto min_us = this->cycle_time_us * this->min_power / 1000;
this->enable_time_us = std::max((uint32_t) 1, ((65535 - this->value) * (this->cycle_time_us - min_us)) / 65535);
if (this->method == DIM_METHOD_LEADING_PULSE) {
@@ -194,17 +192,18 @@ void AcDimmer::setup() {
setTimer1Callback(&timer_interrupt);
#endif
#ifdef USE_ESP32
// timer frequency of 1mhz
dimmer_timer = timerBegin(1000000);
timerAttachInterrupt(dimmer_timer, &AcDimmerDataStore::s_timer_intr);
// 80 Divider -> 1 count=1µs
dimmer_timer = timerBegin(0, 80, true);
timerAttachInterrupt(dimmer_timer, &AcDimmerDataStore::s_timer_intr, true);
// For ESP32, we can't use dynamic interval calculation because the timerX functions
// are not callable from ISR (placed in flash storage).
// Here we just use an interrupt firing every 50 µs.
timerAlarm(dimmer_timer, 50, true, 0);
timerAlarmWrite(dimmer_timer, 50, true);
timerAlarmEnable(dimmer_timer);
#endif
}
void AcDimmer::write_state(float state) {
state = std::acos(1 - (2 * state)) / std::numbers::pi; // RMS power compensation
state = std::acos(1 - (2 * state)) / 3.14159; // RMS power compensation
auto new_value = static_cast<uint16_t>(roundf(state * 65535));
if (new_value != 0 && this->store_.value == 0)
this->store_.init_cycle = this->init_with_half_cycle_;
@@ -214,10 +213,8 @@ void AcDimmer::dump_config() {
ESP_LOGCONFIG(TAG, "AcDimmer:");
LOG_PIN(" Output Pin: ", this->gate_pin_);
LOG_PIN(" Zero-Cross Pin: ", this->zero_cross_pin_);
ESP_LOGCONFIG(TAG,
" Min Power: %.1f%%\n"
" Init with half cycle: %s",
this->store_.min_power / 10.0f, YESNO(this->init_with_half_cycle_));
ESP_LOGCONFIG(TAG, " Min Power: %.1f%%", this->store_.min_power / 10.0f);
ESP_LOGCONFIG(TAG, " Init with half cycle: %s", YESNO(this->init_with_half_cycle_));
if (method_ == DIM_METHOD_LEADING_PULSE) {
ESP_LOGCONFIG(TAG, " Method: leading pulse");
} else if (method_ == DIM_METHOD_LEADING) {

View File

@@ -1,11 +1,10 @@
from esphome import pins
import esphome.codegen as cg
from esphome.components.esp32 import VARIANT_ESP32P4, get_esp32_variant
from esphome.components.esp32 import get_esp32_variant
from esphome.components.esp32.const import (
VARIANT_ESP32,
VARIANT_ESP32C2,
VARIANT_ESP32C3,
VARIANT_ESP32C5,
VARIANT_ESP32C6,
VARIANT_ESP32H2,
VARIANT_ESP32S2,
@@ -45,170 +44,122 @@ SAMPLING_MODES = {
"max": sampling_mode.MAX,
}
adc_unit_t = cg.global_ns.enum("adc_unit_t", is_class=True)
adc_channel_t = cg.global_ns.enum("adc_channel_t", is_class=True)
adc1_channel_t = cg.global_ns.enum("adc1_channel_t")
adc2_channel_t = cg.global_ns.enum("adc2_channel_t")
# From https://github.com/espressif/esp-idf/blob/master/components/driver/include/driver/adc_common.h
# pin to adc1 channel mapping
# https://github.com/espressif/esp-idf/blob/v4.4.8/components/driver/include/driver/adc.h
ESP32_VARIANT_ADC1_PIN_TO_CHANNEL = {
# https://github.com/espressif/esp-idf/blob/master/components/soc/esp32/include/soc/adc_channel.h
VARIANT_ESP32: {
36: adc_channel_t.ADC_CHANNEL_0,
37: adc_channel_t.ADC_CHANNEL_1,
38: adc_channel_t.ADC_CHANNEL_2,
39: adc_channel_t.ADC_CHANNEL_3,
32: adc_channel_t.ADC_CHANNEL_4,
33: adc_channel_t.ADC_CHANNEL_5,
34: adc_channel_t.ADC_CHANNEL_6,
35: adc_channel_t.ADC_CHANNEL_7,
36: adc1_channel_t.ADC1_CHANNEL_0,
37: adc1_channel_t.ADC1_CHANNEL_1,
38: adc1_channel_t.ADC1_CHANNEL_2,
39: adc1_channel_t.ADC1_CHANNEL_3,
32: adc1_channel_t.ADC1_CHANNEL_4,
33: adc1_channel_t.ADC1_CHANNEL_5,
34: adc1_channel_t.ADC1_CHANNEL_6,
35: adc1_channel_t.ADC1_CHANNEL_7,
},
# https://github.com/espressif/esp-idf/blob/master/components/soc/esp32c2/include/soc/adc_channel.h
VARIANT_ESP32C2: {
0: adc_channel_t.ADC_CHANNEL_0,
1: adc_channel_t.ADC_CHANNEL_1,
2: adc_channel_t.ADC_CHANNEL_2,
3: adc_channel_t.ADC_CHANNEL_3,
4: adc_channel_t.ADC_CHANNEL_4,
},
# https://github.com/espressif/esp-idf/blob/master/components/soc/esp32c3/include/soc/adc_channel.h
VARIANT_ESP32C3: {
0: adc_channel_t.ADC_CHANNEL_0,
1: adc_channel_t.ADC_CHANNEL_1,
2: adc_channel_t.ADC_CHANNEL_2,
3: adc_channel_t.ADC_CHANNEL_3,
4: adc_channel_t.ADC_CHANNEL_4,
},
# ESP32-C5 ADC1 pin mapping - based on official ESP-IDF documentation
# https://docs.espressif.com/projects/esp-idf/en/latest/esp32c5/api-reference/peripherals/gpio.html
VARIANT_ESP32C5: {
1: adc_channel_t.ADC_CHANNEL_0,
2: adc_channel_t.ADC_CHANNEL_1,
3: adc_channel_t.ADC_CHANNEL_2,
4: adc_channel_t.ADC_CHANNEL_3,
5: adc_channel_t.ADC_CHANNEL_4,
6: adc_channel_t.ADC_CHANNEL_5,
},
# https://github.com/espressif/esp-idf/blob/master/components/soc/esp32c6/include/soc/adc_channel.h
VARIANT_ESP32C6: {
0: adc_channel_t.ADC_CHANNEL_0,
1: adc_channel_t.ADC_CHANNEL_1,
2: adc_channel_t.ADC_CHANNEL_2,
3: adc_channel_t.ADC_CHANNEL_3,
4: adc_channel_t.ADC_CHANNEL_4,
5: adc_channel_t.ADC_CHANNEL_5,
6: adc_channel_t.ADC_CHANNEL_6,
},
# https://github.com/espressif/esp-idf/blob/master/components/soc/esp32h2/include/soc/adc_channel.h
VARIANT_ESP32H2: {
1: adc_channel_t.ADC_CHANNEL_0,
2: adc_channel_t.ADC_CHANNEL_1,
3: adc_channel_t.ADC_CHANNEL_2,
4: adc_channel_t.ADC_CHANNEL_3,
5: adc_channel_t.ADC_CHANNEL_4,
},
# https://github.com/espressif/esp-idf/blob/master/components/soc/esp32s2/include/soc/adc_channel.h
VARIANT_ESP32S2: {
1: adc_channel_t.ADC_CHANNEL_0,
2: adc_channel_t.ADC_CHANNEL_1,
3: adc_channel_t.ADC_CHANNEL_2,
4: adc_channel_t.ADC_CHANNEL_3,
5: adc_channel_t.ADC_CHANNEL_4,
6: adc_channel_t.ADC_CHANNEL_5,
7: adc_channel_t.ADC_CHANNEL_6,
8: adc_channel_t.ADC_CHANNEL_7,
9: adc_channel_t.ADC_CHANNEL_8,
10: adc_channel_t.ADC_CHANNEL_9,
1: adc1_channel_t.ADC1_CHANNEL_0,
2: adc1_channel_t.ADC1_CHANNEL_1,
3: adc1_channel_t.ADC1_CHANNEL_2,
4: adc1_channel_t.ADC1_CHANNEL_3,
5: adc1_channel_t.ADC1_CHANNEL_4,
6: adc1_channel_t.ADC1_CHANNEL_5,
7: adc1_channel_t.ADC1_CHANNEL_6,
8: adc1_channel_t.ADC1_CHANNEL_7,
9: adc1_channel_t.ADC1_CHANNEL_8,
10: adc1_channel_t.ADC1_CHANNEL_9,
},
# https://github.com/espressif/esp-idf/blob/master/components/soc/esp32s3/include/soc/adc_channel.h
VARIANT_ESP32S3: {
1: adc_channel_t.ADC_CHANNEL_0,
2: adc_channel_t.ADC_CHANNEL_1,
3: adc_channel_t.ADC_CHANNEL_2,
4: adc_channel_t.ADC_CHANNEL_3,
5: adc_channel_t.ADC_CHANNEL_4,
6: adc_channel_t.ADC_CHANNEL_5,
7: adc_channel_t.ADC_CHANNEL_6,
8: adc_channel_t.ADC_CHANNEL_7,
9: adc_channel_t.ADC_CHANNEL_8,
10: adc_channel_t.ADC_CHANNEL_9,
1: adc1_channel_t.ADC1_CHANNEL_0,
2: adc1_channel_t.ADC1_CHANNEL_1,
3: adc1_channel_t.ADC1_CHANNEL_2,
4: adc1_channel_t.ADC1_CHANNEL_3,
5: adc1_channel_t.ADC1_CHANNEL_4,
6: adc1_channel_t.ADC1_CHANNEL_5,
7: adc1_channel_t.ADC1_CHANNEL_6,
8: adc1_channel_t.ADC1_CHANNEL_7,
9: adc1_channel_t.ADC1_CHANNEL_8,
10: adc1_channel_t.ADC1_CHANNEL_9,
},
VARIANT_ESP32P4: {
16: adc_channel_t.ADC_CHANNEL_0,
17: adc_channel_t.ADC_CHANNEL_1,
18: adc_channel_t.ADC_CHANNEL_2,
19: adc_channel_t.ADC_CHANNEL_3,
20: adc_channel_t.ADC_CHANNEL_4,
21: adc_channel_t.ADC_CHANNEL_5,
22: adc_channel_t.ADC_CHANNEL_6,
23: adc_channel_t.ADC_CHANNEL_7,
VARIANT_ESP32C3: {
0: adc1_channel_t.ADC1_CHANNEL_0,
1: adc1_channel_t.ADC1_CHANNEL_1,
2: adc1_channel_t.ADC1_CHANNEL_2,
3: adc1_channel_t.ADC1_CHANNEL_3,
4: adc1_channel_t.ADC1_CHANNEL_4,
},
VARIANT_ESP32C2: {
0: adc1_channel_t.ADC1_CHANNEL_0,
1: adc1_channel_t.ADC1_CHANNEL_1,
2: adc1_channel_t.ADC1_CHANNEL_2,
3: adc1_channel_t.ADC1_CHANNEL_3,
4: adc1_channel_t.ADC1_CHANNEL_4,
},
VARIANT_ESP32C6: {
0: adc1_channel_t.ADC1_CHANNEL_0,
1: adc1_channel_t.ADC1_CHANNEL_1,
2: adc1_channel_t.ADC1_CHANNEL_2,
3: adc1_channel_t.ADC1_CHANNEL_3,
4: adc1_channel_t.ADC1_CHANNEL_4,
5: adc1_channel_t.ADC1_CHANNEL_5,
6: adc1_channel_t.ADC1_CHANNEL_6,
},
VARIANT_ESP32H2: {
1: adc1_channel_t.ADC1_CHANNEL_0,
2: adc1_channel_t.ADC1_CHANNEL_1,
3: adc1_channel_t.ADC1_CHANNEL_2,
4: adc1_channel_t.ADC1_CHANNEL_3,
5: adc1_channel_t.ADC1_CHANNEL_4,
},
}
# pin to adc2 channel mapping
# https://github.com/espressif/esp-idf/blob/v4.4.8/components/driver/include/driver/adc.h
ESP32_VARIANT_ADC2_PIN_TO_CHANNEL = {
# https://github.com/espressif/esp-idf/blob/master/components/soc/esp32/include/soc/adc_channel.h
# TODO: add other variants
VARIANT_ESP32: {
4: adc_channel_t.ADC_CHANNEL_0,
0: adc_channel_t.ADC_CHANNEL_1,
2: adc_channel_t.ADC_CHANNEL_2,
15: adc_channel_t.ADC_CHANNEL_3,
13: adc_channel_t.ADC_CHANNEL_4,
12: adc_channel_t.ADC_CHANNEL_5,
14: adc_channel_t.ADC_CHANNEL_6,
27: adc_channel_t.ADC_CHANNEL_7,
25: adc_channel_t.ADC_CHANNEL_8,
26: adc_channel_t.ADC_CHANNEL_9,
4: adc2_channel_t.ADC2_CHANNEL_0,
0: adc2_channel_t.ADC2_CHANNEL_1,
2: adc2_channel_t.ADC2_CHANNEL_2,
15: adc2_channel_t.ADC2_CHANNEL_3,
13: adc2_channel_t.ADC2_CHANNEL_4,
12: adc2_channel_t.ADC2_CHANNEL_5,
14: adc2_channel_t.ADC2_CHANNEL_6,
27: adc2_channel_t.ADC2_CHANNEL_7,
25: adc2_channel_t.ADC2_CHANNEL_8,
26: adc2_channel_t.ADC2_CHANNEL_9,
},
# https://github.com/espressif/esp-idf/blob/master/components/soc/esp32c2/include/soc/adc_channel.h
VARIANT_ESP32C2: {
5: adc_channel_t.ADC_CHANNEL_0,
},
# https://github.com/espressif/esp-idf/blob/master/components/soc/esp32c3/include/soc/adc_channel.h
VARIANT_ESP32C3: {
5: adc_channel_t.ADC_CHANNEL_0,
},
# ESP32-C5 has no ADC2 channels
VARIANT_ESP32C5: {}, # no ADC2
# https://github.com/espressif/esp-idf/blob/master/components/soc/esp32c6/include/soc/adc_channel.h
VARIANT_ESP32C6: {}, # no ADC2
# https://github.com/espressif/esp-idf/blob/master/components/soc/esp32h2/include/soc/adc_channel.h
VARIANT_ESP32H2: {}, # no ADC2
# https://github.com/espressif/esp-idf/blob/master/components/soc/esp32s2/include/soc/adc_channel.h
VARIANT_ESP32S2: {
11: adc_channel_t.ADC_CHANNEL_0,
12: adc_channel_t.ADC_CHANNEL_1,
13: adc_channel_t.ADC_CHANNEL_2,
14: adc_channel_t.ADC_CHANNEL_3,
15: adc_channel_t.ADC_CHANNEL_4,
16: adc_channel_t.ADC_CHANNEL_5,
17: adc_channel_t.ADC_CHANNEL_6,
18: adc_channel_t.ADC_CHANNEL_7,
19: adc_channel_t.ADC_CHANNEL_8,
20: adc_channel_t.ADC_CHANNEL_9,
11: adc2_channel_t.ADC2_CHANNEL_0,
12: adc2_channel_t.ADC2_CHANNEL_1,
13: adc2_channel_t.ADC2_CHANNEL_2,
14: adc2_channel_t.ADC2_CHANNEL_3,
15: adc2_channel_t.ADC2_CHANNEL_4,
16: adc2_channel_t.ADC2_CHANNEL_5,
17: adc2_channel_t.ADC2_CHANNEL_6,
18: adc2_channel_t.ADC2_CHANNEL_7,
19: adc2_channel_t.ADC2_CHANNEL_8,
20: adc2_channel_t.ADC2_CHANNEL_9,
},
# https://github.com/espressif/esp-idf/blob/master/components/soc/esp32s3/include/soc/adc_channel.h
VARIANT_ESP32S3: {
11: adc_channel_t.ADC_CHANNEL_0,
12: adc_channel_t.ADC_CHANNEL_1,
13: adc_channel_t.ADC_CHANNEL_2,
14: adc_channel_t.ADC_CHANNEL_3,
15: adc_channel_t.ADC_CHANNEL_4,
16: adc_channel_t.ADC_CHANNEL_5,
17: adc_channel_t.ADC_CHANNEL_6,
18: adc_channel_t.ADC_CHANNEL_7,
19: adc_channel_t.ADC_CHANNEL_8,
20: adc_channel_t.ADC_CHANNEL_9,
11: adc2_channel_t.ADC2_CHANNEL_0,
12: adc2_channel_t.ADC2_CHANNEL_1,
13: adc2_channel_t.ADC2_CHANNEL_2,
14: adc2_channel_t.ADC2_CHANNEL_3,
15: adc2_channel_t.ADC2_CHANNEL_4,
16: adc2_channel_t.ADC2_CHANNEL_5,
17: adc2_channel_t.ADC2_CHANNEL_6,
18: adc2_channel_t.ADC2_CHANNEL_7,
19: adc2_channel_t.ADC2_CHANNEL_8,
20: adc2_channel_t.ADC2_CHANNEL_9,
},
VARIANT_ESP32P4: {
49: adc_channel_t.ADC_CHANNEL_0,
50: adc_channel_t.ADC_CHANNEL_1,
51: adc_channel_t.ADC_CHANNEL_2,
52: adc_channel_t.ADC_CHANNEL_3,
53: adc_channel_t.ADC_CHANNEL_4,
54: adc_channel_t.ADC_CHANNEL_5,
VARIANT_ESP32C3: {
5: adc2_channel_t.ADC2_CHANNEL_0,
},
VARIANT_ESP32C2: {},
VARIANT_ESP32C6: {},
VARIANT_ESP32H2: {},
}
@@ -260,9 +211,4 @@ def validate_adc_pin(value):
{CONF_ANALOG: True, CONF_INPUT: True}, internal=True
)(value)
if CORE.is_nrf52:
return pins.gpio_pin_schema(
{CONF_ANALOG: True, CONF_INPUT: True}, internal=True
)(value)
raise NotImplementedError

View File

@@ -3,26 +3,20 @@
#include "esphome/components/sensor/sensor.h"
#include "esphome/components/voltage_sampler/voltage_sampler.h"
#include "esphome/core/component.h"
#include "esphome/core/defines.h"
#include "esphome/core/hal.h"
#ifdef USE_ESP32
#include "esp_adc/adc_cali.h"
#include "esp_adc/adc_cali_scheme.h"
#include "esp_adc/adc_oneshot.h"
#include "hal/adc_types.h" // This defines ADC_CHANNEL_MAX
#endif // USE_ESP32
#ifdef USE_ZEPHYR
#include <zephyr/drivers/adc.h>
#endif
#include <esp_adc_cal.h>
#include "driver/adc.h"
#endif // USE_ESP32
namespace esphome {
namespace adc {
#ifdef USE_ESP32
// clang-format off
#if (ESP_IDF_VERSION_MAJOR == 5 && \
#if (ESP_IDF_VERSION_MAJOR == 4 && ESP_IDF_VERSION >= ESP_IDF_VERSION_VAL(4, 4, 7)) || \
(ESP_IDF_VERSION_MAJOR == 5 && \
((ESP_IDF_VERSION_MINOR == 0 && ESP_IDF_VERSION_PATCH >= 5) || \
(ESP_IDF_VERSION_MINOR == 1 && ESP_IDF_VERSION_PATCH >= 3) || \
(ESP_IDF_VERSION_MINOR >= 2)) \
@@ -34,136 +28,79 @@ static const adc_atten_t ADC_ATTEN_DB_12_COMPAT = ADC_ATTEN_DB_11;
#endif
#endif // USE_ESP32
enum class SamplingMode : uint8_t {
AVG = 0,
MIN = 1,
MAX = 2,
};
enum class SamplingMode : uint8_t { AVG = 0, MIN = 1, MAX = 2 };
const LogString *sampling_mode_to_str(SamplingMode mode);
template<typename T> class Aggregator {
class Aggregator {
public:
void add_sample(uint32_t value);
uint32_t aggregate();
Aggregator(SamplingMode mode);
void add_sample(T value);
T aggregate();
protected:
T aggr_{0};
uint8_t samples_{0};
SamplingMode mode_{SamplingMode::AVG};
uint32_t aggr_{0};
uint32_t samples_{0};
};
class ADCSensor : public sensor::Sensor, public PollingComponent, public voltage_sampler::VoltageSampler {
public:
/// Update the sensor's state by reading the current ADC value.
/// This method is called periodically based on the update interval.
void update() override;
/// Set up the ADC sensor by initializing hardware and calibration parameters.
/// This method is called once during device initialization.
void setup() override;
/// Output the configuration details of the ADC sensor for debugging purposes.
/// This method is called during the ESPHome setup process to log the configuration.
void dump_config() override;
/// Return the setup priority for this component.
/// Components with higher priority are initialized earlier during setup.
/// @return A float representing the setup priority.
float get_setup_priority() const override;
#ifdef USE_ZEPHYR
/// Set the ADC channel to be used by the ADC sensor.
/// @param channel Pointer to an adc_dt_spec structure representing the ADC channel.
void set_adc_channel(const adc_dt_spec *channel) { this->channel_ = channel; }
#endif
/// Set the GPIO pin to be used by the ADC sensor.
/// @param pin Pointer to an InternalGPIOPin representing the ADC input pin.
void set_pin(InternalGPIOPin *pin) { this->pin_ = pin; }
/// Enable or disable the output of raw ADC values (unprocessed data).
/// @param output_raw Boolean indicating whether to output raw ADC values (true) or processed values (false).
void set_output_raw(bool output_raw) { this->output_raw_ = output_raw; }
/// Set the number of samples to be taken for ADC readings to improve accuracy.
/// A higher sample count reduces noise but increases the reading time.
/// @param sample_count The number of samples (e.g., 1, 4, 8).
void set_sample_count(uint8_t sample_count);
/// Set the sampling mode for how multiple ADC samples are combined into a single measurement.
///
/// When multiple samples are taken (controlled by set_sample_count), they can be combined
/// in one of three ways:
/// - SamplingMode::AVG: Compute the average (default)
/// - SamplingMode::MIN: Use the lowest sample value
/// - SamplingMode::MAX: Use the highest sample value
/// @param sampling_mode The desired sampling mode to use for aggregating ADC samples.
void set_sampling_mode(SamplingMode sampling_mode);
/// Perform a single ADC sampling operation and return the measured value.
/// This function handles raw readings, calibration, and averaging as needed.
/// @return The sampled value as a float.
float sample() override;
#ifdef USE_ESP32
/// Set the ADC attenuation level to adjust the input voltage range.
/// This determines how the ADC interprets input voltages, allowing for greater precision
/// or the ability to measure higher voltages depending on the chosen attenuation level.
/// @param attenuation The desired ADC attenuation level (e.g., ADC_ATTEN_DB_0, ADC_ATTEN_DB_11).
/// Set the attenuation for this pin. Only available on the ESP32.
void set_attenuation(adc_atten_t attenuation) { this->attenuation_ = attenuation; }
/// Configure the ADC to use a specific channel on a specific ADC unit.
/// This sets the channel for single-shot or continuous ADC measurements.
/// @param unit The ADC unit to use (ADC_UNIT_1 or ADC_UNIT_2).
/// @param channel The ADC channel to configure, such as ADC_CHANNEL_0, ADC_CHANNEL_3, etc.
void set_channel(adc_unit_t unit, adc_channel_t channel) {
this->adc_unit_ = unit;
this->channel_ = channel;
void set_channel1(adc1_channel_t channel) {
this->channel1_ = channel;
this->channel2_ = ADC2_CHANNEL_MAX;
}
void set_channel2(adc2_channel_t channel) {
this->channel2_ = channel;
this->channel1_ = ADC1_CHANNEL_MAX;
}
/// Set whether autoranging should be enabled for the ADC.
/// Autoranging automatically adjusts the attenuation level to handle a wide range of input voltages.
/// @param autorange Boolean indicating whether to enable autoranging.
void set_autorange(bool autorange) { this->autorange_ = autorange; }
#endif // USE_ESP32
/// Update ADC values
void update() override;
/// Setup ADC
void setup() override;
void dump_config() override;
/// `HARDWARE_LATE` setup priority
float get_setup_priority() const override;
void set_pin(InternalGPIOPin *pin) { this->pin_ = pin; }
void set_output_raw(bool output_raw) { this->output_raw_ = output_raw; }
void set_sample_count(uint8_t sample_count);
void set_sampling_mode(SamplingMode sampling_mode);
float sample() override;
#ifdef USE_ESP8266
std::string unique_id() override;
#endif // USE_ESP8266
#ifdef USE_RP2040
void set_is_temperature() { this->is_temperature_ = true; }
#endif // USE_RP2040
protected:
uint8_t sample_count_{1};
bool output_raw_{false};
InternalGPIOPin *pin_;
bool output_raw_{false};
uint8_t sample_count_{1};
SamplingMode sampling_mode_{SamplingMode::AVG};
#ifdef USE_ESP32
float sample_autorange_();
float sample_fixed_attenuation_();
bool autorange_{false};
adc_oneshot_unit_handle_t adc_handle_{nullptr};
adc_cali_handle_t calibration_handle_{nullptr};
adc_atten_t attenuation_{ADC_ATTEN_DB_0};
adc_channel_t channel_{};
adc_unit_t adc_unit_{};
struct SetupFlags {
uint8_t init_complete : 1;
uint8_t config_complete : 1;
uint8_t handle_init_complete : 1;
uint8_t calibration_complete : 1;
uint8_t reserved : 4;
} setup_flags_{};
static adc_oneshot_unit_handle_t shared_adc_handles[2];
#endif // USE_ESP32
#ifdef USE_RP2040
bool is_temperature_{false};
#endif // USE_RP2040
#ifdef USE_ZEPHYR
const struct adc_dt_spec *channel_ = nullptr;
#endif
#ifdef USE_ESP32
adc_atten_t attenuation_{ADC_ATTEN_DB_0};
adc1_channel_t channel1_{ADC1_CHANNEL_MAX};
adc2_channel_t channel2_{ADC2_CHANNEL_MAX};
bool autorange_{false};
#if ESP_IDF_VERSION_MAJOR >= 5
esp_adc_cal_characteristics_t cal_characteristics_[SOC_ADC_ATTEN_NUM] = {};
#else
esp_adc_cal_characteristics_t cal_characteristics_[ADC_ATTEN_MAX] = {};
#endif // ESP_IDF_VERSION_MAJOR
#endif // USE_ESP32
};
} // namespace adc

View File

@@ -18,15 +18,15 @@ const LogString *sampling_mode_to_str(SamplingMode mode) {
return LOG_STR("unknown");
}
template<typename T> Aggregator<T>::Aggregator(SamplingMode mode) {
Aggregator::Aggregator(SamplingMode mode) {
this->mode_ = mode;
// set to max uint if mode is "min"
if (mode == SamplingMode::MIN) {
this->aggr_ = std::numeric_limits<T>::max();
this->aggr_ = UINT32_MAX;
}
}
template<typename T> void Aggregator<T>::add_sample(T value) {
void Aggregator::add_sample(uint32_t value) {
this->samples_ += 1;
switch (this->mode_) {
@@ -47,7 +47,7 @@ template<typename T> void Aggregator<T>::add_sample(T value) {
}
}
template<typename T> T Aggregator<T>::aggregate() {
uint32_t Aggregator::aggregate() {
if (this->mode_ == SamplingMode::AVG) {
if (this->samples_ == 0) {
return this->aggr_;
@@ -59,15 +59,9 @@ template<typename T> T Aggregator<T>::aggregate() {
return this->aggr_;
}
#ifdef USE_ZEPHYR
template class Aggregator<int32_t>;
#else
template class Aggregator<uint32_t>;
#endif
void ADCSensor::update() {
float value_v = this->sample();
ESP_LOGV(TAG, "'%s': Voltage=%.4fV", this->get_name().c_str(), value_v);
ESP_LOGV(TAG, "'%s': Got voltage=%.4fV", this->get_name().c_str(), value_v);
this->publish_state(value_v);
}

View File

@@ -8,322 +8,135 @@ namespace adc {
static const char *const TAG = "adc.esp32";
adc_oneshot_unit_handle_t ADCSensor::shared_adc_handles[2] = {nullptr, nullptr};
static const adc_bits_width_t ADC_WIDTH_MAX_SOC_BITS = static_cast<adc_bits_width_t>(ADC_WIDTH_MAX - 1);
const LogString *attenuation_to_str(adc_atten_t attenuation) {
switch (attenuation) {
case ADC_ATTEN_DB_0:
return LOG_STR("0 dB");
case ADC_ATTEN_DB_2_5:
return LOG_STR("2.5 dB");
case ADC_ATTEN_DB_6:
return LOG_STR("6 dB");
case ADC_ATTEN_DB_12_COMPAT:
return LOG_STR("12 dB");
default:
return LOG_STR("Unknown Attenuation");
}
}
#ifndef SOC_ADC_RTC_MAX_BITWIDTH
#if USE_ESP32_VARIANT_ESP32S2
static const int32_t SOC_ADC_RTC_MAX_BITWIDTH = 13;
#else
static const int32_t SOC_ADC_RTC_MAX_BITWIDTH = 12;
#endif // USE_ESP32_VARIANT_ESP32S2
#endif // SOC_ADC_RTC_MAX_BITWIDTH
const LogString *adc_unit_to_str(adc_unit_t unit) {
switch (unit) {
case ADC_UNIT_1:
return LOG_STR("ADC1");
case ADC_UNIT_2:
return LOG_STR("ADC2");
default:
return LOG_STR("Unknown ADC Unit");
}
}
static const int ADC_MAX = (1 << SOC_ADC_RTC_MAX_BITWIDTH) - 1;
static const int ADC_HALF = (1 << SOC_ADC_RTC_MAX_BITWIDTH) >> 1;
void ADCSensor::setup() {
// Check if another sensor already initialized this ADC unit
if (ADCSensor::shared_adc_handles[this->adc_unit_] == nullptr) {
adc_oneshot_unit_init_cfg_t init_config = {}; // Zero initialize
init_config.unit_id = this->adc_unit_;
init_config.ulp_mode = ADC_ULP_MODE_DISABLE;
#if USE_ESP32_VARIANT_ESP32C3 || USE_ESP32_VARIANT_ESP32C5 || USE_ESP32_VARIANT_ESP32C6 || USE_ESP32_VARIANT_ESP32H2
init_config.clk_src = ADC_DIGI_CLK_SRC_DEFAULT;
#endif // USE_ESP32_VARIANT_ESP32C3 || USE_ESP32_VARIANT_ESP32C5 || USE_ESP32_VARIANT_ESP32C6 ||
// USE_ESP32_VARIANT_ESP32H2
esp_err_t err = adc_oneshot_new_unit(&init_config, &ADCSensor::shared_adc_handles[this->adc_unit_]);
if (err != ESP_OK) {
ESP_LOGE(TAG, "Error initializing %s: %d", LOG_STR_ARG(adc_unit_to_str(this->adc_unit_)), err);
this->mark_failed();
return;
ESP_LOGCONFIG(TAG, "Setting up ADC '%s'...", this->get_name().c_str());
if (this->channel1_ != ADC1_CHANNEL_MAX) {
adc1_config_width(ADC_WIDTH_MAX_SOC_BITS);
if (!this->autorange_) {
adc1_config_channel_atten(this->channel1_, this->attenuation_);
}
} else if (this->channel2_ != ADC2_CHANNEL_MAX) {
if (!this->autorange_) {
adc2_config_channel_atten(this->channel2_, this->attenuation_);
}
}
this->adc_handle_ = ADCSensor::shared_adc_handles[this->adc_unit_];
this->setup_flags_.handle_init_complete = true;
adc_oneshot_chan_cfg_t config = {
.atten = this->attenuation_,
.bitwidth = ADC_BITWIDTH_DEFAULT,
};
esp_err_t err = adc_oneshot_config_channel(this->adc_handle_, this->channel_, &config);
if (err != ESP_OK) {
ESP_LOGE(TAG, "Error configuring channel: %d", err);
this->mark_failed();
return;
}
this->setup_flags_.config_complete = true;
// Initialize ADC calibration
if (this->calibration_handle_ == nullptr) {
adc_cali_handle_t handle = nullptr;
#if USE_ESP32_VARIANT_ESP32C3 || USE_ESP32_VARIANT_ESP32C5 || USE_ESP32_VARIANT_ESP32C6 || \
USE_ESP32_VARIANT_ESP32S3 || USE_ESP32_VARIANT_ESP32H2 || USE_ESP32_VARIANT_ESP32P4
// RISC-V variants and S3 use curve fitting calibration
adc_cali_curve_fitting_config_t cali_config = {}; // Zero initialize first
#if ESP_IDF_VERSION >= ESP_IDF_VERSION_VAL(5, 3, 0)
cali_config.chan = this->channel_;
#endif // ESP_IDF_VERSION >= ESP_IDF_VERSION_VAL(5, 3, 0)
cali_config.unit_id = this->adc_unit_;
cali_config.atten = this->attenuation_;
cali_config.bitwidth = ADC_BITWIDTH_DEFAULT;
err = adc_cali_create_scheme_curve_fitting(&cali_config, &handle);
if (err == ESP_OK) {
this->calibration_handle_ = handle;
this->setup_flags_.calibration_complete = true;
ESP_LOGV(TAG, "Using curve fitting calibration");
} else {
ESP_LOGW(TAG, "Curve fitting calibration failed with error %d, will use uncalibrated readings", err);
this->setup_flags_.calibration_complete = false;
for (int32_t i = 0; i <= ADC_ATTEN_DB_12_COMPAT; i++) {
auto adc_unit = this->channel1_ != ADC1_CHANNEL_MAX ? ADC_UNIT_1 : ADC_UNIT_2;
auto cal_value = esp_adc_cal_characterize(adc_unit, (adc_atten_t) i, ADC_WIDTH_MAX_SOC_BITS,
1100, // default vref
&this->cal_characteristics_[i]);
switch (cal_value) {
case ESP_ADC_CAL_VAL_EFUSE_VREF:
ESP_LOGV(TAG, "Using eFuse Vref for calibration");
break;
case ESP_ADC_CAL_VAL_EFUSE_TP:
ESP_LOGV(TAG, "Using two-point eFuse Vref for calibration");
break;
case ESP_ADC_CAL_VAL_DEFAULT_VREF:
default:
break;
}
#else // Other ESP32 variants use line fitting calibration
adc_cali_line_fitting_config_t cali_config = {
.unit_id = this->adc_unit_,
.atten = this->attenuation_,
.bitwidth = ADC_BITWIDTH_DEFAULT,
#if !defined(USE_ESP32_VARIANT_ESP32S2)
.default_vref = 1100, // Default reference voltage in mV
#endif // !defined(USE_ESP32_VARIANT_ESP32S2)
};
err = adc_cali_create_scheme_line_fitting(&cali_config, &handle);
if (err == ESP_OK) {
this->calibration_handle_ = handle;
this->setup_flags_.calibration_complete = true;
ESP_LOGV(TAG, "Using line fitting calibration");
} else {
ESP_LOGW(TAG, "Line fitting calibration failed with error %d, will use uncalibrated readings", err);
this->setup_flags_.calibration_complete = false;
}
#endif // USE_ESP32_VARIANT_ESP32C3 || ESP32C5 || ESP32C6 || ESP32S3 || ESP32H2
}
this->setup_flags_.init_complete = true;
}
void ADCSensor::dump_config() {
LOG_SENSOR("", "ADC Sensor", this);
LOG_PIN(" Pin: ", this->pin_);
ESP_LOGCONFIG(TAG,
" Channel: %d\n"
" Unit: %s\n"
" Attenuation: %s\n"
" Samples: %i\n"
" Sampling mode: %s",
this->channel_, LOG_STR_ARG(adc_unit_to_str(this->adc_unit_)),
this->autorange_ ? "Auto" : LOG_STR_ARG(attenuation_to_str(this->attenuation_)), this->sample_count_,
LOG_STR_ARG(sampling_mode_to_str(this->sampling_mode_)));
ESP_LOGCONFIG(
TAG,
" Setup Status:\n"
" Handle Init: %s\n"
" Config: %s\n"
" Calibration: %s\n"
" Overall Init: %s",
this->setup_flags_.handle_init_complete ? "OK" : "FAILED", this->setup_flags_.config_complete ? "OK" : "FAILED",
this->setup_flags_.calibration_complete ? "OK" : "FAILED", this->setup_flags_.init_complete ? "OK" : "FAILED");
if (this->autorange_) {
ESP_LOGCONFIG(TAG, " Attenuation: auto");
} else {
switch (this->attenuation_) {
case ADC_ATTEN_DB_0:
ESP_LOGCONFIG(TAG, " Attenuation: 0db");
break;
case ADC_ATTEN_DB_2_5:
ESP_LOGCONFIG(TAG, " Attenuation: 2.5db");
break;
case ADC_ATTEN_DB_6:
ESP_LOGCONFIG(TAG, " Attenuation: 6db");
break;
case ADC_ATTEN_DB_12_COMPAT:
ESP_LOGCONFIG(TAG, " Attenuation: 12db");
break;
default: // This is to satisfy the unused ADC_ATTEN_MAX
break;
}
}
ESP_LOGCONFIG(TAG, " Samples: %i", this->sample_count_);
ESP_LOGCONFIG(TAG, " Sampling mode: %s", LOG_STR_ARG(sampling_mode_to_str(this->sampling_mode_)));
LOG_UPDATE_INTERVAL(this);
}
float ADCSensor::sample() {
if (this->autorange_) {
return this->sample_autorange_();
} else {
return this->sample_fixed_attenuation_();
}
}
if (!this->autorange_) {
auto aggr = Aggregator(this->sampling_mode_);
float ADCSensor::sample_fixed_attenuation_() {
auto aggr = Aggregator<uint32_t>(this->sampling_mode_);
for (uint8_t sample = 0; sample < this->sample_count_; sample++) {
int raw = -1;
if (this->channel1_ != ADC1_CHANNEL_MAX) {
raw = adc1_get_raw(this->channel1_);
} else if (this->channel2_ != ADC2_CHANNEL_MAX) {
adc2_get_raw(this->channel2_, ADC_WIDTH_MAX_SOC_BITS, &raw);
}
if (raw == -1) {
return NAN;
}
for (uint8_t sample = 0; sample < this->sample_count_; sample++) {
int raw;
esp_err_t err = adc_oneshot_read(this->adc_handle_, this->channel_, &raw);
if (err != ESP_OK) {
ESP_LOGW(TAG, "ADC read failed with error %d", err);
continue;
aggr.add_sample(raw);
}
if (raw == -1) {
ESP_LOGW(TAG, "Invalid ADC reading");
continue;
if (this->output_raw_) {
return aggr.aggregate();
}
aggr.add_sample(raw);
uint32_t mv =
esp_adc_cal_raw_to_voltage(aggr.aggregate(), &this->cal_characteristics_[(int32_t) this->attenuation_]);
return mv / 1000.0f;
}
uint32_t final_value = aggr.aggregate();
int raw12 = ADC_MAX, raw6 = ADC_MAX, raw2 = ADC_MAX, raw0 = ADC_MAX;
if (this->output_raw_) {
return final_value;
}
if (this->calibration_handle_ != nullptr) {
int voltage_mv;
esp_err_t err = adc_cali_raw_to_voltage(this->calibration_handle_, final_value, &voltage_mv);
if (err == ESP_OK) {
return voltage_mv / 1000.0f;
} else {
ESP_LOGW(TAG, "ADC calibration conversion failed with error %d, disabling calibration", err);
if (this->calibration_handle_ != nullptr) {
#if USE_ESP32_VARIANT_ESP32C3 || USE_ESP32_VARIANT_ESP32C5 || USE_ESP32_VARIANT_ESP32C6 || \
USE_ESP32_VARIANT_ESP32S3 || USE_ESP32_VARIANT_ESP32H2 || USE_ESP32_VARIANT_ESP32P4
adc_cali_delete_scheme_curve_fitting(this->calibration_handle_);
#else // Other ESP32 variants use line fitting calibration
adc_cali_delete_scheme_line_fitting(this->calibration_handle_);
#endif // USE_ESP32_VARIANT_ESP32C3 || ESP32C5 || ESP32C6 || ESP32S3 || ESP32H2
this->calibration_handle_ = nullptr;
if (this->channel1_ != ADC1_CHANNEL_MAX) {
adc1_config_channel_atten(this->channel1_, ADC_ATTEN_DB_12_COMPAT);
raw12 = adc1_get_raw(this->channel1_);
if (raw12 < ADC_MAX) {
adc1_config_channel_atten(this->channel1_, ADC_ATTEN_DB_6);
raw6 = adc1_get_raw(this->channel1_);
if (raw6 < ADC_MAX) {
adc1_config_channel_atten(this->channel1_, ADC_ATTEN_DB_2_5);
raw2 = adc1_get_raw(this->channel1_);
if (raw2 < ADC_MAX) {
adc1_config_channel_atten(this->channel1_, ADC_ATTEN_DB_0);
raw0 = adc1_get_raw(this->channel1_);
}
}
}
}
return final_value * 3.3f / 4095.0f;
}
float ADCSensor::sample_autorange_() {
// Auto-range mode
auto read_atten = [this](adc_atten_t atten) -> std::pair<int, float> {
// First reconfigure the attenuation for this reading
adc_oneshot_chan_cfg_t config = {
.atten = atten,
.bitwidth = ADC_BITWIDTH_DEFAULT,
};
esp_err_t err = adc_oneshot_config_channel(this->adc_handle_, this->channel_, &config);
if (err != ESP_OK) {
ESP_LOGW(TAG, "Error configuring ADC channel for autorange: %d", err);
return {-1, 0.0f};
}
// Need to recalibrate for the new attenuation
if (this->calibration_handle_ != nullptr) {
// Delete old calibration handle
#if USE_ESP32_VARIANT_ESP32C3 || USE_ESP32_VARIANT_ESP32C5 || USE_ESP32_VARIANT_ESP32C6 || \
USE_ESP32_VARIANT_ESP32S3 || USE_ESP32_VARIANT_ESP32H2 || USE_ESP32_VARIANT_ESP32P4
adc_cali_delete_scheme_curve_fitting(this->calibration_handle_);
#else
adc_cali_delete_scheme_line_fitting(this->calibration_handle_);
#endif
this->calibration_handle_ = nullptr;
}
// Create new calibration handle for this attenuation
adc_cali_handle_t handle = nullptr;
#if USE_ESP32_VARIANT_ESP32C3 || USE_ESP32_VARIANT_ESP32C5 || USE_ESP32_VARIANT_ESP32C6 || \
USE_ESP32_VARIANT_ESP32S3 || USE_ESP32_VARIANT_ESP32H2 || USE_ESP32_VARIANT_ESP32P4
adc_cali_curve_fitting_config_t cali_config = {};
#if ESP_IDF_VERSION >= ESP_IDF_VERSION_VAL(5, 3, 0)
cali_config.chan = this->channel_;
#endif
cali_config.unit_id = this->adc_unit_;
cali_config.atten = atten;
cali_config.bitwidth = ADC_BITWIDTH_DEFAULT;
err = adc_cali_create_scheme_curve_fitting(&cali_config, &handle);
ESP_LOGVV(TAG, "Autorange atten=%d: Calibration handle creation %s (err=%d)", atten,
(err == ESP_OK) ? "SUCCESS" : "FAILED", err);
#else
adc_cali_line_fitting_config_t cali_config = {
.unit_id = this->adc_unit_,
.atten = atten,
.bitwidth = ADC_BITWIDTH_DEFAULT,
#if !defined(USE_ESP32_VARIANT_ESP32S2)
.default_vref = 1100,
#endif
};
err = adc_cali_create_scheme_line_fitting(&cali_config, &handle);
ESP_LOGVV(TAG, "Autorange atten=%d: Calibration handle creation %s (err=%d)", atten,
(err == ESP_OK) ? "SUCCESS" : "FAILED", err);
#endif
int raw;
err = adc_oneshot_read(this->adc_handle_, this->channel_, &raw);
ESP_LOGVV(TAG, "Autorange atten=%d: Raw ADC read %s, value=%d (err=%d)", atten,
(err == ESP_OK) ? "SUCCESS" : "FAILED", raw, err);
if (err != ESP_OK) {
ESP_LOGW(TAG, "ADC read failed in autorange with error %d", err);
if (handle != nullptr) {
#if USE_ESP32_VARIANT_ESP32C3 || USE_ESP32_VARIANT_ESP32C5 || USE_ESP32_VARIANT_ESP32C6 || \
USE_ESP32_VARIANT_ESP32S3 || USE_ESP32_VARIANT_ESP32H2 || USE_ESP32_VARIANT_ESP32P4
adc_cali_delete_scheme_curve_fitting(handle);
#else
adc_cali_delete_scheme_line_fitting(handle);
#endif
}
return {-1, 0.0f};
}
float voltage = 0.0f;
if (handle != nullptr) {
int voltage_mv;
err = adc_cali_raw_to_voltage(handle, raw, &voltage_mv);
if (err == ESP_OK) {
voltage = voltage_mv / 1000.0f;
ESP_LOGVV(TAG, "Autorange atten=%d: CALIBRATED - raw=%d -> %dmV -> %.6fV", atten, raw, voltage_mv, voltage);
} else {
voltage = raw * 3.3f / 4095.0f;
ESP_LOGVV(TAG, "Autorange atten=%d: UNCALIBRATED FALLBACK - raw=%d -> %.6fV (3.3V ref)", atten, raw, voltage);
}
// Clean up calibration handle
#if USE_ESP32_VARIANT_ESP32C3 || USE_ESP32_VARIANT_ESP32C5 || USE_ESP32_VARIANT_ESP32C6 || \
USE_ESP32_VARIANT_ESP32S3 || USE_ESP32_VARIANT_ESP32H2 || USE_ESP32_VARIANT_ESP32P4
adc_cali_delete_scheme_curve_fitting(handle);
#else
adc_cali_delete_scheme_line_fitting(handle);
#endif
} else {
voltage = raw * 3.3f / 4095.0f;
ESP_LOGVV(TAG, "Autorange atten=%d: NO CALIBRATION - raw=%d -> %.6fV (3.3V ref)", atten, raw, voltage);
}
return {raw, voltage};
};
auto [raw12, mv12] = read_atten(ADC_ATTEN_DB_12);
if (raw12 == -1) {
ESP_LOGE(TAG, "Failed to read ADC in autorange mode");
return NAN;
}
int raw6 = 4095, raw2 = 4095, raw0 = 4095;
float mv6 = 0, mv2 = 0, mv0 = 0;
if (raw12 < 4095) {
auto [raw6_val, mv6_val] = read_atten(ADC_ATTEN_DB_6);
raw6 = raw6_val;
mv6 = mv6_val;
if (raw6 < 4095 && raw6 != -1) {
auto [raw2_val, mv2_val] = read_atten(ADC_ATTEN_DB_2_5);
raw2 = raw2_val;
mv2 = mv2_val;
if (raw2 < 4095 && raw2 != -1) {
auto [raw0_val, mv0_val] = read_atten(ADC_ATTEN_DB_0);
raw0 = raw0_val;
mv0 = mv0_val;
} else if (this->channel2_ != ADC2_CHANNEL_MAX) {
adc2_config_channel_atten(this->channel2_, ADC_ATTEN_DB_12_COMPAT);
adc2_get_raw(this->channel2_, ADC_WIDTH_MAX_SOC_BITS, &raw12);
if (raw12 < ADC_MAX) {
adc2_config_channel_atten(this->channel2_, ADC_ATTEN_DB_6);
adc2_get_raw(this->channel2_, ADC_WIDTH_MAX_SOC_BITS, &raw6);
if (raw6 < ADC_MAX) {
adc2_config_channel_atten(this->channel2_, ADC_ATTEN_DB_2_5);
adc2_get_raw(this->channel2_, ADC_WIDTH_MAX_SOC_BITS, &raw2);
if (raw2 < ADC_MAX) {
adc2_config_channel_atten(this->channel2_, ADC_ATTEN_DB_0);
adc2_get_raw(this->channel2_, ADC_WIDTH_MAX_SOC_BITS, &raw0);
}
}
}
}
@@ -332,33 +145,19 @@ float ADCSensor::sample_autorange_() {
return NAN;
}
const int adc_half = 2048;
const uint32_t c12 = std::min(raw12, adc_half);
uint32_t mv12 = esp_adc_cal_raw_to_voltage(raw12, &this->cal_characteristics_[(int32_t) ADC_ATTEN_DB_12_COMPAT]);
uint32_t mv6 = esp_adc_cal_raw_to_voltage(raw6, &this->cal_characteristics_[(int32_t) ADC_ATTEN_DB_6]);
uint32_t mv2 = esp_adc_cal_raw_to_voltage(raw2, &this->cal_characteristics_[(int32_t) ADC_ATTEN_DB_2_5]);
uint32_t mv0 = esp_adc_cal_raw_to_voltage(raw0, &this->cal_characteristics_[(int32_t) ADC_ATTEN_DB_0]);
const int32_t c6_signed = adc_half - std::abs(raw6 - adc_half);
const uint32_t c6 = (c6_signed > 0) ? c6_signed : 0; // Clamp to prevent underflow
uint32_t c12 = std::min(raw12, ADC_HALF);
uint32_t c6 = ADC_HALF - std::abs(raw6 - ADC_HALF);
uint32_t c2 = ADC_HALF - std::abs(raw2 - ADC_HALF);
uint32_t c0 = std::min(ADC_MAX - raw0, ADC_HALF);
uint32_t csum = c12 + c6 + c2 + c0;
const int32_t c2_signed = adc_half - std::abs(raw2 - adc_half);
const uint32_t c2 = (c2_signed > 0) ? c2_signed : 0; // Clamp to prevent underflow
const uint32_t c0 = std::min(4095 - raw0, adc_half);
const uint32_t csum = c12 + c6 + c2 + c0;
ESP_LOGVV(TAG, "Autorange summary:");
ESP_LOGVV(TAG, " Raw readings: 12db=%d, 6db=%d, 2.5db=%d, 0db=%d", raw12, raw6, raw2, raw0);
ESP_LOGVV(TAG, " Voltages: 12db=%.6f, 6db=%.6f, 2.5db=%.6f, 0db=%.6f", mv12, mv6, mv2, mv0);
ESP_LOGVV(TAG, " Coefficients: c12=%u, c6=%u, c2=%u, c0=%u, sum=%u", c12, c6, c2, c0, csum);
if (csum == 0) {
ESP_LOGE(TAG, "Invalid weight sum in autorange calculation");
return NAN;
}
const float final_result = (mv12 * c12 + mv6 * c6 + mv2 * c2 + mv0 * c0) / csum;
ESP_LOGV(TAG, "Autorange final: (%.6f*%u + %.6f*%u + %.6f*%u + %.6f*%u)/%u = %.6fV", mv12, c12, mv6, c6, mv2, c2, mv0,
c0, csum, final_result);
return final_result;
uint32_t mv_scaled = (mv12 * c12) + (mv6 * c6) + (mv2 * c2) + (mv0 * c0);
return mv_scaled / (float) (csum * 1000U);
}
} // namespace adc

View File

@@ -17,6 +17,7 @@ namespace adc {
static const char *const TAG = "adc.esp8266";
void ADCSensor::setup() {
ESP_LOGCONFIG(TAG, "Setting up ADC '%s'...", this->get_name().c_str());
#ifndef USE_ADC_SENSOR_VCC
this->pin_->setup();
#endif
@@ -29,15 +30,13 @@ void ADCSensor::dump_config() {
#else
LOG_PIN(" Pin: ", this->pin_);
#endif // USE_ADC_SENSOR_VCC
ESP_LOGCONFIG(TAG,
" Samples: %i\n"
" Sampling mode: %s",
this->sample_count_, LOG_STR_ARG(sampling_mode_to_str(this->sampling_mode_)));
ESP_LOGCONFIG(TAG, " Samples: %i", this->sample_count_);
ESP_LOGCONFIG(TAG, " Sampling mode: %s", LOG_STR_ARG(sampling_mode_to_str(this->sampling_mode_)));
LOG_UPDATE_INTERVAL(this);
}
float ADCSensor::sample() {
auto aggr = Aggregator<uint32_t>(this->sampling_mode_);
auto aggr = Aggregator(this->sampling_mode_);
for (uint8_t sample = 0; sample < this->sample_count_; sample++) {
uint32_t raw = 0;
@@ -55,6 +54,8 @@ float ADCSensor::sample() {
return aggr.aggregate() / 1024.0f;
}
std::string ADCSensor::unique_id() { return get_mac_address() + "-adc"; }
} // namespace adc
} // namespace esphome

View File

@@ -9,6 +9,7 @@ namespace adc {
static const char *const TAG = "adc.libretiny";
void ADCSensor::setup() {
ESP_LOGCONFIG(TAG, "Setting up ADC '%s'...", this->get_name().c_str());
#ifndef USE_ADC_SENSOR_VCC
this->pin_->setup();
#endif // !USE_ADC_SENSOR_VCC
@@ -21,16 +22,14 @@ void ADCSensor::dump_config() {
#else // USE_ADC_SENSOR_VCC
LOG_PIN(" Pin: ", this->pin_);
#endif // USE_ADC_SENSOR_VCC
ESP_LOGCONFIG(TAG,
" Samples: %i\n"
" Sampling mode: %s",
this->sample_count_, LOG_STR_ARG(sampling_mode_to_str(this->sampling_mode_)));
ESP_LOGCONFIG(TAG, " Samples: %i", this->sample_count_);
ESP_LOGCONFIG(TAG, " Sampling mode: %s", LOG_STR_ARG(sampling_mode_to_str(this->sampling_mode_)));
LOG_UPDATE_INTERVAL(this);
}
float ADCSensor::sample() {
uint32_t raw = 0;
auto aggr = Aggregator<uint32_t>(this->sampling_mode_);
auto aggr = Aggregator(this->sampling_mode_);
if (this->output_raw_) {
for (uint8_t sample = 0; sample < this->sample_count_; sample++) {

View File

@@ -14,6 +14,7 @@ namespace adc {
static const char *const TAG = "adc.rp2040";
void ADCSensor::setup() {
ESP_LOGCONFIG(TAG, "Setting up ADC '%s'...", this->get_name().c_str());
static bool initialized = false;
if (!initialized) {
adc_init();
@@ -32,16 +33,14 @@ void ADCSensor::dump_config() {
LOG_PIN(" Pin: ", this->pin_);
#endif // USE_ADC_SENSOR_VCC
}
ESP_LOGCONFIG(TAG,
" Samples: %i\n"
" Sampling mode: %s",
this->sample_count_, LOG_STR_ARG(sampling_mode_to_str(this->sampling_mode_)));
ESP_LOGCONFIG(TAG, " Samples: %i", this->sample_count_);
ESP_LOGCONFIG(TAG, " Sampling mode: %s", LOG_STR_ARG(sampling_mode_to_str(this->sampling_mode_)));
LOG_UPDATE_INTERVAL(this);
}
float ADCSensor::sample() {
uint32_t raw = 0;
auto aggr = Aggregator<uint32_t>(this->sampling_mode_);
auto aggr = Aggregator(this->sampling_mode_);
if (this->is_temperature_) {
adc_set_temp_sensor_enabled(true);

View File

@@ -1,207 +0,0 @@
#include "adc_sensor.h"
#ifdef USE_ZEPHYR
#include "esphome/core/log.h"
#include "hal/nrf_saadc.h"
namespace esphome {
namespace adc {
static const char *const TAG = "adc.zephyr";
void ADCSensor::setup() {
if (!adc_is_ready_dt(this->channel_)) {
ESP_LOGE(TAG, "ADC controller device %s not ready", this->channel_->dev->name);
return;
}
auto err = adc_channel_setup_dt(this->channel_);
if (err < 0) {
ESP_LOGE(TAG, "Could not setup channel %s (%d)", this->channel_->dev->name, err);
return;
}
}
#if ESPHOME_LOG_LEVEL >= ESPHOME_LOG_LEVEL_VERBOSE
static const LogString *gain_to_str(enum adc_gain gain) {
switch (gain) {
case ADC_GAIN_1_6:
return LOG_STR("1/6");
case ADC_GAIN_1_5:
return LOG_STR("1/5");
case ADC_GAIN_1_4:
return LOG_STR("1/4");
case ADC_GAIN_1_3:
return LOG_STR("1/3");
case ADC_GAIN_2_5:
return LOG_STR("2/5");
case ADC_GAIN_1_2:
return LOG_STR("1/2");
case ADC_GAIN_2_3:
return LOG_STR("2/3");
case ADC_GAIN_4_5:
return LOG_STR("4/5");
case ADC_GAIN_1:
return LOG_STR("1");
case ADC_GAIN_2:
return LOG_STR("2");
case ADC_GAIN_3:
return LOG_STR("3");
case ADC_GAIN_4:
return LOG_STR("4");
case ADC_GAIN_6:
return LOG_STR("6");
case ADC_GAIN_8:
return LOG_STR("8");
case ADC_GAIN_12:
return LOG_STR("12");
case ADC_GAIN_16:
return LOG_STR("16");
case ADC_GAIN_24:
return LOG_STR("24");
case ADC_GAIN_32:
return LOG_STR("32");
case ADC_GAIN_64:
return LOG_STR("64");
case ADC_GAIN_128:
return LOG_STR("128");
}
return LOG_STR("undefined gain");
}
static const LogString *reference_to_str(enum adc_reference reference) {
switch (reference) {
case ADC_REF_VDD_1:
return LOG_STR("VDD");
case ADC_REF_VDD_1_2:
return LOG_STR("VDD/2");
case ADC_REF_VDD_1_3:
return LOG_STR("VDD/3");
case ADC_REF_VDD_1_4:
return LOG_STR("VDD/4");
case ADC_REF_INTERNAL:
return LOG_STR("INTERNAL");
case ADC_REF_EXTERNAL0:
return LOG_STR("External, input 0");
case ADC_REF_EXTERNAL1:
return LOG_STR("External, input 1");
}
return LOG_STR("undefined reference");
}
static const LogString *input_to_str(uint8_t input) {
switch (input) {
case NRF_SAADC_INPUT_AIN0:
return LOG_STR("AIN0");
case NRF_SAADC_INPUT_AIN1:
return LOG_STR("AIN1");
case NRF_SAADC_INPUT_AIN2:
return LOG_STR("AIN2");
case NRF_SAADC_INPUT_AIN3:
return LOG_STR("AIN3");
case NRF_SAADC_INPUT_AIN4:
return LOG_STR("AIN4");
case NRF_SAADC_INPUT_AIN5:
return LOG_STR("AIN5");
case NRF_SAADC_INPUT_AIN6:
return LOG_STR("AIN6");
case NRF_SAADC_INPUT_AIN7:
return LOG_STR("AIN7");
case NRF_SAADC_INPUT_VDD:
return LOG_STR("VDD");
case NRF_SAADC_INPUT_VDDHDIV5:
return LOG_STR("VDDHDIV5");
}
return LOG_STR("undefined input");
}
#endif // ESPHOME_LOG_LEVEL >= ESPHOME_LOG_LEVEL_VERBOSE
void ADCSensor::dump_config() {
LOG_SENSOR("", "ADC Sensor", this);
LOG_PIN(" Pin: ", this->pin_);
#if ESPHOME_LOG_LEVEL >= ESPHOME_LOG_LEVEL_VERBOSE
ESP_LOGV(TAG,
" Name: %s\n"
" Channel: %d\n"
" vref_mv: %d\n"
" Resolution %d\n"
" Oversampling %d",
this->channel_->dev->name, this->channel_->channel_id, this->channel_->vref_mv, this->channel_->resolution,
this->channel_->oversampling);
ESP_LOGV(TAG,
" Gain: %s\n"
" reference: %s\n"
" acquisition_time: %d\n"
" differential %s",
LOG_STR_ARG(gain_to_str(this->channel_->channel_cfg.gain)),
LOG_STR_ARG(reference_to_str(this->channel_->channel_cfg.reference)),
this->channel_->channel_cfg.acquisition_time, YESNO(this->channel_->channel_cfg.differential));
if (this->channel_->channel_cfg.differential) {
ESP_LOGV(TAG,
" Positive: %s\n"
" Negative: %s",
LOG_STR_ARG(input_to_str(this->channel_->channel_cfg.input_positive)),
LOG_STR_ARG(input_to_str(this->channel_->channel_cfg.input_negative)));
} else {
ESP_LOGV(TAG, " Positive: %s", LOG_STR_ARG(input_to_str(this->channel_->channel_cfg.input_positive)));
}
#endif
LOG_UPDATE_INTERVAL(this);
}
float ADCSensor::sample() {
auto aggr = Aggregator<int32_t>(this->sampling_mode_);
int err;
for (uint8_t sample = 0; sample < this->sample_count_; sample++) {
int16_t buf = 0;
struct adc_sequence sequence = {
.buffer = &buf,
/* buffer size in bytes, not number of samples */
.buffer_size = sizeof(buf),
};
int32_t val_raw;
err = adc_sequence_init_dt(this->channel_, &sequence);
if (err < 0) {
ESP_LOGE(TAG, "Could sequence init %s (%d)", this->channel_->dev->name, err);
return 0.0;
}
err = adc_read(this->channel_->dev, &sequence);
if (err < 0) {
ESP_LOGE(TAG, "Could not read %s (%d)", this->channel_->dev->name, err);
return 0.0;
}
val_raw = (int32_t) buf;
if (!this->channel_->channel_cfg.differential) {
// https://github.com/adafruit/Adafruit_nRF52_Arduino/blob/0ed4d9ffc674ae407be7cacf5696a02f5e789861/cores/nRF5/wiring_analog_nRF52.c#L222
if (val_raw < 0) {
val_raw = 0;
}
}
aggr.add_sample(val_raw);
}
int32_t val_mv = aggr.aggregate();
if (this->output_raw_) {
return val_mv;
}
err = adc_raw_to_millivolts_dt(this->channel_, &val_mv);
/* conversion to mV may not be supported, skip if not */
if (err < 0) {
ESP_LOGE(TAG, "Value in mV not available %s (%d)", this->channel_->dev->name, err);
return 0.0;
}
return val_mv / 1000.0f;
}
} // namespace adc
} // namespace esphome
#endif

View File

@@ -3,13 +3,6 @@ import logging
import esphome.codegen as cg
from esphome.components import sensor, voltage_sampler
from esphome.components.esp32 import get_esp32_variant
from esphome.components.nrf52.const import AIN_TO_GPIO, EXTRA_ADC
from esphome.components.zephyr import (
zephyr_add_overlay,
zephyr_add_prj_conf,
zephyr_add_user,
)
from esphome.config_helpers import filter_source_files_from_platform
import esphome.config_validation as cv
from esphome.const import (
CONF_ATTENUATION,
@@ -17,13 +10,13 @@ from esphome.const import (
CONF_NUMBER,
CONF_PIN,
CONF_RAW,
CONF_WIFI,
DEVICE_CLASS_VOLTAGE,
PLATFORM_NRF52,
STATE_CLASS_MEASUREMENT,
UNIT_VOLT,
PlatformFramework,
)
from esphome.core import CORE
import esphome.final_validate as fv
from . import (
ATTENUATION_MODES,
@@ -31,7 +24,6 @@ from . import (
ESP32_VARIANT_ADC2_PIN_TO_CHANNEL,
SAMPLING_MODES,
adc_ns,
adc_unit_t,
validate_adc_pin,
)
@@ -65,14 +57,25 @@ def validate_config(config):
return config
def final_validate_config(config):
if CORE.is_esp32:
variant = get_esp32_variant()
if (
CONF_WIFI in fv.full_config.get()
and config[CONF_PIN][CONF_NUMBER]
in ESP32_VARIANT_ADC2_PIN_TO_CHANNEL[variant]
):
raise cv.Invalid(
f"{variant} doesn't support ADC on this pin when Wi-Fi is configured"
)
return config
ADCSensor = adc_ns.class_(
"ADCSensor", sensor.Sensor, cg.PollingComponent, voltage_sampler.VoltageSampler
)
CONF_NRF_SAADC = "nrf_saadc"
adc_dt_spec = cg.global_ns.class_("adc_dt_spec")
CONFIG_SCHEMA = cv.All(
sensor.sensor_schema(
ADCSensor,
@@ -88,7 +91,6 @@ CONFIG_SCHEMA = cv.All(
cv.SplitDefault(CONF_ATTENUATION, esp32="0db"): cv.All(
cv.only_on_esp32, _attenuation
),
cv.OnlyWith(CONF_NRF_SAADC, PLATFORM_NRF52): cv.declare_id(adc_dt_spec),
cv.Optional(CONF_SAMPLES, default=1): cv.int_range(min=1, max=255),
cv.Optional(CONF_SAMPLING_MODE, default="avg"): _sampling_mode,
}
@@ -97,7 +99,7 @@ CONFIG_SCHEMA = cv.All(
validate_config,
)
CONF_ADC_CHANNEL_ID = "adc_channel_id"
FINAL_VALIDATE_SCHEMA = final_validate_config
async def to_code(config):
@@ -109,7 +111,7 @@ async def to_code(config):
cg.add_define("USE_ADC_SENSOR_VCC")
elif config[CONF_PIN] == "TEMPERATURE":
cg.add(var.set_is_temperature())
elif not CORE.is_nrf52 or config[CONF_PIN][CONF_NUMBER] not in EXTRA_ADC:
else:
pin = await cg.gpio_pin_expression(config[CONF_PIN])
cg.add(var.set_pin(pin))
@@ -117,13 +119,13 @@ async def to_code(config):
cg.add(var.set_sample_count(config[CONF_SAMPLES]))
cg.add(var.set_sampling_mode(config[CONF_SAMPLING_MODE]))
if CORE.is_esp32:
if attenuation := config.get(CONF_ATTENUATION):
if attenuation == "auto":
cg.add(var.set_autorange(cg.global_ns.true))
else:
cg.add(var.set_attenuation(attenuation))
if attenuation := config.get(CONF_ATTENUATION):
if attenuation == "auto":
cg.add(var.set_autorange(cg.global_ns.true))
else:
cg.add(var.set_attenuation(attenuation))
if CORE.is_esp32:
variant = get_esp32_variant()
pin_num = config[CONF_PIN][CONF_NUMBER]
if (
@@ -131,66 +133,10 @@ async def to_code(config):
and pin_num in ESP32_VARIANT_ADC1_PIN_TO_CHANNEL[variant]
):
chan = ESP32_VARIANT_ADC1_PIN_TO_CHANNEL[variant][pin_num]
cg.add(var.set_channel(adc_unit_t.ADC_UNIT_1, chan))
cg.add(var.set_channel1(chan))
elif (
variant in ESP32_VARIANT_ADC2_PIN_TO_CHANNEL
and pin_num in ESP32_VARIANT_ADC2_PIN_TO_CHANNEL[variant]
):
chan = ESP32_VARIANT_ADC2_PIN_TO_CHANNEL[variant][pin_num]
cg.add(var.set_channel(adc_unit_t.ADC_UNIT_2, chan))
elif CORE.is_nrf52:
CORE.data.setdefault(CONF_ADC_CHANNEL_ID, 0)
channel_id = CORE.data[CONF_ADC_CHANNEL_ID]
CORE.data[CONF_ADC_CHANNEL_ID] = channel_id + 1
zephyr_add_prj_conf("ADC", True)
nrf_saadc = config[CONF_NRF_SAADC]
rhs = cg.RawExpression(
f"ADC_DT_SPEC_GET_BY_IDX(DT_PATH(zephyr_user), {channel_id})"
)
adc = cg.new_Pvariable(nrf_saadc, rhs)
cg.add(var.set_adc_channel(adc))
gain = "ADC_GAIN_1_6"
pin_number = config[CONF_PIN][CONF_NUMBER]
if pin_number == "VDDHDIV5":
gain = "ADC_GAIN_1_2"
if isinstance(pin_number, int):
GPIO_TO_AIN = {v: k for k, v in AIN_TO_GPIO.items()}
pin_number = GPIO_TO_AIN[pin_number]
zephyr_add_user("io-channels", f"<&adc {channel_id}>")
zephyr_add_overlay(
f"""
&adc {{
#address-cells = <1>;
#size-cells = <0>;
channel@{channel_id} {{
reg = <{channel_id}>;
zephyr,gain = "{gain}";
zephyr,reference = "ADC_REF_INTERNAL";
zephyr,acquisition-time = <ADC_ACQ_TIME_DEFAULT>;
zephyr,input-positive = <NRF_SAADC_{pin_number}>;
zephyr,resolution = <14>;
zephyr,oversampling = <8>;
}};
}};
"""
)
FILTER_SOURCE_FILES = filter_source_files_from_platform(
{
"adc_sensor_esp32.cpp": {
PlatformFramework.ESP32_ARDUINO,
PlatformFramework.ESP32_IDF,
},
"adc_sensor_esp8266.cpp": {PlatformFramework.ESP8266_ARDUINO},
"adc_sensor_rp2040.cpp": {PlatformFramework.RP2040_ARDUINO},
"adc_sensor_libretiny.cpp": {
PlatformFramework.BK72XX_ARDUINO,
PlatformFramework.RTL87XX_ARDUINO,
PlatformFramework.LN882X_ARDUINO,
},
"adc_sensor_zephyr.cpp": {PlatformFramework.NRF52_ZEPHYR},
}
)
cg.add(var.set_channel2(chan))

View File

@@ -8,7 +8,10 @@ static const char *const TAG = "adc128s102";
float ADC128S102::get_setup_priority() const { return setup_priority::HARDWARE; }
void ADC128S102::setup() { this->spi_setup(); }
void ADC128S102::setup() {
ESP_LOGCONFIG(TAG, "Setting up adc128s102");
this->spi_setup();
}
void ADC128S102::dump_config() {
ESP_LOGCONFIG(TAG, "ADC128S102:");

View File

@@ -113,7 +113,7 @@ void ADE7880::update() {
if (this->channel_a_ != nullptr) {
auto *chan = this->channel_a_;
this->update_sensor_from_s24zp_register16_(chan->current, AIRMS, [](float val) { return val / 100000.0f; });
this->update_sensor_from_s24zp_register16_(chan->voltage, AVRMS, [](float val) { return val / 10000.0f; });
this->update_sensor_from_s24zp_register16_(chan->voltage, BVRMS, [](float val) { return val / 10000.0f; });
this->update_sensor_from_s24zp_register16_(chan->active_power, AWATT, [](float val) { return val / 100.0f; });
this->update_sensor_from_s24zp_register16_(chan->apparent_power, AVA, [](float val) { return val / 100.0f; });
this->update_sensor_from_s16_register16_(chan->power_factor, APF,
@@ -177,14 +177,11 @@ void ADE7880::dump_config() {
LOG_SENSOR(" ", "Power Factor", this->channel_a_->power_factor);
LOG_SENSOR(" ", "Forward Active Energy", this->channel_a_->forward_active_energy);
LOG_SENSOR(" ", "Reverse Active Energy", this->channel_a_->reverse_active_energy);
ESP_LOGCONFIG(TAG,
" Calibration:\n"
" Current: %" PRId32 "\n"
" Voltage: %" PRId32 "\n"
" Power: %" PRId32 "\n"
" Phase Angle: %u",
this->channel_a_->current_gain_calibration, this->channel_a_->voltage_gain_calibration,
this->channel_a_->power_gain_calibration, this->channel_a_->phase_angle_calibration);
ESP_LOGCONFIG(TAG, " Calibration:");
ESP_LOGCONFIG(TAG, " Current: %" PRId32, this->channel_a_->current_gain_calibration);
ESP_LOGCONFIG(TAG, " Voltage: %" PRId32, this->channel_a_->voltage_gain_calibration);
ESP_LOGCONFIG(TAG, " Power: %" PRId32, this->channel_a_->power_gain_calibration);
ESP_LOGCONFIG(TAG, " Phase Angle: %u", this->channel_a_->phase_angle_calibration);
}
if (this->channel_b_ != nullptr) {
@@ -196,14 +193,11 @@ void ADE7880::dump_config() {
LOG_SENSOR(" ", "Power Factor", this->channel_b_->power_factor);
LOG_SENSOR(" ", "Forward Active Energy", this->channel_b_->forward_active_energy);
LOG_SENSOR(" ", "Reverse Active Energy", this->channel_b_->reverse_active_energy);
ESP_LOGCONFIG(TAG,
" Calibration:\n"
" Current: %" PRId32 "\n"
" Voltage: %" PRId32 "\n"
" Power: %" PRId32 "\n"
" Phase Angle: %u",
this->channel_b_->current_gain_calibration, this->channel_b_->voltage_gain_calibration,
this->channel_b_->power_gain_calibration, this->channel_b_->phase_angle_calibration);
ESP_LOGCONFIG(TAG, " Calibration:");
ESP_LOGCONFIG(TAG, " Current: %" PRId32, this->channel_b_->current_gain_calibration);
ESP_LOGCONFIG(TAG, " Voltage: %" PRId32, this->channel_b_->voltage_gain_calibration);
ESP_LOGCONFIG(TAG, " Power: %" PRId32, this->channel_b_->power_gain_calibration);
ESP_LOGCONFIG(TAG, " Phase Angle: %u", this->channel_b_->phase_angle_calibration);
}
if (this->channel_c_ != nullptr) {
@@ -215,23 +209,18 @@ void ADE7880::dump_config() {
LOG_SENSOR(" ", "Power Factor", this->channel_c_->power_factor);
LOG_SENSOR(" ", "Forward Active Energy", this->channel_c_->forward_active_energy);
LOG_SENSOR(" ", "Reverse Active Energy", this->channel_c_->reverse_active_energy);
ESP_LOGCONFIG(TAG,
" Calibration:\n"
" Current: %" PRId32 "\n"
" Voltage: %" PRId32 "\n"
" Power: %" PRId32 "\n"
" Phase Angle: %u",
this->channel_c_->current_gain_calibration, this->channel_c_->voltage_gain_calibration,
this->channel_c_->power_gain_calibration, this->channel_c_->phase_angle_calibration);
ESP_LOGCONFIG(TAG, " Calibration:");
ESP_LOGCONFIG(TAG, " Current: %" PRId32, this->channel_c_->current_gain_calibration);
ESP_LOGCONFIG(TAG, " Voltage: %" PRId32, this->channel_c_->voltage_gain_calibration);
ESP_LOGCONFIG(TAG, " Power: %" PRId32, this->channel_c_->power_gain_calibration);
ESP_LOGCONFIG(TAG, " Phase Angle: %u", this->channel_c_->phase_angle_calibration);
}
if (this->channel_n_ != nullptr) {
ESP_LOGCONFIG(TAG, " Neutral:");
LOG_SENSOR(" ", "Current", this->channel_n_->current);
ESP_LOGCONFIG(TAG,
" Calibration:\n"
" Current: %" PRId32,
this->channel_n_->current_gain_calibration);
ESP_LOGCONFIG(TAG, " Calibration:");
ESP_LOGCONFIG(TAG, " Current: %" PRId32, this->channel_n_->current_gain_calibration);
}
LOG_I2C_DEVICE(this);

View File

@@ -85,6 +85,8 @@ class ADE7880 : public i2c::I2CDevice, public PollingComponent {
void dump_config() override;
float get_setup_priority() const override { return setup_priority::DATA; }
protected:
ADE7880Store store_{};
InternalGPIOPin *irq0_pin_{nullptr};

View File

@@ -36,7 +36,6 @@ from esphome.const import (
UNIT_WATT,
UNIT_WATT_HOURS,
)
from esphome.types import ConfigType
DEPENDENCIES = ["i2c"]
@@ -52,20 +51,6 @@ CONF_POWER_GAIN = "power_gain"
CONF_NEUTRAL = "neutral"
# Tuple of power channel phases
POWER_PHASES = (CONF_PHASE_A, CONF_PHASE_B, CONF_PHASE_C)
# Tuple of sensor types that can be configured for power channels
POWER_SENSOR_TYPES = (
CONF_CURRENT,
CONF_VOLTAGE,
CONF_ACTIVE_POWER,
CONF_APPARENT_POWER,
CONF_POWER_FACTOR,
CONF_FORWARD_ACTIVE_ENERGY,
CONF_REVERSE_ACTIVE_ENERGY,
)
NEUTRAL_CHANNEL_SCHEMA = cv.Schema(
{
cv.GenerateID(): cv.declare_id(NeutralChannel),
@@ -165,64 +150,7 @@ POWER_CHANNEL_SCHEMA = cv.Schema(
}
)
def prefix_sensor_name(
sensor_conf: ConfigType,
channel_name: str,
channel_config: ConfigType,
sensor_type: str,
) -> None:
"""Helper to prefix sensor name with channel name.
Args:
sensor_conf: The sensor configuration (dict or string)
channel_name: The channel name to prefix with
channel_config: The channel configuration to update
sensor_type: The sensor type key in the channel config
"""
if isinstance(sensor_conf, dict) and CONF_NAME in sensor_conf:
sensor_name = sensor_conf[CONF_NAME]
if sensor_name and not sensor_name.startswith(channel_name):
sensor_conf[CONF_NAME] = f"{channel_name} {sensor_name}"
elif isinstance(sensor_conf, str):
# Simple value case - convert to dict with prefixed name
channel_config[sensor_type] = {CONF_NAME: f"{channel_name} {sensor_conf}"}
def process_channel_sensors(
config: ConfigType, channel_key: str, sensor_types: tuple
) -> None:
"""Process sensors for a channel and prefix their names.
Args:
config: The main configuration
channel_key: The channel key (e.g., CONF_PHASE_A, CONF_NEUTRAL)
sensor_types: Tuple of sensor types to process for this channel
"""
if not (channel_config := config.get(channel_key)) or not (
channel_name := channel_config.get(CONF_NAME)
):
return
for sensor_type in sensor_types:
if sensor_conf := channel_config.get(sensor_type):
prefix_sensor_name(sensor_conf, channel_name, channel_config, sensor_type)
def preprocess_channels(config: ConfigType) -> ConfigType:
"""Preprocess channel configurations to add channel name prefix to sensor names."""
# Process power channels
for channel in POWER_PHASES:
process_channel_sensors(config, channel, POWER_SENSOR_TYPES)
# Process neutral channel
process_channel_sensors(config, CONF_NEUTRAL, (CONF_CURRENT,))
return config
CONFIG_SCHEMA = cv.All(
preprocess_channels,
CONFIG_SCHEMA = (
cv.Schema(
{
cv.GenerateID(): cv.declare_id(ADE7880),
@@ -239,7 +167,7 @@ CONFIG_SCHEMA = cv.All(
}
)
.extend(cv.polling_component_schema("60s"))
.extend(i2c.i2c_device_schema(0x38)),
.extend(i2c.i2c_device_schema(0x38))
)
@@ -260,7 +188,15 @@ async def neutral_channel(config):
async def power_channel(config):
var = cg.new_Pvariable(config[CONF_ID])
for sensor_type in POWER_SENSOR_TYPES:
for sensor_type in [
CONF_CURRENT,
CONF_VOLTAGE,
CONF_ACTIVE_POWER,
CONF_APPARENT_POWER,
CONF_POWER_FACTOR,
CONF_FORWARD_ACTIVE_ENERGY,
CONF_REVERSE_ACTIVE_ENERGY,
]:
if conf := config.get(sensor_type):
sens = await sensor.new_sensor(conf)
cg.add(getattr(var, f"set_{sensor_type}")(sens))
@@ -280,6 +216,44 @@ async def power_channel(config):
return var
def final_validate(config):
for channel in [CONF_PHASE_A, CONF_PHASE_B, CONF_PHASE_C]:
if channel := config.get(channel):
channel_name = channel.get(CONF_NAME)
for sensor_type in [
CONF_CURRENT,
CONF_VOLTAGE,
CONF_ACTIVE_POWER,
CONF_APPARENT_POWER,
CONF_POWER_FACTOR,
CONF_FORWARD_ACTIVE_ENERGY,
CONF_REVERSE_ACTIVE_ENERGY,
]:
if conf := channel.get(sensor_type):
sensor_name = conf.get(CONF_NAME)
if (
sensor_name
and channel_name
and not sensor_name.startswith(channel_name)
):
conf[CONF_NAME] = f"{channel_name} {sensor_name}"
if channel := config.get(CONF_NEUTRAL):
channel_name = channel.get(CONF_NAME)
if conf := channel.get(CONF_CURRENT):
sensor_name = conf.get(CONF_NAME)
if (
sensor_name
and channel_name
and not sensor_name.startswith(channel_name)
):
conf[CONF_NAME] = f"{channel_name} {sensor_name}"
FINAL_VALIDATE_SCHEMA = final_validate
async def to_code(config):
var = cg.new_Pvariable(config[CONF_ID])
await cg.register_component(var, config)

View File

@@ -58,18 +58,15 @@ void ADE7953::dump_config() {
LOG_SENSOR(" ", "Active Power B Sensor", this->active_power_b_sensor_);
LOG_SENSOR(" ", "Rective Power A Sensor", this->reactive_power_a_sensor_);
LOG_SENSOR(" ", "Reactive Power B Sensor", this->reactive_power_b_sensor_);
ESP_LOGCONFIG(TAG,
" USE_ACC_ENERGY_REGS: %d\n"
" PGA_V_8: 0x%X\n"
" PGA_IA_8: 0x%X\n"
" PGA_IB_8: 0x%X\n"
" VGAIN_32: 0x%08jX\n"
" AIGAIN_32: 0x%08jX\n"
" BIGAIN_32: 0x%08jX\n"
" AWGAIN_32: 0x%08jX\n"
" BWGAIN_32: 0x%08jX",
this->use_acc_energy_regs_, pga_v_, pga_ia_, pga_ib_, (uintmax_t) vgain_, (uintmax_t) aigain_,
(uintmax_t) bigain_, (uintmax_t) awgain_, (uintmax_t) bwgain_);
ESP_LOGCONFIG(TAG, " USE_ACC_ENERGY_REGS: %d", this->use_acc_energy_regs_);
ESP_LOGCONFIG(TAG, " PGA_V_8: 0x%X", pga_v_);
ESP_LOGCONFIG(TAG, " PGA_IA_8: 0x%X", pga_ia_);
ESP_LOGCONFIG(TAG, " PGA_IB_8: 0x%X", pga_ib_);
ESP_LOGCONFIG(TAG, " VGAIN_32: 0x%08jX", (uintmax_t) vgain_);
ESP_LOGCONFIG(TAG, " AIGAIN_32: 0x%08jX", (uintmax_t) aigain_);
ESP_LOGCONFIG(TAG, " BIGAIN_32: 0x%08jX", (uintmax_t) bigain_);
ESP_LOGCONFIG(TAG, " AWGAIN_32: 0x%08jX", (uintmax_t) awgain_);
ESP_LOGCONFIG(TAG, " BWGAIN_32: 0x%08jX", (uintmax_t) bwgain_);
}
#define ADE_PUBLISH_(name, val, factor) \

View File

@@ -1,6 +1,6 @@
#include "ade7953_i2c.h"
#include "esphome/core/helpers.h"
#include "esphome/core/log.h"
#include "esphome/core/helpers.h"
namespace esphome {
namespace ade7953_i2c {

View File

@@ -1,6 +1,6 @@
#include "ade7953_spi.h"
#include "esphome/core/helpers.h"
#include "esphome/core/log.h"
#include "esphome/core/helpers.h"
namespace esphome {
namespace ade7953_spi {

View File

@@ -10,12 +10,15 @@ static const uint8_t ADS1115_REGISTER_CONVERSION = 0x00;
static const uint8_t ADS1115_REGISTER_CONFIG = 0x01;
void ADS1115Component::setup() {
ESP_LOGCONFIG(TAG, "Setting up ADS1115...");
uint16_t value;
if (!this->read_byte_16(ADS1115_REGISTER_CONVERSION, &value)) {
this->mark_failed();
return;
}
ESP_LOGCONFIG(TAG, "Configuring ADS1115...");
uint16_t config = 0;
// Clear single-shot bit
// 0b0xxxxxxxxxxxxxxx
@@ -65,10 +68,10 @@ void ADS1115Component::setup() {
this->prev_config_ = config;
}
void ADS1115Component::dump_config() {
ESP_LOGCONFIG(TAG, "ADS1115:");
ESP_LOGCONFIG(TAG, "Setting up ADS1115...");
LOG_I2C_DEVICE(this);
if (this->is_failed()) {
ESP_LOGE(TAG, ESP_LOG_MSG_COMM_FAIL);
ESP_LOGE(TAG, "Communication with ADS1115 failed!");
}
}
float ADS1115Component::request_measurement(ADS1115Multiplexer multiplexer, ADS1115Gain gain,

View File

@@ -49,6 +49,7 @@ class ADS1115Component : public Component, public i2c::I2CDevice {
void setup() override;
void dump_config() override;
/// HARDWARE_LATE setup priority
float get_setup_priority() const override { return setup_priority::DATA; }
void set_continuous_mode(bool continuous_mode) { continuous_mode_ = continuous_mode; }
/// Helper method to request a measurement from a sensor.

View File

@@ -1,5 +1,4 @@
#include "ads1118.h"
#include "esphome/core/helpers.h"
#include "esphome/core/log.h"
namespace esphome {
@@ -9,6 +8,7 @@ static const char *const TAG = "ads1118";
static const uint8_t ADS1118_DATA_RATE_860_SPS = 0b111;
void ADS1118::setup() {
ESP_LOGCONFIG(TAG, "Setting up ads1118");
this->spi_setup();
this->config_ = 0;

View File

@@ -34,6 +34,7 @@ class ADS1118 : public Component,
ADS1118() = default;
void setup() override;
void dump_config() override;
float get_setup_priority() const override { return setup_priority::DATA; }
/// Helper method to request a measurement from a sensor.
float request_measurement(ADS1118Multiplexer multiplexer, ADS1118Gain gain, bool temperature_mode);

View File

@@ -1,5 +1,4 @@
#include "ags10.h"
#include "esphome/core/helpers.h"
#include <cinttypes>
@@ -24,6 +23,8 @@ static const uint16_t ZP_CURRENT = 0x0000;
static const uint16_t ZP_DEFAULT = 0xFFFF;
void AGS10Component::setup() {
ESP_LOGCONFIG(TAG, "Setting up ags10...");
auto version = this->read_version_();
if (version) {
ESP_LOGD(TAG, "AGS10 Sensor Version: 0x%02X", *version);
@@ -43,6 +44,8 @@ void AGS10Component::setup() {
} else {
ESP_LOGE(TAG, "AGS10 Sensor Resistance: unknown");
}
ESP_LOGD(TAG, "Sensor initialized");
}
void AGS10Component::update() {
@@ -62,7 +65,7 @@ void AGS10Component::dump_config() {
case NONE:
break;
case COMMUNICATION_FAILED:
ESP_LOGE(TAG, ESP_LOG_MSG_COMM_FAIL);
ESP_LOGE(TAG, "Communication with AGS10 failed!");
break;
case CRC_CHECK_FAILED:
ESP_LOGE(TAG, "The crc check failed");
@@ -89,7 +92,7 @@ void AGS10Component::dump_config() {
bool AGS10Component::new_i2c_address(uint8_t newaddress) {
uint8_t rev_newaddress = ~newaddress;
std::array<uint8_t, 5> data{newaddress, rev_newaddress, newaddress, rev_newaddress, 0};
data[4] = crc8(data.data(), 4, 0xFF, 0x31, true);
data[4] = calc_crc8_(data, 4);
if (!this->write_bytes(REG_ADDRESS, data)) {
this->error_code_ = COMMUNICATION_FAILED;
this->status_set_warning();
@@ -109,7 +112,7 @@ bool AGS10Component::set_zero_point_with_current_resistance() { return this->set
bool AGS10Component::set_zero_point_with(uint16_t value) {
std::array<uint8_t, 5> data{0x00, 0x0C, (uint8_t) ((value >> 8) & 0xFF), (uint8_t) (value & 0xFF), 0};
data[4] = crc8(data.data(), 4, 0xFF, 0x31, true);
data[4] = calc_crc8_(data, 4);
if (!this->write_bytes(REG_CALIBRATION, data)) {
this->error_code_ = COMMUNICATION_FAILED;
this->status_set_warning();
@@ -184,7 +187,7 @@ template<size_t N> optional<std::array<uint8_t, N>> AGS10Component::read_and_che
auto res = *data;
auto crc_byte = res[len];
if (crc_byte != crc8(res.data(), len, 0xFF, 0x31, true)) {
if (crc_byte != calc_crc8_(res, len)) {
this->error_code_ = CRC_CHECK_FAILED;
ESP_LOGE(TAG, "Reading AGS10 version failed: crc error!");
return optional<std::array<uint8_t, N>>();
@@ -192,5 +195,20 @@ template<size_t N> optional<std::array<uint8_t, N>> AGS10Component::read_and_che
return data;
}
template<size_t N> uint8_t AGS10Component::calc_crc8_(std::array<uint8_t, N> dat, uint8_t num) {
uint8_t i, byte1, crc = 0xFF;
for (byte1 = 0; byte1 < num; byte1++) {
crc ^= (dat[byte1]);
for (i = 0; i < 8; i++) {
if (crc & 0x80) {
crc = (crc << 1) ^ 0x31;
} else {
crc = (crc << 1);
}
}
}
return crc;
}
} // namespace ags10
} // namespace esphome

View File

@@ -1,9 +1,9 @@
#pragma once
#include "esphome/components/i2c/i2c.h"
#include "esphome/components/sensor/sensor.h"
#include "esphome/core/automation.h"
#include "esphome/core/component.h"
#include "esphome/components/sensor/sensor.h"
#include "esphome/components/i2c/i2c.h"
namespace esphome {
namespace ags10 {
@@ -31,6 +31,8 @@ class AGS10Component : public PollingComponent, public i2c::I2CDevice {
void dump_config() override;
float get_setup_priority() const override { return setup_priority::DATA; }
/**
* Modifies target address of AGS10.
*
@@ -99,6 +101,16 @@ class AGS10Component : public PollingComponent, public i2c::I2CDevice {
* Read, checks and returns data from the sensor.
*/
template<size_t N> optional<std::array<uint8_t, N>> read_and_check_(uint8_t a_register);
/**
* Calculates CRC8 value.
*
* CRC8 calculation, initial value: 0xFF, polynomial: 0x31 (x8+ x5+ x4+1)
*
* @param[in] dat the data buffer
* @param num number of bytes in the buffer
*/
template<size_t N> uint8_t calc_crc8_(std::array<uint8_t, N> dat, uint8_t num);
};
template<typename... Ts> class AGS10NewI2cAddressAction : public Action<Ts...>, public Parented<AGS10Component> {

View File

@@ -13,9 +13,8 @@
// results making successive requests; the current implementation makes 3 attempts with a delay of 30ms each time.
#include "aht10.h"
#include "esphome/core/hal.h"
#include "esphome/core/helpers.h"
#include "esphome/core/log.h"
#include "esphome/core/hal.h"
namespace esphome {
namespace aht10 {
@@ -35,55 +34,57 @@ static const uint8_t AHT10_INIT_ATTEMPTS = 10;
static const uint8_t AHT10_STATUS_BUSY = 0x80;
static const float AHT10_DIVISOR = 1048576.0f; // 2^20, used for temperature and humidity calculations
void AHT10Component::setup() {
if (this->write(AHT10_SOFTRESET_CMD, sizeof(AHT10_SOFTRESET_CMD)) != i2c::ERROR_OK) {
ESP_LOGE(TAG, "Reset failed");
ESP_LOGE(TAG, "Reset AHT10 failed!");
}
delay(AHT10_SOFTRESET_DELAY);
i2c::ErrorCode error_code = i2c::ERROR_INVALID_ARGUMENT;
switch (this->variant_) {
case AHT10Variant::AHT20:
ESP_LOGCONFIG(TAG, "Setting up AHT20");
error_code = this->write(AHT20_INITIALIZE_CMD, sizeof(AHT20_INITIALIZE_CMD));
break;
case AHT10Variant::AHT10:
ESP_LOGCONFIG(TAG, "Setting up AHT10");
error_code = this->write(AHT10_INITIALIZE_CMD, sizeof(AHT10_INITIALIZE_CMD));
break;
}
if (error_code != i2c::ERROR_OK) {
ESP_LOGE(TAG, ESP_LOG_MSG_COMM_FAIL);
ESP_LOGE(TAG, "Communication with AHT10 failed!");
this->mark_failed();
return;
}
uint8_t cal_attempts = 0;
uint8_t data = AHT10_STATUS_BUSY;
int cal_attempts = 0;
while (data & AHT10_STATUS_BUSY) {
delay(AHT10_DEFAULT_DELAY);
if (this->read(&data, 1) != i2c::ERROR_OK) {
ESP_LOGE(TAG, ESP_LOG_MSG_COMM_FAIL);
ESP_LOGE(TAG, "Communication with AHT10 failed!");
this->mark_failed();
return;
}
++cal_attempts;
if (cal_attempts > AHT10_INIT_ATTEMPTS) {
ESP_LOGE(TAG, "Initialization timed out");
ESP_LOGE(TAG, "AHT10 initialization timed out!");
this->mark_failed();
return;
}
}
if ((data & 0x68) != 0x08) { // Bit[6:5] = 0b00, NORMAL mode and Bit[3] = 0b1, CALIBRATED
ESP_LOGE(TAG, "Initialization failed");
ESP_LOGE(TAG, "AHT10 initialization failed!");
this->mark_failed();
return;
}
ESP_LOGV(TAG, "AHT10 initialization");
}
void AHT10Component::restart_read_() {
if (this->read_count_ == AHT10_ATTEMPTS) {
this->read_count_ = 0;
this->status_set_error("Reading timed out");
this->status_set_error("Measurements reading timed-out!");
return;
}
this->read_count_++;
@@ -96,24 +97,24 @@ void AHT10Component::read_data_() {
ESP_LOGD(TAG, "Read attempt %d at %ums", this->read_count_, (unsigned) (millis() - this->start_time_));
}
if (this->read(data, 6) != i2c::ERROR_OK) {
this->status_set_warning(LOG_STR("Read failed, will retry"));
this->status_set_warning("AHT10 read failed, retrying soon");
this->restart_read_();
return;
}
if ((data[0] & 0x80) == 0x80) { // Bit[7] = 0b1, device is busy
ESP_LOGD(TAG, "Device busy, will retry");
ESP_LOGD(TAG, "AHT10 is busy, waiting...");
this->restart_read_();
return;
}
if (data[1] == 0x0 && data[2] == 0x0 && (data[3] >> 4) == 0x0) {
// Invalid humidity (0x0)
// Unrealistic humidity (0x0)
if (this->humidity_sensor_ == nullptr) {
ESP_LOGV(TAG, "Invalid humidity (reading not required)");
ESP_LOGV(TAG, "ATH10 Unrealistic humidity (0x0), but humidity is not required");
} else {
ESP_LOGD(TAG, "Invalid humidity, retrying");
ESP_LOGD(TAG, "ATH10 Unrealistic humidity (0x0), retrying...");
if (this->write(AHT10_MEASURE_CMD, sizeof(AHT10_MEASURE_CMD)) != i2c::ERROR_OK) {
this->status_set_warning(LOG_STR(ESP_LOG_MSG_COMM_FAIL));
this->status_set_warning("Communication with AHT10 failed!");
}
this->restart_read_();
return;
@@ -122,17 +123,22 @@ void AHT10Component::read_data_() {
if (this->read_count_ > 1) {
ESP_LOGD(TAG, "Success at %ums", (unsigned) (millis() - this->start_time_));
}
uint32_t raw_temperature = encode_uint24(data[3] & 0xF, data[4], data[5]);
uint32_t raw_humidity = encode_uint24(data[1], data[2], data[3]) >> 4;
uint32_t raw_temperature = ((data[3] & 0x0F) << 16) | (data[4] << 8) | data[5];
uint32_t raw_humidity = ((data[1] << 16) | (data[2] << 8) | data[3]) >> 4;
if (this->temperature_sensor_ != nullptr) {
float temperature = ((200.0f * static_cast<float>(raw_temperature)) / AHT10_DIVISOR) - 50.0f;
float temperature = ((200.0f * (float) raw_temperature) / 1048576.0f) - 50.0f;
this->temperature_sensor_->publish_state(temperature);
}
if (this->humidity_sensor_ != nullptr) {
float humidity = raw_humidity == 0 ? NAN : static_cast<float>(raw_humidity) * 100.0f / AHT10_DIVISOR;
float humidity;
if (raw_humidity == 0) { // unrealistic value
humidity = NAN;
} else {
humidity = (float) raw_humidity * 100.0f / 1048576.0f;
}
if (std::isnan(humidity)) {
ESP_LOGW(TAG, "Invalid humidity reading (0%%), ");
ESP_LOGW(TAG, "Invalid humidity! Sensor reported 0%% Hum");
}
this->humidity_sensor_->publish_state(humidity);
}
@@ -144,7 +150,7 @@ void AHT10Component::update() {
return;
this->start_time_ = millis();
if (this->write(AHT10_MEASURE_CMD, sizeof(AHT10_MEASURE_CMD)) != i2c::ERROR_OK) {
this->status_set_warning(LOG_STR(ESP_LOG_MSG_COMM_FAIL));
this->status_set_warning("Communication with AHT10 failed!");
return;
}
this->restart_read_();
@@ -156,7 +162,7 @@ void AHT10Component::dump_config() {
ESP_LOGCONFIG(TAG, "AHT10:");
LOG_I2C_DEVICE(this);
if (this->is_failed()) {
ESP_LOGE(TAG, ESP_LOG_MSG_COMM_FAIL);
ESP_LOGE(TAG, "Communication with AHT10 failed!");
}
LOG_SENSOR(" ", "Temperature", this->temperature_sensor_);
LOG_SENSOR(" ", "Humidity", this->humidity_sensor_);

View File

@@ -17,6 +17,8 @@ static const char *const TAG = "aic3204";
}
void AIC3204::setup() {
ESP_LOGCONFIG(TAG, "Setting up AIC3204...");
// Set register page to 0
ERROR_CHECK(this->write_byte(AIC3204_PAGE_CTRL, 0x00), "Set page 0 failed");
// Initiate SW reset (PLL is powered off as part of reset)
@@ -111,7 +113,7 @@ void AIC3204::dump_config() {
LOG_I2C_DEVICE(this);
if (this->is_failed()) {
ESP_LOGE(TAG, ESP_LOG_MSG_COMM_FAIL);
ESP_LOGE(TAG, "Communication with AIC3204 failed");
}
}

View File

@@ -66,6 +66,7 @@ class AIC3204 : public audio_dac::AudioDac, public Component, public i2c::I2CDev
public:
void setup() override;
void dump_config() override;
float get_setup_priority() const override { return setup_priority::DATA; }
bool set_mute_off() override;
bool set_mute_on() override;

View File

@@ -18,6 +18,6 @@ CONFIG_SCHEMA = cv.Schema(
).extend(esp32_ble_tracker.ESP_BLE_DEVICE_SCHEMA)
async def to_code(config):
def to_code(config):
var = cg.new_Pvariable(config[CONF_ID])
await esp32_ble_tracker.register_ble_device(var, config)
yield esp32_ble_tracker.register_ble_device(var, config)

View File

@@ -34,7 +34,7 @@ AirthingsWaveBase = airthings_wave_base_ns.class_(
BASE_SCHEMA = (
cv.Schema(
sensor.SENSOR_SCHEMA.extend(
{
cv.Optional(CONF_HUMIDITY): sensor.sensor_schema(
unit_of_measurement=UNIT_PERCENT,

View File

@@ -1 +1 @@
CODEOWNERS = ["@jeromelaban", "@precurse"]
CODEOWNERS = ["@jeromelaban"]

View File

@@ -73,29 +73,11 @@ void AirthingsWavePlus::dump_config() {
LOG_SENSOR(" ", "Illuminance", this->illuminance_sensor_);
}
void AirthingsWavePlus::setup() {
const char *service_uuid;
const char *characteristic_uuid;
const char *access_control_point_characteristic_uuid;
// Change UUIDs for Wave Radon Gen2
switch (this->wave_device_type_) {
case WaveDeviceType::WAVE_GEN2:
service_uuid = SERVICE_UUID_WAVE_RADON_GEN2;
characteristic_uuid = CHARACTERISTIC_UUID_WAVE_RADON_GEN2;
access_control_point_characteristic_uuid = ACCESS_CONTROL_POINT_CHARACTERISTIC_UUID_WAVE_RADON_GEN2;
break;
default:
// Wave Plus
service_uuid = SERVICE_UUID;
characteristic_uuid = CHARACTERISTIC_UUID;
access_control_point_characteristic_uuid = ACCESS_CONTROL_POINT_CHARACTERISTIC_UUID;
}
this->service_uuid_ = espbt::ESPBTUUID::from_raw(service_uuid);
this->sensors_data_characteristic_uuid_ = espbt::ESPBTUUID::from_raw(characteristic_uuid);
AirthingsWavePlus::AirthingsWavePlus() {
this->service_uuid_ = espbt::ESPBTUUID::from_raw(SERVICE_UUID);
this->sensors_data_characteristic_uuid_ = espbt::ESPBTUUID::from_raw(CHARACTERISTIC_UUID);
this->access_control_point_characteristic_uuid_ =
espbt::ESPBTUUID::from_raw(access_control_point_characteristic_uuid);
espbt::ESPBTUUID::from_raw(ACCESS_CONTROL_POINT_CHARACTERISTIC_UUID);
}
} // namespace airthings_wave_plus

View File

@@ -9,20 +9,13 @@ namespace airthings_wave_plus {
namespace espbt = esphome::esp32_ble_tracker;
enum WaveDeviceType : uint8_t { WAVE_PLUS = 0, WAVE_GEN2 = 1 };
static const char *const SERVICE_UUID = "b42e1c08-ade7-11e4-89d3-123b93f75cba";
static const char *const CHARACTERISTIC_UUID = "b42e2a68-ade7-11e4-89d3-123b93f75cba";
static const char *const ACCESS_CONTROL_POINT_CHARACTERISTIC_UUID = "b42e2d06-ade7-11e4-89d3-123b93f75cba";
static const char *const SERVICE_UUID_WAVE_RADON_GEN2 = "b42e4a8e-ade7-11e4-89d3-123b93f75cba";
static const char *const CHARACTERISTIC_UUID_WAVE_RADON_GEN2 = "b42e4dcc-ade7-11e4-89d3-123b93f75cba";
static const char *const ACCESS_CONTROL_POINT_CHARACTERISTIC_UUID_WAVE_RADON_GEN2 =
"b42e50d8-ade7-11e4-89d3-123b93f75cba";
class AirthingsWavePlus : public airthings_wave_base::AirthingsWaveBase {
public:
void setup() override;
AirthingsWavePlus();
void dump_config() override;
@@ -30,14 +23,12 @@ class AirthingsWavePlus : public airthings_wave_base::AirthingsWaveBase {
void set_radon_long_term(sensor::Sensor *radon_long_term) { radon_long_term_sensor_ = radon_long_term; }
void set_co2(sensor::Sensor *co2) { co2_sensor_ = co2; }
void set_illuminance(sensor::Sensor *illuminance) { illuminance_sensor_ = illuminance; }
void set_device_type(WaveDeviceType wave_device_type) { wave_device_type_ = wave_device_type; }
protected:
bool is_valid_radon_value_(uint16_t radon);
bool is_valid_co2_value_(uint16_t co2);
void read_sensors(uint8_t *raw_value, uint16_t value_len) override;
WaveDeviceType wave_device_type_{WaveDeviceType::WAVE_PLUS};
sensor::Sensor *radon_sensor_{nullptr};
sensor::Sensor *radon_long_term_sensor_{nullptr};

View File

@@ -7,7 +7,6 @@ from esphome.const import (
CONF_ILLUMINANCE,
CONF_RADON,
CONF_RADON_LONG_TERM,
CONF_TVOC,
DEVICE_CLASS_CARBON_DIOXIDE,
DEVICE_CLASS_ILLUMINANCE,
ICON_RADIOACTIVE,
@@ -16,7 +15,6 @@ from esphome.const import (
UNIT_LUX,
UNIT_PARTS_PER_MILLION,
)
from esphome.types import ConfigType
DEPENDENCIES = airthings_wave_base.DEPENDENCIES
@@ -27,59 +25,35 @@ AirthingsWavePlus = airthings_wave_plus_ns.class_(
"AirthingsWavePlus", airthings_wave_base.AirthingsWaveBase
)
CONF_DEVICE_TYPE = "device_type"
WaveDeviceType = airthings_wave_plus_ns.enum("WaveDeviceType")
DEVICE_TYPES = {
"WAVE_PLUS": WaveDeviceType.WAVE_PLUS,
"WAVE_GEN2": WaveDeviceType.WAVE_GEN2,
}
def validate_wave_gen2_config(config: ConfigType) -> ConfigType:
"""Validate that Wave Gen2 devices don't have CO2 or TVOC sensors."""
if config[CONF_DEVICE_TYPE] == "WAVE_GEN2":
if CONF_CO2 in config:
raise cv.Invalid("Wave Gen2 devices do not support CO2 sensor")
# Check for TVOC in the base schema config
if CONF_TVOC in config:
raise cv.Invalid("Wave Gen2 devices do not support TVOC sensor")
return config
CONFIG_SCHEMA = cv.All(
airthings_wave_base.BASE_SCHEMA.extend(
{
cv.GenerateID(): cv.declare_id(AirthingsWavePlus),
cv.Optional(CONF_RADON): sensor.sensor_schema(
unit_of_measurement=UNIT_BECQUEREL_PER_CUBIC_METER,
icon=ICON_RADIOACTIVE,
accuracy_decimals=0,
state_class=STATE_CLASS_MEASUREMENT,
),
cv.Optional(CONF_RADON_LONG_TERM): sensor.sensor_schema(
unit_of_measurement=UNIT_BECQUEREL_PER_CUBIC_METER,
icon=ICON_RADIOACTIVE,
accuracy_decimals=0,
state_class=STATE_CLASS_MEASUREMENT,
),
cv.Optional(CONF_CO2): sensor.sensor_schema(
unit_of_measurement=UNIT_PARTS_PER_MILLION,
accuracy_decimals=0,
device_class=DEVICE_CLASS_CARBON_DIOXIDE,
state_class=STATE_CLASS_MEASUREMENT,
),
cv.Optional(CONF_ILLUMINANCE): sensor.sensor_schema(
unit_of_measurement=UNIT_LUX,
accuracy_decimals=0,
device_class=DEVICE_CLASS_ILLUMINANCE,
state_class=STATE_CLASS_MEASUREMENT,
),
cv.Optional(CONF_DEVICE_TYPE, default="WAVE_PLUS"): cv.enum(
DEVICE_TYPES, upper=True
),
}
),
validate_wave_gen2_config,
CONFIG_SCHEMA = airthings_wave_base.BASE_SCHEMA.extend(
{
cv.GenerateID(): cv.declare_id(AirthingsWavePlus),
cv.Optional(CONF_RADON): sensor.sensor_schema(
unit_of_measurement=UNIT_BECQUEREL_PER_CUBIC_METER,
icon=ICON_RADIOACTIVE,
accuracy_decimals=0,
state_class=STATE_CLASS_MEASUREMENT,
),
cv.Optional(CONF_RADON_LONG_TERM): sensor.sensor_schema(
unit_of_measurement=UNIT_BECQUEREL_PER_CUBIC_METER,
icon=ICON_RADIOACTIVE,
accuracy_decimals=0,
state_class=STATE_CLASS_MEASUREMENT,
),
cv.Optional(CONF_CO2): sensor.sensor_schema(
unit_of_measurement=UNIT_PARTS_PER_MILLION,
accuracy_decimals=0,
device_class=DEVICE_CLASS_CARBON_DIOXIDE,
state_class=STATE_CLASS_MEASUREMENT,
),
cv.Optional(CONF_ILLUMINANCE): sensor.sensor_schema(
unit_of_measurement=UNIT_LUX,
accuracy_decimals=0,
device_class=DEVICE_CLASS_ILLUMINANCE,
state_class=STATE_CLASS_MEASUREMENT,
),
}
)
@@ -99,4 +73,3 @@ async def to_code(config):
if config_illuminance := config.get(CONF_ILLUMINANCE):
sens = await sensor.new_sensor(config_illuminance)
cg.add(var.set_illuminance(sens))
cg.add(var.set_device_type(config[CONF_DEVICE_TYPE]))

View File

@@ -5,17 +5,14 @@ from esphome.components import mqtt, web_server
import esphome.config_validation as cv
from esphome.const import (
CONF_CODE,
CONF_ENTITY_CATEGORY,
CONF_ICON,
CONF_ID,
CONF_MQTT_ID,
CONF_ON_STATE,
CONF_TRIGGER_ID,
CONF_WEB_SERVER,
)
from esphome.core import CORE, CoroPriority, coroutine_with_priority
from esphome.core.entity_helpers import entity_duplicate_validator, setup_entity
from esphome.cpp_generator import MockObjClass
from esphome.core import CORE, coroutine_with_priority
from esphome.cpp_helpers import setup_entity
CODEOWNERS = ["@grahambrown11", "@hwstar"]
IS_PLATFORM_COMPONENT = True
@@ -81,11 +78,12 @@ AlarmControlPanelCondition = alarm_control_panel_ns.class_(
"AlarmControlPanelCondition", automation.Condition
)
_ALARM_CONTROL_PANEL_SCHEMA = (
ALARM_CONTROL_PANEL_SCHEMA = (
cv.ENTITY_BASE_SCHEMA.extend(web_server.WEBSERVER_SORTING_SCHEMA)
.extend(cv.MQTT_COMMAND_COMPONENT_SCHEMA)
.extend(
{
cv.GenerateID(): cv.declare_id(AlarmControlPanel),
cv.OnlyWith(CONF_MQTT_ID, "mqtt"): cv.declare_id(
mqtt.MQTTAlarmControlPanelComponent
),
@@ -148,36 +146,6 @@ _ALARM_CONTROL_PANEL_SCHEMA = (
)
)
_ALARM_CONTROL_PANEL_SCHEMA.add_extra(entity_duplicate_validator("alarm_control_panel"))
def alarm_control_panel_schema(
class_: MockObjClass,
*,
entity_category: str = cv.UNDEFINED,
icon: str = cv.UNDEFINED,
) -> cv.Schema:
schema = {
cv.GenerateID(): cv.declare_id(class_),
}
for key, default, validator in [
(CONF_ENTITY_CATEGORY, entity_category, cv.entity_category),
(CONF_ICON, icon, cv.icon),
]:
if default is not cv.UNDEFINED:
schema[cv.Optional(key, default=default)] = validator
return _ALARM_CONTROL_PANEL_SCHEMA.extend(schema)
# Remove before 2025.11.0
ALARM_CONTROL_PANEL_SCHEMA = alarm_control_panel_schema(AlarmControlPanel)
ALARM_CONTROL_PANEL_SCHEMA.add_extra(
cv.deprecated_schema_constant("alarm_control_panel")
)
ALARM_CONTROL_PANEL_ACTION_SCHEMA = maybe_simple_id(
{
cv.GenerateID(): cv.use_id(AlarmControlPanel),
@@ -193,7 +161,7 @@ ALARM_CONTROL_PANEL_CONDITION_SCHEMA = maybe_simple_id(
async def setup_alarm_control_panel_core_(var, config):
await setup_entity(var, config, "alarm_control_panel")
await setup_entity(var, config)
for conf in config.get(CONF_ON_STATE, []):
trigger = cg.new_Pvariable(conf[CONF_TRIGGER_ID], var)
await automation.build_automation(trigger, [], conf)
@@ -238,16 +206,9 @@ async def register_alarm_control_panel(var, config):
if not CORE.has_id(config[CONF_ID]):
var = cg.Pvariable(config[CONF_ID], var)
cg.add(cg.App.register_alarm_control_panel(var))
CORE.register_platform_component("alarm_control_panel", var)
await setup_alarm_control_panel_core_(var, config)
async def new_alarm_control_panel(config, *args):
var = cg.new_Pvariable(config[CONF_ID], *args)
await register_alarm_control_panel(var, config)
return var
@automation.register_action(
"alarm_control_panel.arm_away", ArmAwayAction, ALARM_CONTROL_PANEL_ACTION_SCHEMA
)
@@ -301,7 +262,8 @@ async def alarm_action_disarm_to_code(config, action_id, template_arg, args):
)
async def alarm_action_pending_to_code(config, action_id, template_arg, args):
paren = await cg.get_variable(config[CONF_ID])
return cg.new_Pvariable(action_id, template_arg, paren)
var = cg.new_Pvariable(action_id, template_arg, paren)
return var
@automation.register_action(
@@ -309,7 +271,8 @@ async def alarm_action_pending_to_code(config, action_id, template_arg, args):
)
async def alarm_action_trigger_to_code(config, action_id, template_arg, args):
paren = await cg.get_variable(config[CONF_ID])
return cg.new_Pvariable(action_id, template_arg, paren)
var = cg.new_Pvariable(action_id, template_arg, paren)
return var
@automation.register_action(
@@ -317,7 +280,8 @@ async def alarm_action_trigger_to_code(config, action_id, template_arg, args):
)
async def alarm_action_chime_to_code(config, action_id, template_arg, args):
paren = await cg.get_variable(config[CONF_ID])
return cg.new_Pvariable(action_id, template_arg, paren)
var = cg.new_Pvariable(action_id, template_arg, paren)
return var
@automation.register_action(
@@ -330,7 +294,8 @@ async def alarm_action_chime_to_code(config, action_id, template_arg, args):
)
async def alarm_action_ready_to_code(config, action_id, template_arg, args):
paren = await cg.get_variable(config[CONF_ID])
return cg.new_Pvariable(action_id, template_arg, paren)
var = cg.new_Pvariable(action_id, template_arg, paren)
return var
@automation.register_condition(
@@ -345,6 +310,7 @@ async def alarm_control_panel_is_armed_to_code(
return cg.new_Pvariable(condition_id, template_arg, paren)
@coroutine_with_priority(CoroPriority.CORE)
@coroutine_with_priority(100.0)
async def to_code(config):
cg.add_global(alarm_control_panel_ns.using)
cg.add_define("USE_ALARM_CONTROL_PANEL")

View File

@@ -41,6 +41,7 @@ class Alpha3 : public esphome::ble_client::BLEClientNode, public PollingComponen
void gattc_event_handler(esp_gattc_cb_event_t event, esp_gatt_if_t gattc_if,
esp_ble_gattc_cb_param_t *param) override;
void dump_config() override;
float get_setup_priority() const override { return setup_priority::DATA; }
void set_flow_sensor(sensor::Sensor *sensor) { this->flow_sensor_ = sensor; }
void set_head_sensor(sensor::Sensor *sensor) { this->head_sensor_ = sensor; }
void set_power_sensor(sensor::Sensor *sensor) { this->power_sensor_ = sensor; }

View File

@@ -29,6 +29,22 @@ namespace am2315c {
static const char *const TAG = "am2315c";
uint8_t AM2315C::crc8_(uint8_t *data, uint8_t len) {
uint8_t crc = 0xFF;
while (len--) {
crc ^= *data++;
for (uint8_t i = 0; i < 8; i++) {
if (crc & 0x80) {
crc <<= 1;
crc ^= 0x31;
} else {
crc <<= 1;
}
}
}
return crc;
}
bool AM2315C::reset_register_(uint8_t reg) {
// code based on demo code sent by www.aosong.com
// no further documentation.
@@ -70,10 +86,12 @@ bool AM2315C::convert_(uint8_t *data, float &humidity, float &temperature) {
humidity = raw * 9.5367431640625e-5;
raw = ((data[3] & 0x0F) << 16) | (data[4] << 8) | data[5];
temperature = raw * 1.9073486328125e-4 - 50;
return crc8(data, 6, 0xFF, 0x31, true) == data[6];
return this->crc8_(data, 6) == data[6];
}
void AM2315C::setup() {
ESP_LOGCONFIG(TAG, "Setting up AM2315C...");
// get status
uint8_t status = 0;
if (this->read(&status, 1) != i2c::ERROR_OK) {
@@ -110,7 +128,7 @@ void AM2315C::update() {
data[2] = 0x00;
if (this->write(data, 3) != i2c::ERROR_OK) {
ESP_LOGE(TAG, "Write failed!");
this->status_set_warning();
this->mark_failed();
return;
}
@@ -120,12 +138,12 @@ void AM2315C::update() {
uint8_t status = 0;
if (this->read(&status, 1) != i2c::ERROR_OK) {
ESP_LOGE(TAG, "Read failed!");
this->status_set_warning();
this->mark_failed();
return;
}
if ((status & 0x80) == 0x80) {
ESP_LOGE(TAG, "HW still busy!");
this->status_set_warning();
this->mark_failed();
return;
}
@@ -133,7 +151,7 @@ void AM2315C::update() {
uint8_t data[7];
if (this->read(data, 7) != i2c::ERROR_OK) {
ESP_LOGE(TAG, "Read failed!");
this->status_set_warning();
this->mark_failed();
return;
}
@@ -170,7 +188,7 @@ void AM2315C::dump_config() {
ESP_LOGCONFIG(TAG, "AM2315C:");
LOG_I2C_DEVICE(this);
if (this->is_failed()) {
ESP_LOGE(TAG, ESP_LOG_MSG_COMM_FAIL);
ESP_LOGE(TAG, "Communication with AM2315C failed!");
}
LOG_SENSOR(" ", "Temperature", this->temperature_sensor_);
LOG_SENSOR(" ", "Humidity", this->humidity_sensor_);

View File

@@ -21,9 +21,9 @@
// SOFTWARE.
#pragma once
#include "esphome/components/i2c/i2c.h"
#include "esphome/components/sensor/sensor.h"
#include "esphome/core/component.h"
#include "esphome/components/sensor/sensor.h"
#include "esphome/components/i2c/i2c.h"
namespace esphome {
namespace am2315c {
@@ -39,6 +39,7 @@ class AM2315C : public PollingComponent, public i2c::I2CDevice {
void set_humidity_sensor(sensor::Sensor *humidity_sensor) { this->humidity_sensor_ = humidity_sensor; }
protected:
uint8_t crc8_(uint8_t *data, uint8_t len);
bool convert_(uint8_t *data, float &humidity, float &temperature);
bool reset_register_(uint8_t reg);

View File

@@ -34,6 +34,7 @@ void AM2320Component::update() {
this->status_clear_warning();
}
void AM2320Component::setup() {
ESP_LOGCONFIG(TAG, "Setting up AM2320...");
uint8_t data[8];
data[0] = 0;
data[1] = 4;
@@ -46,7 +47,7 @@ void AM2320Component::dump_config() {
ESP_LOGD(TAG, "AM2320:");
LOG_I2C_DEVICE(this);
if (this->is_failed()) {
ESP_LOGE(TAG, ESP_LOG_MSG_COMM_FAIL);
ESP_LOGE(TAG, "Communication with AM2320 failed!");
}
LOG_SENSOR(" ", "Temperature", this->temperature_sensor_);
LOG_SENSOR(" ", "Humidity", this->humidity_sensor_);

View File

@@ -1,7 +1,7 @@
#pragma once
#include "esphome/core/helpers.h"
#include "esphome/core/log.h"
#include "esphome/core/helpers.h"
namespace esphome {
namespace am43 {

View File

@@ -1,7 +1,7 @@
import esphome.codegen as cg
from esphome.components import ble_client, cover
import esphome.config_validation as cv
from esphome.const import CONF_PIN
from esphome.const import CONF_ID, CONF_PIN
CODEOWNERS = ["@buxtronix"]
DEPENDENCIES = ["ble_client"]
@@ -15,9 +15,9 @@ Am43Component = am43_ns.class_(
)
CONFIG_SCHEMA = (
cover.cover_schema(Am43Component)
.extend(
cover.COVER_SCHEMA.extend(
{
cv.GenerateID(): cv.declare_id(Am43Component),
cv.Optional(CONF_PIN, default=8888): cv.int_range(min=0, max=0xFFFF),
cv.Optional(CONF_INVERT_POSITION, default=False): cv.boolean,
}
@@ -28,8 +28,9 @@ CONFIG_SCHEMA = (
async def to_code(config):
var = await cover.new_cover(config)
var = cg.new_Pvariable(config[CONF_ID])
cg.add(var.set_pin(config[CONF_PIN]))
cg.add(var.set_invert_position(config[CONF_INVERT_POSITION]))
await cg.register_component(var, config)
await cover.register_cover(var, config)
await ble_client.register_ble_node(var, config)

View File

@@ -12,10 +12,8 @@ using namespace esphome::cover;
void Am43Component::dump_config() {
LOG_COVER("", "AM43 Cover", this);
ESP_LOGCONFIG(TAG,
" Device Pin: %d\n"
" Invert Position: %d",
this->pin_, (int) this->invert_position_);
ESP_LOGCONFIG(TAG, " Device Pin: %d", this->pin_);
ESP_LOGCONFIG(TAG, " Invert Position: %d", (int) this->invert_position_);
}
void Am43Component::setup() {

View File

@@ -22,6 +22,7 @@ class Am43Component : public cover::Cover, public esphome::ble_client::BLEClient
void gattc_event_handler(esp_gattc_cb_event_t event, esp_gatt_if_t gattc_if,
esp_ble_gattc_cb_param_t *param) override;
void dump_config() override;
float get_setup_priority() const override { return setup_priority::DATA; }
cover::CoverTraits get_traits() override;
void set_pin(uint16_t pin) { this->pin_ = pin; }
void set_invert_position(bool invert_position) { this->invert_position_ = invert_position; }

View File

@@ -22,6 +22,7 @@ class Am43 : public esphome::ble_client::BLEClientNode, public PollingComponent
void gattc_event_handler(esp_gattc_cb_event_t event, esp_gatt_if_t gattc_if,
esp_ble_gattc_cb_param_t *param) override;
void dump_config() override;
float get_setup_priority() const override { return setup_priority::DATA; }
void set_battery(sensor::Sensor *battery) { battery_ = battery; }
void set_illuminance(sensor::Sensor *illuminance) { illuminance_ = illuminance; }

View File

@@ -14,8 +14,7 @@ void AnalogThresholdBinarySensor::setup() {
if (std::isnan(sensor_value)) {
this->publish_initial_state(false);
} else {
this->publish_initial_state(sensor_value >=
(this->lower_threshold_.value() + this->upper_threshold_.value()) / 2.0f);
this->publish_initial_state(sensor_value >= (this->lower_threshold_ + this->upper_threshold_) / 2.0f);
}
}
@@ -25,8 +24,7 @@ void AnalogThresholdBinarySensor::set_sensor(sensor::Sensor *analog_sensor) {
this->sensor_->add_on_state_callback([this](float sensor_value) {
// if there is an invalid sensor reading, ignore the change and keep the current state
if (!std::isnan(sensor_value)) {
this->publish_state(sensor_value >=
(this->state ? this->lower_threshold_.value() : this->upper_threshold_.value()));
this->publish_state(sensor_value >= (this->state ? this->lower_threshold_ : this->upper_threshold_));
}
});
}
@@ -34,10 +32,8 @@ void AnalogThresholdBinarySensor::set_sensor(sensor::Sensor *analog_sensor) {
void AnalogThresholdBinarySensor::dump_config() {
LOG_BINARY_SENSOR("", "Analog Threshold Binary Sensor", this);
LOG_SENSOR(" ", "Sensor", this->sensor_);
ESP_LOGCONFIG(TAG,
" Upper threshold: %.11f\n"
" Lower threshold: %.11f",
this->upper_threshold_.value(), this->lower_threshold_.value());
ESP_LOGCONFIG(TAG, " Upper threshold: %.11f", this->upper_threshold_);
ESP_LOGCONFIG(TAG, " Lower threshold: %.11f", this->lower_threshold_);
}
} // namespace analog_threshold

View File

@@ -12,14 +12,17 @@ class AnalogThresholdBinarySensor : public Component, public binary_sensor::Bina
void dump_config() override;
void setup() override;
float get_setup_priority() const override { return setup_priority::DATA; }
void set_sensor(sensor::Sensor *analog_sensor);
template<typename T> void set_upper_threshold(T upper_threshold) { this->upper_threshold_ = upper_threshold; }
template<typename T> void set_lower_threshold(T lower_threshold) { this->lower_threshold_ = lower_threshold; }
void set_upper_threshold(float threshold) { this->upper_threshold_ = threshold; }
void set_lower_threshold(float threshold) { this->lower_threshold_ = threshold; }
protected:
sensor::Sensor *sensor_{nullptr};
TemplatableValue<float> upper_threshold_{};
TemplatableValue<float> lower_threshold_{};
float upper_threshold_;
float lower_threshold_;
};
} // namespace analog_threshold

View File

@@ -18,11 +18,11 @@ CONFIG_SCHEMA = (
{
cv.Required(CONF_SENSOR_ID): cv.use_id(sensor.Sensor),
cv.Required(CONF_THRESHOLD): cv.Any(
cv.templatable(cv.float_),
cv.float_,
cv.Schema(
{
cv.Required(CONF_UPPER): cv.templatable(cv.float_),
cv.Required(CONF_LOWER): cv.templatable(cv.float_),
cv.Required(CONF_UPPER): cv.float_,
cv.Required(CONF_LOWER): cv.float_,
}
),
),
@@ -39,11 +39,9 @@ async def to_code(config):
sens = await cg.get_variable(config[CONF_SENSOR_ID])
cg.add(var.set_sensor(sens))
if isinstance(config[CONF_THRESHOLD], dict):
lower = await cg.templatable(config[CONF_THRESHOLD][CONF_LOWER], [], float)
upper = await cg.templatable(config[CONF_THRESHOLD][CONF_UPPER], [], float)
if isinstance(config[CONF_THRESHOLD], float):
cg.add(var.set_upper_threshold(config[CONF_THRESHOLD]))
cg.add(var.set_lower_threshold(config[CONF_THRESHOLD]))
else:
lower = await cg.templatable(config[CONF_THRESHOLD], [], float)
upper = lower
cg.add(var.set_upper_threshold(upper))
cg.add(var.set_lower_threshold(lower))
cg.add(var.set_upper_threshold(config[CONF_THRESHOLD][CONF_UPPER]))
cg.add(var.set_lower_threshold(config[CONF_THRESHOLD][CONF_LOWER]))

View File

@@ -34,20 +34,17 @@ SetFrameAction = animation_ns.class_(
"AnimationSetFrameAction", automation.Action, cg.Parented.template(Animation_)
)
CONFIG_SCHEMA = cv.All(
espImage.IMAGE_SCHEMA.extend(
{
cv.Required(CONF_ID): cv.declare_id(Animation_),
cv.Optional(CONF_LOOP): cv.All(
{
cv.Optional(CONF_START_FRAME, default=0): cv.positive_int,
cv.Optional(CONF_END_FRAME): cv.positive_int,
cv.Optional(CONF_REPEAT): cv.positive_int,
}
),
},
),
espImage.validate_settings,
CONFIG_SCHEMA = espImage.IMAGE_SCHEMA.extend(
{
cv.Required(CONF_ID): cv.declare_id(Animation_),
cv.Optional(CONF_LOOP): cv.All(
{
cv.Optional(CONF_START_FRAME, default=0): cv.positive_int,
cv.Optional(CONF_END_FRAME): cv.positive_int,
cv.Optional(CONF_REPEAT): cv.positive_int,
}
),
},
)

View File

@@ -17,11 +17,7 @@ void Anova::setup() {
this->current_request_ = 0;
}
void Anova::loop() {
// Parent BLEClientNode has a loop() method, but this component uses
// polling via update() and BLE callbacks so loop isn't needed
this->disable_loop();
}
void Anova::loop() {}
void Anova::control(const ClimateCall &call) {
if (call.get_mode().has_value()) {

View File

@@ -26,6 +26,7 @@ class Anova : public climate::Climate, public esphome::ble_client::BLEClientNode
void gattc_event_handler(esp_gattc_cb_event_t event, esp_gatt_if_t gattc_if,
esp_ble_gattc_cb_param_t *param) override;
void dump_config() override;
float get_setup_priority() const override { return setup_priority::DATA; }
climate::ClimateTraits traits() override {
auto traits = climate::ClimateTraits();
traits.set_supports_current_temperature(true);

View File

@@ -1,7 +1,7 @@
import esphome.codegen as cg
from esphome.components import ble_client, climate
import esphome.config_validation as cv
from esphome.const import CONF_UNIT_OF_MEASUREMENT
from esphome.const import CONF_ID, CONF_UNIT_OF_MEASUREMENT
UNITS = {
"f": "f",
@@ -17,9 +17,9 @@ Anova = anova_ns.class_(
)
CONFIG_SCHEMA = (
climate.climate_schema(Anova)
.extend(
climate.CLIMATE_SCHEMA.extend(
{
cv.GenerateID(): cv.declare_id(Anova),
cv.Required(CONF_UNIT_OF_MEASUREMENT): cv.enum(UNITS),
}
)
@@ -29,7 +29,8 @@ CONFIG_SCHEMA = (
async def to_code(config):
var = await climate.new_climate(config)
var = cg.new_Pvariable(config[CONF_ID])
await cg.register_component(var, config)
await climate.register_climate(var, config)
await ble_client.register_ble_node(var, config)
cg.add(var.set_unit_of_measurement(config[CONF_UNIT_OF_MEASUREMENT]))

View File

@@ -54,6 +54,8 @@ enum { // APDS9306 registers
}
void APDS9306::setup() {
ESP_LOGCONFIG(TAG, "Setting up APDS9306...");
uint8_t id;
if (!this->read_byte(APDS9306_PART_ID, &id)) { // Part ID register
this->error_code_ = COMMUNICATION_FAILED;
@@ -84,6 +86,8 @@ void APDS9306::setup() {
// Set to active mode
APDS9306_WRITE_BYTE(APDS9306_MAIN_CTRL, 0x02);
ESP_LOGCONFIG(TAG, "APDS9306 setup complete");
}
void APDS9306::dump_config() {
@@ -93,7 +97,7 @@ void APDS9306::dump_config() {
if (this->is_failed()) {
switch (this->error_code_) {
case COMMUNICATION_FAILED:
ESP_LOGE(TAG, ESP_LOG_MSG_COMM_FAIL);
ESP_LOGE(TAG, "Communication with APDS9306 failed!");
break;
case WRONG_ID:
ESP_LOGE(TAG, "APDS9306 has invalid id!");
@@ -104,12 +108,9 @@ void APDS9306::dump_config() {
}
}
ESP_LOGCONFIG(TAG,
" Gain: %u\n"
" Measurement rate: %u\n"
" Measurement Resolution/Bit width: %d",
AMBIENT_LIGHT_GAIN_VALUES[this->gain_], MEASUREMENT_RATE_VALUES[this->measurement_rate_],
MEASUREMENT_BIT_WIDTH_VALUES[this->bit_width_]);
ESP_LOGCONFIG(TAG, " Gain: %u", AMBIENT_LIGHT_GAIN_VALUES[this->gain_]);
ESP_LOGCONFIG(TAG, " Measurement rate: %u", MEASUREMENT_RATE_VALUES[this->measurement_rate_]);
ESP_LOGCONFIG(TAG, " Measurement Resolution/Bit width: %d", MEASUREMENT_BIT_WIDTH_VALUES[this->bit_width_]);
LOG_UPDATE_INTERVAL(this);
}

View File

@@ -15,6 +15,7 @@ static const char *const TAG = "apds9960";
#define APDS9960_WRITE_BYTE(reg, value) APDS9960_ERROR_CHECK(this->write_byte(reg, value));
void APDS9960::setup() {
ESP_LOGCONFIG(TAG, "Setting up APDS9960...");
uint8_t id;
if (!this->read_byte(0x92, &id)) { // ID register
this->error_code_ = COMMUNICATION_FAILED;
@@ -22,7 +23,7 @@ void APDS9960::setup() {
return;
}
if (id != 0xAB && id != 0x9C && id != 0xA8 && id != 0x9E) { // APDS9960 all should have one of these IDs
if (id != 0xAB && id != 0x9C && id != 0xA8) { // APDS9960 all should have one of these IDs
this->error_code_ = WRONG_ID;
this->mark_failed();
return;
@@ -140,7 +141,7 @@ void APDS9960::dump_config() {
if (this->is_failed()) {
switch (this->error_code_) {
case COMMUNICATION_FAILED:
ESP_LOGE(TAG, ESP_LOG_MSG_COMM_FAIL);
ESP_LOGE(TAG, "Communication with APDS9960 failed!");
break;
case WRONG_ID:
ESP_LOGE(TAG, "APDS9960 has invalid id!");

View File

@@ -3,7 +3,6 @@ import base64
from esphome import automation
from esphome.automation import Condition
import esphome.codegen as cg
from esphome.config_helpers import get_logger_level
import esphome.config_validation as cv
from esphome.const import (
CONF_ACTION,
@@ -24,12 +23,11 @@ from esphome.const import (
CONF_TRIGGER_ID,
CONF_VARIABLES,
)
from esphome.core import CORE, CoroPriority, coroutine_with_priority
from esphome.core import coroutine_with_priority
DOMAIN = "api"
DEPENDENCIES = ["network"]
AUTO_LOAD = ["socket"]
CODEOWNERS = ["@esphome/core"]
CODEOWNERS = ["@OttoWinter"]
api_ns = cg.esphome_ns.namespace("api")
APIServer = api_ns.class_("APIServer", cg.Component, cg.Controller)
@@ -51,10 +49,6 @@ SERVICE_ARG_NATIVE_TYPES = {
"string[]": cg.std_vector.template(cg.std_string),
}
CONF_ENCRYPTION = "encryption"
CONF_BATCH_DELAY = "batch_delay"
CONF_CUSTOM_SERVICES = "custom_services"
CONF_HOMEASSISTANT_SERVICES = "homeassistant_services"
CONF_HOMEASSISTANT_STATES = "homeassistant_states"
def validate_encryption_key(value):
@@ -88,19 +82,6 @@ ACTIONS_SCHEMA = automation.validate_automation(
),
)
ENCRYPTION_SCHEMA = cv.Schema(
{
cv.Optional(CONF_KEY): validate_encryption_key,
}
)
def _encryption_schema(config):
if config is None:
config = {}
return ENCRYPTION_SCHEMA(config)
CONFIG_SCHEMA = cv.All(
cv.Schema(
{
@@ -114,14 +95,11 @@ CONFIG_SCHEMA = cv.All(
CONF_SERVICES, group_of_exclusion=CONF_ACTIONS
): ACTIONS_SCHEMA,
cv.Exclusive(CONF_ACTIONS, group_of_exclusion=CONF_ACTIONS): ACTIONS_SCHEMA,
cv.Optional(CONF_ENCRYPTION): _encryption_schema,
cv.Optional(CONF_BATCH_DELAY, default="100ms"): cv.All(
cv.positive_time_period_milliseconds,
cv.Range(max=cv.TimePeriod(milliseconds=65535)),
cv.Optional(CONF_ENCRYPTION): cv.Schema(
{
cv.Required(CONF_KEY): validate_encryption_key,
}
),
cv.Optional(CONF_CUSTOM_SERVICES, default=False): cv.boolean,
cv.Optional(CONF_HOMEASSISTANT_SERVICES, default=False): cv.boolean,
cv.Optional(CONF_HOMEASSISTANT_STATES, default=False): cv.boolean,
cv.Optional(CONF_ON_CLIENT_CONNECTED): automation.validate_automation(
single=True
),
@@ -134,47 +112,32 @@ CONFIG_SCHEMA = cv.All(
)
@coroutine_with_priority(CoroPriority.WEB)
@coroutine_with_priority(40.0)
async def to_code(config):
var = cg.new_Pvariable(config[CONF_ID])
await cg.register_component(var, config)
cg.add(var.set_port(config[CONF_PORT]))
if config[CONF_PASSWORD]:
cg.add_define("USE_API_PASSWORD")
cg.add(var.set_password(config[CONF_PASSWORD]))
cg.add(var.set_password(config[CONF_PASSWORD]))
cg.add(var.set_reboot_timeout(config[CONF_REBOOT_TIMEOUT]))
cg.add(var.set_batch_delay(config[CONF_BATCH_DELAY]))
# Set USE_API_SERVICES if any services are enabled
if config.get(CONF_ACTIONS) or config[CONF_CUSTOM_SERVICES]:
cg.add_define("USE_API_SERVICES")
if config[CONF_HOMEASSISTANT_SERVICES]:
cg.add_define("USE_API_HOMEASSISTANT_SERVICES")
if config[CONF_HOMEASSISTANT_STATES]:
cg.add_define("USE_API_HOMEASSISTANT_STATES")
if actions := config.get(CONF_ACTIONS, []):
for conf in actions:
template_args = []
func_args = []
service_arg_names = []
for name, var_ in conf[CONF_VARIABLES].items():
native = SERVICE_ARG_NATIVE_TYPES[var_]
template_args.append(native)
func_args.append((native, name))
service_arg_names.append(name)
templ = cg.TemplateArguments(*template_args)
trigger = cg.new_Pvariable(
conf[CONF_TRIGGER_ID], templ, conf[CONF_ACTION], service_arg_names
)
cg.add(var.register_user_service(trigger))
await automation.build_automation(trigger, func_args, conf)
for conf in config.get(CONF_ACTIONS, []):
template_args = []
func_args = []
service_arg_names = []
for name, var_ in conf[CONF_VARIABLES].items():
native = SERVICE_ARG_NATIVE_TYPES[var_]
template_args.append(native)
func_args.append((native, name))
service_arg_names.append(name)
templ = cg.TemplateArguments(*template_args)
trigger = cg.new_Pvariable(
conf[CONF_TRIGGER_ID], templ, conf[CONF_ACTION], service_arg_names
)
cg.add(var.register_user_service(trigger))
await automation.build_automation(trigger, func_args, conf)
if CONF_ON_CLIENT_CONNECTED in config:
cg.add_define("USE_API_CLIENT_CONNECTED_TRIGGER")
await automation.build_automation(
var.get_client_connected_trigger(),
[(cg.std_string, "client_info"), (cg.std_string, "client_address")],
@@ -182,26 +145,17 @@ async def to_code(config):
)
if CONF_ON_CLIENT_DISCONNECTED in config:
cg.add_define("USE_API_CLIENT_DISCONNECTED_TRIGGER")
await automation.build_automation(
var.get_client_disconnected_trigger(),
[(cg.std_string, "client_info"), (cg.std_string, "client_address")],
config[CONF_ON_CLIENT_DISCONNECTED],
)
if (encryption_config := config.get(CONF_ENCRYPTION, None)) is not None:
if key := encryption_config.get(CONF_KEY):
decoded = base64.b64decode(key)
cg.add(var.set_noise_psk(list(decoded)))
else:
# No key provided, but encryption desired
# This will allow a plaintext client to provide a noise key,
# send it to the device, and then switch to noise.
# The key will be saved in flash and used for future connections
# and plaintext disabled. Only a factory reset can remove it.
cg.add_define("USE_API_PLAINTEXT")
if encryption_config := config.get(CONF_ENCRYPTION):
decoded = base64.b64decode(encryption_config[CONF_KEY])
cg.add(var.set_noise_psk(list(decoded)))
cg.add_define("USE_API_NOISE")
cg.add_library("esphome/noise-c", "0.1.10")
cg.add_library("esphome/noise-c", "0.1.6")
else:
cg.add_define("USE_API_PLAINTEXT")
@@ -245,7 +199,6 @@ HOMEASSISTANT_ACTION_ACTION_SCHEMA = cv.All(
HOMEASSISTANT_ACTION_ACTION_SCHEMA,
)
async def homeassistant_service_to_code(config, action_id, template_arg, args):
cg.add_define("USE_API_HOMEASSISTANT_SERVICES")
serv = await cg.get_variable(config[CONF_ID])
var = cg.new_Pvariable(action_id, template_arg, serv, False)
templ = await cg.templatable(config[CONF_ACTION], args, None)
@@ -289,7 +242,6 @@ HOMEASSISTANT_EVENT_ACTION_SCHEMA = cv.Schema(
HOMEASSISTANT_EVENT_ACTION_SCHEMA,
)
async def homeassistant_event_to_code(config, action_id, template_arg, args):
cg.add_define("USE_API_HOMEASSISTANT_SERVICES")
serv = await cg.get_variable(config[CONF_ID])
var = cg.new_Pvariable(action_id, template_arg, serv, True)
templ = await cg.templatable(config[CONF_EVENT], args, None)
@@ -321,7 +273,6 @@ HOMEASSISTANT_TAG_SCANNED_ACTION_SCHEMA = cv.maybe_simple_value(
HOMEASSISTANT_TAG_SCANNED_ACTION_SCHEMA,
)
async def homeassistant_tag_scanned_to_code(config, action_id, template_arg, args):
cg.add_define("USE_API_HOMEASSISTANT_SERVICES")
serv = await cg.get_variable(config[CONF_ID])
var = cg.new_Pvariable(action_id, template_arg, serv, True)
cg.add(var.set_service("esphome.tag_scanned"))
@@ -333,38 +284,3 @@ async def homeassistant_tag_scanned_to_code(config, action_id, template_arg, arg
@automation.register_condition("api.connected", APIConnectedCondition, {})
async def api_connected_to_code(config, condition_id, template_arg, args):
return cg.new_Pvariable(condition_id, template_arg)
def FILTER_SOURCE_FILES() -> list[str]:
"""Filter out api_pb2_dump.cpp when proto message dumping is not enabled,
user_services.cpp when no services are defined, and protocol-specific
implementations based on encryption configuration."""
files_to_filter: list[str] = []
# api_pb2_dump.cpp is only needed when HAS_PROTO_MESSAGE_DUMP is defined
# This is a particularly large file that still needs to be opened and read
# all the way to the end even when ifdef'd out
#
# HAS_PROTO_MESSAGE_DUMP is defined when ESPHOME_LOG_HAS_VERY_VERBOSE is set,
# which happens when the logger level is VERY_VERBOSE
if get_logger_level() != "VERY_VERBOSE":
files_to_filter.append("api_pb2_dump.cpp")
# user_services.cpp is only needed when services are defined
config = CORE.config.get(DOMAIN, {})
if config and not config.get(CONF_ACTIONS) and not config[CONF_CUSTOM_SERVICES]:
files_to_filter.append("user_services.cpp")
# Filter protocol-specific implementations based on encryption configuration
encryption_config = config.get(CONF_ENCRYPTION) if config else None
# If encryption is not configured at all, we only need plaintext
if encryption_config is None:
files_to_filter.append("api_frame_helper_noise.cpp")
# If encryption is configured with a key, we only need noise
elif encryption_config.get(CONF_KEY):
files_to_filter.append("api_frame_helper_plaintext.cpp")
# If encryption is configured but no key is provided, we need both
# (this allows a plaintext client to provide a noise key)
return files_to_filter

File diff suppressed because it is too large Load Diff

Some files were not shown because too many files have changed in this diff Show More