mirror of
https://github.com/esphome/esphome.git
synced 2025-07-25 20:56:38 +00:00
Compare commits
No commits in common. "dev" and "2025.4.0" have entirely different histories.
@ -1,222 +0,0 @@
|
|||||||
# ESPHome AI Collaboration Guide
|
|
||||||
|
|
||||||
This document provides essential context for AI models interacting with this project. Adhering to these guidelines will ensure consistency and maintain code quality.
|
|
||||||
|
|
||||||
## 1. Project Overview & Purpose
|
|
||||||
|
|
||||||
* **Primary Goal:** ESPHome is a system to configure microcontrollers (like ESP32, ESP8266, RP2040, and LibreTiny-based chips) using simple yet powerful YAML configuration files. It generates C++ firmware that can be compiled and flashed to these devices, allowing users to control them remotely through home automation systems.
|
|
||||||
* **Business Domain:** Internet of Things (IoT), Home Automation.
|
|
||||||
|
|
||||||
## 2. Core Technologies & Stack
|
|
||||||
|
|
||||||
* **Languages:** Python (>=3.10), C++ (gnu++20)
|
|
||||||
* **Frameworks & Runtimes:** PlatformIO, Arduino, ESP-IDF.
|
|
||||||
* **Build Systems:** PlatformIO is the primary build system. CMake is used as an alternative.
|
|
||||||
* **Configuration:** YAML.
|
|
||||||
* **Key Libraries/Dependencies:**
|
|
||||||
* **Python:** `voluptuous` (for configuration validation), `PyYAML` (for parsing configuration files), `paho-mqtt` (for MQTT communication), `tornado` (for the web server), `aioesphomeapi` (for the native API).
|
|
||||||
* **C++:** `ArduinoJson` (for JSON serialization/deserialization), `AsyncMqttClient-esphome` (for MQTT), `ESPAsyncWebServer` (for the web server).
|
|
||||||
* **Package Manager(s):** `pip` (for Python dependencies), `platformio` (for C++/PlatformIO dependencies).
|
|
||||||
* **Communication Protocols:** Protobuf (for native API), MQTT, HTTP.
|
|
||||||
|
|
||||||
## 3. Architectural Patterns
|
|
||||||
|
|
||||||
* **Overall Architecture:** The project follows a code-generation architecture. The Python code parses user-defined YAML configuration files and generates C++ source code. This C++ code is then compiled and flashed to the target microcontroller using PlatformIO.
|
|
||||||
|
|
||||||
* **Directory Structure Philosophy:**
|
|
||||||
* `/esphome`: Contains the core Python source code for the ESPHome application.
|
|
||||||
* `/esphome/components`: Contains the individual components that can be used in ESPHome configurations. Each component is a self-contained unit with its own C++ and Python code.
|
|
||||||
* `/tests`: Contains all unit and integration tests for the Python code.
|
|
||||||
* `/docker`: Contains Docker-related files for building and running ESPHome in a container.
|
|
||||||
* `/script`: Contains helper scripts for development and maintenance.
|
|
||||||
|
|
||||||
* **Core Architectural Components:**
|
|
||||||
1. **Configuration System** (`esphome/config*.py`): Handles YAML parsing and validation using Voluptuous, schema definitions, and multi-platform configurations.
|
|
||||||
2. **Code Generation** (`esphome/codegen.py`, `esphome/cpp_generator.py`): Manages Python to C++ code generation, template processing, and build flag management.
|
|
||||||
3. **Component System** (`esphome/components/`): Contains modular hardware and software components with platform-specific implementations and dependency management.
|
|
||||||
4. **Core Framework** (`esphome/core/`): Manages the application lifecycle, hardware abstraction, and component registration.
|
|
||||||
5. **Dashboard** (`esphome/dashboard/`): A web-based interface for device configuration, management, and OTA updates.
|
|
||||||
|
|
||||||
* **Platform Support:**
|
|
||||||
1. **ESP32** (`components/esp32/`): Espressif ESP32 family. Supports multiple variants (S2, S3, C3, etc.) and both IDF and Arduino frameworks.
|
|
||||||
2. **ESP8266** (`components/esp8266/`): Espressif ESP8266. Arduino framework only, with memory constraints.
|
|
||||||
3. **RP2040** (`components/rp2040/`): Raspberry Pi Pico/RP2040. Arduino framework with PIO (Programmable I/O) support.
|
|
||||||
4. **LibreTiny** (`components/libretiny/`): Realtek and Beken chips. Supports multiple chip families and auto-generated components.
|
|
||||||
|
|
||||||
## 4. Coding Conventions & Style Guide
|
|
||||||
|
|
||||||
* **Formatting:**
|
|
||||||
* **Python:** Uses `ruff` and `flake8` for linting and formatting. Configuration is in `pyproject.toml`.
|
|
||||||
* **C++:** Uses `clang-format` for formatting. Configuration is in `.clang-format`.
|
|
||||||
|
|
||||||
* **Naming Conventions:**
|
|
||||||
* **Python:** Follows PEP 8. Use clear, descriptive names following snake_case.
|
|
||||||
* **C++:** Follows the Google C++ Style Guide.
|
|
||||||
|
|
||||||
* **Component Structure:**
|
|
||||||
* **Standard Files:**
|
|
||||||
```
|
|
||||||
components/[component_name]/
|
|
||||||
├── __init__.py # Component configuration schema and code generation
|
|
||||||
├── [component].h # C++ header file (if needed)
|
|
||||||
├── [component].cpp # C++ implementation (if needed)
|
|
||||||
└── [platform]/ # Platform-specific implementations
|
|
||||||
├── __init__.py # Platform-specific configuration
|
|
||||||
├── [platform].h # Platform C++ header
|
|
||||||
└── [platform].cpp # Platform C++ implementation
|
|
||||||
```
|
|
||||||
|
|
||||||
* **Component Metadata:**
|
|
||||||
- `DEPENDENCIES`: List of required components
|
|
||||||
- `AUTO_LOAD`: Components to automatically load
|
|
||||||
- `CONFLICTS_WITH`: Incompatible components
|
|
||||||
- `CODEOWNERS`: GitHub usernames responsible for maintenance
|
|
||||||
- `MULTI_CONF`: Whether multiple instances are allowed
|
|
||||||
|
|
||||||
* **Code Generation & Common Patterns:**
|
|
||||||
* **Configuration Schema Pattern:**
|
|
||||||
```python
|
|
||||||
import esphome.codegen as cg
|
|
||||||
import esphome.config_validation as cv
|
|
||||||
from esphome.const import CONF_KEY, CONF_ID
|
|
||||||
|
|
||||||
CONF_PARAM = "param" # A constant that does not yet exist in esphome/const.py
|
|
||||||
|
|
||||||
my_component_ns = cg.esphome_ns.namespace("my_component")
|
|
||||||
MyComponent = my_component_ns.class_("MyComponent", cg.Component)
|
|
||||||
|
|
||||||
CONFIG_SCHEMA = cv.Schema({
|
|
||||||
cv.GenerateID(): cv.declare_id(MyComponent),
|
|
||||||
cv.Required(CONF_KEY): cv.string,
|
|
||||||
cv.Optional(CONF_PARAM, default=42): cv.int_,
|
|
||||||
}).extend(cv.COMPONENT_SCHEMA)
|
|
||||||
|
|
||||||
async def to_code(config):
|
|
||||||
var = cg.new_Pvariable(config[CONF_ID])
|
|
||||||
await cg.register_component(var, config)
|
|
||||||
cg.add(var.set_key(config[CONF_KEY]))
|
|
||||||
cg.add(var.set_param(config[CONF_PARAM]))
|
|
||||||
```
|
|
||||||
|
|
||||||
* **C++ Class Pattern:**
|
|
||||||
```cpp
|
|
||||||
namespace esphome {
|
|
||||||
namespace my_component {
|
|
||||||
|
|
||||||
class MyComponent : public Component {
|
|
||||||
public:
|
|
||||||
void setup() override;
|
|
||||||
void loop() override;
|
|
||||||
void dump_config() override;
|
|
||||||
|
|
||||||
void set_key(const std::string &key) { this->key_ = key; }
|
|
||||||
void set_param(int param) { this->param_ = param; }
|
|
||||||
|
|
||||||
protected:
|
|
||||||
std::string key_;
|
|
||||||
int param_{0};
|
|
||||||
};
|
|
||||||
|
|
||||||
} // namespace my_component
|
|
||||||
} // namespace esphome
|
|
||||||
```
|
|
||||||
|
|
||||||
* **Common Component Examples:**
|
|
||||||
- **Sensor:**
|
|
||||||
```python
|
|
||||||
from esphome.components import sensor
|
|
||||||
CONFIG_SCHEMA = sensor.sensor_schema(MySensor).extend(cv.polling_component_schema("60s"))
|
|
||||||
async def to_code(config):
|
|
||||||
var = await sensor.new_sensor(config)
|
|
||||||
await cg.register_component(var, config)
|
|
||||||
```
|
|
||||||
|
|
||||||
- **Binary Sensor:**
|
|
||||||
```python
|
|
||||||
from esphome.components import binary_sensor
|
|
||||||
CONFIG_SCHEMA = binary_sensor.binary_sensor_schema().extend({ ... })
|
|
||||||
async def to_code(config):
|
|
||||||
var = await binary_sensor.new_binary_sensor(config)
|
|
||||||
```
|
|
||||||
|
|
||||||
- **Switch:**
|
|
||||||
```python
|
|
||||||
from esphome.components import switch
|
|
||||||
CONFIG_SCHEMA = switch.switch_schema().extend({ ... })
|
|
||||||
async def to_code(config):
|
|
||||||
var = await switch.new_switch(config)
|
|
||||||
```
|
|
||||||
|
|
||||||
* **Configuration Validation:**
|
|
||||||
* **Common Validators:** `cv.int_`, `cv.float_`, `cv.string`, `cv.boolean`, `cv.int_range(min=0, max=100)`, `cv.positive_int`, `cv.percentage`.
|
|
||||||
* **Complex Validation:** `cv.All(cv.string, cv.Length(min=1, max=50))`, `cv.Any(cv.int_, cv.string)`.
|
|
||||||
* **Platform-Specific:** `cv.only_on(["esp32", "esp8266"])`, `cv.only_with_arduino`.
|
|
||||||
* **Schema Extensions:**
|
|
||||||
```python
|
|
||||||
CONFIG_SCHEMA = cv.Schema({ ... })
|
|
||||||
.extend(cv.COMPONENT_SCHEMA)
|
|
||||||
.extend(uart.UART_DEVICE_SCHEMA)
|
|
||||||
.extend(i2c.i2c_device_schema(0x48))
|
|
||||||
.extend(spi.spi_device_schema(cs_pin_required=True))
|
|
||||||
```
|
|
||||||
|
|
||||||
## 5. Key Files & Entrypoints
|
|
||||||
|
|
||||||
* **Main Entrypoint(s):** `esphome/__main__.py` is the main entrypoint for the ESPHome command-line interface.
|
|
||||||
* **Configuration:**
|
|
||||||
* `pyproject.toml`: Defines the Python project metadata and dependencies.
|
|
||||||
* `platformio.ini`: Configures the PlatformIO build environments for different microcontrollers.
|
|
||||||
* `.pre-commit-config.yaml`: Configures the pre-commit hooks for linting and formatting.
|
|
||||||
* **CI/CD Pipeline:** Defined in `.github/workflows`.
|
|
||||||
|
|
||||||
## 6. Development & Testing Workflow
|
|
||||||
|
|
||||||
* **Local Development Environment:** Use the provided Docker container or create a Python virtual environment and install dependencies from `requirements_dev.txt`.
|
|
||||||
* **Running Commands:** Use the `script/run-in-env.py` script to execute commands within the project's virtual environment. For example, to run the linter: `python3 script/run-in-env.py pre-commit run`.
|
|
||||||
* **Testing:**
|
|
||||||
* **Python:** Run unit tests with `pytest`.
|
|
||||||
* **C++:** Use `clang-tidy` for static analysis.
|
|
||||||
* **Component Tests:** YAML-based compilation tests are located in `tests/`. The structure is as follows:
|
|
||||||
```
|
|
||||||
tests/
|
|
||||||
├── test_build_components/ # Base test configurations
|
|
||||||
└── components/[component]/ # Component-specific tests
|
|
||||||
```
|
|
||||||
Run them using `script/test_build_components`. Use `-c <component>` to test specific components and `-t <target>` for specific platforms.
|
|
||||||
* **Debugging and Troubleshooting:**
|
|
||||||
* **Debug Tools:**
|
|
||||||
- `esphome config <file>.yaml` to validate configuration.
|
|
||||||
- `esphome compile <file>.yaml` to compile without uploading.
|
|
||||||
- Check the Dashboard for real-time logs.
|
|
||||||
- Use component-specific debug logging.
|
|
||||||
* **Common Issues:**
|
|
||||||
- **Import Errors**: Check component dependencies and `PYTHONPATH`.
|
|
||||||
- **Validation Errors**: Review configuration schema definitions.
|
|
||||||
- **Build Errors**: Check platform compatibility and library versions.
|
|
||||||
- **Runtime Errors**: Review generated C++ code and component logic.
|
|
||||||
|
|
||||||
## 7. Specific Instructions for AI Collaboration
|
|
||||||
|
|
||||||
* **Contribution Workflow (Pull Request Process):**
|
|
||||||
1. **Fork & Branch:** Create a new branch in your fork.
|
|
||||||
2. **Make Changes:** Adhere to all coding conventions and patterns.
|
|
||||||
3. **Test:** Create component tests for all supported platforms and run the full test suite locally.
|
|
||||||
4. **Lint:** Run `pre-commit` to ensure code is compliant.
|
|
||||||
5. **Commit:** Commit your changes. There is no strict format for commit messages.
|
|
||||||
6. **Pull Request:** Submit a PR against the `dev` branch. The Pull Request title should have a prefix of the component being worked on (e.g., `[display] Fix bug`, `[abc123] Add new component`). Update documentation, examples, and add `CODEOWNERS` entries as needed. Pull requests should always be made with the PULL_REQUEST_TEMPLATE.md template filled out correctly.
|
|
||||||
|
|
||||||
* **Documentation Contributions:**
|
|
||||||
* Documentation is hosted in the separate `esphome/esphome-docs` repository.
|
|
||||||
* The contribution workflow is the same as for the codebase.
|
|
||||||
|
|
||||||
* **Best Practices:**
|
|
||||||
* **Component Development:** Keep dependencies minimal, provide clear error messages, and write comprehensive docstrings and tests.
|
|
||||||
* **Code Generation:** Generate minimal and efficient C++ code. Validate all user inputs thoroughly. Support multiple platform variations.
|
|
||||||
* **Configuration Design:** Aim for simplicity with sensible defaults, while allowing for advanced customization.
|
|
||||||
|
|
||||||
* **Security:** Be mindful of security when making changes to the API, web server, or any other network-related code. Do not hardcode secrets or keys.
|
|
||||||
|
|
||||||
* **Dependencies & Build System Integration:**
|
|
||||||
* **Python:** When adding a new Python dependency, add it to the appropriate `requirements*.txt` file and `pyproject.toml`.
|
|
||||||
* **C++ / PlatformIO:** When adding a new C++ dependency, add it to `platformio.ini` and use `cg.add_library`.
|
|
||||||
* **Build Flags:** Use `cg.add_build_flag(...)` to add compiler flags.
|
|
@ -1 +0,0 @@
|
|||||||
32b0db73b3ae01ba18c9cbb1dabbd8156bc14dded500471919bd0a3dc33916e0
|
|
@ -1,4 +1,2 @@
|
|||||||
[run]
|
[run]
|
||||||
omit =
|
omit = esphome/components/*
|
||||||
esphome/components/*
|
|
||||||
tests/integration/*
|
|
||||||
|
@ -1,37 +0,0 @@
|
|||||||
ARG BUILD_BASE_VERSION=2025.04.0
|
|
||||||
|
|
||||||
|
|
||||||
FROM ghcr.io/esphome/docker-base:debian-${BUILD_BASE_VERSION} AS base
|
|
||||||
|
|
||||||
RUN git config --system --add safe.directory "*"
|
|
||||||
|
|
||||||
RUN apt update \
|
|
||||||
&& apt install -y \
|
|
||||||
protobuf-compiler
|
|
||||||
|
|
||||||
RUN pip install uv
|
|
||||||
|
|
||||||
RUN useradd esphome -m
|
|
||||||
|
|
||||||
USER esphome
|
|
||||||
ENV VIRTUAL_ENV=/home/esphome/.local/esphome-venv
|
|
||||||
RUN uv venv $VIRTUAL_ENV
|
|
||||||
ENV PATH="$VIRTUAL_ENV/bin:$PATH"
|
|
||||||
# Override this set to true in the docker-base image
|
|
||||||
ENV UV_SYSTEM_PYTHON=false
|
|
||||||
|
|
||||||
WORKDIR /tmp
|
|
||||||
|
|
||||||
COPY requirements.txt ./
|
|
||||||
RUN uv pip install -r requirements.txt
|
|
||||||
COPY requirements_dev.txt requirements_test.txt ./
|
|
||||||
RUN uv pip install -r requirements_dev.txt -r requirements_test.txt
|
|
||||||
|
|
||||||
RUN \
|
|
||||||
platformio settings set enable_telemetry No \
|
|
||||||
&& platformio settings set check_platformio_interval 1000000
|
|
||||||
|
|
||||||
COPY script/platformio_install_deps.py platformio.ini ./
|
|
||||||
RUN ./platformio_install_deps.py platformio.ini --libraries --platforms --tools
|
|
||||||
|
|
||||||
WORKDIR /workspaces
|
|
@ -1,17 +1,18 @@
|
|||||||
{
|
{
|
||||||
"name": "ESPHome Dev",
|
"name": "ESPHome Dev",
|
||||||
"context": "..",
|
"image": "ghcr.io/esphome/esphome-lint:dev",
|
||||||
"dockerFile": "Dockerfile",
|
|
||||||
"postCreateCommand": [
|
"postCreateCommand": [
|
||||||
"script/devcontainer-post-create"
|
"script/devcontainer-post-create"
|
||||||
],
|
],
|
||||||
"features": {
|
"containerEnv": {
|
||||||
"ghcr.io/devcontainers/features/github-cli:1": {}
|
"DEVCONTAINER": "1",
|
||||||
|
"PIP_BREAK_SYSTEM_PACKAGES": "1",
|
||||||
|
"PIP_ROOT_USER_ACTION": "ignore"
|
||||||
},
|
},
|
||||||
"runArgs": [
|
"runArgs": [
|
||||||
"--privileged",
|
"--privileged",
|
||||||
"-e",
|
"-e",
|
||||||
"GIT_EDITOR=code --wait"
|
"ESPHOME_DASHBOARD_USE_PING=1"
|
||||||
// uncomment and edit the path in order to pass though local USB serial to the conatiner
|
// uncomment and edit the path in order to pass though local USB serial to the conatiner
|
||||||
// , "--device=/dev/ttyACM0"
|
// , "--device=/dev/ttyACM0"
|
||||||
],
|
],
|
||||||
|
@ -114,5 +114,4 @@ config/
|
|||||||
examples/
|
examples/
|
||||||
Dockerfile
|
Dockerfile
|
||||||
.git/
|
.git/
|
||||||
tests/
|
tests/build/
|
||||||
.*
|
|
||||||
|
92
.github/ISSUE_TEMPLATE/bug_report.yml
vendored
92
.github/ISSUE_TEMPLATE/bug_report.yml
vendored
@ -1,92 +0,0 @@
|
|||||||
name: Report an issue with ESPHome
|
|
||||||
description: Report an issue with ESPHome.
|
|
||||||
body:
|
|
||||||
- type: markdown
|
|
||||||
attributes:
|
|
||||||
value: |
|
|
||||||
This issue form is for reporting bugs only!
|
|
||||||
|
|
||||||
If you have a feature request or enhancement, please [request them here instead][fr].
|
|
||||||
|
|
||||||
[fr]: https://github.com/orgs/esphome/discussions
|
|
||||||
- type: textarea
|
|
||||||
validations:
|
|
||||||
required: true
|
|
||||||
id: problem
|
|
||||||
attributes:
|
|
||||||
label: The problem
|
|
||||||
description: >-
|
|
||||||
Describe the issue you are experiencing here to communicate to the
|
|
||||||
maintainers. Tell us what you were trying to do and what happened.
|
|
||||||
|
|
||||||
Provide a clear and concise description of what the problem is.
|
|
||||||
|
|
||||||
- type: markdown
|
|
||||||
attributes:
|
|
||||||
value: |
|
|
||||||
## Environment
|
|
||||||
- type: input
|
|
||||||
id: version
|
|
||||||
validations:
|
|
||||||
required: true
|
|
||||||
attributes:
|
|
||||||
label: Which version of ESPHome has the issue?
|
|
||||||
description: >
|
|
||||||
ESPHome version like 1.19, 2025.6.0 or 2025.XX.X-dev.
|
|
||||||
- type: dropdown
|
|
||||||
validations:
|
|
||||||
required: true
|
|
||||||
id: installation
|
|
||||||
attributes:
|
|
||||||
label: What type of installation are you using?
|
|
||||||
options:
|
|
||||||
- Home Assistant Add-on
|
|
||||||
- Docker
|
|
||||||
- pip
|
|
||||||
- type: dropdown
|
|
||||||
validations:
|
|
||||||
required: true
|
|
||||||
id: platform
|
|
||||||
attributes:
|
|
||||||
label: What platform are you using?
|
|
||||||
options:
|
|
||||||
- ESP8266
|
|
||||||
- ESP32
|
|
||||||
- RP2040
|
|
||||||
- BK72XX
|
|
||||||
- RTL87XX
|
|
||||||
- LN882X
|
|
||||||
- Host
|
|
||||||
- Other
|
|
||||||
- type: input
|
|
||||||
id: component_name
|
|
||||||
attributes:
|
|
||||||
label: Component causing the issue
|
|
||||||
description: >
|
|
||||||
The name of the component or platform. For example, api/i2c or ultrasonic.
|
|
||||||
|
|
||||||
- type: markdown
|
|
||||||
attributes:
|
|
||||||
value: |
|
|
||||||
# Details
|
|
||||||
- type: textarea
|
|
||||||
id: config
|
|
||||||
attributes:
|
|
||||||
label: YAML Config
|
|
||||||
description: |
|
|
||||||
Include a complete YAML configuration file demonstrating the problem here. Preferably post the *entire* file - don't make assumptions about what is unimportant. However, if it's a large or complicated config then you will need to reduce it to the smallest possible file *that still demonstrates the problem*. If you don't provide enough information to *easily* reproduce the problem, it's unlikely your bug report will get any attention. Logs do not belong here, attach them below.
|
|
||||||
render: yaml
|
|
||||||
- type: textarea
|
|
||||||
id: logs
|
|
||||||
attributes:
|
|
||||||
label: Anything in the logs that might be useful for us?
|
|
||||||
description: For example, error message, or stack traces. Serial or USB logs are much more useful than WiFi logs.
|
|
||||||
render: txt
|
|
||||||
- type: textarea
|
|
||||||
id: additional
|
|
||||||
attributes:
|
|
||||||
label: Additional information
|
|
||||||
description: >
|
|
||||||
If you have any additional information for us, use the field below.
|
|
||||||
Please note, you can attach screenshots or screen recordings here, by
|
|
||||||
dragging and dropping files in the field below.
|
|
26
.github/ISSUE_TEMPLATE/config.yml
vendored
26
.github/ISSUE_TEMPLATE/config.yml
vendored
@ -1,21 +1,15 @@
|
|||||||
---
|
---
|
||||||
blank_issues_enabled: false
|
blank_issues_enabled: false
|
||||||
contact_links:
|
contact_links:
|
||||||
- name: Report an issue with the ESPHome documentation
|
- name: Issue Tracker
|
||||||
url: https://github.com/esphome/esphome-docs/issues/new/choose
|
url: https://github.com/esphome/issues
|
||||||
about: Report an issue with the ESPHome documentation.
|
about: Please create bug reports in the dedicated issue tracker.
|
||||||
- name: Report an issue with the ESPHome web server
|
- name: Feature Request Tracker
|
||||||
url: https://github.com/esphome/esphome-webserver/issues/new/choose
|
url: https://github.com/esphome/feature-requests
|
||||||
about: Report an issue with the ESPHome web server.
|
about: |
|
||||||
- name: Report an issue with the ESPHome Builder / Dashboard
|
Please create feature requests in the dedicated feature request tracker.
|
||||||
url: https://github.com/esphome/dashboard/issues/new/choose
|
|
||||||
about: Report an issue with the ESPHome Builder / Dashboard.
|
|
||||||
- name: Report an issue with the ESPHome API client
|
|
||||||
url: https://github.com/esphome/aioesphomeapi/issues/new/choose
|
|
||||||
about: Report an issue with the ESPHome API client.
|
|
||||||
- name: Make a Feature Request
|
|
||||||
url: https://github.com/orgs/esphome/discussions
|
|
||||||
about: Please create feature requests in the dedicated feature request tracker.
|
|
||||||
- name: Frequently Asked Question
|
- name: Frequently Asked Question
|
||||||
url: https://esphome.io/guides/faq.html
|
url: https://esphome.io/guides/faq.html
|
||||||
about: Please view the FAQ for common questions and what to include in a bug report.
|
about: |
|
||||||
|
Please view the FAQ for common questions and what
|
||||||
|
to include in a bug report.
|
||||||
|
1
.github/PULL_REQUEST_TEMPLATE.md
vendored
1
.github/PULL_REQUEST_TEMPLATE.md
vendored
@ -26,7 +26,6 @@
|
|||||||
- [ ] RP2040
|
- [ ] RP2040
|
||||||
- [ ] BK72xx
|
- [ ] BK72xx
|
||||||
- [ ] RTL87xx
|
- [ ] RTL87xx
|
||||||
- [ ] nRF52840
|
|
||||||
|
|
||||||
## Example entry for `config.yaml`:
|
## Example entry for `config.yaml`:
|
||||||
|
|
||||||
|
33
.github/actions/build-image/action.yaml
vendored
33
.github/actions/build-image/action.yaml
vendored
@ -1,11 +1,15 @@
|
|||||||
name: Build Image
|
name: Build Image
|
||||||
inputs:
|
inputs:
|
||||||
|
platform:
|
||||||
|
description: "Platform to build for"
|
||||||
|
required: true
|
||||||
|
example: "linux/amd64"
|
||||||
target:
|
target:
|
||||||
description: "Target to build"
|
description: "Target to build"
|
||||||
required: true
|
required: true
|
||||||
example: "docker"
|
example: "docker"
|
||||||
build_type:
|
baseimg:
|
||||||
description: "Build type"
|
description: "Base image type"
|
||||||
required: true
|
required: true
|
||||||
example: "docker"
|
example: "docker"
|
||||||
suffix:
|
suffix:
|
||||||
@ -15,11 +19,6 @@ inputs:
|
|||||||
description: "Version to build"
|
description: "Version to build"
|
||||||
required: true
|
required: true
|
||||||
example: "2023.12.0"
|
example: "2023.12.0"
|
||||||
base_os:
|
|
||||||
description: "Base OS to use"
|
|
||||||
required: false
|
|
||||||
default: "debian"
|
|
||||||
example: "debian"
|
|
||||||
runs:
|
runs:
|
||||||
using: "composite"
|
using: "composite"
|
||||||
steps:
|
steps:
|
||||||
@ -47,52 +46,52 @@ runs:
|
|||||||
|
|
||||||
- name: Build and push to ghcr by digest
|
- name: Build and push to ghcr by digest
|
||||||
id: build-ghcr
|
id: build-ghcr
|
||||||
uses: docker/build-push-action@v6.18.0
|
uses: docker/build-push-action@v6.15.0
|
||||||
env:
|
env:
|
||||||
DOCKER_BUILD_SUMMARY: false
|
DOCKER_BUILD_SUMMARY: false
|
||||||
DOCKER_BUILD_RECORD_UPLOAD: false
|
DOCKER_BUILD_RECORD_UPLOAD: false
|
||||||
with:
|
with:
|
||||||
context: .
|
context: .
|
||||||
file: ./docker/Dockerfile
|
file: ./docker/Dockerfile
|
||||||
|
platforms: ${{ inputs.platform }}
|
||||||
target: ${{ inputs.target }}
|
target: ${{ inputs.target }}
|
||||||
cache-from: type=gha
|
cache-from: type=gha
|
||||||
cache-to: ${{ steps.cache-to.outputs.value }}
|
cache-to: ${{ steps.cache-to.outputs.value }}
|
||||||
build-args: |
|
build-args: |
|
||||||
BUILD_TYPE=${{ inputs.build_type }}
|
BASEIMGTYPE=${{ inputs.baseimg }}
|
||||||
BUILD_VERSION=${{ inputs.version }}
|
BUILD_VERSION=${{ inputs.version }}
|
||||||
BUILD_OS=${{ inputs.base_os }}
|
|
||||||
outputs: |
|
outputs: |
|
||||||
type=image,name=ghcr.io/${{ steps.tags.outputs.image_name }},push-by-digest=true,name-canonical=true,push=true
|
type=image,name=ghcr.io/${{ steps.tags.outputs.image_name }},push-by-digest=true,name-canonical=true,push=true
|
||||||
|
|
||||||
- name: Export ghcr digests
|
- name: Export ghcr digests
|
||||||
shell: bash
|
shell: bash
|
||||||
run: |
|
run: |
|
||||||
mkdir -p /tmp/digests/${{ inputs.build_type }}/ghcr
|
mkdir -p /tmp/digests/${{ inputs.target }}/ghcr
|
||||||
digest="${{ steps.build-ghcr.outputs.digest }}"
|
digest="${{ steps.build-ghcr.outputs.digest }}"
|
||||||
touch "/tmp/digests/${{ inputs.build_type }}/ghcr/${digest#sha256:}"
|
touch "/tmp/digests/${{ inputs.target }}/ghcr/${digest#sha256:}"
|
||||||
|
|
||||||
- name: Build and push to dockerhub by digest
|
- name: Build and push to dockerhub by digest
|
||||||
id: build-dockerhub
|
id: build-dockerhub
|
||||||
uses: docker/build-push-action@v6.18.0
|
uses: docker/build-push-action@v6.15.0
|
||||||
env:
|
env:
|
||||||
DOCKER_BUILD_SUMMARY: false
|
DOCKER_BUILD_SUMMARY: false
|
||||||
DOCKER_BUILD_RECORD_UPLOAD: false
|
DOCKER_BUILD_RECORD_UPLOAD: false
|
||||||
with:
|
with:
|
||||||
context: .
|
context: .
|
||||||
file: ./docker/Dockerfile
|
file: ./docker/Dockerfile
|
||||||
|
platforms: ${{ inputs.platform }}
|
||||||
target: ${{ inputs.target }}
|
target: ${{ inputs.target }}
|
||||||
cache-from: type=gha
|
cache-from: type=gha
|
||||||
cache-to: ${{ steps.cache-to.outputs.value }}
|
cache-to: ${{ steps.cache-to.outputs.value }}
|
||||||
build-args: |
|
build-args: |
|
||||||
BUILD_TYPE=${{ inputs.build_type }}
|
BASEIMGTYPE=${{ inputs.baseimg }}
|
||||||
BUILD_VERSION=${{ inputs.version }}
|
BUILD_VERSION=${{ inputs.version }}
|
||||||
BUILD_OS=${{ inputs.base_os }}
|
|
||||||
outputs: |
|
outputs: |
|
||||||
type=image,name=docker.io/${{ steps.tags.outputs.image_name }},push-by-digest=true,name-canonical=true,push=true
|
type=image,name=docker.io/${{ steps.tags.outputs.image_name }},push-by-digest=true,name-canonical=true,push=true
|
||||||
|
|
||||||
- name: Export dockerhub digests
|
- name: Export dockerhub digests
|
||||||
shell: bash
|
shell: bash
|
||||||
run: |
|
run: |
|
||||||
mkdir -p /tmp/digests/${{ inputs.build_type }}/dockerhub
|
mkdir -p /tmp/digests/${{ inputs.target }}/dockerhub
|
||||||
digest="${{ steps.build-dockerhub.outputs.digest }}"
|
digest="${{ steps.build-dockerhub.outputs.digest }}"
|
||||||
touch "/tmp/digests/${{ inputs.build_type }}/dockerhub/${digest#sha256:}"
|
touch "/tmp/digests/${{ inputs.target }}/dockerhub/${digest#sha256:}"
|
||||||
|
8
.github/actions/restore-python/action.yml
vendored
8
.github/actions/restore-python/action.yml
vendored
@ -17,7 +17,7 @@ runs:
|
|||||||
steps:
|
steps:
|
||||||
- name: Set up Python ${{ inputs.python-version }}
|
- name: Set up Python ${{ inputs.python-version }}
|
||||||
id: python
|
id: python
|
||||||
uses: actions/setup-python@v5.6.0
|
uses: actions/setup-python@v5.5.0
|
||||||
with:
|
with:
|
||||||
python-version: ${{ inputs.python-version }}
|
python-version: ${{ inputs.python-version }}
|
||||||
- name: Restore Python virtual environment
|
- name: Restore Python virtual environment
|
||||||
@ -34,14 +34,14 @@ runs:
|
|||||||
python -m venv venv
|
python -m venv venv
|
||||||
source venv/bin/activate
|
source venv/bin/activate
|
||||||
python --version
|
python --version
|
||||||
pip install -r requirements.txt -r requirements_test.txt
|
pip install -r requirements.txt -r requirements_optional.txt -r requirements_test.txt
|
||||||
pip install -e .
|
pip install -e .
|
||||||
- name: Create Python virtual environment
|
- name: Create Python virtual environment
|
||||||
if: steps.cache-venv.outputs.cache-hit != 'true' && runner.os == 'Windows'
|
if: steps.cache-venv.outputs.cache-hit != 'true' && runner.os == 'Windows'
|
||||||
shell: bash
|
shell: bash
|
||||||
run: |
|
run: |
|
||||||
python -m venv venv
|
python -m venv venv
|
||||||
source ./venv/Scripts/activate
|
./venv/Scripts/activate
|
||||||
python --version
|
python --version
|
||||||
pip install -r requirements.txt -r requirements_test.txt
|
pip install -r requirements.txt -r requirements_optional.txt -r requirements_test.txt
|
||||||
pip install -e .
|
pip install -e .
|
||||||
|
1
.github/copilot-instructions.md
vendored
1
.github/copilot-instructions.md
vendored
@ -1 +0,0 @@
|
|||||||
../.ai/instructions.md
|
|
10
.github/dependabot.yml
vendored
10
.github/dependabot.yml
vendored
@ -9,9 +9,6 @@ updates:
|
|||||||
# Hypotehsis is only used for testing and is updated quite often
|
# Hypotehsis is only used for testing and is updated quite often
|
||||||
- dependency-name: hypothesis
|
- dependency-name: hypothesis
|
||||||
- package-ecosystem: github-actions
|
- package-ecosystem: github-actions
|
||||||
labels:
|
|
||||||
- "dependencies"
|
|
||||||
- "github-actions"
|
|
||||||
directory: "/"
|
directory: "/"
|
||||||
schedule:
|
schedule:
|
||||||
interval: daily
|
interval: daily
|
||||||
@ -20,20 +17,15 @@ updates:
|
|||||||
docker-actions:
|
docker-actions:
|
||||||
applies-to: version-updates
|
applies-to: version-updates
|
||||||
patterns:
|
patterns:
|
||||||
|
- "docker/setup-qemu-action"
|
||||||
- "docker/login-action"
|
- "docker/login-action"
|
||||||
- "docker/setup-buildx-action"
|
- "docker/setup-buildx-action"
|
||||||
- package-ecosystem: github-actions
|
- package-ecosystem: github-actions
|
||||||
labels:
|
|
||||||
- "dependencies"
|
|
||||||
- "github-actions"
|
|
||||||
directory: "/.github/actions/build-image"
|
directory: "/.github/actions/build-image"
|
||||||
schedule:
|
schedule:
|
||||||
interval: daily
|
interval: daily
|
||||||
open-pull-requests-limit: 10
|
open-pull-requests-limit: 10
|
||||||
- package-ecosystem: github-actions
|
- package-ecosystem: github-actions
|
||||||
labels:
|
|
||||||
- "dependencies"
|
|
||||||
- "github-actions"
|
|
||||||
directory: "/.github/actions/restore-python"
|
directory: "/.github/actions/restore-python"
|
||||||
schedule:
|
schedule:
|
||||||
interval: daily
|
interval: daily
|
||||||
|
620
.github/workflows/auto-label-pr.yml
vendored
620
.github/workflows/auto-label-pr.yml
vendored
@ -1,620 +0,0 @@
|
|||||||
name: Auto Label PR
|
|
||||||
|
|
||||||
on:
|
|
||||||
# Runs only on pull_request_target due to having access to a App token.
|
|
||||||
# This means PRs from forks will not be able to alter this workflow to get the tokens
|
|
||||||
pull_request_target:
|
|
||||||
types: [labeled, opened, reopened, synchronize, edited]
|
|
||||||
|
|
||||||
permissions:
|
|
||||||
pull-requests: write
|
|
||||||
contents: read
|
|
||||||
|
|
||||||
env:
|
|
||||||
SMALL_PR_THRESHOLD: 30
|
|
||||||
MAX_LABELS: 15
|
|
||||||
TOO_BIG_THRESHOLD: 1000
|
|
||||||
COMPONENT_LABEL_THRESHOLD: 10
|
|
||||||
|
|
||||||
jobs:
|
|
||||||
label:
|
|
||||||
runs-on: ubuntu-latest
|
|
||||||
if: github.event.action != 'labeled' || github.event.sender.type != 'Bot'
|
|
||||||
steps:
|
|
||||||
- name: Checkout
|
|
||||||
uses: actions/checkout@v4.2.2
|
|
||||||
|
|
||||||
- name: Generate a token
|
|
||||||
id: generate-token
|
|
||||||
uses: actions/create-github-app-token@v2
|
|
||||||
with:
|
|
||||||
app-id: ${{ secrets.ESPHOME_GITHUB_APP_ID }}
|
|
||||||
private-key: ${{ secrets.ESPHOME_GITHUB_APP_PRIVATE_KEY }}
|
|
||||||
|
|
||||||
- name: Auto Label PR
|
|
||||||
uses: actions/github-script@v7.0.1
|
|
||||||
with:
|
|
||||||
github-token: ${{ steps.generate-token.outputs.token }}
|
|
||||||
script: |
|
|
||||||
const fs = require('fs');
|
|
||||||
|
|
||||||
// Constants
|
|
||||||
const SMALL_PR_THRESHOLD = parseInt('${{ env.SMALL_PR_THRESHOLD }}');
|
|
||||||
const MAX_LABELS = parseInt('${{ env.MAX_LABELS }}');
|
|
||||||
const TOO_BIG_THRESHOLD = parseInt('${{ env.TOO_BIG_THRESHOLD }}');
|
|
||||||
const COMPONENT_LABEL_THRESHOLD = parseInt('${{ env.COMPONENT_LABEL_THRESHOLD }}');
|
|
||||||
const BOT_COMMENT_MARKER = '<!-- auto-label-pr-bot -->';
|
|
||||||
const CODEOWNERS_MARKER = '<!-- codeowners-request -->';
|
|
||||||
const TOO_BIG_MARKER = '<!-- too-big-request -->';
|
|
||||||
|
|
||||||
const MANAGED_LABELS = [
|
|
||||||
'new-component',
|
|
||||||
'new-platform',
|
|
||||||
'new-target-platform',
|
|
||||||
'merging-to-release',
|
|
||||||
'merging-to-beta',
|
|
||||||
'core',
|
|
||||||
'small-pr',
|
|
||||||
'dashboard',
|
|
||||||
'github-actions',
|
|
||||||
'by-code-owner',
|
|
||||||
'has-tests',
|
|
||||||
'needs-tests',
|
|
||||||
'needs-docs',
|
|
||||||
'needs-codeowners',
|
|
||||||
'too-big',
|
|
||||||
'labeller-recheck'
|
|
||||||
];
|
|
||||||
|
|
||||||
const DOCS_PR_PATTERNS = [
|
|
||||||
/https:\/\/github\.com\/esphome\/esphome-docs\/pull\/\d+/,
|
|
||||||
/esphome\/esphome-docs#\d+/
|
|
||||||
];
|
|
||||||
|
|
||||||
// Global state
|
|
||||||
const { owner, repo } = context.repo;
|
|
||||||
const pr_number = context.issue.number;
|
|
||||||
|
|
||||||
// Get current labels and PR data
|
|
||||||
const { data: currentLabelsData } = await github.rest.issues.listLabelsOnIssue({
|
|
||||||
owner,
|
|
||||||
repo,
|
|
||||||
issue_number: pr_number
|
|
||||||
});
|
|
||||||
const currentLabels = currentLabelsData.map(label => label.name);
|
|
||||||
const managedLabels = currentLabels.filter(label =>
|
|
||||||
label.startsWith('component: ') || MANAGED_LABELS.includes(label)
|
|
||||||
);
|
|
||||||
|
|
||||||
// Check for mega-PR early - if present, skip most automatic labeling
|
|
||||||
const isMegaPR = currentLabels.includes('mega-pr');
|
|
||||||
|
|
||||||
// Get all PR files with automatic pagination
|
|
||||||
const prFiles = await github.paginate(
|
|
||||||
github.rest.pulls.listFiles,
|
|
||||||
{
|
|
||||||
owner,
|
|
||||||
repo,
|
|
||||||
pull_number: pr_number
|
|
||||||
}
|
|
||||||
);
|
|
||||||
|
|
||||||
// Calculate data from PR files
|
|
||||||
const changedFiles = prFiles.map(file => file.filename);
|
|
||||||
const totalChanges = prFiles.reduce((sum, file) => sum + (file.additions || 0) + (file.deletions || 0), 0);
|
|
||||||
|
|
||||||
console.log('Current labels:', currentLabels.join(', '));
|
|
||||||
console.log('Changed files:', changedFiles.length);
|
|
||||||
console.log('Total changes:', totalChanges);
|
|
||||||
if (isMegaPR) {
|
|
||||||
console.log('Mega-PR detected - applying limited labeling logic');
|
|
||||||
}
|
|
||||||
|
|
||||||
// Fetch API data
|
|
||||||
async function fetchApiData() {
|
|
||||||
try {
|
|
||||||
const response = await fetch('https://data.esphome.io/components.json');
|
|
||||||
const componentsData = await response.json();
|
|
||||||
return {
|
|
||||||
targetPlatforms: componentsData.target_platforms || [],
|
|
||||||
platformComponents: componentsData.platform_components || []
|
|
||||||
};
|
|
||||||
} catch (error) {
|
|
||||||
console.log('Failed to fetch components data from API:', error.message);
|
|
||||||
return { targetPlatforms: [], platformComponents: [] };
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
// Strategy: Merge branch detection
|
|
||||||
async function detectMergeBranch() {
|
|
||||||
const labels = new Set();
|
|
||||||
const baseRef = context.payload.pull_request.base.ref;
|
|
||||||
|
|
||||||
if (baseRef === 'release') {
|
|
||||||
labels.add('merging-to-release');
|
|
||||||
} else if (baseRef === 'beta') {
|
|
||||||
labels.add('merging-to-beta');
|
|
||||||
}
|
|
||||||
|
|
||||||
return labels;
|
|
||||||
}
|
|
||||||
|
|
||||||
// Strategy: Component and platform labeling
|
|
||||||
async function detectComponentPlatforms(apiData) {
|
|
||||||
const labels = new Set();
|
|
||||||
const componentRegex = /^esphome\/components\/([^\/]+)\//;
|
|
||||||
const targetPlatformRegex = new RegExp(`^esphome\/components\/(${apiData.targetPlatforms.join('|')})/`);
|
|
||||||
|
|
||||||
for (const file of changedFiles) {
|
|
||||||
const componentMatch = file.match(componentRegex);
|
|
||||||
if (componentMatch) {
|
|
||||||
labels.add(`component: ${componentMatch[1]}`);
|
|
||||||
}
|
|
||||||
|
|
||||||
const platformMatch = file.match(targetPlatformRegex);
|
|
||||||
if (platformMatch) {
|
|
||||||
labels.add(`platform: ${platformMatch[1]}`);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
return labels;
|
|
||||||
}
|
|
||||||
|
|
||||||
// Strategy: New component detection
|
|
||||||
async function detectNewComponents() {
|
|
||||||
const labels = new Set();
|
|
||||||
const addedFiles = prFiles.filter(file => file.status === 'added').map(file => file.filename);
|
|
||||||
|
|
||||||
for (const file of addedFiles) {
|
|
||||||
const componentMatch = file.match(/^esphome\/components\/([^\/]+)\/__init__\.py$/);
|
|
||||||
if (componentMatch) {
|
|
||||||
try {
|
|
||||||
const content = fs.readFileSync(file, 'utf8');
|
|
||||||
if (content.includes('IS_TARGET_PLATFORM = True')) {
|
|
||||||
labels.add('new-target-platform');
|
|
||||||
}
|
|
||||||
} catch (error) {
|
|
||||||
console.log(`Failed to read content of ${file}:`, error.message);
|
|
||||||
}
|
|
||||||
labels.add('new-component');
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
return labels;
|
|
||||||
}
|
|
||||||
|
|
||||||
// Strategy: New platform detection
|
|
||||||
async function detectNewPlatforms(apiData) {
|
|
||||||
const labels = new Set();
|
|
||||||
const addedFiles = prFiles.filter(file => file.status === 'added').map(file => file.filename);
|
|
||||||
|
|
||||||
for (const file of addedFiles) {
|
|
||||||
const platformFileMatch = file.match(/^esphome\/components\/([^\/]+)\/([^\/]+)\.py$/);
|
|
||||||
if (platformFileMatch) {
|
|
||||||
const [, component, platform] = platformFileMatch;
|
|
||||||
if (apiData.platformComponents.includes(platform)) {
|
|
||||||
labels.add('new-platform');
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
const platformDirMatch = file.match(/^esphome\/components\/([^\/]+)\/([^\/]+)\/__init__\.py$/);
|
|
||||||
if (platformDirMatch) {
|
|
||||||
const [, component, platform] = platformDirMatch;
|
|
||||||
if (apiData.platformComponents.includes(platform)) {
|
|
||||||
labels.add('new-platform');
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
return labels;
|
|
||||||
}
|
|
||||||
|
|
||||||
// Strategy: Core files detection
|
|
||||||
async function detectCoreChanges() {
|
|
||||||
const labels = new Set();
|
|
||||||
const coreFiles = changedFiles.filter(file =>
|
|
||||||
file.startsWith('esphome/core/') ||
|
|
||||||
(file.startsWith('esphome/') && file.split('/').length === 2)
|
|
||||||
);
|
|
||||||
|
|
||||||
if (coreFiles.length > 0) {
|
|
||||||
labels.add('core');
|
|
||||||
}
|
|
||||||
|
|
||||||
return labels;
|
|
||||||
}
|
|
||||||
|
|
||||||
// Strategy: PR size detection
|
|
||||||
async function detectPRSize() {
|
|
||||||
const labels = new Set();
|
|
||||||
const testChanges = prFiles
|
|
||||||
.filter(file => file.filename.startsWith('tests/'))
|
|
||||||
.reduce((sum, file) => sum + (file.additions || 0) + (file.deletions || 0), 0);
|
|
||||||
|
|
||||||
const nonTestChanges = totalChanges - testChanges;
|
|
||||||
|
|
||||||
if (totalChanges <= SMALL_PR_THRESHOLD) {
|
|
||||||
labels.add('small-pr');
|
|
||||||
}
|
|
||||||
|
|
||||||
// Don't add too-big if mega-pr label is already present
|
|
||||||
if (nonTestChanges > TOO_BIG_THRESHOLD && !isMegaPR) {
|
|
||||||
labels.add('too-big');
|
|
||||||
}
|
|
||||||
|
|
||||||
return labels;
|
|
||||||
}
|
|
||||||
|
|
||||||
// Strategy: Dashboard changes
|
|
||||||
async function detectDashboardChanges() {
|
|
||||||
const labels = new Set();
|
|
||||||
const dashboardFiles = changedFiles.filter(file =>
|
|
||||||
file.startsWith('esphome/dashboard/') ||
|
|
||||||
file.startsWith('esphome/components/dashboard_import/')
|
|
||||||
);
|
|
||||||
|
|
||||||
if (dashboardFiles.length > 0) {
|
|
||||||
labels.add('dashboard');
|
|
||||||
}
|
|
||||||
|
|
||||||
return labels;
|
|
||||||
}
|
|
||||||
|
|
||||||
// Strategy: GitHub Actions changes
|
|
||||||
async function detectGitHubActionsChanges() {
|
|
||||||
const labels = new Set();
|
|
||||||
const githubActionsFiles = changedFiles.filter(file =>
|
|
||||||
file.startsWith('.github/workflows/')
|
|
||||||
);
|
|
||||||
|
|
||||||
if (githubActionsFiles.length > 0) {
|
|
||||||
labels.add('github-actions');
|
|
||||||
}
|
|
||||||
|
|
||||||
return labels;
|
|
||||||
}
|
|
||||||
|
|
||||||
// Strategy: Code owner detection
|
|
||||||
async function detectCodeOwner() {
|
|
||||||
const labels = new Set();
|
|
||||||
|
|
||||||
try {
|
|
||||||
const { data: codeownersFile } = await github.rest.repos.getContent({
|
|
||||||
owner,
|
|
||||||
repo,
|
|
||||||
path: 'CODEOWNERS',
|
|
||||||
});
|
|
||||||
|
|
||||||
const codeownersContent = Buffer.from(codeownersFile.content, 'base64').toString('utf8');
|
|
||||||
const prAuthor = context.payload.pull_request.user.login;
|
|
||||||
|
|
||||||
const codeownersLines = codeownersContent.split('\n')
|
|
||||||
.map(line => line.trim())
|
|
||||||
.filter(line => line && !line.startsWith('#'));
|
|
||||||
|
|
||||||
const codeownersRegexes = codeownersLines.map(line => {
|
|
||||||
const parts = line.split(/\s+/);
|
|
||||||
const pattern = parts[0];
|
|
||||||
const owners = parts.slice(1);
|
|
||||||
|
|
||||||
let regex;
|
|
||||||
if (pattern.endsWith('*')) {
|
|
||||||
const dir = pattern.slice(0, -1);
|
|
||||||
regex = new RegExp(`^${dir.replace(/[.*+?^${}()|[\]\\]/g, '\\$&')}`);
|
|
||||||
} else if (pattern.includes('*')) {
|
|
||||||
// First escape all regex special chars except *, then replace * with .*
|
|
||||||
const regexPattern = pattern
|
|
||||||
.replace(/[.+?^${}()|[\]\\]/g, '\\$&')
|
|
||||||
.replace(/\*/g, '.*');
|
|
||||||
regex = new RegExp(`^${regexPattern}$`);
|
|
||||||
} else {
|
|
||||||
regex = new RegExp(`^${pattern.replace(/[.*+?^${}()|[\]\\]/g, '\\$&')}$`);
|
|
||||||
}
|
|
||||||
|
|
||||||
return { regex, owners };
|
|
||||||
});
|
|
||||||
|
|
||||||
for (const file of changedFiles) {
|
|
||||||
for (const { regex, owners } of codeownersRegexes) {
|
|
||||||
if (regex.test(file) && owners.some(owner => owner === `@${prAuthor}`)) {
|
|
||||||
labels.add('by-code-owner');
|
|
||||||
return labels;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
} catch (error) {
|
|
||||||
console.log('Failed to read or parse CODEOWNERS file:', error.message);
|
|
||||||
}
|
|
||||||
|
|
||||||
return labels;
|
|
||||||
}
|
|
||||||
|
|
||||||
// Strategy: Test detection
|
|
||||||
async function detectTests() {
|
|
||||||
const labels = new Set();
|
|
||||||
const testFiles = changedFiles.filter(file => file.startsWith('tests/'));
|
|
||||||
|
|
||||||
if (testFiles.length > 0) {
|
|
||||||
labels.add('has-tests');
|
|
||||||
}
|
|
||||||
|
|
||||||
return labels;
|
|
||||||
}
|
|
||||||
|
|
||||||
// Strategy: Requirements detection
|
|
||||||
async function detectRequirements(allLabels) {
|
|
||||||
const labels = new Set();
|
|
||||||
|
|
||||||
// Check for missing tests
|
|
||||||
if ((allLabels.has('new-component') || allLabels.has('new-platform')) && !allLabels.has('has-tests')) {
|
|
||||||
labels.add('needs-tests');
|
|
||||||
}
|
|
||||||
|
|
||||||
// Check for missing docs
|
|
||||||
if (allLabels.has('new-component') || allLabels.has('new-platform')) {
|
|
||||||
const prBody = context.payload.pull_request.body || '';
|
|
||||||
const hasDocsLink = DOCS_PR_PATTERNS.some(pattern => pattern.test(prBody));
|
|
||||||
|
|
||||||
if (!hasDocsLink) {
|
|
||||||
labels.add('needs-docs');
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
// Check for missing CODEOWNERS
|
|
||||||
if (allLabels.has('new-component')) {
|
|
||||||
const codeownersModified = prFiles.some(file =>
|
|
||||||
file.filename === 'CODEOWNERS' &&
|
|
||||||
(file.status === 'modified' || file.status === 'added') &&
|
|
||||||
(file.additions || 0) > 0
|
|
||||||
);
|
|
||||||
|
|
||||||
if (!codeownersModified) {
|
|
||||||
labels.add('needs-codeowners');
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
return labels;
|
|
||||||
}
|
|
||||||
|
|
||||||
// Generate review messages
|
|
||||||
function generateReviewMessages(finalLabels) {
|
|
||||||
const messages = [];
|
|
||||||
const prAuthor = context.payload.pull_request.user.login;
|
|
||||||
|
|
||||||
// Too big message
|
|
||||||
if (finalLabels.includes('too-big')) {
|
|
||||||
const testChanges = prFiles
|
|
||||||
.filter(file => file.filename.startsWith('tests/'))
|
|
||||||
.reduce((sum, file) => sum + (file.additions || 0) + (file.deletions || 0), 0);
|
|
||||||
const nonTestChanges = totalChanges - testChanges;
|
|
||||||
|
|
||||||
const tooManyLabels = finalLabels.length > MAX_LABELS;
|
|
||||||
const tooManyChanges = nonTestChanges > TOO_BIG_THRESHOLD;
|
|
||||||
|
|
||||||
let message = `${TOO_BIG_MARKER}\n### 📦 Pull Request Size\n\n`;
|
|
||||||
|
|
||||||
if (tooManyLabels && tooManyChanges) {
|
|
||||||
message += `This PR is too large with ${nonTestChanges} line changes (excluding tests) and affects ${finalLabels.length} different components/areas.`;
|
|
||||||
} else if (tooManyLabels) {
|
|
||||||
message += `This PR affects ${finalLabels.length} different components/areas.`;
|
|
||||||
} else {
|
|
||||||
message += `This PR is too large with ${nonTestChanges} line changes (excluding tests).`;
|
|
||||||
}
|
|
||||||
|
|
||||||
message += ` Please consider breaking it down into smaller, focused PRs to make review easier and reduce the risk of conflicts.\n\n`;
|
|
||||||
message += `For guidance on breaking down large PRs, see: https://developers.esphome.io/contributing/submitting-your-work/#how-to-approach-large-submissions`;
|
|
||||||
|
|
||||||
messages.push(message);
|
|
||||||
}
|
|
||||||
|
|
||||||
// CODEOWNERS message
|
|
||||||
if (finalLabels.includes('needs-codeowners')) {
|
|
||||||
const message = `${CODEOWNERS_MARKER}\n### 👥 Code Ownership\n\n` +
|
|
||||||
`Hey there @${prAuthor},\n` +
|
|
||||||
`Thanks for submitting this pull request! Can you add yourself as a codeowner for this integration? ` +
|
|
||||||
`This way we can notify you if a bug report for this integration is reported.\n\n` +
|
|
||||||
`In \`__init__.py\` of the integration, please add:\n\n` +
|
|
||||||
`\`\`\`python\nCODEOWNERS = ["@${prAuthor}"]\n\`\`\`\n\n` +
|
|
||||||
`And run \`script/build_codeowners.py\``;
|
|
||||||
|
|
||||||
messages.push(message);
|
|
||||||
}
|
|
||||||
|
|
||||||
return messages;
|
|
||||||
}
|
|
||||||
|
|
||||||
// Handle reviews
|
|
||||||
async function handleReviews(finalLabels) {
|
|
||||||
const reviewMessages = generateReviewMessages(finalLabels);
|
|
||||||
const hasReviewableLabels = finalLabels.some(label =>
|
|
||||||
['too-big', 'needs-codeowners'].includes(label)
|
|
||||||
);
|
|
||||||
|
|
||||||
const { data: reviews } = await github.rest.pulls.listReviews({
|
|
||||||
owner,
|
|
||||||
repo,
|
|
||||||
pull_number: pr_number
|
|
||||||
});
|
|
||||||
|
|
||||||
const botReviews = reviews.filter(review =>
|
|
||||||
review.user.type === 'Bot' &&
|
|
||||||
review.state === 'CHANGES_REQUESTED' &&
|
|
||||||
review.body && review.body.includes(BOT_COMMENT_MARKER)
|
|
||||||
);
|
|
||||||
|
|
||||||
if (hasReviewableLabels) {
|
|
||||||
const reviewBody = `${BOT_COMMENT_MARKER}\n\n${reviewMessages.join('\n\n---\n\n')}`;
|
|
||||||
|
|
||||||
if (botReviews.length > 0) {
|
|
||||||
// Update existing review
|
|
||||||
await github.rest.pulls.updateReview({
|
|
||||||
owner,
|
|
||||||
repo,
|
|
||||||
pull_number: pr_number,
|
|
||||||
review_id: botReviews[0].id,
|
|
||||||
body: reviewBody
|
|
||||||
});
|
|
||||||
console.log('Updated existing bot review');
|
|
||||||
} else {
|
|
||||||
// Create new review
|
|
||||||
await github.rest.pulls.createReview({
|
|
||||||
owner,
|
|
||||||
repo,
|
|
||||||
pull_number: pr_number,
|
|
||||||
body: reviewBody,
|
|
||||||
event: 'REQUEST_CHANGES'
|
|
||||||
});
|
|
||||||
console.log('Created new bot review');
|
|
||||||
}
|
|
||||||
} else if (botReviews.length > 0) {
|
|
||||||
// Dismiss existing reviews
|
|
||||||
for (const review of botReviews) {
|
|
||||||
try {
|
|
||||||
await github.rest.pulls.dismissReview({
|
|
||||||
owner,
|
|
||||||
repo,
|
|
||||||
pull_number: pr_number,
|
|
||||||
review_id: review.id,
|
|
||||||
message: 'Review dismissed: All requirements have been met'
|
|
||||||
});
|
|
||||||
console.log(`Dismissed bot review ${review.id}`);
|
|
||||||
} catch (error) {
|
|
||||||
console.log(`Failed to dismiss review ${review.id}:`, error.message);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
// Main execution
|
|
||||||
const apiData = await fetchApiData();
|
|
||||||
const baseRef = context.payload.pull_request.base.ref;
|
|
||||||
|
|
||||||
// Early exit for non-dev branches
|
|
||||||
if (baseRef !== 'dev') {
|
|
||||||
const branchLabels = await detectMergeBranch();
|
|
||||||
const finalLabels = Array.from(branchLabels);
|
|
||||||
|
|
||||||
console.log('Computed labels (merge branch only):', finalLabels.join(', '));
|
|
||||||
|
|
||||||
// Apply labels
|
|
||||||
if (finalLabels.length > 0) {
|
|
||||||
await github.rest.issues.addLabels({
|
|
||||||
owner,
|
|
||||||
repo,
|
|
||||||
issue_number: pr_number,
|
|
||||||
labels: finalLabels
|
|
||||||
});
|
|
||||||
}
|
|
||||||
|
|
||||||
// Remove old managed labels
|
|
||||||
const labelsToRemove = managedLabels.filter(label => !finalLabels.includes(label));
|
|
||||||
for (const label of labelsToRemove) {
|
|
||||||
try {
|
|
||||||
await github.rest.issues.removeLabel({
|
|
||||||
owner,
|
|
||||||
repo,
|
|
||||||
issue_number: pr_number,
|
|
||||||
name: label
|
|
||||||
});
|
|
||||||
} catch (error) {
|
|
||||||
console.log(`Failed to remove label ${label}:`, error.message);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
return;
|
|
||||||
}
|
|
||||||
|
|
||||||
// Run all strategies
|
|
||||||
const [
|
|
||||||
branchLabels,
|
|
||||||
componentLabels,
|
|
||||||
newComponentLabels,
|
|
||||||
newPlatformLabels,
|
|
||||||
coreLabels,
|
|
||||||
sizeLabels,
|
|
||||||
dashboardLabels,
|
|
||||||
actionsLabels,
|
|
||||||
codeOwnerLabels,
|
|
||||||
testLabels
|
|
||||||
] = await Promise.all([
|
|
||||||
detectMergeBranch(),
|
|
||||||
detectComponentPlatforms(apiData),
|
|
||||||
detectNewComponents(),
|
|
||||||
detectNewPlatforms(apiData),
|
|
||||||
detectCoreChanges(),
|
|
||||||
detectPRSize(),
|
|
||||||
detectDashboardChanges(),
|
|
||||||
detectGitHubActionsChanges(),
|
|
||||||
detectCodeOwner(),
|
|
||||||
detectTests()
|
|
||||||
]);
|
|
||||||
|
|
||||||
// Combine all labels
|
|
||||||
const allLabels = new Set([
|
|
||||||
...branchLabels,
|
|
||||||
...componentLabels,
|
|
||||||
...newComponentLabels,
|
|
||||||
...newPlatformLabels,
|
|
||||||
...coreLabels,
|
|
||||||
...sizeLabels,
|
|
||||||
...dashboardLabels,
|
|
||||||
...actionsLabels,
|
|
||||||
...codeOwnerLabels,
|
|
||||||
...testLabels
|
|
||||||
]);
|
|
||||||
|
|
||||||
// Detect requirements based on all other labels
|
|
||||||
const requirementLabels = await detectRequirements(allLabels);
|
|
||||||
for (const label of requirementLabels) {
|
|
||||||
allLabels.add(label);
|
|
||||||
}
|
|
||||||
|
|
||||||
let finalLabels = Array.from(allLabels);
|
|
||||||
|
|
||||||
// For mega-PRs, exclude component labels if there are too many
|
|
||||||
if (isMegaPR) {
|
|
||||||
const componentLabels = finalLabels.filter(label => label.startsWith('component: '));
|
|
||||||
if (componentLabels.length > COMPONENT_LABEL_THRESHOLD) {
|
|
||||||
finalLabels = finalLabels.filter(label => !label.startsWith('component: '));
|
|
||||||
console.log(`Mega-PR detected - excluding ${componentLabels.length} component labels (threshold: ${COMPONENT_LABEL_THRESHOLD})`);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
// Handle too many labels (only for non-mega PRs)
|
|
||||||
const tooManyLabels = finalLabels.length > MAX_LABELS;
|
|
||||||
|
|
||||||
if (tooManyLabels && !isMegaPR && !finalLabels.includes('too-big')) {
|
|
||||||
finalLabels = ['too-big'];
|
|
||||||
}
|
|
||||||
|
|
||||||
console.log('Computed labels:', finalLabels.join(', '));
|
|
||||||
|
|
||||||
// Handle reviews
|
|
||||||
await handleReviews(finalLabels);
|
|
||||||
|
|
||||||
// Apply labels
|
|
||||||
if (finalLabels.length > 0) {
|
|
||||||
console.log(`Adding labels: ${finalLabels.join(', ')}`);
|
|
||||||
await github.rest.issues.addLabels({
|
|
||||||
owner,
|
|
||||||
repo,
|
|
||||||
issue_number: pr_number,
|
|
||||||
labels: finalLabels
|
|
||||||
});
|
|
||||||
}
|
|
||||||
|
|
||||||
// Remove old managed labels
|
|
||||||
const labelsToRemove = managedLabels.filter(label => !finalLabels.includes(label));
|
|
||||||
for (const label of labelsToRemove) {
|
|
||||||
console.log(`Removing label: ${label}`);
|
|
||||||
try {
|
|
||||||
await github.rest.issues.removeLabel({
|
|
||||||
owner,
|
|
||||||
repo,
|
|
||||||
issue_number: pr_number,
|
|
||||||
name: label
|
|
||||||
});
|
|
||||||
} catch (error) {
|
|
||||||
console.log(`Failed to remove label ${label}:`, error.message);
|
|
||||||
}
|
|
||||||
}
|
|
15
.github/workflows/ci-api-proto.yml
vendored
15
.github/workflows/ci-api-proto.yml
vendored
@ -21,9 +21,9 @@ jobs:
|
|||||||
runs-on: ubuntu-latest
|
runs-on: ubuntu-latest
|
||||||
steps:
|
steps:
|
||||||
- name: Checkout
|
- name: Checkout
|
||||||
uses: actions/checkout@v4.2.2
|
uses: actions/checkout@v4.1.7
|
||||||
- name: Set up Python
|
- name: Set up Python
|
||||||
uses: actions/setup-python@v5.6.0
|
uses: actions/setup-python@v5.5.0
|
||||||
with:
|
with:
|
||||||
python-version: "3.11"
|
python-version: "3.11"
|
||||||
|
|
||||||
@ -57,17 +57,6 @@ jobs:
|
|||||||
event: 'REQUEST_CHANGES',
|
event: 'REQUEST_CHANGES',
|
||||||
body: 'You have altered the generated proto files but they do not match what is expected.\nPlease run "script/api_protobuf/api_protobuf.py" and commit the changes.'
|
body: 'You have altered the generated proto files but they do not match what is expected.\nPlease run "script/api_protobuf/api_protobuf.py" and commit the changes.'
|
||||||
})
|
})
|
||||||
- if: failure()
|
|
||||||
name: Show changes
|
|
||||||
run: git diff
|
|
||||||
- if: failure()
|
|
||||||
name: Archive artifacts
|
|
||||||
uses: actions/upload-artifact@v4.6.2
|
|
||||||
with:
|
|
||||||
name: generated-proto-files
|
|
||||||
path: |
|
|
||||||
esphome/components/api/api_pb2.*
|
|
||||||
esphome/components/api/api_pb2_service.*
|
|
||||||
- if: success()
|
- if: success()
|
||||||
name: Dismiss review
|
name: Dismiss review
|
||||||
uses: actions/github-script@v7.0.1
|
uses: actions/github-script@v7.0.1
|
||||||
|
75
.github/workflows/ci-clang-tidy-hash.yml
vendored
75
.github/workflows/ci-clang-tidy-hash.yml
vendored
@ -1,75 +0,0 @@
|
|||||||
name: Clang-tidy Hash CI
|
|
||||||
|
|
||||||
on:
|
|
||||||
pull_request:
|
|
||||||
paths:
|
|
||||||
- ".clang-tidy"
|
|
||||||
- "platformio.ini"
|
|
||||||
- "requirements_dev.txt"
|
|
||||||
- ".clang-tidy.hash"
|
|
||||||
- "script/clang_tidy_hash.py"
|
|
||||||
- ".github/workflows/ci-clang-tidy-hash.yml"
|
|
||||||
|
|
||||||
permissions:
|
|
||||||
contents: read
|
|
||||||
pull-requests: write
|
|
||||||
|
|
||||||
jobs:
|
|
||||||
verify-hash:
|
|
||||||
name: Verify clang-tidy hash
|
|
||||||
runs-on: ubuntu-latest
|
|
||||||
steps:
|
|
||||||
- name: Checkout
|
|
||||||
uses: actions/checkout@v4.2.2
|
|
||||||
|
|
||||||
- name: Set up Python
|
|
||||||
uses: actions/setup-python@v5.6.0
|
|
||||||
with:
|
|
||||||
python-version: "3.11"
|
|
||||||
|
|
||||||
- name: Verify hash
|
|
||||||
run: |
|
|
||||||
python script/clang_tidy_hash.py --verify
|
|
||||||
|
|
||||||
- if: failure()
|
|
||||||
name: Show hash details
|
|
||||||
run: |
|
|
||||||
python script/clang_tidy_hash.py
|
|
||||||
echo "## Job Failed" | tee -a $GITHUB_STEP_SUMMARY
|
|
||||||
echo "You have modified clang-tidy configuration but have not updated the hash." | tee -a $GITHUB_STEP_SUMMARY
|
|
||||||
echo "Please run 'script/clang_tidy_hash.py --update' and commit the changes." | tee -a $GITHUB_STEP_SUMMARY
|
|
||||||
|
|
||||||
- if: failure()
|
|
||||||
name: Request changes
|
|
||||||
uses: actions/github-script@v7.0.1
|
|
||||||
with:
|
|
||||||
script: |
|
|
||||||
await github.rest.pulls.createReview({
|
|
||||||
pull_number: context.issue.number,
|
|
||||||
owner: context.repo.owner,
|
|
||||||
repo: context.repo.repo,
|
|
||||||
event: 'REQUEST_CHANGES',
|
|
||||||
body: 'You have modified clang-tidy configuration but have not updated the hash.\nPlease run `script/clang_tidy_hash.py --update` and commit the changes.'
|
|
||||||
})
|
|
||||||
|
|
||||||
- if: success()
|
|
||||||
name: Dismiss review
|
|
||||||
uses: actions/github-script@v7.0.1
|
|
||||||
with:
|
|
||||||
script: |
|
|
||||||
let reviews = await github.rest.pulls.listReviews({
|
|
||||||
pull_number: context.issue.number,
|
|
||||||
owner: context.repo.owner,
|
|
||||||
repo: context.repo.repo
|
|
||||||
});
|
|
||||||
for (let review of reviews.data) {
|
|
||||||
if (review.user.login === 'github-actions[bot]' && review.state === 'CHANGES_REQUESTED') {
|
|
||||||
await github.rest.pulls.dismissReview({
|
|
||||||
pull_number: context.issue.number,
|
|
||||||
owner: context.repo.owner,
|
|
||||||
repo: context.repo.repo,
|
|
||||||
review_id: review.id,
|
|
||||||
message: 'Clang-tidy hash now matches configuration.'
|
|
||||||
});
|
|
||||||
}
|
|
||||||
}
|
|
15
.github/workflows/ci-docker.yml
vendored
15
.github/workflows/ci-docker.yml
vendored
@ -37,19 +37,16 @@ jobs:
|
|||||||
strategy:
|
strategy:
|
||||||
fail-fast: false
|
fail-fast: false
|
||||||
matrix:
|
matrix:
|
||||||
os: ["ubuntu-24.04", "ubuntu-24.04-arm"]
|
os: ["ubuntu-latest", "ubuntu-24.04-arm"]
|
||||||
build_type:
|
build_type: ["ha-addon", "docker", "lint"]
|
||||||
- "ha-addon"
|
|
||||||
- "docker"
|
|
||||||
# - "lint"
|
|
||||||
steps:
|
steps:
|
||||||
- uses: actions/checkout@v4.2.2
|
- uses: actions/checkout@v4.1.7
|
||||||
- name: Set up Python
|
- name: Set up Python
|
||||||
uses: actions/setup-python@v5.6.0
|
uses: actions/setup-python@v5.5.0
|
||||||
with:
|
with:
|
||||||
python-version: "3.11"
|
python-version: "3.9"
|
||||||
- name: Set up Docker Buildx
|
- name: Set up Docker Buildx
|
||||||
uses: docker/setup-buildx-action@v3.11.1
|
uses: docker/setup-buildx-action@v3.10.0
|
||||||
|
|
||||||
- name: Set TAG
|
- name: Set TAG
|
||||||
run: |
|
run: |
|
||||||
|
324
.github/workflows/ci.yml
vendored
324
.github/workflows/ci.yml
vendored
@ -20,8 +20,8 @@ permissions:
|
|||||||
contents: read
|
contents: read
|
||||||
|
|
||||||
env:
|
env:
|
||||||
DEFAULT_PYTHON: "3.11"
|
DEFAULT_PYTHON: "3.9"
|
||||||
PYUPGRADE_TARGET: "--py311-plus"
|
PYUPGRADE_TARGET: "--py39-plus"
|
||||||
|
|
||||||
concurrency:
|
concurrency:
|
||||||
# yamllint disable-line rule:line-length
|
# yamllint disable-line rule:line-length
|
||||||
@ -36,13 +36,13 @@ jobs:
|
|||||||
cache-key: ${{ steps.cache-key.outputs.key }}
|
cache-key: ${{ steps.cache-key.outputs.key }}
|
||||||
steps:
|
steps:
|
||||||
- name: Check out code from GitHub
|
- name: Check out code from GitHub
|
||||||
uses: actions/checkout@v4.2.2
|
uses: actions/checkout@v4.1.7
|
||||||
- name: Generate cache-key
|
- name: Generate cache-key
|
||||||
id: cache-key
|
id: cache-key
|
||||||
run: echo key="${{ hashFiles('requirements.txt', 'requirements_test.txt', '.pre-commit-config.yaml') }}" >> $GITHUB_OUTPUT
|
run: echo key="${{ hashFiles('requirements.txt', 'requirements_optional.txt', 'requirements_test.txt') }}" >> $GITHUB_OUTPUT
|
||||||
- name: Set up Python ${{ env.DEFAULT_PYTHON }}
|
- name: Set up Python ${{ env.DEFAULT_PYTHON }}
|
||||||
id: python
|
id: python
|
||||||
uses: actions/setup-python@v5.6.0
|
uses: actions/setup-python@v5.5.0
|
||||||
with:
|
with:
|
||||||
python-version: ${{ env.DEFAULT_PYTHON }}
|
python-version: ${{ env.DEFAULT_PYTHON }}
|
||||||
- name: Restore Python virtual environment
|
- name: Restore Python virtual environment
|
||||||
@ -58,19 +58,59 @@ jobs:
|
|||||||
python -m venv venv
|
python -m venv venv
|
||||||
. venv/bin/activate
|
. venv/bin/activate
|
||||||
python --version
|
python --version
|
||||||
pip install -r requirements.txt -r requirements_test.txt pre-commit
|
pip install -r requirements.txt -r requirements_optional.txt -r requirements_test.txt
|
||||||
pip install -e .
|
pip install -e .
|
||||||
|
|
||||||
|
ruff:
|
||||||
|
name: Check ruff
|
||||||
|
runs-on: ubuntu-24.04
|
||||||
|
needs:
|
||||||
|
- common
|
||||||
|
steps:
|
||||||
|
- name: Check out code from GitHub
|
||||||
|
uses: actions/checkout@v4.1.7
|
||||||
|
- name: Restore Python
|
||||||
|
uses: ./.github/actions/restore-python
|
||||||
|
with:
|
||||||
|
python-version: ${{ env.DEFAULT_PYTHON }}
|
||||||
|
cache-key: ${{ needs.common.outputs.cache-key }}
|
||||||
|
- name: Run Ruff
|
||||||
|
run: |
|
||||||
|
. venv/bin/activate
|
||||||
|
ruff format esphome tests
|
||||||
|
- name: Suggested changes
|
||||||
|
run: script/ci-suggest-changes
|
||||||
|
if: always()
|
||||||
|
|
||||||
|
flake8:
|
||||||
|
name: Check flake8
|
||||||
|
runs-on: ubuntu-24.04
|
||||||
|
needs:
|
||||||
|
- common
|
||||||
|
steps:
|
||||||
|
- name: Check out code from GitHub
|
||||||
|
uses: actions/checkout@v4.1.7
|
||||||
|
- name: Restore Python
|
||||||
|
uses: ./.github/actions/restore-python
|
||||||
|
with:
|
||||||
|
python-version: ${{ env.DEFAULT_PYTHON }}
|
||||||
|
cache-key: ${{ needs.common.outputs.cache-key }}
|
||||||
|
- name: Run flake8
|
||||||
|
run: |
|
||||||
|
. venv/bin/activate
|
||||||
|
flake8 esphome
|
||||||
|
- name: Suggested changes
|
||||||
|
run: script/ci-suggest-changes
|
||||||
|
if: always()
|
||||||
|
|
||||||
pylint:
|
pylint:
|
||||||
name: Check pylint
|
name: Check pylint
|
||||||
runs-on: ubuntu-24.04
|
runs-on: ubuntu-24.04
|
||||||
needs:
|
needs:
|
||||||
- common
|
- common
|
||||||
- determine-jobs
|
|
||||||
if: needs.determine-jobs.outputs.python-linters == 'true'
|
|
||||||
steps:
|
steps:
|
||||||
- name: Check out code from GitHub
|
- name: Check out code from GitHub
|
||||||
uses: actions/checkout@v4.2.2
|
uses: actions/checkout@v4.1.7
|
||||||
- name: Restore Python
|
- name: Restore Python
|
||||||
uses: ./.github/actions/restore-python
|
uses: ./.github/actions/restore-python
|
||||||
with:
|
with:
|
||||||
@ -84,6 +124,27 @@ jobs:
|
|||||||
run: script/ci-suggest-changes
|
run: script/ci-suggest-changes
|
||||||
if: always()
|
if: always()
|
||||||
|
|
||||||
|
pyupgrade:
|
||||||
|
name: Check pyupgrade
|
||||||
|
runs-on: ubuntu-24.04
|
||||||
|
needs:
|
||||||
|
- common
|
||||||
|
steps:
|
||||||
|
- name: Check out code from GitHub
|
||||||
|
uses: actions/checkout@v4.1.7
|
||||||
|
- name: Restore Python
|
||||||
|
uses: ./.github/actions/restore-python
|
||||||
|
with:
|
||||||
|
python-version: ${{ env.DEFAULT_PYTHON }}
|
||||||
|
cache-key: ${{ needs.common.outputs.cache-key }}
|
||||||
|
- name: Run pyupgrade
|
||||||
|
run: |
|
||||||
|
. venv/bin/activate
|
||||||
|
pyupgrade ${{ env.PYUPGRADE_TARGET }} `find esphome -name "*.py" -type f`
|
||||||
|
- name: Suggested changes
|
||||||
|
run: script/ci-suggest-changes
|
||||||
|
if: always()
|
||||||
|
|
||||||
ci-custom:
|
ci-custom:
|
||||||
name: Run script/ci-custom
|
name: Run script/ci-custom
|
||||||
runs-on: ubuntu-24.04
|
runs-on: ubuntu-24.04
|
||||||
@ -91,7 +152,7 @@ jobs:
|
|||||||
- common
|
- common
|
||||||
steps:
|
steps:
|
||||||
- name: Check out code from GitHub
|
- name: Check out code from GitHub
|
||||||
uses: actions/checkout@v4.2.2
|
uses: actions/checkout@v4.1.7
|
||||||
- name: Restore Python
|
- name: Restore Python
|
||||||
uses: ./.github/actions/restore-python
|
uses: ./.github/actions/restore-python
|
||||||
with:
|
with:
|
||||||
@ -104,7 +165,6 @@ jobs:
|
|||||||
. venv/bin/activate
|
. venv/bin/activate
|
||||||
script/ci-custom.py
|
script/ci-custom.py
|
||||||
script/build_codeowners.py --check
|
script/build_codeowners.py --check
|
||||||
script/build_language_schema.py --check
|
|
||||||
|
|
||||||
pytest:
|
pytest:
|
||||||
name: Run pytest
|
name: Run pytest
|
||||||
@ -112,9 +172,10 @@ jobs:
|
|||||||
fail-fast: false
|
fail-fast: false
|
||||||
matrix:
|
matrix:
|
||||||
python-version:
|
python-version:
|
||||||
|
- "3.9"
|
||||||
|
- "3.10"
|
||||||
- "3.11"
|
- "3.11"
|
||||||
- "3.12"
|
- "3.12"
|
||||||
- "3.13"
|
|
||||||
os:
|
os:
|
||||||
- ubuntu-latest
|
- ubuntu-latest
|
||||||
- macOS-latest
|
- macOS-latest
|
||||||
@ -123,22 +184,25 @@ jobs:
|
|||||||
# Minimize CI resource usage
|
# Minimize CI resource usage
|
||||||
# by only running the Python version
|
# by only running the Python version
|
||||||
# version used for docker images on Windows and macOS
|
# version used for docker images on Windows and macOS
|
||||||
- python-version: "3.13"
|
|
||||||
os: windows-latest
|
|
||||||
- python-version: "3.12"
|
- python-version: "3.12"
|
||||||
os: windows-latest
|
os: windows-latest
|
||||||
- python-version: "3.13"
|
- python-version: "3.10"
|
||||||
|
os: windows-latest
|
||||||
|
- python-version: "3.9"
|
||||||
|
os: windows-latest
|
||||||
|
- python-version: "3.12"
|
||||||
os: macOS-latest
|
os: macOS-latest
|
||||||
- python-version: "3.12"
|
- python-version: "3.10"
|
||||||
|
os: macOS-latest
|
||||||
|
- python-version: "3.9"
|
||||||
os: macOS-latest
|
os: macOS-latest
|
||||||
runs-on: ${{ matrix.os }}
|
runs-on: ${{ matrix.os }}
|
||||||
needs:
|
needs:
|
||||||
- common
|
- common
|
||||||
steps:
|
steps:
|
||||||
- name: Check out code from GitHub
|
- name: Check out code from GitHub
|
||||||
uses: actions/checkout@v4.2.2
|
uses: actions/checkout@v4.1.7
|
||||||
- name: Restore Python
|
- name: Restore Python
|
||||||
id: restore-python
|
|
||||||
uses: ./.github/actions/restore-python
|
uses: ./.github/actions/restore-python
|
||||||
with:
|
with:
|
||||||
python-version: ${{ matrix.python-version }}
|
python-version: ${{ matrix.python-version }}
|
||||||
@ -148,108 +212,56 @@ jobs:
|
|||||||
- name: Run pytest
|
- name: Run pytest
|
||||||
if: matrix.os == 'windows-latest'
|
if: matrix.os == 'windows-latest'
|
||||||
run: |
|
run: |
|
||||||
. ./venv/Scripts/activate.ps1
|
./venv/Scripts/activate
|
||||||
pytest -vv --cov-report=xml --tb=native -n auto tests --ignore=tests/integration/
|
pytest -vv --cov-report=xml --tb=native tests
|
||||||
- name: Run pytest
|
- name: Run pytest
|
||||||
if: matrix.os == 'ubuntu-latest' || matrix.os == 'macOS-latest'
|
if: matrix.os == 'ubuntu-latest' || matrix.os == 'macOS-latest'
|
||||||
run: |
|
run: |
|
||||||
. venv/bin/activate
|
. venv/bin/activate
|
||||||
pytest -vv --cov-report=xml --tb=native -n auto tests --ignore=tests/integration/
|
pytest -vv --cov-report=xml --tb=native tests
|
||||||
- name: Upload coverage to Codecov
|
- name: Upload coverage to Codecov
|
||||||
uses: codecov/codecov-action@v5.4.3
|
uses: codecov/codecov-action@v5
|
||||||
with:
|
with:
|
||||||
token: ${{ secrets.CODECOV_TOKEN }}
|
token: ${{ secrets.CODECOV_TOKEN }}
|
||||||
- name: Save Python virtual environment cache
|
|
||||||
if: github.ref == 'refs/heads/dev'
|
|
||||||
uses: actions/cache/save@v4.2.3
|
|
||||||
with:
|
|
||||||
path: venv
|
|
||||||
key: ${{ runner.os }}-${{ steps.restore-python.outputs.python-version }}-venv-${{ needs.common.outputs.cache-key }}
|
|
||||||
|
|
||||||
determine-jobs:
|
clang-format:
|
||||||
name: Determine which jobs to run
|
name: Check clang-format
|
||||||
runs-on: ubuntu-24.04
|
runs-on: ubuntu-24.04
|
||||||
needs:
|
needs:
|
||||||
- common
|
- common
|
||||||
outputs:
|
|
||||||
integration-tests: ${{ steps.determine.outputs.integration-tests }}
|
|
||||||
clang-tidy: ${{ steps.determine.outputs.clang-tidy }}
|
|
||||||
python-linters: ${{ steps.determine.outputs.python-linters }}
|
|
||||||
changed-components: ${{ steps.determine.outputs.changed-components }}
|
|
||||||
component-test-count: ${{ steps.determine.outputs.component-test-count }}
|
|
||||||
steps:
|
steps:
|
||||||
- name: Check out code from GitHub
|
- name: Check out code from GitHub
|
||||||
uses: actions/checkout@v4.2.2
|
uses: actions/checkout@v4.1.7
|
||||||
with:
|
|
||||||
# Fetch enough history to find the merge base
|
|
||||||
fetch-depth: 2
|
|
||||||
- name: Restore Python
|
- name: Restore Python
|
||||||
uses: ./.github/actions/restore-python
|
uses: ./.github/actions/restore-python
|
||||||
with:
|
with:
|
||||||
python-version: ${{ env.DEFAULT_PYTHON }}
|
python-version: ${{ env.DEFAULT_PYTHON }}
|
||||||
cache-key: ${{ needs.common.outputs.cache-key }}
|
cache-key: ${{ needs.common.outputs.cache-key }}
|
||||||
- name: Determine which tests to run
|
- name: Install clang-format
|
||||||
id: determine
|
|
||||||
env:
|
|
||||||
GH_TOKEN: ${{ github.token }}
|
|
||||||
run: |
|
run: |
|
||||||
. venv/bin/activate
|
. venv/bin/activate
|
||||||
output=$(python script/determine-jobs.py)
|
pip install clang-format -c requirements_dev.txt
|
||||||
echo "Test determination output:"
|
- name: Run clang-format
|
||||||
echo "$output" | jq
|
|
||||||
|
|
||||||
# Extract individual fields
|
|
||||||
echo "integration-tests=$(echo "$output" | jq -r '.integration_tests')" >> $GITHUB_OUTPUT
|
|
||||||
echo "clang-tidy=$(echo "$output" | jq -r '.clang_tidy')" >> $GITHUB_OUTPUT
|
|
||||||
echo "python-linters=$(echo "$output" | jq -r '.python_linters')" >> $GITHUB_OUTPUT
|
|
||||||
echo "changed-components=$(echo "$output" | jq -c '.changed_components')" >> $GITHUB_OUTPUT
|
|
||||||
echo "component-test-count=$(echo "$output" | jq -r '.component_test_count')" >> $GITHUB_OUTPUT
|
|
||||||
|
|
||||||
integration-tests:
|
|
||||||
name: Run integration tests
|
|
||||||
runs-on: ubuntu-latest
|
|
||||||
needs:
|
|
||||||
- common
|
|
||||||
- determine-jobs
|
|
||||||
if: needs.determine-jobs.outputs.integration-tests == 'true'
|
|
||||||
steps:
|
|
||||||
- name: Check out code from GitHub
|
|
||||||
uses: actions/checkout@v4.2.2
|
|
||||||
- name: Set up Python 3.13
|
|
||||||
id: python
|
|
||||||
uses: actions/setup-python@v5.6.0
|
|
||||||
with:
|
|
||||||
python-version: "3.13"
|
|
||||||
- name: Restore Python virtual environment
|
|
||||||
id: cache-venv
|
|
||||||
uses: actions/cache@v4.2.3
|
|
||||||
with:
|
|
||||||
path: venv
|
|
||||||
key: ${{ runner.os }}-${{ steps.python.outputs.python-version }}-venv-${{ needs.common.outputs.cache-key }}
|
|
||||||
- name: Create Python virtual environment
|
|
||||||
if: steps.cache-venv.outputs.cache-hit != 'true'
|
|
||||||
run: |
|
|
||||||
python -m venv venv
|
|
||||||
. venv/bin/activate
|
|
||||||
python --version
|
|
||||||
pip install -r requirements.txt -r requirements_test.txt
|
|
||||||
pip install -e .
|
|
||||||
- name: Register matcher
|
|
||||||
run: echo "::add-matcher::.github/workflows/matchers/pytest.json"
|
|
||||||
- name: Run integration tests
|
|
||||||
run: |
|
run: |
|
||||||
. venv/bin/activate
|
. venv/bin/activate
|
||||||
pytest -vv --no-cov --tb=native -n auto tests/integration/
|
script/clang-format -i
|
||||||
|
git diff-index --quiet HEAD --
|
||||||
|
- name: Suggested changes
|
||||||
|
run: script/ci-suggest-changes
|
||||||
|
if: always()
|
||||||
|
|
||||||
clang-tidy:
|
clang-tidy:
|
||||||
name: ${{ matrix.name }}
|
name: ${{ matrix.name }}
|
||||||
runs-on: ubuntu-24.04
|
runs-on: ubuntu-24.04
|
||||||
needs:
|
needs:
|
||||||
- common
|
- common
|
||||||
- determine-jobs
|
- ruff
|
||||||
if: needs.determine-jobs.outputs.clang-tidy == 'true'
|
- ci-custom
|
||||||
env:
|
- clang-format
|
||||||
GH_TOKEN: ${{ github.token }}
|
- flake8
|
||||||
|
- pylint
|
||||||
|
- pytest
|
||||||
|
- pyupgrade
|
||||||
strategy:
|
strategy:
|
||||||
fail-fast: false
|
fail-fast: false
|
||||||
max-parallel: 2
|
max-parallel: 2
|
||||||
@ -279,19 +291,10 @@ jobs:
|
|||||||
name: Run script/clang-tidy for ESP32 IDF
|
name: Run script/clang-tidy for ESP32 IDF
|
||||||
options: --environment esp32-idf-tidy --grep USE_ESP_IDF
|
options: --environment esp32-idf-tidy --grep USE_ESP_IDF
|
||||||
pio_cache_key: tidyesp32-idf
|
pio_cache_key: tidyesp32-idf
|
||||||
- id: clang-tidy
|
|
||||||
name: Run script/clang-tidy for ZEPHYR
|
|
||||||
options: --environment nrf52-tidy --grep USE_ZEPHYR
|
|
||||||
pio_cache_key: tidy-zephyr
|
|
||||||
ignore_errors: false
|
|
||||||
|
|
||||||
steps:
|
steps:
|
||||||
- name: Check out code from GitHub
|
- name: Check out code from GitHub
|
||||||
uses: actions/checkout@v4.2.2
|
uses: actions/checkout@v4.1.7
|
||||||
with:
|
|
||||||
# Need history for HEAD~1 to work for checking changed files
|
|
||||||
fetch-depth: 2
|
|
||||||
|
|
||||||
- name: Restore Python
|
- name: Restore Python
|
||||||
uses: ./.github/actions/restore-python
|
uses: ./.github/actions/restore-python
|
||||||
with:
|
with:
|
||||||
@ -303,14 +306,14 @@ jobs:
|
|||||||
uses: actions/cache@v4.2.3
|
uses: actions/cache@v4.2.3
|
||||||
with:
|
with:
|
||||||
path: ~/.platformio
|
path: ~/.platformio
|
||||||
key: platformio-${{ matrix.pio_cache_key }}-${{ hashFiles('platformio.ini') }}
|
key: platformio-${{ matrix.pio_cache_key }}
|
||||||
|
|
||||||
- name: Cache platformio
|
- name: Cache platformio
|
||||||
if: github.ref != 'refs/heads/dev'
|
if: github.ref != 'refs/heads/dev'
|
||||||
uses: actions/cache/restore@v4.2.3
|
uses: actions/cache/restore@v4.2.3
|
||||||
with:
|
with:
|
||||||
path: ~/.platformio
|
path: ~/.platformio
|
||||||
key: platformio-${{ matrix.pio_cache_key }}-${{ hashFiles('platformio.ini') }}
|
key: platformio-${{ matrix.pio_cache_key }}
|
||||||
|
|
||||||
- name: Register problem matchers
|
- name: Register problem matchers
|
||||||
run: |
|
run: |
|
||||||
@ -324,49 +327,72 @@ jobs:
|
|||||||
mkdir -p .temp
|
mkdir -p .temp
|
||||||
pio run --list-targets -e esp32-idf-tidy
|
pio run --list-targets -e esp32-idf-tidy
|
||||||
|
|
||||||
- name: Check if full clang-tidy scan needed
|
|
||||||
id: check_full_scan
|
|
||||||
run: |
|
|
||||||
. venv/bin/activate
|
|
||||||
if python script/clang_tidy_hash.py --check; then
|
|
||||||
echo "full_scan=true" >> $GITHUB_OUTPUT
|
|
||||||
echo "reason=hash_changed" >> $GITHUB_OUTPUT
|
|
||||||
else
|
|
||||||
echo "full_scan=false" >> $GITHUB_OUTPUT
|
|
||||||
echo "reason=normal" >> $GITHUB_OUTPUT
|
|
||||||
fi
|
|
||||||
|
|
||||||
- name: Run clang-tidy
|
- name: Run clang-tidy
|
||||||
run: |
|
run: |
|
||||||
. venv/bin/activate
|
. venv/bin/activate
|
||||||
if [ "${{ steps.check_full_scan.outputs.full_scan }}" = "true" ]; then
|
script/clang-tidy --all-headers --fix ${{ matrix.options }}
|
||||||
echo "Running FULL clang-tidy scan (hash changed)"
|
|
||||||
script/clang-tidy --all-headers --fix ${{ matrix.options }} ${{ matrix.ignore_errors && '|| true' || '' }}
|
|
||||||
else
|
|
||||||
echo "Running clang-tidy on changed files only"
|
|
||||||
script/clang-tidy --all-headers --fix --changed ${{ matrix.options }} ${{ matrix.ignore_errors && '|| true' || '' }}
|
|
||||||
fi
|
|
||||||
env:
|
env:
|
||||||
# Also cache libdeps, store them in a ~/.platformio subfolder
|
# Also cache libdeps, store them in a ~/.platformio subfolder
|
||||||
PLATFORMIO_LIBDEPS_DIR: ~/.platformio/libdeps
|
PLATFORMIO_LIBDEPS_DIR: ~/.platformio/libdeps
|
||||||
|
|
||||||
- name: Suggested changes
|
- name: Suggested changes
|
||||||
run: script/ci-suggest-changes ${{ matrix.ignore_errors && '|| true' || '' }}
|
run: script/ci-suggest-changes
|
||||||
# yamllint disable-line rule:line-length
|
# yamllint disable-line rule:line-length
|
||||||
if: always()
|
if: always()
|
||||||
|
|
||||||
|
list-components:
|
||||||
|
runs-on: ubuntu-24.04
|
||||||
|
needs:
|
||||||
|
- common
|
||||||
|
if: github.event_name == 'pull_request'
|
||||||
|
outputs:
|
||||||
|
components: ${{ steps.list-components.outputs.components }}
|
||||||
|
count: ${{ steps.list-components.outputs.count }}
|
||||||
|
steps:
|
||||||
|
- name: Check out code from GitHub
|
||||||
|
uses: actions/checkout@v4.1.7
|
||||||
|
with:
|
||||||
|
# Fetch enough history so `git merge-base refs/remotes/origin/dev HEAD` works.
|
||||||
|
fetch-depth: 500
|
||||||
|
- name: Get target branch
|
||||||
|
id: target-branch
|
||||||
|
run: |
|
||||||
|
echo "branch=${{ github.event.pull_request.base.ref }}" >> $GITHUB_OUTPUT
|
||||||
|
- name: Fetch ${{ steps.target-branch.outputs.branch }} branch
|
||||||
|
run: |
|
||||||
|
git -c protocol.version=2 fetch --no-tags --prune --no-recurse-submodules --depth=1 origin +refs/heads/${{ steps.target-branch.outputs.branch }}:refs/remotes/origin/${{ steps.target-branch.outputs.branch }}
|
||||||
|
git merge-base refs/remotes/origin/${{ steps.target-branch.outputs.branch }} HEAD
|
||||||
|
- name: Restore Python
|
||||||
|
uses: ./.github/actions/restore-python
|
||||||
|
with:
|
||||||
|
python-version: ${{ env.DEFAULT_PYTHON }}
|
||||||
|
cache-key: ${{ needs.common.outputs.cache-key }}
|
||||||
|
- name: Find changed components
|
||||||
|
id: list-components
|
||||||
|
run: |
|
||||||
|
. venv/bin/activate
|
||||||
|
components=$(script/list-components.py --changed --branch ${{ steps.target-branch.outputs.branch }})
|
||||||
|
output_components=$(echo "$components" | jq -R -s -c 'split("\n")[:-1] | map(select(length > 0))')
|
||||||
|
count=$(echo "$output_components" | jq length)
|
||||||
|
|
||||||
|
echo "components=$output_components" >> $GITHUB_OUTPUT
|
||||||
|
echo "count=$count" >> $GITHUB_OUTPUT
|
||||||
|
|
||||||
|
echo "$count Components:"
|
||||||
|
echo "$output_components" | jq
|
||||||
|
|
||||||
test-build-components:
|
test-build-components:
|
||||||
name: Component test ${{ matrix.file }}
|
name: Component test ${{ matrix.file }}
|
||||||
runs-on: ubuntu-24.04
|
runs-on: ubuntu-24.04
|
||||||
needs:
|
needs:
|
||||||
- common
|
- common
|
||||||
- determine-jobs
|
- list-components
|
||||||
if: github.event_name == 'pull_request' && fromJSON(needs.determine-jobs.outputs.component-test-count) > 0 && fromJSON(needs.determine-jobs.outputs.component-test-count) < 100
|
if: github.event_name == 'pull_request' && fromJSON(needs.list-components.outputs.count) > 0 && fromJSON(needs.list-components.outputs.count) < 100
|
||||||
strategy:
|
strategy:
|
||||||
fail-fast: false
|
fail-fast: false
|
||||||
max-parallel: 2
|
max-parallel: 2
|
||||||
matrix:
|
matrix:
|
||||||
file: ${{ fromJson(needs.determine-jobs.outputs.changed-components) }}
|
file: ${{ fromJson(needs.list-components.outputs.components) }}
|
||||||
steps:
|
steps:
|
||||||
- name: Install dependencies
|
- name: Install dependencies
|
||||||
run: |
|
run: |
|
||||||
@ -374,7 +400,7 @@ jobs:
|
|||||||
sudo apt-get install libsdl2-dev
|
sudo apt-get install libsdl2-dev
|
||||||
|
|
||||||
- name: Check out code from GitHub
|
- name: Check out code from GitHub
|
||||||
uses: actions/checkout@v4.2.2
|
uses: actions/checkout@v4.1.7
|
||||||
- name: Restore Python
|
- name: Restore Python
|
||||||
uses: ./.github/actions/restore-python
|
uses: ./.github/actions/restore-python
|
||||||
with:
|
with:
|
||||||
@ -394,17 +420,17 @@ jobs:
|
|||||||
runs-on: ubuntu-24.04
|
runs-on: ubuntu-24.04
|
||||||
needs:
|
needs:
|
||||||
- common
|
- common
|
||||||
- determine-jobs
|
- list-components
|
||||||
if: github.event_name == 'pull_request' && fromJSON(needs.determine-jobs.outputs.component-test-count) >= 100
|
if: github.event_name == 'pull_request' && fromJSON(needs.list-components.outputs.count) >= 100
|
||||||
outputs:
|
outputs:
|
||||||
matrix: ${{ steps.split.outputs.components }}
|
matrix: ${{ steps.split.outputs.components }}
|
||||||
steps:
|
steps:
|
||||||
- name: Check out code from GitHub
|
- name: Check out code from GitHub
|
||||||
uses: actions/checkout@v4.2.2
|
uses: actions/checkout@v4.1.7
|
||||||
- name: Split components into 20 groups
|
- name: Split components into 20 groups
|
||||||
id: split
|
id: split
|
||||||
run: |
|
run: |
|
||||||
components=$(echo '${{ needs.determine-jobs.outputs.changed-components }}' | jq -c '.[]' | shuf | jq -s -c '[_nwise(20) | join(" ")]')
|
components=$(echo '${{ needs.list-components.outputs.components }}' | jq -c '.[]' | shuf | jq -s -c '[_nwise(20) | join(" ")]')
|
||||||
echo "components=$components" >> $GITHUB_OUTPUT
|
echo "components=$components" >> $GITHUB_OUTPUT
|
||||||
|
|
||||||
test-build-components-split:
|
test-build-components-split:
|
||||||
@ -412,9 +438,9 @@ jobs:
|
|||||||
runs-on: ubuntu-24.04
|
runs-on: ubuntu-24.04
|
||||||
needs:
|
needs:
|
||||||
- common
|
- common
|
||||||
- determine-jobs
|
- list-components
|
||||||
- test-build-components-splitter
|
- test-build-components-splitter
|
||||||
if: github.event_name == 'pull_request' && fromJSON(needs.determine-jobs.outputs.component-test-count) >= 100
|
if: github.event_name == 'pull_request' && fromJSON(needs.list-components.outputs.count) >= 100
|
||||||
strategy:
|
strategy:
|
||||||
fail-fast: false
|
fail-fast: false
|
||||||
max-parallel: 4
|
max-parallel: 4
|
||||||
@ -430,7 +456,7 @@ jobs:
|
|||||||
sudo apt-get install libsdl2-dev
|
sudo apt-get install libsdl2-dev
|
||||||
|
|
||||||
- name: Check out code from GitHub
|
- name: Check out code from GitHub
|
||||||
uses: actions/checkout@v4.2.2
|
uses: actions/checkout@v4.1.7
|
||||||
- name: Restore Python
|
- name: Restore Python
|
||||||
uses: ./.github/actions/restore-python
|
uses: ./.github/actions/restore-python
|
||||||
with:
|
with:
|
||||||
@ -451,41 +477,23 @@ jobs:
|
|||||||
./script/test_build_components -e compile -c $component
|
./script/test_build_components -e compile -c $component
|
||||||
done
|
done
|
||||||
|
|
||||||
pre-commit-ci-lite:
|
|
||||||
name: pre-commit.ci lite
|
|
||||||
runs-on: ubuntu-latest
|
|
||||||
needs:
|
|
||||||
- common
|
|
||||||
if: github.event_name == 'pull_request' && github.base_ref != 'beta' && github.base_ref != 'release'
|
|
||||||
steps:
|
|
||||||
- name: Check out code from GitHub
|
|
||||||
uses: actions/checkout@v4.2.2
|
|
||||||
- name: Restore Python
|
|
||||||
uses: ./.github/actions/restore-python
|
|
||||||
with:
|
|
||||||
python-version: ${{ env.DEFAULT_PYTHON }}
|
|
||||||
cache-key: ${{ needs.common.outputs.cache-key }}
|
|
||||||
- uses: pre-commit/action@v3.0.1
|
|
||||||
env:
|
|
||||||
SKIP: pylint,clang-tidy-hash
|
|
||||||
- uses: pre-commit-ci/lite-action@v1.1.0
|
|
||||||
if: always()
|
|
||||||
|
|
||||||
ci-status:
|
ci-status:
|
||||||
name: CI Status
|
name: CI Status
|
||||||
runs-on: ubuntu-24.04
|
runs-on: ubuntu-24.04
|
||||||
needs:
|
needs:
|
||||||
- common
|
- common
|
||||||
|
- ruff
|
||||||
- ci-custom
|
- ci-custom
|
||||||
|
- clang-format
|
||||||
|
- flake8
|
||||||
- pylint
|
- pylint
|
||||||
- pytest
|
- pytest
|
||||||
- integration-tests
|
- pyupgrade
|
||||||
- clang-tidy
|
- clang-tidy
|
||||||
- determine-jobs
|
- list-components
|
||||||
- test-build-components
|
- test-build-components
|
||||||
- test-build-components-splitter
|
- test-build-components-splitter
|
||||||
- test-build-components-split
|
- test-build-components-split
|
||||||
- pre-commit-ci-lite
|
|
||||||
if: always()
|
if: always()
|
||||||
steps:
|
steps:
|
||||||
- name: Success
|
- name: Success
|
||||||
|
324
.github/workflows/codeowner-review-request.yml
vendored
324
.github/workflows/codeowner-review-request.yml
vendored
@ -1,324 +0,0 @@
|
|||||||
# This workflow automatically requests reviews from codeowners when:
|
|
||||||
# 1. A PR is opened, reopened, or synchronized (updated)
|
|
||||||
# 2. A PR is marked as ready for review
|
|
||||||
#
|
|
||||||
# It reads the CODEOWNERS file and matches all changed files in the PR against
|
|
||||||
# the codeowner patterns, then requests reviews from the appropriate owners
|
|
||||||
# while avoiding duplicate requests for users who have already been requested
|
|
||||||
# or have already reviewed the PR.
|
|
||||||
|
|
||||||
name: Request Codeowner Reviews
|
|
||||||
|
|
||||||
on:
|
|
||||||
# Needs to be pull_request_target to get write permissions
|
|
||||||
pull_request_target:
|
|
||||||
types: [opened, reopened, synchronize, ready_for_review]
|
|
||||||
|
|
||||||
permissions:
|
|
||||||
pull-requests: write
|
|
||||||
contents: read
|
|
||||||
|
|
||||||
jobs:
|
|
||||||
request-codeowner-reviews:
|
|
||||||
name: Run
|
|
||||||
if: ${{ !github.event.pull_request.draft }}
|
|
||||||
runs-on: ubuntu-latest
|
|
||||||
steps:
|
|
||||||
- name: Request reviews from component codeowners
|
|
||||||
uses: actions/github-script@v7.0.1
|
|
||||||
with:
|
|
||||||
script: |
|
|
||||||
const owner = context.repo.owner;
|
|
||||||
const repo = context.repo.repo;
|
|
||||||
const pr_number = context.payload.pull_request.number;
|
|
||||||
|
|
||||||
console.log(`Processing PR #${pr_number} for codeowner review requests`);
|
|
||||||
|
|
||||||
// Hidden marker to identify bot comments from this workflow
|
|
||||||
const BOT_COMMENT_MARKER = '<!-- codeowner-review-request-bot -->';
|
|
||||||
|
|
||||||
try {
|
|
||||||
// Get the list of changed files in this PR
|
|
||||||
const { data: files } = await github.rest.pulls.listFiles({
|
|
||||||
owner,
|
|
||||||
repo,
|
|
||||||
pull_number: pr_number
|
|
||||||
});
|
|
||||||
|
|
||||||
const changedFiles = files.map(file => file.filename);
|
|
||||||
console.log(`Found ${changedFiles.length} changed files`);
|
|
||||||
|
|
||||||
if (changedFiles.length === 0) {
|
|
||||||
console.log('No changed files found, skipping codeowner review requests');
|
|
||||||
return;
|
|
||||||
}
|
|
||||||
|
|
||||||
// Fetch CODEOWNERS file from root
|
|
||||||
const { data: codeownersFile } = await github.rest.repos.getContent({
|
|
||||||
owner,
|
|
||||||
repo,
|
|
||||||
path: 'CODEOWNERS',
|
|
||||||
ref: context.payload.pull_request.base.sha
|
|
||||||
});
|
|
||||||
const codeownersContent = Buffer.from(codeownersFile.content, 'base64').toString('utf8');
|
|
||||||
|
|
||||||
// Parse CODEOWNERS file to extract all patterns and their owners
|
|
||||||
const codeownersLines = codeownersContent.split('\n')
|
|
||||||
.map(line => line.trim())
|
|
||||||
.filter(line => line && !line.startsWith('#'));
|
|
||||||
|
|
||||||
const codeownersPatterns = [];
|
|
||||||
|
|
||||||
// Convert CODEOWNERS pattern to regex (robust glob handling)
|
|
||||||
function globToRegex(pattern) {
|
|
||||||
// Escape regex special characters except for glob wildcards
|
|
||||||
let regexStr = pattern
|
|
||||||
.replace(/([.+^=!:${}()|[\]\\])/g, '\\$1') // escape regex chars
|
|
||||||
.replace(/\*\*/g, '.*') // globstar
|
|
||||||
.replace(/\*/g, '[^/]*') // single star
|
|
||||||
.replace(/\?/g, '.'); // question mark
|
|
||||||
return new RegExp('^' + regexStr + '$');
|
|
||||||
}
|
|
||||||
|
|
||||||
// Helper function to create comment body
|
|
||||||
function createCommentBody(reviewersList, teamsList, matchedFileCount, isSuccessful = true) {
|
|
||||||
const reviewerMentions = reviewersList.map(r => `@${r}`);
|
|
||||||
const teamMentions = teamsList.map(t => `@${owner}/${t}`);
|
|
||||||
const allMentions = [...reviewerMentions, ...teamMentions].join(', ');
|
|
||||||
|
|
||||||
if (isSuccessful) {
|
|
||||||
return `${BOT_COMMENT_MARKER}\n👋 Hi there! I've automatically requested reviews from codeowners based on the files changed in this PR.\n\n${allMentions} - You've been requested to review this PR as codeowner(s) of ${matchedFileCount} file(s) that were modified. Thanks for your time! 🙏`;
|
|
||||||
} else {
|
|
||||||
return `${BOT_COMMENT_MARKER}\n👋 Hi there! This PR modifies ${matchedFileCount} file(s) with codeowners.\n\n${allMentions} - As codeowner(s) of the affected files, your review would be appreciated! 🙏\n\n_Note: Automatic review request may have failed, but you're still welcome to review._`;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
for (const line of codeownersLines) {
|
|
||||||
const parts = line.split(/\s+/);
|
|
||||||
if (parts.length < 2) continue;
|
|
||||||
|
|
||||||
const pattern = parts[0];
|
|
||||||
const owners = parts.slice(1);
|
|
||||||
|
|
||||||
// Use robust glob-to-regex conversion
|
|
||||||
const regex = globToRegex(pattern);
|
|
||||||
codeownersPatterns.push({ pattern, regex, owners });
|
|
||||||
}
|
|
||||||
|
|
||||||
console.log(`Parsed ${codeownersPatterns.length} codeowner patterns`);
|
|
||||||
|
|
||||||
// Match changed files against CODEOWNERS patterns
|
|
||||||
const matchedOwners = new Set();
|
|
||||||
const matchedTeams = new Set();
|
|
||||||
const fileMatches = new Map(); // Track which files matched which patterns
|
|
||||||
|
|
||||||
for (const file of changedFiles) {
|
|
||||||
for (const { pattern, regex, owners } of codeownersPatterns) {
|
|
||||||
if (regex.test(file)) {
|
|
||||||
console.log(`File '${file}' matches pattern '${pattern}' with owners: ${owners.join(', ')}`);
|
|
||||||
|
|
||||||
if (!fileMatches.has(file)) {
|
|
||||||
fileMatches.set(file, []);
|
|
||||||
}
|
|
||||||
fileMatches.get(file).push({ pattern, owners });
|
|
||||||
|
|
||||||
// Add owners to the appropriate set (remove @ prefix)
|
|
||||||
for (const owner of owners) {
|
|
||||||
const cleanOwner = owner.startsWith('@') ? owner.slice(1) : owner;
|
|
||||||
if (cleanOwner.includes('/')) {
|
|
||||||
// Team mention (org/team-name)
|
|
||||||
const teamName = cleanOwner.split('/')[1];
|
|
||||||
matchedTeams.add(teamName);
|
|
||||||
} else {
|
|
||||||
// Individual user
|
|
||||||
matchedOwners.add(cleanOwner);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
if (matchedOwners.size === 0 && matchedTeams.size === 0) {
|
|
||||||
console.log('No codeowners found for any changed files');
|
|
||||||
return;
|
|
||||||
}
|
|
||||||
|
|
||||||
// Remove the PR author from reviewers
|
|
||||||
const prAuthor = context.payload.pull_request.user.login;
|
|
||||||
matchedOwners.delete(prAuthor);
|
|
||||||
|
|
||||||
// Get current reviewers to avoid duplicate requests (but still mention them)
|
|
||||||
const { data: prData } = await github.rest.pulls.get({
|
|
||||||
owner,
|
|
||||||
repo,
|
|
||||||
pull_number: pr_number
|
|
||||||
});
|
|
||||||
|
|
||||||
const currentReviewers = new Set();
|
|
||||||
const currentTeams = new Set();
|
|
||||||
|
|
||||||
if (prData.requested_reviewers) {
|
|
||||||
prData.requested_reviewers.forEach(reviewer => {
|
|
||||||
currentReviewers.add(reviewer.login);
|
|
||||||
});
|
|
||||||
}
|
|
||||||
|
|
||||||
if (prData.requested_teams) {
|
|
||||||
prData.requested_teams.forEach(team => {
|
|
||||||
currentTeams.add(team.slug);
|
|
||||||
});
|
|
||||||
}
|
|
||||||
|
|
||||||
// Check for completed reviews to avoid re-requesting users who have already reviewed
|
|
||||||
const { data: reviews } = await github.rest.pulls.listReviews({
|
|
||||||
owner,
|
|
||||||
repo,
|
|
||||||
pull_number: pr_number
|
|
||||||
});
|
|
||||||
|
|
||||||
const reviewedUsers = new Set();
|
|
||||||
reviews.forEach(review => {
|
|
||||||
reviewedUsers.add(review.user.login);
|
|
||||||
});
|
|
||||||
|
|
||||||
// Check for previous comments from this workflow to avoid duplicate pings
|
|
||||||
const comments = await github.paginate(
|
|
||||||
github.rest.issues.listComments,
|
|
||||||
{
|
|
||||||
owner,
|
|
||||||
repo,
|
|
||||||
issue_number: pr_number
|
|
||||||
}
|
|
||||||
);
|
|
||||||
|
|
||||||
const previouslyPingedUsers = new Set();
|
|
||||||
const previouslyPingedTeams = new Set();
|
|
||||||
|
|
||||||
// Look for comments from github-actions bot that contain our bot marker
|
|
||||||
const workflowComments = comments.filter(comment =>
|
|
||||||
comment.user.type === 'Bot' &&
|
|
||||||
comment.body.includes(BOT_COMMENT_MARKER)
|
|
||||||
);
|
|
||||||
|
|
||||||
// Extract previously mentioned users and teams from workflow comments
|
|
||||||
for (const comment of workflowComments) {
|
|
||||||
// Match @username patterns (not team mentions)
|
|
||||||
const userMentions = comment.body.match(/@([a-zA-Z0-9_.-]+)(?![/])/g) || [];
|
|
||||||
userMentions.forEach(mention => {
|
|
||||||
const username = mention.slice(1); // remove @
|
|
||||||
previouslyPingedUsers.add(username);
|
|
||||||
});
|
|
||||||
|
|
||||||
// Match @org/team patterns
|
|
||||||
const teamMentions = comment.body.match(/@[a-zA-Z0-9_.-]+\/([a-zA-Z0-9_.-]+)/g) || [];
|
|
||||||
teamMentions.forEach(mention => {
|
|
||||||
const teamName = mention.split('/')[1];
|
|
||||||
previouslyPingedTeams.add(teamName);
|
|
||||||
});
|
|
||||||
}
|
|
||||||
|
|
||||||
console.log(`Found ${previouslyPingedUsers.size} previously pinged users and ${previouslyPingedTeams.size} previously pinged teams`);
|
|
||||||
|
|
||||||
// Remove users who have already been pinged in previous workflow comments
|
|
||||||
previouslyPingedUsers.forEach(user => {
|
|
||||||
matchedOwners.delete(user);
|
|
||||||
});
|
|
||||||
|
|
||||||
previouslyPingedTeams.forEach(team => {
|
|
||||||
matchedTeams.delete(team);
|
|
||||||
});
|
|
||||||
|
|
||||||
// Remove only users who have already submitted reviews (not just requested reviewers)
|
|
||||||
reviewedUsers.forEach(reviewer => {
|
|
||||||
matchedOwners.delete(reviewer);
|
|
||||||
});
|
|
||||||
|
|
||||||
// For teams, we'll still remove already requested teams to avoid API errors
|
|
||||||
currentTeams.forEach(team => {
|
|
||||||
matchedTeams.delete(team);
|
|
||||||
});
|
|
||||||
|
|
||||||
const reviewersList = Array.from(matchedOwners);
|
|
||||||
const teamsList = Array.from(matchedTeams);
|
|
||||||
|
|
||||||
if (reviewersList.length === 0 && teamsList.length === 0) {
|
|
||||||
console.log('No eligible reviewers found (all may already be requested, reviewed, or previously pinged)');
|
|
||||||
return;
|
|
||||||
}
|
|
||||||
|
|
||||||
const totalReviewers = reviewersList.length + teamsList.length;
|
|
||||||
console.log(`Requesting reviews from ${reviewersList.length} users and ${teamsList.length} teams for ${fileMatches.size} matched files`);
|
|
||||||
|
|
||||||
// Request reviews
|
|
||||||
try {
|
|
||||||
const requestParams = {
|
|
||||||
owner,
|
|
||||||
repo,
|
|
||||||
pull_number: pr_number
|
|
||||||
};
|
|
||||||
|
|
||||||
// Filter out users who are already requested reviewers for the API call
|
|
||||||
const newReviewers = reviewersList.filter(reviewer => !currentReviewers.has(reviewer));
|
|
||||||
const newTeams = teamsList.filter(team => !currentTeams.has(team));
|
|
||||||
|
|
||||||
if (newReviewers.length > 0) {
|
|
||||||
requestParams.reviewers = newReviewers;
|
|
||||||
}
|
|
||||||
|
|
||||||
if (newTeams.length > 0) {
|
|
||||||
requestParams.team_reviewers = newTeams;
|
|
||||||
}
|
|
||||||
|
|
||||||
// Only make the API call if there are new reviewers to request
|
|
||||||
if (newReviewers.length > 0 || newTeams.length > 0) {
|
|
||||||
await github.rest.pulls.requestReviewers(requestParams);
|
|
||||||
console.log(`Successfully requested reviews from ${newReviewers.length} new users and ${newTeams.length} new teams`);
|
|
||||||
} else {
|
|
||||||
console.log('All codeowners are already requested reviewers or have reviewed');
|
|
||||||
}
|
|
||||||
|
|
||||||
// Only add a comment if there are new codeowners to mention (not previously pinged)
|
|
||||||
if (reviewersList.length > 0 || teamsList.length > 0) {
|
|
||||||
const commentBody = createCommentBody(reviewersList, teamsList, fileMatches.size, true);
|
|
||||||
|
|
||||||
await github.rest.issues.createComment({
|
|
||||||
owner,
|
|
||||||
repo,
|
|
||||||
issue_number: pr_number,
|
|
||||||
body: commentBody
|
|
||||||
});
|
|
||||||
console.log(`Added comment mentioning ${reviewersList.length} users and ${teamsList.length} teams`);
|
|
||||||
} else {
|
|
||||||
console.log('No new codeowners to mention in comment (all previously pinged)');
|
|
||||||
}
|
|
||||||
} catch (error) {
|
|
||||||
if (error.status === 422) {
|
|
||||||
console.log('Some reviewers may already be requested or unavailable:', error.message);
|
|
||||||
|
|
||||||
// Only try to add a comment if there are new codeowners to mention
|
|
||||||
if (reviewersList.length > 0 || teamsList.length > 0) {
|
|
||||||
const commentBody = createCommentBody(reviewersList, teamsList, fileMatches.size, false);
|
|
||||||
|
|
||||||
try {
|
|
||||||
await github.rest.issues.createComment({
|
|
||||||
owner,
|
|
||||||
repo,
|
|
||||||
issue_number: pr_number,
|
|
||||||
body: commentBody
|
|
||||||
});
|
|
||||||
console.log(`Added fallback comment mentioning ${reviewersList.length} users and ${teamsList.length} teams`);
|
|
||||||
} catch (commentError) {
|
|
||||||
console.log('Failed to add comment:', commentError.message);
|
|
||||||
}
|
|
||||||
} else {
|
|
||||||
console.log('No new codeowners to mention in fallback comment');
|
|
||||||
}
|
|
||||||
} else {
|
|
||||||
throw error;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
} catch (error) {
|
|
||||||
console.log('Failed to process codeowner review requests:', error.message);
|
|
||||||
console.error(error);
|
|
||||||
}
|
|
157
.github/workflows/external-component-bot.yml
vendored
157
.github/workflows/external-component-bot.yml
vendored
@ -1,157 +0,0 @@
|
|||||||
name: Add External Component Comment
|
|
||||||
|
|
||||||
on:
|
|
||||||
pull_request_target:
|
|
||||||
types: [opened, synchronize]
|
|
||||||
|
|
||||||
permissions:
|
|
||||||
contents: read # Needed to fetch PR details
|
|
||||||
issues: write # Needed to create and update comments (PR comments are managed via the issues REST API)
|
|
||||||
pull-requests: write # also needed?
|
|
||||||
|
|
||||||
jobs:
|
|
||||||
external-comment:
|
|
||||||
name: External component comment
|
|
||||||
runs-on: ubuntu-latest
|
|
||||||
steps:
|
|
||||||
- name: Add external component comment
|
|
||||||
uses: actions/github-script@v7.0.1
|
|
||||||
with:
|
|
||||||
github-token: ${{ secrets.GITHUB_TOKEN }}
|
|
||||||
script: |
|
|
||||||
// Generate external component usage instructions
|
|
||||||
function generateExternalComponentInstructions(prNumber, componentNames, owner, repo) {
|
|
||||||
let source;
|
|
||||||
if (owner === 'esphome' && repo === 'esphome')
|
|
||||||
source = `github://pr#${prNumber}`;
|
|
||||||
else
|
|
||||||
source = `github://${owner}/${repo}@pull/${prNumber}/head`;
|
|
||||||
return `To use the changes from this PR as an external component, add the following to your ESPHome configuration YAML file:
|
|
||||||
|
|
||||||
\`\`\`yaml
|
|
||||||
external_components:
|
|
||||||
- source: ${source}
|
|
||||||
components: [${componentNames.join(', ')}]
|
|
||||||
refresh: 1h
|
|
||||||
\`\`\``;
|
|
||||||
}
|
|
||||||
|
|
||||||
// Generate repo clone instructions
|
|
||||||
function generateRepoInstructions(prNumber, owner, repo, branch) {
|
|
||||||
return `To use the changes in this PR:
|
|
||||||
|
|
||||||
\`\`\`bash
|
|
||||||
# Clone the repository:
|
|
||||||
git clone https://github.com/${owner}/${repo}
|
|
||||||
cd ${repo}
|
|
||||||
|
|
||||||
# Checkout the PR branch:
|
|
||||||
git fetch origin pull/${prNumber}/head:${branch}
|
|
||||||
git checkout ${branch}
|
|
||||||
|
|
||||||
# Install the development version:
|
|
||||||
script/setup
|
|
||||||
|
|
||||||
# Activate the development version:
|
|
||||||
source venv/bin/activate
|
|
||||||
\`\`\`
|
|
||||||
|
|
||||||
Now you can run \`esphome\` as usual to test the changes in this PR.
|
|
||||||
`;
|
|
||||||
}
|
|
||||||
|
|
||||||
async function createComment(octokit, owner, repo, prNumber, esphomeChanges, componentChanges) {
|
|
||||||
const commentMarker = "<!-- This comment was generated automatically by the external-component-bot workflow. -->";
|
|
||||||
const legacyCommentMarker = "<!-- This comment was generated automatically by a GitHub workflow. -->";
|
|
||||||
let commentBody;
|
|
||||||
if (esphomeChanges.length === 1) {
|
|
||||||
commentBody = generateExternalComponentInstructions(prNumber, componentChanges, owner, repo);
|
|
||||||
} else {
|
|
||||||
commentBody = generateRepoInstructions(prNumber, owner, repo, context.payload.pull_request.head.ref);
|
|
||||||
}
|
|
||||||
commentBody += `\n\n---\n(Added by the PR bot)\n\n${commentMarker}`;
|
|
||||||
|
|
||||||
// Check for existing bot comment
|
|
||||||
const comments = await github.paginate(
|
|
||||||
github.rest.issues.listComments,
|
|
||||||
{
|
|
||||||
owner: owner,
|
|
||||||
repo: repo,
|
|
||||||
issue_number: prNumber,
|
|
||||||
per_page: 100,
|
|
||||||
}
|
|
||||||
);
|
|
||||||
|
|
||||||
const sorted = comments.sort((a, b) => new Date(b.updated_at) - new Date(a.updated_at));
|
|
||||||
|
|
||||||
const botComment = sorted.find(comment =>
|
|
||||||
(
|
|
||||||
comment.body.includes(commentMarker) ||
|
|
||||||
comment.body.includes(legacyCommentMarker)
|
|
||||||
) && comment.user.type === "Bot"
|
|
||||||
);
|
|
||||||
|
|
||||||
if (botComment && botComment.body === commentBody) {
|
|
||||||
// No changes in the comment, do nothing
|
|
||||||
return;
|
|
||||||
}
|
|
||||||
|
|
||||||
if (botComment) {
|
|
||||||
// Update existing comment
|
|
||||||
await github.rest.issues.updateComment({
|
|
||||||
owner: owner,
|
|
||||||
repo: repo,
|
|
||||||
comment_id: botComment.id,
|
|
||||||
body: commentBody,
|
|
||||||
});
|
|
||||||
} else {
|
|
||||||
// Create new comment
|
|
||||||
await github.rest.issues.createComment({
|
|
||||||
owner: owner,
|
|
||||||
repo: repo,
|
|
||||||
issue_number: prNumber,
|
|
||||||
body: commentBody,
|
|
||||||
});
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
async function getEsphomeAndComponentChanges(github, owner, repo, prNumber) {
|
|
||||||
const changedFiles = await github.rest.pulls.listFiles({
|
|
||||||
owner: owner,
|
|
||||||
repo: repo,
|
|
||||||
pull_number: prNumber,
|
|
||||||
});
|
|
||||||
|
|
||||||
const esphomeChanges = changedFiles.data
|
|
||||||
.filter(file => file.filename !== "esphome/core/defines.h" && file.filename.startsWith('esphome/'))
|
|
||||||
.map(file => {
|
|
||||||
const match = file.filename.match(/esphome\/([^/]+)/);
|
|
||||||
return match ? match[1] : null;
|
|
||||||
})
|
|
||||||
.filter(it => it !== null);
|
|
||||||
|
|
||||||
if (esphomeChanges.length === 0) {
|
|
||||||
return {esphomeChanges: [], componentChanges: []};
|
|
||||||
}
|
|
||||||
|
|
||||||
const uniqueEsphomeChanges = [...new Set(esphomeChanges)];
|
|
||||||
const componentChanges = changedFiles.data
|
|
||||||
.filter(file => file.filename.startsWith('esphome/components/'))
|
|
||||||
.map(file => {
|
|
||||||
const match = file.filename.match(/esphome\/components\/([^/]+)\//);
|
|
||||||
return match ? match[1] : null;
|
|
||||||
})
|
|
||||||
.filter(it => it !== null);
|
|
||||||
|
|
||||||
return {esphomeChanges: uniqueEsphomeChanges, componentChanges: [...new Set(componentChanges)]};
|
|
||||||
}
|
|
||||||
|
|
||||||
// Start of main code.
|
|
||||||
|
|
||||||
const prNumber = context.payload.pull_request.number;
|
|
||||||
const {owner, repo} = context.repo;
|
|
||||||
|
|
||||||
const {esphomeChanges, componentChanges} = await getEsphomeAndComponentChanges(github, owner, repo, prNumber);
|
|
||||||
if (componentChanges.length !== 0) {
|
|
||||||
await createComment(github, owner, repo, prNumber, esphomeChanges, componentChanges);
|
|
||||||
}
|
|
163
.github/workflows/issue-codeowner-notify.yml
vendored
163
.github/workflows/issue-codeowner-notify.yml
vendored
@ -1,163 +0,0 @@
|
|||||||
# This workflow automatically notifies codeowners when an issue is labeled with component labels.
|
|
||||||
# It reads the CODEOWNERS file to find the maintainers for the labeled components
|
|
||||||
# and posts a comment mentioning them to ensure they're aware of the issue.
|
|
||||||
|
|
||||||
name: Notify Issue Codeowners
|
|
||||||
|
|
||||||
on:
|
|
||||||
issues:
|
|
||||||
types: [labeled]
|
|
||||||
|
|
||||||
permissions:
|
|
||||||
issues: write
|
|
||||||
contents: read
|
|
||||||
|
|
||||||
jobs:
|
|
||||||
notify-codeowners:
|
|
||||||
name: Run
|
|
||||||
if: ${{ startsWith(github.event.label.name, format('component{0} ', ':')) }}
|
|
||||||
runs-on: ubuntu-latest
|
|
||||||
steps:
|
|
||||||
- name: Notify codeowners for component issues
|
|
||||||
uses: actions/github-script@v7.0.1
|
|
||||||
with:
|
|
||||||
script: |
|
|
||||||
const owner = context.repo.owner;
|
|
||||||
const repo = context.repo.repo;
|
|
||||||
const issue_number = context.payload.issue.number;
|
|
||||||
const labelName = context.payload.label.name;
|
|
||||||
|
|
||||||
console.log(`Processing issue #${issue_number} with label: ${labelName}`);
|
|
||||||
|
|
||||||
// Hidden marker to identify bot comments from this workflow
|
|
||||||
const BOT_COMMENT_MARKER = '<!-- issue-codeowner-notify-bot -->';
|
|
||||||
|
|
||||||
// Extract component name from label
|
|
||||||
const componentName = labelName.replace('component: ', '');
|
|
||||||
console.log(`Component: ${componentName}`);
|
|
||||||
|
|
||||||
try {
|
|
||||||
// Fetch CODEOWNERS file from root
|
|
||||||
const { data: codeownersFile } = await github.rest.repos.getContent({
|
|
||||||
owner,
|
|
||||||
repo,
|
|
||||||
path: 'CODEOWNERS'
|
|
||||||
});
|
|
||||||
const codeownersContent = Buffer.from(codeownersFile.content, 'base64').toString('utf8');
|
|
||||||
|
|
||||||
// Parse CODEOWNERS file to extract component mappings
|
|
||||||
const codeownersLines = codeownersContent.split('\n')
|
|
||||||
.map(line => line.trim())
|
|
||||||
.filter(line => line && !line.startsWith('#'));
|
|
||||||
|
|
||||||
let componentOwners = null;
|
|
||||||
|
|
||||||
for (const line of codeownersLines) {
|
|
||||||
const parts = line.split(/\s+/);
|
|
||||||
if (parts.length < 2) continue;
|
|
||||||
|
|
||||||
const pattern = parts[0];
|
|
||||||
const owners = parts.slice(1);
|
|
||||||
|
|
||||||
// Look for component patterns: esphome/components/{component}/*
|
|
||||||
const componentMatch = pattern.match(/^esphome\/components\/([^\/]+)\/\*$/);
|
|
||||||
if (componentMatch && componentMatch[1] === componentName) {
|
|
||||||
componentOwners = owners;
|
|
||||||
break;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
if (!componentOwners) {
|
|
||||||
console.log(`No codeowners found for component: ${componentName}`);
|
|
||||||
return;
|
|
||||||
}
|
|
||||||
|
|
||||||
console.log(`Found codeowners for '${componentName}': ${componentOwners.join(', ')}`);
|
|
||||||
|
|
||||||
// Separate users and teams
|
|
||||||
const userOwners = [];
|
|
||||||
const teamOwners = [];
|
|
||||||
|
|
||||||
for (const owner of componentOwners) {
|
|
||||||
const cleanOwner = owner.startsWith('@') ? owner.slice(1) : owner;
|
|
||||||
if (cleanOwner.includes('/')) {
|
|
||||||
// Team mention (org/team-name)
|
|
||||||
teamOwners.push(`@${cleanOwner}`);
|
|
||||||
} else {
|
|
||||||
// Individual user
|
|
||||||
userOwners.push(`@${cleanOwner}`);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
// Remove issue author from mentions to avoid self-notification
|
|
||||||
const issueAuthor = context.payload.issue.user.login;
|
|
||||||
const filteredUserOwners = userOwners.filter(mention =>
|
|
||||||
mention !== `@${issueAuthor}`
|
|
||||||
);
|
|
||||||
|
|
||||||
// Check for previous comments from this workflow to avoid duplicate pings
|
|
||||||
const comments = await github.paginate(
|
|
||||||
github.rest.issues.listComments,
|
|
||||||
{
|
|
||||||
owner,
|
|
||||||
repo,
|
|
||||||
issue_number: issue_number
|
|
||||||
}
|
|
||||||
);
|
|
||||||
|
|
||||||
const previouslyPingedUsers = new Set();
|
|
||||||
const previouslyPingedTeams = new Set();
|
|
||||||
|
|
||||||
// Look for comments from github-actions bot that contain codeowner pings for this component
|
|
||||||
const workflowComments = comments.filter(comment =>
|
|
||||||
comment.user.type === 'Bot' &&
|
|
||||||
comment.body.includes(BOT_COMMENT_MARKER) &&
|
|
||||||
comment.body.includes(`component: ${componentName}`)
|
|
||||||
);
|
|
||||||
|
|
||||||
// Extract previously mentioned users and teams from workflow comments
|
|
||||||
for (const comment of workflowComments) {
|
|
||||||
// Match @username patterns (not team mentions)
|
|
||||||
const userMentions = comment.body.match(/@([a-zA-Z0-9_.-]+)(?![/])/g) || [];
|
|
||||||
userMentions.forEach(mention => {
|
|
||||||
previouslyPingedUsers.add(mention); // Keep @ prefix for easy comparison
|
|
||||||
});
|
|
||||||
|
|
||||||
// Match @org/team patterns
|
|
||||||
const teamMentions = comment.body.match(/@[a-zA-Z0-9_.-]+\/[a-zA-Z0-9_.-]+/g) || [];
|
|
||||||
teamMentions.forEach(mention => {
|
|
||||||
previouslyPingedTeams.add(mention);
|
|
||||||
});
|
|
||||||
}
|
|
||||||
|
|
||||||
console.log(`Found ${previouslyPingedUsers.size} previously pinged users and ${previouslyPingedTeams.size} previously pinged teams for component ${componentName}`);
|
|
||||||
|
|
||||||
// Remove previously pinged users and teams
|
|
||||||
const newUserOwners = filteredUserOwners.filter(mention => !previouslyPingedUsers.has(mention));
|
|
||||||
const newTeamOwners = teamOwners.filter(mention => !previouslyPingedTeams.has(mention));
|
|
||||||
|
|
||||||
const allMentions = [...newUserOwners, ...newTeamOwners];
|
|
||||||
|
|
||||||
if (allMentions.length === 0) {
|
|
||||||
console.log('No new codeowners to notify (all previously pinged or issue author is the only codeowner)');
|
|
||||||
return;
|
|
||||||
}
|
|
||||||
|
|
||||||
// Create comment body
|
|
||||||
const mentionString = allMentions.join(', ');
|
|
||||||
const commentBody = `${BOT_COMMENT_MARKER}\n👋 Hey ${mentionString}!\n\nThis issue has been labeled with \`component: ${componentName}\` and you've been identified as a codeowner of this component. Please take a look when you have a chance!\n\nThanks for maintaining this component! 🙏`;
|
|
||||||
|
|
||||||
// Post comment
|
|
||||||
await github.rest.issues.createComment({
|
|
||||||
owner,
|
|
||||||
repo,
|
|
||||||
issue_number: issue_number,
|
|
||||||
body: commentBody
|
|
||||||
});
|
|
||||||
|
|
||||||
console.log(`Successfully notified new codeowners: ${mentionString}`);
|
|
||||||
|
|
||||||
} catch (error) {
|
|
||||||
console.log('Failed to process codeowner notifications:', error.message);
|
|
||||||
console.error(error);
|
|
||||||
}
|
|
23
.github/workflows/lock.yml
vendored
23
.github/workflows/lock.yml
vendored
@ -1,11 +1,28 @@
|
|||||||
---
|
---
|
||||||
name: Lock closed issues and PRs
|
name: Lock
|
||||||
|
|
||||||
on:
|
on:
|
||||||
schedule:
|
schedule:
|
||||||
- cron: "30 0 * * *" # Run daily at 00:30 UTC
|
- cron: "30 0 * * *"
|
||||||
workflow_dispatch:
|
workflow_dispatch:
|
||||||
|
|
||||||
|
permissions:
|
||||||
|
issues: write
|
||||||
|
pull-requests: write
|
||||||
|
|
||||||
|
concurrency:
|
||||||
|
group: lock
|
||||||
|
|
||||||
jobs:
|
jobs:
|
||||||
lock:
|
lock:
|
||||||
uses: esphome/workflows/.github/workflows/lock.yml@main
|
runs-on: ubuntu-latest
|
||||||
|
steps:
|
||||||
|
- uses: dessant/lock-threads@v5.0.1
|
||||||
|
with:
|
||||||
|
pr-inactive-days: "1"
|
||||||
|
pr-lock-reason: ""
|
||||||
|
exclude-any-pr-labels: keep-open
|
||||||
|
|
||||||
|
issue-inactive-days: "7"
|
||||||
|
issue-lock-reason: ""
|
||||||
|
exclude-any-issue-labels: keep-open
|
||||||
|
120
.github/workflows/release.yml
vendored
120
.github/workflows/release.yml
vendored
@ -18,9 +18,8 @@ jobs:
|
|||||||
outputs:
|
outputs:
|
||||||
tag: ${{ steps.tag.outputs.tag }}
|
tag: ${{ steps.tag.outputs.tag }}
|
||||||
branch_build: ${{ steps.tag.outputs.branch_build }}
|
branch_build: ${{ steps.tag.outputs.branch_build }}
|
||||||
deploy_env: ${{ steps.tag.outputs.deploy_env }}
|
|
||||||
steps:
|
steps:
|
||||||
- uses: actions/checkout@v4.2.2
|
- uses: actions/checkout@v4.1.7
|
||||||
- name: Get tag
|
- name: Get tag
|
||||||
id: tag
|
id: tag
|
||||||
# yamllint disable rule:line-length
|
# yamllint disable rule:line-length
|
||||||
@ -28,11 +27,6 @@ jobs:
|
|||||||
if [[ "${{ github.event_name }}" = "release" ]]; then
|
if [[ "${{ github.event_name }}" = "release" ]]; then
|
||||||
TAG="${{ github.event.release.tag_name}}"
|
TAG="${{ github.event.release.tag_name}}"
|
||||||
BRANCH_BUILD="false"
|
BRANCH_BUILD="false"
|
||||||
if [[ "${{ github.event.release.prerelease }}" = "true" ]]; then
|
|
||||||
ENVIRONMENT="beta"
|
|
||||||
else
|
|
||||||
ENVIRONMENT="production"
|
|
||||||
fi
|
|
||||||
else
|
else
|
||||||
TAG=$(cat esphome/const.py | sed -n -E "s/^__version__\s+=\s+\"(.+)\"$/\1/p")
|
TAG=$(cat esphome/const.py | sed -n -E "s/^__version__\s+=\s+\"(.+)\"$/\1/p")
|
||||||
today="$(date --utc '+%Y%m%d')"
|
today="$(date --utc '+%Y%m%d')"
|
||||||
@ -41,15 +35,12 @@ jobs:
|
|||||||
if [[ "$BRANCH" != "dev" ]]; then
|
if [[ "$BRANCH" != "dev" ]]; then
|
||||||
TAG="${TAG}-${BRANCH}"
|
TAG="${TAG}-${BRANCH}"
|
||||||
BRANCH_BUILD="true"
|
BRANCH_BUILD="true"
|
||||||
ENVIRONMENT=""
|
|
||||||
else
|
else
|
||||||
BRANCH_BUILD="false"
|
BRANCH_BUILD="false"
|
||||||
ENVIRONMENT="dev"
|
|
||||||
fi
|
fi
|
||||||
fi
|
fi
|
||||||
echo "tag=${TAG}" >> $GITHUB_OUTPUT
|
echo "tag=${TAG}" >> $GITHUB_OUTPUT
|
||||||
echo "branch_build=${BRANCH_BUILD}" >> $GITHUB_OUTPUT
|
echo "branch_build=${BRANCH_BUILD}" >> $GITHUB_OUTPUT
|
||||||
echo "deploy_env=${ENVIRONMENT}" >> $GITHUB_OUTPUT
|
|
||||||
# yamllint enable rule:line-length
|
# yamllint enable rule:line-length
|
||||||
|
|
||||||
deploy-pypi:
|
deploy-pypi:
|
||||||
@ -60,46 +51,48 @@ jobs:
|
|||||||
contents: read
|
contents: read
|
||||||
id-token: write
|
id-token: write
|
||||||
steps:
|
steps:
|
||||||
- uses: actions/checkout@v4.2.2
|
- uses: actions/checkout@v4.1.7
|
||||||
- name: Set up Python
|
- name: Set up Python
|
||||||
uses: actions/setup-python@v5.6.0
|
uses: actions/setup-python@v5.5.0
|
||||||
with:
|
with:
|
||||||
python-version: "3.x"
|
python-version: "3.x"
|
||||||
|
- name: Set up python environment
|
||||||
|
env:
|
||||||
|
ESPHOME_NO_VENV: 1
|
||||||
|
run: script/setup
|
||||||
- name: Build
|
- name: Build
|
||||||
run: |-
|
run: |-
|
||||||
pip3 install build
|
pip3 install build
|
||||||
python3 -m build
|
python3 -m build
|
||||||
- name: Publish
|
- name: Publish
|
||||||
uses: pypa/gh-action-pypi-publish@v1.12.4
|
uses: pypa/gh-action-pypi-publish@v1.12.4
|
||||||
with:
|
|
||||||
skip-existing: true
|
|
||||||
|
|
||||||
deploy-docker:
|
deploy-docker:
|
||||||
name: Build ESPHome ${{ matrix.platform.arch }}
|
name: Build ESPHome ${{ matrix.platform }}
|
||||||
if: github.repository == 'esphome/esphome'
|
if: github.repository == 'esphome/esphome'
|
||||||
permissions:
|
permissions:
|
||||||
contents: read
|
contents: read
|
||||||
packages: write
|
packages: write
|
||||||
runs-on: ${{ matrix.platform.os }}
|
runs-on: ubuntu-latest
|
||||||
needs: [init]
|
needs: [init]
|
||||||
strategy:
|
strategy:
|
||||||
fail-fast: false
|
fail-fast: false
|
||||||
matrix:
|
matrix:
|
||||||
platform:
|
platform:
|
||||||
- arch: amd64
|
- linux/amd64
|
||||||
os: "ubuntu-24.04"
|
- linux/arm64
|
||||||
- arch: arm64
|
|
||||||
os: "ubuntu-24.04-arm"
|
|
||||||
|
|
||||||
steps:
|
steps:
|
||||||
- uses: actions/checkout@v4.2.2
|
- uses: actions/checkout@v4.1.7
|
||||||
- name: Set up Python
|
- name: Set up Python
|
||||||
uses: actions/setup-python@v5.6.0
|
uses: actions/setup-python@v5.5.0
|
||||||
with:
|
with:
|
||||||
python-version: "3.11"
|
python-version: "3.9"
|
||||||
|
|
||||||
- name: Set up Docker Buildx
|
- name: Set up Docker Buildx
|
||||||
uses: docker/setup-buildx-action@v3.11.1
|
uses: docker/setup-buildx-action@v3.10.0
|
||||||
|
- name: Set up QEMU
|
||||||
|
if: matrix.platform != 'linux/amd64'
|
||||||
|
uses: docker/setup-qemu-action@v3.6.0
|
||||||
|
|
||||||
- name: Log in to docker hub
|
- name: Log in to docker hub
|
||||||
uses: docker/login-action@v3.4.0
|
uses: docker/login-action@v3.4.0
|
||||||
@ -116,36 +109,45 @@ jobs:
|
|||||||
- name: Build docker
|
- name: Build docker
|
||||||
uses: ./.github/actions/build-image
|
uses: ./.github/actions/build-image
|
||||||
with:
|
with:
|
||||||
target: final
|
platform: ${{ matrix.platform }}
|
||||||
build_type: docker
|
target: docker
|
||||||
|
baseimg: docker
|
||||||
suffix: ""
|
suffix: ""
|
||||||
version: ${{ needs.init.outputs.tag }}
|
version: ${{ needs.init.outputs.tag }}
|
||||||
|
|
||||||
- name: Build ha-addon
|
- name: Build ha-addon
|
||||||
uses: ./.github/actions/build-image
|
uses: ./.github/actions/build-image
|
||||||
with:
|
with:
|
||||||
target: final
|
platform: ${{ matrix.platform }}
|
||||||
build_type: ha-addon
|
target: hassio
|
||||||
|
baseimg: hassio
|
||||||
suffix: "hassio"
|
suffix: "hassio"
|
||||||
version: ${{ needs.init.outputs.tag }}
|
version: ${{ needs.init.outputs.tag }}
|
||||||
|
|
||||||
# - name: Build lint
|
- name: Build lint
|
||||||
# uses: ./.github/actions/build-image
|
uses: ./.github/actions/build-image
|
||||||
# with:
|
with:
|
||||||
# target: lint
|
platform: ${{ matrix.platform }}
|
||||||
# build_type: lint
|
target: lint
|
||||||
# suffix: lint
|
baseimg: docker
|
||||||
# version: ${{ needs.init.outputs.tag }}
|
suffix: lint
|
||||||
|
version: ${{ needs.init.outputs.tag }}
|
||||||
|
|
||||||
|
- name: Sanitize platform name
|
||||||
|
id: sanitize
|
||||||
|
run: |
|
||||||
|
echo "${{ matrix.platform }}" | sed 's|/|-|g' > /tmp/platform
|
||||||
|
echo name=$(cat /tmp/platform) >> $GITHUB_OUTPUT
|
||||||
|
|
||||||
- name: Upload digests
|
- name: Upload digests
|
||||||
uses: actions/upload-artifact@v4.6.2
|
uses: actions/upload-artifact@v4.6.2
|
||||||
with:
|
with:
|
||||||
name: digests-${{ matrix.platform.arch }}
|
name: digests-${{ steps.sanitize.outputs.name }}
|
||||||
path: /tmp/digests
|
path: /tmp/digests
|
||||||
retention-days: 1
|
retention-days: 1
|
||||||
|
|
||||||
deploy-manifest:
|
deploy-manifest:
|
||||||
name: Publish ESPHome ${{ matrix.image.build_type }} to ${{ matrix.registry }}
|
name: Publish ESPHome ${{ matrix.image.title }} to ${{ matrix.registry }}
|
||||||
runs-on: ubuntu-latest
|
runs-on: ubuntu-latest
|
||||||
needs:
|
needs:
|
||||||
- init
|
- init
|
||||||
@ -158,27 +160,30 @@ jobs:
|
|||||||
fail-fast: false
|
fail-fast: false
|
||||||
matrix:
|
matrix:
|
||||||
image:
|
image:
|
||||||
- build_type: "docker"
|
- title: "ha-addon"
|
||||||
suffix: ""
|
target: "hassio"
|
||||||
- build_type: "ha-addon"
|
|
||||||
suffix: "hassio"
|
suffix: "hassio"
|
||||||
# - build_type: "lint"
|
- title: "docker"
|
||||||
# suffix: "lint"
|
target: "docker"
|
||||||
|
suffix: ""
|
||||||
|
- title: "lint"
|
||||||
|
target: "lint"
|
||||||
|
suffix: "lint"
|
||||||
registry:
|
registry:
|
||||||
- ghcr
|
- ghcr
|
||||||
- dockerhub
|
- dockerhub
|
||||||
steps:
|
steps:
|
||||||
- uses: actions/checkout@v4.2.2
|
- uses: actions/checkout@v4.1.7
|
||||||
|
|
||||||
- name: Download digests
|
- name: Download digests
|
||||||
uses: actions/download-artifact@v4.3.0
|
uses: actions/download-artifact@v4.2.1
|
||||||
with:
|
with:
|
||||||
pattern: digests-*
|
pattern: digests-*
|
||||||
path: /tmp/digests
|
path: /tmp/digests
|
||||||
merge-multiple: true
|
merge-multiple: true
|
||||||
|
|
||||||
- name: Set up Docker Buildx
|
- name: Set up Docker Buildx
|
||||||
uses: docker/setup-buildx-action@v3.11.1
|
uses: docker/setup-buildx-action@v3.10.0
|
||||||
|
|
||||||
- name: Log in to docker hub
|
- name: Log in to docker hub
|
||||||
if: matrix.registry == 'dockerhub'
|
if: matrix.registry == 'dockerhub'
|
||||||
@ -207,7 +212,7 @@ jobs:
|
|||||||
done
|
done
|
||||||
|
|
||||||
- name: Create manifest list and push
|
- name: Create manifest list and push
|
||||||
working-directory: /tmp/digests/${{ matrix.image.build_type }}/${{ matrix.registry }}
|
working-directory: /tmp/digests/${{ matrix.image.target }}/${{ matrix.registry }}
|
||||||
run: |
|
run: |
|
||||||
docker buildx imagetools create $(jq -Rcnr 'inputs | . / "," | map("-t " + .) | join(" ")' <<< "${{ steps.tags.outputs.tags}}") \
|
docker buildx imagetools create $(jq -Rcnr 'inputs | . / "," | map("-t " + .) | join(" ")' <<< "${{ steps.tags.outputs.tags}}") \
|
||||||
$(printf '${{ steps.tags.outputs.image }}@sha256:%s ' *)
|
$(printf '${{ steps.tags.outputs.image }}@sha256:%s ' *)
|
||||||
@ -238,24 +243,3 @@ jobs:
|
|||||||
content: description
|
content: description
|
||||||
}
|
}
|
||||||
})
|
})
|
||||||
|
|
||||||
deploy-esphome-schema:
|
|
||||||
if: github.repository == 'esphome/esphome' && needs.init.outputs.branch_build == 'false'
|
|
||||||
runs-on: ubuntu-latest
|
|
||||||
needs: [init]
|
|
||||||
environment: ${{ needs.init.outputs.deploy_env }}
|
|
||||||
steps:
|
|
||||||
- name: Trigger Workflow
|
|
||||||
uses: actions/github-script@v7.0.1
|
|
||||||
with:
|
|
||||||
github-token: ${{ secrets.DEPLOY_ESPHOME_SCHEMA_REPO_TOKEN }}
|
|
||||||
script: |
|
|
||||||
github.rest.actions.createWorkflowDispatch({
|
|
||||||
owner: "esphome",
|
|
||||||
repo: "esphome-schema",
|
|
||||||
workflow_id: "generate-schemas.yml",
|
|
||||||
ref: "main",
|
|
||||||
inputs: {
|
|
||||||
version: "${{ needs.init.outputs.tag }}",
|
|
||||||
}
|
|
||||||
})
|
|
||||||
|
8
.github/workflows/sync-device-classes.yml
vendored
8
.github/workflows/sync-device-classes.yml
vendored
@ -13,18 +13,18 @@ jobs:
|
|||||||
if: github.repository == 'esphome/esphome'
|
if: github.repository == 'esphome/esphome'
|
||||||
steps:
|
steps:
|
||||||
- name: Checkout
|
- name: Checkout
|
||||||
uses: actions/checkout@v4.2.2
|
uses: actions/checkout@v4.1.7
|
||||||
|
|
||||||
- name: Checkout Home Assistant
|
- name: Checkout Home Assistant
|
||||||
uses: actions/checkout@v4.2.2
|
uses: actions/checkout@v4.1.7
|
||||||
with:
|
with:
|
||||||
repository: home-assistant/core
|
repository: home-assistant/core
|
||||||
path: lib/home-assistant
|
path: lib/home-assistant
|
||||||
|
|
||||||
- name: Setup Python
|
- name: Setup Python
|
||||||
uses: actions/setup-python@v5.6.0
|
uses: actions/setup-python@v5.5.0
|
||||||
with:
|
with:
|
||||||
python-version: 3.13
|
python-version: 3.12
|
||||||
|
|
||||||
- name: Install Home Assistant
|
- name: Install Home Assistant
|
||||||
run: |
|
run: |
|
||||||
|
25
.github/workflows/yaml-lint.yml
vendored
Normal file
25
.github/workflows/yaml-lint.yml
vendored
Normal file
@ -0,0 +1,25 @@
|
|||||||
|
---
|
||||||
|
name: YAML lint
|
||||||
|
|
||||||
|
on:
|
||||||
|
push:
|
||||||
|
branches: [dev, beta, release]
|
||||||
|
paths:
|
||||||
|
- "**.yaml"
|
||||||
|
- "**.yml"
|
||||||
|
pull_request:
|
||||||
|
paths:
|
||||||
|
- "**.yaml"
|
||||||
|
- "**.yml"
|
||||||
|
|
||||||
|
jobs:
|
||||||
|
yamllint:
|
||||||
|
name: yamllint
|
||||||
|
runs-on: ubuntu-latest
|
||||||
|
steps:
|
||||||
|
- name: Check out code from GitHub
|
||||||
|
uses: actions/checkout@v4.1.7
|
||||||
|
- name: Run yamllint
|
||||||
|
uses: frenck/action-yamllint@v1.5.0
|
||||||
|
with:
|
||||||
|
strict: true
|
1
.gitignore
vendored
1
.gitignore
vendored
@ -143,4 +143,3 @@ sdkconfig.*
|
|||||||
/components
|
/components
|
||||||
/managed_components
|
/managed_components
|
||||||
|
|
||||||
api-docs/
|
|
||||||
|
@ -1,17 +1,10 @@
|
|||||||
---
|
---
|
||||||
# See https://pre-commit.com for more information
|
# See https://pre-commit.com for more information
|
||||||
# See https://pre-commit.com/hooks.html for more hooks
|
# See https://pre-commit.com/hooks.html for more hooks
|
||||||
|
|
||||||
ci:
|
|
||||||
autoupdate_commit_msg: 'pre-commit: autoupdate'
|
|
||||||
autoupdate_schedule: off # Disabled until ruff versions are synced between deps and pre-commit
|
|
||||||
# Skip hooks that have issues in pre-commit CI environment
|
|
||||||
skip: [pylint, clang-tidy-hash]
|
|
||||||
|
|
||||||
repos:
|
repos:
|
||||||
- repo: https://github.com/astral-sh/ruff-pre-commit
|
- repo: https://github.com/astral-sh/ruff-pre-commit
|
||||||
# Ruff version.
|
# Ruff version.
|
||||||
rev: v0.12.5
|
rev: v0.11.0
|
||||||
hooks:
|
hooks:
|
||||||
# Run the linter.
|
# Run the linter.
|
||||||
- id: ruff
|
- id: ruff
|
||||||
@ -19,7 +12,7 @@ repos:
|
|||||||
# Run the formatter.
|
# Run the formatter.
|
||||||
- id: ruff-format
|
- id: ruff-format
|
||||||
- repo: https://github.com/PyCQA/flake8
|
- repo: https://github.com/PyCQA/flake8
|
||||||
rev: 7.3.0
|
rev: 7.2.0
|
||||||
hooks:
|
hooks:
|
||||||
- id: flake8
|
- id: flake8
|
||||||
additional_dependencies:
|
additional_dependencies:
|
||||||
@ -27,25 +20,22 @@ repos:
|
|||||||
- pydocstyle==5.1.1
|
- pydocstyle==5.1.1
|
||||||
files: ^(esphome|tests)/.+\.py$
|
files: ^(esphome|tests)/.+\.py$
|
||||||
- repo: https://github.com/pre-commit/pre-commit-hooks
|
- repo: https://github.com/pre-commit/pre-commit-hooks
|
||||||
rev: v5.0.0
|
rev: v3.4.0
|
||||||
hooks:
|
hooks:
|
||||||
- id: no-commit-to-branch
|
- id: no-commit-to-branch
|
||||||
args:
|
args:
|
||||||
- --branch=dev
|
- --branch=dev
|
||||||
- --branch=release
|
- --branch=release
|
||||||
- --branch=beta
|
- --branch=beta
|
||||||
- id: end-of-file-fixer
|
|
||||||
- id: trailing-whitespace
|
|
||||||
- repo: https://github.com/asottile/pyupgrade
|
- repo: https://github.com/asottile/pyupgrade
|
||||||
rev: v3.20.0
|
rev: v3.15.2
|
||||||
hooks:
|
hooks:
|
||||||
- id: pyupgrade
|
- id: pyupgrade
|
||||||
args: [--py311-plus]
|
args: [--py39-plus]
|
||||||
- repo: https://github.com/adrienverge/yamllint.git
|
- repo: https://github.com/adrienverge/yamllint.git
|
||||||
rev: v1.37.1
|
rev: v1.35.1
|
||||||
hooks:
|
hooks:
|
||||||
- id: yamllint
|
- id: yamllint
|
||||||
exclude: ^(\.clang-format|\.clang-tidy)$
|
|
||||||
- repo: https://github.com/pre-commit/mirrors-clang-format
|
- repo: https://github.com/pre-commit/mirrors-clang-format
|
||||||
rev: v13.0.1
|
rev: v13.0.1
|
||||||
hooks:
|
hooks:
|
||||||
@ -58,10 +48,3 @@ repos:
|
|||||||
entry: python3 script/run-in-env.py pylint
|
entry: python3 script/run-in-env.py pylint
|
||||||
language: system
|
language: system
|
||||||
types: [python]
|
types: [python]
|
||||||
- id: clang-tidy-hash
|
|
||||||
name: Update clang-tidy hash
|
|
||||||
entry: python script/clang_tidy_hash.py --update-if-changed
|
|
||||||
language: python
|
|
||||||
files: ^(\.clang-tidy|platformio\.ini|requirements_dev\.txt)$
|
|
||||||
pass_filenames: false
|
|
||||||
additional_dependencies: []
|
|
||||||
|
42
CODEOWNERS
42
CODEOWNERS
@ -9,7 +9,6 @@
|
|||||||
pyproject.toml @esphome/core
|
pyproject.toml @esphome/core
|
||||||
esphome/*.py @esphome/core
|
esphome/*.py @esphome/core
|
||||||
esphome/core/* @esphome/core
|
esphome/core/* @esphome/core
|
||||||
.github/** @esphome/core
|
|
||||||
|
|
||||||
# Integrations
|
# Integrations
|
||||||
esphome/components/a01nyub/* @MrSuicideParrot
|
esphome/components/a01nyub/* @MrSuicideParrot
|
||||||
@ -29,7 +28,7 @@ esphome/components/aic3204/* @kbx81
|
|||||||
esphome/components/airthings_ble/* @jeromelaban
|
esphome/components/airthings_ble/* @jeromelaban
|
||||||
esphome/components/airthings_wave_base/* @jeromelaban @kpfleming @ncareau
|
esphome/components/airthings_wave_base/* @jeromelaban @kpfleming @ncareau
|
||||||
esphome/components/airthings_wave_mini/* @ncareau
|
esphome/components/airthings_wave_mini/* @ncareau
|
||||||
esphome/components/airthings_wave_plus/* @jeromelaban @precurse
|
esphome/components/airthings_wave_plus/* @jeromelaban
|
||||||
esphome/components/alarm_control_panel/* @grahambrown11 @hwstar
|
esphome/components/alarm_control_panel/* @grahambrown11 @hwstar
|
||||||
esphome/components/alpha3/* @jan-hofmeier
|
esphome/components/alpha3/* @jan-hofmeier
|
||||||
esphome/components/am2315c/* @swoboda1337
|
esphome/components/am2315c/* @swoboda1337
|
||||||
@ -88,7 +87,6 @@ esphome/components/bp1658cj/* @Cossid
|
|||||||
esphome/components/bp5758d/* @Cossid
|
esphome/components/bp5758d/* @Cossid
|
||||||
esphome/components/button/* @esphome/core
|
esphome/components/button/* @esphome/core
|
||||||
esphome/components/bytebuffer/* @clydebarrow
|
esphome/components/bytebuffer/* @clydebarrow
|
||||||
esphome/components/camera/* @DT-art1 @bdraco
|
|
||||||
esphome/components/canbus/* @danielschramm @mvturnho
|
esphome/components/canbus/* @danielschramm @mvturnho
|
||||||
esphome/components/cap1188/* @mreditor97
|
esphome/components/cap1188/* @mreditor97
|
||||||
esphome/components/captive_portal/* @OttoWinter
|
esphome/components/captive_portal/* @OttoWinter
|
||||||
@ -98,10 +96,8 @@ esphome/components/ch422g/* @clydebarrow @jesterret
|
|||||||
esphome/components/chsc6x/* @kkosik20
|
esphome/components/chsc6x/* @kkosik20
|
||||||
esphome/components/climate/* @esphome/core
|
esphome/components/climate/* @esphome/core
|
||||||
esphome/components/climate_ir/* @glmnet
|
esphome/components/climate_ir/* @glmnet
|
||||||
esphome/components/cm1106/* @andrewjswan
|
|
||||||
esphome/components/color_temperature/* @jesserockz
|
esphome/components/color_temperature/* @jesserockz
|
||||||
esphome/components/combination/* @Cat-Ion @kahrendt
|
esphome/components/combination/* @Cat-Ion @kahrendt
|
||||||
esphome/components/const/* @esphome/core
|
|
||||||
esphome/components/coolix/* @glmnet
|
esphome/components/coolix/* @glmnet
|
||||||
esphome/components/copy/* @OttoWinter
|
esphome/components/copy/* @OttoWinter
|
||||||
esphome/components/cover/* @esphome/core
|
esphome/components/cover/* @esphome/core
|
||||||
@ -126,7 +122,6 @@ esphome/components/dht/* @OttoWinter
|
|||||||
esphome/components/display_menu_base/* @numo68
|
esphome/components/display_menu_base/* @numo68
|
||||||
esphome/components/dps310/* @kbx81
|
esphome/components/dps310/* @kbx81
|
||||||
esphome/components/ds1307/* @badbadc0ffee
|
esphome/components/ds1307/* @badbadc0ffee
|
||||||
esphome/components/ds2484/* @mrk-its
|
|
||||||
esphome/components/dsmr/* @glmnet @zuidwijk
|
esphome/components/dsmr/* @glmnet @zuidwijk
|
||||||
esphome/components/duty_time/* @dudanov
|
esphome/components/duty_time/* @dudanov
|
||||||
esphome/components/ee895/* @Stock-M
|
esphome/components/ee895/* @Stock-M
|
||||||
@ -142,19 +137,16 @@ esphome/components/es7210/* @kahrendt
|
|||||||
esphome/components/es7243e/* @kbx81
|
esphome/components/es7243e/* @kbx81
|
||||||
esphome/components/es8156/* @kbx81
|
esphome/components/es8156/* @kbx81
|
||||||
esphome/components/es8311/* @kahrendt @kroimon
|
esphome/components/es8311/* @kahrendt @kroimon
|
||||||
esphome/components/es8388/* @P4uLT
|
|
||||||
esphome/components/esp32/* @esphome/core
|
esphome/components/esp32/* @esphome/core
|
||||||
esphome/components/esp32_ble/* @Rapsssito @jesserockz
|
esphome/components/esp32_ble/* @Rapsssito @jesserockz
|
||||||
esphome/components/esp32_ble_client/* @jesserockz
|
esphome/components/esp32_ble_client/* @jesserockz
|
||||||
esphome/components/esp32_ble_server/* @Rapsssito @clydebarrow @jesserockz
|
esphome/components/esp32_ble_server/* @Rapsssito @clydebarrow @jesserockz
|
||||||
esphome/components/esp32_camera_web_server/* @ayufan
|
esphome/components/esp32_camera_web_server/* @ayufan
|
||||||
esphome/components/esp32_can/* @Sympatron
|
esphome/components/esp32_can/* @Sympatron
|
||||||
esphome/components/esp32_hosted/* @swoboda1337
|
|
||||||
esphome/components/esp32_improv/* @jesserockz
|
esphome/components/esp32_improv/* @jesserockz
|
||||||
esphome/components/esp32_rmt/* @jesserockz
|
esphome/components/esp32_rmt/* @jesserockz
|
||||||
esphome/components/esp32_rmt_led_strip/* @jesserockz
|
esphome/components/esp32_rmt_led_strip/* @jesserockz
|
||||||
esphome/components/esp8266/* @esphome/core
|
esphome/components/esp8266/* @esphome/core
|
||||||
esphome/components/esp_ldo/* @clydebarrow
|
|
||||||
esphome/components/ethernet_info/* @gtjadsonsantos
|
esphome/components/ethernet_info/* @gtjadsonsantos
|
||||||
esphome/components/event/* @nohat
|
esphome/components/event/* @nohat
|
||||||
esphome/components/event_emitter/* @Rapsssito
|
esphome/components/event_emitter/* @Rapsssito
|
||||||
@ -171,13 +163,12 @@ esphome/components/ft5x06/* @clydebarrow
|
|||||||
esphome/components/ft63x6/* @gpambrozio
|
esphome/components/ft63x6/* @gpambrozio
|
||||||
esphome/components/gcja5/* @gcormier
|
esphome/components/gcja5/* @gcormier
|
||||||
esphome/components/gdk101/* @Szewcson
|
esphome/components/gdk101/* @Szewcson
|
||||||
esphome/components/gl_r01_i2c/* @pkejval
|
|
||||||
esphome/components/globals/* @esphome/core
|
esphome/components/globals/* @esphome/core
|
||||||
esphome/components/gp2y1010au0f/* @zry98
|
esphome/components/gp2y1010au0f/* @zry98
|
||||||
esphome/components/gp8403/* @jesserockz
|
esphome/components/gp8403/* @jesserockz
|
||||||
esphome/components/gpio/* @esphome/core
|
esphome/components/gpio/* @esphome/core
|
||||||
esphome/components/gpio/one_wire/* @ssieb
|
esphome/components/gpio/one_wire/* @ssieb
|
||||||
esphome/components/gps/* @coogle @ximex
|
esphome/components/gps/* @coogle
|
||||||
esphome/components/graph/* @synco
|
esphome/components/graph/* @synco
|
||||||
esphome/components/graphical_display_menu/* @MrMDavidson
|
esphome/components/graphical_display_menu/* @MrMDavidson
|
||||||
esphome/components/gree/* @orestismers
|
esphome/components/gree/* @orestismers
|
||||||
@ -241,29 +232,24 @@ esphome/components/kamstrup_kmp/* @cfeenstra1024
|
|||||||
esphome/components/key_collector/* @ssieb
|
esphome/components/key_collector/* @ssieb
|
||||||
esphome/components/key_provider/* @ssieb
|
esphome/components/key_provider/* @ssieb
|
||||||
esphome/components/kuntze/* @ssieb
|
esphome/components/kuntze/* @ssieb
|
||||||
esphome/components/lc709203f/* @ilikecake
|
|
||||||
esphome/components/lcd_menu/* @numo68
|
esphome/components/lcd_menu/* @numo68
|
||||||
esphome/components/ld2410/* @regevbr @sebcaps
|
esphome/components/ld2410/* @regevbr @sebcaps
|
||||||
esphome/components/ld2420/* @descipher
|
esphome/components/ld2420/* @descipher
|
||||||
esphome/components/ld2450/* @hareeshmu
|
esphome/components/ld2450/* @hareeshmu
|
||||||
esphome/components/ld24xx/* @kbx81
|
|
||||||
esphome/components/ledc/* @OttoWinter
|
esphome/components/ledc/* @OttoWinter
|
||||||
esphome/components/libretiny/* @kuba2k2
|
esphome/components/libretiny/* @kuba2k2
|
||||||
esphome/components/libretiny_pwm/* @kuba2k2
|
esphome/components/libretiny_pwm/* @kuba2k2
|
||||||
esphome/components/light/* @esphome/core
|
esphome/components/light/* @esphome/core
|
||||||
esphome/components/lightwaverf/* @max246
|
esphome/components/lightwaverf/* @max246
|
||||||
esphome/components/lilygo_t5_47/touchscreen/* @jesserockz
|
esphome/components/lilygo_t5_47/touchscreen/* @jesserockz
|
||||||
esphome/components/ln882x/* @lamauny
|
|
||||||
esphome/components/lock/* @esphome/core
|
esphome/components/lock/* @esphome/core
|
||||||
esphome/components/logger/* @esphome/core
|
esphome/components/logger/* @esphome/core
|
||||||
esphome/components/logger/select/* @clydebarrow
|
esphome/components/logger/select/* @clydebarrow
|
||||||
esphome/components/lps22/* @nagisa
|
|
||||||
esphome/components/ltr390/* @latonita @sjtrny
|
esphome/components/ltr390/* @latonita @sjtrny
|
||||||
esphome/components/ltr501/* @latonita
|
esphome/components/ltr501/* @latonita
|
||||||
esphome/components/ltr_als_ps/* @latonita
|
esphome/components/ltr_als_ps/* @latonita
|
||||||
esphome/components/lvgl/* @clydebarrow
|
esphome/components/lvgl/* @clydebarrow
|
||||||
esphome/components/m5stack_8angle/* @rnauber
|
esphome/components/m5stack_8angle/* @rnauber
|
||||||
esphome/components/mapping/* @clydebarrow
|
|
||||||
esphome/components/matrix_keypad/* @ssieb
|
esphome/components/matrix_keypad/* @ssieb
|
||||||
esphome/components/max17043/* @blacknell
|
esphome/components/max17043/* @blacknell
|
||||||
esphome/components/max31865/* @DAVe3283
|
esphome/components/max31865/* @DAVe3283
|
||||||
@ -290,12 +276,10 @@ esphome/components/mdns/* @esphome/core
|
|||||||
esphome/components/media_player/* @jesserockz
|
esphome/components/media_player/* @jesserockz
|
||||||
esphome/components/micro_wake_word/* @jesserockz @kahrendt
|
esphome/components/micro_wake_word/* @jesserockz @kahrendt
|
||||||
esphome/components/micronova/* @jorre05
|
esphome/components/micronova/* @jorre05
|
||||||
esphome/components/microphone/* @jesserockz @kahrendt
|
esphome/components/microphone/* @jesserockz
|
||||||
esphome/components/mics_4514/* @jesserockz
|
esphome/components/mics_4514/* @jesserockz
|
||||||
esphome/components/midea/* @dudanov
|
esphome/components/midea/* @dudanov
|
||||||
esphome/components/midea_ir/* @dudanov
|
esphome/components/midea_ir/* @dudanov
|
||||||
esphome/components/mipi_dsi/* @clydebarrow
|
|
||||||
esphome/components/mipi_spi/* @clydebarrow
|
|
||||||
esphome/components/mitsubishi/* @RubyBailey
|
esphome/components/mitsubishi/* @RubyBailey
|
||||||
esphome/components/mixer/speaker/* @kahrendt
|
esphome/components/mixer/speaker/* @kahrendt
|
||||||
esphome/components/mlx90393/* @functionpointer
|
esphome/components/mlx90393/* @functionpointer
|
||||||
@ -327,27 +311,20 @@ esphome/components/nextion/text_sensor/* @senexcrenshaw
|
|||||||
esphome/components/nfc/* @jesserockz @kbx81
|
esphome/components/nfc/* @jesserockz @kbx81
|
||||||
esphome/components/noblex/* @AGalfra
|
esphome/components/noblex/* @AGalfra
|
||||||
esphome/components/npi19/* @bakerkj
|
esphome/components/npi19/* @bakerkj
|
||||||
esphome/components/nrf52/* @tomaszduda23
|
|
||||||
esphome/components/number/* @esphome/core
|
esphome/components/number/* @esphome/core
|
||||||
esphome/components/one_wire/* @ssieb
|
esphome/components/one_wire/* @ssieb
|
||||||
esphome/components/online_image/* @clydebarrow @guillempages
|
esphome/components/online_image/* @clydebarrow @guillempages
|
||||||
esphome/components/opentherm/* @olegtarasov
|
esphome/components/opentherm/* @olegtarasov
|
||||||
esphome/components/openthread/* @mrene
|
|
||||||
esphome/components/opt3001/* @ccutrer
|
|
||||||
esphome/components/ota/* @esphome/core
|
esphome/components/ota/* @esphome/core
|
||||||
esphome/components/output/* @esphome/core
|
esphome/components/output/* @esphome/core
|
||||||
esphome/components/packet_transport/* @clydebarrow
|
|
||||||
esphome/components/pca6416a/* @Mat931
|
esphome/components/pca6416a/* @Mat931
|
||||||
esphome/components/pca9554/* @clydebarrow @hwstar
|
esphome/components/pca9554/* @clydebarrow @hwstar
|
||||||
esphome/components/pcf85063/* @brogon
|
esphome/components/pcf85063/* @brogon
|
||||||
esphome/components/pcf8563/* @KoenBreeman
|
esphome/components/pcf8563/* @KoenBreeman
|
||||||
esphome/components/pi4ioe5v6408/* @jesserockz
|
|
||||||
esphome/components/pid/* @OttoWinter
|
esphome/components/pid/* @OttoWinter
|
||||||
esphome/components/pipsolar/* @andreashergert1984
|
esphome/components/pipsolar/* @andreashergert1984
|
||||||
esphome/components/pm1006/* @habbie
|
esphome/components/pm1006/* @habbie
|
||||||
esphome/components/pm2005/* @andrewjswan
|
|
||||||
esphome/components/pmsa003i/* @sjtrny
|
esphome/components/pmsa003i/* @sjtrny
|
||||||
esphome/components/pmsx003/* @ximex
|
|
||||||
esphome/components/pmwcs3/* @SeByDocKy
|
esphome/components/pmwcs3/* @SeByDocKy
|
||||||
esphome/components/pn532/* @OttoWinter @jesserockz
|
esphome/components/pn532/* @OttoWinter @jesserockz
|
||||||
esphome/components/pn532_i2c/* @OttoWinter @jesserockz
|
esphome/components/pn532_i2c/* @OttoWinter @jesserockz
|
||||||
@ -382,7 +359,6 @@ esphome/components/rp2040_pwm/* @jesserockz
|
|||||||
esphome/components/rpi_dpi_rgb/* @clydebarrow
|
esphome/components/rpi_dpi_rgb/* @clydebarrow
|
||||||
esphome/components/rtl87xx/* @kuba2k2
|
esphome/components/rtl87xx/* @kuba2k2
|
||||||
esphome/components/rtttl/* @glmnet
|
esphome/components/rtttl/* @glmnet
|
||||||
esphome/components/runtime_stats/* @bdraco
|
|
||||||
esphome/components/safe_mode/* @jsuanet @kbx81 @paulmonigatti
|
esphome/components/safe_mode/* @jsuanet @kbx81 @paulmonigatti
|
||||||
esphome/components/scd4x/* @martgras @sjtrny
|
esphome/components/scd4x/* @martgras @sjtrny
|
||||||
esphome/components/script/* @esphome/core
|
esphome/components/script/* @esphome/core
|
||||||
@ -417,7 +393,6 @@ esphome/components/smt100/* @piechade
|
|||||||
esphome/components/sn74hc165/* @jesserockz
|
esphome/components/sn74hc165/* @jesserockz
|
||||||
esphome/components/socket/* @esphome/core
|
esphome/components/socket/* @esphome/core
|
||||||
esphome/components/sonoff_d1/* @anatoly-savchenkov
|
esphome/components/sonoff_d1/* @anatoly-savchenkov
|
||||||
esphome/components/sound_level/* @kahrendt
|
|
||||||
esphome/components/speaker/* @jesserockz @kahrendt
|
esphome/components/speaker/* @jesserockz @kahrendt
|
||||||
esphome/components/speaker/media_player/* @kahrendt @synesthesiam
|
esphome/components/speaker/media_player/* @kahrendt @synesthesiam
|
||||||
esphome/components/spi/* @clydebarrow @esphome/core
|
esphome/components/spi/* @clydebarrow @esphome/core
|
||||||
@ -449,9 +424,6 @@ esphome/components/sun/* @OttoWinter
|
|||||||
esphome/components/sun_gtil2/* @Mat931
|
esphome/components/sun_gtil2/* @Mat931
|
||||||
esphome/components/switch/* @esphome/core
|
esphome/components/switch/* @esphome/core
|
||||||
esphome/components/switch/binary_sensor/* @ssieb
|
esphome/components/switch/binary_sensor/* @ssieb
|
||||||
esphome/components/sx126x/* @swoboda1337
|
|
||||||
esphome/components/sx127x/* @swoboda1337
|
|
||||||
esphome/components/syslog/* @clydebarrow
|
|
||||||
esphome/components/t6615/* @tylermenezes
|
esphome/components/t6615/* @tylermenezes
|
||||||
esphome/components/tc74/* @sethgirvan
|
esphome/components/tc74/* @sethgirvan
|
||||||
esphome/components/tca9548a/* @andreashergert1984
|
esphome/components/tca9548a/* @andreashergert1984
|
||||||
@ -491,25 +463,21 @@ esphome/components/tuya/switch/* @jesserockz
|
|||||||
esphome/components/tuya/text_sensor/* @dentra
|
esphome/components/tuya/text_sensor/* @dentra
|
||||||
esphome/components/uart/* @esphome/core
|
esphome/components/uart/* @esphome/core
|
||||||
esphome/components/uart/button/* @ssieb
|
esphome/components/uart/button/* @ssieb
|
||||||
esphome/components/uart/packet_transport/* @clydebarrow
|
|
||||||
esphome/components/udp/* @clydebarrow
|
esphome/components/udp/* @clydebarrow
|
||||||
esphome/components/ufire_ec/* @pvizeli
|
esphome/components/ufire_ec/* @pvizeli
|
||||||
esphome/components/ufire_ise/* @pvizeli
|
esphome/components/ufire_ise/* @pvizeli
|
||||||
esphome/components/ultrasonic/* @OttoWinter
|
esphome/components/ultrasonic/* @OttoWinter
|
||||||
esphome/components/update/* @jesserockz
|
esphome/components/update/* @jesserockz
|
||||||
esphome/components/uponor_smatrix/* @kroimon
|
esphome/components/uponor_smatrix/* @kroimon
|
||||||
esphome/components/usb_host/* @clydebarrow
|
|
||||||
esphome/components/usb_uart/* @clydebarrow
|
|
||||||
esphome/components/valve/* @esphome/core
|
esphome/components/valve/* @esphome/core
|
||||||
esphome/components/vbus/* @ssieb
|
esphome/components/vbus/* @ssieb
|
||||||
esphome/components/veml3235/* @kbx81
|
esphome/components/veml3235/* @kbx81
|
||||||
esphome/components/veml7700/* @latonita
|
esphome/components/veml7700/* @latonita
|
||||||
esphome/components/version/* @esphome/core
|
esphome/components/version/* @esphome/core
|
||||||
esphome/components/voice_assistant/* @jesserockz @kahrendt
|
esphome/components/voice_assistant/* @jesserockz
|
||||||
esphome/components/wake_on_lan/* @clydebarrow @willwill2will54
|
esphome/components/wake_on_lan/* @clydebarrow @willwill2will54
|
||||||
esphome/components/watchdog/* @oarcher
|
esphome/components/watchdog/* @oarcher
|
||||||
esphome/components/waveshare_epaper/* @clydebarrow
|
esphome/components/waveshare_epaper/* @clydebarrow
|
||||||
esphome/components/web_server/ota/* @esphome/core
|
|
||||||
esphome/components/web_server_base/* @OttoWinter
|
esphome/components/web_server_base/* @OttoWinter
|
||||||
esphome/components/web_server_idf/* @dentra
|
esphome/components/web_server_idf/* @dentra
|
||||||
esphome/components/weikai/* @DrCoolZic
|
esphome/components/weikai/* @DrCoolZic
|
||||||
@ -536,10 +504,8 @@ esphome/components/xiaomi_lywsd03mmc/* @ahpohl
|
|||||||
esphome/components/xiaomi_mhoc303/* @drug123
|
esphome/components/xiaomi_mhoc303/* @drug123
|
||||||
esphome/components/xiaomi_mhoc401/* @vevsvevs
|
esphome/components/xiaomi_mhoc401/* @vevsvevs
|
||||||
esphome/components/xiaomi_rtcgq02lm/* @jesserockz
|
esphome/components/xiaomi_rtcgq02lm/* @jesserockz
|
||||||
esphome/components/xiaomi_xmwsdj04mmc/* @medusalix
|
|
||||||
esphome/components/xl9535/* @mreditor97
|
esphome/components/xl9535/* @mreditor97
|
||||||
esphome/components/xpt2046/touchscreen/* @nielsnl68 @numo68
|
esphome/components/xpt2046/touchscreen/* @nielsnl68 @numo68
|
||||||
esphome/components/xxtea/* @clydebarrow
|
esphome/components/xxtea/* @clydebarrow
|
||||||
esphome/components/zephyr/* @tomaszduda23
|
|
||||||
esphome/components/zhlt01/* @cfeenstra1024
|
esphome/components/zhlt01/* @cfeenstra1024
|
||||||
esphome/components/zio_ultrasonic/* @kahrendt
|
esphome/components/zio_ultrasonic/* @kahrendt
|
||||||
|
@ -7,7 +7,7 @@ project and be sure to join us on [Discord](https://discord.gg/KhAMKrd).
|
|||||||
|
|
||||||
**See also:**
|
**See also:**
|
||||||
|
|
||||||
[Documentation](https://esphome.io) -- [Issues](https://github.com/esphome/esphome/issues) -- [Feature requests](https://github.com/orgs/esphome/discussions)
|
[Documentation](https://esphome.io) -- [Issues](https://github.com/esphome/issues/issues) -- [Feature requests](https://github.com/esphome/feature-requests/issues)
|
||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
|
@ -9,7 +9,7 @@
|
|||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
[Documentation](https://esphome.io) -- [Issues](https://github.com/esphome/esphome/issues) -- [Feature requests](https://github.com/orgs/esphome/discussions)
|
[Documentation](https://esphome.io) -- [Issues](https://github.com/esphome/issues/issues) -- [Feature requests](https://github.com/esphome/feature-requests/issues)
|
||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
|
@ -1,56 +1,131 @@
|
|||||||
ARG BUILD_VERSION=dev
|
# Build these with the build.py script
|
||||||
ARG BUILD_OS=alpine
|
# Example:
|
||||||
ARG BUILD_BASE_VERSION=2025.04.0
|
# python3 docker/build.py --tag dev --arch amd64 --build-type docker build
|
||||||
ARG BUILD_TYPE=docker
|
|
||||||
|
|
||||||
FROM ghcr.io/esphome/docker-base:${BUILD_OS}-${BUILD_BASE_VERSION} AS base-source-docker
|
# One of "docker", "hassio"
|
||||||
FROM ghcr.io/esphome/docker-base:${BUILD_OS}-ha-addon-${BUILD_BASE_VERSION} AS base-source-ha-addon
|
ARG BASEIMGTYPE=docker
|
||||||
|
|
||||||
ARG BUILD_TYPE
|
|
||||||
FROM base-source-${BUILD_TYPE} AS base
|
|
||||||
|
|
||||||
RUN git config --system --add safe.directory "*"
|
# https://github.com/hassio-addons/addon-debian-base/releases
|
||||||
|
FROM ghcr.io/hassio-addons/debian-base:7.2.0 AS base-hassio
|
||||||
|
# https://hub.docker.com/_/debian?tab=tags&page=1&name=bookworm
|
||||||
|
FROM debian:12.2-slim AS base-docker
|
||||||
|
|
||||||
ENV PIP_DISABLE_PIP_VERSION_CHECK=1
|
FROM base-${BASEIMGTYPE} AS base
|
||||||
|
|
||||||
RUN pip install --no-cache-dir -U pip uv==0.6.14
|
|
||||||
|
|
||||||
COPY requirements.txt /
|
ARG TARGETARCH
|
||||||
|
ARG TARGETVARIANT
|
||||||
|
|
||||||
|
|
||||||
|
# Note that --break-system-packages is used below because
|
||||||
|
# https://peps.python.org/pep-0668/ added a safety check that prevents
|
||||||
|
# installing packages with the same name as a system package. This is
|
||||||
|
# not a problem for us because we are not concerned about overwriting
|
||||||
|
# system packages because we are running in an isolated container.
|
||||||
|
|
||||||
RUN \
|
RUN \
|
||||||
uv pip install --no-cache-dir \
|
apt-get update \
|
||||||
-r /requirements.txt
|
# Use pinned versions so that we get updates with build caching
|
||||||
|
&& apt-get install -y --no-install-recommends \
|
||||||
|
python3-pip=23.0.1+dfsg-1 \
|
||||||
|
python3-setuptools=66.1.1-1+deb12u1 \
|
||||||
|
python3-venv=3.11.2-1+b1 \
|
||||||
|
python3-wheel=0.38.4-2 \
|
||||||
|
iputils-ping=3:20221126-1+deb12u1 \
|
||||||
|
git=1:2.39.5-0+deb12u2 \
|
||||||
|
curl=7.88.1-10+deb12u12 \
|
||||||
|
openssh-client=1:9.2p1-2+deb12u5 \
|
||||||
|
python3-cffi=1.15.1-5 \
|
||||||
|
libcairo2=1.16.0-7 \
|
||||||
|
libmagic1=1:5.44-3 \
|
||||||
|
patch=2.7.6-7 \
|
||||||
|
&& rm -rf \
|
||||||
|
/tmp/* \
|
||||||
|
/var/{cache,log}/* \
|
||||||
|
/var/lib/apt/lists/*
|
||||||
|
|
||||||
|
ENV \
|
||||||
|
# Fix click python3 lang warning https://click.palletsprojects.com/en/7.x/python3/
|
||||||
|
LANG=C.UTF-8 LC_ALL=C.UTF-8 \
|
||||||
|
# Store globally installed pio libs in /piolibs
|
||||||
|
PLATFORMIO_GLOBALLIB_DIR=/piolibs
|
||||||
|
|
||||||
RUN \
|
RUN \
|
||||||
platformio settings set enable_telemetry No \
|
pip3 install \
|
||||||
|
--break-system-packages --no-cache-dir \
|
||||||
|
# Keep platformio version in sync with requirements.txt
|
||||||
|
platformio==6.1.18 \
|
||||||
|
# Change some platformio settings
|
||||||
|
&& platformio settings set enable_telemetry No \
|
||||||
&& platformio settings set check_platformio_interval 1000000 \
|
&& platformio settings set check_platformio_interval 1000000 \
|
||||||
&& mkdir -p /piolibs
|
&& mkdir -p /piolibs
|
||||||
|
|
||||||
|
|
||||||
|
# First install requirements to leverage caching when requirements don't change
|
||||||
|
# tmpfs is for https://github.com/rust-lang/cargo/issues/8719
|
||||||
|
|
||||||
|
COPY requirements.txt requirements_optional.txt /
|
||||||
|
RUN --mount=type=tmpfs,target=/root/.cargo <<END-OF-RUN
|
||||||
|
# Fail on any non-zero status
|
||||||
|
set -e
|
||||||
|
|
||||||
|
# install build tools in case wheels are not available
|
||||||
|
BUILD_DEPS="
|
||||||
|
build-essential=12.9
|
||||||
|
python3-dev=3.11.2-1+b1
|
||||||
|
zlib1g-dev=1:1.2.13.dfsg-1
|
||||||
|
libjpeg-dev=1:2.1.5-2
|
||||||
|
libfreetype-dev=2.12.1+dfsg-5+deb12u4
|
||||||
|
libssl-dev=3.0.15-1~deb12u1
|
||||||
|
libffi-dev=3.4.4-1
|
||||||
|
cargo=0.66.0+ds1-1
|
||||||
|
pkg-config=1.8.1-1
|
||||||
|
"
|
||||||
|
LIB_DEPS="
|
||||||
|
libtiff6=4.5.0-6+deb12u1
|
||||||
|
libopenjp2-7=2.5.0-2+deb12u1
|
||||||
|
"
|
||||||
|
if [ "$TARGETARCH$TARGETVARIANT" = "arm64" ]
|
||||||
|
then
|
||||||
|
apt-get update
|
||||||
|
apt-get install -y --no-install-recommends $BUILD_DEPS $LIB_DEPS
|
||||||
|
fi
|
||||||
|
|
||||||
|
CARGO_REGISTRIES_CRATES_IO_PROTOCOL=sparse CARGO_HOME=/root/.cargo
|
||||||
|
pip3 install --break-system-packages --no-cache-dir -r /requirements.txt -r /requirements_optional.txt
|
||||||
|
|
||||||
|
if [ "$TARGETARCH$TARGETVARIANT" = "arm64" ]
|
||||||
|
then
|
||||||
|
apt-get remove -y --purge --auto-remove $BUILD_DEPS
|
||||||
|
rm -rf /tmp/* /var/{cache,log}/* /var/lib/apt/lists/*
|
||||||
|
fi
|
||||||
|
END-OF-RUN
|
||||||
|
|
||||||
|
|
||||||
COPY script/platformio_install_deps.py platformio.ini /
|
COPY script/platformio_install_deps.py platformio.ini /
|
||||||
RUN /platformio_install_deps.py /platformio.ini --libraries
|
RUN /platformio_install_deps.py /platformio.ini --libraries
|
||||||
|
|
||||||
ARG BUILD_VERSION
|
# Avoid unsafe git error when container user and file config volume permissions don't match
|
||||||
|
RUN git config --system --add safe.directory '*'
|
||||||
LABEL \
|
|
||||||
org.opencontainers.image.authors="The ESPHome Authors" \
|
|
||||||
org.opencontainers.image.title="ESPHome" \
|
|
||||||
org.opencontainers.image.description="ESPHome is a system to configure your microcontrollers by simple yet powerful configuration files and control them remotely through Home Automation systems" \
|
|
||||||
org.opencontainers.image.url="https://esphome.io/" \
|
|
||||||
org.opencontainers.image.documentation="https://esphome.io/" \
|
|
||||||
org.opencontainers.image.source="https://github.com/esphome/esphome" \
|
|
||||||
org.opencontainers.image.licenses="ESPHome" \
|
|
||||||
org.opencontainers.image.version=${BUILD_VERSION}
|
|
||||||
|
|
||||||
|
|
||||||
# ======================= docker-type image =======================
|
# ======================= docker-type image =======================
|
||||||
FROM base AS base-docker
|
FROM base AS docker
|
||||||
|
|
||||||
|
# Copy esphome and install
|
||||||
|
COPY . /esphome
|
||||||
|
RUN pip3 install --break-system-packages --no-cache-dir -e /esphome
|
||||||
|
|
||||||
|
# Settings for dashboard
|
||||||
|
ENV USERNAME="" PASSWORD=""
|
||||||
|
|
||||||
# Expose the dashboard to Docker
|
# Expose the dashboard to Docker
|
||||||
EXPOSE 6052
|
EXPOSE 6052
|
||||||
|
|
||||||
# Run healthcheck (heartbeat)
|
# Run healthcheck (heartbeat)
|
||||||
HEALTHCHECK --interval=30s --timeout=30s \
|
HEALTHCHECK --interval=30s --timeout=30s \
|
||||||
CMD curl --fail http://localhost:6052/version -A "HealthCheck" || exit 1
|
CMD curl --fail http://localhost:6052/version -A "HealthCheck" || exit 1
|
||||||
|
|
||||||
COPY docker/docker_entrypoint.sh /entrypoint.sh
|
COPY docker/docker_entrypoint.sh /entrypoint.sh
|
||||||
|
|
||||||
@ -64,13 +139,43 @@ ENTRYPOINT ["/entrypoint.sh"]
|
|||||||
CMD ["dashboard", "/config"]
|
CMD ["dashboard", "/config"]
|
||||||
|
|
||||||
|
|
||||||
# ======================= ha-addon-type image =======================
|
ARG BUILD_VERSION=dev
|
||||||
FROM base AS base-ha-addon
|
|
||||||
|
# Labels
|
||||||
|
LABEL \
|
||||||
|
org.opencontainers.image.authors="The ESPHome Authors" \
|
||||||
|
org.opencontainers.image.title="ESPHome" \
|
||||||
|
org.opencontainers.image.description="ESPHome is a system to configure your microcontrollers by simple yet powerful configuration files and control them remotely through Home Automation systems" \
|
||||||
|
org.opencontainers.image.url="https://esphome.io/" \
|
||||||
|
org.opencontainers.image.documentation="https://esphome.io/" \
|
||||||
|
org.opencontainers.image.source="https://github.com/esphome/esphome" \
|
||||||
|
org.opencontainers.image.licenses="ESPHome" \
|
||||||
|
org.opencontainers.image.version=${BUILD_VERSION}
|
||||||
|
|
||||||
|
|
||||||
|
# ======================= hassio-type image =======================
|
||||||
|
FROM base AS hassio
|
||||||
|
|
||||||
|
RUN \
|
||||||
|
apt-get update \
|
||||||
|
# Use pinned versions so that we get updates with build caching
|
||||||
|
&& apt-get install -y --no-install-recommends \
|
||||||
|
nginx-light=1.22.1-9+deb12u1 \
|
||||||
|
&& rm -rf \
|
||||||
|
/tmp/* \
|
||||||
|
/var/{cache,log}/* \
|
||||||
|
/var/lib/apt/lists/*
|
||||||
|
|
||||||
|
ARG BUILD_VERSION=dev
|
||||||
|
|
||||||
# Copy root filesystem
|
# Copy root filesystem
|
||||||
COPY docker/ha-addon-rootfs/ /
|
COPY docker/ha-addon-rootfs/ /
|
||||||
|
|
||||||
ARG BUILD_VERSION
|
# Copy esphome and install
|
||||||
|
COPY . /esphome
|
||||||
|
RUN pip3 install --break-system-packages --no-cache-dir -e /esphome
|
||||||
|
|
||||||
|
# Labels
|
||||||
LABEL \
|
LABEL \
|
||||||
io.hass.name="ESPHome" \
|
io.hass.name="ESPHome" \
|
||||||
io.hass.description="ESPHome is a system to configure your microcontrollers by simple yet powerful configuration files and control them remotely through Home Automation systems" \
|
io.hass.description="ESPHome is a system to configure your microcontrollers by simple yet powerful configuration files and control them remotely through Home Automation systems" \
|
||||||
@ -78,9 +183,35 @@ LABEL \
|
|||||||
io.hass.version="${BUILD_VERSION}"
|
io.hass.version="${BUILD_VERSION}"
|
||||||
# io.hass.arch is inherited from addon-debian-base
|
# io.hass.arch is inherited from addon-debian-base
|
||||||
|
|
||||||
ARG BUILD_TYPE
|
|
||||||
FROM base-${BUILD_TYPE} AS final
|
|
||||||
|
|
||||||
# Copy esphome and install
|
|
||||||
COPY . /esphome
|
|
||||||
RUN uv pip install --no-cache-dir -e /esphome
|
# ======================= lint-type image =======================
|
||||||
|
FROM base AS lint
|
||||||
|
|
||||||
|
ENV \
|
||||||
|
PLATFORMIO_CORE_DIR=/esphome/.temp/platformio
|
||||||
|
|
||||||
|
RUN \
|
||||||
|
curl -L https://apt.llvm.org/llvm-snapshot.gpg.key -o /etc/apt/trusted.gpg.d/apt.llvm.org.asc \
|
||||||
|
&& echo "deb http://apt.llvm.org/bookworm/ llvm-toolchain-bookworm-18 main" > /etc/apt/sources.list.d/llvm.sources.list \
|
||||||
|
&& apt-get update \
|
||||||
|
# Use pinned versions so that we get updates with build caching
|
||||||
|
&& apt-get install -y --no-install-recommends \
|
||||||
|
clang-format-13=1:13.0.1-11+b2 \
|
||||||
|
patch=2.7.6-7 \
|
||||||
|
software-properties-common=0.99.30-4.1~deb12u1 \
|
||||||
|
nano=7.2-1+deb12u1 \
|
||||||
|
build-essential=12.9 \
|
||||||
|
python3-dev=3.11.2-1+b1 \
|
||||||
|
clang-tidy-18=1:18.1.8~++20240731024826+3b5b5c1ec4a3-1~exp1~20240731144843.145 \
|
||||||
|
&& rm -rf \
|
||||||
|
/tmp/* \
|
||||||
|
/var/{cache,log}/* \
|
||||||
|
/var/lib/apt/lists/*
|
||||||
|
|
||||||
|
COPY requirements_test.txt /
|
||||||
|
RUN pip3 install --break-system-packages --no-cache-dir -r /requirements_test.txt
|
||||||
|
|
||||||
|
VOLUME ["/esphome"]
|
||||||
|
WORKDIR /esphome
|
||||||
|
@ -54,7 +54,7 @@ manifest_parser = subparsers.add_parser(
|
|||||||
class DockerParams:
|
class DockerParams:
|
||||||
build_to: str
|
build_to: str
|
||||||
manifest_to: str
|
manifest_to: str
|
||||||
build_type: str
|
baseimgtype: str
|
||||||
platform: str
|
platform: str
|
||||||
target: str
|
target: str
|
||||||
|
|
||||||
@ -66,19 +66,24 @@ class DockerParams:
|
|||||||
TYPE_LINT: "esphome/esphome-lint",
|
TYPE_LINT: "esphome/esphome-lint",
|
||||||
}[build_type]
|
}[build_type]
|
||||||
build_to = f"{prefix}-{arch}"
|
build_to = f"{prefix}-{arch}"
|
||||||
|
baseimgtype = {
|
||||||
|
TYPE_DOCKER: "docker",
|
||||||
|
TYPE_HA_ADDON: "hassio",
|
||||||
|
TYPE_LINT: "docker",
|
||||||
|
}[build_type]
|
||||||
platform = {
|
platform = {
|
||||||
ARCH_AMD64: "linux/amd64",
|
ARCH_AMD64: "linux/amd64",
|
||||||
ARCH_AARCH64: "linux/arm64",
|
ARCH_AARCH64: "linux/arm64",
|
||||||
}[arch]
|
}[arch]
|
||||||
target = {
|
target = {
|
||||||
TYPE_DOCKER: "final",
|
TYPE_DOCKER: "docker",
|
||||||
TYPE_HA_ADDON: "final",
|
TYPE_HA_ADDON: "hassio",
|
||||||
TYPE_LINT: "lint",
|
TYPE_LINT: "lint",
|
||||||
}[build_type]
|
}[build_type]
|
||||||
return cls(
|
return cls(
|
||||||
build_to=build_to,
|
build_to=build_to,
|
||||||
manifest_to=prefix,
|
manifest_to=prefix,
|
||||||
build_type=build_type,
|
baseimgtype=baseimgtype,
|
||||||
platform=platform,
|
platform=platform,
|
||||||
target=target,
|
target=target,
|
||||||
)
|
)
|
||||||
@ -140,7 +145,7 @@ def main():
|
|||||||
"buildx",
|
"buildx",
|
||||||
"build",
|
"build",
|
||||||
"--build-arg",
|
"--build-arg",
|
||||||
f"BUILD_TYPE={params.build_type}",
|
f"BASEIMGTYPE={params.baseimgtype}",
|
||||||
"--build-arg",
|
"--build-arg",
|
||||||
f"BUILD_VERSION={args.tag}",
|
f"BUILD_VERSION={args.tag}",
|
||||||
"--cache-from",
|
"--cache-from",
|
||||||
|
@ -34,15 +34,16 @@ from esphome.const import (
|
|||||||
CONF_PORT,
|
CONF_PORT,
|
||||||
CONF_SUBSTITUTIONS,
|
CONF_SUBSTITUTIONS,
|
||||||
CONF_TOPIC,
|
CONF_TOPIC,
|
||||||
ENV_NOGITIGNORE,
|
PLATFORM_BK72XX,
|
||||||
PLATFORM_ESP32,
|
PLATFORM_ESP32,
|
||||||
PLATFORM_ESP8266,
|
PLATFORM_ESP8266,
|
||||||
PLATFORM_RP2040,
|
PLATFORM_RP2040,
|
||||||
|
PLATFORM_RTL87XX,
|
||||||
SECRETS_FILES,
|
SECRETS_FILES,
|
||||||
)
|
)
|
||||||
from esphome.core import CORE, EsphomeError, coroutine
|
from esphome.core import CORE, EsphomeError, coroutine
|
||||||
from esphome.helpers import get_bool_env, indent, is_ip_address
|
from esphome.helpers import get_bool_env, indent, is_ip_address
|
||||||
from esphome.log import AnsiFore, color, setup_log
|
from esphome.log import Fore, color, setup_log
|
||||||
from esphome.util import (
|
from esphome.util import (
|
||||||
get_serial_ports,
|
get_serial_ports,
|
||||||
list_yaml_files,
|
list_yaml_files,
|
||||||
@ -82,16 +83,16 @@ def choose_prompt(options, purpose: str = None):
|
|||||||
raise ValueError
|
raise ValueError
|
||||||
break
|
break
|
||||||
except ValueError:
|
except ValueError:
|
||||||
safe_print(color(AnsiFore.RED, f"Invalid option: '{opt}'"))
|
safe_print(color(Fore.RED, f"Invalid option: '{opt}'"))
|
||||||
return options[opt - 1][1]
|
return options[opt - 1][1]
|
||||||
|
|
||||||
|
|
||||||
def choose_upload_log_host(
|
def choose_upload_log_host(
|
||||||
default, check_default, show_ota, show_mqtt, show_api, purpose: str = None
|
default, check_default, show_ota, show_mqtt, show_api, purpose: str = None
|
||||||
):
|
):
|
||||||
options = [
|
options = []
|
||||||
(f"{port.path} ({port.description})", port.path) for port in get_serial_ports()
|
for port in get_serial_ports():
|
||||||
]
|
options.append((f"{port.path} ({port.description})", port.path))
|
||||||
if default == "SERIAL":
|
if default == "SERIAL":
|
||||||
return choose_prompt(options, purpose=purpose)
|
return choose_prompt(options, purpose=purpose)
|
||||||
if (show_ota and "ota" in CORE.config) or (show_api and "api" in CORE.config):
|
if (show_ota and "ota" in CORE.config) or (show_api and "api" in CORE.config):
|
||||||
@ -119,7 +120,9 @@ def mqtt_logging_enabled(mqtt_config):
|
|||||||
return False
|
return False
|
||||||
if CONF_TOPIC not in log_topic:
|
if CONF_TOPIC not in log_topic:
|
||||||
return False
|
return False
|
||||||
return log_topic.get(CONF_LEVEL, None) != "NONE"
|
if log_topic.get(CONF_LEVEL, None) == "NONE":
|
||||||
|
return False
|
||||||
|
return True
|
||||||
|
|
||||||
|
|
||||||
def get_port_type(port):
|
def get_port_type(port):
|
||||||
@ -131,7 +134,6 @@ def get_port_type(port):
|
|||||||
|
|
||||||
|
|
||||||
def run_miniterm(config, port, args):
|
def run_miniterm(config, port, args):
|
||||||
from aioesphomeapi import LogParser
|
|
||||||
import serial
|
import serial
|
||||||
|
|
||||||
from esphome import platformio_api
|
from esphome import platformio_api
|
||||||
@ -156,7 +158,6 @@ def run_miniterm(config, port, args):
|
|||||||
ser.dtr = False
|
ser.dtr = False
|
||||||
ser.rts = False
|
ser.rts = False
|
||||||
|
|
||||||
parser = LogParser()
|
|
||||||
tries = 0
|
tries = 0
|
||||||
while tries < 5:
|
while tries < 5:
|
||||||
try:
|
try:
|
||||||
@ -173,7 +174,8 @@ def run_miniterm(config, port, args):
|
|||||||
.decode("utf8", "backslashreplace")
|
.decode("utf8", "backslashreplace")
|
||||||
)
|
)
|
||||||
time_str = datetime.now().time().strftime("[%H:%M:%S]")
|
time_str = datetime.now().time().strftime("[%H:%M:%S]")
|
||||||
safe_print(parser.parse_line(line, time_str))
|
message = time_str + line
|
||||||
|
safe_print(message)
|
||||||
|
|
||||||
backtrace_state = platformio_api.process_stacktrace(
|
backtrace_state = platformio_api.process_stacktrace(
|
||||||
config, line, backtrace_state=backtrace_state
|
config, line, backtrace_state=backtrace_state
|
||||||
@ -208,9 +210,6 @@ def wrap_to_code(name, comp):
|
|||||||
|
|
||||||
|
|
||||||
def write_cpp(config):
|
def write_cpp(config):
|
||||||
if not get_bool_env(ENV_NOGITIGNORE):
|
|
||||||
writer.write_gitignore()
|
|
||||||
|
|
||||||
generate_cpp_contents(config)
|
generate_cpp_contents(config)
|
||||||
return write_cpp_file()
|
return write_cpp_file()
|
||||||
|
|
||||||
@ -227,13 +226,10 @@ def generate_cpp_contents(config):
|
|||||||
|
|
||||||
|
|
||||||
def write_cpp_file():
|
def write_cpp_file():
|
||||||
|
writer.write_platformio_project()
|
||||||
|
|
||||||
code_s = indent(CORE.cpp_main_section)
|
code_s = indent(CORE.cpp_main_section)
|
||||||
writer.write_cpp(code_s)
|
writer.write_cpp(code_s)
|
||||||
|
|
||||||
from esphome.build_gen import platformio
|
|
||||||
|
|
||||||
platformio.write_project()
|
|
||||||
|
|
||||||
return 0
|
return 0
|
||||||
|
|
||||||
|
|
||||||
@ -357,7 +353,7 @@ def upload_program(config, args, host):
|
|||||||
if CORE.target_platform in (PLATFORM_RP2040):
|
if CORE.target_platform in (PLATFORM_RP2040):
|
||||||
return upload_using_platformio(config, args.device)
|
return upload_using_platformio(config, args.device)
|
||||||
|
|
||||||
if CORE.is_libretiny:
|
if CORE.target_platform in (PLATFORM_BK72XX, PLATFORM_RTL87XX):
|
||||||
return upload_using_platformio(config, host)
|
return upload_using_platformio(config, host)
|
||||||
|
|
||||||
return 1 # Unknown target platform
|
return 1 # Unknown target platform
|
||||||
@ -597,38 +593,33 @@ def command_update_all(args):
|
|||||||
middle_text = f" {middle_text} "
|
middle_text = f" {middle_text} "
|
||||||
width = len(click.unstyle(middle_text))
|
width = len(click.unstyle(middle_text))
|
||||||
half_line = "=" * ((twidth - width) // 2)
|
half_line = "=" * ((twidth - width) // 2)
|
||||||
safe_print(f"{half_line}{middle_text}{half_line}")
|
click.echo(f"{half_line}{middle_text}{half_line}")
|
||||||
|
|
||||||
for f in files:
|
for f in files:
|
||||||
safe_print(f"Updating {color(AnsiFore.CYAN, f)}")
|
print(f"Updating {color(Fore.CYAN, f)}")
|
||||||
safe_print("-" * twidth)
|
print("-" * twidth)
|
||||||
safe_print()
|
print()
|
||||||
if CORE.dashboard:
|
rc = run_external_process(
|
||||||
rc = run_external_process(
|
"esphome", "--dashboard", "run", f, "--no-logs", "--device", "OTA"
|
||||||
"esphome", "--dashboard", "run", f, "--no-logs", "--device", "OTA"
|
)
|
||||||
)
|
|
||||||
else:
|
|
||||||
rc = run_external_process(
|
|
||||||
"esphome", "run", f, "--no-logs", "--device", "OTA"
|
|
||||||
)
|
|
||||||
if rc == 0:
|
if rc == 0:
|
||||||
print_bar(f"[{color(AnsiFore.BOLD_GREEN, 'SUCCESS')}] {f}")
|
print_bar(f"[{color(Fore.BOLD_GREEN, 'SUCCESS')}] {f}")
|
||||||
success[f] = True
|
success[f] = True
|
||||||
else:
|
else:
|
||||||
print_bar(f"[{color(AnsiFore.BOLD_RED, 'ERROR')}] {f}")
|
print_bar(f"[{color(Fore.BOLD_RED, 'ERROR')}] {f}")
|
||||||
success[f] = False
|
success[f] = False
|
||||||
|
|
||||||
safe_print()
|
print()
|
||||||
safe_print()
|
print()
|
||||||
safe_print()
|
print()
|
||||||
|
|
||||||
print_bar(f"[{color(AnsiFore.BOLD_WHITE, 'SUMMARY')}]")
|
print_bar(f"[{color(Fore.BOLD_WHITE, 'SUMMARY')}]")
|
||||||
failed = 0
|
failed = 0
|
||||||
for f in files:
|
for f in files:
|
||||||
if success[f]:
|
if success[f]:
|
||||||
safe_print(f" - {f}: {color(AnsiFore.GREEN, 'SUCCESS')}")
|
print(f" - {f}: {color(Fore.GREEN, 'SUCCESS')}")
|
||||||
else:
|
else:
|
||||||
safe_print(f" - {f}: {color(AnsiFore.BOLD_RED, 'FAILED')}")
|
print(f" - {f}: {color(Fore.BOLD_RED, 'FAILED')}")
|
||||||
failed += 1
|
failed += 1
|
||||||
return failed
|
return failed
|
||||||
|
|
||||||
@ -654,7 +645,7 @@ def command_rename(args, config):
|
|||||||
if c not in ALLOWED_NAME_CHARS:
|
if c not in ALLOWED_NAME_CHARS:
|
||||||
print(
|
print(
|
||||||
color(
|
color(
|
||||||
AnsiFore.BOLD_RED,
|
Fore.BOLD_RED,
|
||||||
f"'{c}' is an invalid character for names. Valid characters are: "
|
f"'{c}' is an invalid character for names. Valid characters are: "
|
||||||
f"{ALLOWED_NAME_CHARS} (lowercase, no spaces)",
|
f"{ALLOWED_NAME_CHARS} (lowercase, no spaces)",
|
||||||
)
|
)
|
||||||
@ -667,9 +658,7 @@ def command_rename(args, config):
|
|||||||
yaml = yaml_util.load_yaml(CORE.config_path)
|
yaml = yaml_util.load_yaml(CORE.config_path)
|
||||||
if CONF_ESPHOME not in yaml or CONF_NAME not in yaml[CONF_ESPHOME]:
|
if CONF_ESPHOME not in yaml or CONF_NAME not in yaml[CONF_ESPHOME]:
|
||||||
print(
|
print(
|
||||||
color(
|
color(Fore.BOLD_RED, "Complex YAML files cannot be automatically renamed.")
|
||||||
AnsiFore.BOLD_RED, "Complex YAML files cannot be automatically renamed."
|
|
||||||
)
|
|
||||||
)
|
)
|
||||||
return 1
|
return 1
|
||||||
old_name = yaml[CONF_ESPHOME][CONF_NAME]
|
old_name = yaml[CONF_ESPHOME][CONF_NAME]
|
||||||
@ -692,7 +681,7 @@ def command_rename(args, config):
|
|||||||
)
|
)
|
||||||
> 1
|
> 1
|
||||||
):
|
):
|
||||||
print(color(AnsiFore.BOLD_RED, "Too many matches in YAML to safely rename"))
|
print(color(Fore.BOLD_RED, "Too many matches in YAML to safely rename"))
|
||||||
return 1
|
return 1
|
||||||
|
|
||||||
new_raw = re.sub(
|
new_raw = re.sub(
|
||||||
@ -704,7 +693,7 @@ def command_rename(args, config):
|
|||||||
|
|
||||||
new_path = os.path.join(CORE.config_dir, args.name + ".yaml")
|
new_path = os.path.join(CORE.config_dir, args.name + ".yaml")
|
||||||
print(
|
print(
|
||||||
f"Updating {color(AnsiFore.CYAN, CORE.config_path)} to {color(AnsiFore.CYAN, new_path)}"
|
f"Updating {color(Fore.CYAN, CORE.config_path)} to {color(Fore.CYAN, new_path)}"
|
||||||
)
|
)
|
||||||
print()
|
print()
|
||||||
|
|
||||||
@ -713,7 +702,7 @@ def command_rename(args, config):
|
|||||||
|
|
||||||
rc = run_external_process("esphome", "config", new_path)
|
rc = run_external_process("esphome", "config", new_path)
|
||||||
if rc != 0:
|
if rc != 0:
|
||||||
print(color(AnsiFore.BOLD_RED, "Rename failed. Reverting changes."))
|
print(color(Fore.BOLD_RED, "Rename failed. Reverting changes."))
|
||||||
os.remove(new_path)
|
os.remove(new_path)
|
||||||
return 1
|
return 1
|
||||||
|
|
||||||
@ -739,7 +728,7 @@ def command_rename(args, config):
|
|||||||
if CORE.config_path != new_path:
|
if CORE.config_path != new_path:
|
||||||
os.remove(CORE.config_path)
|
os.remove(CORE.config_path)
|
||||||
|
|
||||||
print(color(AnsiFore.BOLD_GREEN, "SUCCESS"))
|
print(color(Fore.BOLD_GREEN, "SUCCESS"))
|
||||||
print()
|
print()
|
||||||
return 0
|
return 0
|
||||||
|
|
||||||
|
@ -1,102 +0,0 @@
|
|||||||
import os
|
|
||||||
|
|
||||||
from esphome.const import __version__
|
|
||||||
from esphome.core import CORE
|
|
||||||
from esphome.helpers import mkdir_p, read_file, write_file_if_changed
|
|
||||||
from esphome.writer import find_begin_end, update_storage_json
|
|
||||||
|
|
||||||
INI_AUTO_GENERATE_BEGIN = "; ========== AUTO GENERATED CODE BEGIN ==========="
|
|
||||||
INI_AUTO_GENERATE_END = "; =========== AUTO GENERATED CODE END ============"
|
|
||||||
|
|
||||||
INI_BASE_FORMAT = (
|
|
||||||
"""; Auto generated code by esphome
|
|
||||||
|
|
||||||
[common]
|
|
||||||
lib_deps =
|
|
||||||
build_flags =
|
|
||||||
upload_flags =
|
|
||||||
|
|
||||||
""",
|
|
||||||
"""
|
|
||||||
|
|
||||||
""",
|
|
||||||
)
|
|
||||||
|
|
||||||
|
|
||||||
def format_ini(data: dict[str, str | list[str]]) -> str:
|
|
||||||
content = ""
|
|
||||||
for key, value in sorted(data.items()):
|
|
||||||
if isinstance(value, list):
|
|
||||||
content += f"{key} =\n"
|
|
||||||
for x in value:
|
|
||||||
content += f" {x}\n"
|
|
||||||
else:
|
|
||||||
content += f"{key} = {value}\n"
|
|
||||||
return content
|
|
||||||
|
|
||||||
|
|
||||||
def get_ini_content():
|
|
||||||
CORE.add_platformio_option(
|
|
||||||
"lib_deps",
|
|
||||||
[x.as_lib_dep for x in CORE.platformio_libraries.values()]
|
|
||||||
+ ["${common.lib_deps}"],
|
|
||||||
)
|
|
||||||
# Sort to avoid changing build flags order
|
|
||||||
CORE.add_platformio_option("build_flags", sorted(CORE.build_flags))
|
|
||||||
|
|
||||||
# Sort to avoid changing build unflags order
|
|
||||||
CORE.add_platformio_option("build_unflags", sorted(CORE.build_unflags))
|
|
||||||
|
|
||||||
# Add extra script for C++ flags
|
|
||||||
CORE.add_platformio_option("extra_scripts", [f"pre:{CXX_FLAGS_FILE_NAME}"])
|
|
||||||
|
|
||||||
content = "[platformio]\n"
|
|
||||||
content += f"description = ESPHome {__version__}\n"
|
|
||||||
|
|
||||||
content += f"[env:{CORE.name}]\n"
|
|
||||||
content += format_ini(CORE.platformio_options)
|
|
||||||
|
|
||||||
return content
|
|
||||||
|
|
||||||
|
|
||||||
def write_ini(content):
|
|
||||||
update_storage_json()
|
|
||||||
path = CORE.relative_build_path("platformio.ini")
|
|
||||||
|
|
||||||
if os.path.isfile(path):
|
|
||||||
text = read_file(path)
|
|
||||||
content_format = find_begin_end(
|
|
||||||
text, INI_AUTO_GENERATE_BEGIN, INI_AUTO_GENERATE_END
|
|
||||||
)
|
|
||||||
else:
|
|
||||||
content_format = INI_BASE_FORMAT
|
|
||||||
full_file = f"{content_format[0] + INI_AUTO_GENERATE_BEGIN}\n{content}"
|
|
||||||
full_file += INI_AUTO_GENERATE_END + content_format[1]
|
|
||||||
write_file_if_changed(path, full_file)
|
|
||||||
|
|
||||||
|
|
||||||
def write_project():
|
|
||||||
mkdir_p(CORE.build_path)
|
|
||||||
|
|
||||||
content = get_ini_content()
|
|
||||||
write_ini(content)
|
|
||||||
|
|
||||||
# Write extra script for C++ specific flags
|
|
||||||
write_cxx_flags_script()
|
|
||||||
|
|
||||||
|
|
||||||
CXX_FLAGS_FILE_NAME = "cxx_flags.py"
|
|
||||||
CXX_FLAGS_FILE_CONTENTS = """# Auto-generated ESPHome script for C++ specific compiler flags
|
|
||||||
Import("env")
|
|
||||||
|
|
||||||
# Add C++ specific flags
|
|
||||||
"""
|
|
||||||
|
|
||||||
|
|
||||||
def write_cxx_flags_script() -> None:
|
|
||||||
path = CORE.relative_build_path(CXX_FLAGS_FILE_NAME)
|
|
||||||
contents = CXX_FLAGS_FILE_CONTENTS
|
|
||||||
if not CORE.is_host:
|
|
||||||
contents += 'env.Append(CXXFLAGS=["-Wno-volatile"])'
|
|
||||||
contents += "\n"
|
|
||||||
write_file_if_changed(path, contents)
|
|
@ -22,7 +22,6 @@ from esphome.cpp_generator import ( # noqa: F401
|
|||||||
TemplateArguments,
|
TemplateArguments,
|
||||||
add,
|
add,
|
||||||
add_build_flag,
|
add_build_flag,
|
||||||
add_build_unflag,
|
|
||||||
add_define,
|
add_define,
|
||||||
add_global,
|
add_global,
|
||||||
add_library,
|
add_library,
|
||||||
@ -35,7 +34,6 @@ from esphome.cpp_generator import ( # noqa: F401
|
|||||||
process_lambda,
|
process_lambda,
|
||||||
progmem_array,
|
progmem_array,
|
||||||
safe_exp,
|
safe_exp,
|
||||||
set_cpp_standard,
|
|
||||||
statement,
|
statement,
|
||||||
static_const_array,
|
static_const_array,
|
||||||
templatable,
|
templatable,
|
||||||
|
@ -7,6 +7,7 @@ namespace a4988 {
|
|||||||
static const char *const TAG = "a4988.stepper";
|
static const char *const TAG = "a4988.stepper";
|
||||||
|
|
||||||
void A4988::setup() {
|
void A4988::setup() {
|
||||||
|
ESP_LOGCONFIG(TAG, "Setting up A4988...");
|
||||||
if (this->sleep_pin_ != nullptr) {
|
if (this->sleep_pin_ != nullptr) {
|
||||||
this->sleep_pin_->setup();
|
this->sleep_pin_->setup();
|
||||||
this->sleep_pin_->digital_write(false);
|
this->sleep_pin_->digital_write(false);
|
||||||
|
@ -7,6 +7,8 @@ namespace absolute_humidity {
|
|||||||
static const char *const TAG = "absolute_humidity.sensor";
|
static const char *const TAG = "absolute_humidity.sensor";
|
||||||
|
|
||||||
void AbsoluteHumidityComponent::setup() {
|
void AbsoluteHumidityComponent::setup() {
|
||||||
|
ESP_LOGCONFIG(TAG, "Setting up absolute humidity '%s'...", this->get_name().c_str());
|
||||||
|
|
||||||
ESP_LOGD(TAG, " Added callback for temperature '%s'", this->temperature_sensor_->get_name().c_str());
|
ESP_LOGD(TAG, " Added callback for temperature '%s'", this->temperature_sensor_->get_name().c_str());
|
||||||
this->temperature_sensor_->add_on_state_callback([this](float state) { this->temperature_callback_(state); });
|
this->temperature_sensor_->add_on_state_callback([this](float state) { this->temperature_callback_(state); });
|
||||||
if (this->temperature_sensor_->has_state()) {
|
if (this->temperature_sensor_->has_state()) {
|
||||||
@ -38,11 +40,9 @@ void AbsoluteHumidityComponent::dump_config() {
|
|||||||
break;
|
break;
|
||||||
}
|
}
|
||||||
|
|
||||||
ESP_LOGCONFIG(TAG,
|
ESP_LOGCONFIG(TAG, "Sources");
|
||||||
"Sources\n"
|
ESP_LOGCONFIG(TAG, " Temperature: '%s'", this->temperature_sensor_->get_name().c_str());
|
||||||
" Temperature: '%s'\n"
|
ESP_LOGCONFIG(TAG, " Relative Humidity: '%s'", this->humidity_sensor_->get_name().c_str());
|
||||||
" Relative Humidity: '%s'",
|
|
||||||
this->temperature_sensor_->get_name().c_str(), this->humidity_sensor_->get_name().c_str());
|
|
||||||
}
|
}
|
||||||
|
|
||||||
float AbsoluteHumidityComponent::get_setup_priority() const { return setup_priority::DATA; }
|
float AbsoluteHumidityComponent::get_setup_priority() const { return setup_priority::DATA; }
|
||||||
|
@ -4,7 +4,6 @@
|
|||||||
#include "esphome/core/helpers.h"
|
#include "esphome/core/helpers.h"
|
||||||
#include "esphome/core/log.h"
|
#include "esphome/core/log.h"
|
||||||
#include <cmath>
|
#include <cmath>
|
||||||
#include <numbers>
|
|
||||||
|
|
||||||
#ifdef USE_ESP8266
|
#ifdef USE_ESP8266
|
||||||
#include <core_esp8266_waveform.h>
|
#include <core_esp8266_waveform.h>
|
||||||
@ -115,14 +114,13 @@ void IRAM_ATTR HOT AcDimmerDataStore::gpio_intr() {
|
|||||||
// fully off, disable output immediately
|
// fully off, disable output immediately
|
||||||
this->gate_pin.digital_write(false);
|
this->gate_pin.digital_write(false);
|
||||||
} else {
|
} else {
|
||||||
auto min_us = this->cycle_time_us * this->min_power / 1000;
|
|
||||||
if (this->method == DIM_METHOD_TRAILING) {
|
if (this->method == DIM_METHOD_TRAILING) {
|
||||||
this->enable_time_us = 1; // cannot be 0
|
this->enable_time_us = 1; // cannot be 0
|
||||||
// calculate time until disable in µs with integer arithmetic and take into account min_power
|
this->disable_time_us = std::max((uint32_t) 10, this->value * this->cycle_time_us / 65535);
|
||||||
this->disable_time_us = std::max((uint32_t) 10, this->value * (this->cycle_time_us - min_us) / 65535 + min_us);
|
|
||||||
} else {
|
} else {
|
||||||
// calculate time until enable in µs: (1.0-value)*cycle_time, but with integer arithmetic
|
// calculate time until enable in µs: (1.0-value)*cycle_time, but with integer arithmetic
|
||||||
// also take into account min_power
|
// also take into account min_power
|
||||||
|
auto min_us = this->cycle_time_us * this->min_power / 1000;
|
||||||
this->enable_time_us = std::max((uint32_t) 1, ((65535 - this->value) * (this->cycle_time_us - min_us)) / 65535);
|
this->enable_time_us = std::max((uint32_t) 1, ((65535 - this->value) * (this->cycle_time_us - min_us)) / 65535);
|
||||||
|
|
||||||
if (this->method == DIM_METHOD_LEADING_PULSE) {
|
if (this->method == DIM_METHOD_LEADING_PULSE) {
|
||||||
@ -194,17 +192,18 @@ void AcDimmer::setup() {
|
|||||||
setTimer1Callback(&timer_interrupt);
|
setTimer1Callback(&timer_interrupt);
|
||||||
#endif
|
#endif
|
||||||
#ifdef USE_ESP32
|
#ifdef USE_ESP32
|
||||||
// timer frequency of 1mhz
|
// 80 Divider -> 1 count=1µs
|
||||||
dimmer_timer = timerBegin(1000000);
|
dimmer_timer = timerBegin(0, 80, true);
|
||||||
timerAttachInterrupt(dimmer_timer, &AcDimmerDataStore::s_timer_intr);
|
timerAttachInterrupt(dimmer_timer, &AcDimmerDataStore::s_timer_intr, true);
|
||||||
// For ESP32, we can't use dynamic interval calculation because the timerX functions
|
// For ESP32, we can't use dynamic interval calculation because the timerX functions
|
||||||
// are not callable from ISR (placed in flash storage).
|
// are not callable from ISR (placed in flash storage).
|
||||||
// Here we just use an interrupt firing every 50 µs.
|
// Here we just use an interrupt firing every 50 µs.
|
||||||
timerAlarm(dimmer_timer, 50, true, 0);
|
timerAlarmWrite(dimmer_timer, 50, true);
|
||||||
|
timerAlarmEnable(dimmer_timer);
|
||||||
#endif
|
#endif
|
||||||
}
|
}
|
||||||
void AcDimmer::write_state(float state) {
|
void AcDimmer::write_state(float state) {
|
||||||
state = std::acos(1 - (2 * state)) / std::numbers::pi; // RMS power compensation
|
state = std::acos(1 - (2 * state)) / 3.14159; // RMS power compensation
|
||||||
auto new_value = static_cast<uint16_t>(roundf(state * 65535));
|
auto new_value = static_cast<uint16_t>(roundf(state * 65535));
|
||||||
if (new_value != 0 && this->store_.value == 0)
|
if (new_value != 0 && this->store_.value == 0)
|
||||||
this->store_.init_cycle = this->init_with_half_cycle_;
|
this->store_.init_cycle = this->init_with_half_cycle_;
|
||||||
@ -214,10 +213,8 @@ void AcDimmer::dump_config() {
|
|||||||
ESP_LOGCONFIG(TAG, "AcDimmer:");
|
ESP_LOGCONFIG(TAG, "AcDimmer:");
|
||||||
LOG_PIN(" Output Pin: ", this->gate_pin_);
|
LOG_PIN(" Output Pin: ", this->gate_pin_);
|
||||||
LOG_PIN(" Zero-Cross Pin: ", this->zero_cross_pin_);
|
LOG_PIN(" Zero-Cross Pin: ", this->zero_cross_pin_);
|
||||||
ESP_LOGCONFIG(TAG,
|
ESP_LOGCONFIG(TAG, " Min Power: %.1f%%", this->store_.min_power / 10.0f);
|
||||||
" Min Power: %.1f%%\n"
|
ESP_LOGCONFIG(TAG, " Init with half cycle: %s", YESNO(this->init_with_half_cycle_));
|
||||||
" Init with half cycle: %s",
|
|
||||||
this->store_.min_power / 10.0f, YESNO(this->init_with_half_cycle_));
|
|
||||||
if (method_ == DIM_METHOD_LEADING_PULSE) {
|
if (method_ == DIM_METHOD_LEADING_PULSE) {
|
||||||
ESP_LOGCONFIG(TAG, " Method: leading pulse");
|
ESP_LOGCONFIG(TAG, " Method: leading pulse");
|
||||||
} else if (method_ == DIM_METHOD_LEADING) {
|
} else if (method_ == DIM_METHOD_LEADING) {
|
||||||
|
@ -5,21 +5,13 @@ from esphome.components.esp32.const import (
|
|||||||
VARIANT_ESP32,
|
VARIANT_ESP32,
|
||||||
VARIANT_ESP32C2,
|
VARIANT_ESP32C2,
|
||||||
VARIANT_ESP32C3,
|
VARIANT_ESP32C3,
|
||||||
VARIANT_ESP32C5,
|
|
||||||
VARIANT_ESP32C6,
|
VARIANT_ESP32C6,
|
||||||
VARIANT_ESP32H2,
|
VARIANT_ESP32H2,
|
||||||
VARIANT_ESP32S2,
|
VARIANT_ESP32S2,
|
||||||
VARIANT_ESP32S3,
|
VARIANT_ESP32S3,
|
||||||
)
|
)
|
||||||
from esphome.config_helpers import filter_source_files_from_platform
|
|
||||||
import esphome.config_validation as cv
|
import esphome.config_validation as cv
|
||||||
from esphome.const import (
|
from esphome.const import CONF_ANALOG, CONF_INPUT, CONF_NUMBER, PLATFORM_ESP8266
|
||||||
CONF_ANALOG,
|
|
||||||
CONF_INPUT,
|
|
||||||
CONF_NUMBER,
|
|
||||||
PLATFORM_ESP8266,
|
|
||||||
PlatformFramework,
|
|
||||||
)
|
|
||||||
from esphome.core import CORE
|
from esphome.core import CORE
|
||||||
|
|
||||||
CODEOWNERS = ["@esphome/core"]
|
CODEOWNERS = ["@esphome/core"]
|
||||||
@ -52,152 +44,122 @@ SAMPLING_MODES = {
|
|||||||
"max": sampling_mode.MAX,
|
"max": sampling_mode.MAX,
|
||||||
}
|
}
|
||||||
|
|
||||||
adc_unit_t = cg.global_ns.enum("adc_unit_t", is_class=True)
|
adc1_channel_t = cg.global_ns.enum("adc1_channel_t")
|
||||||
|
adc2_channel_t = cg.global_ns.enum("adc2_channel_t")
|
||||||
adc_channel_t = cg.global_ns.enum("adc_channel_t", is_class=True)
|
|
||||||
|
|
||||||
|
# From https://github.com/espressif/esp-idf/blob/master/components/driver/include/driver/adc_common.h
|
||||||
# pin to adc1 channel mapping
|
# pin to adc1 channel mapping
|
||||||
# https://github.com/espressif/esp-idf/blob/v4.4.8/components/driver/include/driver/adc.h
|
|
||||||
ESP32_VARIANT_ADC1_PIN_TO_CHANNEL = {
|
ESP32_VARIANT_ADC1_PIN_TO_CHANNEL = {
|
||||||
# https://github.com/espressif/esp-idf/blob/master/components/soc/esp32/include/soc/adc_channel.h
|
|
||||||
VARIANT_ESP32: {
|
VARIANT_ESP32: {
|
||||||
36: adc_channel_t.ADC_CHANNEL_0,
|
36: adc1_channel_t.ADC1_CHANNEL_0,
|
||||||
37: adc_channel_t.ADC_CHANNEL_1,
|
37: adc1_channel_t.ADC1_CHANNEL_1,
|
||||||
38: adc_channel_t.ADC_CHANNEL_2,
|
38: adc1_channel_t.ADC1_CHANNEL_2,
|
||||||
39: adc_channel_t.ADC_CHANNEL_3,
|
39: adc1_channel_t.ADC1_CHANNEL_3,
|
||||||
32: adc_channel_t.ADC_CHANNEL_4,
|
32: adc1_channel_t.ADC1_CHANNEL_4,
|
||||||
33: adc_channel_t.ADC_CHANNEL_5,
|
33: adc1_channel_t.ADC1_CHANNEL_5,
|
||||||
34: adc_channel_t.ADC_CHANNEL_6,
|
34: adc1_channel_t.ADC1_CHANNEL_6,
|
||||||
35: adc_channel_t.ADC_CHANNEL_7,
|
35: adc1_channel_t.ADC1_CHANNEL_7,
|
||||||
},
|
},
|
||||||
# https://github.com/espressif/esp-idf/blob/master/components/soc/esp32c2/include/soc/adc_channel.h
|
|
||||||
VARIANT_ESP32C2: {
|
|
||||||
0: adc_channel_t.ADC_CHANNEL_0,
|
|
||||||
1: adc_channel_t.ADC_CHANNEL_1,
|
|
||||||
2: adc_channel_t.ADC_CHANNEL_2,
|
|
||||||
3: adc_channel_t.ADC_CHANNEL_3,
|
|
||||||
4: adc_channel_t.ADC_CHANNEL_4,
|
|
||||||
},
|
|
||||||
# https://github.com/espressif/esp-idf/blob/master/components/soc/esp32c3/include/soc/adc_channel.h
|
|
||||||
VARIANT_ESP32C3: {
|
|
||||||
0: adc_channel_t.ADC_CHANNEL_0,
|
|
||||||
1: adc_channel_t.ADC_CHANNEL_1,
|
|
||||||
2: adc_channel_t.ADC_CHANNEL_2,
|
|
||||||
3: adc_channel_t.ADC_CHANNEL_3,
|
|
||||||
4: adc_channel_t.ADC_CHANNEL_4,
|
|
||||||
},
|
|
||||||
# ESP32-C5 ADC1 pin mapping - based on official ESP-IDF documentation
|
|
||||||
# https://docs.espressif.com/projects/esp-idf/en/latest/esp32c5/api-reference/peripherals/gpio.html
|
|
||||||
VARIANT_ESP32C5: {
|
|
||||||
1: adc_channel_t.ADC_CHANNEL_0,
|
|
||||||
2: adc_channel_t.ADC_CHANNEL_1,
|
|
||||||
3: adc_channel_t.ADC_CHANNEL_2,
|
|
||||||
4: adc_channel_t.ADC_CHANNEL_3,
|
|
||||||
5: adc_channel_t.ADC_CHANNEL_4,
|
|
||||||
6: adc_channel_t.ADC_CHANNEL_5,
|
|
||||||
},
|
|
||||||
# https://github.com/espressif/esp-idf/blob/master/components/soc/esp32c6/include/soc/adc_channel.h
|
|
||||||
VARIANT_ESP32C6: {
|
|
||||||
0: adc_channel_t.ADC_CHANNEL_0,
|
|
||||||
1: adc_channel_t.ADC_CHANNEL_1,
|
|
||||||
2: adc_channel_t.ADC_CHANNEL_2,
|
|
||||||
3: adc_channel_t.ADC_CHANNEL_3,
|
|
||||||
4: adc_channel_t.ADC_CHANNEL_4,
|
|
||||||
5: adc_channel_t.ADC_CHANNEL_5,
|
|
||||||
6: adc_channel_t.ADC_CHANNEL_6,
|
|
||||||
},
|
|
||||||
# https://github.com/espressif/esp-idf/blob/master/components/soc/esp32h2/include/soc/adc_channel.h
|
|
||||||
VARIANT_ESP32H2: {
|
|
||||||
1: adc_channel_t.ADC_CHANNEL_0,
|
|
||||||
2: adc_channel_t.ADC_CHANNEL_1,
|
|
||||||
3: adc_channel_t.ADC_CHANNEL_2,
|
|
||||||
4: adc_channel_t.ADC_CHANNEL_3,
|
|
||||||
5: adc_channel_t.ADC_CHANNEL_4,
|
|
||||||
},
|
|
||||||
# https://github.com/espressif/esp-idf/blob/master/components/soc/esp32s2/include/soc/adc_channel.h
|
|
||||||
VARIANT_ESP32S2: {
|
VARIANT_ESP32S2: {
|
||||||
1: adc_channel_t.ADC_CHANNEL_0,
|
1: adc1_channel_t.ADC1_CHANNEL_0,
|
||||||
2: adc_channel_t.ADC_CHANNEL_1,
|
2: adc1_channel_t.ADC1_CHANNEL_1,
|
||||||
3: adc_channel_t.ADC_CHANNEL_2,
|
3: adc1_channel_t.ADC1_CHANNEL_2,
|
||||||
4: adc_channel_t.ADC_CHANNEL_3,
|
4: adc1_channel_t.ADC1_CHANNEL_3,
|
||||||
5: adc_channel_t.ADC_CHANNEL_4,
|
5: adc1_channel_t.ADC1_CHANNEL_4,
|
||||||
6: adc_channel_t.ADC_CHANNEL_5,
|
6: adc1_channel_t.ADC1_CHANNEL_5,
|
||||||
7: adc_channel_t.ADC_CHANNEL_6,
|
7: adc1_channel_t.ADC1_CHANNEL_6,
|
||||||
8: adc_channel_t.ADC_CHANNEL_7,
|
8: adc1_channel_t.ADC1_CHANNEL_7,
|
||||||
9: adc_channel_t.ADC_CHANNEL_8,
|
9: adc1_channel_t.ADC1_CHANNEL_8,
|
||||||
10: adc_channel_t.ADC_CHANNEL_9,
|
10: adc1_channel_t.ADC1_CHANNEL_9,
|
||||||
},
|
},
|
||||||
# https://github.com/espressif/esp-idf/blob/master/components/soc/esp32s3/include/soc/adc_channel.h
|
|
||||||
VARIANT_ESP32S3: {
|
VARIANT_ESP32S3: {
|
||||||
1: adc_channel_t.ADC_CHANNEL_0,
|
1: adc1_channel_t.ADC1_CHANNEL_0,
|
||||||
2: adc_channel_t.ADC_CHANNEL_1,
|
2: adc1_channel_t.ADC1_CHANNEL_1,
|
||||||
3: adc_channel_t.ADC_CHANNEL_2,
|
3: adc1_channel_t.ADC1_CHANNEL_2,
|
||||||
4: adc_channel_t.ADC_CHANNEL_3,
|
4: adc1_channel_t.ADC1_CHANNEL_3,
|
||||||
5: adc_channel_t.ADC_CHANNEL_4,
|
5: adc1_channel_t.ADC1_CHANNEL_4,
|
||||||
6: adc_channel_t.ADC_CHANNEL_5,
|
6: adc1_channel_t.ADC1_CHANNEL_5,
|
||||||
7: adc_channel_t.ADC_CHANNEL_6,
|
7: adc1_channel_t.ADC1_CHANNEL_6,
|
||||||
8: adc_channel_t.ADC_CHANNEL_7,
|
8: adc1_channel_t.ADC1_CHANNEL_7,
|
||||||
9: adc_channel_t.ADC_CHANNEL_8,
|
9: adc1_channel_t.ADC1_CHANNEL_8,
|
||||||
10: adc_channel_t.ADC_CHANNEL_9,
|
10: adc1_channel_t.ADC1_CHANNEL_9,
|
||||||
|
},
|
||||||
|
VARIANT_ESP32C3: {
|
||||||
|
0: adc1_channel_t.ADC1_CHANNEL_0,
|
||||||
|
1: adc1_channel_t.ADC1_CHANNEL_1,
|
||||||
|
2: adc1_channel_t.ADC1_CHANNEL_2,
|
||||||
|
3: adc1_channel_t.ADC1_CHANNEL_3,
|
||||||
|
4: adc1_channel_t.ADC1_CHANNEL_4,
|
||||||
|
},
|
||||||
|
VARIANT_ESP32C2: {
|
||||||
|
0: adc1_channel_t.ADC1_CHANNEL_0,
|
||||||
|
1: adc1_channel_t.ADC1_CHANNEL_1,
|
||||||
|
2: adc1_channel_t.ADC1_CHANNEL_2,
|
||||||
|
3: adc1_channel_t.ADC1_CHANNEL_3,
|
||||||
|
4: adc1_channel_t.ADC1_CHANNEL_4,
|
||||||
|
},
|
||||||
|
VARIANT_ESP32C6: {
|
||||||
|
0: adc1_channel_t.ADC1_CHANNEL_0,
|
||||||
|
1: adc1_channel_t.ADC1_CHANNEL_1,
|
||||||
|
2: adc1_channel_t.ADC1_CHANNEL_2,
|
||||||
|
3: adc1_channel_t.ADC1_CHANNEL_3,
|
||||||
|
4: adc1_channel_t.ADC1_CHANNEL_4,
|
||||||
|
5: adc1_channel_t.ADC1_CHANNEL_5,
|
||||||
|
6: adc1_channel_t.ADC1_CHANNEL_6,
|
||||||
|
},
|
||||||
|
VARIANT_ESP32H2: {
|
||||||
|
1: adc1_channel_t.ADC1_CHANNEL_0,
|
||||||
|
2: adc1_channel_t.ADC1_CHANNEL_1,
|
||||||
|
3: adc1_channel_t.ADC1_CHANNEL_2,
|
||||||
|
4: adc1_channel_t.ADC1_CHANNEL_3,
|
||||||
|
5: adc1_channel_t.ADC1_CHANNEL_4,
|
||||||
},
|
},
|
||||||
}
|
}
|
||||||
|
|
||||||
# pin to adc2 channel mapping
|
|
||||||
# https://github.com/espressif/esp-idf/blob/v4.4.8/components/driver/include/driver/adc.h
|
|
||||||
ESP32_VARIANT_ADC2_PIN_TO_CHANNEL = {
|
ESP32_VARIANT_ADC2_PIN_TO_CHANNEL = {
|
||||||
# https://github.com/espressif/esp-idf/blob/master/components/soc/esp32/include/soc/adc_channel.h
|
# TODO: add other variants
|
||||||
VARIANT_ESP32: {
|
VARIANT_ESP32: {
|
||||||
4: adc_channel_t.ADC_CHANNEL_0,
|
4: adc2_channel_t.ADC2_CHANNEL_0,
|
||||||
0: adc_channel_t.ADC_CHANNEL_1,
|
0: adc2_channel_t.ADC2_CHANNEL_1,
|
||||||
2: adc_channel_t.ADC_CHANNEL_2,
|
2: adc2_channel_t.ADC2_CHANNEL_2,
|
||||||
15: adc_channel_t.ADC_CHANNEL_3,
|
15: adc2_channel_t.ADC2_CHANNEL_3,
|
||||||
13: adc_channel_t.ADC_CHANNEL_4,
|
13: adc2_channel_t.ADC2_CHANNEL_4,
|
||||||
12: adc_channel_t.ADC_CHANNEL_5,
|
12: adc2_channel_t.ADC2_CHANNEL_5,
|
||||||
14: adc_channel_t.ADC_CHANNEL_6,
|
14: adc2_channel_t.ADC2_CHANNEL_6,
|
||||||
27: adc_channel_t.ADC_CHANNEL_7,
|
27: adc2_channel_t.ADC2_CHANNEL_7,
|
||||||
25: adc_channel_t.ADC_CHANNEL_8,
|
25: adc2_channel_t.ADC2_CHANNEL_8,
|
||||||
26: adc_channel_t.ADC_CHANNEL_9,
|
26: adc2_channel_t.ADC2_CHANNEL_9,
|
||||||
},
|
},
|
||||||
# https://github.com/espressif/esp-idf/blob/master/components/soc/esp32c2/include/soc/adc_channel.h
|
|
||||||
VARIANT_ESP32C2: {
|
|
||||||
5: adc_channel_t.ADC_CHANNEL_0,
|
|
||||||
},
|
|
||||||
# https://github.com/espressif/esp-idf/blob/master/components/soc/esp32c3/include/soc/adc_channel.h
|
|
||||||
VARIANT_ESP32C3: {
|
|
||||||
5: adc_channel_t.ADC_CHANNEL_0,
|
|
||||||
},
|
|
||||||
# ESP32-C5 has no ADC2 channels
|
|
||||||
VARIANT_ESP32C5: {}, # no ADC2
|
|
||||||
# https://github.com/espressif/esp-idf/blob/master/components/soc/esp32c6/include/soc/adc_channel.h
|
|
||||||
VARIANT_ESP32C6: {}, # no ADC2
|
|
||||||
# https://github.com/espressif/esp-idf/blob/master/components/soc/esp32h2/include/soc/adc_channel.h
|
|
||||||
VARIANT_ESP32H2: {}, # no ADC2
|
|
||||||
# https://github.com/espressif/esp-idf/blob/master/components/soc/esp32s2/include/soc/adc_channel.h
|
|
||||||
VARIANT_ESP32S2: {
|
VARIANT_ESP32S2: {
|
||||||
11: adc_channel_t.ADC_CHANNEL_0,
|
11: adc2_channel_t.ADC2_CHANNEL_0,
|
||||||
12: adc_channel_t.ADC_CHANNEL_1,
|
12: adc2_channel_t.ADC2_CHANNEL_1,
|
||||||
13: adc_channel_t.ADC_CHANNEL_2,
|
13: adc2_channel_t.ADC2_CHANNEL_2,
|
||||||
14: adc_channel_t.ADC_CHANNEL_3,
|
14: adc2_channel_t.ADC2_CHANNEL_3,
|
||||||
15: adc_channel_t.ADC_CHANNEL_4,
|
15: adc2_channel_t.ADC2_CHANNEL_4,
|
||||||
16: adc_channel_t.ADC_CHANNEL_5,
|
16: adc2_channel_t.ADC2_CHANNEL_5,
|
||||||
17: adc_channel_t.ADC_CHANNEL_6,
|
17: adc2_channel_t.ADC2_CHANNEL_6,
|
||||||
18: adc_channel_t.ADC_CHANNEL_7,
|
18: adc2_channel_t.ADC2_CHANNEL_7,
|
||||||
19: adc_channel_t.ADC_CHANNEL_8,
|
19: adc2_channel_t.ADC2_CHANNEL_8,
|
||||||
20: adc_channel_t.ADC_CHANNEL_9,
|
20: adc2_channel_t.ADC2_CHANNEL_9,
|
||||||
},
|
},
|
||||||
# https://github.com/espressif/esp-idf/blob/master/components/soc/esp32s3/include/soc/adc_channel.h
|
|
||||||
VARIANT_ESP32S3: {
|
VARIANT_ESP32S3: {
|
||||||
11: adc_channel_t.ADC_CHANNEL_0,
|
11: adc2_channel_t.ADC2_CHANNEL_0,
|
||||||
12: adc_channel_t.ADC_CHANNEL_1,
|
12: adc2_channel_t.ADC2_CHANNEL_1,
|
||||||
13: adc_channel_t.ADC_CHANNEL_2,
|
13: adc2_channel_t.ADC2_CHANNEL_2,
|
||||||
14: adc_channel_t.ADC_CHANNEL_3,
|
14: adc2_channel_t.ADC2_CHANNEL_3,
|
||||||
15: adc_channel_t.ADC_CHANNEL_4,
|
15: adc2_channel_t.ADC2_CHANNEL_4,
|
||||||
16: adc_channel_t.ADC_CHANNEL_5,
|
16: adc2_channel_t.ADC2_CHANNEL_5,
|
||||||
17: adc_channel_t.ADC_CHANNEL_6,
|
17: adc2_channel_t.ADC2_CHANNEL_6,
|
||||||
18: adc_channel_t.ADC_CHANNEL_7,
|
18: adc2_channel_t.ADC2_CHANNEL_7,
|
||||||
19: adc_channel_t.ADC_CHANNEL_8,
|
19: adc2_channel_t.ADC2_CHANNEL_8,
|
||||||
20: adc_channel_t.ADC_CHANNEL_9,
|
20: adc2_channel_t.ADC2_CHANNEL_9,
|
||||||
},
|
},
|
||||||
|
VARIANT_ESP32C3: {
|
||||||
|
5: adc2_channel_t.ADC2_CHANNEL_0,
|
||||||
|
},
|
||||||
|
VARIANT_ESP32C2: {},
|
||||||
|
VARIANT_ESP32C6: {},
|
||||||
|
VARIANT_ESP32H2: {},
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
||||||
@ -250,20 +212,3 @@ def validate_adc_pin(value):
|
|||||||
)(value)
|
)(value)
|
||||||
|
|
||||||
raise NotImplementedError
|
raise NotImplementedError
|
||||||
|
|
||||||
|
|
||||||
FILTER_SOURCE_FILES = filter_source_files_from_platform(
|
|
||||||
{
|
|
||||||
"adc_sensor_esp32.cpp": {
|
|
||||||
PlatformFramework.ESP32_ARDUINO,
|
|
||||||
PlatformFramework.ESP32_IDF,
|
|
||||||
},
|
|
||||||
"adc_sensor_esp8266.cpp": {PlatformFramework.ESP8266_ARDUINO},
|
|
||||||
"adc_sensor_rp2040.cpp": {PlatformFramework.RP2040_ARDUINO},
|
|
||||||
"adc_sensor_libretiny.cpp": {
|
|
||||||
PlatformFramework.BK72XX_ARDUINO,
|
|
||||||
PlatformFramework.RTL87XX_ARDUINO,
|
|
||||||
PlatformFramework.LN882X_ARDUINO,
|
|
||||||
},
|
|
||||||
}
|
|
||||||
)
|
|
||||||
|
@ -3,22 +3,20 @@
|
|||||||
#include "esphome/components/sensor/sensor.h"
|
#include "esphome/components/sensor/sensor.h"
|
||||||
#include "esphome/components/voltage_sampler/voltage_sampler.h"
|
#include "esphome/components/voltage_sampler/voltage_sampler.h"
|
||||||
#include "esphome/core/component.h"
|
#include "esphome/core/component.h"
|
||||||
#include "esphome/core/defines.h"
|
|
||||||
#include "esphome/core/hal.h"
|
#include "esphome/core/hal.h"
|
||||||
|
|
||||||
#ifdef USE_ESP32
|
#ifdef USE_ESP32
|
||||||
#include "esp_adc/adc_cali.h"
|
#include <esp_adc_cal.h>
|
||||||
#include "esp_adc/adc_cali_scheme.h"
|
#include "driver/adc.h"
|
||||||
#include "esp_adc/adc_oneshot.h"
|
#endif // USE_ESP32
|
||||||
#include "hal/adc_types.h" // This defines ADC_CHANNEL_MAX
|
|
||||||
#endif // USE_ESP32
|
|
||||||
|
|
||||||
namespace esphome {
|
namespace esphome {
|
||||||
namespace adc {
|
namespace adc {
|
||||||
|
|
||||||
#ifdef USE_ESP32
|
#ifdef USE_ESP32
|
||||||
// clang-format off
|
// clang-format off
|
||||||
#if (ESP_IDF_VERSION_MAJOR == 5 && \
|
#if (ESP_IDF_VERSION_MAJOR == 4 && ESP_IDF_VERSION >= ESP_IDF_VERSION_VAL(4, 4, 7)) || \
|
||||||
|
(ESP_IDF_VERSION_MAJOR == 5 && \
|
||||||
((ESP_IDF_VERSION_MINOR == 0 && ESP_IDF_VERSION_PATCH >= 5) || \
|
((ESP_IDF_VERSION_MINOR == 0 && ESP_IDF_VERSION_PATCH >= 5) || \
|
||||||
(ESP_IDF_VERSION_MINOR == 1 && ESP_IDF_VERSION_PATCH >= 3) || \
|
(ESP_IDF_VERSION_MINOR == 1 && ESP_IDF_VERSION_PATCH >= 3) || \
|
||||||
(ESP_IDF_VERSION_MINOR >= 2)) \
|
(ESP_IDF_VERSION_MINOR >= 2)) \
|
||||||
@ -30,127 +28,79 @@ static const adc_atten_t ADC_ATTEN_DB_12_COMPAT = ADC_ATTEN_DB_11;
|
|||||||
#endif
|
#endif
|
||||||
#endif // USE_ESP32
|
#endif // USE_ESP32
|
||||||
|
|
||||||
enum class SamplingMode : uint8_t {
|
enum class SamplingMode : uint8_t { AVG = 0, MIN = 1, MAX = 2 };
|
||||||
AVG = 0,
|
|
||||||
MIN = 1,
|
|
||||||
MAX = 2,
|
|
||||||
};
|
|
||||||
|
|
||||||
const LogString *sampling_mode_to_str(SamplingMode mode);
|
const LogString *sampling_mode_to_str(SamplingMode mode);
|
||||||
|
|
||||||
class Aggregator {
|
class Aggregator {
|
||||||
public:
|
public:
|
||||||
Aggregator(SamplingMode mode);
|
|
||||||
void add_sample(uint32_t value);
|
void add_sample(uint32_t value);
|
||||||
uint32_t aggregate();
|
uint32_t aggregate();
|
||||||
|
Aggregator(SamplingMode mode);
|
||||||
|
|
||||||
protected:
|
protected:
|
||||||
|
SamplingMode mode_{SamplingMode::AVG};
|
||||||
uint32_t aggr_{0};
|
uint32_t aggr_{0};
|
||||||
uint32_t samples_{0};
|
uint32_t samples_{0};
|
||||||
SamplingMode mode_{SamplingMode::AVG};
|
|
||||||
};
|
};
|
||||||
|
|
||||||
class ADCSensor : public sensor::Sensor, public PollingComponent, public voltage_sampler::VoltageSampler {
|
class ADCSensor : public sensor::Sensor, public PollingComponent, public voltage_sampler::VoltageSampler {
|
||||||
public:
|
public:
|
||||||
/// Update the sensor's state by reading the current ADC value.
|
|
||||||
/// This method is called periodically based on the update interval.
|
|
||||||
void update() override;
|
|
||||||
|
|
||||||
/// Set up the ADC sensor by initializing hardware and calibration parameters.
|
|
||||||
/// This method is called once during device initialization.
|
|
||||||
void setup() override;
|
|
||||||
|
|
||||||
/// Output the configuration details of the ADC sensor for debugging purposes.
|
|
||||||
/// This method is called during the ESPHome setup process to log the configuration.
|
|
||||||
void dump_config() override;
|
|
||||||
|
|
||||||
/// Return the setup priority for this component.
|
|
||||||
/// Components with higher priority are initialized earlier during setup.
|
|
||||||
/// @return A float representing the setup priority.
|
|
||||||
float get_setup_priority() const override;
|
|
||||||
|
|
||||||
/// Set the GPIO pin to be used by the ADC sensor.
|
|
||||||
/// @param pin Pointer to an InternalGPIOPin representing the ADC input pin.
|
|
||||||
void set_pin(InternalGPIOPin *pin) { this->pin_ = pin; }
|
|
||||||
|
|
||||||
/// Enable or disable the output of raw ADC values (unprocessed data).
|
|
||||||
/// @param output_raw Boolean indicating whether to output raw ADC values (true) or processed values (false).
|
|
||||||
void set_output_raw(bool output_raw) { this->output_raw_ = output_raw; }
|
|
||||||
|
|
||||||
/// Set the number of samples to be taken for ADC readings to improve accuracy.
|
|
||||||
/// A higher sample count reduces noise but increases the reading time.
|
|
||||||
/// @param sample_count The number of samples (e.g., 1, 4, 8).
|
|
||||||
void set_sample_count(uint8_t sample_count);
|
|
||||||
|
|
||||||
/// Set the sampling mode for how multiple ADC samples are combined into a single measurement.
|
|
||||||
///
|
|
||||||
/// When multiple samples are taken (controlled by set_sample_count), they can be combined
|
|
||||||
/// in one of three ways:
|
|
||||||
/// - SamplingMode::AVG: Compute the average (default)
|
|
||||||
/// - SamplingMode::MIN: Use the lowest sample value
|
|
||||||
/// - SamplingMode::MAX: Use the highest sample value
|
|
||||||
/// @param sampling_mode The desired sampling mode to use for aggregating ADC samples.
|
|
||||||
void set_sampling_mode(SamplingMode sampling_mode);
|
|
||||||
|
|
||||||
/// Perform a single ADC sampling operation and return the measured value.
|
|
||||||
/// This function handles raw readings, calibration, and averaging as needed.
|
|
||||||
/// @return The sampled value as a float.
|
|
||||||
float sample() override;
|
|
||||||
|
|
||||||
#ifdef USE_ESP32
|
#ifdef USE_ESP32
|
||||||
/// Set the ADC attenuation level to adjust the input voltage range.
|
/// Set the attenuation for this pin. Only available on the ESP32.
|
||||||
/// This determines how the ADC interprets input voltages, allowing for greater precision
|
|
||||||
/// or the ability to measure higher voltages depending on the chosen attenuation level.
|
|
||||||
/// @param attenuation The desired ADC attenuation level (e.g., ADC_ATTEN_DB_0, ADC_ATTEN_DB_11).
|
|
||||||
void set_attenuation(adc_atten_t attenuation) { this->attenuation_ = attenuation; }
|
void set_attenuation(adc_atten_t attenuation) { this->attenuation_ = attenuation; }
|
||||||
|
void set_channel1(adc1_channel_t channel) {
|
||||||
/// Configure the ADC to use a specific channel on a specific ADC unit.
|
this->channel1_ = channel;
|
||||||
/// This sets the channel for single-shot or continuous ADC measurements.
|
this->channel2_ = ADC2_CHANNEL_MAX;
|
||||||
/// @param unit The ADC unit to use (ADC_UNIT_1 or ADC_UNIT_2).
|
}
|
||||||
/// @param channel The ADC channel to configure, such as ADC_CHANNEL_0, ADC_CHANNEL_3, etc.
|
void set_channel2(adc2_channel_t channel) {
|
||||||
void set_channel(adc_unit_t unit, adc_channel_t channel) {
|
this->channel2_ = channel;
|
||||||
this->adc_unit_ = unit;
|
this->channel1_ = ADC1_CHANNEL_MAX;
|
||||||
this->channel_ = channel;
|
|
||||||
}
|
}
|
||||||
|
|
||||||
/// Set whether autoranging should be enabled for the ADC.
|
|
||||||
/// Autoranging automatically adjusts the attenuation level to handle a wide range of input voltages.
|
|
||||||
/// @param autorange Boolean indicating whether to enable autoranging.
|
|
||||||
void set_autorange(bool autorange) { this->autorange_ = autorange; }
|
void set_autorange(bool autorange) { this->autorange_ = autorange; }
|
||||||
#endif // USE_ESP32
|
#endif // USE_ESP32
|
||||||
|
|
||||||
|
/// Update ADC values
|
||||||
|
void update() override;
|
||||||
|
/// Setup ADC
|
||||||
|
void setup() override;
|
||||||
|
void dump_config() override;
|
||||||
|
/// `HARDWARE_LATE` setup priority
|
||||||
|
float get_setup_priority() const override;
|
||||||
|
void set_pin(InternalGPIOPin *pin) { this->pin_ = pin; }
|
||||||
|
void set_output_raw(bool output_raw) { this->output_raw_ = output_raw; }
|
||||||
|
void set_sample_count(uint8_t sample_count);
|
||||||
|
void set_sampling_mode(SamplingMode sampling_mode);
|
||||||
|
float sample() override;
|
||||||
|
|
||||||
|
#ifdef USE_ESP8266
|
||||||
|
std::string unique_id() override;
|
||||||
|
#endif // USE_ESP8266
|
||||||
|
|
||||||
#ifdef USE_RP2040
|
#ifdef USE_RP2040
|
||||||
void set_is_temperature() { this->is_temperature_ = true; }
|
void set_is_temperature() { this->is_temperature_ = true; }
|
||||||
#endif // USE_RP2040
|
#endif // USE_RP2040
|
||||||
|
|
||||||
protected:
|
protected:
|
||||||
uint8_t sample_count_{1};
|
|
||||||
bool output_raw_{false};
|
|
||||||
InternalGPIOPin *pin_;
|
InternalGPIOPin *pin_;
|
||||||
|
bool output_raw_{false};
|
||||||
|
uint8_t sample_count_{1};
|
||||||
SamplingMode sampling_mode_{SamplingMode::AVG};
|
SamplingMode sampling_mode_{SamplingMode::AVG};
|
||||||
|
|
||||||
#ifdef USE_ESP32
|
|
||||||
float sample_autorange_();
|
|
||||||
float sample_fixed_attenuation_();
|
|
||||||
bool autorange_{false};
|
|
||||||
adc_oneshot_unit_handle_t adc_handle_{nullptr};
|
|
||||||
adc_cali_handle_t calibration_handle_{nullptr};
|
|
||||||
adc_atten_t attenuation_{ADC_ATTEN_DB_0};
|
|
||||||
adc_channel_t channel_;
|
|
||||||
adc_unit_t adc_unit_;
|
|
||||||
struct SetupFlags {
|
|
||||||
uint8_t init_complete : 1;
|
|
||||||
uint8_t config_complete : 1;
|
|
||||||
uint8_t handle_init_complete : 1;
|
|
||||||
uint8_t calibration_complete : 1;
|
|
||||||
uint8_t reserved : 4;
|
|
||||||
} setup_flags_{};
|
|
||||||
static adc_oneshot_unit_handle_t shared_adc_handles[2];
|
|
||||||
#endif // USE_ESP32
|
|
||||||
|
|
||||||
#ifdef USE_RP2040
|
#ifdef USE_RP2040
|
||||||
bool is_temperature_{false};
|
bool is_temperature_{false};
|
||||||
#endif // USE_RP2040
|
#endif // USE_RP2040
|
||||||
|
|
||||||
|
#ifdef USE_ESP32
|
||||||
|
adc_atten_t attenuation_{ADC_ATTEN_DB_0};
|
||||||
|
adc1_channel_t channel1_{ADC1_CHANNEL_MAX};
|
||||||
|
adc2_channel_t channel2_{ADC2_CHANNEL_MAX};
|
||||||
|
bool autorange_{false};
|
||||||
|
#if ESP_IDF_VERSION_MAJOR >= 5
|
||||||
|
esp_adc_cal_characteristics_t cal_characteristics_[SOC_ADC_ATTEN_NUM] = {};
|
||||||
|
#else
|
||||||
|
esp_adc_cal_characteristics_t cal_characteristics_[ADC_ATTEN_MAX] = {};
|
||||||
|
#endif // ESP_IDF_VERSION_MAJOR
|
||||||
|
#endif // USE_ESP32
|
||||||
};
|
};
|
||||||
|
|
||||||
} // namespace adc
|
} // namespace adc
|
||||||
|
@ -61,7 +61,7 @@ uint32_t Aggregator::aggregate() {
|
|||||||
|
|
||||||
void ADCSensor::update() {
|
void ADCSensor::update() {
|
||||||
float value_v = this->sample();
|
float value_v = this->sample();
|
||||||
ESP_LOGV(TAG, "'%s': Voltage=%.4fV", this->get_name().c_str(), value_v);
|
ESP_LOGV(TAG, "'%s': Got voltage=%.4fV", this->get_name().c_str(), value_v);
|
||||||
this->publish_state(value_v);
|
this->publish_state(value_v);
|
||||||
}
|
}
|
||||||
|
|
||||||
|
@ -8,314 +8,135 @@ namespace adc {
|
|||||||
|
|
||||||
static const char *const TAG = "adc.esp32";
|
static const char *const TAG = "adc.esp32";
|
||||||
|
|
||||||
adc_oneshot_unit_handle_t ADCSensor::shared_adc_handles[2] = {nullptr, nullptr};
|
static const adc_bits_width_t ADC_WIDTH_MAX_SOC_BITS = static_cast<adc_bits_width_t>(ADC_WIDTH_MAX - 1);
|
||||||
|
|
||||||
const LogString *attenuation_to_str(adc_atten_t attenuation) {
|
#ifndef SOC_ADC_RTC_MAX_BITWIDTH
|
||||||
switch (attenuation) {
|
#if USE_ESP32_VARIANT_ESP32S2
|
||||||
case ADC_ATTEN_DB_0:
|
static const int32_t SOC_ADC_RTC_MAX_BITWIDTH = 13;
|
||||||
return LOG_STR("0 dB");
|
#else
|
||||||
case ADC_ATTEN_DB_2_5:
|
static const int32_t SOC_ADC_RTC_MAX_BITWIDTH = 12;
|
||||||
return LOG_STR("2.5 dB");
|
#endif // USE_ESP32_VARIANT_ESP32S2
|
||||||
case ADC_ATTEN_DB_6:
|
#endif // SOC_ADC_RTC_MAX_BITWIDTH
|
||||||
return LOG_STR("6 dB");
|
|
||||||
case ADC_ATTEN_DB_12_COMPAT:
|
|
||||||
return LOG_STR("12 dB");
|
|
||||||
default:
|
|
||||||
return LOG_STR("Unknown Attenuation");
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
const LogString *adc_unit_to_str(adc_unit_t unit) {
|
static const int ADC_MAX = (1 << SOC_ADC_RTC_MAX_BITWIDTH) - 1;
|
||||||
switch (unit) {
|
static const int ADC_HALF = (1 << SOC_ADC_RTC_MAX_BITWIDTH) >> 1;
|
||||||
case ADC_UNIT_1:
|
|
||||||
return LOG_STR("ADC1");
|
|
||||||
case ADC_UNIT_2:
|
|
||||||
return LOG_STR("ADC2");
|
|
||||||
default:
|
|
||||||
return LOG_STR("Unknown ADC Unit");
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
void ADCSensor::setup() {
|
void ADCSensor::setup() {
|
||||||
// Check if another sensor already initialized this ADC unit
|
ESP_LOGCONFIG(TAG, "Setting up ADC '%s'...", this->get_name().c_str());
|
||||||
if (ADCSensor::shared_adc_handles[this->adc_unit_] == nullptr) {
|
|
||||||
adc_oneshot_unit_init_cfg_t init_config = {}; // Zero initialize
|
if (this->channel1_ != ADC1_CHANNEL_MAX) {
|
||||||
init_config.unit_id = this->adc_unit_;
|
adc1_config_width(ADC_WIDTH_MAX_SOC_BITS);
|
||||||
init_config.ulp_mode = ADC_ULP_MODE_DISABLE;
|
if (!this->autorange_) {
|
||||||
#if USE_ESP32_VARIANT_ESP32C3 || USE_ESP32_VARIANT_ESP32C5 || USE_ESP32_VARIANT_ESP32C6 || USE_ESP32_VARIANT_ESP32H2
|
adc1_config_channel_atten(this->channel1_, this->attenuation_);
|
||||||
init_config.clk_src = ADC_DIGI_CLK_SRC_DEFAULT;
|
}
|
||||||
#endif // USE_ESP32_VARIANT_ESP32C3 || USE_ESP32_VARIANT_ESP32C5 || USE_ESP32_VARIANT_ESP32C6 ||
|
} else if (this->channel2_ != ADC2_CHANNEL_MAX) {
|
||||||
// USE_ESP32_VARIANT_ESP32H2
|
if (!this->autorange_) {
|
||||||
esp_err_t err = adc_oneshot_new_unit(&init_config, &ADCSensor::shared_adc_handles[this->adc_unit_]);
|
adc2_config_channel_atten(this->channel2_, this->attenuation_);
|
||||||
if (err != ESP_OK) {
|
|
||||||
ESP_LOGE(TAG, "Error initializing %s: %d", LOG_STR_ARG(adc_unit_to_str(this->adc_unit_)), err);
|
|
||||||
this->mark_failed();
|
|
||||||
return;
|
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
this->adc_handle_ = ADCSensor::shared_adc_handles[this->adc_unit_];
|
|
||||||
|
|
||||||
this->setup_flags_.handle_init_complete = true;
|
for (int32_t i = 0; i <= ADC_ATTEN_DB_12_COMPAT; i++) {
|
||||||
|
auto adc_unit = this->channel1_ != ADC1_CHANNEL_MAX ? ADC_UNIT_1 : ADC_UNIT_2;
|
||||||
adc_oneshot_chan_cfg_t config = {
|
auto cal_value = esp_adc_cal_characterize(adc_unit, (adc_atten_t) i, ADC_WIDTH_MAX_SOC_BITS,
|
||||||
.atten = this->attenuation_,
|
1100, // default vref
|
||||||
.bitwidth = ADC_BITWIDTH_DEFAULT,
|
&this->cal_characteristics_[i]);
|
||||||
};
|
switch (cal_value) {
|
||||||
esp_err_t err = adc_oneshot_config_channel(this->adc_handle_, this->channel_, &config);
|
case ESP_ADC_CAL_VAL_EFUSE_VREF:
|
||||||
if (err != ESP_OK) {
|
ESP_LOGV(TAG, "Using eFuse Vref for calibration");
|
||||||
ESP_LOGE(TAG, "Error configuring channel: %d", err);
|
break;
|
||||||
this->mark_failed();
|
case ESP_ADC_CAL_VAL_EFUSE_TP:
|
||||||
return;
|
ESP_LOGV(TAG, "Using two-point eFuse Vref for calibration");
|
||||||
}
|
break;
|
||||||
this->setup_flags_.config_complete = true;
|
case ESP_ADC_CAL_VAL_DEFAULT_VREF:
|
||||||
|
default:
|
||||||
// Initialize ADC calibration
|
break;
|
||||||
if (this->calibration_handle_ == nullptr) {
|
|
||||||
adc_cali_handle_t handle = nullptr;
|
|
||||||
esp_err_t err;
|
|
||||||
|
|
||||||
#if USE_ESP32_VARIANT_ESP32C3 || USE_ESP32_VARIANT_ESP32C5 || USE_ESP32_VARIANT_ESP32C6 || \
|
|
||||||
USE_ESP32_VARIANT_ESP32S3 || USE_ESP32_VARIANT_ESP32H2
|
|
||||||
// RISC-V variants and S3 use curve fitting calibration
|
|
||||||
adc_cali_curve_fitting_config_t cali_config = {}; // Zero initialize first
|
|
||||||
#if ESP_IDF_VERSION >= ESP_IDF_VERSION_VAL(5, 3, 0)
|
|
||||||
cali_config.chan = this->channel_;
|
|
||||||
#endif // ESP_IDF_VERSION >= ESP_IDF_VERSION_VAL(5, 3, 0)
|
|
||||||
cali_config.unit_id = this->adc_unit_;
|
|
||||||
cali_config.atten = this->attenuation_;
|
|
||||||
cali_config.bitwidth = ADC_BITWIDTH_DEFAULT;
|
|
||||||
|
|
||||||
err = adc_cali_create_scheme_curve_fitting(&cali_config, &handle);
|
|
||||||
if (err == ESP_OK) {
|
|
||||||
this->calibration_handle_ = handle;
|
|
||||||
this->setup_flags_.calibration_complete = true;
|
|
||||||
ESP_LOGV(TAG, "Using curve fitting calibration");
|
|
||||||
} else {
|
|
||||||
ESP_LOGW(TAG, "Curve fitting calibration failed with error %d, will use uncalibrated readings", err);
|
|
||||||
this->setup_flags_.calibration_complete = false;
|
|
||||||
}
|
}
|
||||||
#else // Other ESP32 variants use line fitting calibration
|
|
||||||
adc_cali_line_fitting_config_t cali_config = {
|
|
||||||
.unit_id = this->adc_unit_,
|
|
||||||
.atten = this->attenuation_,
|
|
||||||
.bitwidth = ADC_BITWIDTH_DEFAULT,
|
|
||||||
#if !defined(USE_ESP32_VARIANT_ESP32S2)
|
|
||||||
.default_vref = 1100, // Default reference voltage in mV
|
|
||||||
#endif // !defined(USE_ESP32_VARIANT_ESP32S2)
|
|
||||||
};
|
|
||||||
err = adc_cali_create_scheme_line_fitting(&cali_config, &handle);
|
|
||||||
if (err == ESP_OK) {
|
|
||||||
this->calibration_handle_ = handle;
|
|
||||||
this->setup_flags_.calibration_complete = true;
|
|
||||||
ESP_LOGV(TAG, "Using line fitting calibration");
|
|
||||||
} else {
|
|
||||||
ESP_LOGW(TAG, "Line fitting calibration failed with error %d, will use uncalibrated readings", err);
|
|
||||||
this->setup_flags_.calibration_complete = false;
|
|
||||||
}
|
|
||||||
#endif // USE_ESP32_VARIANT_ESP32C3 || ESP32C5 || ESP32C6 || ESP32S3 || ESP32H2
|
|
||||||
}
|
}
|
||||||
|
|
||||||
this->setup_flags_.init_complete = true;
|
|
||||||
}
|
}
|
||||||
|
|
||||||
void ADCSensor::dump_config() {
|
void ADCSensor::dump_config() {
|
||||||
LOG_SENSOR("", "ADC Sensor", this);
|
LOG_SENSOR("", "ADC Sensor", this);
|
||||||
LOG_PIN(" Pin: ", this->pin_);
|
LOG_PIN(" Pin: ", this->pin_);
|
||||||
ESP_LOGCONFIG(TAG,
|
if (this->autorange_) {
|
||||||
" Channel: %d\n"
|
ESP_LOGCONFIG(TAG, " Attenuation: auto");
|
||||||
" Unit: %s\n"
|
} else {
|
||||||
" Attenuation: %s\n"
|
switch (this->attenuation_) {
|
||||||
" Samples: %i\n"
|
case ADC_ATTEN_DB_0:
|
||||||
" Sampling mode: %s",
|
ESP_LOGCONFIG(TAG, " Attenuation: 0db");
|
||||||
this->channel_, LOG_STR_ARG(adc_unit_to_str(this->adc_unit_)),
|
break;
|
||||||
this->autorange_ ? "Auto" : LOG_STR_ARG(attenuation_to_str(this->attenuation_)), this->sample_count_,
|
case ADC_ATTEN_DB_2_5:
|
||||||
LOG_STR_ARG(sampling_mode_to_str(this->sampling_mode_)));
|
ESP_LOGCONFIG(TAG, " Attenuation: 2.5db");
|
||||||
|
break;
|
||||||
ESP_LOGCONFIG(
|
case ADC_ATTEN_DB_6:
|
||||||
TAG,
|
ESP_LOGCONFIG(TAG, " Attenuation: 6db");
|
||||||
" Setup Status:\n"
|
break;
|
||||||
" Handle Init: %s\n"
|
case ADC_ATTEN_DB_12_COMPAT:
|
||||||
" Config: %s\n"
|
ESP_LOGCONFIG(TAG, " Attenuation: 12db");
|
||||||
" Calibration: %s\n"
|
break;
|
||||||
" Overall Init: %s",
|
default: // This is to satisfy the unused ADC_ATTEN_MAX
|
||||||
this->setup_flags_.handle_init_complete ? "OK" : "FAILED", this->setup_flags_.config_complete ? "OK" : "FAILED",
|
break;
|
||||||
this->setup_flags_.calibration_complete ? "OK" : "FAILED", this->setup_flags_.init_complete ? "OK" : "FAILED");
|
}
|
||||||
|
}
|
||||||
|
ESP_LOGCONFIG(TAG, " Samples: %i", this->sample_count_);
|
||||||
|
ESP_LOGCONFIG(TAG, " Sampling mode: %s", LOG_STR_ARG(sampling_mode_to_str(this->sampling_mode_)));
|
||||||
LOG_UPDATE_INTERVAL(this);
|
LOG_UPDATE_INTERVAL(this);
|
||||||
}
|
}
|
||||||
|
|
||||||
float ADCSensor::sample() {
|
float ADCSensor::sample() {
|
||||||
if (this->autorange_) {
|
if (!this->autorange_) {
|
||||||
return this->sample_autorange_();
|
auto aggr = Aggregator(this->sampling_mode_);
|
||||||
} else {
|
|
||||||
return this->sample_fixed_attenuation_();
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
float ADCSensor::sample_fixed_attenuation_() {
|
for (uint8_t sample = 0; sample < this->sample_count_; sample++) {
|
||||||
auto aggr = Aggregator(this->sampling_mode_);
|
int raw = -1;
|
||||||
|
if (this->channel1_ != ADC1_CHANNEL_MAX) {
|
||||||
|
raw = adc1_get_raw(this->channel1_);
|
||||||
|
} else if (this->channel2_ != ADC2_CHANNEL_MAX) {
|
||||||
|
adc2_get_raw(this->channel2_, ADC_WIDTH_MAX_SOC_BITS, &raw);
|
||||||
|
}
|
||||||
|
if (raw == -1) {
|
||||||
|
return NAN;
|
||||||
|
}
|
||||||
|
|
||||||
for (uint8_t sample = 0; sample < this->sample_count_; sample++) {
|
aggr.add_sample(raw);
|
||||||
int raw;
|
|
||||||
esp_err_t err = adc_oneshot_read(this->adc_handle_, this->channel_, &raw);
|
|
||||||
|
|
||||||
if (err != ESP_OK) {
|
|
||||||
ESP_LOGW(TAG, "ADC read failed with error %d", err);
|
|
||||||
continue;
|
|
||||||
}
|
}
|
||||||
|
if (this->output_raw_) {
|
||||||
if (raw == -1) {
|
return aggr.aggregate();
|
||||||
ESP_LOGW(TAG, "Invalid ADC reading");
|
|
||||||
continue;
|
|
||||||
}
|
}
|
||||||
|
uint32_t mv =
|
||||||
aggr.add_sample(raw);
|
esp_adc_cal_raw_to_voltage(aggr.aggregate(), &this->cal_characteristics_[(int32_t) this->attenuation_]);
|
||||||
|
return mv / 1000.0f;
|
||||||
}
|
}
|
||||||
|
|
||||||
uint32_t final_value = aggr.aggregate();
|
int raw12 = ADC_MAX, raw6 = ADC_MAX, raw2 = ADC_MAX, raw0 = ADC_MAX;
|
||||||
|
|
||||||
if (this->output_raw_) {
|
if (this->channel1_ != ADC1_CHANNEL_MAX) {
|
||||||
return final_value;
|
adc1_config_channel_atten(this->channel1_, ADC_ATTEN_DB_12_COMPAT);
|
||||||
}
|
raw12 = adc1_get_raw(this->channel1_);
|
||||||
|
if (raw12 < ADC_MAX) {
|
||||||
if (this->calibration_handle_ != nullptr) {
|
adc1_config_channel_atten(this->channel1_, ADC_ATTEN_DB_6);
|
||||||
int voltage_mv;
|
raw6 = adc1_get_raw(this->channel1_);
|
||||||
esp_err_t err = adc_cali_raw_to_voltage(this->calibration_handle_, final_value, &voltage_mv);
|
if (raw6 < ADC_MAX) {
|
||||||
if (err == ESP_OK) {
|
adc1_config_channel_atten(this->channel1_, ADC_ATTEN_DB_2_5);
|
||||||
return voltage_mv / 1000.0f;
|
raw2 = adc1_get_raw(this->channel1_);
|
||||||
} else {
|
if (raw2 < ADC_MAX) {
|
||||||
ESP_LOGW(TAG, "ADC calibration conversion failed with error %d, disabling calibration", err);
|
adc1_config_channel_atten(this->channel1_, ADC_ATTEN_DB_0);
|
||||||
if (this->calibration_handle_ != nullptr) {
|
raw0 = adc1_get_raw(this->channel1_);
|
||||||
#if USE_ESP32_VARIANT_ESP32C3 || USE_ESP32_VARIANT_ESP32C5 || USE_ESP32_VARIANT_ESP32C6 || \
|
}
|
||||||
USE_ESP32_VARIANT_ESP32S3 || USE_ESP32_VARIANT_ESP32H2
|
|
||||||
adc_cali_delete_scheme_curve_fitting(this->calibration_handle_);
|
|
||||||
#else // Other ESP32 variants use line fitting calibration
|
|
||||||
adc_cali_delete_scheme_line_fitting(this->calibration_handle_);
|
|
||||||
#endif // USE_ESP32_VARIANT_ESP32C3 || ESP32C5 || ESP32C6 || ESP32S3 || ESP32H2
|
|
||||||
this->calibration_handle_ = nullptr;
|
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
} else if (this->channel2_ != ADC2_CHANNEL_MAX) {
|
||||||
|
adc2_config_channel_atten(this->channel2_, ADC_ATTEN_DB_12_COMPAT);
|
||||||
return final_value * 3.3f / 4095.0f;
|
adc2_get_raw(this->channel2_, ADC_WIDTH_MAX_SOC_BITS, &raw12);
|
||||||
}
|
if (raw12 < ADC_MAX) {
|
||||||
|
adc2_config_channel_atten(this->channel2_, ADC_ATTEN_DB_6);
|
||||||
float ADCSensor::sample_autorange_() {
|
adc2_get_raw(this->channel2_, ADC_WIDTH_MAX_SOC_BITS, &raw6);
|
||||||
// Auto-range mode
|
if (raw6 < ADC_MAX) {
|
||||||
auto read_atten = [this](adc_atten_t atten) -> std::pair<int, float> {
|
adc2_config_channel_atten(this->channel2_, ADC_ATTEN_DB_2_5);
|
||||||
// First reconfigure the attenuation for this reading
|
adc2_get_raw(this->channel2_, ADC_WIDTH_MAX_SOC_BITS, &raw2);
|
||||||
adc_oneshot_chan_cfg_t config = {
|
if (raw2 < ADC_MAX) {
|
||||||
.atten = atten,
|
adc2_config_channel_atten(this->channel2_, ADC_ATTEN_DB_0);
|
||||||
.bitwidth = ADC_BITWIDTH_DEFAULT,
|
adc2_get_raw(this->channel2_, ADC_WIDTH_MAX_SOC_BITS, &raw0);
|
||||||
};
|
}
|
||||||
|
|
||||||
esp_err_t err = adc_oneshot_config_channel(this->adc_handle_, this->channel_, &config);
|
|
||||||
|
|
||||||
if (err != ESP_OK) {
|
|
||||||
ESP_LOGW(TAG, "Error configuring ADC channel for autorange: %d", err);
|
|
||||||
return {-1, 0.0f};
|
|
||||||
}
|
|
||||||
|
|
||||||
// Need to recalibrate for the new attenuation
|
|
||||||
if (this->calibration_handle_ != nullptr) {
|
|
||||||
// Delete old calibration handle
|
|
||||||
#if USE_ESP32_VARIANT_ESP32C3 || USE_ESP32_VARIANT_ESP32C5 || USE_ESP32_VARIANT_ESP32C6 || \
|
|
||||||
USE_ESP32_VARIANT_ESP32S3 || USE_ESP32_VARIANT_ESP32H2
|
|
||||||
adc_cali_delete_scheme_curve_fitting(this->calibration_handle_);
|
|
||||||
#else
|
|
||||||
adc_cali_delete_scheme_line_fitting(this->calibration_handle_);
|
|
||||||
#endif
|
|
||||||
this->calibration_handle_ = nullptr;
|
|
||||||
}
|
|
||||||
|
|
||||||
// Create new calibration handle for this attenuation
|
|
||||||
adc_cali_handle_t handle = nullptr;
|
|
||||||
|
|
||||||
#if USE_ESP32_VARIANT_ESP32C3 || USE_ESP32_VARIANT_ESP32C5 || USE_ESP32_VARIANT_ESP32C6 || \
|
|
||||||
USE_ESP32_VARIANT_ESP32S3 || USE_ESP32_VARIANT_ESP32H2
|
|
||||||
adc_cali_curve_fitting_config_t cali_config = {};
|
|
||||||
#if ESP_IDF_VERSION >= ESP_IDF_VERSION_VAL(5, 3, 0)
|
|
||||||
cali_config.chan = this->channel_;
|
|
||||||
#endif
|
|
||||||
cali_config.unit_id = this->adc_unit_;
|
|
||||||
cali_config.atten = atten;
|
|
||||||
cali_config.bitwidth = ADC_BITWIDTH_DEFAULT;
|
|
||||||
|
|
||||||
err = adc_cali_create_scheme_curve_fitting(&cali_config, &handle);
|
|
||||||
#else
|
|
||||||
adc_cali_line_fitting_config_t cali_config = {
|
|
||||||
.unit_id = this->adc_unit_,
|
|
||||||
.atten = atten,
|
|
||||||
.bitwidth = ADC_BITWIDTH_DEFAULT,
|
|
||||||
#if !defined(USE_ESP32_VARIANT_ESP32S2)
|
|
||||||
.default_vref = 1100,
|
|
||||||
#endif
|
|
||||||
};
|
|
||||||
err = adc_cali_create_scheme_line_fitting(&cali_config, &handle);
|
|
||||||
#endif
|
|
||||||
|
|
||||||
int raw;
|
|
||||||
err = adc_oneshot_read(this->adc_handle_, this->channel_, &raw);
|
|
||||||
|
|
||||||
if (err != ESP_OK) {
|
|
||||||
ESP_LOGW(TAG, "ADC read failed in autorange with error %d", err);
|
|
||||||
if (handle != nullptr) {
|
|
||||||
#if USE_ESP32_VARIANT_ESP32C3 || USE_ESP32_VARIANT_ESP32C5 || USE_ESP32_VARIANT_ESP32C6 || \
|
|
||||||
USE_ESP32_VARIANT_ESP32S3 || USE_ESP32_VARIANT_ESP32H2
|
|
||||||
adc_cali_delete_scheme_curve_fitting(handle);
|
|
||||||
#else
|
|
||||||
adc_cali_delete_scheme_line_fitting(handle);
|
|
||||||
#endif
|
|
||||||
}
|
|
||||||
return {-1, 0.0f};
|
|
||||||
}
|
|
||||||
|
|
||||||
float voltage = 0.0f;
|
|
||||||
if (handle != nullptr) {
|
|
||||||
int voltage_mv;
|
|
||||||
err = adc_cali_raw_to_voltage(handle, raw, &voltage_mv);
|
|
||||||
if (err == ESP_OK) {
|
|
||||||
voltage = voltage_mv / 1000.0f;
|
|
||||||
} else {
|
|
||||||
voltage = raw * 3.3f / 4095.0f;
|
|
||||||
}
|
|
||||||
// Clean up calibration handle
|
|
||||||
#if USE_ESP32_VARIANT_ESP32C3 || USE_ESP32_VARIANT_ESP32C5 || USE_ESP32_VARIANT_ESP32C6 || \
|
|
||||||
USE_ESP32_VARIANT_ESP32S3 || USE_ESP32_VARIANT_ESP32H2
|
|
||||||
adc_cali_delete_scheme_curve_fitting(handle);
|
|
||||||
#else
|
|
||||||
adc_cali_delete_scheme_line_fitting(handle);
|
|
||||||
#endif
|
|
||||||
} else {
|
|
||||||
voltage = raw * 3.3f / 4095.0f;
|
|
||||||
}
|
|
||||||
|
|
||||||
return {raw, voltage};
|
|
||||||
};
|
|
||||||
|
|
||||||
auto [raw12, mv12] = read_atten(ADC_ATTEN_DB_12);
|
|
||||||
if (raw12 == -1) {
|
|
||||||
ESP_LOGE(TAG, "Failed to read ADC in autorange mode");
|
|
||||||
return NAN;
|
|
||||||
}
|
|
||||||
|
|
||||||
int raw6 = 4095, raw2 = 4095, raw0 = 4095;
|
|
||||||
float mv6 = 0, mv2 = 0, mv0 = 0;
|
|
||||||
|
|
||||||
if (raw12 < 4095) {
|
|
||||||
auto [raw6_val, mv6_val] = read_atten(ADC_ATTEN_DB_6);
|
|
||||||
raw6 = raw6_val;
|
|
||||||
mv6 = mv6_val;
|
|
||||||
|
|
||||||
if (raw6 < 4095 && raw6 != -1) {
|
|
||||||
auto [raw2_val, mv2_val] = read_atten(ADC_ATTEN_DB_2_5);
|
|
||||||
raw2 = raw2_val;
|
|
||||||
mv2 = mv2_val;
|
|
||||||
|
|
||||||
if (raw2 < 4095 && raw2 != -1) {
|
|
||||||
auto [raw0_val, mv0_val] = read_atten(ADC_ATTEN_DB_0);
|
|
||||||
raw0 = raw0_val;
|
|
||||||
mv0 = mv0_val;
|
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
@ -324,19 +145,19 @@ float ADCSensor::sample_autorange_() {
|
|||||||
return NAN;
|
return NAN;
|
||||||
}
|
}
|
||||||
|
|
||||||
const int adc_half = 2048;
|
uint32_t mv12 = esp_adc_cal_raw_to_voltage(raw12, &this->cal_characteristics_[(int32_t) ADC_ATTEN_DB_12_COMPAT]);
|
||||||
uint32_t c12 = std::min(raw12, adc_half);
|
uint32_t mv6 = esp_adc_cal_raw_to_voltage(raw6, &this->cal_characteristics_[(int32_t) ADC_ATTEN_DB_6]);
|
||||||
uint32_t c6 = adc_half - std::abs(raw6 - adc_half);
|
uint32_t mv2 = esp_adc_cal_raw_to_voltage(raw2, &this->cal_characteristics_[(int32_t) ADC_ATTEN_DB_2_5]);
|
||||||
uint32_t c2 = adc_half - std::abs(raw2 - adc_half);
|
uint32_t mv0 = esp_adc_cal_raw_to_voltage(raw0, &this->cal_characteristics_[(int32_t) ADC_ATTEN_DB_0]);
|
||||||
uint32_t c0 = std::min(4095 - raw0, adc_half);
|
|
||||||
|
uint32_t c12 = std::min(raw12, ADC_HALF);
|
||||||
|
uint32_t c6 = ADC_HALF - std::abs(raw6 - ADC_HALF);
|
||||||
|
uint32_t c2 = ADC_HALF - std::abs(raw2 - ADC_HALF);
|
||||||
|
uint32_t c0 = std::min(ADC_MAX - raw0, ADC_HALF);
|
||||||
uint32_t csum = c12 + c6 + c2 + c0;
|
uint32_t csum = c12 + c6 + c2 + c0;
|
||||||
|
|
||||||
if (csum == 0) {
|
uint32_t mv_scaled = (mv12 * c12) + (mv6 * c6) + (mv2 * c2) + (mv0 * c0);
|
||||||
ESP_LOGE(TAG, "Invalid weight sum in autorange calculation");
|
return mv_scaled / (float) (csum * 1000U);
|
||||||
return NAN;
|
|
||||||
}
|
|
||||||
|
|
||||||
return (mv12 * c12 + mv6 * c6 + mv2 * c2 + mv0 * c0) / csum;
|
|
||||||
}
|
}
|
||||||
|
|
||||||
} // namespace adc
|
} // namespace adc
|
||||||
|
@ -17,6 +17,7 @@ namespace adc {
|
|||||||
static const char *const TAG = "adc.esp8266";
|
static const char *const TAG = "adc.esp8266";
|
||||||
|
|
||||||
void ADCSensor::setup() {
|
void ADCSensor::setup() {
|
||||||
|
ESP_LOGCONFIG(TAG, "Setting up ADC '%s'...", this->get_name().c_str());
|
||||||
#ifndef USE_ADC_SENSOR_VCC
|
#ifndef USE_ADC_SENSOR_VCC
|
||||||
this->pin_->setup();
|
this->pin_->setup();
|
||||||
#endif
|
#endif
|
||||||
@ -29,10 +30,8 @@ void ADCSensor::dump_config() {
|
|||||||
#else
|
#else
|
||||||
LOG_PIN(" Pin: ", this->pin_);
|
LOG_PIN(" Pin: ", this->pin_);
|
||||||
#endif // USE_ADC_SENSOR_VCC
|
#endif // USE_ADC_SENSOR_VCC
|
||||||
ESP_LOGCONFIG(TAG,
|
ESP_LOGCONFIG(TAG, " Samples: %i", this->sample_count_);
|
||||||
" Samples: %i\n"
|
ESP_LOGCONFIG(TAG, " Sampling mode: %s", LOG_STR_ARG(sampling_mode_to_str(this->sampling_mode_)));
|
||||||
" Sampling mode: %s",
|
|
||||||
this->sample_count_, LOG_STR_ARG(sampling_mode_to_str(this->sampling_mode_)));
|
|
||||||
LOG_UPDATE_INTERVAL(this);
|
LOG_UPDATE_INTERVAL(this);
|
||||||
}
|
}
|
||||||
|
|
||||||
@ -55,6 +54,8 @@ float ADCSensor::sample() {
|
|||||||
return aggr.aggregate() / 1024.0f;
|
return aggr.aggregate() / 1024.0f;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
std::string ADCSensor::unique_id() { return get_mac_address() + "-adc"; }
|
||||||
|
|
||||||
} // namespace adc
|
} // namespace adc
|
||||||
} // namespace esphome
|
} // namespace esphome
|
||||||
|
|
||||||
|
@ -9,6 +9,7 @@ namespace adc {
|
|||||||
static const char *const TAG = "adc.libretiny";
|
static const char *const TAG = "adc.libretiny";
|
||||||
|
|
||||||
void ADCSensor::setup() {
|
void ADCSensor::setup() {
|
||||||
|
ESP_LOGCONFIG(TAG, "Setting up ADC '%s'...", this->get_name().c_str());
|
||||||
#ifndef USE_ADC_SENSOR_VCC
|
#ifndef USE_ADC_SENSOR_VCC
|
||||||
this->pin_->setup();
|
this->pin_->setup();
|
||||||
#endif // !USE_ADC_SENSOR_VCC
|
#endif // !USE_ADC_SENSOR_VCC
|
||||||
@ -21,10 +22,8 @@ void ADCSensor::dump_config() {
|
|||||||
#else // USE_ADC_SENSOR_VCC
|
#else // USE_ADC_SENSOR_VCC
|
||||||
LOG_PIN(" Pin: ", this->pin_);
|
LOG_PIN(" Pin: ", this->pin_);
|
||||||
#endif // USE_ADC_SENSOR_VCC
|
#endif // USE_ADC_SENSOR_VCC
|
||||||
ESP_LOGCONFIG(TAG,
|
ESP_LOGCONFIG(TAG, " Samples: %i", this->sample_count_);
|
||||||
" Samples: %i\n"
|
ESP_LOGCONFIG(TAG, " Sampling mode: %s", LOG_STR_ARG(sampling_mode_to_str(this->sampling_mode_)));
|
||||||
" Sampling mode: %s",
|
|
||||||
this->sample_count_, LOG_STR_ARG(sampling_mode_to_str(this->sampling_mode_)));
|
|
||||||
LOG_UPDATE_INTERVAL(this);
|
LOG_UPDATE_INTERVAL(this);
|
||||||
}
|
}
|
||||||
|
|
||||||
|
@ -14,6 +14,7 @@ namespace adc {
|
|||||||
static const char *const TAG = "adc.rp2040";
|
static const char *const TAG = "adc.rp2040";
|
||||||
|
|
||||||
void ADCSensor::setup() {
|
void ADCSensor::setup() {
|
||||||
|
ESP_LOGCONFIG(TAG, "Setting up ADC '%s'...", this->get_name().c_str());
|
||||||
static bool initialized = false;
|
static bool initialized = false;
|
||||||
if (!initialized) {
|
if (!initialized) {
|
||||||
adc_init();
|
adc_init();
|
||||||
@ -32,10 +33,8 @@ void ADCSensor::dump_config() {
|
|||||||
LOG_PIN(" Pin: ", this->pin_);
|
LOG_PIN(" Pin: ", this->pin_);
|
||||||
#endif // USE_ADC_SENSOR_VCC
|
#endif // USE_ADC_SENSOR_VCC
|
||||||
}
|
}
|
||||||
ESP_LOGCONFIG(TAG,
|
ESP_LOGCONFIG(TAG, " Samples: %i", this->sample_count_);
|
||||||
" Samples: %i\n"
|
ESP_LOGCONFIG(TAG, " Sampling mode: %s", LOG_STR_ARG(sampling_mode_to_str(this->sampling_mode_)));
|
||||||
" Sampling mode: %s",
|
|
||||||
this->sample_count_, LOG_STR_ARG(sampling_mode_to_str(this->sampling_mode_)));
|
|
||||||
LOG_UPDATE_INTERVAL(this);
|
LOG_UPDATE_INTERVAL(this);
|
||||||
}
|
}
|
||||||
|
|
||||||
|
@ -10,11 +10,13 @@ from esphome.const import (
|
|||||||
CONF_NUMBER,
|
CONF_NUMBER,
|
||||||
CONF_PIN,
|
CONF_PIN,
|
||||||
CONF_RAW,
|
CONF_RAW,
|
||||||
|
CONF_WIFI,
|
||||||
DEVICE_CLASS_VOLTAGE,
|
DEVICE_CLASS_VOLTAGE,
|
||||||
STATE_CLASS_MEASUREMENT,
|
STATE_CLASS_MEASUREMENT,
|
||||||
UNIT_VOLT,
|
UNIT_VOLT,
|
||||||
)
|
)
|
||||||
from esphome.core import CORE
|
from esphome.core import CORE
|
||||||
|
import esphome.final_validate as fv
|
||||||
|
|
||||||
from . import (
|
from . import (
|
||||||
ATTENUATION_MODES,
|
ATTENUATION_MODES,
|
||||||
@ -22,7 +24,6 @@ from . import (
|
|||||||
ESP32_VARIANT_ADC2_PIN_TO_CHANNEL,
|
ESP32_VARIANT_ADC2_PIN_TO_CHANNEL,
|
||||||
SAMPLING_MODES,
|
SAMPLING_MODES,
|
||||||
adc_ns,
|
adc_ns,
|
||||||
adc_unit_t,
|
|
||||||
validate_adc_pin,
|
validate_adc_pin,
|
||||||
)
|
)
|
||||||
|
|
||||||
@ -56,6 +57,21 @@ def validate_config(config):
|
|||||||
return config
|
return config
|
||||||
|
|
||||||
|
|
||||||
|
def final_validate_config(config):
|
||||||
|
if CORE.is_esp32:
|
||||||
|
variant = get_esp32_variant()
|
||||||
|
if (
|
||||||
|
CONF_WIFI in fv.full_config.get()
|
||||||
|
and config[CONF_PIN][CONF_NUMBER]
|
||||||
|
in ESP32_VARIANT_ADC2_PIN_TO_CHANNEL[variant]
|
||||||
|
):
|
||||||
|
raise cv.Invalid(
|
||||||
|
f"{variant} doesn't support ADC on this pin when Wi-Fi is configured"
|
||||||
|
)
|
||||||
|
|
||||||
|
return config
|
||||||
|
|
||||||
|
|
||||||
ADCSensor = adc_ns.class_(
|
ADCSensor = adc_ns.class_(
|
||||||
"ADCSensor", sensor.Sensor, cg.PollingComponent, voltage_sampler.VoltageSampler
|
"ADCSensor", sensor.Sensor, cg.PollingComponent, voltage_sampler.VoltageSampler
|
||||||
)
|
)
|
||||||
@ -83,6 +99,8 @@ CONFIG_SCHEMA = cv.All(
|
|||||||
validate_config,
|
validate_config,
|
||||||
)
|
)
|
||||||
|
|
||||||
|
FINAL_VALIDATE_SCHEMA = final_validate_config
|
||||||
|
|
||||||
|
|
||||||
async def to_code(config):
|
async def to_code(config):
|
||||||
var = cg.new_Pvariable(config[CONF_ID])
|
var = cg.new_Pvariable(config[CONF_ID])
|
||||||
@ -101,13 +119,13 @@ async def to_code(config):
|
|||||||
cg.add(var.set_sample_count(config[CONF_SAMPLES]))
|
cg.add(var.set_sample_count(config[CONF_SAMPLES]))
|
||||||
cg.add(var.set_sampling_mode(config[CONF_SAMPLING_MODE]))
|
cg.add(var.set_sampling_mode(config[CONF_SAMPLING_MODE]))
|
||||||
|
|
||||||
if CORE.is_esp32:
|
if attenuation := config.get(CONF_ATTENUATION):
|
||||||
if attenuation := config.get(CONF_ATTENUATION):
|
if attenuation == "auto":
|
||||||
if attenuation == "auto":
|
cg.add(var.set_autorange(cg.global_ns.true))
|
||||||
cg.add(var.set_autorange(cg.global_ns.true))
|
else:
|
||||||
else:
|
cg.add(var.set_attenuation(attenuation))
|
||||||
cg.add(var.set_attenuation(attenuation))
|
|
||||||
|
|
||||||
|
if CORE.is_esp32:
|
||||||
variant = get_esp32_variant()
|
variant = get_esp32_variant()
|
||||||
pin_num = config[CONF_PIN][CONF_NUMBER]
|
pin_num = config[CONF_PIN][CONF_NUMBER]
|
||||||
if (
|
if (
|
||||||
@ -115,10 +133,10 @@ async def to_code(config):
|
|||||||
and pin_num in ESP32_VARIANT_ADC1_PIN_TO_CHANNEL[variant]
|
and pin_num in ESP32_VARIANT_ADC1_PIN_TO_CHANNEL[variant]
|
||||||
):
|
):
|
||||||
chan = ESP32_VARIANT_ADC1_PIN_TO_CHANNEL[variant][pin_num]
|
chan = ESP32_VARIANT_ADC1_PIN_TO_CHANNEL[variant][pin_num]
|
||||||
cg.add(var.set_channel(adc_unit_t.ADC_UNIT_1, chan))
|
cg.add(var.set_channel1(chan))
|
||||||
elif (
|
elif (
|
||||||
variant in ESP32_VARIANT_ADC2_PIN_TO_CHANNEL
|
variant in ESP32_VARIANT_ADC2_PIN_TO_CHANNEL
|
||||||
and pin_num in ESP32_VARIANT_ADC2_PIN_TO_CHANNEL[variant]
|
and pin_num in ESP32_VARIANT_ADC2_PIN_TO_CHANNEL[variant]
|
||||||
):
|
):
|
||||||
chan = ESP32_VARIANT_ADC2_PIN_TO_CHANNEL[variant][pin_num]
|
chan = ESP32_VARIANT_ADC2_PIN_TO_CHANNEL[variant][pin_num]
|
||||||
cg.add(var.set_channel(adc_unit_t.ADC_UNIT_2, chan))
|
cg.add(var.set_channel2(chan))
|
||||||
|
@ -8,7 +8,10 @@ static const char *const TAG = "adc128s102";
|
|||||||
|
|
||||||
float ADC128S102::get_setup_priority() const { return setup_priority::HARDWARE; }
|
float ADC128S102::get_setup_priority() const { return setup_priority::HARDWARE; }
|
||||||
|
|
||||||
void ADC128S102::setup() { this->spi_setup(); }
|
void ADC128S102::setup() {
|
||||||
|
ESP_LOGCONFIG(TAG, "Setting up adc128s102");
|
||||||
|
this->spi_setup();
|
||||||
|
}
|
||||||
|
|
||||||
void ADC128S102::dump_config() {
|
void ADC128S102::dump_config() {
|
||||||
ESP_LOGCONFIG(TAG, "ADC128S102:");
|
ESP_LOGCONFIG(TAG, "ADC128S102:");
|
||||||
|
@ -177,14 +177,11 @@ void ADE7880::dump_config() {
|
|||||||
LOG_SENSOR(" ", "Power Factor", this->channel_a_->power_factor);
|
LOG_SENSOR(" ", "Power Factor", this->channel_a_->power_factor);
|
||||||
LOG_SENSOR(" ", "Forward Active Energy", this->channel_a_->forward_active_energy);
|
LOG_SENSOR(" ", "Forward Active Energy", this->channel_a_->forward_active_energy);
|
||||||
LOG_SENSOR(" ", "Reverse Active Energy", this->channel_a_->reverse_active_energy);
|
LOG_SENSOR(" ", "Reverse Active Energy", this->channel_a_->reverse_active_energy);
|
||||||
ESP_LOGCONFIG(TAG,
|
ESP_LOGCONFIG(TAG, " Calibration:");
|
||||||
" Calibration:\n"
|
ESP_LOGCONFIG(TAG, " Current: %" PRId32, this->channel_a_->current_gain_calibration);
|
||||||
" Current: %" PRId32 "\n"
|
ESP_LOGCONFIG(TAG, " Voltage: %" PRId32, this->channel_a_->voltage_gain_calibration);
|
||||||
" Voltage: %" PRId32 "\n"
|
ESP_LOGCONFIG(TAG, " Power: %" PRId32, this->channel_a_->power_gain_calibration);
|
||||||
" Power: %" PRId32 "\n"
|
ESP_LOGCONFIG(TAG, " Phase Angle: %u", this->channel_a_->phase_angle_calibration);
|
||||||
" Phase Angle: %u",
|
|
||||||
this->channel_a_->current_gain_calibration, this->channel_a_->voltage_gain_calibration,
|
|
||||||
this->channel_a_->power_gain_calibration, this->channel_a_->phase_angle_calibration);
|
|
||||||
}
|
}
|
||||||
|
|
||||||
if (this->channel_b_ != nullptr) {
|
if (this->channel_b_ != nullptr) {
|
||||||
@ -196,14 +193,11 @@ void ADE7880::dump_config() {
|
|||||||
LOG_SENSOR(" ", "Power Factor", this->channel_b_->power_factor);
|
LOG_SENSOR(" ", "Power Factor", this->channel_b_->power_factor);
|
||||||
LOG_SENSOR(" ", "Forward Active Energy", this->channel_b_->forward_active_energy);
|
LOG_SENSOR(" ", "Forward Active Energy", this->channel_b_->forward_active_energy);
|
||||||
LOG_SENSOR(" ", "Reverse Active Energy", this->channel_b_->reverse_active_energy);
|
LOG_SENSOR(" ", "Reverse Active Energy", this->channel_b_->reverse_active_energy);
|
||||||
ESP_LOGCONFIG(TAG,
|
ESP_LOGCONFIG(TAG, " Calibration:");
|
||||||
" Calibration:\n"
|
ESP_LOGCONFIG(TAG, " Current: %" PRId32, this->channel_b_->current_gain_calibration);
|
||||||
" Current: %" PRId32 "\n"
|
ESP_LOGCONFIG(TAG, " Voltage: %" PRId32, this->channel_b_->voltage_gain_calibration);
|
||||||
" Voltage: %" PRId32 "\n"
|
ESP_LOGCONFIG(TAG, " Power: %" PRId32, this->channel_b_->power_gain_calibration);
|
||||||
" Power: %" PRId32 "\n"
|
ESP_LOGCONFIG(TAG, " Phase Angle: %u", this->channel_b_->phase_angle_calibration);
|
||||||
" Phase Angle: %u",
|
|
||||||
this->channel_b_->current_gain_calibration, this->channel_b_->voltage_gain_calibration,
|
|
||||||
this->channel_b_->power_gain_calibration, this->channel_b_->phase_angle_calibration);
|
|
||||||
}
|
}
|
||||||
|
|
||||||
if (this->channel_c_ != nullptr) {
|
if (this->channel_c_ != nullptr) {
|
||||||
@ -215,23 +209,18 @@ void ADE7880::dump_config() {
|
|||||||
LOG_SENSOR(" ", "Power Factor", this->channel_c_->power_factor);
|
LOG_SENSOR(" ", "Power Factor", this->channel_c_->power_factor);
|
||||||
LOG_SENSOR(" ", "Forward Active Energy", this->channel_c_->forward_active_energy);
|
LOG_SENSOR(" ", "Forward Active Energy", this->channel_c_->forward_active_energy);
|
||||||
LOG_SENSOR(" ", "Reverse Active Energy", this->channel_c_->reverse_active_energy);
|
LOG_SENSOR(" ", "Reverse Active Energy", this->channel_c_->reverse_active_energy);
|
||||||
ESP_LOGCONFIG(TAG,
|
ESP_LOGCONFIG(TAG, " Calibration:");
|
||||||
" Calibration:\n"
|
ESP_LOGCONFIG(TAG, " Current: %" PRId32, this->channel_c_->current_gain_calibration);
|
||||||
" Current: %" PRId32 "\n"
|
ESP_LOGCONFIG(TAG, " Voltage: %" PRId32, this->channel_c_->voltage_gain_calibration);
|
||||||
" Voltage: %" PRId32 "\n"
|
ESP_LOGCONFIG(TAG, " Power: %" PRId32, this->channel_c_->power_gain_calibration);
|
||||||
" Power: %" PRId32 "\n"
|
ESP_LOGCONFIG(TAG, " Phase Angle: %u", this->channel_c_->phase_angle_calibration);
|
||||||
" Phase Angle: %u",
|
|
||||||
this->channel_c_->current_gain_calibration, this->channel_c_->voltage_gain_calibration,
|
|
||||||
this->channel_c_->power_gain_calibration, this->channel_c_->phase_angle_calibration);
|
|
||||||
}
|
}
|
||||||
|
|
||||||
if (this->channel_n_ != nullptr) {
|
if (this->channel_n_ != nullptr) {
|
||||||
ESP_LOGCONFIG(TAG, " Neutral:");
|
ESP_LOGCONFIG(TAG, " Neutral:");
|
||||||
LOG_SENSOR(" ", "Current", this->channel_n_->current);
|
LOG_SENSOR(" ", "Current", this->channel_n_->current);
|
||||||
ESP_LOGCONFIG(TAG,
|
ESP_LOGCONFIG(TAG, " Calibration:");
|
||||||
" Calibration:\n"
|
ESP_LOGCONFIG(TAG, " Current: %" PRId32, this->channel_n_->current_gain_calibration);
|
||||||
" Current: %" PRId32,
|
|
||||||
this->channel_n_->current_gain_calibration);
|
|
||||||
}
|
}
|
||||||
|
|
||||||
LOG_I2C_DEVICE(this);
|
LOG_I2C_DEVICE(this);
|
||||||
|
@ -85,6 +85,8 @@ class ADE7880 : public i2c::I2CDevice, public PollingComponent {
|
|||||||
|
|
||||||
void dump_config() override;
|
void dump_config() override;
|
||||||
|
|
||||||
|
float get_setup_priority() const override { return setup_priority::DATA; }
|
||||||
|
|
||||||
protected:
|
protected:
|
||||||
ADE7880Store store_{};
|
ADE7880Store store_{};
|
||||||
InternalGPIOPin *irq0_pin_{nullptr};
|
InternalGPIOPin *irq0_pin_{nullptr};
|
||||||
|
@ -58,18 +58,15 @@ void ADE7953::dump_config() {
|
|||||||
LOG_SENSOR(" ", "Active Power B Sensor", this->active_power_b_sensor_);
|
LOG_SENSOR(" ", "Active Power B Sensor", this->active_power_b_sensor_);
|
||||||
LOG_SENSOR(" ", "Rective Power A Sensor", this->reactive_power_a_sensor_);
|
LOG_SENSOR(" ", "Rective Power A Sensor", this->reactive_power_a_sensor_);
|
||||||
LOG_SENSOR(" ", "Reactive Power B Sensor", this->reactive_power_b_sensor_);
|
LOG_SENSOR(" ", "Reactive Power B Sensor", this->reactive_power_b_sensor_);
|
||||||
ESP_LOGCONFIG(TAG,
|
ESP_LOGCONFIG(TAG, " USE_ACC_ENERGY_REGS: %d", this->use_acc_energy_regs_);
|
||||||
" USE_ACC_ENERGY_REGS: %d\n"
|
ESP_LOGCONFIG(TAG, " PGA_V_8: 0x%X", pga_v_);
|
||||||
" PGA_V_8: 0x%X\n"
|
ESP_LOGCONFIG(TAG, " PGA_IA_8: 0x%X", pga_ia_);
|
||||||
" PGA_IA_8: 0x%X\n"
|
ESP_LOGCONFIG(TAG, " PGA_IB_8: 0x%X", pga_ib_);
|
||||||
" PGA_IB_8: 0x%X\n"
|
ESP_LOGCONFIG(TAG, " VGAIN_32: 0x%08jX", (uintmax_t) vgain_);
|
||||||
" VGAIN_32: 0x%08jX\n"
|
ESP_LOGCONFIG(TAG, " AIGAIN_32: 0x%08jX", (uintmax_t) aigain_);
|
||||||
" AIGAIN_32: 0x%08jX\n"
|
ESP_LOGCONFIG(TAG, " BIGAIN_32: 0x%08jX", (uintmax_t) bigain_);
|
||||||
" BIGAIN_32: 0x%08jX\n"
|
ESP_LOGCONFIG(TAG, " AWGAIN_32: 0x%08jX", (uintmax_t) awgain_);
|
||||||
" AWGAIN_32: 0x%08jX\n"
|
ESP_LOGCONFIG(TAG, " BWGAIN_32: 0x%08jX", (uintmax_t) bwgain_);
|
||||||
" BWGAIN_32: 0x%08jX",
|
|
||||||
this->use_acc_energy_regs_, pga_v_, pga_ia_, pga_ib_, (uintmax_t) vgain_, (uintmax_t) aigain_,
|
|
||||||
(uintmax_t) bigain_, (uintmax_t) awgain_, (uintmax_t) bwgain_);
|
|
||||||
}
|
}
|
||||||
|
|
||||||
#define ADE_PUBLISH_(name, val, factor) \
|
#define ADE_PUBLISH_(name, val, factor) \
|
||||||
|
@ -1,6 +1,6 @@
|
|||||||
#include "ade7953_i2c.h"
|
#include "ade7953_i2c.h"
|
||||||
#include "esphome/core/helpers.h"
|
|
||||||
#include "esphome/core/log.h"
|
#include "esphome/core/log.h"
|
||||||
|
#include "esphome/core/helpers.h"
|
||||||
|
|
||||||
namespace esphome {
|
namespace esphome {
|
||||||
namespace ade7953_i2c {
|
namespace ade7953_i2c {
|
||||||
|
@ -1,6 +1,6 @@
|
|||||||
#include "ade7953_spi.h"
|
#include "ade7953_spi.h"
|
||||||
#include "esphome/core/helpers.h"
|
|
||||||
#include "esphome/core/log.h"
|
#include "esphome/core/log.h"
|
||||||
|
#include "esphome/core/helpers.h"
|
||||||
|
|
||||||
namespace esphome {
|
namespace esphome {
|
||||||
namespace ade7953_spi {
|
namespace ade7953_spi {
|
||||||
|
@ -10,12 +10,15 @@ static const uint8_t ADS1115_REGISTER_CONVERSION = 0x00;
|
|||||||
static const uint8_t ADS1115_REGISTER_CONFIG = 0x01;
|
static const uint8_t ADS1115_REGISTER_CONFIG = 0x01;
|
||||||
|
|
||||||
void ADS1115Component::setup() {
|
void ADS1115Component::setup() {
|
||||||
|
ESP_LOGCONFIG(TAG, "Setting up ADS1115...");
|
||||||
uint16_t value;
|
uint16_t value;
|
||||||
if (!this->read_byte_16(ADS1115_REGISTER_CONVERSION, &value)) {
|
if (!this->read_byte_16(ADS1115_REGISTER_CONVERSION, &value)) {
|
||||||
this->mark_failed();
|
this->mark_failed();
|
||||||
return;
|
return;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
ESP_LOGCONFIG(TAG, "Configuring ADS1115...");
|
||||||
|
|
||||||
uint16_t config = 0;
|
uint16_t config = 0;
|
||||||
// Clear single-shot bit
|
// Clear single-shot bit
|
||||||
// 0b0xxxxxxxxxxxxxxx
|
// 0b0xxxxxxxxxxxxxxx
|
||||||
@ -65,10 +68,10 @@ void ADS1115Component::setup() {
|
|||||||
this->prev_config_ = config;
|
this->prev_config_ = config;
|
||||||
}
|
}
|
||||||
void ADS1115Component::dump_config() {
|
void ADS1115Component::dump_config() {
|
||||||
ESP_LOGCONFIG(TAG, "ADS1115:");
|
ESP_LOGCONFIG(TAG, "Setting up ADS1115...");
|
||||||
LOG_I2C_DEVICE(this);
|
LOG_I2C_DEVICE(this);
|
||||||
if (this->is_failed()) {
|
if (this->is_failed()) {
|
||||||
ESP_LOGE(TAG, ESP_LOG_MSG_COMM_FAIL);
|
ESP_LOGE(TAG, "Communication with ADS1115 failed!");
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
float ADS1115Component::request_measurement(ADS1115Multiplexer multiplexer, ADS1115Gain gain,
|
float ADS1115Component::request_measurement(ADS1115Multiplexer multiplexer, ADS1115Gain gain,
|
||||||
|
@ -49,6 +49,7 @@ class ADS1115Component : public Component, public i2c::I2CDevice {
|
|||||||
void setup() override;
|
void setup() override;
|
||||||
void dump_config() override;
|
void dump_config() override;
|
||||||
/// HARDWARE_LATE setup priority
|
/// HARDWARE_LATE setup priority
|
||||||
|
float get_setup_priority() const override { return setup_priority::DATA; }
|
||||||
void set_continuous_mode(bool continuous_mode) { continuous_mode_ = continuous_mode; }
|
void set_continuous_mode(bool continuous_mode) { continuous_mode_ = continuous_mode; }
|
||||||
|
|
||||||
/// Helper method to request a measurement from a sensor.
|
/// Helper method to request a measurement from a sensor.
|
||||||
|
@ -1,5 +1,4 @@
|
|||||||
#include "ads1118.h"
|
#include "ads1118.h"
|
||||||
#include "esphome/core/helpers.h"
|
|
||||||
#include "esphome/core/log.h"
|
#include "esphome/core/log.h"
|
||||||
|
|
||||||
namespace esphome {
|
namespace esphome {
|
||||||
@ -9,6 +8,7 @@ static const char *const TAG = "ads1118";
|
|||||||
static const uint8_t ADS1118_DATA_RATE_860_SPS = 0b111;
|
static const uint8_t ADS1118_DATA_RATE_860_SPS = 0b111;
|
||||||
|
|
||||||
void ADS1118::setup() {
|
void ADS1118::setup() {
|
||||||
|
ESP_LOGCONFIG(TAG, "Setting up ads1118");
|
||||||
this->spi_setup();
|
this->spi_setup();
|
||||||
|
|
||||||
this->config_ = 0;
|
this->config_ = 0;
|
||||||
|
@ -34,6 +34,7 @@ class ADS1118 : public Component,
|
|||||||
ADS1118() = default;
|
ADS1118() = default;
|
||||||
void setup() override;
|
void setup() override;
|
||||||
void dump_config() override;
|
void dump_config() override;
|
||||||
|
float get_setup_priority() const override { return setup_priority::DATA; }
|
||||||
/// Helper method to request a measurement from a sensor.
|
/// Helper method to request a measurement from a sensor.
|
||||||
float request_measurement(ADS1118Multiplexer multiplexer, ADS1118Gain gain, bool temperature_mode);
|
float request_measurement(ADS1118Multiplexer multiplexer, ADS1118Gain gain, bool temperature_mode);
|
||||||
|
|
||||||
|
@ -1,5 +1,4 @@
|
|||||||
#include "ags10.h"
|
#include "ags10.h"
|
||||||
#include "esphome/core/helpers.h"
|
|
||||||
|
|
||||||
#include <cinttypes>
|
#include <cinttypes>
|
||||||
|
|
||||||
@ -24,6 +23,8 @@ static const uint16_t ZP_CURRENT = 0x0000;
|
|||||||
static const uint16_t ZP_DEFAULT = 0xFFFF;
|
static const uint16_t ZP_DEFAULT = 0xFFFF;
|
||||||
|
|
||||||
void AGS10Component::setup() {
|
void AGS10Component::setup() {
|
||||||
|
ESP_LOGCONFIG(TAG, "Setting up ags10...");
|
||||||
|
|
||||||
auto version = this->read_version_();
|
auto version = this->read_version_();
|
||||||
if (version) {
|
if (version) {
|
||||||
ESP_LOGD(TAG, "AGS10 Sensor Version: 0x%02X", *version);
|
ESP_LOGD(TAG, "AGS10 Sensor Version: 0x%02X", *version);
|
||||||
@ -43,6 +44,8 @@ void AGS10Component::setup() {
|
|||||||
} else {
|
} else {
|
||||||
ESP_LOGE(TAG, "AGS10 Sensor Resistance: unknown");
|
ESP_LOGE(TAG, "AGS10 Sensor Resistance: unknown");
|
||||||
}
|
}
|
||||||
|
|
||||||
|
ESP_LOGD(TAG, "Sensor initialized");
|
||||||
}
|
}
|
||||||
|
|
||||||
void AGS10Component::update() {
|
void AGS10Component::update() {
|
||||||
@ -62,7 +65,7 @@ void AGS10Component::dump_config() {
|
|||||||
case NONE:
|
case NONE:
|
||||||
break;
|
break;
|
||||||
case COMMUNICATION_FAILED:
|
case COMMUNICATION_FAILED:
|
||||||
ESP_LOGE(TAG, ESP_LOG_MSG_COMM_FAIL);
|
ESP_LOGE(TAG, "Communication with AGS10 failed!");
|
||||||
break;
|
break;
|
||||||
case CRC_CHECK_FAILED:
|
case CRC_CHECK_FAILED:
|
||||||
ESP_LOGE(TAG, "The crc check failed");
|
ESP_LOGE(TAG, "The crc check failed");
|
||||||
|
@ -31,6 +31,8 @@ class AGS10Component : public PollingComponent, public i2c::I2CDevice {
|
|||||||
|
|
||||||
void dump_config() override;
|
void dump_config() override;
|
||||||
|
|
||||||
|
float get_setup_priority() const override { return setup_priority::DATA; }
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Modifies target address of AGS10.
|
* Modifies target address of AGS10.
|
||||||
*
|
*
|
||||||
|
@ -13,9 +13,8 @@
|
|||||||
// results making successive requests; the current implementation makes 3 attempts with a delay of 30ms each time.
|
// results making successive requests; the current implementation makes 3 attempts with a delay of 30ms each time.
|
||||||
|
|
||||||
#include "aht10.h"
|
#include "aht10.h"
|
||||||
#include "esphome/core/hal.h"
|
|
||||||
#include "esphome/core/helpers.h"
|
|
||||||
#include "esphome/core/log.h"
|
#include "esphome/core/log.h"
|
||||||
|
#include "esphome/core/hal.h"
|
||||||
|
|
||||||
namespace esphome {
|
namespace esphome {
|
||||||
namespace aht10 {
|
namespace aht10 {
|
||||||
@ -35,55 +34,57 @@ static const uint8_t AHT10_INIT_ATTEMPTS = 10;
|
|||||||
|
|
||||||
static const uint8_t AHT10_STATUS_BUSY = 0x80;
|
static const uint8_t AHT10_STATUS_BUSY = 0x80;
|
||||||
|
|
||||||
static const float AHT10_DIVISOR = 1048576.0f; // 2^20, used for temperature and humidity calculations
|
|
||||||
|
|
||||||
void AHT10Component::setup() {
|
void AHT10Component::setup() {
|
||||||
if (this->write(AHT10_SOFTRESET_CMD, sizeof(AHT10_SOFTRESET_CMD)) != i2c::ERROR_OK) {
|
if (this->write(AHT10_SOFTRESET_CMD, sizeof(AHT10_SOFTRESET_CMD)) != i2c::ERROR_OK) {
|
||||||
ESP_LOGE(TAG, "Reset failed");
|
ESP_LOGE(TAG, "Reset AHT10 failed!");
|
||||||
}
|
}
|
||||||
delay(AHT10_SOFTRESET_DELAY);
|
delay(AHT10_SOFTRESET_DELAY);
|
||||||
|
|
||||||
i2c::ErrorCode error_code = i2c::ERROR_INVALID_ARGUMENT;
|
i2c::ErrorCode error_code = i2c::ERROR_INVALID_ARGUMENT;
|
||||||
switch (this->variant_) {
|
switch (this->variant_) {
|
||||||
case AHT10Variant::AHT20:
|
case AHT10Variant::AHT20:
|
||||||
|
ESP_LOGCONFIG(TAG, "Setting up AHT20");
|
||||||
error_code = this->write(AHT20_INITIALIZE_CMD, sizeof(AHT20_INITIALIZE_CMD));
|
error_code = this->write(AHT20_INITIALIZE_CMD, sizeof(AHT20_INITIALIZE_CMD));
|
||||||
break;
|
break;
|
||||||
case AHT10Variant::AHT10:
|
case AHT10Variant::AHT10:
|
||||||
|
ESP_LOGCONFIG(TAG, "Setting up AHT10");
|
||||||
error_code = this->write(AHT10_INITIALIZE_CMD, sizeof(AHT10_INITIALIZE_CMD));
|
error_code = this->write(AHT10_INITIALIZE_CMD, sizeof(AHT10_INITIALIZE_CMD));
|
||||||
break;
|
break;
|
||||||
}
|
}
|
||||||
if (error_code != i2c::ERROR_OK) {
|
if (error_code != i2c::ERROR_OK) {
|
||||||
ESP_LOGE(TAG, ESP_LOG_MSG_COMM_FAIL);
|
ESP_LOGE(TAG, "Communication with AHT10 failed!");
|
||||||
this->mark_failed();
|
this->mark_failed();
|
||||||
return;
|
return;
|
||||||
}
|
}
|
||||||
uint8_t cal_attempts = 0;
|
|
||||||
uint8_t data = AHT10_STATUS_BUSY;
|
uint8_t data = AHT10_STATUS_BUSY;
|
||||||
|
int cal_attempts = 0;
|
||||||
while (data & AHT10_STATUS_BUSY) {
|
while (data & AHT10_STATUS_BUSY) {
|
||||||
delay(AHT10_DEFAULT_DELAY);
|
delay(AHT10_DEFAULT_DELAY);
|
||||||
if (this->read(&data, 1) != i2c::ERROR_OK) {
|
if (this->read(&data, 1) != i2c::ERROR_OK) {
|
||||||
ESP_LOGE(TAG, ESP_LOG_MSG_COMM_FAIL);
|
ESP_LOGE(TAG, "Communication with AHT10 failed!");
|
||||||
this->mark_failed();
|
this->mark_failed();
|
||||||
return;
|
return;
|
||||||
}
|
}
|
||||||
++cal_attempts;
|
++cal_attempts;
|
||||||
if (cal_attempts > AHT10_INIT_ATTEMPTS) {
|
if (cal_attempts > AHT10_INIT_ATTEMPTS) {
|
||||||
ESP_LOGE(TAG, "Initialization timed out");
|
ESP_LOGE(TAG, "AHT10 initialization timed out!");
|
||||||
this->mark_failed();
|
this->mark_failed();
|
||||||
return;
|
return;
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
if ((data & 0x68) != 0x08) { // Bit[6:5] = 0b00, NORMAL mode and Bit[3] = 0b1, CALIBRATED
|
if ((data & 0x68) != 0x08) { // Bit[6:5] = 0b00, NORMAL mode and Bit[3] = 0b1, CALIBRATED
|
||||||
ESP_LOGE(TAG, "Initialization failed");
|
ESP_LOGE(TAG, "AHT10 initialization failed!");
|
||||||
this->mark_failed();
|
this->mark_failed();
|
||||||
return;
|
return;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
ESP_LOGV(TAG, "AHT10 initialization");
|
||||||
}
|
}
|
||||||
|
|
||||||
void AHT10Component::restart_read_() {
|
void AHT10Component::restart_read_() {
|
||||||
if (this->read_count_ == AHT10_ATTEMPTS) {
|
if (this->read_count_ == AHT10_ATTEMPTS) {
|
||||||
this->read_count_ = 0;
|
this->read_count_ = 0;
|
||||||
this->status_set_error("Reading timed out");
|
this->status_set_error("Measurements reading timed-out!");
|
||||||
return;
|
return;
|
||||||
}
|
}
|
||||||
this->read_count_++;
|
this->read_count_++;
|
||||||
@ -96,24 +97,24 @@ void AHT10Component::read_data_() {
|
|||||||
ESP_LOGD(TAG, "Read attempt %d at %ums", this->read_count_, (unsigned) (millis() - this->start_time_));
|
ESP_LOGD(TAG, "Read attempt %d at %ums", this->read_count_, (unsigned) (millis() - this->start_time_));
|
||||||
}
|
}
|
||||||
if (this->read(data, 6) != i2c::ERROR_OK) {
|
if (this->read(data, 6) != i2c::ERROR_OK) {
|
||||||
this->status_set_warning("Read failed, will retry");
|
this->status_set_warning("AHT10 read failed, retrying soon");
|
||||||
this->restart_read_();
|
this->restart_read_();
|
||||||
return;
|
return;
|
||||||
}
|
}
|
||||||
|
|
||||||
if ((data[0] & 0x80) == 0x80) { // Bit[7] = 0b1, device is busy
|
if ((data[0] & 0x80) == 0x80) { // Bit[7] = 0b1, device is busy
|
||||||
ESP_LOGD(TAG, "Device busy, will retry");
|
ESP_LOGD(TAG, "AHT10 is busy, waiting...");
|
||||||
this->restart_read_();
|
this->restart_read_();
|
||||||
return;
|
return;
|
||||||
}
|
}
|
||||||
if (data[1] == 0x0 && data[2] == 0x0 && (data[3] >> 4) == 0x0) {
|
if (data[1] == 0x0 && data[2] == 0x0 && (data[3] >> 4) == 0x0) {
|
||||||
// Invalid humidity (0x0)
|
// Unrealistic humidity (0x0)
|
||||||
if (this->humidity_sensor_ == nullptr) {
|
if (this->humidity_sensor_ == nullptr) {
|
||||||
ESP_LOGV(TAG, "Invalid humidity (reading not required)");
|
ESP_LOGV(TAG, "ATH10 Unrealistic humidity (0x0), but humidity is not required");
|
||||||
} else {
|
} else {
|
||||||
ESP_LOGD(TAG, "Invalid humidity, retrying");
|
ESP_LOGD(TAG, "ATH10 Unrealistic humidity (0x0), retrying...");
|
||||||
if (this->write(AHT10_MEASURE_CMD, sizeof(AHT10_MEASURE_CMD)) != i2c::ERROR_OK) {
|
if (this->write(AHT10_MEASURE_CMD, sizeof(AHT10_MEASURE_CMD)) != i2c::ERROR_OK) {
|
||||||
this->status_set_warning(ESP_LOG_MSG_COMM_FAIL);
|
this->status_set_warning("Communication with AHT10 failed!");
|
||||||
}
|
}
|
||||||
this->restart_read_();
|
this->restart_read_();
|
||||||
return;
|
return;
|
||||||
@ -122,17 +123,22 @@ void AHT10Component::read_data_() {
|
|||||||
if (this->read_count_ > 1) {
|
if (this->read_count_ > 1) {
|
||||||
ESP_LOGD(TAG, "Success at %ums", (unsigned) (millis() - this->start_time_));
|
ESP_LOGD(TAG, "Success at %ums", (unsigned) (millis() - this->start_time_));
|
||||||
}
|
}
|
||||||
uint32_t raw_temperature = encode_uint24(data[3] & 0xF, data[4], data[5]);
|
uint32_t raw_temperature = ((data[3] & 0x0F) << 16) | (data[4] << 8) | data[5];
|
||||||
uint32_t raw_humidity = encode_uint24(data[1], data[2], data[3]) >> 4;
|
uint32_t raw_humidity = ((data[1] << 16) | (data[2] << 8) | data[3]) >> 4;
|
||||||
|
|
||||||
if (this->temperature_sensor_ != nullptr) {
|
if (this->temperature_sensor_ != nullptr) {
|
||||||
float temperature = ((200.0f * static_cast<float>(raw_temperature)) / AHT10_DIVISOR) - 50.0f;
|
float temperature = ((200.0f * (float) raw_temperature) / 1048576.0f) - 50.0f;
|
||||||
this->temperature_sensor_->publish_state(temperature);
|
this->temperature_sensor_->publish_state(temperature);
|
||||||
}
|
}
|
||||||
if (this->humidity_sensor_ != nullptr) {
|
if (this->humidity_sensor_ != nullptr) {
|
||||||
float humidity = raw_humidity == 0 ? NAN : static_cast<float>(raw_humidity) * 100.0f / AHT10_DIVISOR;
|
float humidity;
|
||||||
|
if (raw_humidity == 0) { // unrealistic value
|
||||||
|
humidity = NAN;
|
||||||
|
} else {
|
||||||
|
humidity = (float) raw_humidity * 100.0f / 1048576.0f;
|
||||||
|
}
|
||||||
if (std::isnan(humidity)) {
|
if (std::isnan(humidity)) {
|
||||||
ESP_LOGW(TAG, "Invalid humidity reading (0%%), ");
|
ESP_LOGW(TAG, "Invalid humidity! Sensor reported 0%% Hum");
|
||||||
}
|
}
|
||||||
this->humidity_sensor_->publish_state(humidity);
|
this->humidity_sensor_->publish_state(humidity);
|
||||||
}
|
}
|
||||||
@ -144,7 +150,7 @@ void AHT10Component::update() {
|
|||||||
return;
|
return;
|
||||||
this->start_time_ = millis();
|
this->start_time_ = millis();
|
||||||
if (this->write(AHT10_MEASURE_CMD, sizeof(AHT10_MEASURE_CMD)) != i2c::ERROR_OK) {
|
if (this->write(AHT10_MEASURE_CMD, sizeof(AHT10_MEASURE_CMD)) != i2c::ERROR_OK) {
|
||||||
this->status_set_warning(ESP_LOG_MSG_COMM_FAIL);
|
this->status_set_warning("Communication with AHT10 failed!");
|
||||||
return;
|
return;
|
||||||
}
|
}
|
||||||
this->restart_read_();
|
this->restart_read_();
|
||||||
@ -156,7 +162,7 @@ void AHT10Component::dump_config() {
|
|||||||
ESP_LOGCONFIG(TAG, "AHT10:");
|
ESP_LOGCONFIG(TAG, "AHT10:");
|
||||||
LOG_I2C_DEVICE(this);
|
LOG_I2C_DEVICE(this);
|
||||||
if (this->is_failed()) {
|
if (this->is_failed()) {
|
||||||
ESP_LOGE(TAG, ESP_LOG_MSG_COMM_FAIL);
|
ESP_LOGE(TAG, "Communication with AHT10 failed!");
|
||||||
}
|
}
|
||||||
LOG_SENSOR(" ", "Temperature", this->temperature_sensor_);
|
LOG_SENSOR(" ", "Temperature", this->temperature_sensor_);
|
||||||
LOG_SENSOR(" ", "Humidity", this->humidity_sensor_);
|
LOG_SENSOR(" ", "Humidity", this->humidity_sensor_);
|
||||||
|
@ -17,6 +17,8 @@ static const char *const TAG = "aic3204";
|
|||||||
}
|
}
|
||||||
|
|
||||||
void AIC3204::setup() {
|
void AIC3204::setup() {
|
||||||
|
ESP_LOGCONFIG(TAG, "Setting up AIC3204...");
|
||||||
|
|
||||||
// Set register page to 0
|
// Set register page to 0
|
||||||
ERROR_CHECK(this->write_byte(AIC3204_PAGE_CTRL, 0x00), "Set page 0 failed");
|
ERROR_CHECK(this->write_byte(AIC3204_PAGE_CTRL, 0x00), "Set page 0 failed");
|
||||||
// Initiate SW reset (PLL is powered off as part of reset)
|
// Initiate SW reset (PLL is powered off as part of reset)
|
||||||
@ -111,7 +113,7 @@ void AIC3204::dump_config() {
|
|||||||
LOG_I2C_DEVICE(this);
|
LOG_I2C_DEVICE(this);
|
||||||
|
|
||||||
if (this->is_failed()) {
|
if (this->is_failed()) {
|
||||||
ESP_LOGE(TAG, ESP_LOG_MSG_COMM_FAIL);
|
ESP_LOGE(TAG, "Communication with AIC3204 failed");
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
@ -66,6 +66,7 @@ class AIC3204 : public audio_dac::AudioDac, public Component, public i2c::I2CDev
|
|||||||
public:
|
public:
|
||||||
void setup() override;
|
void setup() override;
|
||||||
void dump_config() override;
|
void dump_config() override;
|
||||||
|
float get_setup_priority() const override { return setup_priority::DATA; }
|
||||||
|
|
||||||
bool set_mute_off() override;
|
bool set_mute_off() override;
|
||||||
bool set_mute_on() override;
|
bool set_mute_on() override;
|
||||||
|
@ -34,7 +34,7 @@ AirthingsWaveBase = airthings_wave_base_ns.class_(
|
|||||||
|
|
||||||
|
|
||||||
BASE_SCHEMA = (
|
BASE_SCHEMA = (
|
||||||
cv.Schema(
|
sensor.SENSOR_SCHEMA.extend(
|
||||||
{
|
{
|
||||||
cv.Optional(CONF_HUMIDITY): sensor.sensor_schema(
|
cv.Optional(CONF_HUMIDITY): sensor.sensor_schema(
|
||||||
unit_of_measurement=UNIT_PERCENT,
|
unit_of_measurement=UNIT_PERCENT,
|
||||||
|
@ -1 +1 @@
|
|||||||
CODEOWNERS = ["@jeromelaban", "@precurse"]
|
CODEOWNERS = ["@jeromelaban"]
|
||||||
|
@ -73,29 +73,11 @@ void AirthingsWavePlus::dump_config() {
|
|||||||
LOG_SENSOR(" ", "Illuminance", this->illuminance_sensor_);
|
LOG_SENSOR(" ", "Illuminance", this->illuminance_sensor_);
|
||||||
}
|
}
|
||||||
|
|
||||||
void AirthingsWavePlus::setup() {
|
AirthingsWavePlus::AirthingsWavePlus() {
|
||||||
const char *service_uuid;
|
this->service_uuid_ = espbt::ESPBTUUID::from_raw(SERVICE_UUID);
|
||||||
const char *characteristic_uuid;
|
this->sensors_data_characteristic_uuid_ = espbt::ESPBTUUID::from_raw(CHARACTERISTIC_UUID);
|
||||||
const char *access_control_point_characteristic_uuid;
|
|
||||||
|
|
||||||
// Change UUIDs for Wave Radon Gen2
|
|
||||||
switch (this->wave_device_type_) {
|
|
||||||
case WaveDeviceType::WAVE_GEN2:
|
|
||||||
service_uuid = SERVICE_UUID_WAVE_RADON_GEN2;
|
|
||||||
characteristic_uuid = CHARACTERISTIC_UUID_WAVE_RADON_GEN2;
|
|
||||||
access_control_point_characteristic_uuid = ACCESS_CONTROL_POINT_CHARACTERISTIC_UUID_WAVE_RADON_GEN2;
|
|
||||||
break;
|
|
||||||
default:
|
|
||||||
// Wave Plus
|
|
||||||
service_uuid = SERVICE_UUID;
|
|
||||||
characteristic_uuid = CHARACTERISTIC_UUID;
|
|
||||||
access_control_point_characteristic_uuid = ACCESS_CONTROL_POINT_CHARACTERISTIC_UUID;
|
|
||||||
}
|
|
||||||
|
|
||||||
this->service_uuid_ = espbt::ESPBTUUID::from_raw(service_uuid);
|
|
||||||
this->sensors_data_characteristic_uuid_ = espbt::ESPBTUUID::from_raw(characteristic_uuid);
|
|
||||||
this->access_control_point_characteristic_uuid_ =
|
this->access_control_point_characteristic_uuid_ =
|
||||||
espbt::ESPBTUUID::from_raw(access_control_point_characteristic_uuid);
|
espbt::ESPBTUUID::from_raw(ACCESS_CONTROL_POINT_CHARACTERISTIC_UUID);
|
||||||
}
|
}
|
||||||
|
|
||||||
} // namespace airthings_wave_plus
|
} // namespace airthings_wave_plus
|
||||||
|
@ -9,20 +9,13 @@ namespace airthings_wave_plus {
|
|||||||
|
|
||||||
namespace espbt = esphome::esp32_ble_tracker;
|
namespace espbt = esphome::esp32_ble_tracker;
|
||||||
|
|
||||||
enum WaveDeviceType : uint8_t { WAVE_PLUS = 0, WAVE_GEN2 = 1 };
|
|
||||||
|
|
||||||
static const char *const SERVICE_UUID = "b42e1c08-ade7-11e4-89d3-123b93f75cba";
|
static const char *const SERVICE_UUID = "b42e1c08-ade7-11e4-89d3-123b93f75cba";
|
||||||
static const char *const CHARACTERISTIC_UUID = "b42e2a68-ade7-11e4-89d3-123b93f75cba";
|
static const char *const CHARACTERISTIC_UUID = "b42e2a68-ade7-11e4-89d3-123b93f75cba";
|
||||||
static const char *const ACCESS_CONTROL_POINT_CHARACTERISTIC_UUID = "b42e2d06-ade7-11e4-89d3-123b93f75cba";
|
static const char *const ACCESS_CONTROL_POINT_CHARACTERISTIC_UUID = "b42e2d06-ade7-11e4-89d3-123b93f75cba";
|
||||||
|
|
||||||
static const char *const SERVICE_UUID_WAVE_RADON_GEN2 = "b42e4a8e-ade7-11e4-89d3-123b93f75cba";
|
|
||||||
static const char *const CHARACTERISTIC_UUID_WAVE_RADON_GEN2 = "b42e4dcc-ade7-11e4-89d3-123b93f75cba";
|
|
||||||
static const char *const ACCESS_CONTROL_POINT_CHARACTERISTIC_UUID_WAVE_RADON_GEN2 =
|
|
||||||
"b42e50d8-ade7-11e4-89d3-123b93f75cba";
|
|
||||||
|
|
||||||
class AirthingsWavePlus : public airthings_wave_base::AirthingsWaveBase {
|
class AirthingsWavePlus : public airthings_wave_base::AirthingsWaveBase {
|
||||||
public:
|
public:
|
||||||
void setup() override;
|
AirthingsWavePlus();
|
||||||
|
|
||||||
void dump_config() override;
|
void dump_config() override;
|
||||||
|
|
||||||
@ -30,14 +23,12 @@ class AirthingsWavePlus : public airthings_wave_base::AirthingsWaveBase {
|
|||||||
void set_radon_long_term(sensor::Sensor *radon_long_term) { radon_long_term_sensor_ = radon_long_term; }
|
void set_radon_long_term(sensor::Sensor *radon_long_term) { radon_long_term_sensor_ = radon_long_term; }
|
||||||
void set_co2(sensor::Sensor *co2) { co2_sensor_ = co2; }
|
void set_co2(sensor::Sensor *co2) { co2_sensor_ = co2; }
|
||||||
void set_illuminance(sensor::Sensor *illuminance) { illuminance_sensor_ = illuminance; }
|
void set_illuminance(sensor::Sensor *illuminance) { illuminance_sensor_ = illuminance; }
|
||||||
void set_device_type(WaveDeviceType wave_device_type) { wave_device_type_ = wave_device_type; }
|
|
||||||
|
|
||||||
protected:
|
protected:
|
||||||
bool is_valid_radon_value_(uint16_t radon);
|
bool is_valid_radon_value_(uint16_t radon);
|
||||||
bool is_valid_co2_value_(uint16_t co2);
|
bool is_valid_co2_value_(uint16_t co2);
|
||||||
|
|
||||||
void read_sensors(uint8_t *raw_value, uint16_t value_len) override;
|
void read_sensors(uint8_t *raw_value, uint16_t value_len) override;
|
||||||
WaveDeviceType wave_device_type_{WaveDeviceType::WAVE_PLUS};
|
|
||||||
|
|
||||||
sensor::Sensor *radon_sensor_{nullptr};
|
sensor::Sensor *radon_sensor_{nullptr};
|
||||||
sensor::Sensor *radon_long_term_sensor_{nullptr};
|
sensor::Sensor *radon_long_term_sensor_{nullptr};
|
||||||
|
@ -7,7 +7,6 @@ from esphome.const import (
|
|||||||
CONF_ILLUMINANCE,
|
CONF_ILLUMINANCE,
|
||||||
CONF_RADON,
|
CONF_RADON,
|
||||||
CONF_RADON_LONG_TERM,
|
CONF_RADON_LONG_TERM,
|
||||||
CONF_TVOC,
|
|
||||||
DEVICE_CLASS_CARBON_DIOXIDE,
|
DEVICE_CLASS_CARBON_DIOXIDE,
|
||||||
DEVICE_CLASS_ILLUMINANCE,
|
DEVICE_CLASS_ILLUMINANCE,
|
||||||
ICON_RADIOACTIVE,
|
ICON_RADIOACTIVE,
|
||||||
@ -16,7 +15,6 @@ from esphome.const import (
|
|||||||
UNIT_LUX,
|
UNIT_LUX,
|
||||||
UNIT_PARTS_PER_MILLION,
|
UNIT_PARTS_PER_MILLION,
|
||||||
)
|
)
|
||||||
from esphome.types import ConfigType
|
|
||||||
|
|
||||||
DEPENDENCIES = airthings_wave_base.DEPENDENCIES
|
DEPENDENCIES = airthings_wave_base.DEPENDENCIES
|
||||||
|
|
||||||
@ -27,59 +25,35 @@ AirthingsWavePlus = airthings_wave_plus_ns.class_(
|
|||||||
"AirthingsWavePlus", airthings_wave_base.AirthingsWaveBase
|
"AirthingsWavePlus", airthings_wave_base.AirthingsWaveBase
|
||||||
)
|
)
|
||||||
|
|
||||||
CONF_DEVICE_TYPE = "device_type"
|
|
||||||
WaveDeviceType = airthings_wave_plus_ns.enum("WaveDeviceType")
|
|
||||||
DEVICE_TYPES = {
|
|
||||||
"WAVE_PLUS": WaveDeviceType.WAVE_PLUS,
|
|
||||||
"WAVE_GEN2": WaveDeviceType.WAVE_GEN2,
|
|
||||||
}
|
|
||||||
|
|
||||||
|
CONFIG_SCHEMA = airthings_wave_base.BASE_SCHEMA.extend(
|
||||||
def validate_wave_gen2_config(config: ConfigType) -> ConfigType:
|
{
|
||||||
"""Validate that Wave Gen2 devices don't have CO2 or TVOC sensors."""
|
cv.GenerateID(): cv.declare_id(AirthingsWavePlus),
|
||||||
if config[CONF_DEVICE_TYPE] == "WAVE_GEN2":
|
cv.Optional(CONF_RADON): sensor.sensor_schema(
|
||||||
if CONF_CO2 in config:
|
unit_of_measurement=UNIT_BECQUEREL_PER_CUBIC_METER,
|
||||||
raise cv.Invalid("Wave Gen2 devices do not support CO2 sensor")
|
icon=ICON_RADIOACTIVE,
|
||||||
# Check for TVOC in the base schema config
|
accuracy_decimals=0,
|
||||||
if CONF_TVOC in config:
|
state_class=STATE_CLASS_MEASUREMENT,
|
||||||
raise cv.Invalid("Wave Gen2 devices do not support TVOC sensor")
|
),
|
||||||
return config
|
cv.Optional(CONF_RADON_LONG_TERM): sensor.sensor_schema(
|
||||||
|
unit_of_measurement=UNIT_BECQUEREL_PER_CUBIC_METER,
|
||||||
|
icon=ICON_RADIOACTIVE,
|
||||||
CONFIG_SCHEMA = cv.All(
|
accuracy_decimals=0,
|
||||||
airthings_wave_base.BASE_SCHEMA.extend(
|
state_class=STATE_CLASS_MEASUREMENT,
|
||||||
{
|
),
|
||||||
cv.GenerateID(): cv.declare_id(AirthingsWavePlus),
|
cv.Optional(CONF_CO2): sensor.sensor_schema(
|
||||||
cv.Optional(CONF_RADON): sensor.sensor_schema(
|
unit_of_measurement=UNIT_PARTS_PER_MILLION,
|
||||||
unit_of_measurement=UNIT_BECQUEREL_PER_CUBIC_METER,
|
accuracy_decimals=0,
|
||||||
icon=ICON_RADIOACTIVE,
|
device_class=DEVICE_CLASS_CARBON_DIOXIDE,
|
||||||
accuracy_decimals=0,
|
state_class=STATE_CLASS_MEASUREMENT,
|
||||||
state_class=STATE_CLASS_MEASUREMENT,
|
),
|
||||||
),
|
cv.Optional(CONF_ILLUMINANCE): sensor.sensor_schema(
|
||||||
cv.Optional(CONF_RADON_LONG_TERM): sensor.sensor_schema(
|
unit_of_measurement=UNIT_LUX,
|
||||||
unit_of_measurement=UNIT_BECQUEREL_PER_CUBIC_METER,
|
accuracy_decimals=0,
|
||||||
icon=ICON_RADIOACTIVE,
|
device_class=DEVICE_CLASS_ILLUMINANCE,
|
||||||
accuracy_decimals=0,
|
state_class=STATE_CLASS_MEASUREMENT,
|
||||||
state_class=STATE_CLASS_MEASUREMENT,
|
),
|
||||||
),
|
}
|
||||||
cv.Optional(CONF_CO2): sensor.sensor_schema(
|
|
||||||
unit_of_measurement=UNIT_PARTS_PER_MILLION,
|
|
||||||
accuracy_decimals=0,
|
|
||||||
device_class=DEVICE_CLASS_CARBON_DIOXIDE,
|
|
||||||
state_class=STATE_CLASS_MEASUREMENT,
|
|
||||||
),
|
|
||||||
cv.Optional(CONF_ILLUMINANCE): sensor.sensor_schema(
|
|
||||||
unit_of_measurement=UNIT_LUX,
|
|
||||||
accuracy_decimals=0,
|
|
||||||
device_class=DEVICE_CLASS_ILLUMINANCE,
|
|
||||||
state_class=STATE_CLASS_MEASUREMENT,
|
|
||||||
),
|
|
||||||
cv.Optional(CONF_DEVICE_TYPE, default="WAVE_PLUS"): cv.enum(
|
|
||||||
DEVICE_TYPES, upper=True
|
|
||||||
),
|
|
||||||
}
|
|
||||||
),
|
|
||||||
validate_wave_gen2_config,
|
|
||||||
)
|
)
|
||||||
|
|
||||||
|
|
||||||
@ -99,4 +73,3 @@ async def to_code(config):
|
|||||||
if config_illuminance := config.get(CONF_ILLUMINANCE):
|
if config_illuminance := config.get(CONF_ILLUMINANCE):
|
||||||
sens = await sensor.new_sensor(config_illuminance)
|
sens = await sensor.new_sensor(config_illuminance)
|
||||||
cg.add(var.set_illuminance(sens))
|
cg.add(var.set_illuminance(sens))
|
||||||
cg.add(var.set_device_type(config[CONF_DEVICE_TYPE]))
|
|
||||||
|
@ -5,8 +5,6 @@ from esphome.components import mqtt, web_server
|
|||||||
import esphome.config_validation as cv
|
import esphome.config_validation as cv
|
||||||
from esphome.const import (
|
from esphome.const import (
|
||||||
CONF_CODE,
|
CONF_CODE,
|
||||||
CONF_ENTITY_CATEGORY,
|
|
||||||
CONF_ICON,
|
|
||||||
CONF_ID,
|
CONF_ID,
|
||||||
CONF_MQTT_ID,
|
CONF_MQTT_ID,
|
||||||
CONF_ON_STATE,
|
CONF_ON_STATE,
|
||||||
@ -14,8 +12,7 @@ from esphome.const import (
|
|||||||
CONF_WEB_SERVER,
|
CONF_WEB_SERVER,
|
||||||
)
|
)
|
||||||
from esphome.core import CORE, coroutine_with_priority
|
from esphome.core import CORE, coroutine_with_priority
|
||||||
from esphome.core.entity_helpers import entity_duplicate_validator, setup_entity
|
from esphome.cpp_helpers import setup_entity
|
||||||
from esphome.cpp_generator import MockObjClass
|
|
||||||
|
|
||||||
CODEOWNERS = ["@grahambrown11", "@hwstar"]
|
CODEOWNERS = ["@grahambrown11", "@hwstar"]
|
||||||
IS_PLATFORM_COMPONENT = True
|
IS_PLATFORM_COMPONENT = True
|
||||||
@ -81,11 +78,12 @@ AlarmControlPanelCondition = alarm_control_panel_ns.class_(
|
|||||||
"AlarmControlPanelCondition", automation.Condition
|
"AlarmControlPanelCondition", automation.Condition
|
||||||
)
|
)
|
||||||
|
|
||||||
_ALARM_CONTROL_PANEL_SCHEMA = (
|
ALARM_CONTROL_PANEL_SCHEMA = (
|
||||||
cv.ENTITY_BASE_SCHEMA.extend(web_server.WEBSERVER_SORTING_SCHEMA)
|
cv.ENTITY_BASE_SCHEMA.extend(web_server.WEBSERVER_SORTING_SCHEMA)
|
||||||
.extend(cv.MQTT_COMMAND_COMPONENT_SCHEMA)
|
.extend(cv.MQTT_COMMAND_COMPONENT_SCHEMA)
|
||||||
.extend(
|
.extend(
|
||||||
{
|
{
|
||||||
|
cv.GenerateID(): cv.declare_id(AlarmControlPanel),
|
||||||
cv.OnlyWith(CONF_MQTT_ID, "mqtt"): cv.declare_id(
|
cv.OnlyWith(CONF_MQTT_ID, "mqtt"): cv.declare_id(
|
||||||
mqtt.MQTTAlarmControlPanelComponent
|
mqtt.MQTTAlarmControlPanelComponent
|
||||||
),
|
),
|
||||||
@ -148,36 +146,6 @@ _ALARM_CONTROL_PANEL_SCHEMA = (
|
|||||||
)
|
)
|
||||||
)
|
)
|
||||||
|
|
||||||
|
|
||||||
_ALARM_CONTROL_PANEL_SCHEMA.add_extra(entity_duplicate_validator("alarm_control_panel"))
|
|
||||||
|
|
||||||
|
|
||||||
def alarm_control_panel_schema(
|
|
||||||
class_: MockObjClass,
|
|
||||||
*,
|
|
||||||
entity_category: str = cv.UNDEFINED,
|
|
||||||
icon: str = cv.UNDEFINED,
|
|
||||||
) -> cv.Schema:
|
|
||||||
schema = {
|
|
||||||
cv.GenerateID(): cv.declare_id(class_),
|
|
||||||
}
|
|
||||||
|
|
||||||
for key, default, validator in [
|
|
||||||
(CONF_ENTITY_CATEGORY, entity_category, cv.entity_category),
|
|
||||||
(CONF_ICON, icon, cv.icon),
|
|
||||||
]:
|
|
||||||
if default is not cv.UNDEFINED:
|
|
||||||
schema[cv.Optional(key, default=default)] = validator
|
|
||||||
|
|
||||||
return _ALARM_CONTROL_PANEL_SCHEMA.extend(schema)
|
|
||||||
|
|
||||||
|
|
||||||
# Remove before 2025.11.0
|
|
||||||
ALARM_CONTROL_PANEL_SCHEMA = alarm_control_panel_schema(AlarmControlPanel)
|
|
||||||
ALARM_CONTROL_PANEL_SCHEMA.add_extra(
|
|
||||||
cv.deprecated_schema_constant("alarm_control_panel")
|
|
||||||
)
|
|
||||||
|
|
||||||
ALARM_CONTROL_PANEL_ACTION_SCHEMA = maybe_simple_id(
|
ALARM_CONTROL_PANEL_ACTION_SCHEMA = maybe_simple_id(
|
||||||
{
|
{
|
||||||
cv.GenerateID(): cv.use_id(AlarmControlPanel),
|
cv.GenerateID(): cv.use_id(AlarmControlPanel),
|
||||||
@ -193,7 +161,7 @@ ALARM_CONTROL_PANEL_CONDITION_SCHEMA = maybe_simple_id(
|
|||||||
|
|
||||||
|
|
||||||
async def setup_alarm_control_panel_core_(var, config):
|
async def setup_alarm_control_panel_core_(var, config):
|
||||||
await setup_entity(var, config, "alarm_control_panel")
|
await setup_entity(var, config)
|
||||||
for conf in config.get(CONF_ON_STATE, []):
|
for conf in config.get(CONF_ON_STATE, []):
|
||||||
trigger = cg.new_Pvariable(conf[CONF_TRIGGER_ID], var)
|
trigger = cg.new_Pvariable(conf[CONF_TRIGGER_ID], var)
|
||||||
await automation.build_automation(trigger, [], conf)
|
await automation.build_automation(trigger, [], conf)
|
||||||
@ -238,16 +206,9 @@ async def register_alarm_control_panel(var, config):
|
|||||||
if not CORE.has_id(config[CONF_ID]):
|
if not CORE.has_id(config[CONF_ID]):
|
||||||
var = cg.Pvariable(config[CONF_ID], var)
|
var = cg.Pvariable(config[CONF_ID], var)
|
||||||
cg.add(cg.App.register_alarm_control_panel(var))
|
cg.add(cg.App.register_alarm_control_panel(var))
|
||||||
CORE.register_platform_component("alarm_control_panel", var)
|
|
||||||
await setup_alarm_control_panel_core_(var, config)
|
await setup_alarm_control_panel_core_(var, config)
|
||||||
|
|
||||||
|
|
||||||
async def new_alarm_control_panel(config, *args):
|
|
||||||
var = cg.new_Pvariable(config[CONF_ID], *args)
|
|
||||||
await register_alarm_control_panel(var, config)
|
|
||||||
return var
|
|
||||||
|
|
||||||
|
|
||||||
@automation.register_action(
|
@automation.register_action(
|
||||||
"alarm_control_panel.arm_away", ArmAwayAction, ALARM_CONTROL_PANEL_ACTION_SCHEMA
|
"alarm_control_panel.arm_away", ArmAwayAction, ALARM_CONTROL_PANEL_ACTION_SCHEMA
|
||||||
)
|
)
|
||||||
|
@ -41,6 +41,7 @@ class Alpha3 : public esphome::ble_client::BLEClientNode, public PollingComponen
|
|||||||
void gattc_event_handler(esp_gattc_cb_event_t event, esp_gatt_if_t gattc_if,
|
void gattc_event_handler(esp_gattc_cb_event_t event, esp_gatt_if_t gattc_if,
|
||||||
esp_ble_gattc_cb_param_t *param) override;
|
esp_ble_gattc_cb_param_t *param) override;
|
||||||
void dump_config() override;
|
void dump_config() override;
|
||||||
|
float get_setup_priority() const override { return setup_priority::DATA; }
|
||||||
void set_flow_sensor(sensor::Sensor *sensor) { this->flow_sensor_ = sensor; }
|
void set_flow_sensor(sensor::Sensor *sensor) { this->flow_sensor_ = sensor; }
|
||||||
void set_head_sensor(sensor::Sensor *sensor) { this->head_sensor_ = sensor; }
|
void set_head_sensor(sensor::Sensor *sensor) { this->head_sensor_ = sensor; }
|
||||||
void set_power_sensor(sensor::Sensor *sensor) { this->power_sensor_ = sensor; }
|
void set_power_sensor(sensor::Sensor *sensor) { this->power_sensor_ = sensor; }
|
||||||
|
@ -90,6 +90,8 @@ bool AM2315C::convert_(uint8_t *data, float &humidity, float &temperature) {
|
|||||||
}
|
}
|
||||||
|
|
||||||
void AM2315C::setup() {
|
void AM2315C::setup() {
|
||||||
|
ESP_LOGCONFIG(TAG, "Setting up AM2315C...");
|
||||||
|
|
||||||
// get status
|
// get status
|
||||||
uint8_t status = 0;
|
uint8_t status = 0;
|
||||||
if (this->read(&status, 1) != i2c::ERROR_OK) {
|
if (this->read(&status, 1) != i2c::ERROR_OK) {
|
||||||
@ -186,7 +188,7 @@ void AM2315C::dump_config() {
|
|||||||
ESP_LOGCONFIG(TAG, "AM2315C:");
|
ESP_LOGCONFIG(TAG, "AM2315C:");
|
||||||
LOG_I2C_DEVICE(this);
|
LOG_I2C_DEVICE(this);
|
||||||
if (this->is_failed()) {
|
if (this->is_failed()) {
|
||||||
ESP_LOGE(TAG, ESP_LOG_MSG_COMM_FAIL);
|
ESP_LOGE(TAG, "Communication with AM2315C failed!");
|
||||||
}
|
}
|
||||||
LOG_SENSOR(" ", "Temperature", this->temperature_sensor_);
|
LOG_SENSOR(" ", "Temperature", this->temperature_sensor_);
|
||||||
LOG_SENSOR(" ", "Humidity", this->humidity_sensor_);
|
LOG_SENSOR(" ", "Humidity", this->humidity_sensor_);
|
||||||
|
@ -34,6 +34,7 @@ void AM2320Component::update() {
|
|||||||
this->status_clear_warning();
|
this->status_clear_warning();
|
||||||
}
|
}
|
||||||
void AM2320Component::setup() {
|
void AM2320Component::setup() {
|
||||||
|
ESP_LOGCONFIG(TAG, "Setting up AM2320...");
|
||||||
uint8_t data[8];
|
uint8_t data[8];
|
||||||
data[0] = 0;
|
data[0] = 0;
|
||||||
data[1] = 4;
|
data[1] = 4;
|
||||||
@ -46,7 +47,7 @@ void AM2320Component::dump_config() {
|
|||||||
ESP_LOGD(TAG, "AM2320:");
|
ESP_LOGD(TAG, "AM2320:");
|
||||||
LOG_I2C_DEVICE(this);
|
LOG_I2C_DEVICE(this);
|
||||||
if (this->is_failed()) {
|
if (this->is_failed()) {
|
||||||
ESP_LOGE(TAG, ESP_LOG_MSG_COMM_FAIL);
|
ESP_LOGE(TAG, "Communication with AM2320 failed!");
|
||||||
}
|
}
|
||||||
LOG_SENSOR(" ", "Temperature", this->temperature_sensor_);
|
LOG_SENSOR(" ", "Temperature", this->temperature_sensor_);
|
||||||
LOG_SENSOR(" ", "Humidity", this->humidity_sensor_);
|
LOG_SENSOR(" ", "Humidity", this->humidity_sensor_);
|
||||||
|
@ -1,7 +1,7 @@
|
|||||||
#pragma once
|
#pragma once
|
||||||
|
|
||||||
#include "esphome/core/helpers.h"
|
|
||||||
#include "esphome/core/log.h"
|
#include "esphome/core/log.h"
|
||||||
|
#include "esphome/core/helpers.h"
|
||||||
|
|
||||||
namespace esphome {
|
namespace esphome {
|
||||||
namespace am43 {
|
namespace am43 {
|
||||||
|
@ -1,7 +1,7 @@
|
|||||||
import esphome.codegen as cg
|
import esphome.codegen as cg
|
||||||
from esphome.components import ble_client, cover
|
from esphome.components import ble_client, cover
|
||||||
import esphome.config_validation as cv
|
import esphome.config_validation as cv
|
||||||
from esphome.const import CONF_PIN
|
from esphome.const import CONF_ID, CONF_PIN
|
||||||
|
|
||||||
CODEOWNERS = ["@buxtronix"]
|
CODEOWNERS = ["@buxtronix"]
|
||||||
DEPENDENCIES = ["ble_client"]
|
DEPENDENCIES = ["ble_client"]
|
||||||
@ -15,9 +15,9 @@ Am43Component = am43_ns.class_(
|
|||||||
)
|
)
|
||||||
|
|
||||||
CONFIG_SCHEMA = (
|
CONFIG_SCHEMA = (
|
||||||
cover.cover_schema(Am43Component)
|
cover.COVER_SCHEMA.extend(
|
||||||
.extend(
|
|
||||||
{
|
{
|
||||||
|
cv.GenerateID(): cv.declare_id(Am43Component),
|
||||||
cv.Optional(CONF_PIN, default=8888): cv.int_range(min=0, max=0xFFFF),
|
cv.Optional(CONF_PIN, default=8888): cv.int_range(min=0, max=0xFFFF),
|
||||||
cv.Optional(CONF_INVERT_POSITION, default=False): cv.boolean,
|
cv.Optional(CONF_INVERT_POSITION, default=False): cv.boolean,
|
||||||
}
|
}
|
||||||
@ -28,8 +28,9 @@ CONFIG_SCHEMA = (
|
|||||||
|
|
||||||
|
|
||||||
async def to_code(config):
|
async def to_code(config):
|
||||||
var = await cover.new_cover(config)
|
var = cg.new_Pvariable(config[CONF_ID])
|
||||||
cg.add(var.set_pin(config[CONF_PIN]))
|
cg.add(var.set_pin(config[CONF_PIN]))
|
||||||
cg.add(var.set_invert_position(config[CONF_INVERT_POSITION]))
|
cg.add(var.set_invert_position(config[CONF_INVERT_POSITION]))
|
||||||
await cg.register_component(var, config)
|
await cg.register_component(var, config)
|
||||||
|
await cover.register_cover(var, config)
|
||||||
await ble_client.register_ble_node(var, config)
|
await ble_client.register_ble_node(var, config)
|
||||||
|
@ -12,10 +12,8 @@ using namespace esphome::cover;
|
|||||||
|
|
||||||
void Am43Component::dump_config() {
|
void Am43Component::dump_config() {
|
||||||
LOG_COVER("", "AM43 Cover", this);
|
LOG_COVER("", "AM43 Cover", this);
|
||||||
ESP_LOGCONFIG(TAG,
|
ESP_LOGCONFIG(TAG, " Device Pin: %d", this->pin_);
|
||||||
" Device Pin: %d\n"
|
ESP_LOGCONFIG(TAG, " Invert Position: %d", (int) this->invert_position_);
|
||||||
" Invert Position: %d",
|
|
||||||
this->pin_, (int) this->invert_position_);
|
|
||||||
}
|
}
|
||||||
|
|
||||||
void Am43Component::setup() {
|
void Am43Component::setup() {
|
||||||
|
@ -22,6 +22,7 @@ class Am43Component : public cover::Cover, public esphome::ble_client::BLEClient
|
|||||||
void gattc_event_handler(esp_gattc_cb_event_t event, esp_gatt_if_t gattc_if,
|
void gattc_event_handler(esp_gattc_cb_event_t event, esp_gatt_if_t gattc_if,
|
||||||
esp_ble_gattc_cb_param_t *param) override;
|
esp_ble_gattc_cb_param_t *param) override;
|
||||||
void dump_config() override;
|
void dump_config() override;
|
||||||
|
float get_setup_priority() const override { return setup_priority::DATA; }
|
||||||
cover::CoverTraits get_traits() override;
|
cover::CoverTraits get_traits() override;
|
||||||
void set_pin(uint16_t pin) { this->pin_ = pin; }
|
void set_pin(uint16_t pin) { this->pin_ = pin; }
|
||||||
void set_invert_position(bool invert_position) { this->invert_position_ = invert_position; }
|
void set_invert_position(bool invert_position) { this->invert_position_ = invert_position; }
|
||||||
|
@ -22,6 +22,7 @@ class Am43 : public esphome::ble_client::BLEClientNode, public PollingComponent
|
|||||||
void gattc_event_handler(esp_gattc_cb_event_t event, esp_gatt_if_t gattc_if,
|
void gattc_event_handler(esp_gattc_cb_event_t event, esp_gatt_if_t gattc_if,
|
||||||
esp_ble_gattc_cb_param_t *param) override;
|
esp_ble_gattc_cb_param_t *param) override;
|
||||||
void dump_config() override;
|
void dump_config() override;
|
||||||
|
float get_setup_priority() const override { return setup_priority::DATA; }
|
||||||
void set_battery(sensor::Sensor *battery) { battery_ = battery; }
|
void set_battery(sensor::Sensor *battery) { battery_ = battery; }
|
||||||
void set_illuminance(sensor::Sensor *illuminance) { illuminance_ = illuminance; }
|
void set_illuminance(sensor::Sensor *illuminance) { illuminance_ = illuminance; }
|
||||||
|
|
||||||
|
@ -14,8 +14,7 @@ void AnalogThresholdBinarySensor::setup() {
|
|||||||
if (std::isnan(sensor_value)) {
|
if (std::isnan(sensor_value)) {
|
||||||
this->publish_initial_state(false);
|
this->publish_initial_state(false);
|
||||||
} else {
|
} else {
|
||||||
this->publish_initial_state(sensor_value >=
|
this->publish_initial_state(sensor_value >= (this->lower_threshold_ + this->upper_threshold_) / 2.0f);
|
||||||
(this->lower_threshold_.value() + this->upper_threshold_.value()) / 2.0f);
|
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
@ -25,8 +24,7 @@ void AnalogThresholdBinarySensor::set_sensor(sensor::Sensor *analog_sensor) {
|
|||||||
this->sensor_->add_on_state_callback([this](float sensor_value) {
|
this->sensor_->add_on_state_callback([this](float sensor_value) {
|
||||||
// if there is an invalid sensor reading, ignore the change and keep the current state
|
// if there is an invalid sensor reading, ignore the change and keep the current state
|
||||||
if (!std::isnan(sensor_value)) {
|
if (!std::isnan(sensor_value)) {
|
||||||
this->publish_state(sensor_value >=
|
this->publish_state(sensor_value >= (this->state ? this->lower_threshold_ : this->upper_threshold_));
|
||||||
(this->state ? this->lower_threshold_.value() : this->upper_threshold_.value()));
|
|
||||||
}
|
}
|
||||||
});
|
});
|
||||||
}
|
}
|
||||||
@ -34,10 +32,8 @@ void AnalogThresholdBinarySensor::set_sensor(sensor::Sensor *analog_sensor) {
|
|||||||
void AnalogThresholdBinarySensor::dump_config() {
|
void AnalogThresholdBinarySensor::dump_config() {
|
||||||
LOG_BINARY_SENSOR("", "Analog Threshold Binary Sensor", this);
|
LOG_BINARY_SENSOR("", "Analog Threshold Binary Sensor", this);
|
||||||
LOG_SENSOR(" ", "Sensor", this->sensor_);
|
LOG_SENSOR(" ", "Sensor", this->sensor_);
|
||||||
ESP_LOGCONFIG(TAG,
|
ESP_LOGCONFIG(TAG, " Upper threshold: %.11f", this->upper_threshold_);
|
||||||
" Upper threshold: %.11f\n"
|
ESP_LOGCONFIG(TAG, " Lower threshold: %.11f", this->lower_threshold_);
|
||||||
" Lower threshold: %.11f",
|
|
||||||
this->upper_threshold_.value(), this->lower_threshold_.value());
|
|
||||||
}
|
}
|
||||||
|
|
||||||
} // namespace analog_threshold
|
} // namespace analog_threshold
|
||||||
|
@ -12,14 +12,17 @@ class AnalogThresholdBinarySensor : public Component, public binary_sensor::Bina
|
|||||||
void dump_config() override;
|
void dump_config() override;
|
||||||
void setup() override;
|
void setup() override;
|
||||||
|
|
||||||
|
float get_setup_priority() const override { return setup_priority::DATA; }
|
||||||
|
|
||||||
void set_sensor(sensor::Sensor *analog_sensor);
|
void set_sensor(sensor::Sensor *analog_sensor);
|
||||||
template<typename T> void set_upper_threshold(T upper_threshold) { this->upper_threshold_ = upper_threshold; }
|
void set_upper_threshold(float threshold) { this->upper_threshold_ = threshold; }
|
||||||
template<typename T> void set_lower_threshold(T lower_threshold) { this->lower_threshold_ = lower_threshold; }
|
void set_lower_threshold(float threshold) { this->lower_threshold_ = threshold; }
|
||||||
|
|
||||||
protected:
|
protected:
|
||||||
sensor::Sensor *sensor_{nullptr};
|
sensor::Sensor *sensor_{nullptr};
|
||||||
TemplatableValue<float> upper_threshold_{};
|
|
||||||
TemplatableValue<float> lower_threshold_{};
|
float upper_threshold_;
|
||||||
|
float lower_threshold_;
|
||||||
};
|
};
|
||||||
|
|
||||||
} // namespace analog_threshold
|
} // namespace analog_threshold
|
||||||
|
@ -18,11 +18,11 @@ CONFIG_SCHEMA = (
|
|||||||
{
|
{
|
||||||
cv.Required(CONF_SENSOR_ID): cv.use_id(sensor.Sensor),
|
cv.Required(CONF_SENSOR_ID): cv.use_id(sensor.Sensor),
|
||||||
cv.Required(CONF_THRESHOLD): cv.Any(
|
cv.Required(CONF_THRESHOLD): cv.Any(
|
||||||
cv.templatable(cv.float_),
|
cv.float_,
|
||||||
cv.Schema(
|
cv.Schema(
|
||||||
{
|
{
|
||||||
cv.Required(CONF_UPPER): cv.templatable(cv.float_),
|
cv.Required(CONF_UPPER): cv.float_,
|
||||||
cv.Required(CONF_LOWER): cv.templatable(cv.float_),
|
cv.Required(CONF_LOWER): cv.float_,
|
||||||
}
|
}
|
||||||
),
|
),
|
||||||
),
|
),
|
||||||
@ -39,11 +39,9 @@ async def to_code(config):
|
|||||||
sens = await cg.get_variable(config[CONF_SENSOR_ID])
|
sens = await cg.get_variable(config[CONF_SENSOR_ID])
|
||||||
cg.add(var.set_sensor(sens))
|
cg.add(var.set_sensor(sens))
|
||||||
|
|
||||||
if isinstance(config[CONF_THRESHOLD], dict):
|
if isinstance(config[CONF_THRESHOLD], float):
|
||||||
lower = await cg.templatable(config[CONF_THRESHOLD][CONF_LOWER], [], float)
|
cg.add(var.set_upper_threshold(config[CONF_THRESHOLD]))
|
||||||
upper = await cg.templatable(config[CONF_THRESHOLD][CONF_UPPER], [], float)
|
cg.add(var.set_lower_threshold(config[CONF_THRESHOLD]))
|
||||||
else:
|
else:
|
||||||
lower = await cg.templatable(config[CONF_THRESHOLD], [], float)
|
cg.add(var.set_upper_threshold(config[CONF_THRESHOLD][CONF_UPPER]))
|
||||||
upper = lower
|
cg.add(var.set_lower_threshold(config[CONF_THRESHOLD][CONF_LOWER]))
|
||||||
cg.add(var.set_upper_threshold(upper))
|
|
||||||
cg.add(var.set_lower_threshold(lower))
|
|
||||||
|
@ -17,11 +17,7 @@ void Anova::setup() {
|
|||||||
this->current_request_ = 0;
|
this->current_request_ = 0;
|
||||||
}
|
}
|
||||||
|
|
||||||
void Anova::loop() {
|
void Anova::loop() {}
|
||||||
// Parent BLEClientNode has a loop() method, but this component uses
|
|
||||||
// polling via update() and BLE callbacks so loop isn't needed
|
|
||||||
this->disable_loop();
|
|
||||||
}
|
|
||||||
|
|
||||||
void Anova::control(const ClimateCall &call) {
|
void Anova::control(const ClimateCall &call) {
|
||||||
if (call.get_mode().has_value()) {
|
if (call.get_mode().has_value()) {
|
||||||
|
@ -26,6 +26,7 @@ class Anova : public climate::Climate, public esphome::ble_client::BLEClientNode
|
|||||||
void gattc_event_handler(esp_gattc_cb_event_t event, esp_gatt_if_t gattc_if,
|
void gattc_event_handler(esp_gattc_cb_event_t event, esp_gatt_if_t gattc_if,
|
||||||
esp_ble_gattc_cb_param_t *param) override;
|
esp_ble_gattc_cb_param_t *param) override;
|
||||||
void dump_config() override;
|
void dump_config() override;
|
||||||
|
float get_setup_priority() const override { return setup_priority::DATA; }
|
||||||
climate::ClimateTraits traits() override {
|
climate::ClimateTraits traits() override {
|
||||||
auto traits = climate::ClimateTraits();
|
auto traits = climate::ClimateTraits();
|
||||||
traits.set_supports_current_temperature(true);
|
traits.set_supports_current_temperature(true);
|
||||||
|
@ -1,7 +1,7 @@
|
|||||||
import esphome.codegen as cg
|
import esphome.codegen as cg
|
||||||
from esphome.components import ble_client, climate
|
from esphome.components import ble_client, climate
|
||||||
import esphome.config_validation as cv
|
import esphome.config_validation as cv
|
||||||
from esphome.const import CONF_UNIT_OF_MEASUREMENT
|
from esphome.const import CONF_ID, CONF_UNIT_OF_MEASUREMENT
|
||||||
|
|
||||||
UNITS = {
|
UNITS = {
|
||||||
"f": "f",
|
"f": "f",
|
||||||
@ -17,9 +17,9 @@ Anova = anova_ns.class_(
|
|||||||
)
|
)
|
||||||
|
|
||||||
CONFIG_SCHEMA = (
|
CONFIG_SCHEMA = (
|
||||||
climate.climate_schema(Anova)
|
climate.CLIMATE_SCHEMA.extend(
|
||||||
.extend(
|
|
||||||
{
|
{
|
||||||
|
cv.GenerateID(): cv.declare_id(Anova),
|
||||||
cv.Required(CONF_UNIT_OF_MEASUREMENT): cv.enum(UNITS),
|
cv.Required(CONF_UNIT_OF_MEASUREMENT): cv.enum(UNITS),
|
||||||
}
|
}
|
||||||
)
|
)
|
||||||
@ -29,7 +29,8 @@ CONFIG_SCHEMA = (
|
|||||||
|
|
||||||
|
|
||||||
async def to_code(config):
|
async def to_code(config):
|
||||||
var = await climate.new_climate(config)
|
var = cg.new_Pvariable(config[CONF_ID])
|
||||||
await cg.register_component(var, config)
|
await cg.register_component(var, config)
|
||||||
|
await climate.register_climate(var, config)
|
||||||
await ble_client.register_ble_node(var, config)
|
await ble_client.register_ble_node(var, config)
|
||||||
cg.add(var.set_unit_of_measurement(config[CONF_UNIT_OF_MEASUREMENT]))
|
cg.add(var.set_unit_of_measurement(config[CONF_UNIT_OF_MEASUREMENT]))
|
||||||
|
@ -54,6 +54,8 @@ enum { // APDS9306 registers
|
|||||||
}
|
}
|
||||||
|
|
||||||
void APDS9306::setup() {
|
void APDS9306::setup() {
|
||||||
|
ESP_LOGCONFIG(TAG, "Setting up APDS9306...");
|
||||||
|
|
||||||
uint8_t id;
|
uint8_t id;
|
||||||
if (!this->read_byte(APDS9306_PART_ID, &id)) { // Part ID register
|
if (!this->read_byte(APDS9306_PART_ID, &id)) { // Part ID register
|
||||||
this->error_code_ = COMMUNICATION_FAILED;
|
this->error_code_ = COMMUNICATION_FAILED;
|
||||||
@ -84,6 +86,8 @@ void APDS9306::setup() {
|
|||||||
|
|
||||||
// Set to active mode
|
// Set to active mode
|
||||||
APDS9306_WRITE_BYTE(APDS9306_MAIN_CTRL, 0x02);
|
APDS9306_WRITE_BYTE(APDS9306_MAIN_CTRL, 0x02);
|
||||||
|
|
||||||
|
ESP_LOGCONFIG(TAG, "APDS9306 setup complete");
|
||||||
}
|
}
|
||||||
|
|
||||||
void APDS9306::dump_config() {
|
void APDS9306::dump_config() {
|
||||||
@ -93,7 +97,7 @@ void APDS9306::dump_config() {
|
|||||||
if (this->is_failed()) {
|
if (this->is_failed()) {
|
||||||
switch (this->error_code_) {
|
switch (this->error_code_) {
|
||||||
case COMMUNICATION_FAILED:
|
case COMMUNICATION_FAILED:
|
||||||
ESP_LOGE(TAG, ESP_LOG_MSG_COMM_FAIL);
|
ESP_LOGE(TAG, "Communication with APDS9306 failed!");
|
||||||
break;
|
break;
|
||||||
case WRONG_ID:
|
case WRONG_ID:
|
||||||
ESP_LOGE(TAG, "APDS9306 has invalid id!");
|
ESP_LOGE(TAG, "APDS9306 has invalid id!");
|
||||||
@ -104,12 +108,9 @@ void APDS9306::dump_config() {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
ESP_LOGCONFIG(TAG,
|
ESP_LOGCONFIG(TAG, " Gain: %u", AMBIENT_LIGHT_GAIN_VALUES[this->gain_]);
|
||||||
" Gain: %u\n"
|
ESP_LOGCONFIG(TAG, " Measurement rate: %u", MEASUREMENT_RATE_VALUES[this->measurement_rate_]);
|
||||||
" Measurement rate: %u\n"
|
ESP_LOGCONFIG(TAG, " Measurement Resolution/Bit width: %d", MEASUREMENT_BIT_WIDTH_VALUES[this->bit_width_]);
|
||||||
" Measurement Resolution/Bit width: %d",
|
|
||||||
AMBIENT_LIGHT_GAIN_VALUES[this->gain_], MEASUREMENT_RATE_VALUES[this->measurement_rate_],
|
|
||||||
MEASUREMENT_BIT_WIDTH_VALUES[this->bit_width_]);
|
|
||||||
|
|
||||||
LOG_UPDATE_INTERVAL(this);
|
LOG_UPDATE_INTERVAL(this);
|
||||||
}
|
}
|
||||||
|
@ -15,6 +15,7 @@ static const char *const TAG = "apds9960";
|
|||||||
#define APDS9960_WRITE_BYTE(reg, value) APDS9960_ERROR_CHECK(this->write_byte(reg, value));
|
#define APDS9960_WRITE_BYTE(reg, value) APDS9960_ERROR_CHECK(this->write_byte(reg, value));
|
||||||
|
|
||||||
void APDS9960::setup() {
|
void APDS9960::setup() {
|
||||||
|
ESP_LOGCONFIG(TAG, "Setting up APDS9960...");
|
||||||
uint8_t id;
|
uint8_t id;
|
||||||
if (!this->read_byte(0x92, &id)) { // ID register
|
if (!this->read_byte(0x92, &id)) { // ID register
|
||||||
this->error_code_ = COMMUNICATION_FAILED;
|
this->error_code_ = COMMUNICATION_FAILED;
|
||||||
@ -22,7 +23,7 @@ void APDS9960::setup() {
|
|||||||
return;
|
return;
|
||||||
}
|
}
|
||||||
|
|
||||||
if (id != 0xAB && id != 0x9C && id != 0xA8 && id != 0x9E) { // APDS9960 all should have one of these IDs
|
if (id != 0xAB && id != 0x9C && id != 0xA8) { // APDS9960 all should have one of these IDs
|
||||||
this->error_code_ = WRONG_ID;
|
this->error_code_ = WRONG_ID;
|
||||||
this->mark_failed();
|
this->mark_failed();
|
||||||
return;
|
return;
|
||||||
@ -140,7 +141,7 @@ void APDS9960::dump_config() {
|
|||||||
if (this->is_failed()) {
|
if (this->is_failed()) {
|
||||||
switch (this->error_code_) {
|
switch (this->error_code_) {
|
||||||
case COMMUNICATION_FAILED:
|
case COMMUNICATION_FAILED:
|
||||||
ESP_LOGE(TAG, ESP_LOG_MSG_COMM_FAIL);
|
ESP_LOGE(TAG, "Communication with APDS9960 failed!");
|
||||||
break;
|
break;
|
||||||
case WRONG_ID:
|
case WRONG_ID:
|
||||||
ESP_LOGE(TAG, "APDS9960 has invalid id!");
|
ESP_LOGE(TAG, "APDS9960 has invalid id!");
|
||||||
|
@ -3,7 +3,6 @@ import base64
|
|||||||
from esphome import automation
|
from esphome import automation
|
||||||
from esphome.automation import Condition
|
from esphome.automation import Condition
|
||||||
import esphome.codegen as cg
|
import esphome.codegen as cg
|
||||||
from esphome.config_helpers import get_logger_level
|
|
||||||
import esphome.config_validation as cv
|
import esphome.config_validation as cv
|
||||||
from esphome.const import (
|
from esphome.const import (
|
||||||
CONF_ACTION,
|
CONF_ACTION,
|
||||||
@ -24,9 +23,8 @@ from esphome.const import (
|
|||||||
CONF_TRIGGER_ID,
|
CONF_TRIGGER_ID,
|
||||||
CONF_VARIABLES,
|
CONF_VARIABLES,
|
||||||
)
|
)
|
||||||
from esphome.core import CORE, coroutine_with_priority
|
from esphome.core import coroutine_with_priority
|
||||||
|
|
||||||
DOMAIN = "api"
|
|
||||||
DEPENDENCIES = ["network"]
|
DEPENDENCIES = ["network"]
|
||||||
AUTO_LOAD = ["socket"]
|
AUTO_LOAD = ["socket"]
|
||||||
CODEOWNERS = ["@OttoWinter"]
|
CODEOWNERS = ["@OttoWinter"]
|
||||||
@ -51,8 +49,6 @@ SERVICE_ARG_NATIVE_TYPES = {
|
|||||||
"string[]": cg.std_vector.template(cg.std_string),
|
"string[]": cg.std_vector.template(cg.std_string),
|
||||||
}
|
}
|
||||||
CONF_ENCRYPTION = "encryption"
|
CONF_ENCRYPTION = "encryption"
|
||||||
CONF_BATCH_DELAY = "batch_delay"
|
|
||||||
CONF_CUSTOM_SERVICES = "custom_services"
|
|
||||||
|
|
||||||
|
|
||||||
def validate_encryption_key(value):
|
def validate_encryption_key(value):
|
||||||
@ -86,19 +82,6 @@ ACTIONS_SCHEMA = automation.validate_automation(
|
|||||||
),
|
),
|
||||||
)
|
)
|
||||||
|
|
||||||
ENCRYPTION_SCHEMA = cv.Schema(
|
|
||||||
{
|
|
||||||
cv.Optional(CONF_KEY): validate_encryption_key,
|
|
||||||
}
|
|
||||||
)
|
|
||||||
|
|
||||||
|
|
||||||
def _encryption_schema(config):
|
|
||||||
if config is None:
|
|
||||||
config = {}
|
|
||||||
return ENCRYPTION_SCHEMA(config)
|
|
||||||
|
|
||||||
|
|
||||||
CONFIG_SCHEMA = cv.All(
|
CONFIG_SCHEMA = cv.All(
|
||||||
cv.Schema(
|
cv.Schema(
|
||||||
{
|
{
|
||||||
@ -112,12 +95,11 @@ CONFIG_SCHEMA = cv.All(
|
|||||||
CONF_SERVICES, group_of_exclusion=CONF_ACTIONS
|
CONF_SERVICES, group_of_exclusion=CONF_ACTIONS
|
||||||
): ACTIONS_SCHEMA,
|
): ACTIONS_SCHEMA,
|
||||||
cv.Exclusive(CONF_ACTIONS, group_of_exclusion=CONF_ACTIONS): ACTIONS_SCHEMA,
|
cv.Exclusive(CONF_ACTIONS, group_of_exclusion=CONF_ACTIONS): ACTIONS_SCHEMA,
|
||||||
cv.Optional(CONF_ENCRYPTION): _encryption_schema,
|
cv.Optional(CONF_ENCRYPTION): cv.Schema(
|
||||||
cv.Optional(CONF_BATCH_DELAY, default="100ms"): cv.All(
|
{
|
||||||
cv.positive_time_period_milliseconds,
|
cv.Required(CONF_KEY): validate_encryption_key,
|
||||||
cv.Range(max=cv.TimePeriod(milliseconds=65535)),
|
}
|
||||||
),
|
),
|
||||||
cv.Optional(CONF_CUSTOM_SERVICES, default=False): cv.boolean,
|
|
||||||
cv.Optional(CONF_ON_CLIENT_CONNECTED): automation.validate_automation(
|
cv.Optional(CONF_ON_CLIENT_CONNECTED): automation.validate_automation(
|
||||||
single=True
|
single=True
|
||||||
),
|
),
|
||||||
@ -136,35 +118,26 @@ async def to_code(config):
|
|||||||
await cg.register_component(var, config)
|
await cg.register_component(var, config)
|
||||||
|
|
||||||
cg.add(var.set_port(config[CONF_PORT]))
|
cg.add(var.set_port(config[CONF_PORT]))
|
||||||
if config[CONF_PASSWORD]:
|
cg.add(var.set_password(config[CONF_PASSWORD]))
|
||||||
cg.add_define("USE_API_PASSWORD")
|
|
||||||
cg.add(var.set_password(config[CONF_PASSWORD]))
|
|
||||||
cg.add(var.set_reboot_timeout(config[CONF_REBOOT_TIMEOUT]))
|
cg.add(var.set_reboot_timeout(config[CONF_REBOOT_TIMEOUT]))
|
||||||
cg.add(var.set_batch_delay(config[CONF_BATCH_DELAY]))
|
|
||||||
|
|
||||||
# Set USE_API_SERVICES if any services are enabled
|
for conf in config.get(CONF_ACTIONS, []):
|
||||||
if config.get(CONF_ACTIONS) or config[CONF_CUSTOM_SERVICES]:
|
template_args = []
|
||||||
cg.add_define("USE_API_SERVICES")
|
func_args = []
|
||||||
|
service_arg_names = []
|
||||||
if actions := config.get(CONF_ACTIONS, []):
|
for name, var_ in conf[CONF_VARIABLES].items():
|
||||||
for conf in actions:
|
native = SERVICE_ARG_NATIVE_TYPES[var_]
|
||||||
template_args = []
|
template_args.append(native)
|
||||||
func_args = []
|
func_args.append((native, name))
|
||||||
service_arg_names = []
|
service_arg_names.append(name)
|
||||||
for name, var_ in conf[CONF_VARIABLES].items():
|
templ = cg.TemplateArguments(*template_args)
|
||||||
native = SERVICE_ARG_NATIVE_TYPES[var_]
|
trigger = cg.new_Pvariable(
|
||||||
template_args.append(native)
|
conf[CONF_TRIGGER_ID], templ, conf[CONF_ACTION], service_arg_names
|
||||||
func_args.append((native, name))
|
)
|
||||||
service_arg_names.append(name)
|
cg.add(var.register_user_service(trigger))
|
||||||
templ = cg.TemplateArguments(*template_args)
|
await automation.build_automation(trigger, func_args, conf)
|
||||||
trigger = cg.new_Pvariable(
|
|
||||||
conf[CONF_TRIGGER_ID], templ, conf[CONF_ACTION], service_arg_names
|
|
||||||
)
|
|
||||||
cg.add(var.register_user_service(trigger))
|
|
||||||
await automation.build_automation(trigger, func_args, conf)
|
|
||||||
|
|
||||||
if CONF_ON_CLIENT_CONNECTED in config:
|
if CONF_ON_CLIENT_CONNECTED in config:
|
||||||
cg.add_define("USE_API_CLIENT_CONNECTED_TRIGGER")
|
|
||||||
await automation.build_automation(
|
await automation.build_automation(
|
||||||
var.get_client_connected_trigger(),
|
var.get_client_connected_trigger(),
|
||||||
[(cg.std_string, "client_info"), (cg.std_string, "client_address")],
|
[(cg.std_string, "client_info"), (cg.std_string, "client_address")],
|
||||||
@ -172,26 +145,17 @@ async def to_code(config):
|
|||||||
)
|
)
|
||||||
|
|
||||||
if CONF_ON_CLIENT_DISCONNECTED in config:
|
if CONF_ON_CLIENT_DISCONNECTED in config:
|
||||||
cg.add_define("USE_API_CLIENT_DISCONNECTED_TRIGGER")
|
|
||||||
await automation.build_automation(
|
await automation.build_automation(
|
||||||
var.get_client_disconnected_trigger(),
|
var.get_client_disconnected_trigger(),
|
||||||
[(cg.std_string, "client_info"), (cg.std_string, "client_address")],
|
[(cg.std_string, "client_info"), (cg.std_string, "client_address")],
|
||||||
config[CONF_ON_CLIENT_DISCONNECTED],
|
config[CONF_ON_CLIENT_DISCONNECTED],
|
||||||
)
|
)
|
||||||
|
|
||||||
if (encryption_config := config.get(CONF_ENCRYPTION, None)) is not None:
|
if encryption_config := config.get(CONF_ENCRYPTION):
|
||||||
if key := encryption_config.get(CONF_KEY):
|
decoded = base64.b64decode(encryption_config[CONF_KEY])
|
||||||
decoded = base64.b64decode(key)
|
cg.add(var.set_noise_psk(list(decoded)))
|
||||||
cg.add(var.set_noise_psk(list(decoded)))
|
|
||||||
else:
|
|
||||||
# No key provided, but encryption desired
|
|
||||||
# This will allow a plaintext client to provide a noise key,
|
|
||||||
# send it to the device, and then switch to noise.
|
|
||||||
# The key will be saved in flash and used for future connections
|
|
||||||
# and plaintext disabled. Only a factory reset can remove it.
|
|
||||||
cg.add_define("USE_API_PLAINTEXT")
|
|
||||||
cg.add_define("USE_API_NOISE")
|
cg.add_define("USE_API_NOISE")
|
||||||
cg.add_library("esphome/noise-c", "0.1.10")
|
cg.add_library("esphome/noise-c", "0.1.6")
|
||||||
else:
|
else:
|
||||||
cg.add_define("USE_API_PLAINTEXT")
|
cg.add_define("USE_API_PLAINTEXT")
|
||||||
|
|
||||||
@ -320,38 +284,3 @@ async def homeassistant_tag_scanned_to_code(config, action_id, template_arg, arg
|
|||||||
@automation.register_condition("api.connected", APIConnectedCondition, {})
|
@automation.register_condition("api.connected", APIConnectedCondition, {})
|
||||||
async def api_connected_to_code(config, condition_id, template_arg, args):
|
async def api_connected_to_code(config, condition_id, template_arg, args):
|
||||||
return cg.new_Pvariable(condition_id, template_arg)
|
return cg.new_Pvariable(condition_id, template_arg)
|
||||||
|
|
||||||
|
|
||||||
def FILTER_SOURCE_FILES() -> list[str]:
|
|
||||||
"""Filter out api_pb2_dump.cpp when proto message dumping is not enabled,
|
|
||||||
user_services.cpp when no services are defined, and protocol-specific
|
|
||||||
implementations based on encryption configuration."""
|
|
||||||
files_to_filter: list[str] = []
|
|
||||||
|
|
||||||
# api_pb2_dump.cpp is only needed when HAS_PROTO_MESSAGE_DUMP is defined
|
|
||||||
# This is a particularly large file that still needs to be opened and read
|
|
||||||
# all the way to the end even when ifdef'd out
|
|
||||||
#
|
|
||||||
# HAS_PROTO_MESSAGE_DUMP is defined when ESPHOME_LOG_HAS_VERY_VERBOSE is set,
|
|
||||||
# which happens when the logger level is VERY_VERBOSE
|
|
||||||
if get_logger_level() != "VERY_VERBOSE":
|
|
||||||
files_to_filter.append("api_pb2_dump.cpp")
|
|
||||||
|
|
||||||
# user_services.cpp is only needed when services are defined
|
|
||||||
config = CORE.config.get(DOMAIN, {})
|
|
||||||
if config and not config.get(CONF_ACTIONS) and not config[CONF_CUSTOM_SERVICES]:
|
|
||||||
files_to_filter.append("user_services.cpp")
|
|
||||||
|
|
||||||
# Filter protocol-specific implementations based on encryption configuration
|
|
||||||
encryption_config = config.get(CONF_ENCRYPTION) if config else None
|
|
||||||
|
|
||||||
# If encryption is not configured at all, we only need plaintext
|
|
||||||
if encryption_config is None:
|
|
||||||
files_to_filter.append("api_frame_helper_noise.cpp")
|
|
||||||
# If encryption is configured with a key, we only need noise
|
|
||||||
elif encryption_config.get(CONF_KEY):
|
|
||||||
files_to_filter.append("api_frame_helper_plaintext.cpp")
|
|
||||||
# If encryption is configured but no key is provided, we need both
|
|
||||||
# (this allows a plaintext client to provide a noise key)
|
|
||||||
|
|
||||||
return files_to_filter
|
|
||||||
|
File diff suppressed because it is too large
Load Diff
File diff suppressed because it is too large
Load Diff
@ -8,46 +8,54 @@
|
|||||||
#include "api_server.h"
|
#include "api_server.h"
|
||||||
#include "esphome/core/application.h"
|
#include "esphome/core/application.h"
|
||||||
#include "esphome/core/component.h"
|
#include "esphome/core/component.h"
|
||||||
#include "esphome/core/entity_base.h"
|
|
||||||
|
|
||||||
#include <vector>
|
#include <vector>
|
||||||
#include <functional>
|
|
||||||
|
|
||||||
namespace esphome::api {
|
namespace esphome {
|
||||||
|
namespace api {
|
||||||
|
|
||||||
// Client information structure
|
using send_message_t = bool(APIConnection *, void *);
|
||||||
struct ClientInfo {
|
|
||||||
std::string name; // Client name from Hello message
|
|
||||||
std::string peername; // IP:port from socket
|
|
||||||
|
|
||||||
std::string get_combined_info() const {
|
/*
|
||||||
if (name == peername) {
|
This class holds a pointer to the source component that wants to publish a message, and a pointer to a function that
|
||||||
// Before Hello message, both are the same
|
will lazily publish that message. The two pointers allow dedup in the deferred queue if multiple publishes for the
|
||||||
return name;
|
same component are backed up, and take up only 8 bytes of memory. The entry in the deferred queue (a std::vector) is
|
||||||
|
the DeferredMessage instance itself (not a pointer to one elsewhere in heap) so still only 8 bytes per entry. Even
|
||||||
|
100 backed up messages (you'd have to have at least 100 sensors publishing because of dedup) would take up only 0.8
|
||||||
|
kB.
|
||||||
|
*/
|
||||||
|
class DeferredMessageQueue {
|
||||||
|
struct DeferredMessage {
|
||||||
|
friend class DeferredMessageQueue;
|
||||||
|
|
||||||
|
protected:
|
||||||
|
void *source_;
|
||||||
|
send_message_t *send_message_;
|
||||||
|
|
||||||
|
public:
|
||||||
|
DeferredMessage(void *source, send_message_t *send_message) : source_(source), send_message_(send_message) {}
|
||||||
|
bool operator==(const DeferredMessage &test) const {
|
||||||
|
return (source_ == test.source_ && send_message_ == test.send_message_);
|
||||||
}
|
}
|
||||||
return name + " (" + peername + ")";
|
} __attribute__((packed));
|
||||||
}
|
|
||||||
};
|
|
||||||
|
|
||||||
// Keepalive timeout in milliseconds
|
protected:
|
||||||
static constexpr uint32_t KEEPALIVE_TIMEOUT_MS = 60000;
|
// vector is used very specifically for its zero memory overhead even though items are popped from the front (memory
|
||||||
// Maximum number of entities to process in a single batch during initial state/info sending
|
// footprint is more important than speed here)
|
||||||
// This was increased from 20 to 24 after removing the unique_id field from entity info messages,
|
std::vector<DeferredMessage> deferred_queue_;
|
||||||
// which reduced message sizes allowing more entities per batch without exceeding packet limits
|
APIConnection *api_connection_;
|
||||||
static constexpr size_t MAX_INITIAL_PER_BATCH = 24;
|
|
||||||
// Maximum number of packets to process in a single batch (platform-dependent)
|
// helper for allowing only unique entries in the queue
|
||||||
// This limit exists to prevent stack overflow from the PacketInfo array in process_batch_
|
void dmq_push_back_with_dedup_(void *source, send_message_t *send_message);
|
||||||
// Each PacketInfo is 8 bytes, so 64 * 8 = 512 bytes, 32 * 8 = 256 bytes
|
|
||||||
#if defined(USE_ESP32) || defined(USE_HOST)
|
public:
|
||||||
static constexpr size_t MAX_PACKETS_PER_BATCH = 64; // ESP32 has 8KB+ stack, HOST has plenty
|
DeferredMessageQueue(APIConnection *api_connection) : api_connection_(api_connection) {}
|
||||||
#else
|
void process_queue();
|
||||||
static constexpr size_t MAX_PACKETS_PER_BATCH = 32; // ESP8266/RP2040/etc have smaller stacks
|
void defer(void *source, send_message_t *send_message);
|
||||||
#endif
|
};
|
||||||
|
|
||||||
class APIConnection : public APIServerConnection {
|
class APIConnection : public APIServerConnection {
|
||||||
public:
|
public:
|
||||||
friend class APIServer;
|
|
||||||
friend class ListEntitiesIterator;
|
|
||||||
APIConnection(std::unique_ptr<socket::Socket> socket, APIServer *parent);
|
APIConnection(std::unique_ptr<socket::Socket> socket, APIServer *parent);
|
||||||
virtual ~APIConnection();
|
virtual ~APIConnection();
|
||||||
|
|
||||||
@ -55,90 +63,154 @@ class APIConnection : public APIServerConnection {
|
|||||||
void loop();
|
void loop();
|
||||||
|
|
||||||
bool send_list_info_done() {
|
bool send_list_info_done() {
|
||||||
return this->schedule_message_(nullptr, &APIConnection::try_send_list_info_done,
|
ListEntitiesDoneResponse resp;
|
||||||
ListEntitiesDoneResponse::MESSAGE_TYPE, ListEntitiesDoneResponse::ESTIMATED_SIZE);
|
return this->send_list_entities_done_response(resp);
|
||||||
}
|
}
|
||||||
#ifdef USE_BINARY_SENSOR
|
#ifdef USE_BINARY_SENSOR
|
||||||
bool send_binary_sensor_state(binary_sensor::BinarySensor *binary_sensor);
|
bool send_binary_sensor_state(binary_sensor::BinarySensor *binary_sensor, bool state);
|
||||||
|
void send_binary_sensor_info(binary_sensor::BinarySensor *binary_sensor);
|
||||||
|
static bool try_send_binary_sensor_state(APIConnection *api, void *v_binary_sensor);
|
||||||
|
static bool try_send_binary_sensor_state(APIConnection *api, binary_sensor::BinarySensor *binary_sensor, bool state);
|
||||||
|
static bool try_send_binary_sensor_info(APIConnection *api, void *v_binary_sensor);
|
||||||
#endif
|
#endif
|
||||||
#ifdef USE_COVER
|
#ifdef USE_COVER
|
||||||
bool send_cover_state(cover::Cover *cover);
|
bool send_cover_state(cover::Cover *cover);
|
||||||
|
void send_cover_info(cover::Cover *cover);
|
||||||
|
static bool try_send_cover_state(APIConnection *api, void *v_cover);
|
||||||
|
static bool try_send_cover_info(APIConnection *api, void *v_cover);
|
||||||
void cover_command(const CoverCommandRequest &msg) override;
|
void cover_command(const CoverCommandRequest &msg) override;
|
||||||
#endif
|
#endif
|
||||||
#ifdef USE_FAN
|
#ifdef USE_FAN
|
||||||
bool send_fan_state(fan::Fan *fan);
|
bool send_fan_state(fan::Fan *fan);
|
||||||
|
void send_fan_info(fan::Fan *fan);
|
||||||
|
static bool try_send_fan_state(APIConnection *api, void *v_fan);
|
||||||
|
static bool try_send_fan_info(APIConnection *api, void *v_fan);
|
||||||
void fan_command(const FanCommandRequest &msg) override;
|
void fan_command(const FanCommandRequest &msg) override;
|
||||||
#endif
|
#endif
|
||||||
#ifdef USE_LIGHT
|
#ifdef USE_LIGHT
|
||||||
bool send_light_state(light::LightState *light);
|
bool send_light_state(light::LightState *light);
|
||||||
|
void send_light_info(light::LightState *light);
|
||||||
|
static bool try_send_light_state(APIConnection *api, void *v_light);
|
||||||
|
static bool try_send_light_info(APIConnection *api, void *v_light);
|
||||||
void light_command(const LightCommandRequest &msg) override;
|
void light_command(const LightCommandRequest &msg) override;
|
||||||
#endif
|
#endif
|
||||||
#ifdef USE_SENSOR
|
#ifdef USE_SENSOR
|
||||||
bool send_sensor_state(sensor::Sensor *sensor);
|
bool send_sensor_state(sensor::Sensor *sensor, float state);
|
||||||
|
void send_sensor_info(sensor::Sensor *sensor);
|
||||||
|
static bool try_send_sensor_state(APIConnection *api, void *v_sensor);
|
||||||
|
static bool try_send_sensor_state(APIConnection *api, sensor::Sensor *sensor, float state);
|
||||||
|
static bool try_send_sensor_info(APIConnection *api, void *v_sensor);
|
||||||
#endif
|
#endif
|
||||||
#ifdef USE_SWITCH
|
#ifdef USE_SWITCH
|
||||||
bool send_switch_state(switch_::Switch *a_switch);
|
bool send_switch_state(switch_::Switch *a_switch, bool state);
|
||||||
|
void send_switch_info(switch_::Switch *a_switch);
|
||||||
|
static bool try_send_switch_state(APIConnection *api, void *v_a_switch);
|
||||||
|
static bool try_send_switch_state(APIConnection *api, switch_::Switch *a_switch, bool state);
|
||||||
|
static bool try_send_switch_info(APIConnection *api, void *v_a_switch);
|
||||||
void switch_command(const SwitchCommandRequest &msg) override;
|
void switch_command(const SwitchCommandRequest &msg) override;
|
||||||
#endif
|
#endif
|
||||||
#ifdef USE_TEXT_SENSOR
|
#ifdef USE_TEXT_SENSOR
|
||||||
bool send_text_sensor_state(text_sensor::TextSensor *text_sensor);
|
bool send_text_sensor_state(text_sensor::TextSensor *text_sensor, std::string state);
|
||||||
|
void send_text_sensor_info(text_sensor::TextSensor *text_sensor);
|
||||||
|
static bool try_send_text_sensor_state(APIConnection *api, void *v_text_sensor);
|
||||||
|
static bool try_send_text_sensor_state(APIConnection *api, text_sensor::TextSensor *text_sensor, std::string state);
|
||||||
|
static bool try_send_text_sensor_info(APIConnection *api, void *v_text_sensor);
|
||||||
#endif
|
#endif
|
||||||
#ifdef USE_CAMERA
|
#ifdef USE_ESP32_CAMERA
|
||||||
void set_camera_state(std::shared_ptr<camera::CameraImage> image);
|
void set_camera_state(std::shared_ptr<esp32_camera::CameraImage> image);
|
||||||
|
void send_camera_info(esp32_camera::ESP32Camera *camera);
|
||||||
|
static bool try_send_camera_info(APIConnection *api, void *v_camera);
|
||||||
void camera_image(const CameraImageRequest &msg) override;
|
void camera_image(const CameraImageRequest &msg) override;
|
||||||
#endif
|
#endif
|
||||||
#ifdef USE_CLIMATE
|
#ifdef USE_CLIMATE
|
||||||
bool send_climate_state(climate::Climate *climate);
|
bool send_climate_state(climate::Climate *climate);
|
||||||
|
void send_climate_info(climate::Climate *climate);
|
||||||
|
static bool try_send_climate_state(APIConnection *api, void *v_climate);
|
||||||
|
static bool try_send_climate_info(APIConnection *api, void *v_climate);
|
||||||
void climate_command(const ClimateCommandRequest &msg) override;
|
void climate_command(const ClimateCommandRequest &msg) override;
|
||||||
#endif
|
#endif
|
||||||
#ifdef USE_NUMBER
|
#ifdef USE_NUMBER
|
||||||
bool send_number_state(number::Number *number);
|
bool send_number_state(number::Number *number, float state);
|
||||||
|
void send_number_info(number::Number *number);
|
||||||
|
static bool try_send_number_state(APIConnection *api, void *v_number);
|
||||||
|
static bool try_send_number_state(APIConnection *api, number::Number *number, float state);
|
||||||
|
static bool try_send_number_info(APIConnection *api, void *v_number);
|
||||||
void number_command(const NumberCommandRequest &msg) override;
|
void number_command(const NumberCommandRequest &msg) override;
|
||||||
#endif
|
#endif
|
||||||
#ifdef USE_DATETIME_DATE
|
#ifdef USE_DATETIME_DATE
|
||||||
bool send_date_state(datetime::DateEntity *date);
|
bool send_date_state(datetime::DateEntity *date);
|
||||||
|
void send_date_info(datetime::DateEntity *date);
|
||||||
|
static bool try_send_date_state(APIConnection *api, void *v_date);
|
||||||
|
static bool try_send_date_info(APIConnection *api, void *v_date);
|
||||||
void date_command(const DateCommandRequest &msg) override;
|
void date_command(const DateCommandRequest &msg) override;
|
||||||
#endif
|
#endif
|
||||||
#ifdef USE_DATETIME_TIME
|
#ifdef USE_DATETIME_TIME
|
||||||
bool send_time_state(datetime::TimeEntity *time);
|
bool send_time_state(datetime::TimeEntity *time);
|
||||||
|
void send_time_info(datetime::TimeEntity *time);
|
||||||
|
static bool try_send_time_state(APIConnection *api, void *v_time);
|
||||||
|
static bool try_send_time_info(APIConnection *api, void *v_time);
|
||||||
void time_command(const TimeCommandRequest &msg) override;
|
void time_command(const TimeCommandRequest &msg) override;
|
||||||
#endif
|
#endif
|
||||||
#ifdef USE_DATETIME_DATETIME
|
#ifdef USE_DATETIME_DATETIME
|
||||||
bool send_datetime_state(datetime::DateTimeEntity *datetime);
|
bool send_datetime_state(datetime::DateTimeEntity *datetime);
|
||||||
|
void send_datetime_info(datetime::DateTimeEntity *datetime);
|
||||||
|
static bool try_send_datetime_state(APIConnection *api, void *v_datetime);
|
||||||
|
static bool try_send_datetime_info(APIConnection *api, void *v_datetime);
|
||||||
void datetime_command(const DateTimeCommandRequest &msg) override;
|
void datetime_command(const DateTimeCommandRequest &msg) override;
|
||||||
#endif
|
#endif
|
||||||
#ifdef USE_TEXT
|
#ifdef USE_TEXT
|
||||||
bool send_text_state(text::Text *text);
|
bool send_text_state(text::Text *text, std::string state);
|
||||||
|
void send_text_info(text::Text *text);
|
||||||
|
static bool try_send_text_state(APIConnection *api, void *v_text);
|
||||||
|
static bool try_send_text_state(APIConnection *api, text::Text *text, std::string state);
|
||||||
|
static bool try_send_text_info(APIConnection *api, void *v_text);
|
||||||
void text_command(const TextCommandRequest &msg) override;
|
void text_command(const TextCommandRequest &msg) override;
|
||||||
#endif
|
#endif
|
||||||
#ifdef USE_SELECT
|
#ifdef USE_SELECT
|
||||||
bool send_select_state(select::Select *select);
|
bool send_select_state(select::Select *select, std::string state);
|
||||||
|
void send_select_info(select::Select *select);
|
||||||
|
static bool try_send_select_state(APIConnection *api, void *v_select);
|
||||||
|
static bool try_send_select_state(APIConnection *api, select::Select *select, std::string state);
|
||||||
|
static bool try_send_select_info(APIConnection *api, void *v_select);
|
||||||
void select_command(const SelectCommandRequest &msg) override;
|
void select_command(const SelectCommandRequest &msg) override;
|
||||||
#endif
|
#endif
|
||||||
#ifdef USE_BUTTON
|
#ifdef USE_BUTTON
|
||||||
|
void send_button_info(button::Button *button);
|
||||||
|
static bool try_send_button_info(APIConnection *api, void *v_button);
|
||||||
void button_command(const ButtonCommandRequest &msg) override;
|
void button_command(const ButtonCommandRequest &msg) override;
|
||||||
#endif
|
#endif
|
||||||
#ifdef USE_LOCK
|
#ifdef USE_LOCK
|
||||||
bool send_lock_state(lock::Lock *a_lock);
|
bool send_lock_state(lock::Lock *a_lock, lock::LockState state);
|
||||||
|
void send_lock_info(lock::Lock *a_lock);
|
||||||
|
static bool try_send_lock_state(APIConnection *api, void *v_a_lock);
|
||||||
|
static bool try_send_lock_state(APIConnection *api, lock::Lock *a_lock, lock::LockState state);
|
||||||
|
static bool try_send_lock_info(APIConnection *api, void *v_a_lock);
|
||||||
void lock_command(const LockCommandRequest &msg) override;
|
void lock_command(const LockCommandRequest &msg) override;
|
||||||
#endif
|
#endif
|
||||||
#ifdef USE_VALVE
|
#ifdef USE_VALVE
|
||||||
bool send_valve_state(valve::Valve *valve);
|
bool send_valve_state(valve::Valve *valve);
|
||||||
|
void send_valve_info(valve::Valve *valve);
|
||||||
|
static bool try_send_valve_state(APIConnection *api, void *v_valve);
|
||||||
|
static bool try_send_valve_info(APIConnection *api, void *v_valve);
|
||||||
void valve_command(const ValveCommandRequest &msg) override;
|
void valve_command(const ValveCommandRequest &msg) override;
|
||||||
#endif
|
#endif
|
||||||
#ifdef USE_MEDIA_PLAYER
|
#ifdef USE_MEDIA_PLAYER
|
||||||
bool send_media_player_state(media_player::MediaPlayer *media_player);
|
bool send_media_player_state(media_player::MediaPlayer *media_player);
|
||||||
|
void send_media_player_info(media_player::MediaPlayer *media_player);
|
||||||
|
static bool try_send_media_player_state(APIConnection *api, void *v_media_player);
|
||||||
|
static bool try_send_media_player_info(APIConnection *api, void *v_media_player);
|
||||||
void media_player_command(const MediaPlayerCommandRequest &msg) override;
|
void media_player_command(const MediaPlayerCommandRequest &msg) override;
|
||||||
#endif
|
#endif
|
||||||
bool try_send_log_message(int level, const char *tag, const char *line, size_t message_len);
|
bool try_send_log_message(int level, const char *tag, const char *line);
|
||||||
void send_homeassistant_service_call(const HomeassistantServiceResponse &call) {
|
void send_homeassistant_service_call(const HomeassistantServiceResponse &call) {
|
||||||
if (!this->flags_.service_call_subscription)
|
if (!this->service_call_subscription_)
|
||||||
return;
|
return;
|
||||||
this->send_message(call, HomeassistantServiceResponse::MESSAGE_TYPE);
|
this->send_homeassistant_service_response(call);
|
||||||
}
|
}
|
||||||
#ifdef USE_BLUETOOTH_PROXY
|
#ifdef USE_BLUETOOTH_PROXY
|
||||||
void subscribe_bluetooth_le_advertisements(const SubscribeBluetoothLEAdvertisementsRequest &msg) override;
|
void subscribe_bluetooth_le_advertisements(const SubscribeBluetoothLEAdvertisementsRequest &msg) override;
|
||||||
void unsubscribe_bluetooth_le_advertisements(const UnsubscribeBluetoothLEAdvertisementsRequest &msg) override;
|
void unsubscribe_bluetooth_le_advertisements(const UnsubscribeBluetoothLEAdvertisementsRequest &msg) override;
|
||||||
|
bool send_bluetooth_le_advertisement(const BluetoothLEAdvertisementResponse &msg);
|
||||||
|
|
||||||
void bluetooth_device_request(const BluetoothDeviceRequest &msg) override;
|
void bluetooth_device_request(const BluetoothDeviceRequest &msg) override;
|
||||||
void bluetooth_gatt_read(const BluetoothGATTReadRequest &msg) override;
|
void bluetooth_gatt_read(const BluetoothGATTReadRequest &msg) override;
|
||||||
@ -147,14 +219,14 @@ class APIConnection : public APIServerConnection {
|
|||||||
void bluetooth_gatt_write_descriptor(const BluetoothGATTWriteDescriptorRequest &msg) override;
|
void bluetooth_gatt_write_descriptor(const BluetoothGATTWriteDescriptorRequest &msg) override;
|
||||||
void bluetooth_gatt_get_services(const BluetoothGATTGetServicesRequest &msg) override;
|
void bluetooth_gatt_get_services(const BluetoothGATTGetServicesRequest &msg) override;
|
||||||
void bluetooth_gatt_notify(const BluetoothGATTNotifyRequest &msg) override;
|
void bluetooth_gatt_notify(const BluetoothGATTNotifyRequest &msg) override;
|
||||||
bool send_subscribe_bluetooth_connections_free_response(const SubscribeBluetoothConnectionsFreeRequest &msg) override;
|
BluetoothConnectionsFreeResponse subscribe_bluetooth_connections_free(
|
||||||
void bluetooth_scanner_set_mode(const BluetoothScannerSetModeRequest &msg) override;
|
const SubscribeBluetoothConnectionsFreeRequest &msg) override;
|
||||||
|
|
||||||
#endif
|
#endif
|
||||||
#ifdef USE_HOMEASSISTANT_TIME
|
#ifdef USE_HOMEASSISTANT_TIME
|
||||||
void send_time_request() {
|
void send_time_request() {
|
||||||
GetTimeRequest req;
|
GetTimeRequest req;
|
||||||
this->send_message(req, GetTimeRequest::MESSAGE_TYPE);
|
this->send_get_time_request(req);
|
||||||
}
|
}
|
||||||
#endif
|
#endif
|
||||||
|
|
||||||
@ -165,562 +237,128 @@ class APIConnection : public APIServerConnection {
|
|||||||
void on_voice_assistant_audio(const VoiceAssistantAudio &msg) override;
|
void on_voice_assistant_audio(const VoiceAssistantAudio &msg) override;
|
||||||
void on_voice_assistant_timer_event_response(const VoiceAssistantTimerEventResponse &msg) override;
|
void on_voice_assistant_timer_event_response(const VoiceAssistantTimerEventResponse &msg) override;
|
||||||
void on_voice_assistant_announce_request(const VoiceAssistantAnnounceRequest &msg) override;
|
void on_voice_assistant_announce_request(const VoiceAssistantAnnounceRequest &msg) override;
|
||||||
bool send_voice_assistant_get_configuration_response(const VoiceAssistantConfigurationRequest &msg) override;
|
VoiceAssistantConfigurationResponse voice_assistant_get_configuration(
|
||||||
|
const VoiceAssistantConfigurationRequest &msg) override;
|
||||||
void voice_assistant_set_configuration(const VoiceAssistantSetConfiguration &msg) override;
|
void voice_assistant_set_configuration(const VoiceAssistantSetConfiguration &msg) override;
|
||||||
#endif
|
#endif
|
||||||
|
|
||||||
#ifdef USE_ALARM_CONTROL_PANEL
|
#ifdef USE_ALARM_CONTROL_PANEL
|
||||||
bool send_alarm_control_panel_state(alarm_control_panel::AlarmControlPanel *a_alarm_control_panel);
|
bool send_alarm_control_panel_state(alarm_control_panel::AlarmControlPanel *a_alarm_control_panel);
|
||||||
|
void send_alarm_control_panel_info(alarm_control_panel::AlarmControlPanel *a_alarm_control_panel);
|
||||||
|
static bool try_send_alarm_control_panel_state(APIConnection *api, void *v_a_alarm_control_panel);
|
||||||
|
static bool try_send_alarm_control_panel_info(APIConnection *api, void *v_a_alarm_control_panel);
|
||||||
void alarm_control_panel_command(const AlarmControlPanelCommandRequest &msg) override;
|
void alarm_control_panel_command(const AlarmControlPanelCommandRequest &msg) override;
|
||||||
#endif
|
#endif
|
||||||
|
|
||||||
#ifdef USE_EVENT
|
#ifdef USE_EVENT
|
||||||
void send_event(event::Event *event, const std::string &event_type);
|
void send_event(event::Event *event, std::string event_type);
|
||||||
|
void send_event_info(event::Event *event);
|
||||||
|
static bool try_send_event(APIConnection *api, void *v_event);
|
||||||
|
static bool try_send_event(APIConnection *api, event::Event *event, std::string event_type);
|
||||||
|
static bool try_send_event_info(APIConnection *api, void *v_event);
|
||||||
#endif
|
#endif
|
||||||
|
|
||||||
#ifdef USE_UPDATE
|
#ifdef USE_UPDATE
|
||||||
bool send_update_state(update::UpdateEntity *update);
|
bool send_update_state(update::UpdateEntity *update);
|
||||||
|
void send_update_info(update::UpdateEntity *update);
|
||||||
|
static bool try_send_update_state(APIConnection *api, void *v_update);
|
||||||
|
static bool try_send_update_info(APIConnection *api, void *v_update);
|
||||||
void update_command(const UpdateCommandRequest &msg) override;
|
void update_command(const UpdateCommandRequest &msg) override;
|
||||||
#endif
|
#endif
|
||||||
|
|
||||||
void on_disconnect_response(const DisconnectResponse &value) override;
|
void on_disconnect_response(const DisconnectResponse &value) override;
|
||||||
void on_ping_response(const PingResponse &value) override {
|
void on_ping_response(const PingResponse &value) override {
|
||||||
// we initiated ping
|
// we initiated ping
|
||||||
this->flags_.sent_ping = false;
|
this->ping_retries_ = 0;
|
||||||
|
this->sent_ping_ = false;
|
||||||
}
|
}
|
||||||
void on_home_assistant_state_response(const HomeAssistantStateResponse &msg) override;
|
void on_home_assistant_state_response(const HomeAssistantStateResponse &msg) override;
|
||||||
#ifdef USE_HOMEASSISTANT_TIME
|
#ifdef USE_HOMEASSISTANT_TIME
|
||||||
void on_get_time_response(const GetTimeResponse &value) override;
|
void on_get_time_response(const GetTimeResponse &value) override;
|
||||||
#endif
|
#endif
|
||||||
bool send_hello_response(const HelloRequest &msg) override;
|
HelloResponse hello(const HelloRequest &msg) override;
|
||||||
bool send_connect_response(const ConnectRequest &msg) override;
|
ConnectResponse connect(const ConnectRequest &msg) override;
|
||||||
bool send_disconnect_response(const DisconnectRequest &msg) override;
|
DisconnectResponse disconnect(const DisconnectRequest &msg) override;
|
||||||
bool send_ping_response(const PingRequest &msg) override;
|
PingResponse ping(const PingRequest &msg) override { return {}; }
|
||||||
bool send_device_info_response(const DeviceInfoRequest &msg) override;
|
DeviceInfoResponse device_info(const DeviceInfoRequest &msg) override;
|
||||||
void list_entities(const ListEntitiesRequest &msg) override { this->list_entities_iterator_.begin(); }
|
void list_entities(const ListEntitiesRequest &msg) override { this->list_entities_iterator_.begin(); }
|
||||||
void subscribe_states(const SubscribeStatesRequest &msg) override {
|
void subscribe_states(const SubscribeStatesRequest &msg) override {
|
||||||
this->flags_.state_subscription = true;
|
this->state_subscription_ = true;
|
||||||
this->initial_state_iterator_.begin();
|
this->initial_state_iterator_.begin();
|
||||||
}
|
}
|
||||||
void subscribe_logs(const SubscribeLogsRequest &msg) override {
|
void subscribe_logs(const SubscribeLogsRequest &msg) override {
|
||||||
this->flags_.log_subscription = msg.level;
|
this->log_subscription_ = msg.level;
|
||||||
if (msg.dump_config)
|
if (msg.dump_config)
|
||||||
App.schedule_dump_config();
|
App.schedule_dump_config();
|
||||||
}
|
}
|
||||||
void subscribe_homeassistant_services(const SubscribeHomeassistantServicesRequest &msg) override {
|
void subscribe_homeassistant_services(const SubscribeHomeassistantServicesRequest &msg) override {
|
||||||
this->flags_.service_call_subscription = true;
|
this->service_call_subscription_ = true;
|
||||||
}
|
}
|
||||||
void subscribe_home_assistant_states(const SubscribeHomeAssistantStatesRequest &msg) override;
|
void subscribe_home_assistant_states(const SubscribeHomeAssistantStatesRequest &msg) override;
|
||||||
bool send_get_time_response(const GetTimeRequest &msg) override;
|
GetTimeResponse get_time(const GetTimeRequest &msg) override {
|
||||||
#ifdef USE_API_SERVICES
|
// TODO
|
||||||
|
return {};
|
||||||
|
}
|
||||||
void execute_service(const ExecuteServiceRequest &msg) override;
|
void execute_service(const ExecuteServiceRequest &msg) override;
|
||||||
#endif
|
|
||||||
#ifdef USE_API_NOISE
|
|
||||||
bool send_noise_encryption_set_key_response(const NoiseEncryptionSetKeyRequest &msg) override;
|
|
||||||
#endif
|
|
||||||
|
|
||||||
bool is_authenticated() override {
|
bool is_authenticated() override { return this->connection_state_ == ConnectionState::AUTHENTICATED; }
|
||||||
return static_cast<ConnectionState>(this->flags_.connection_state) == ConnectionState::AUTHENTICATED;
|
|
||||||
}
|
|
||||||
bool is_connection_setup() override {
|
bool is_connection_setup() override {
|
||||||
return static_cast<ConnectionState>(this->flags_.connection_state) == ConnectionState::CONNECTED ||
|
return this->connection_state_ == ConnectionState ::CONNECTED || this->is_authenticated();
|
||||||
this->is_authenticated();
|
|
||||||
}
|
}
|
||||||
uint8_t get_log_subscription_level() const { return this->flags_.log_subscription; }
|
|
||||||
void on_fatal_error() override;
|
void on_fatal_error() override;
|
||||||
void on_unauthenticated_access() override;
|
void on_unauthenticated_access() override;
|
||||||
void on_no_setup_connection() override;
|
void on_no_setup_connection() override;
|
||||||
ProtoWriteBuffer create_buffer(uint32_t reserve_size) override {
|
ProtoWriteBuffer create_buffer() override {
|
||||||
// FIXME: ensure no recursive writes can happen
|
// FIXME: ensure no recursive writes can happen
|
||||||
|
this->proto_write_buffer_.clear();
|
||||||
// Get header padding size - used for both reserve and insert
|
return {&this->proto_write_buffer_};
|
||||||
uint8_t header_padding = this->helper_->frame_header_padding();
|
|
||||||
|
|
||||||
// Get shared buffer from parent server
|
|
||||||
std::vector<uint8_t> &shared_buf = this->parent_->get_shared_buffer_ref();
|
|
||||||
shared_buf.clear();
|
|
||||||
// Reserve space for header padding + message + footer
|
|
||||||
// - Header padding: space for protocol headers (7 bytes for Noise, 6 for Plaintext)
|
|
||||||
// - Footer: space for MAC (16 bytes for Noise, 0 for Plaintext)
|
|
||||||
shared_buf.reserve(reserve_size + header_padding + this->helper_->frame_footer_size());
|
|
||||||
// Resize to add header padding so message encoding starts at the correct position
|
|
||||||
shared_buf.resize(header_padding);
|
|
||||||
return {&shared_buf};
|
|
||||||
}
|
}
|
||||||
|
bool send_buffer(ProtoWriteBuffer buffer, uint32_t message_type) override;
|
||||||
|
|
||||||
// Prepare buffer for next message in batch
|
std::string get_client_combined_info() const { return this->client_combined_info_; }
|
||||||
ProtoWriteBuffer prepare_message_buffer(uint16_t message_size, bool is_first_message) {
|
|
||||||
// Get reference to shared buffer (it maintains state between batch messages)
|
|
||||||
std::vector<uint8_t> &shared_buf = this->parent_->get_shared_buffer_ref();
|
|
||||||
|
|
||||||
if (is_first_message) {
|
|
||||||
shared_buf.clear();
|
|
||||||
}
|
|
||||||
|
|
||||||
size_t current_size = shared_buf.size();
|
|
||||||
|
|
||||||
// Calculate padding to add:
|
|
||||||
// - First message: just header padding
|
|
||||||
// - Subsequent messages: footer for previous message + header padding for this message
|
|
||||||
size_t padding_to_add = is_first_message
|
|
||||||
? this->helper_->frame_header_padding()
|
|
||||||
: this->helper_->frame_header_padding() + this->helper_->frame_footer_size();
|
|
||||||
|
|
||||||
// Reserve space for padding + message
|
|
||||||
shared_buf.reserve(current_size + padding_to_add + message_size);
|
|
||||||
|
|
||||||
// Resize to add the padding bytes
|
|
||||||
shared_buf.resize(current_size + padding_to_add);
|
|
||||||
|
|
||||||
return {&shared_buf};
|
|
||||||
}
|
|
||||||
|
|
||||||
bool try_to_clear_buffer(bool log_out_of_space);
|
|
||||||
bool send_buffer(ProtoWriteBuffer buffer, uint8_t message_type) override;
|
|
||||||
|
|
||||||
std::string get_client_combined_info() const { return this->client_info_.get_combined_info(); }
|
|
||||||
|
|
||||||
// Buffer allocator methods for batch processing
|
|
||||||
ProtoWriteBuffer allocate_single_message_buffer(uint16_t size);
|
|
||||||
ProtoWriteBuffer allocate_batch_message_buffer(uint16_t size);
|
|
||||||
|
|
||||||
protected:
|
protected:
|
||||||
// Helper function to handle authentication completion
|
friend APIServer;
|
||||||
void complete_authentication_();
|
|
||||||
|
|
||||||
// Non-template helper to encode any ProtoMessage
|
bool send_(const void *buf, size_t len, bool force);
|
||||||
static uint16_t encode_message_to_buffer(ProtoMessage &msg, uint8_t message_type, APIConnection *conn,
|
|
||||||
uint32_t remaining_size, bool is_single);
|
|
||||||
|
|
||||||
// Helper to fill entity state base and encode message
|
enum class ConnectionState {
|
||||||
static uint16_t fill_and_encode_entity_state(EntityBase *entity, StateResponseProtoMessage &msg, uint8_t message_type,
|
WAITING_FOR_HELLO,
|
||||||
APIConnection *conn, uint32_t remaining_size, bool is_single) {
|
CONNECTED,
|
||||||
msg.key = entity->get_object_id_hash();
|
AUTHENTICATED,
|
||||||
#ifdef USE_DEVICES
|
} connection_state_{ConnectionState::WAITING_FOR_HELLO};
|
||||||
msg.device_id = entity->get_device_id();
|
|
||||||
#endif
|
|
||||||
return encode_message_to_buffer(msg, message_type, conn, remaining_size, is_single);
|
|
||||||
}
|
|
||||||
|
|
||||||
// Helper to fill entity info base and encode message
|
bool remove_{false};
|
||||||
static uint16_t fill_and_encode_entity_info(EntityBase *entity, InfoResponseProtoMessage &msg, uint8_t message_type,
|
|
||||||
APIConnection *conn, uint32_t remaining_size, bool is_single) {
|
|
||||||
// Set common fields that are shared by all entity types
|
|
||||||
msg.key = entity->get_object_id_hash();
|
|
||||||
// IMPORTANT: get_object_id() may return a temporary std::string
|
|
||||||
std::string object_id = entity->get_object_id();
|
|
||||||
msg.set_object_id(StringRef(object_id));
|
|
||||||
|
|
||||||
if (entity->has_own_name()) {
|
// Buffer used to encode proto messages
|
||||||
msg.set_name(entity->get_name());
|
// Re-use to prevent allocations
|
||||||
}
|
std::vector<uint8_t> proto_write_buffer_;
|
||||||
|
|
||||||
// Set common EntityBase properties
|
|
||||||
#ifdef USE_ENTITY_ICON
|
|
||||||
msg.set_icon(entity->get_icon_ref());
|
|
||||||
#endif
|
|
||||||
msg.disabled_by_default = entity->is_disabled_by_default();
|
|
||||||
msg.entity_category = static_cast<enums::EntityCategory>(entity->get_entity_category());
|
|
||||||
#ifdef USE_DEVICES
|
|
||||||
msg.device_id = entity->get_device_id();
|
|
||||||
#endif
|
|
||||||
return encode_message_to_buffer(msg, message_type, conn, remaining_size, is_single);
|
|
||||||
}
|
|
||||||
|
|
||||||
#ifdef USE_VOICE_ASSISTANT
|
|
||||||
// Helper to check voice assistant validity and connection ownership
|
|
||||||
inline bool check_voice_assistant_api_connection_() const;
|
|
||||||
#endif
|
|
||||||
|
|
||||||
// Helper method to process multiple entities from an iterator in a batch
|
|
||||||
template<typename Iterator> void process_iterator_batch_(Iterator &iterator) {
|
|
||||||
size_t initial_size = this->deferred_batch_.size();
|
|
||||||
while (!iterator.completed() && (this->deferred_batch_.size() - initial_size) < MAX_INITIAL_PER_BATCH) {
|
|
||||||
iterator.advance();
|
|
||||||
}
|
|
||||||
|
|
||||||
// If the batch is full, process it immediately
|
|
||||||
// Note: iterator.advance() already calls schedule_batch_() via schedule_message_()
|
|
||||||
if (this->deferred_batch_.size() >= MAX_INITIAL_PER_BATCH) {
|
|
||||||
this->process_batch_();
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
#ifdef USE_BINARY_SENSOR
|
|
||||||
static uint16_t try_send_binary_sensor_state(EntityBase *entity, APIConnection *conn, uint32_t remaining_size,
|
|
||||||
bool is_single);
|
|
||||||
static uint16_t try_send_binary_sensor_info(EntityBase *entity, APIConnection *conn, uint32_t remaining_size,
|
|
||||||
bool is_single);
|
|
||||||
#endif
|
|
||||||
#ifdef USE_COVER
|
|
||||||
static uint16_t try_send_cover_state(EntityBase *entity, APIConnection *conn, uint32_t remaining_size,
|
|
||||||
bool is_single);
|
|
||||||
static uint16_t try_send_cover_info(EntityBase *entity, APIConnection *conn, uint32_t remaining_size, bool is_single);
|
|
||||||
#endif
|
|
||||||
#ifdef USE_FAN
|
|
||||||
static uint16_t try_send_fan_state(EntityBase *entity, APIConnection *conn, uint32_t remaining_size, bool is_single);
|
|
||||||
static uint16_t try_send_fan_info(EntityBase *entity, APIConnection *conn, uint32_t remaining_size, bool is_single);
|
|
||||||
#endif
|
|
||||||
#ifdef USE_LIGHT
|
|
||||||
static uint16_t try_send_light_state(EntityBase *entity, APIConnection *conn, uint32_t remaining_size,
|
|
||||||
bool is_single);
|
|
||||||
static uint16_t try_send_light_info(EntityBase *entity, APIConnection *conn, uint32_t remaining_size, bool is_single);
|
|
||||||
#endif
|
|
||||||
#ifdef USE_SENSOR
|
|
||||||
static uint16_t try_send_sensor_state(EntityBase *entity, APIConnection *conn, uint32_t remaining_size,
|
|
||||||
bool is_single);
|
|
||||||
static uint16_t try_send_sensor_info(EntityBase *entity, APIConnection *conn, uint32_t remaining_size,
|
|
||||||
bool is_single);
|
|
||||||
#endif
|
|
||||||
#ifdef USE_SWITCH
|
|
||||||
static uint16_t try_send_switch_state(EntityBase *entity, APIConnection *conn, uint32_t remaining_size,
|
|
||||||
bool is_single);
|
|
||||||
static uint16_t try_send_switch_info(EntityBase *entity, APIConnection *conn, uint32_t remaining_size,
|
|
||||||
bool is_single);
|
|
||||||
#endif
|
|
||||||
#ifdef USE_TEXT_SENSOR
|
|
||||||
static uint16_t try_send_text_sensor_state(EntityBase *entity, APIConnection *conn, uint32_t remaining_size,
|
|
||||||
bool is_single);
|
|
||||||
static uint16_t try_send_text_sensor_info(EntityBase *entity, APIConnection *conn, uint32_t remaining_size,
|
|
||||||
bool is_single);
|
|
||||||
#endif
|
|
||||||
#ifdef USE_CLIMATE
|
|
||||||
static uint16_t try_send_climate_state(EntityBase *entity, APIConnection *conn, uint32_t remaining_size,
|
|
||||||
bool is_single);
|
|
||||||
static uint16_t try_send_climate_info(EntityBase *entity, APIConnection *conn, uint32_t remaining_size,
|
|
||||||
bool is_single);
|
|
||||||
#endif
|
|
||||||
#ifdef USE_NUMBER
|
|
||||||
static uint16_t try_send_number_state(EntityBase *entity, APIConnection *conn, uint32_t remaining_size,
|
|
||||||
bool is_single);
|
|
||||||
static uint16_t try_send_number_info(EntityBase *entity, APIConnection *conn, uint32_t remaining_size,
|
|
||||||
bool is_single);
|
|
||||||
#endif
|
|
||||||
#ifdef USE_DATETIME_DATE
|
|
||||||
static uint16_t try_send_date_state(EntityBase *entity, APIConnection *conn, uint32_t remaining_size, bool is_single);
|
|
||||||
static uint16_t try_send_date_info(EntityBase *entity, APIConnection *conn, uint32_t remaining_size, bool is_single);
|
|
||||||
#endif
|
|
||||||
#ifdef USE_DATETIME_TIME
|
|
||||||
static uint16_t try_send_time_state(EntityBase *entity, APIConnection *conn, uint32_t remaining_size, bool is_single);
|
|
||||||
static uint16_t try_send_time_info(EntityBase *entity, APIConnection *conn, uint32_t remaining_size, bool is_single);
|
|
||||||
#endif
|
|
||||||
#ifdef USE_DATETIME_DATETIME
|
|
||||||
static uint16_t try_send_datetime_state(EntityBase *entity, APIConnection *conn, uint32_t remaining_size,
|
|
||||||
bool is_single);
|
|
||||||
static uint16_t try_send_datetime_info(EntityBase *entity, APIConnection *conn, uint32_t remaining_size,
|
|
||||||
bool is_single);
|
|
||||||
#endif
|
|
||||||
#ifdef USE_TEXT
|
|
||||||
static uint16_t try_send_text_state(EntityBase *entity, APIConnection *conn, uint32_t remaining_size, bool is_single);
|
|
||||||
static uint16_t try_send_text_info(EntityBase *entity, APIConnection *conn, uint32_t remaining_size, bool is_single);
|
|
||||||
#endif
|
|
||||||
#ifdef USE_SELECT
|
|
||||||
static uint16_t try_send_select_state(EntityBase *entity, APIConnection *conn, uint32_t remaining_size,
|
|
||||||
bool is_single);
|
|
||||||
static uint16_t try_send_select_info(EntityBase *entity, APIConnection *conn, uint32_t remaining_size,
|
|
||||||
bool is_single);
|
|
||||||
#endif
|
|
||||||
#ifdef USE_BUTTON
|
|
||||||
static uint16_t try_send_button_info(EntityBase *entity, APIConnection *conn, uint32_t remaining_size,
|
|
||||||
bool is_single);
|
|
||||||
#endif
|
|
||||||
#ifdef USE_LOCK
|
|
||||||
static uint16_t try_send_lock_state(EntityBase *entity, APIConnection *conn, uint32_t remaining_size, bool is_single);
|
|
||||||
static uint16_t try_send_lock_info(EntityBase *entity, APIConnection *conn, uint32_t remaining_size, bool is_single);
|
|
||||||
#endif
|
|
||||||
#ifdef USE_VALVE
|
|
||||||
static uint16_t try_send_valve_state(EntityBase *entity, APIConnection *conn, uint32_t remaining_size,
|
|
||||||
bool is_single);
|
|
||||||
static uint16_t try_send_valve_info(EntityBase *entity, APIConnection *conn, uint32_t remaining_size, bool is_single);
|
|
||||||
#endif
|
|
||||||
#ifdef USE_MEDIA_PLAYER
|
|
||||||
static uint16_t try_send_media_player_state(EntityBase *entity, APIConnection *conn, uint32_t remaining_size,
|
|
||||||
bool is_single);
|
|
||||||
static uint16_t try_send_media_player_info(EntityBase *entity, APIConnection *conn, uint32_t remaining_size,
|
|
||||||
bool is_single);
|
|
||||||
#endif
|
|
||||||
#ifdef USE_ALARM_CONTROL_PANEL
|
|
||||||
static uint16_t try_send_alarm_control_panel_state(EntityBase *entity, APIConnection *conn, uint32_t remaining_size,
|
|
||||||
bool is_single);
|
|
||||||
static uint16_t try_send_alarm_control_panel_info(EntityBase *entity, APIConnection *conn, uint32_t remaining_size,
|
|
||||||
bool is_single);
|
|
||||||
#endif
|
|
||||||
#ifdef USE_EVENT
|
|
||||||
static uint16_t try_send_event_response(event::Event *event, const std::string &event_type, APIConnection *conn,
|
|
||||||
uint32_t remaining_size, bool is_single);
|
|
||||||
static uint16_t try_send_event_info(EntityBase *entity, APIConnection *conn, uint32_t remaining_size, bool is_single);
|
|
||||||
#endif
|
|
||||||
#ifdef USE_UPDATE
|
|
||||||
static uint16_t try_send_update_state(EntityBase *entity, APIConnection *conn, uint32_t remaining_size,
|
|
||||||
bool is_single);
|
|
||||||
static uint16_t try_send_update_info(EntityBase *entity, APIConnection *conn, uint32_t remaining_size,
|
|
||||||
bool is_single);
|
|
||||||
#endif
|
|
||||||
#ifdef USE_CAMERA
|
|
||||||
static uint16_t try_send_camera_info(EntityBase *entity, APIConnection *conn, uint32_t remaining_size,
|
|
||||||
bool is_single);
|
|
||||||
#endif
|
|
||||||
|
|
||||||
// Method for ListEntitiesDone batching
|
|
||||||
static uint16_t try_send_list_info_done(EntityBase *entity, APIConnection *conn, uint32_t remaining_size,
|
|
||||||
bool is_single);
|
|
||||||
|
|
||||||
// Method for DisconnectRequest batching
|
|
||||||
static uint16_t try_send_disconnect_request(EntityBase *entity, APIConnection *conn, uint32_t remaining_size,
|
|
||||||
bool is_single);
|
|
||||||
|
|
||||||
// Batch message method for ping requests
|
|
||||||
static uint16_t try_send_ping_request(EntityBase *entity, APIConnection *conn, uint32_t remaining_size,
|
|
||||||
bool is_single);
|
|
||||||
|
|
||||||
// === Optimal member ordering for 32-bit systems ===
|
|
||||||
|
|
||||||
// Group 1: Pointers (4 bytes each on 32-bit)
|
|
||||||
std::unique_ptr<APIFrameHelper> helper_;
|
std::unique_ptr<APIFrameHelper> helper_;
|
||||||
APIServer *parent_;
|
|
||||||
|
|
||||||
// Group 2: Larger objects (must be 4-byte aligned)
|
std::string client_info_;
|
||||||
// These contain vectors/pointers internally, so putting them early ensures good alignment
|
std::string client_peername_;
|
||||||
|
std::string client_combined_info_;
|
||||||
|
uint32_t client_api_version_major_{0};
|
||||||
|
uint32_t client_api_version_minor_{0};
|
||||||
|
#ifdef USE_ESP32_CAMERA
|
||||||
|
esp32_camera::CameraImageReader image_reader_;
|
||||||
|
#endif
|
||||||
|
|
||||||
|
bool state_subscription_{false};
|
||||||
|
int log_subscription_{ESPHOME_LOG_LEVEL_NONE};
|
||||||
|
uint32_t last_traffic_;
|
||||||
|
uint32_t next_ping_retry_{0};
|
||||||
|
uint8_t ping_retries_{0};
|
||||||
|
bool sent_ping_{false};
|
||||||
|
bool service_call_subscription_{false};
|
||||||
|
bool next_close_ = false;
|
||||||
|
APIServer *parent_;
|
||||||
|
DeferredMessageQueue deferred_message_queue_;
|
||||||
InitialStateIterator initial_state_iterator_;
|
InitialStateIterator initial_state_iterator_;
|
||||||
ListEntitiesIterator list_entities_iterator_;
|
ListEntitiesIterator list_entities_iterator_;
|
||||||
#ifdef USE_CAMERA
|
|
||||||
std::unique_ptr<camera::CameraImageReader> image_reader_;
|
|
||||||
#endif
|
|
||||||
|
|
||||||
// Group 3: Client info struct (24 bytes on 32-bit: 2 strings × 12 bytes each)
|
|
||||||
ClientInfo client_info_;
|
|
||||||
|
|
||||||
// Group 4: 4-byte types
|
|
||||||
uint32_t last_traffic_;
|
|
||||||
int state_subs_at_ = -1;
|
int state_subs_at_ = -1;
|
||||||
|
|
||||||
// Function pointer type for message encoding
|
|
||||||
using MessageCreatorPtr = uint16_t (*)(EntityBase *, APIConnection *, uint32_t remaining_size, bool is_single);
|
|
||||||
|
|
||||||
class MessageCreator {
|
|
||||||
public:
|
|
||||||
// Constructor for function pointer
|
|
||||||
MessageCreator(MessageCreatorPtr ptr) { data_.function_ptr = ptr; }
|
|
||||||
|
|
||||||
// Constructor for string state capture
|
|
||||||
explicit MessageCreator(const std::string &str_value) { data_.string_ptr = new std::string(str_value); }
|
|
||||||
|
|
||||||
// No destructor - cleanup must be called explicitly with message_type
|
|
||||||
|
|
||||||
// Delete copy operations - MessageCreator should only be moved
|
|
||||||
MessageCreator(const MessageCreator &other) = delete;
|
|
||||||
MessageCreator &operator=(const MessageCreator &other) = delete;
|
|
||||||
|
|
||||||
// Move constructor
|
|
||||||
MessageCreator(MessageCreator &&other) noexcept : data_(other.data_) { other.data_.function_ptr = nullptr; }
|
|
||||||
|
|
||||||
// Move assignment
|
|
||||||
MessageCreator &operator=(MessageCreator &&other) noexcept {
|
|
||||||
if (this != &other) {
|
|
||||||
// IMPORTANT: Caller must ensure cleanup() was called if this contains a string!
|
|
||||||
// In our usage, this happens in add_item() deduplication and vector::erase()
|
|
||||||
data_ = other.data_;
|
|
||||||
other.data_.function_ptr = nullptr;
|
|
||||||
}
|
|
||||||
return *this;
|
|
||||||
}
|
|
||||||
|
|
||||||
// Call operator - uses message_type to determine union type
|
|
||||||
uint16_t operator()(EntityBase *entity, APIConnection *conn, uint32_t remaining_size, bool is_single,
|
|
||||||
uint8_t message_type) const;
|
|
||||||
|
|
||||||
// Manual cleanup method - must be called before destruction for string types
|
|
||||||
void cleanup(uint8_t message_type) {
|
|
||||||
#ifdef USE_EVENT
|
|
||||||
if (message_type == EventResponse::MESSAGE_TYPE && data_.string_ptr != nullptr) {
|
|
||||||
delete data_.string_ptr;
|
|
||||||
data_.string_ptr = nullptr;
|
|
||||||
}
|
|
||||||
#endif
|
|
||||||
}
|
|
||||||
|
|
||||||
private:
|
|
||||||
union Data {
|
|
||||||
MessageCreatorPtr function_ptr;
|
|
||||||
std::string *string_ptr;
|
|
||||||
} data_; // 4 bytes on 32-bit, 8 bytes on 64-bit - same as before
|
|
||||||
};
|
|
||||||
|
|
||||||
// Generic batching mechanism for both state updates and entity info
|
|
||||||
struct DeferredBatch {
|
|
||||||
struct BatchItem {
|
|
||||||
EntityBase *entity; // Entity pointer
|
|
||||||
MessageCreator creator; // Function that creates the message when needed
|
|
||||||
uint8_t message_type; // Message type for overhead calculation (max 255)
|
|
||||||
uint8_t estimated_size; // Estimated message size (max 255 bytes)
|
|
||||||
|
|
||||||
// Constructor for creating BatchItem
|
|
||||||
BatchItem(EntityBase *entity, MessageCreator creator, uint8_t message_type, uint8_t estimated_size)
|
|
||||||
: entity(entity), creator(std::move(creator)), message_type(message_type), estimated_size(estimated_size) {}
|
|
||||||
};
|
|
||||||
|
|
||||||
std::vector<BatchItem> items;
|
|
||||||
uint32_t batch_start_time{0};
|
|
||||||
|
|
||||||
private:
|
|
||||||
// Helper to cleanup items from the beginning
|
|
||||||
void cleanup_items_(size_t count) {
|
|
||||||
for (size_t i = 0; i < count; i++) {
|
|
||||||
items[i].creator.cleanup(items[i].message_type);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
public:
|
|
||||||
DeferredBatch() {
|
|
||||||
// Pre-allocate capacity for typical batch sizes to avoid reallocation
|
|
||||||
items.reserve(8);
|
|
||||||
}
|
|
||||||
|
|
||||||
~DeferredBatch() {
|
|
||||||
// Ensure cleanup of any remaining items
|
|
||||||
clear();
|
|
||||||
}
|
|
||||||
|
|
||||||
// Add item to the batch
|
|
||||||
void add_item(EntityBase *entity, MessageCreator creator, uint8_t message_type, uint8_t estimated_size);
|
|
||||||
// Add item to the front of the batch (for high priority messages like ping)
|
|
||||||
void add_item_front(EntityBase *entity, MessageCreator creator, uint8_t message_type, uint8_t estimated_size);
|
|
||||||
|
|
||||||
// Clear all items with proper cleanup
|
|
||||||
void clear() {
|
|
||||||
cleanup_items_(items.size());
|
|
||||||
items.clear();
|
|
||||||
batch_start_time = 0;
|
|
||||||
}
|
|
||||||
|
|
||||||
// Remove processed items from the front with proper cleanup
|
|
||||||
void remove_front(size_t count) {
|
|
||||||
cleanup_items_(count);
|
|
||||||
items.erase(items.begin(), items.begin() + count);
|
|
||||||
}
|
|
||||||
|
|
||||||
bool empty() const { return items.empty(); }
|
|
||||||
size_t size() const { return items.size(); }
|
|
||||||
const BatchItem &operator[](size_t index) const { return items[index]; }
|
|
||||||
};
|
|
||||||
|
|
||||||
// DeferredBatch here (16 bytes, 4-byte aligned)
|
|
||||||
DeferredBatch deferred_batch_;
|
|
||||||
|
|
||||||
// ConnectionState enum for type safety
|
|
||||||
enum class ConnectionState : uint8_t {
|
|
||||||
WAITING_FOR_HELLO = 0,
|
|
||||||
CONNECTED = 1,
|
|
||||||
AUTHENTICATED = 2,
|
|
||||||
};
|
|
||||||
|
|
||||||
// Group 5: Pack all small members together to minimize padding
|
|
||||||
// This group starts at a 4-byte boundary after DeferredBatch
|
|
||||||
struct APIFlags {
|
|
||||||
// Connection state only needs 2 bits (3 states)
|
|
||||||
uint8_t connection_state : 2;
|
|
||||||
// Log subscription needs 3 bits (log levels 0-7)
|
|
||||||
uint8_t log_subscription : 3;
|
|
||||||
// Boolean flags (1 bit each)
|
|
||||||
uint8_t remove : 1;
|
|
||||||
uint8_t state_subscription : 1;
|
|
||||||
uint8_t sent_ping : 1;
|
|
||||||
|
|
||||||
uint8_t service_call_subscription : 1;
|
|
||||||
uint8_t next_close : 1;
|
|
||||||
uint8_t batch_scheduled : 1;
|
|
||||||
uint8_t batch_first_message : 1; // For batch buffer allocation
|
|
||||||
uint8_t should_try_send_immediately : 1; // True after initial states are sent
|
|
||||||
#ifdef HAS_PROTO_MESSAGE_DUMP
|
|
||||||
uint8_t log_only_mode : 1;
|
|
||||||
#endif
|
|
||||||
} flags_{}; // 2 bytes total
|
|
||||||
|
|
||||||
// 2-byte types immediately after flags_ (no padding between them)
|
|
||||||
uint16_t client_api_version_major_{0};
|
|
||||||
uint16_t client_api_version_minor_{0};
|
|
||||||
// Total: 2 (flags) + 2 + 2 = 6 bytes, then 2 bytes padding to next 4-byte boundary
|
|
||||||
|
|
||||||
uint32_t get_batch_delay_ms_() const;
|
|
||||||
// Message will use 8 more bytes than the minimum size, and typical
|
|
||||||
// MTU is 1500. Sometimes users will see as low as 1460 MTU.
|
|
||||||
// If its IPv6 the header is 40 bytes, and if its IPv4
|
|
||||||
// the header is 20 bytes. So we have 1460 - 40 = 1420 bytes
|
|
||||||
// available for the payload. But we also need to add the size of
|
|
||||||
// the protobuf overhead, which is 8 bytes.
|
|
||||||
//
|
|
||||||
// To be safe we pick 1390 bytes as the maximum size
|
|
||||||
// to send in one go. This is the maximum size of a single packet
|
|
||||||
// that can be sent over the network.
|
|
||||||
// This is to avoid fragmentation of the packet.
|
|
||||||
static constexpr size_t MAX_BATCH_PACKET_SIZE = 1390; // MTU
|
|
||||||
|
|
||||||
bool schedule_batch_();
|
|
||||||
void process_batch_();
|
|
||||||
void clear_batch_() {
|
|
||||||
this->deferred_batch_.clear();
|
|
||||||
this->flags_.batch_scheduled = false;
|
|
||||||
}
|
|
||||||
|
|
||||||
#ifdef HAS_PROTO_MESSAGE_DUMP
|
|
||||||
// Helper to log a proto message from a MessageCreator object
|
|
||||||
void log_proto_message_(EntityBase *entity, const MessageCreator &creator, uint8_t message_type) {
|
|
||||||
this->flags_.log_only_mode = true;
|
|
||||||
creator(entity, this, MAX_BATCH_PACKET_SIZE, true, message_type);
|
|
||||||
this->flags_.log_only_mode = false;
|
|
||||||
}
|
|
||||||
|
|
||||||
void log_batch_item_(const DeferredBatch::BatchItem &item) {
|
|
||||||
// Use the helper to log the message
|
|
||||||
this->log_proto_message_(item.entity, item.creator, item.message_type);
|
|
||||||
}
|
|
||||||
#endif
|
|
||||||
|
|
||||||
// Helper method to send a message either immediately or via batching
|
|
||||||
bool send_message_smart_(EntityBase *entity, MessageCreatorPtr creator, uint8_t message_type,
|
|
||||||
uint8_t estimated_size) {
|
|
||||||
// Try to send immediately if:
|
|
||||||
// 1. We should try to send immediately (should_try_send_immediately = true)
|
|
||||||
// 2. Batch delay is 0 (user has opted in to immediate sending)
|
|
||||||
// 3. Buffer has space available
|
|
||||||
if (this->flags_.should_try_send_immediately && this->get_batch_delay_ms_() == 0 &&
|
|
||||||
this->helper_->can_write_without_blocking()) {
|
|
||||||
// Now actually encode and send
|
|
||||||
if (creator(entity, this, MAX_BATCH_PACKET_SIZE, true) &&
|
|
||||||
this->send_buffer(ProtoWriteBuffer{&this->parent_->get_shared_buffer_ref()}, message_type)) {
|
|
||||||
#ifdef HAS_PROTO_MESSAGE_DUMP
|
|
||||||
// Log the message in verbose mode
|
|
||||||
this->log_proto_message_(entity, MessageCreator(creator), message_type);
|
|
||||||
#endif
|
|
||||||
return true;
|
|
||||||
}
|
|
||||||
|
|
||||||
// If immediate send failed, fall through to batching
|
|
||||||
}
|
|
||||||
|
|
||||||
// Fall back to scheduled batching
|
|
||||||
return this->schedule_message_(entity, creator, message_type, estimated_size);
|
|
||||||
}
|
|
||||||
|
|
||||||
// Helper function to schedule a deferred message with known message type
|
|
||||||
bool schedule_message_(EntityBase *entity, MessageCreator creator, uint8_t message_type, uint8_t estimated_size) {
|
|
||||||
this->deferred_batch_.add_item(entity, std::move(creator), message_type, estimated_size);
|
|
||||||
return this->schedule_batch_();
|
|
||||||
}
|
|
||||||
|
|
||||||
// Overload for function pointers (for info messages and current state reads)
|
|
||||||
bool schedule_message_(EntityBase *entity, MessageCreatorPtr function_ptr, uint8_t message_type,
|
|
||||||
uint8_t estimated_size) {
|
|
||||||
return schedule_message_(entity, MessageCreator(function_ptr), message_type, estimated_size);
|
|
||||||
}
|
|
||||||
|
|
||||||
// Helper function to schedule a high priority message at the front of the batch
|
|
||||||
bool schedule_message_front_(EntityBase *entity, MessageCreatorPtr function_ptr, uint8_t message_type,
|
|
||||||
uint8_t estimated_size) {
|
|
||||||
this->deferred_batch_.add_item_front(entity, MessageCreator(function_ptr), message_type, estimated_size);
|
|
||||||
return this->schedule_batch_();
|
|
||||||
}
|
|
||||||
};
|
};
|
||||||
|
|
||||||
} // namespace esphome::api
|
} // namespace api
|
||||||
|
} // namespace esphome
|
||||||
#endif
|
#endif
|
||||||
|
File diff suppressed because it is too large
Load Diff
@ -1,46 +1,39 @@
|
|||||||
#pragma once
|
#pragma once
|
||||||
#include <cstdint>
|
#include <cstdint>
|
||||||
#include <deque>
|
#include <deque>
|
||||||
#include <limits>
|
|
||||||
#include <span>
|
|
||||||
#include <utility>
|
#include <utility>
|
||||||
#include <vector>
|
#include <vector>
|
||||||
|
|
||||||
#include "esphome/core/defines.h"
|
#include "esphome/core/defines.h"
|
||||||
#ifdef USE_API
|
#ifdef USE_API
|
||||||
|
#ifdef USE_API_NOISE
|
||||||
|
#include "noise/protocol.h"
|
||||||
|
#endif
|
||||||
|
|
||||||
|
#include "api_noise_context.h"
|
||||||
#include "esphome/components/socket/socket.h"
|
#include "esphome/components/socket/socket.h"
|
||||||
#include "esphome/core/application.h"
|
|
||||||
#include "esphome/core/log.h"
|
|
||||||
|
|
||||||
namespace esphome::api {
|
namespace esphome {
|
||||||
|
namespace api {
|
||||||
// uncomment to log raw packets
|
|
||||||
//#define HELPER_LOG_PACKETS
|
|
||||||
|
|
||||||
// Forward declaration
|
|
||||||
struct ClientInfo;
|
|
||||||
|
|
||||||
class ProtoWriteBuffer;
|
|
||||||
|
|
||||||
struct ReadPacketBuffer {
|
struct ReadPacketBuffer {
|
||||||
std::vector<uint8_t> container;
|
std::vector<uint8_t> container;
|
||||||
uint16_t type;
|
uint16_t type;
|
||||||
uint16_t data_offset;
|
size_t data_offset;
|
||||||
uint16_t data_len;
|
size_t data_len;
|
||||||
};
|
};
|
||||||
|
|
||||||
// Packed packet info structure to minimize memory usage
|
struct PacketBuffer {
|
||||||
struct PacketInfo {
|
const std::vector<uint8_t> container;
|
||||||
uint16_t offset; // Offset in buffer where message starts
|
uint16_t type;
|
||||||
uint16_t payload_size; // Size of the message payload
|
uint8_t data_offset;
|
||||||
uint8_t message_type; // Message type (0-255)
|
uint8_t data_len;
|
||||||
|
|
||||||
PacketInfo(uint8_t type, uint16_t off, uint16_t size) : offset(off), payload_size(size), message_type(type) {}
|
|
||||||
};
|
};
|
||||||
|
|
||||||
enum class APIError : uint16_t {
|
enum class APIError : int {
|
||||||
OK = 0,
|
OK = 0,
|
||||||
WOULD_BLOCK = 1001,
|
WOULD_BLOCK = 1001,
|
||||||
|
BAD_HANDSHAKE_PACKET_LEN = 1002,
|
||||||
BAD_INDICATOR = 1003,
|
BAD_INDICATOR = 1003,
|
||||||
BAD_DATA_PACKET = 1004,
|
BAD_DATA_PACKET = 1004,
|
||||||
TCP_NODELAY_FAILED = 1005,
|
TCP_NODELAY_FAILED = 1005,
|
||||||
@ -51,138 +44,150 @@ enum class APIError : uint16_t {
|
|||||||
BAD_ARG = 1010,
|
BAD_ARG = 1010,
|
||||||
SOCKET_READ_FAILED = 1011,
|
SOCKET_READ_FAILED = 1011,
|
||||||
SOCKET_WRITE_FAILED = 1012,
|
SOCKET_WRITE_FAILED = 1012,
|
||||||
OUT_OF_MEMORY = 1018,
|
|
||||||
CONNECTION_CLOSED = 1022,
|
|
||||||
#ifdef USE_API_NOISE
|
|
||||||
BAD_HANDSHAKE_PACKET_LEN = 1002,
|
|
||||||
HANDSHAKESTATE_READ_FAILED = 1013,
|
HANDSHAKESTATE_READ_FAILED = 1013,
|
||||||
HANDSHAKESTATE_WRITE_FAILED = 1014,
|
HANDSHAKESTATE_WRITE_FAILED = 1014,
|
||||||
HANDSHAKESTATE_BAD_STATE = 1015,
|
HANDSHAKESTATE_BAD_STATE = 1015,
|
||||||
CIPHERSTATE_DECRYPT_FAILED = 1016,
|
CIPHERSTATE_DECRYPT_FAILED = 1016,
|
||||||
CIPHERSTATE_ENCRYPT_FAILED = 1017,
|
CIPHERSTATE_ENCRYPT_FAILED = 1017,
|
||||||
|
OUT_OF_MEMORY = 1018,
|
||||||
HANDSHAKESTATE_SETUP_FAILED = 1019,
|
HANDSHAKESTATE_SETUP_FAILED = 1019,
|
||||||
HANDSHAKESTATE_SPLIT_FAILED = 1020,
|
HANDSHAKESTATE_SPLIT_FAILED = 1020,
|
||||||
BAD_HANDSHAKE_ERROR_BYTE = 1021,
|
BAD_HANDSHAKE_ERROR_BYTE = 1021,
|
||||||
#endif
|
CONNECTION_CLOSED = 1022,
|
||||||
};
|
};
|
||||||
|
|
||||||
const char *api_error_to_str(APIError err);
|
const char *api_error_to_str(APIError err);
|
||||||
|
|
||||||
class APIFrameHelper {
|
class APIFrameHelper {
|
||||||
public:
|
public:
|
||||||
APIFrameHelper() = default;
|
|
||||||
explicit APIFrameHelper(std::unique_ptr<socket::Socket> socket, const ClientInfo *client_info)
|
|
||||||
: socket_owned_(std::move(socket)), client_info_(client_info) {
|
|
||||||
socket_ = socket_owned_.get();
|
|
||||||
}
|
|
||||||
virtual ~APIFrameHelper() = default;
|
virtual ~APIFrameHelper() = default;
|
||||||
virtual APIError init() = 0;
|
virtual APIError init() = 0;
|
||||||
virtual APIError loop();
|
virtual APIError loop() = 0;
|
||||||
virtual APIError read_packet(ReadPacketBuffer *buffer) = 0;
|
virtual APIError read_packet(ReadPacketBuffer *buffer) = 0;
|
||||||
bool can_write_without_blocking() { return state_ == State::DATA && tx_buf_.empty(); }
|
virtual bool can_write_without_blocking() = 0;
|
||||||
std::string getpeername() { return socket_->getpeername(); }
|
virtual APIError write_packet(uint16_t type, const uint8_t *data, size_t len) = 0;
|
||||||
int getpeername(struct sockaddr *addr, socklen_t *addrlen) { return socket_->getpeername(addr, addrlen); }
|
virtual std::string getpeername() = 0;
|
||||||
APIError close() {
|
virtual int getpeername(struct sockaddr *addr, socklen_t *addrlen) = 0;
|
||||||
state_ = State::CLOSED;
|
virtual APIError close() = 0;
|
||||||
int err = this->socket_->close();
|
virtual APIError shutdown(int how) = 0;
|
||||||
if (err == -1)
|
// Give this helper a name for logging
|
||||||
return APIError::CLOSE_FAILED;
|
virtual void set_log_info(std::string info) = 0;
|
||||||
return APIError::OK;
|
};
|
||||||
|
|
||||||
|
#ifdef USE_API_NOISE
|
||||||
|
class APINoiseFrameHelper : public APIFrameHelper {
|
||||||
|
public:
|
||||||
|
APINoiseFrameHelper(std::unique_ptr<socket::Socket> socket, std::shared_ptr<APINoiseContext> ctx)
|
||||||
|
: socket_(std::move(socket)), ctx_(std::move(std::move(ctx))) {}
|
||||||
|
~APINoiseFrameHelper() override;
|
||||||
|
APIError init() override;
|
||||||
|
APIError loop() override;
|
||||||
|
APIError read_packet(ReadPacketBuffer *buffer) override;
|
||||||
|
bool can_write_without_blocking() override;
|
||||||
|
APIError write_packet(uint16_t type, const uint8_t *payload, size_t len) override;
|
||||||
|
std::string getpeername() override { return this->socket_->getpeername(); }
|
||||||
|
int getpeername(struct sockaddr *addr, socklen_t *addrlen) override {
|
||||||
|
return this->socket_->getpeername(addr, addrlen);
|
||||||
}
|
}
|
||||||
APIError shutdown(int how) {
|
APIError close() override;
|
||||||
int err = this->socket_->shutdown(how);
|
APIError shutdown(int how) override;
|
||||||
if (err == -1)
|
// Give this helper a name for logging
|
||||||
return APIError::SHUTDOWN_FAILED;
|
void set_log_info(std::string info) override { info_ = std::move(info); }
|
||||||
if (how == SHUT_RDWR) {
|
|
||||||
state_ = State::CLOSED;
|
|
||||||
}
|
|
||||||
return APIError::OK;
|
|
||||||
}
|
|
||||||
virtual APIError write_protobuf_packet(uint8_t type, ProtoWriteBuffer buffer) = 0;
|
|
||||||
// Write multiple protobuf packets in a single operation
|
|
||||||
// packets contains (message_type, offset, length) for each message in the buffer
|
|
||||||
// The buffer contains all messages with appropriate padding before each
|
|
||||||
virtual APIError write_protobuf_packets(ProtoWriteBuffer buffer, std::span<const PacketInfo> packets) = 0;
|
|
||||||
// Get the frame header padding required by this protocol
|
|
||||||
virtual uint8_t frame_header_padding() = 0;
|
|
||||||
// Get the frame footer size required by this protocol
|
|
||||||
virtual uint8_t frame_footer_size() = 0;
|
|
||||||
// Check if socket has data ready to read
|
|
||||||
bool is_socket_ready() const { return socket_ != nullptr && socket_->ready(); }
|
|
||||||
|
|
||||||
protected:
|
protected:
|
||||||
// Buffer containing data to be sent
|
struct ParsedFrame {
|
||||||
struct SendBuffer {
|
std::vector<uint8_t> msg;
|
||||||
std::unique_ptr<uint8_t[]> data;
|
|
||||||
uint16_t size{0}; // Total size of the buffer
|
|
||||||
uint16_t offset{0}; // Current offset within the buffer
|
|
||||||
|
|
||||||
// Using uint16_t reduces memory usage since ESPHome API messages are limited to UINT16_MAX (65535) bytes
|
|
||||||
uint16_t remaining() const { return size - offset; }
|
|
||||||
const uint8_t *current_data() const { return data.get() + offset; }
|
|
||||||
};
|
};
|
||||||
|
|
||||||
// Common implementation for writing raw data to socket
|
APIError state_action_();
|
||||||
APIError write_raw_(const struct iovec *iov, int iovcnt, uint16_t total_write_len);
|
APIError try_read_frame_(ParsedFrame *frame);
|
||||||
|
|
||||||
// Try to send data from the tx buffer
|
|
||||||
APIError try_send_tx_buf_();
|
APIError try_send_tx_buf_();
|
||||||
|
APIError write_frame_(const uint8_t *data, size_t len);
|
||||||
|
APIError write_raw_(const struct iovec *iov, int iovcnt);
|
||||||
|
APIError init_handshake_();
|
||||||
|
APIError check_handshake_finished_();
|
||||||
|
void send_explicit_handshake_reject_(const std::string &reason);
|
||||||
|
|
||||||
// Helper method to buffer data from IOVs
|
std::unique_ptr<socket::Socket> socket_;
|
||||||
void buffer_data_from_iov_(const struct iovec *iov, int iovcnt, uint16_t total_write_len, uint16_t offset);
|
|
||||||
|
|
||||||
// Common socket write error handling
|
std::string info_;
|
||||||
APIError handle_socket_write_error_();
|
uint8_t rx_header_buf_[3];
|
||||||
template<typename StateEnum>
|
size_t rx_header_buf_len_ = 0;
|
||||||
APIError write_raw_(const struct iovec *iov, int iovcnt, socket::Socket *socket, std::vector<uint8_t> &tx_buf,
|
std::vector<uint8_t> rx_buf_;
|
||||||
const std::string &info, StateEnum &state, StateEnum failed_state);
|
size_t rx_buf_len_ = 0;
|
||||||
|
|
||||||
// Pointers first (4 bytes each)
|
std::vector<uint8_t> tx_buf_;
|
||||||
socket::Socket *socket_{nullptr};
|
std::vector<uint8_t> prologue_;
|
||||||
std::unique_ptr<socket::Socket> socket_owned_;
|
|
||||||
|
|
||||||
// Common state enum for all frame helpers
|
std::shared_ptr<APINoiseContext> ctx_;
|
||||||
// Note: Not all states are used by all implementations
|
NoiseHandshakeState *handshake_{nullptr};
|
||||||
// - INITIALIZE: Used by both Noise and Plaintext
|
NoiseCipherState *send_cipher_{nullptr};
|
||||||
// - CLIENT_HELLO, SERVER_HELLO, HANDSHAKE: Only used by Noise protocol
|
NoiseCipherState *recv_cipher_{nullptr};
|
||||||
// - DATA: Used by both Noise and Plaintext
|
NoiseProtocolId nid_;
|
||||||
// - CLOSED: Used by both Noise and Plaintext
|
|
||||||
// - FAILED: Used by both Noise and Plaintext
|
enum class State {
|
||||||
// - EXPLICIT_REJECT: Only used by Noise protocol
|
|
||||||
enum class State : uint8_t {
|
|
||||||
INITIALIZE = 1,
|
INITIALIZE = 1,
|
||||||
CLIENT_HELLO = 2, // Noise only
|
CLIENT_HELLO = 2,
|
||||||
SERVER_HELLO = 3, // Noise only
|
SERVER_HELLO = 3,
|
||||||
HANDSHAKE = 4, // Noise only
|
HANDSHAKE = 4,
|
||||||
DATA = 5,
|
DATA = 5,
|
||||||
CLOSED = 6,
|
CLOSED = 6,
|
||||||
FAILED = 7,
|
FAILED = 7,
|
||||||
EXPLICIT_REJECT = 8, // Noise only
|
EXPLICIT_REJECT = 8,
|
||||||
|
} state_ = State::INITIALIZE;
|
||||||
|
};
|
||||||
|
#endif // USE_API_NOISE
|
||||||
|
|
||||||
|
#ifdef USE_API_PLAINTEXT
|
||||||
|
class APIPlaintextFrameHelper : public APIFrameHelper {
|
||||||
|
public:
|
||||||
|
APIPlaintextFrameHelper(std::unique_ptr<socket::Socket> socket) : socket_(std::move(socket)) {}
|
||||||
|
~APIPlaintextFrameHelper() override = default;
|
||||||
|
APIError init() override;
|
||||||
|
APIError loop() override;
|
||||||
|
APIError read_packet(ReadPacketBuffer *buffer) override;
|
||||||
|
bool can_write_without_blocking() override;
|
||||||
|
APIError write_packet(uint16_t type, const uint8_t *payload, size_t len) override;
|
||||||
|
std::string getpeername() override { return this->socket_->getpeername(); }
|
||||||
|
int getpeername(struct sockaddr *addr, socklen_t *addrlen) override {
|
||||||
|
return this->socket_->getpeername(addr, addrlen);
|
||||||
|
}
|
||||||
|
APIError close() override;
|
||||||
|
APIError shutdown(int how) override;
|
||||||
|
// Give this helper a name for logging
|
||||||
|
void set_log_info(std::string info) override { info_ = std::move(info); }
|
||||||
|
|
||||||
|
protected:
|
||||||
|
struct ParsedFrame {
|
||||||
|
std::vector<uint8_t> msg;
|
||||||
};
|
};
|
||||||
|
|
||||||
// Containers (size varies, but typically 12+ bytes on 32-bit)
|
APIError try_read_frame_(ParsedFrame *frame);
|
||||||
std::deque<SendBuffer> tx_buf_;
|
APIError try_send_tx_buf_();
|
||||||
std::vector<struct iovec> reusable_iovs_;
|
APIError write_raw_(const struct iovec *iov, int iovcnt);
|
||||||
|
|
||||||
|
std::unique_ptr<socket::Socket> socket_;
|
||||||
|
|
||||||
|
std::string info_;
|
||||||
|
std::vector<uint8_t> rx_header_buf_;
|
||||||
|
bool rx_header_parsed_ = false;
|
||||||
|
uint32_t rx_header_parsed_type_ = 0;
|
||||||
|
uint32_t rx_header_parsed_len_ = 0;
|
||||||
|
|
||||||
std::vector<uint8_t> rx_buf_;
|
std::vector<uint8_t> rx_buf_;
|
||||||
|
size_t rx_buf_len_ = 0;
|
||||||
|
|
||||||
// Pointer to client info (4 bytes on 32-bit)
|
std::vector<uint8_t> tx_buf_;
|
||||||
// Note: The pointed-to ClientInfo object must outlive this APIFrameHelper instance.
|
|
||||||
const ClientInfo *client_info_{nullptr};
|
|
||||||
|
|
||||||
// Group smaller types together
|
enum class State {
|
||||||
uint16_t rx_buf_len_ = 0;
|
INITIALIZE = 1,
|
||||||
State state_{State::INITIALIZE};
|
DATA = 2,
|
||||||
uint8_t frame_header_padding_{0};
|
CLOSED = 3,
|
||||||
uint8_t frame_footer_size_{0};
|
FAILED = 4,
|
||||||
// 5 bytes total, 3 bytes padding
|
} state_ = State::INITIALIZE;
|
||||||
|
|
||||||
// Common initialization for both plaintext and noise protocols
|
|
||||||
APIError init_common_();
|
|
||||||
|
|
||||||
// Helper method to handle socket read results
|
|
||||||
APIError handle_socket_read_result_(ssize_t received);
|
|
||||||
};
|
};
|
||||||
|
#endif
|
||||||
|
|
||||||
} // namespace esphome::api
|
} // namespace api
|
||||||
|
} // namespace esphome
|
||||||
#endif // USE_API
|
#endif
|
||||||
|
@ -1,583 +0,0 @@
|
|||||||
#include "api_frame_helper_noise.h"
|
|
||||||
#ifdef USE_API
|
|
||||||
#ifdef USE_API_NOISE
|
|
||||||
#include "api_connection.h" // For ClientInfo struct
|
|
||||||
#include "esphome/core/application.h"
|
|
||||||
#include "esphome/core/hal.h"
|
|
||||||
#include "esphome/core/helpers.h"
|
|
||||||
#include "esphome/core/log.h"
|
|
||||||
#include "proto.h"
|
|
||||||
#include <cstring>
|
|
||||||
#include <cinttypes>
|
|
||||||
|
|
||||||
namespace esphome::api {
|
|
||||||
|
|
||||||
static const char *const TAG = "api.noise";
|
|
||||||
static const char *const PROLOGUE_INIT = "NoiseAPIInit";
|
|
||||||
static constexpr size_t PROLOGUE_INIT_LEN = 12; // strlen("NoiseAPIInit")
|
|
||||||
|
|
||||||
#define HELPER_LOG(msg, ...) ESP_LOGVV(TAG, "%s: " msg, this->client_info_->get_combined_info().c_str(), ##__VA_ARGS__)
|
|
||||||
|
|
||||||
#ifdef HELPER_LOG_PACKETS
|
|
||||||
#define LOG_PACKET_RECEIVED(buffer) ESP_LOGVV(TAG, "Received frame: %s", format_hex_pretty(buffer).c_str())
|
|
||||||
#define LOG_PACKET_SENDING(data, len) ESP_LOGVV(TAG, "Sending raw: %s", format_hex_pretty(data, len).c_str())
|
|
||||||
#else
|
|
||||||
#define LOG_PACKET_RECEIVED(buffer) ((void) 0)
|
|
||||||
#define LOG_PACKET_SENDING(data, len) ((void) 0)
|
|
||||||
#endif
|
|
||||||
|
|
||||||
/// Convert a noise error code to a readable error
|
|
||||||
std::string noise_err_to_str(int err) {
|
|
||||||
if (err == NOISE_ERROR_NO_MEMORY)
|
|
||||||
return "NO_MEMORY";
|
|
||||||
if (err == NOISE_ERROR_UNKNOWN_ID)
|
|
||||||
return "UNKNOWN_ID";
|
|
||||||
if (err == NOISE_ERROR_UNKNOWN_NAME)
|
|
||||||
return "UNKNOWN_NAME";
|
|
||||||
if (err == NOISE_ERROR_MAC_FAILURE)
|
|
||||||
return "MAC_FAILURE";
|
|
||||||
if (err == NOISE_ERROR_NOT_APPLICABLE)
|
|
||||||
return "NOT_APPLICABLE";
|
|
||||||
if (err == NOISE_ERROR_SYSTEM)
|
|
||||||
return "SYSTEM";
|
|
||||||
if (err == NOISE_ERROR_REMOTE_KEY_REQUIRED)
|
|
||||||
return "REMOTE_KEY_REQUIRED";
|
|
||||||
if (err == NOISE_ERROR_LOCAL_KEY_REQUIRED)
|
|
||||||
return "LOCAL_KEY_REQUIRED";
|
|
||||||
if (err == NOISE_ERROR_PSK_REQUIRED)
|
|
||||||
return "PSK_REQUIRED";
|
|
||||||
if (err == NOISE_ERROR_INVALID_LENGTH)
|
|
||||||
return "INVALID_LENGTH";
|
|
||||||
if (err == NOISE_ERROR_INVALID_PARAM)
|
|
||||||
return "INVALID_PARAM";
|
|
||||||
if (err == NOISE_ERROR_INVALID_STATE)
|
|
||||||
return "INVALID_STATE";
|
|
||||||
if (err == NOISE_ERROR_INVALID_NONCE)
|
|
||||||
return "INVALID_NONCE";
|
|
||||||
if (err == NOISE_ERROR_INVALID_PRIVATE_KEY)
|
|
||||||
return "INVALID_PRIVATE_KEY";
|
|
||||||
if (err == NOISE_ERROR_INVALID_PUBLIC_KEY)
|
|
||||||
return "INVALID_PUBLIC_KEY";
|
|
||||||
if (err == NOISE_ERROR_INVALID_FORMAT)
|
|
||||||
return "INVALID_FORMAT";
|
|
||||||
if (err == NOISE_ERROR_INVALID_SIGNATURE)
|
|
||||||
return "INVALID_SIGNATURE";
|
|
||||||
return to_string(err);
|
|
||||||
}
|
|
||||||
|
|
||||||
/// Initialize the frame helper, returns OK if successful.
|
|
||||||
APIError APINoiseFrameHelper::init() {
|
|
||||||
APIError err = init_common_();
|
|
||||||
if (err != APIError::OK) {
|
|
||||||
return err;
|
|
||||||
}
|
|
||||||
|
|
||||||
// init prologue
|
|
||||||
size_t old_size = prologue_.size();
|
|
||||||
prologue_.resize(old_size + PROLOGUE_INIT_LEN);
|
|
||||||
std::memcpy(prologue_.data() + old_size, PROLOGUE_INIT, PROLOGUE_INIT_LEN);
|
|
||||||
|
|
||||||
state_ = State::CLIENT_HELLO;
|
|
||||||
return APIError::OK;
|
|
||||||
}
|
|
||||||
// Helper for handling handshake frame errors
|
|
||||||
APIError APINoiseFrameHelper::handle_handshake_frame_error_(APIError aerr) {
|
|
||||||
if (aerr == APIError::BAD_INDICATOR) {
|
|
||||||
send_explicit_handshake_reject_("Bad indicator byte");
|
|
||||||
} else if (aerr == APIError::BAD_HANDSHAKE_PACKET_LEN) {
|
|
||||||
send_explicit_handshake_reject_("Bad handshake packet len");
|
|
||||||
}
|
|
||||||
return aerr;
|
|
||||||
}
|
|
||||||
|
|
||||||
// Helper for handling noise library errors
|
|
||||||
APIError APINoiseFrameHelper::handle_noise_error_(int err, const char *func_name, APIError api_err) {
|
|
||||||
if (err != 0) {
|
|
||||||
state_ = State::FAILED;
|
|
||||||
HELPER_LOG("%s failed: %s", func_name, noise_err_to_str(err).c_str());
|
|
||||||
return api_err;
|
|
||||||
}
|
|
||||||
return APIError::OK;
|
|
||||||
}
|
|
||||||
|
|
||||||
/// Run through handshake messages (if in that phase)
|
|
||||||
APIError APINoiseFrameHelper::loop() {
|
|
||||||
// During handshake phase, process as many actions as possible until we can't progress
|
|
||||||
// socket_->ready() stays true until next main loop, but state_action() will return
|
|
||||||
// WOULD_BLOCK when no more data is available to read
|
|
||||||
while (state_ != State::DATA && this->socket_->ready()) {
|
|
||||||
APIError err = state_action_();
|
|
||||||
if (err == APIError::WOULD_BLOCK) {
|
|
||||||
break;
|
|
||||||
}
|
|
||||||
if (err != APIError::OK) {
|
|
||||||
return err;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
// Use base class implementation for buffer sending
|
|
||||||
return APIFrameHelper::loop();
|
|
||||||
}
|
|
||||||
|
|
||||||
/** Read a packet into the rx_buf_. If successful, stores frame data in the frame parameter
|
|
||||||
*
|
|
||||||
* @param frame: The struct to hold the frame information in.
|
|
||||||
* msg_start: points to the start of the payload - this pointer is only valid until the next
|
|
||||||
* try_receive_raw_ call
|
|
||||||
*
|
|
||||||
* @return 0 if a full packet is in rx_buf_
|
|
||||||
* @return -1 if error, check errno.
|
|
||||||
*
|
|
||||||
* errno EWOULDBLOCK: Packet could not be read without blocking. Try again later.
|
|
||||||
* errno ENOMEM: Not enough memory for reading packet.
|
|
||||||
* errno API_ERROR_BAD_INDICATOR: Bad indicator byte at start of frame.
|
|
||||||
* errno API_ERROR_HANDSHAKE_PACKET_LEN: Packet too big for this phase.
|
|
||||||
*/
|
|
||||||
APIError APINoiseFrameHelper::try_read_frame_(std::vector<uint8_t> *frame) {
|
|
||||||
if (frame == nullptr) {
|
|
||||||
HELPER_LOG("Bad argument for try_read_frame_");
|
|
||||||
return APIError::BAD_ARG;
|
|
||||||
}
|
|
||||||
|
|
||||||
// read header
|
|
||||||
if (rx_header_buf_len_ < 3) {
|
|
||||||
// no header information yet
|
|
||||||
uint8_t to_read = 3 - rx_header_buf_len_;
|
|
||||||
ssize_t received = this->socket_->read(&rx_header_buf_[rx_header_buf_len_], to_read);
|
|
||||||
APIError err = handle_socket_read_result_(received);
|
|
||||||
if (err != APIError::OK) {
|
|
||||||
return err;
|
|
||||||
}
|
|
||||||
rx_header_buf_len_ += static_cast<uint8_t>(received);
|
|
||||||
if (static_cast<uint8_t>(received) != to_read) {
|
|
||||||
// not a full read
|
|
||||||
return APIError::WOULD_BLOCK;
|
|
||||||
}
|
|
||||||
|
|
||||||
if (rx_header_buf_[0] != 0x01) {
|
|
||||||
state_ = State::FAILED;
|
|
||||||
HELPER_LOG("Bad indicator byte %u", rx_header_buf_[0]);
|
|
||||||
return APIError::BAD_INDICATOR;
|
|
||||||
}
|
|
||||||
// header reading done
|
|
||||||
}
|
|
||||||
|
|
||||||
// read body
|
|
||||||
uint16_t msg_size = (((uint16_t) rx_header_buf_[1]) << 8) | rx_header_buf_[2];
|
|
||||||
|
|
||||||
if (state_ != State::DATA && msg_size > 128) {
|
|
||||||
// for handshake message only permit up to 128 bytes
|
|
||||||
state_ = State::FAILED;
|
|
||||||
HELPER_LOG("Bad packet len for handshake: %d", msg_size);
|
|
||||||
return APIError::BAD_HANDSHAKE_PACKET_LEN;
|
|
||||||
}
|
|
||||||
|
|
||||||
// reserve space for body
|
|
||||||
if (rx_buf_.size() != msg_size) {
|
|
||||||
rx_buf_.resize(msg_size);
|
|
||||||
}
|
|
||||||
|
|
||||||
if (rx_buf_len_ < msg_size) {
|
|
||||||
// more data to read
|
|
||||||
uint16_t to_read = msg_size - rx_buf_len_;
|
|
||||||
ssize_t received = this->socket_->read(&rx_buf_[rx_buf_len_], to_read);
|
|
||||||
APIError err = handle_socket_read_result_(received);
|
|
||||||
if (err != APIError::OK) {
|
|
||||||
return err;
|
|
||||||
}
|
|
||||||
rx_buf_len_ += static_cast<uint16_t>(received);
|
|
||||||
if (static_cast<uint16_t>(received) != to_read) {
|
|
||||||
// not all read
|
|
||||||
return APIError::WOULD_BLOCK;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
LOG_PACKET_RECEIVED(rx_buf_);
|
|
||||||
*frame = std::move(rx_buf_);
|
|
||||||
// consume msg
|
|
||||||
rx_buf_ = {};
|
|
||||||
rx_buf_len_ = 0;
|
|
||||||
rx_header_buf_len_ = 0;
|
|
||||||
return APIError::OK;
|
|
||||||
}
|
|
||||||
|
|
||||||
/** To be called from read/write methods.
|
|
||||||
*
|
|
||||||
* This method runs through the internal handshake methods, if in that state.
|
|
||||||
*
|
|
||||||
* If the handshake is still active when this method returns and a read/write can't take place at
|
|
||||||
* the moment, returns WOULD_BLOCK.
|
|
||||||
* If an error occurred, returns that error. Only returns OK if the transport is ready for data
|
|
||||||
* traffic.
|
|
||||||
*/
|
|
||||||
APIError APINoiseFrameHelper::state_action_() {
|
|
||||||
int err;
|
|
||||||
APIError aerr;
|
|
||||||
if (state_ == State::INITIALIZE) {
|
|
||||||
HELPER_LOG("Bad state for method: %d", (int) state_);
|
|
||||||
return APIError::BAD_STATE;
|
|
||||||
}
|
|
||||||
if (state_ == State::CLIENT_HELLO) {
|
|
||||||
// waiting for client hello
|
|
||||||
std::vector<uint8_t> frame;
|
|
||||||
aerr = try_read_frame_(&frame);
|
|
||||||
if (aerr != APIError::OK) {
|
|
||||||
return handle_handshake_frame_error_(aerr);
|
|
||||||
}
|
|
||||||
// ignore contents, may be used in future for flags
|
|
||||||
// Resize for: existing prologue + 2 size bytes + frame data
|
|
||||||
size_t old_size = prologue_.size();
|
|
||||||
prologue_.resize(old_size + 2 + frame.size());
|
|
||||||
prologue_[old_size] = (uint8_t) (frame.size() >> 8);
|
|
||||||
prologue_[old_size + 1] = (uint8_t) frame.size();
|
|
||||||
std::memcpy(prologue_.data() + old_size + 2, frame.data(), frame.size());
|
|
||||||
|
|
||||||
state_ = State::SERVER_HELLO;
|
|
||||||
}
|
|
||||||
if (state_ == State::SERVER_HELLO) {
|
|
||||||
// send server hello
|
|
||||||
const std::string &name = App.get_name();
|
|
||||||
const std::string &mac = get_mac_address();
|
|
||||||
|
|
||||||
std::vector<uint8_t> msg;
|
|
||||||
// Calculate positions and sizes
|
|
||||||
size_t name_len = name.size() + 1; // including null terminator
|
|
||||||
size_t mac_len = mac.size() + 1; // including null terminator
|
|
||||||
size_t name_offset = 1;
|
|
||||||
size_t mac_offset = name_offset + name_len;
|
|
||||||
size_t total_size = 1 + name_len + mac_len;
|
|
||||||
|
|
||||||
msg.resize(total_size);
|
|
||||||
|
|
||||||
// chosen proto
|
|
||||||
msg[0] = 0x01;
|
|
||||||
|
|
||||||
// node name, terminated by null byte
|
|
||||||
std::memcpy(msg.data() + name_offset, name.c_str(), name_len);
|
|
||||||
// node mac, terminated by null byte
|
|
||||||
std::memcpy(msg.data() + mac_offset, mac.c_str(), mac_len);
|
|
||||||
|
|
||||||
aerr = write_frame_(msg.data(), msg.size());
|
|
||||||
if (aerr != APIError::OK)
|
|
||||||
return aerr;
|
|
||||||
|
|
||||||
// start handshake
|
|
||||||
aerr = init_handshake_();
|
|
||||||
if (aerr != APIError::OK)
|
|
||||||
return aerr;
|
|
||||||
|
|
||||||
state_ = State::HANDSHAKE;
|
|
||||||
}
|
|
||||||
if (state_ == State::HANDSHAKE) {
|
|
||||||
int action = noise_handshakestate_get_action(handshake_);
|
|
||||||
if (action == NOISE_ACTION_READ_MESSAGE) {
|
|
||||||
// waiting for handshake msg
|
|
||||||
std::vector<uint8_t> frame;
|
|
||||||
aerr = try_read_frame_(&frame);
|
|
||||||
if (aerr != APIError::OK) {
|
|
||||||
return handle_handshake_frame_error_(aerr);
|
|
||||||
}
|
|
||||||
|
|
||||||
if (frame.empty()) {
|
|
||||||
send_explicit_handshake_reject_("Empty handshake message");
|
|
||||||
return APIError::BAD_HANDSHAKE_ERROR_BYTE;
|
|
||||||
} else if (frame[0] != 0x00) {
|
|
||||||
HELPER_LOG("Bad handshake error byte: %u", frame[0]);
|
|
||||||
send_explicit_handshake_reject_("Bad handshake error byte");
|
|
||||||
return APIError::BAD_HANDSHAKE_ERROR_BYTE;
|
|
||||||
}
|
|
||||||
|
|
||||||
NoiseBuffer mbuf;
|
|
||||||
noise_buffer_init(mbuf);
|
|
||||||
noise_buffer_set_input(mbuf, frame.data() + 1, frame.size() - 1);
|
|
||||||
err = noise_handshakestate_read_message(handshake_, &mbuf, nullptr);
|
|
||||||
if (err != 0) {
|
|
||||||
// Special handling for MAC failure
|
|
||||||
send_explicit_handshake_reject_(err == NOISE_ERROR_MAC_FAILURE ? "Handshake MAC failure" : "Handshake error");
|
|
||||||
return handle_noise_error_(err, "noise_handshakestate_read_message", APIError::HANDSHAKESTATE_READ_FAILED);
|
|
||||||
}
|
|
||||||
|
|
||||||
aerr = check_handshake_finished_();
|
|
||||||
if (aerr != APIError::OK)
|
|
||||||
return aerr;
|
|
||||||
} else if (action == NOISE_ACTION_WRITE_MESSAGE) {
|
|
||||||
uint8_t buffer[65];
|
|
||||||
NoiseBuffer mbuf;
|
|
||||||
noise_buffer_init(mbuf);
|
|
||||||
noise_buffer_set_output(mbuf, buffer + 1, sizeof(buffer) - 1);
|
|
||||||
|
|
||||||
err = noise_handshakestate_write_message(handshake_, &mbuf, nullptr);
|
|
||||||
APIError aerr_write =
|
|
||||||
handle_noise_error_(err, "noise_handshakestate_write_message", APIError::HANDSHAKESTATE_WRITE_FAILED);
|
|
||||||
if (aerr_write != APIError::OK)
|
|
||||||
return aerr_write;
|
|
||||||
buffer[0] = 0x00; // success
|
|
||||||
|
|
||||||
aerr = write_frame_(buffer, mbuf.size + 1);
|
|
||||||
if (aerr != APIError::OK)
|
|
||||||
return aerr;
|
|
||||||
aerr = check_handshake_finished_();
|
|
||||||
if (aerr != APIError::OK)
|
|
||||||
return aerr;
|
|
||||||
} else {
|
|
||||||
// bad state for action
|
|
||||||
state_ = State::FAILED;
|
|
||||||
HELPER_LOG("Bad action for handshake: %d", action);
|
|
||||||
return APIError::HANDSHAKESTATE_BAD_STATE;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
if (state_ == State::CLOSED || state_ == State::FAILED) {
|
|
||||||
return APIError::BAD_STATE;
|
|
||||||
}
|
|
||||||
return APIError::OK;
|
|
||||||
}
|
|
||||||
void APINoiseFrameHelper::send_explicit_handshake_reject_(const std::string &reason) {
|
|
||||||
std::vector<uint8_t> data;
|
|
||||||
data.resize(reason.length() + 1);
|
|
||||||
data[0] = 0x01; // failure
|
|
||||||
|
|
||||||
// Copy error message in bulk
|
|
||||||
if (!reason.empty()) {
|
|
||||||
std::memcpy(data.data() + 1, reason.c_str(), reason.length());
|
|
||||||
}
|
|
||||||
|
|
||||||
// temporarily remove failed state
|
|
||||||
auto orig_state = state_;
|
|
||||||
state_ = State::EXPLICIT_REJECT;
|
|
||||||
write_frame_(data.data(), data.size());
|
|
||||||
state_ = orig_state;
|
|
||||||
}
|
|
||||||
APIError APINoiseFrameHelper::read_packet(ReadPacketBuffer *buffer) {
|
|
||||||
int err;
|
|
||||||
APIError aerr;
|
|
||||||
aerr = state_action_();
|
|
||||||
if (aerr != APIError::OK) {
|
|
||||||
return aerr;
|
|
||||||
}
|
|
||||||
|
|
||||||
if (state_ != State::DATA) {
|
|
||||||
return APIError::WOULD_BLOCK;
|
|
||||||
}
|
|
||||||
|
|
||||||
std::vector<uint8_t> frame;
|
|
||||||
aerr = try_read_frame_(&frame);
|
|
||||||
if (aerr != APIError::OK)
|
|
||||||
return aerr;
|
|
||||||
|
|
||||||
NoiseBuffer mbuf;
|
|
||||||
noise_buffer_init(mbuf);
|
|
||||||
noise_buffer_set_inout(mbuf, frame.data(), frame.size(), frame.size());
|
|
||||||
err = noise_cipherstate_decrypt(recv_cipher_, &mbuf);
|
|
||||||
APIError decrypt_err = handle_noise_error_(err, "noise_cipherstate_decrypt", APIError::CIPHERSTATE_DECRYPT_FAILED);
|
|
||||||
if (decrypt_err != APIError::OK)
|
|
||||||
return decrypt_err;
|
|
||||||
|
|
||||||
uint16_t msg_size = mbuf.size;
|
|
||||||
uint8_t *msg_data = frame.data();
|
|
||||||
if (msg_size < 4) {
|
|
||||||
state_ = State::FAILED;
|
|
||||||
HELPER_LOG("Bad data packet: size %d too short", msg_size);
|
|
||||||
return APIError::BAD_DATA_PACKET;
|
|
||||||
}
|
|
||||||
|
|
||||||
uint16_t type = (((uint16_t) msg_data[0]) << 8) | msg_data[1];
|
|
||||||
uint16_t data_len = (((uint16_t) msg_data[2]) << 8) | msg_data[3];
|
|
||||||
if (data_len > msg_size - 4) {
|
|
||||||
state_ = State::FAILED;
|
|
||||||
HELPER_LOG("Bad data packet: data_len %u greater than msg_size %u", data_len, msg_size);
|
|
||||||
return APIError::BAD_DATA_PACKET;
|
|
||||||
}
|
|
||||||
|
|
||||||
buffer->container = std::move(frame);
|
|
||||||
buffer->data_offset = 4;
|
|
||||||
buffer->data_len = data_len;
|
|
||||||
buffer->type = type;
|
|
||||||
return APIError::OK;
|
|
||||||
}
|
|
||||||
APIError APINoiseFrameHelper::write_protobuf_packet(uint8_t type, ProtoWriteBuffer buffer) {
|
|
||||||
// Resize to include MAC space (required for Noise encryption)
|
|
||||||
buffer.get_buffer()->resize(buffer.get_buffer()->size() + frame_footer_size_);
|
|
||||||
PacketInfo packet{type, 0,
|
|
||||||
static_cast<uint16_t>(buffer.get_buffer()->size() - frame_header_padding_ - frame_footer_size_)};
|
|
||||||
return write_protobuf_packets(buffer, std::span<const PacketInfo>(&packet, 1));
|
|
||||||
}
|
|
||||||
|
|
||||||
APIError APINoiseFrameHelper::write_protobuf_packets(ProtoWriteBuffer buffer, std::span<const PacketInfo> packets) {
|
|
||||||
APIError aerr = state_action_();
|
|
||||||
if (aerr != APIError::OK) {
|
|
||||||
return aerr;
|
|
||||||
}
|
|
||||||
|
|
||||||
if (state_ != State::DATA) {
|
|
||||||
return APIError::WOULD_BLOCK;
|
|
||||||
}
|
|
||||||
|
|
||||||
if (packets.empty()) {
|
|
||||||
return APIError::OK;
|
|
||||||
}
|
|
||||||
|
|
||||||
std::vector<uint8_t> *raw_buffer = buffer.get_buffer();
|
|
||||||
uint8_t *buffer_data = raw_buffer->data(); // Cache buffer pointer
|
|
||||||
|
|
||||||
this->reusable_iovs_.clear();
|
|
||||||
this->reusable_iovs_.reserve(packets.size());
|
|
||||||
uint16_t total_write_len = 0;
|
|
||||||
|
|
||||||
// We need to encrypt each packet in place
|
|
||||||
for (const auto &packet : packets) {
|
|
||||||
// The buffer already has padding at offset
|
|
||||||
uint8_t *buf_start = buffer_data + packet.offset;
|
|
||||||
|
|
||||||
// Write noise header
|
|
||||||
buf_start[0] = 0x01; // indicator
|
|
||||||
// buf_start[1], buf_start[2] to be set after encryption
|
|
||||||
|
|
||||||
// Write message header (to be encrypted)
|
|
||||||
const uint8_t msg_offset = 3;
|
|
||||||
buf_start[msg_offset] = static_cast<uint8_t>(packet.message_type >> 8); // type high byte
|
|
||||||
buf_start[msg_offset + 1] = static_cast<uint8_t>(packet.message_type); // type low byte
|
|
||||||
buf_start[msg_offset + 2] = static_cast<uint8_t>(packet.payload_size >> 8); // data_len high byte
|
|
||||||
buf_start[msg_offset + 3] = static_cast<uint8_t>(packet.payload_size); // data_len low byte
|
|
||||||
// payload data is already in the buffer starting at offset + 7
|
|
||||||
|
|
||||||
// Make sure we have space for MAC
|
|
||||||
// The buffer should already have been sized appropriately
|
|
||||||
|
|
||||||
// Encrypt the message in place
|
|
||||||
NoiseBuffer mbuf;
|
|
||||||
noise_buffer_init(mbuf);
|
|
||||||
noise_buffer_set_inout(mbuf, buf_start + msg_offset, 4 + packet.payload_size,
|
|
||||||
4 + packet.payload_size + frame_footer_size_);
|
|
||||||
|
|
||||||
int err = noise_cipherstate_encrypt(send_cipher_, &mbuf);
|
|
||||||
APIError aerr = handle_noise_error_(err, "noise_cipherstate_encrypt", APIError::CIPHERSTATE_ENCRYPT_FAILED);
|
|
||||||
if (aerr != APIError::OK)
|
|
||||||
return aerr;
|
|
||||||
|
|
||||||
// Fill in the encrypted size
|
|
||||||
buf_start[1] = static_cast<uint8_t>(mbuf.size >> 8);
|
|
||||||
buf_start[2] = static_cast<uint8_t>(mbuf.size);
|
|
||||||
|
|
||||||
// Add iovec for this encrypted packet
|
|
||||||
size_t packet_len = static_cast<size_t>(3 + mbuf.size); // indicator + size + encrypted data
|
|
||||||
this->reusable_iovs_.push_back({buf_start, packet_len});
|
|
||||||
total_write_len += packet_len;
|
|
||||||
}
|
|
||||||
|
|
||||||
// Send all encrypted packets in one writev call
|
|
||||||
return this->write_raw_(this->reusable_iovs_.data(), this->reusable_iovs_.size(), total_write_len);
|
|
||||||
}
|
|
||||||
|
|
||||||
APIError APINoiseFrameHelper::write_frame_(const uint8_t *data, uint16_t len) {
|
|
||||||
uint8_t header[3];
|
|
||||||
header[0] = 0x01; // indicator
|
|
||||||
header[1] = (uint8_t) (len >> 8);
|
|
||||||
header[2] = (uint8_t) len;
|
|
||||||
|
|
||||||
struct iovec iov[2];
|
|
||||||
iov[0].iov_base = header;
|
|
||||||
iov[0].iov_len = 3;
|
|
||||||
if (len == 0) {
|
|
||||||
return this->write_raw_(iov, 1, 3); // Just header
|
|
||||||
}
|
|
||||||
iov[1].iov_base = const_cast<uint8_t *>(data);
|
|
||||||
iov[1].iov_len = len;
|
|
||||||
|
|
||||||
return this->write_raw_(iov, 2, 3 + len); // Header + data
|
|
||||||
}
|
|
||||||
|
|
||||||
/** Initiate the data structures for the handshake.
|
|
||||||
*
|
|
||||||
* @return 0 on success, -1 on error (check errno)
|
|
||||||
*/
|
|
||||||
APIError APINoiseFrameHelper::init_handshake_() {
|
|
||||||
int err;
|
|
||||||
memset(&nid_, 0, sizeof(nid_));
|
|
||||||
// const char *proto = "Noise_NNpsk0_25519_ChaChaPoly_SHA256";
|
|
||||||
// err = noise_protocol_name_to_id(&nid_, proto, strlen(proto));
|
|
||||||
nid_.pattern_id = NOISE_PATTERN_NN;
|
|
||||||
nid_.cipher_id = NOISE_CIPHER_CHACHAPOLY;
|
|
||||||
nid_.dh_id = NOISE_DH_CURVE25519;
|
|
||||||
nid_.prefix_id = NOISE_PREFIX_STANDARD;
|
|
||||||
nid_.hybrid_id = NOISE_DH_NONE;
|
|
||||||
nid_.hash_id = NOISE_HASH_SHA256;
|
|
||||||
nid_.modifier_ids[0] = NOISE_MODIFIER_PSK0;
|
|
||||||
|
|
||||||
err = noise_handshakestate_new_by_id(&handshake_, &nid_, NOISE_ROLE_RESPONDER);
|
|
||||||
APIError aerr = handle_noise_error_(err, "noise_handshakestate_new_by_id", APIError::HANDSHAKESTATE_SETUP_FAILED);
|
|
||||||
if (aerr != APIError::OK)
|
|
||||||
return aerr;
|
|
||||||
|
|
||||||
const auto &psk = ctx_->get_psk();
|
|
||||||
err = noise_handshakestate_set_pre_shared_key(handshake_, psk.data(), psk.size());
|
|
||||||
aerr = handle_noise_error_(err, "noise_handshakestate_set_pre_shared_key", APIError::HANDSHAKESTATE_SETUP_FAILED);
|
|
||||||
if (aerr != APIError::OK)
|
|
||||||
return aerr;
|
|
||||||
|
|
||||||
err = noise_handshakestate_set_prologue(handshake_, prologue_.data(), prologue_.size());
|
|
||||||
aerr = handle_noise_error_(err, "noise_handshakestate_set_prologue", APIError::HANDSHAKESTATE_SETUP_FAILED);
|
|
||||||
if (aerr != APIError::OK)
|
|
||||||
return aerr;
|
|
||||||
// set_prologue copies it into handshakestate, so we can get rid of it now
|
|
||||||
prologue_ = {};
|
|
||||||
|
|
||||||
err = noise_handshakestate_start(handshake_);
|
|
||||||
aerr = handle_noise_error_(err, "noise_handshakestate_start", APIError::HANDSHAKESTATE_SETUP_FAILED);
|
|
||||||
if (aerr != APIError::OK)
|
|
||||||
return aerr;
|
|
||||||
return APIError::OK;
|
|
||||||
}
|
|
||||||
|
|
||||||
APIError APINoiseFrameHelper::check_handshake_finished_() {
|
|
||||||
assert(state_ == State::HANDSHAKE);
|
|
||||||
|
|
||||||
int action = noise_handshakestate_get_action(handshake_);
|
|
||||||
if (action == NOISE_ACTION_READ_MESSAGE || action == NOISE_ACTION_WRITE_MESSAGE)
|
|
||||||
return APIError::OK;
|
|
||||||
if (action != NOISE_ACTION_SPLIT) {
|
|
||||||
state_ = State::FAILED;
|
|
||||||
HELPER_LOG("Bad action for handshake: %d", action);
|
|
||||||
return APIError::HANDSHAKESTATE_BAD_STATE;
|
|
||||||
}
|
|
||||||
int err = noise_handshakestate_split(handshake_, &send_cipher_, &recv_cipher_);
|
|
||||||
APIError aerr = handle_noise_error_(err, "noise_handshakestate_split", APIError::HANDSHAKESTATE_SPLIT_FAILED);
|
|
||||||
if (aerr != APIError::OK)
|
|
||||||
return aerr;
|
|
||||||
|
|
||||||
frame_footer_size_ = noise_cipherstate_get_mac_length(send_cipher_);
|
|
||||||
|
|
||||||
HELPER_LOG("Handshake complete!");
|
|
||||||
noise_handshakestate_free(handshake_);
|
|
||||||
handshake_ = nullptr;
|
|
||||||
state_ = State::DATA;
|
|
||||||
return APIError::OK;
|
|
||||||
}
|
|
||||||
|
|
||||||
APINoiseFrameHelper::~APINoiseFrameHelper() {
|
|
||||||
if (handshake_ != nullptr) {
|
|
||||||
noise_handshakestate_free(handshake_);
|
|
||||||
handshake_ = nullptr;
|
|
||||||
}
|
|
||||||
if (send_cipher_ != nullptr) {
|
|
||||||
noise_cipherstate_free(send_cipher_);
|
|
||||||
send_cipher_ = nullptr;
|
|
||||||
}
|
|
||||||
if (recv_cipher_ != nullptr) {
|
|
||||||
noise_cipherstate_free(recv_cipher_);
|
|
||||||
recv_cipher_ = nullptr;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
extern "C" {
|
|
||||||
// declare how noise generates random bytes (here with a good HWRNG based on the RF system)
|
|
||||||
void noise_rand_bytes(void *output, size_t len) {
|
|
||||||
if (!esphome::random_bytes(reinterpret_cast<uint8_t *>(output), len)) {
|
|
||||||
ESP_LOGE(TAG, "Acquiring random bytes failed; rebooting");
|
|
||||||
arch_restart();
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
} // namespace esphome::api
|
|
||||||
#endif // USE_API_NOISE
|
|
||||||
#endif // USE_API
|
|
@ -1,68 +0,0 @@
|
|||||||
#pragma once
|
|
||||||
#include "api_frame_helper.h"
|
|
||||||
#ifdef USE_API
|
|
||||||
#ifdef USE_API_NOISE
|
|
||||||
#include "noise/protocol.h"
|
|
||||||
#include "api_noise_context.h"
|
|
||||||
|
|
||||||
namespace esphome::api {
|
|
||||||
|
|
||||||
class APINoiseFrameHelper : public APIFrameHelper {
|
|
||||||
public:
|
|
||||||
APINoiseFrameHelper(std::unique_ptr<socket::Socket> socket, std::shared_ptr<APINoiseContext> ctx,
|
|
||||||
const ClientInfo *client_info)
|
|
||||||
: APIFrameHelper(std::move(socket), client_info), ctx_(std::move(ctx)) {
|
|
||||||
// Noise header structure:
|
|
||||||
// Pos 0: indicator (0x01)
|
|
||||||
// Pos 1-2: encrypted payload size (16-bit big-endian)
|
|
||||||
// Pos 3-6: encrypted type (16-bit) + data_len (16-bit)
|
|
||||||
// Pos 7+: actual payload data
|
|
||||||
frame_header_padding_ = 7;
|
|
||||||
}
|
|
||||||
~APINoiseFrameHelper() override;
|
|
||||||
APIError init() override;
|
|
||||||
APIError loop() override;
|
|
||||||
APIError read_packet(ReadPacketBuffer *buffer) override;
|
|
||||||
APIError write_protobuf_packet(uint8_t type, ProtoWriteBuffer buffer) override;
|
|
||||||
APIError write_protobuf_packets(ProtoWriteBuffer buffer, std::span<const PacketInfo> packets) override;
|
|
||||||
// Get the frame header padding required by this protocol
|
|
||||||
uint8_t frame_header_padding() override { return frame_header_padding_; }
|
|
||||||
// Get the frame footer size required by this protocol
|
|
||||||
uint8_t frame_footer_size() override { return frame_footer_size_; }
|
|
||||||
|
|
||||||
protected:
|
|
||||||
APIError state_action_();
|
|
||||||
APIError try_read_frame_(std::vector<uint8_t> *frame);
|
|
||||||
APIError write_frame_(const uint8_t *data, uint16_t len);
|
|
||||||
APIError init_handshake_();
|
|
||||||
APIError check_handshake_finished_();
|
|
||||||
void send_explicit_handshake_reject_(const std::string &reason);
|
|
||||||
APIError handle_handshake_frame_error_(APIError aerr);
|
|
||||||
APIError handle_noise_error_(int err, const char *func_name, APIError api_err);
|
|
||||||
|
|
||||||
// Pointers first (4 bytes each)
|
|
||||||
NoiseHandshakeState *handshake_{nullptr};
|
|
||||||
NoiseCipherState *send_cipher_{nullptr};
|
|
||||||
NoiseCipherState *recv_cipher_{nullptr};
|
|
||||||
|
|
||||||
// Shared pointer (8 bytes on 32-bit = 4 bytes control block pointer + 4 bytes object pointer)
|
|
||||||
std::shared_ptr<APINoiseContext> ctx_;
|
|
||||||
|
|
||||||
// Vector (12 bytes on 32-bit)
|
|
||||||
std::vector<uint8_t> prologue_;
|
|
||||||
|
|
||||||
// NoiseProtocolId (size depends on implementation)
|
|
||||||
NoiseProtocolId nid_;
|
|
||||||
|
|
||||||
// Group small types together
|
|
||||||
// Fixed-size header buffer for noise protocol:
|
|
||||||
// 1 byte for indicator + 2 bytes for message size (16-bit value, not varint)
|
|
||||||
// Note: Maximum message size is UINT16_MAX (65535), with a limit of 128 bytes during handshake phase
|
|
||||||
uint8_t rx_header_buf_[3];
|
|
||||||
uint8_t rx_header_buf_len_ = 0;
|
|
||||||
// 4 bytes total, no padding
|
|
||||||
};
|
|
||||||
|
|
||||||
} // namespace esphome::api
|
|
||||||
#endif // USE_API_NOISE
|
|
||||||
#endif // USE_API
|
|
@ -1,290 +0,0 @@
|
|||||||
#include "api_frame_helper_plaintext.h"
|
|
||||||
#ifdef USE_API
|
|
||||||
#ifdef USE_API_PLAINTEXT
|
|
||||||
#include "api_connection.h" // For ClientInfo struct
|
|
||||||
#include "esphome/core/application.h"
|
|
||||||
#include "esphome/core/hal.h"
|
|
||||||
#include "esphome/core/helpers.h"
|
|
||||||
#include "esphome/core/log.h"
|
|
||||||
#include "proto.h"
|
|
||||||
#include <cstring>
|
|
||||||
#include <cinttypes>
|
|
||||||
|
|
||||||
namespace esphome::api {
|
|
||||||
|
|
||||||
static const char *const TAG = "api.plaintext";
|
|
||||||
|
|
||||||
#define HELPER_LOG(msg, ...) ESP_LOGVV(TAG, "%s: " msg, this->client_info_->get_combined_info().c_str(), ##__VA_ARGS__)
|
|
||||||
|
|
||||||
#ifdef HELPER_LOG_PACKETS
|
|
||||||
#define LOG_PACKET_RECEIVED(buffer) ESP_LOGVV(TAG, "Received frame: %s", format_hex_pretty(buffer).c_str())
|
|
||||||
#define LOG_PACKET_SENDING(data, len) ESP_LOGVV(TAG, "Sending raw: %s", format_hex_pretty(data, len).c_str())
|
|
||||||
#else
|
|
||||||
#define LOG_PACKET_RECEIVED(buffer) ((void) 0)
|
|
||||||
#define LOG_PACKET_SENDING(data, len) ((void) 0)
|
|
||||||
#endif
|
|
||||||
|
|
||||||
/// Initialize the frame helper, returns OK if successful.
|
|
||||||
APIError APIPlaintextFrameHelper::init() {
|
|
||||||
APIError err = init_common_();
|
|
||||||
if (err != APIError::OK) {
|
|
||||||
return err;
|
|
||||||
}
|
|
||||||
|
|
||||||
state_ = State::DATA;
|
|
||||||
return APIError::OK;
|
|
||||||
}
|
|
||||||
APIError APIPlaintextFrameHelper::loop() {
|
|
||||||
if (state_ != State::DATA) {
|
|
||||||
return APIError::BAD_STATE;
|
|
||||||
}
|
|
||||||
// Use base class implementation for buffer sending
|
|
||||||
return APIFrameHelper::loop();
|
|
||||||
}
|
|
||||||
|
|
||||||
/** Read a packet into the rx_buf_. If successful, stores frame data in the frame parameter
|
|
||||||
*
|
|
||||||
* @param frame: The struct to hold the frame information in.
|
|
||||||
* msg: store the parsed frame in that struct
|
|
||||||
*
|
|
||||||
* @return See APIError
|
|
||||||
*
|
|
||||||
* error API_ERROR_BAD_INDICATOR: Bad indicator byte at start of frame.
|
|
||||||
*/
|
|
||||||
APIError APIPlaintextFrameHelper::try_read_frame_(std::vector<uint8_t> *frame) {
|
|
||||||
if (frame == nullptr) {
|
|
||||||
HELPER_LOG("Bad argument for try_read_frame_");
|
|
||||||
return APIError::BAD_ARG;
|
|
||||||
}
|
|
||||||
|
|
||||||
// read header
|
|
||||||
while (!rx_header_parsed_) {
|
|
||||||
// Now that we know when the socket is ready, we can read up to 3 bytes
|
|
||||||
// into the rx_header_buf_ before we have to switch back to reading
|
|
||||||
// one byte at a time to ensure we don't read past the message and
|
|
||||||
// into the next one.
|
|
||||||
|
|
||||||
// Read directly into rx_header_buf_ at the current position
|
|
||||||
// Try to get to at least 3 bytes total (indicator + 2 varint bytes), then read one byte at a time
|
|
||||||
ssize_t received =
|
|
||||||
this->socket_->read(&rx_header_buf_[rx_header_buf_pos_], rx_header_buf_pos_ < 3 ? 3 - rx_header_buf_pos_ : 1);
|
|
||||||
APIError err = handle_socket_read_result_(received);
|
|
||||||
if (err != APIError::OK) {
|
|
||||||
return err;
|
|
||||||
}
|
|
||||||
|
|
||||||
// If this was the first read, validate the indicator byte
|
|
||||||
if (rx_header_buf_pos_ == 0 && received > 0) {
|
|
||||||
if (rx_header_buf_[0] != 0x00) {
|
|
||||||
state_ = State::FAILED;
|
|
||||||
HELPER_LOG("Bad indicator byte %u", rx_header_buf_[0]);
|
|
||||||
return APIError::BAD_INDICATOR;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
rx_header_buf_pos_ += received;
|
|
||||||
|
|
||||||
// Check for buffer overflow
|
|
||||||
if (rx_header_buf_pos_ >= sizeof(rx_header_buf_)) {
|
|
||||||
state_ = State::FAILED;
|
|
||||||
HELPER_LOG("Header buffer overflow");
|
|
||||||
return APIError::BAD_DATA_PACKET;
|
|
||||||
}
|
|
||||||
|
|
||||||
// Need at least 3 bytes total (indicator + 2 varint bytes) before trying to parse
|
|
||||||
if (rx_header_buf_pos_ < 3) {
|
|
||||||
continue;
|
|
||||||
}
|
|
||||||
|
|
||||||
// At this point, we have at least 3 bytes total:
|
|
||||||
// - Validated indicator byte (0x00) stored at position 0
|
|
||||||
// - At least 2 bytes in the buffer for the varints
|
|
||||||
// Buffer layout:
|
|
||||||
// [0]: indicator byte (0x00)
|
|
||||||
// [1-3]: Message size varint (variable length)
|
|
||||||
// - 2 bytes would only allow up to 16383, which is less than noise's UINT16_MAX (65535)
|
|
||||||
// - 3 bytes allows up to 2097151, ensuring we support at least as much as noise
|
|
||||||
// [2-5]: Message type varint (variable length)
|
|
||||||
// We now attempt to parse both varints. If either is incomplete,
|
|
||||||
// we'll continue reading more bytes.
|
|
||||||
|
|
||||||
// Skip indicator byte at position 0
|
|
||||||
uint8_t varint_pos = 1;
|
|
||||||
uint32_t consumed = 0;
|
|
||||||
|
|
||||||
auto msg_size_varint = ProtoVarInt::parse(&rx_header_buf_[varint_pos], rx_header_buf_pos_ - varint_pos, &consumed);
|
|
||||||
if (!msg_size_varint.has_value()) {
|
|
||||||
// not enough data there yet
|
|
||||||
continue;
|
|
||||||
}
|
|
||||||
|
|
||||||
if (msg_size_varint->as_uint32() > std::numeric_limits<uint16_t>::max()) {
|
|
||||||
state_ = State::FAILED;
|
|
||||||
HELPER_LOG("Bad packet: message size %" PRIu32 " exceeds maximum %u", msg_size_varint->as_uint32(),
|
|
||||||
std::numeric_limits<uint16_t>::max());
|
|
||||||
return APIError::BAD_DATA_PACKET;
|
|
||||||
}
|
|
||||||
rx_header_parsed_len_ = msg_size_varint->as_uint16();
|
|
||||||
|
|
||||||
// Move to next varint position
|
|
||||||
varint_pos += consumed;
|
|
||||||
|
|
||||||
auto msg_type_varint = ProtoVarInt::parse(&rx_header_buf_[varint_pos], rx_header_buf_pos_ - varint_pos, &consumed);
|
|
||||||
if (!msg_type_varint.has_value()) {
|
|
||||||
// not enough data there yet
|
|
||||||
continue;
|
|
||||||
}
|
|
||||||
if (msg_type_varint->as_uint32() > std::numeric_limits<uint16_t>::max()) {
|
|
||||||
state_ = State::FAILED;
|
|
||||||
HELPER_LOG("Bad packet: message type %" PRIu32 " exceeds maximum %u", msg_type_varint->as_uint32(),
|
|
||||||
std::numeric_limits<uint16_t>::max());
|
|
||||||
return APIError::BAD_DATA_PACKET;
|
|
||||||
}
|
|
||||||
rx_header_parsed_type_ = msg_type_varint->as_uint16();
|
|
||||||
rx_header_parsed_ = true;
|
|
||||||
}
|
|
||||||
// header reading done
|
|
||||||
|
|
||||||
// reserve space for body
|
|
||||||
if (rx_buf_.size() != rx_header_parsed_len_) {
|
|
||||||
rx_buf_.resize(rx_header_parsed_len_);
|
|
||||||
}
|
|
||||||
|
|
||||||
if (rx_buf_len_ < rx_header_parsed_len_) {
|
|
||||||
// more data to read
|
|
||||||
uint16_t to_read = rx_header_parsed_len_ - rx_buf_len_;
|
|
||||||
ssize_t received = this->socket_->read(&rx_buf_[rx_buf_len_], to_read);
|
|
||||||
APIError err = handle_socket_read_result_(received);
|
|
||||||
if (err != APIError::OK) {
|
|
||||||
return err;
|
|
||||||
}
|
|
||||||
rx_buf_len_ += static_cast<uint16_t>(received);
|
|
||||||
if (static_cast<uint16_t>(received) != to_read) {
|
|
||||||
// not all read
|
|
||||||
return APIError::WOULD_BLOCK;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
LOG_PACKET_RECEIVED(rx_buf_);
|
|
||||||
*frame = std::move(rx_buf_);
|
|
||||||
// consume msg
|
|
||||||
rx_buf_ = {};
|
|
||||||
rx_buf_len_ = 0;
|
|
||||||
rx_header_buf_pos_ = 0;
|
|
||||||
rx_header_parsed_ = false;
|
|
||||||
return APIError::OK;
|
|
||||||
}
|
|
||||||
APIError APIPlaintextFrameHelper::read_packet(ReadPacketBuffer *buffer) {
|
|
||||||
APIError aerr;
|
|
||||||
|
|
||||||
if (state_ != State::DATA) {
|
|
||||||
return APIError::WOULD_BLOCK;
|
|
||||||
}
|
|
||||||
|
|
||||||
std::vector<uint8_t> frame;
|
|
||||||
aerr = try_read_frame_(&frame);
|
|
||||||
if (aerr != APIError::OK) {
|
|
||||||
if (aerr == APIError::BAD_INDICATOR) {
|
|
||||||
// Make sure to tell the remote that we don't
|
|
||||||
// understand the indicator byte so it knows
|
|
||||||
// we do not support it.
|
|
||||||
struct iovec iov[1];
|
|
||||||
// The \x00 first byte is the marker for plaintext.
|
|
||||||
//
|
|
||||||
// The remote will know how to handle the indicator byte,
|
|
||||||
// but it likely won't understand the rest of the message.
|
|
||||||
//
|
|
||||||
// We must send at least 3 bytes to be read, so we add
|
|
||||||
// a message after the indicator byte to ensures its long
|
|
||||||
// enough and can aid in debugging.
|
|
||||||
const char msg[] = "\x00"
|
|
||||||
"Bad indicator byte";
|
|
||||||
iov[0].iov_base = (void *) msg;
|
|
||||||
iov[0].iov_len = 19;
|
|
||||||
this->write_raw_(iov, 1, 19);
|
|
||||||
}
|
|
||||||
return aerr;
|
|
||||||
}
|
|
||||||
|
|
||||||
buffer->container = std::move(frame);
|
|
||||||
buffer->data_offset = 0;
|
|
||||||
buffer->data_len = rx_header_parsed_len_;
|
|
||||||
buffer->type = rx_header_parsed_type_;
|
|
||||||
return APIError::OK;
|
|
||||||
}
|
|
||||||
APIError APIPlaintextFrameHelper::write_protobuf_packet(uint8_t type, ProtoWriteBuffer buffer) {
|
|
||||||
PacketInfo packet{type, 0, static_cast<uint16_t>(buffer.get_buffer()->size() - frame_header_padding_)};
|
|
||||||
return write_protobuf_packets(buffer, std::span<const PacketInfo>(&packet, 1));
|
|
||||||
}
|
|
||||||
|
|
||||||
APIError APIPlaintextFrameHelper::write_protobuf_packets(ProtoWriteBuffer buffer, std::span<const PacketInfo> packets) {
|
|
||||||
if (state_ != State::DATA) {
|
|
||||||
return APIError::BAD_STATE;
|
|
||||||
}
|
|
||||||
|
|
||||||
if (packets.empty()) {
|
|
||||||
return APIError::OK;
|
|
||||||
}
|
|
||||||
|
|
||||||
std::vector<uint8_t> *raw_buffer = buffer.get_buffer();
|
|
||||||
uint8_t *buffer_data = raw_buffer->data(); // Cache buffer pointer
|
|
||||||
|
|
||||||
this->reusable_iovs_.clear();
|
|
||||||
this->reusable_iovs_.reserve(packets.size());
|
|
||||||
uint16_t total_write_len = 0;
|
|
||||||
|
|
||||||
for (const auto &packet : packets) {
|
|
||||||
// Calculate varint sizes for header layout
|
|
||||||
uint8_t size_varint_len = api::ProtoSize::varint(static_cast<uint32_t>(packet.payload_size));
|
|
||||||
uint8_t type_varint_len = api::ProtoSize::varint(static_cast<uint32_t>(packet.message_type));
|
|
||||||
uint8_t total_header_len = 1 + size_varint_len + type_varint_len;
|
|
||||||
|
|
||||||
// Calculate where to start writing the header
|
|
||||||
// The header starts at the latest possible position to minimize unused padding
|
|
||||||
//
|
|
||||||
// Example 1 (small values): total_header_len = 3, header_offset = 6 - 3 = 3
|
|
||||||
// [0-2] - Unused padding
|
|
||||||
// [3] - 0x00 indicator byte
|
|
||||||
// [4] - Payload size varint (1 byte, for sizes 0-127)
|
|
||||||
// [5] - Message type varint (1 byte, for types 0-127)
|
|
||||||
// [6...] - Actual payload data
|
|
||||||
//
|
|
||||||
// Example 2 (medium values): total_header_len = 4, header_offset = 6 - 4 = 2
|
|
||||||
// [0-1] - Unused padding
|
|
||||||
// [2] - 0x00 indicator byte
|
|
||||||
// [3-4] - Payload size varint (2 bytes, for sizes 128-16383)
|
|
||||||
// [5] - Message type varint (1 byte, for types 0-127)
|
|
||||||
// [6...] - Actual payload data
|
|
||||||
//
|
|
||||||
// Example 3 (large values): total_header_len = 6, header_offset = 6 - 6 = 0
|
|
||||||
// [0] - 0x00 indicator byte
|
|
||||||
// [1-3] - Payload size varint (3 bytes, for sizes 16384-2097151)
|
|
||||||
// [4-5] - Message type varint (2 bytes, for types 128-32767)
|
|
||||||
// [6...] - Actual payload data
|
|
||||||
//
|
|
||||||
// The message starts at offset + frame_header_padding_
|
|
||||||
// So we write the header starting at offset + frame_header_padding_ - total_header_len
|
|
||||||
uint8_t *buf_start = buffer_data + packet.offset;
|
|
||||||
uint32_t header_offset = frame_header_padding_ - total_header_len;
|
|
||||||
|
|
||||||
// Write the plaintext header
|
|
||||||
buf_start[header_offset] = 0x00; // indicator
|
|
||||||
|
|
||||||
// Encode varints directly into buffer
|
|
||||||
ProtoVarInt(packet.payload_size).encode_to_buffer_unchecked(buf_start + header_offset + 1, size_varint_len);
|
|
||||||
ProtoVarInt(packet.message_type)
|
|
||||||
.encode_to_buffer_unchecked(buf_start + header_offset + 1 + size_varint_len, type_varint_len);
|
|
||||||
|
|
||||||
// Add iovec for this packet (header + payload)
|
|
||||||
size_t packet_len = static_cast<size_t>(total_header_len + packet.payload_size);
|
|
||||||
this->reusable_iovs_.push_back({buf_start + header_offset, packet_len});
|
|
||||||
total_write_len += packet_len;
|
|
||||||
}
|
|
||||||
|
|
||||||
// Send all packets in one writev call
|
|
||||||
return write_raw_(this->reusable_iovs_.data(), this->reusable_iovs_.size(), total_write_len);
|
|
||||||
}
|
|
||||||
|
|
||||||
} // namespace esphome::api
|
|
||||||
#endif // USE_API_PLAINTEXT
|
|
||||||
#endif // USE_API
|
|
@ -1,53 +0,0 @@
|
|||||||
#pragma once
|
|
||||||
#include "api_frame_helper.h"
|
|
||||||
#ifdef USE_API
|
|
||||||
#ifdef USE_API_PLAINTEXT
|
|
||||||
|
|
||||||
namespace esphome::api {
|
|
||||||
|
|
||||||
class APIPlaintextFrameHelper : public APIFrameHelper {
|
|
||||||
public:
|
|
||||||
APIPlaintextFrameHelper(std::unique_ptr<socket::Socket> socket, const ClientInfo *client_info)
|
|
||||||
: APIFrameHelper(std::move(socket), client_info) {
|
|
||||||
// Plaintext header structure (worst case):
|
|
||||||
// Pos 0: indicator (0x00)
|
|
||||||
// Pos 1-3: payload size varint (up to 3 bytes)
|
|
||||||
// Pos 4-5: message type varint (up to 2 bytes)
|
|
||||||
// Pos 6+: actual payload data
|
|
||||||
frame_header_padding_ = 6;
|
|
||||||
}
|
|
||||||
~APIPlaintextFrameHelper() override = default;
|
|
||||||
APIError init() override;
|
|
||||||
APIError loop() override;
|
|
||||||
APIError read_packet(ReadPacketBuffer *buffer) override;
|
|
||||||
APIError write_protobuf_packet(uint8_t type, ProtoWriteBuffer buffer) override;
|
|
||||||
APIError write_protobuf_packets(ProtoWriteBuffer buffer, std::span<const PacketInfo> packets) override;
|
|
||||||
uint8_t frame_header_padding() override { return frame_header_padding_; }
|
|
||||||
// Get the frame footer size required by this protocol
|
|
||||||
uint8_t frame_footer_size() override { return frame_footer_size_; }
|
|
||||||
|
|
||||||
protected:
|
|
||||||
APIError try_read_frame_(std::vector<uint8_t> *frame);
|
|
||||||
|
|
||||||
// Group 2-byte aligned types
|
|
||||||
uint16_t rx_header_parsed_type_ = 0;
|
|
||||||
uint16_t rx_header_parsed_len_ = 0;
|
|
||||||
|
|
||||||
// Group 1-byte types together
|
|
||||||
// Fixed-size header buffer for plaintext protocol:
|
|
||||||
// We now store the indicator byte + the two varints.
|
|
||||||
// To match noise protocol's maximum message size (UINT16_MAX = 65535), we need:
|
|
||||||
// 1 byte for indicator + 3 bytes for message size varint (supports up to 2097151) + 2 bytes for message type varint
|
|
||||||
//
|
|
||||||
// While varints could theoretically be up to 10 bytes each for 64-bit values,
|
|
||||||
// attempting to process messages with headers that large would likely crash the
|
|
||||||
// ESP32 due to memory constraints.
|
|
||||||
uint8_t rx_header_buf_[6]; // 1 byte indicator + 5 bytes for varints (3 for size + 2 for type)
|
|
||||||
uint8_t rx_header_buf_pos_ = 0;
|
|
||||||
bool rx_header_parsed_ = false;
|
|
||||||
// 8 bytes total, no padding needed
|
|
||||||
};
|
|
||||||
|
|
||||||
} // namespace esphome::api
|
|
||||||
#endif // USE_API_PLAINTEXT
|
|
||||||
#endif // USE_API
|
|
@ -1,30 +1,23 @@
|
|||||||
#pragma once
|
#pragma once
|
||||||
#include <array>
|
|
||||||
#include <cstdint>
|
#include <cstdint>
|
||||||
|
#include <array>
|
||||||
#include "esphome/core/defines.h"
|
#include "esphome/core/defines.h"
|
||||||
|
|
||||||
namespace esphome::api {
|
namespace esphome {
|
||||||
|
namespace api {
|
||||||
|
|
||||||
#ifdef USE_API_NOISE
|
#ifdef USE_API_NOISE
|
||||||
using psk_t = std::array<uint8_t, 32>;
|
using psk_t = std::array<uint8_t, 32>;
|
||||||
|
|
||||||
class APINoiseContext {
|
class APINoiseContext {
|
||||||
public:
|
public:
|
||||||
void set_psk(psk_t psk) {
|
void set_psk(psk_t psk) { psk_ = psk; }
|
||||||
this->psk_ = psk;
|
const psk_t &get_psk() const { return psk_; }
|
||||||
bool has_psk = false;
|
|
||||||
for (auto i : psk) {
|
|
||||||
has_psk |= i;
|
|
||||||
}
|
|
||||||
this->has_psk_ = has_psk;
|
|
||||||
}
|
|
||||||
const psk_t &get_psk() const { return this->psk_; }
|
|
||||||
bool has_psk() const { return this->has_psk_; }
|
|
||||||
|
|
||||||
protected:
|
protected:
|
||||||
psk_t psk_{};
|
psk_t psk_;
|
||||||
bool has_psk_{false};
|
|
||||||
};
|
};
|
||||||
#endif // USE_API_NOISE
|
#endif // USE_API_NOISE
|
||||||
|
|
||||||
} // namespace esphome::api
|
} // namespace api
|
||||||
|
} // namespace esphome
|
||||||
|
@ -21,10 +21,4 @@ extend google.protobuf.MessageOptions {
|
|||||||
optional string ifdef = 1038;
|
optional string ifdef = 1038;
|
||||||
optional bool log = 1039 [default=true];
|
optional bool log = 1039 [default=true];
|
||||||
optional bool no_delay = 1040 [default=false];
|
optional bool no_delay = 1040 [default=false];
|
||||||
optional string base_class = 1041;
|
|
||||||
}
|
|
||||||
|
|
||||||
extend google.protobuf.FieldOptions {
|
|
||||||
optional string field_ifdef = 1042;
|
|
||||||
optional uint32 fixed_array_size = 50007;
|
|
||||||
}
|
}
|
||||||
|
File diff suppressed because it is too large
Load Diff
Some files were not shown because too many files have changed in this diff Show More
Loading…
x
Reference in New Issue
Block a user