mirror of
https://github.com/esphome/esphome.git
synced 2025-08-02 16:37:46 +00:00
Compare commits
No commits in common. "dev" and "2025.7.2" have entirely different histories.
@ -1,222 +0,0 @@
|
|||||||
# ESPHome AI Collaboration Guide
|
|
||||||
|
|
||||||
This document provides essential context for AI models interacting with this project. Adhering to these guidelines will ensure consistency and maintain code quality.
|
|
||||||
|
|
||||||
## 1. Project Overview & Purpose
|
|
||||||
|
|
||||||
* **Primary Goal:** ESPHome is a system to configure microcontrollers (like ESP32, ESP8266, RP2040, and LibreTiny-based chips) using simple yet powerful YAML configuration files. It generates C++ firmware that can be compiled and flashed to these devices, allowing users to control them remotely through home automation systems.
|
|
||||||
* **Business Domain:** Internet of Things (IoT), Home Automation.
|
|
||||||
|
|
||||||
## 2. Core Technologies & Stack
|
|
||||||
|
|
||||||
* **Languages:** Python (>=3.10), C++ (gnu++20)
|
|
||||||
* **Frameworks & Runtimes:** PlatformIO, Arduino, ESP-IDF.
|
|
||||||
* **Build Systems:** PlatformIO is the primary build system. CMake is used as an alternative.
|
|
||||||
* **Configuration:** YAML.
|
|
||||||
* **Key Libraries/Dependencies:**
|
|
||||||
* **Python:** `voluptuous` (for configuration validation), `PyYAML` (for parsing configuration files), `paho-mqtt` (for MQTT communication), `tornado` (for the web server), `aioesphomeapi` (for the native API).
|
|
||||||
* **C++:** `ArduinoJson` (for JSON serialization/deserialization), `AsyncMqttClient-esphome` (for MQTT), `ESPAsyncWebServer` (for the web server).
|
|
||||||
* **Package Manager(s):** `pip` (for Python dependencies), `platformio` (for C++/PlatformIO dependencies).
|
|
||||||
* **Communication Protocols:** Protobuf (for native API), MQTT, HTTP.
|
|
||||||
|
|
||||||
## 3. Architectural Patterns
|
|
||||||
|
|
||||||
* **Overall Architecture:** The project follows a code-generation architecture. The Python code parses user-defined YAML configuration files and generates C++ source code. This C++ code is then compiled and flashed to the target microcontroller using PlatformIO.
|
|
||||||
|
|
||||||
* **Directory Structure Philosophy:**
|
|
||||||
* `/esphome`: Contains the core Python source code for the ESPHome application.
|
|
||||||
* `/esphome/components`: Contains the individual components that can be used in ESPHome configurations. Each component is a self-contained unit with its own C++ and Python code.
|
|
||||||
* `/tests`: Contains all unit and integration tests for the Python code.
|
|
||||||
* `/docker`: Contains Docker-related files for building and running ESPHome in a container.
|
|
||||||
* `/script`: Contains helper scripts for development and maintenance.
|
|
||||||
|
|
||||||
* **Core Architectural Components:**
|
|
||||||
1. **Configuration System** (`esphome/config*.py`): Handles YAML parsing and validation using Voluptuous, schema definitions, and multi-platform configurations.
|
|
||||||
2. **Code Generation** (`esphome/codegen.py`, `esphome/cpp_generator.py`): Manages Python to C++ code generation, template processing, and build flag management.
|
|
||||||
3. **Component System** (`esphome/components/`): Contains modular hardware and software components with platform-specific implementations and dependency management.
|
|
||||||
4. **Core Framework** (`esphome/core/`): Manages the application lifecycle, hardware abstraction, and component registration.
|
|
||||||
5. **Dashboard** (`esphome/dashboard/`): A web-based interface for device configuration, management, and OTA updates.
|
|
||||||
|
|
||||||
* **Platform Support:**
|
|
||||||
1. **ESP32** (`components/esp32/`): Espressif ESP32 family. Supports multiple variants (S2, S3, C3, etc.) and both IDF and Arduino frameworks.
|
|
||||||
2. **ESP8266** (`components/esp8266/`): Espressif ESP8266. Arduino framework only, with memory constraints.
|
|
||||||
3. **RP2040** (`components/rp2040/`): Raspberry Pi Pico/RP2040. Arduino framework with PIO (Programmable I/O) support.
|
|
||||||
4. **LibreTiny** (`components/libretiny/`): Realtek and Beken chips. Supports multiple chip families and auto-generated components.
|
|
||||||
|
|
||||||
## 4. Coding Conventions & Style Guide
|
|
||||||
|
|
||||||
* **Formatting:**
|
|
||||||
* **Python:** Uses `ruff` and `flake8` for linting and formatting. Configuration is in `pyproject.toml`.
|
|
||||||
* **C++:** Uses `clang-format` for formatting. Configuration is in `.clang-format`.
|
|
||||||
|
|
||||||
* **Naming Conventions:**
|
|
||||||
* **Python:** Follows PEP 8. Use clear, descriptive names following snake_case.
|
|
||||||
* **C++:** Follows the Google C++ Style Guide.
|
|
||||||
|
|
||||||
* **Component Structure:**
|
|
||||||
* **Standard Files:**
|
|
||||||
```
|
|
||||||
components/[component_name]/
|
|
||||||
├── __init__.py # Component configuration schema and code generation
|
|
||||||
├── [component].h # C++ header file (if needed)
|
|
||||||
├── [component].cpp # C++ implementation (if needed)
|
|
||||||
└── [platform]/ # Platform-specific implementations
|
|
||||||
├── __init__.py # Platform-specific configuration
|
|
||||||
├── [platform].h # Platform C++ header
|
|
||||||
└── [platform].cpp # Platform C++ implementation
|
|
||||||
```
|
|
||||||
|
|
||||||
* **Component Metadata:**
|
|
||||||
- `DEPENDENCIES`: List of required components
|
|
||||||
- `AUTO_LOAD`: Components to automatically load
|
|
||||||
- `CONFLICTS_WITH`: Incompatible components
|
|
||||||
- `CODEOWNERS`: GitHub usernames responsible for maintenance
|
|
||||||
- `MULTI_CONF`: Whether multiple instances are allowed
|
|
||||||
|
|
||||||
* **Code Generation & Common Patterns:**
|
|
||||||
* **Configuration Schema Pattern:**
|
|
||||||
```python
|
|
||||||
import esphome.codegen as cg
|
|
||||||
import esphome.config_validation as cv
|
|
||||||
from esphome.const import CONF_KEY, CONF_ID
|
|
||||||
|
|
||||||
CONF_PARAM = "param" # A constant that does not yet exist in esphome/const.py
|
|
||||||
|
|
||||||
my_component_ns = cg.esphome_ns.namespace("my_component")
|
|
||||||
MyComponent = my_component_ns.class_("MyComponent", cg.Component)
|
|
||||||
|
|
||||||
CONFIG_SCHEMA = cv.Schema({
|
|
||||||
cv.GenerateID(): cv.declare_id(MyComponent),
|
|
||||||
cv.Required(CONF_KEY): cv.string,
|
|
||||||
cv.Optional(CONF_PARAM, default=42): cv.int_,
|
|
||||||
}).extend(cv.COMPONENT_SCHEMA)
|
|
||||||
|
|
||||||
async def to_code(config):
|
|
||||||
var = cg.new_Pvariable(config[CONF_ID])
|
|
||||||
await cg.register_component(var, config)
|
|
||||||
cg.add(var.set_key(config[CONF_KEY]))
|
|
||||||
cg.add(var.set_param(config[CONF_PARAM]))
|
|
||||||
```
|
|
||||||
|
|
||||||
* **C++ Class Pattern:**
|
|
||||||
```cpp
|
|
||||||
namespace esphome {
|
|
||||||
namespace my_component {
|
|
||||||
|
|
||||||
class MyComponent : public Component {
|
|
||||||
public:
|
|
||||||
void setup() override;
|
|
||||||
void loop() override;
|
|
||||||
void dump_config() override;
|
|
||||||
|
|
||||||
void set_key(const std::string &key) { this->key_ = key; }
|
|
||||||
void set_param(int param) { this->param_ = param; }
|
|
||||||
|
|
||||||
protected:
|
|
||||||
std::string key_;
|
|
||||||
int param_{0};
|
|
||||||
};
|
|
||||||
|
|
||||||
} // namespace my_component
|
|
||||||
} // namespace esphome
|
|
||||||
```
|
|
||||||
|
|
||||||
* **Common Component Examples:**
|
|
||||||
- **Sensor:**
|
|
||||||
```python
|
|
||||||
from esphome.components import sensor
|
|
||||||
CONFIG_SCHEMA = sensor.sensor_schema(MySensor).extend(cv.polling_component_schema("60s"))
|
|
||||||
async def to_code(config):
|
|
||||||
var = await sensor.new_sensor(config)
|
|
||||||
await cg.register_component(var, config)
|
|
||||||
```
|
|
||||||
|
|
||||||
- **Binary Sensor:**
|
|
||||||
```python
|
|
||||||
from esphome.components import binary_sensor
|
|
||||||
CONFIG_SCHEMA = binary_sensor.binary_sensor_schema().extend({ ... })
|
|
||||||
async def to_code(config):
|
|
||||||
var = await binary_sensor.new_binary_sensor(config)
|
|
||||||
```
|
|
||||||
|
|
||||||
- **Switch:**
|
|
||||||
```python
|
|
||||||
from esphome.components import switch
|
|
||||||
CONFIG_SCHEMA = switch.switch_schema().extend({ ... })
|
|
||||||
async def to_code(config):
|
|
||||||
var = await switch.new_switch(config)
|
|
||||||
```
|
|
||||||
|
|
||||||
* **Configuration Validation:**
|
|
||||||
* **Common Validators:** `cv.int_`, `cv.float_`, `cv.string`, `cv.boolean`, `cv.int_range(min=0, max=100)`, `cv.positive_int`, `cv.percentage`.
|
|
||||||
* **Complex Validation:** `cv.All(cv.string, cv.Length(min=1, max=50))`, `cv.Any(cv.int_, cv.string)`.
|
|
||||||
* **Platform-Specific:** `cv.only_on(["esp32", "esp8266"])`, `cv.only_with_arduino`.
|
|
||||||
* **Schema Extensions:**
|
|
||||||
```python
|
|
||||||
CONFIG_SCHEMA = cv.Schema({ ... })
|
|
||||||
.extend(cv.COMPONENT_SCHEMA)
|
|
||||||
.extend(uart.UART_DEVICE_SCHEMA)
|
|
||||||
.extend(i2c.i2c_device_schema(0x48))
|
|
||||||
.extend(spi.spi_device_schema(cs_pin_required=True))
|
|
||||||
```
|
|
||||||
|
|
||||||
## 5. Key Files & Entrypoints
|
|
||||||
|
|
||||||
* **Main Entrypoint(s):** `esphome/__main__.py` is the main entrypoint for the ESPHome command-line interface.
|
|
||||||
* **Configuration:**
|
|
||||||
* `pyproject.toml`: Defines the Python project metadata and dependencies.
|
|
||||||
* `platformio.ini`: Configures the PlatformIO build environments for different microcontrollers.
|
|
||||||
* `.pre-commit-config.yaml`: Configures the pre-commit hooks for linting and formatting.
|
|
||||||
* **CI/CD Pipeline:** Defined in `.github/workflows`.
|
|
||||||
|
|
||||||
## 6. Development & Testing Workflow
|
|
||||||
|
|
||||||
* **Local Development Environment:** Use the provided Docker container or create a Python virtual environment and install dependencies from `requirements_dev.txt`.
|
|
||||||
* **Running Commands:** Use the `script/run-in-env.py` script to execute commands within the project's virtual environment. For example, to run the linter: `python3 script/run-in-env.py pre-commit run`.
|
|
||||||
* **Testing:**
|
|
||||||
* **Python:** Run unit tests with `pytest`.
|
|
||||||
* **C++:** Use `clang-tidy` for static analysis.
|
|
||||||
* **Component Tests:** YAML-based compilation tests are located in `tests/`. The structure is as follows:
|
|
||||||
```
|
|
||||||
tests/
|
|
||||||
├── test_build_components/ # Base test configurations
|
|
||||||
└── components/[component]/ # Component-specific tests
|
|
||||||
```
|
|
||||||
Run them using `script/test_build_components`. Use `-c <component>` to test specific components and `-t <target>` for specific platforms.
|
|
||||||
* **Debugging and Troubleshooting:**
|
|
||||||
* **Debug Tools:**
|
|
||||||
- `esphome config <file>.yaml` to validate configuration.
|
|
||||||
- `esphome compile <file>.yaml` to compile without uploading.
|
|
||||||
- Check the Dashboard for real-time logs.
|
|
||||||
- Use component-specific debug logging.
|
|
||||||
* **Common Issues:**
|
|
||||||
- **Import Errors**: Check component dependencies and `PYTHONPATH`.
|
|
||||||
- **Validation Errors**: Review configuration schema definitions.
|
|
||||||
- **Build Errors**: Check platform compatibility and library versions.
|
|
||||||
- **Runtime Errors**: Review generated C++ code and component logic.
|
|
||||||
|
|
||||||
## 7. Specific Instructions for AI Collaboration
|
|
||||||
|
|
||||||
* **Contribution Workflow (Pull Request Process):**
|
|
||||||
1. **Fork & Branch:** Create a new branch in your fork.
|
|
||||||
2. **Make Changes:** Adhere to all coding conventions and patterns.
|
|
||||||
3. **Test:** Create component tests for all supported platforms and run the full test suite locally.
|
|
||||||
4. **Lint:** Run `pre-commit` to ensure code is compliant.
|
|
||||||
5. **Commit:** Commit your changes. There is no strict format for commit messages.
|
|
||||||
6. **Pull Request:** Submit a PR against the `dev` branch. The Pull Request title should have a prefix of the component being worked on (e.g., `[display] Fix bug`, `[abc123] Add new component`). Update documentation, examples, and add `CODEOWNERS` entries as needed. Pull requests should always be made with the PULL_REQUEST_TEMPLATE.md template filled out correctly.
|
|
||||||
|
|
||||||
* **Documentation Contributions:**
|
|
||||||
* Documentation is hosted in the separate `esphome/esphome-docs` repository.
|
|
||||||
* The contribution workflow is the same as for the codebase.
|
|
||||||
|
|
||||||
* **Best Practices:**
|
|
||||||
* **Component Development:** Keep dependencies minimal, provide clear error messages, and write comprehensive docstrings and tests.
|
|
||||||
* **Code Generation:** Generate minimal and efficient C++ code. Validate all user inputs thoroughly. Support multiple platform variations.
|
|
||||||
* **Configuration Design:** Aim for simplicity with sensible defaults, while allowing for advanced customization.
|
|
||||||
|
|
||||||
* **Security:** Be mindful of security when making changes to the API, web server, or any other network-related code. Do not hardcode secrets or keys.
|
|
||||||
|
|
||||||
* **Dependencies & Build System Integration:**
|
|
||||||
* **Python:** When adding a new Python dependency, add it to the appropriate `requirements*.txt` file and `pyproject.toml`.
|
|
||||||
* **C++ / PlatformIO:** When adding a new C++ dependency, add it to `platformio.ini` and use `cg.add_library`.
|
|
||||||
* **Build Flags:** Use `cg.add_build_flag(...)` to add compiler flags.
|
|
@ -1 +0,0 @@
|
|||||||
6af8b429b94191fe8e239fcb3b73f7982d0266cb5b05ffbc81edaeac1bc8c273
|
|
@ -1,4 +1,4 @@
|
|||||||
[run]
|
[run]
|
||||||
omit =
|
omit =
|
||||||
esphome/components/*
|
esphome/components/*
|
||||||
tests/integration/*
|
tests/integration/*
|
||||||
|
92
.github/ISSUE_TEMPLATE/bug_report.yml
vendored
92
.github/ISSUE_TEMPLATE/bug_report.yml
vendored
@ -1,92 +0,0 @@
|
|||||||
name: Report an issue with ESPHome
|
|
||||||
description: Report an issue with ESPHome.
|
|
||||||
body:
|
|
||||||
- type: markdown
|
|
||||||
attributes:
|
|
||||||
value: |
|
|
||||||
This issue form is for reporting bugs only!
|
|
||||||
|
|
||||||
If you have a feature request or enhancement, please [request them here instead][fr].
|
|
||||||
|
|
||||||
[fr]: https://github.com/orgs/esphome/discussions
|
|
||||||
- type: textarea
|
|
||||||
validations:
|
|
||||||
required: true
|
|
||||||
id: problem
|
|
||||||
attributes:
|
|
||||||
label: The problem
|
|
||||||
description: >-
|
|
||||||
Describe the issue you are experiencing here to communicate to the
|
|
||||||
maintainers. Tell us what you were trying to do and what happened.
|
|
||||||
|
|
||||||
Provide a clear and concise description of what the problem is.
|
|
||||||
|
|
||||||
- type: markdown
|
|
||||||
attributes:
|
|
||||||
value: |
|
|
||||||
## Environment
|
|
||||||
- type: input
|
|
||||||
id: version
|
|
||||||
validations:
|
|
||||||
required: true
|
|
||||||
attributes:
|
|
||||||
label: Which version of ESPHome has the issue?
|
|
||||||
description: >
|
|
||||||
ESPHome version like 1.19, 2025.6.0 or 2025.XX.X-dev.
|
|
||||||
- type: dropdown
|
|
||||||
validations:
|
|
||||||
required: true
|
|
||||||
id: installation
|
|
||||||
attributes:
|
|
||||||
label: What type of installation are you using?
|
|
||||||
options:
|
|
||||||
- Home Assistant Add-on
|
|
||||||
- Docker
|
|
||||||
- pip
|
|
||||||
- type: dropdown
|
|
||||||
validations:
|
|
||||||
required: true
|
|
||||||
id: platform
|
|
||||||
attributes:
|
|
||||||
label: What platform are you using?
|
|
||||||
options:
|
|
||||||
- ESP8266
|
|
||||||
- ESP32
|
|
||||||
- RP2040
|
|
||||||
- BK72XX
|
|
||||||
- RTL87XX
|
|
||||||
- LN882X
|
|
||||||
- Host
|
|
||||||
- Other
|
|
||||||
- type: input
|
|
||||||
id: component_name
|
|
||||||
attributes:
|
|
||||||
label: Component causing the issue
|
|
||||||
description: >
|
|
||||||
The name of the component or platform. For example, api/i2c or ultrasonic.
|
|
||||||
|
|
||||||
- type: markdown
|
|
||||||
attributes:
|
|
||||||
value: |
|
|
||||||
# Details
|
|
||||||
- type: textarea
|
|
||||||
id: config
|
|
||||||
attributes:
|
|
||||||
label: YAML Config
|
|
||||||
description: |
|
|
||||||
Include a complete YAML configuration file demonstrating the problem here. Preferably post the *entire* file - don't make assumptions about what is unimportant. However, if it's a large or complicated config then you will need to reduce it to the smallest possible file *that still demonstrates the problem*. If you don't provide enough information to *easily* reproduce the problem, it's unlikely your bug report will get any attention. Logs do not belong here, attach them below.
|
|
||||||
render: yaml
|
|
||||||
- type: textarea
|
|
||||||
id: logs
|
|
||||||
attributes:
|
|
||||||
label: Anything in the logs that might be useful for us?
|
|
||||||
description: For example, error message, or stack traces. Serial or USB logs are much more useful than WiFi logs.
|
|
||||||
render: txt
|
|
||||||
- type: textarea
|
|
||||||
id: additional
|
|
||||||
attributes:
|
|
||||||
label: Additional information
|
|
||||||
description: >
|
|
||||||
If you have any additional information for us, use the field below.
|
|
||||||
Please note, you can attach screenshots or screen recordings here, by
|
|
||||||
dragging and dropping files in the field below.
|
|
26
.github/ISSUE_TEMPLATE/config.yml
vendored
26
.github/ISSUE_TEMPLATE/config.yml
vendored
@ -1,21 +1,15 @@
|
|||||||
---
|
---
|
||||||
blank_issues_enabled: false
|
blank_issues_enabled: false
|
||||||
contact_links:
|
contact_links:
|
||||||
- name: Report an issue with the ESPHome documentation
|
- name: Issue Tracker
|
||||||
url: https://github.com/esphome/esphome-docs/issues/new/choose
|
url: https://github.com/esphome/issues
|
||||||
about: Report an issue with the ESPHome documentation.
|
about: Please create bug reports in the dedicated issue tracker.
|
||||||
- name: Report an issue with the ESPHome web server
|
- name: Feature Request Tracker
|
||||||
url: https://github.com/esphome/esphome-webserver/issues/new/choose
|
url: https://github.com/esphome/feature-requests
|
||||||
about: Report an issue with the ESPHome web server.
|
about: |
|
||||||
- name: Report an issue with the ESPHome Builder / Dashboard
|
Please create feature requests in the dedicated feature request tracker.
|
||||||
url: https://github.com/esphome/dashboard/issues/new/choose
|
|
||||||
about: Report an issue with the ESPHome Builder / Dashboard.
|
|
||||||
- name: Report an issue with the ESPHome API client
|
|
||||||
url: https://github.com/esphome/aioesphomeapi/issues/new/choose
|
|
||||||
about: Report an issue with the ESPHome API client.
|
|
||||||
- name: Make a Feature Request
|
|
||||||
url: https://github.com/orgs/esphome/discussions
|
|
||||||
about: Please create feature requests in the dedicated feature request tracker.
|
|
||||||
- name: Frequently Asked Question
|
- name: Frequently Asked Question
|
||||||
url: https://esphome.io/guides/faq.html
|
url: https://esphome.io/guides/faq.html
|
||||||
about: Please view the FAQ for common questions and what to include in a bug report.
|
about: |
|
||||||
|
Please view the FAQ for common questions and what
|
||||||
|
to include in a bug report.
|
||||||
|
1
.github/PULL_REQUEST_TEMPLATE.md
vendored
1
.github/PULL_REQUEST_TEMPLATE.md
vendored
@ -26,7 +26,6 @@
|
|||||||
- [ ] RP2040
|
- [ ] RP2040
|
||||||
- [ ] BK72xx
|
- [ ] BK72xx
|
||||||
- [ ] RTL87xx
|
- [ ] RTL87xx
|
||||||
- [ ] nRF52840
|
|
||||||
|
|
||||||
## Example entry for `config.yaml`:
|
## Example entry for `config.yaml`:
|
||||||
|
|
||||||
|
2
.github/actions/restore-python/action.yml
vendored
2
.github/actions/restore-python/action.yml
vendored
@ -41,7 +41,7 @@ runs:
|
|||||||
shell: bash
|
shell: bash
|
||||||
run: |
|
run: |
|
||||||
python -m venv venv
|
python -m venv venv
|
||||||
source ./venv/Scripts/activate
|
./venv/Scripts/activate
|
||||||
python --version
|
python --version
|
||||||
pip install -r requirements.txt -r requirements_test.txt
|
pip install -r requirements.txt -r requirements_test.txt
|
||||||
pip install -e .
|
pip install -e .
|
||||||
|
1
.github/copilot-instructions.md
vendored
1
.github/copilot-instructions.md
vendored
@ -1 +0,0 @@
|
|||||||
../.ai/instructions.md
|
|
9
.github/dependabot.yml
vendored
9
.github/dependabot.yml
vendored
@ -9,9 +9,6 @@ updates:
|
|||||||
# Hypotehsis is only used for testing and is updated quite often
|
# Hypotehsis is only used for testing and is updated quite often
|
||||||
- dependency-name: hypothesis
|
- dependency-name: hypothesis
|
||||||
- package-ecosystem: github-actions
|
- package-ecosystem: github-actions
|
||||||
labels:
|
|
||||||
- "dependencies"
|
|
||||||
- "github-actions"
|
|
||||||
directory: "/"
|
directory: "/"
|
||||||
schedule:
|
schedule:
|
||||||
interval: daily
|
interval: daily
|
||||||
@ -23,17 +20,11 @@ updates:
|
|||||||
- "docker/login-action"
|
- "docker/login-action"
|
||||||
- "docker/setup-buildx-action"
|
- "docker/setup-buildx-action"
|
||||||
- package-ecosystem: github-actions
|
- package-ecosystem: github-actions
|
||||||
labels:
|
|
||||||
- "dependencies"
|
|
||||||
- "github-actions"
|
|
||||||
directory: "/.github/actions/build-image"
|
directory: "/.github/actions/build-image"
|
||||||
schedule:
|
schedule:
|
||||||
interval: daily
|
interval: daily
|
||||||
open-pull-requests-limit: 10
|
open-pull-requests-limit: 10
|
||||||
- package-ecosystem: github-actions
|
- package-ecosystem: github-actions
|
||||||
labels:
|
|
||||||
- "dependencies"
|
|
||||||
- "github-actions"
|
|
||||||
directory: "/.github/actions/restore-python"
|
directory: "/.github/actions/restore-python"
|
||||||
schedule:
|
schedule:
|
||||||
interval: daily
|
interval: daily
|
||||||
|
652
.github/workflows/auto-label-pr.yml
vendored
652
.github/workflows/auto-label-pr.yml
vendored
@ -1,652 +0,0 @@
|
|||||||
name: Auto Label PR
|
|
||||||
|
|
||||||
on:
|
|
||||||
# Runs only on pull_request_target due to having access to a App token.
|
|
||||||
# This means PRs from forks will not be able to alter this workflow to get the tokens
|
|
||||||
pull_request_target:
|
|
||||||
types: [labeled, opened, reopened, synchronize, edited]
|
|
||||||
|
|
||||||
permissions:
|
|
||||||
pull-requests: write
|
|
||||||
contents: read
|
|
||||||
|
|
||||||
env:
|
|
||||||
SMALL_PR_THRESHOLD: 30
|
|
||||||
MAX_LABELS: 15
|
|
||||||
TOO_BIG_THRESHOLD: 1000
|
|
||||||
COMPONENT_LABEL_THRESHOLD: 10
|
|
||||||
|
|
||||||
jobs:
|
|
||||||
label:
|
|
||||||
runs-on: ubuntu-latest
|
|
||||||
if: github.event.action != 'labeled' || github.event.sender.type != 'Bot'
|
|
||||||
steps:
|
|
||||||
- name: Checkout
|
|
||||||
uses: actions/checkout@v4.2.2
|
|
||||||
|
|
||||||
- name: Generate a token
|
|
||||||
id: generate-token
|
|
||||||
uses: actions/create-github-app-token@v2
|
|
||||||
with:
|
|
||||||
app-id: ${{ secrets.ESPHOME_GITHUB_APP_ID }}
|
|
||||||
private-key: ${{ secrets.ESPHOME_GITHUB_APP_PRIVATE_KEY }}
|
|
||||||
|
|
||||||
- name: Auto Label PR
|
|
||||||
uses: actions/github-script@v7.0.1
|
|
||||||
with:
|
|
||||||
github-token: ${{ steps.generate-token.outputs.token }}
|
|
||||||
script: |
|
|
||||||
const fs = require('fs');
|
|
||||||
|
|
||||||
// Constants
|
|
||||||
const SMALL_PR_THRESHOLD = parseInt('${{ env.SMALL_PR_THRESHOLD }}');
|
|
||||||
const MAX_LABELS = parseInt('${{ env.MAX_LABELS }}');
|
|
||||||
const TOO_BIG_THRESHOLD = parseInt('${{ env.TOO_BIG_THRESHOLD }}');
|
|
||||||
const COMPONENT_LABEL_THRESHOLD = parseInt('${{ env.COMPONENT_LABEL_THRESHOLD }}');
|
|
||||||
const BOT_COMMENT_MARKER = '<!-- auto-label-pr-bot -->';
|
|
||||||
const CODEOWNERS_MARKER = '<!-- codeowners-request -->';
|
|
||||||
const TOO_BIG_MARKER = '<!-- too-big-request -->';
|
|
||||||
|
|
||||||
const MANAGED_LABELS = [
|
|
||||||
'new-component',
|
|
||||||
'new-platform',
|
|
||||||
'new-target-platform',
|
|
||||||
'merging-to-release',
|
|
||||||
'merging-to-beta',
|
|
||||||
'core',
|
|
||||||
'small-pr',
|
|
||||||
'dashboard',
|
|
||||||
'github-actions',
|
|
||||||
'by-code-owner',
|
|
||||||
'has-tests',
|
|
||||||
'needs-tests',
|
|
||||||
'needs-docs',
|
|
||||||
'needs-codeowners',
|
|
||||||
'too-big',
|
|
||||||
'labeller-recheck',
|
|
||||||
'bugfix',
|
|
||||||
'new-feature',
|
|
||||||
'breaking-change',
|
|
||||||
'code-quality'
|
|
||||||
];
|
|
||||||
|
|
||||||
const DOCS_PR_PATTERNS = [
|
|
||||||
/https:\/\/github\.com\/esphome\/esphome-docs\/pull\/\d+/,
|
|
||||||
/esphome\/esphome-docs#\d+/
|
|
||||||
];
|
|
||||||
|
|
||||||
// Global state
|
|
||||||
const { owner, repo } = context.repo;
|
|
||||||
const pr_number = context.issue.number;
|
|
||||||
|
|
||||||
// Get current labels and PR data
|
|
||||||
const { data: currentLabelsData } = await github.rest.issues.listLabelsOnIssue({
|
|
||||||
owner,
|
|
||||||
repo,
|
|
||||||
issue_number: pr_number
|
|
||||||
});
|
|
||||||
const currentLabels = currentLabelsData.map(label => label.name);
|
|
||||||
const managedLabels = currentLabels.filter(label =>
|
|
||||||
label.startsWith('component: ') || MANAGED_LABELS.includes(label)
|
|
||||||
);
|
|
||||||
|
|
||||||
// Check for mega-PR early - if present, skip most automatic labeling
|
|
||||||
const isMegaPR = currentLabels.includes('mega-pr');
|
|
||||||
|
|
||||||
// Get all PR files with automatic pagination
|
|
||||||
const prFiles = await github.paginate(
|
|
||||||
github.rest.pulls.listFiles,
|
|
||||||
{
|
|
||||||
owner,
|
|
||||||
repo,
|
|
||||||
pull_number: pr_number
|
|
||||||
}
|
|
||||||
);
|
|
||||||
|
|
||||||
// Calculate data from PR files
|
|
||||||
const changedFiles = prFiles.map(file => file.filename);
|
|
||||||
const totalChanges = prFiles.reduce((sum, file) => sum + (file.additions || 0) + (file.deletions || 0), 0);
|
|
||||||
|
|
||||||
console.log('Current labels:', currentLabels.join(', '));
|
|
||||||
console.log('Changed files:', changedFiles.length);
|
|
||||||
console.log('Total changes:', totalChanges);
|
|
||||||
if (isMegaPR) {
|
|
||||||
console.log('Mega-PR detected - applying limited labeling logic');
|
|
||||||
}
|
|
||||||
|
|
||||||
// Fetch API data
|
|
||||||
async function fetchApiData() {
|
|
||||||
try {
|
|
||||||
const response = await fetch('https://data.esphome.io/components.json');
|
|
||||||
const componentsData = await response.json();
|
|
||||||
return {
|
|
||||||
targetPlatforms: componentsData.target_platforms || [],
|
|
||||||
platformComponents: componentsData.platform_components || []
|
|
||||||
};
|
|
||||||
} catch (error) {
|
|
||||||
console.log('Failed to fetch components data from API:', error.message);
|
|
||||||
return { targetPlatforms: [], platformComponents: [] };
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
// Strategy: Merge branch detection
|
|
||||||
async function detectMergeBranch() {
|
|
||||||
const labels = new Set();
|
|
||||||
const baseRef = context.payload.pull_request.base.ref;
|
|
||||||
|
|
||||||
if (baseRef === 'release') {
|
|
||||||
labels.add('merging-to-release');
|
|
||||||
} else if (baseRef === 'beta') {
|
|
||||||
labels.add('merging-to-beta');
|
|
||||||
}
|
|
||||||
|
|
||||||
return labels;
|
|
||||||
}
|
|
||||||
|
|
||||||
// Strategy: Component and platform labeling
|
|
||||||
async function detectComponentPlatforms(apiData) {
|
|
||||||
const labels = new Set();
|
|
||||||
const componentRegex = /^esphome\/components\/([^\/]+)\//;
|
|
||||||
const targetPlatformRegex = new RegExp(`^esphome\/components\/(${apiData.targetPlatforms.join('|')})/`);
|
|
||||||
|
|
||||||
for (const file of changedFiles) {
|
|
||||||
const componentMatch = file.match(componentRegex);
|
|
||||||
if (componentMatch) {
|
|
||||||
labels.add(`component: ${componentMatch[1]}`);
|
|
||||||
}
|
|
||||||
|
|
||||||
const platformMatch = file.match(targetPlatformRegex);
|
|
||||||
if (platformMatch) {
|
|
||||||
labels.add(`platform: ${platformMatch[1]}`);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
return labels;
|
|
||||||
}
|
|
||||||
|
|
||||||
// Strategy: New component detection
|
|
||||||
async function detectNewComponents() {
|
|
||||||
const labels = new Set();
|
|
||||||
const addedFiles = prFiles.filter(file => file.status === 'added').map(file => file.filename);
|
|
||||||
|
|
||||||
for (const file of addedFiles) {
|
|
||||||
const componentMatch = file.match(/^esphome\/components\/([^\/]+)\/__init__\.py$/);
|
|
||||||
if (componentMatch) {
|
|
||||||
try {
|
|
||||||
const content = fs.readFileSync(file, 'utf8');
|
|
||||||
if (content.includes('IS_TARGET_PLATFORM = True')) {
|
|
||||||
labels.add('new-target-platform');
|
|
||||||
}
|
|
||||||
} catch (error) {
|
|
||||||
console.log(`Failed to read content of ${file}:`, error.message);
|
|
||||||
}
|
|
||||||
labels.add('new-component');
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
return labels;
|
|
||||||
}
|
|
||||||
|
|
||||||
// Strategy: New platform detection
|
|
||||||
async function detectNewPlatforms(apiData) {
|
|
||||||
const labels = new Set();
|
|
||||||
const addedFiles = prFiles.filter(file => file.status === 'added').map(file => file.filename);
|
|
||||||
|
|
||||||
for (const file of addedFiles) {
|
|
||||||
const platformFileMatch = file.match(/^esphome\/components\/([^\/]+)\/([^\/]+)\.py$/);
|
|
||||||
if (platformFileMatch) {
|
|
||||||
const [, component, platform] = platformFileMatch;
|
|
||||||
if (apiData.platformComponents.includes(platform)) {
|
|
||||||
labels.add('new-platform');
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
const platformDirMatch = file.match(/^esphome\/components\/([^\/]+)\/([^\/]+)\/__init__\.py$/);
|
|
||||||
if (platformDirMatch) {
|
|
||||||
const [, component, platform] = platformDirMatch;
|
|
||||||
if (apiData.platformComponents.includes(platform)) {
|
|
||||||
labels.add('new-platform');
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
return labels;
|
|
||||||
}
|
|
||||||
|
|
||||||
// Strategy: Core files detection
|
|
||||||
async function detectCoreChanges() {
|
|
||||||
const labels = new Set();
|
|
||||||
const coreFiles = changedFiles.filter(file =>
|
|
||||||
file.startsWith('esphome/core/') ||
|
|
||||||
(file.startsWith('esphome/') && file.split('/').length === 2)
|
|
||||||
);
|
|
||||||
|
|
||||||
if (coreFiles.length > 0) {
|
|
||||||
labels.add('core');
|
|
||||||
}
|
|
||||||
|
|
||||||
return labels;
|
|
||||||
}
|
|
||||||
|
|
||||||
// Strategy: PR size detection
|
|
||||||
async function detectPRSize() {
|
|
||||||
const labels = new Set();
|
|
||||||
const testChanges = prFiles
|
|
||||||
.filter(file => file.filename.startsWith('tests/'))
|
|
||||||
.reduce((sum, file) => sum + (file.additions || 0) + (file.deletions || 0), 0);
|
|
||||||
|
|
||||||
const nonTestChanges = totalChanges - testChanges;
|
|
||||||
|
|
||||||
if (totalChanges <= SMALL_PR_THRESHOLD) {
|
|
||||||
labels.add('small-pr');
|
|
||||||
}
|
|
||||||
|
|
||||||
// Don't add too-big if mega-pr label is already present
|
|
||||||
if (nonTestChanges > TOO_BIG_THRESHOLD && !isMegaPR) {
|
|
||||||
labels.add('too-big');
|
|
||||||
}
|
|
||||||
|
|
||||||
return labels;
|
|
||||||
}
|
|
||||||
|
|
||||||
// Strategy: Dashboard changes
|
|
||||||
async function detectDashboardChanges() {
|
|
||||||
const labels = new Set();
|
|
||||||
const dashboardFiles = changedFiles.filter(file =>
|
|
||||||
file.startsWith('esphome/dashboard/') ||
|
|
||||||
file.startsWith('esphome/components/dashboard_import/')
|
|
||||||
);
|
|
||||||
|
|
||||||
if (dashboardFiles.length > 0) {
|
|
||||||
labels.add('dashboard');
|
|
||||||
}
|
|
||||||
|
|
||||||
return labels;
|
|
||||||
}
|
|
||||||
|
|
||||||
// Strategy: GitHub Actions changes
|
|
||||||
async function detectGitHubActionsChanges() {
|
|
||||||
const labels = new Set();
|
|
||||||
const githubActionsFiles = changedFiles.filter(file =>
|
|
||||||
file.startsWith('.github/workflows/')
|
|
||||||
);
|
|
||||||
|
|
||||||
if (githubActionsFiles.length > 0) {
|
|
||||||
labels.add('github-actions');
|
|
||||||
}
|
|
||||||
|
|
||||||
return labels;
|
|
||||||
}
|
|
||||||
|
|
||||||
// Strategy: Code owner detection
|
|
||||||
async function detectCodeOwner() {
|
|
||||||
const labels = new Set();
|
|
||||||
|
|
||||||
try {
|
|
||||||
const { data: codeownersFile } = await github.rest.repos.getContent({
|
|
||||||
owner,
|
|
||||||
repo,
|
|
||||||
path: 'CODEOWNERS',
|
|
||||||
});
|
|
||||||
|
|
||||||
const codeownersContent = Buffer.from(codeownersFile.content, 'base64').toString('utf8');
|
|
||||||
const prAuthor = context.payload.pull_request.user.login;
|
|
||||||
|
|
||||||
const codeownersLines = codeownersContent.split('\n')
|
|
||||||
.map(line => line.trim())
|
|
||||||
.filter(line => line && !line.startsWith('#'));
|
|
||||||
|
|
||||||
const codeownersRegexes = codeownersLines.map(line => {
|
|
||||||
const parts = line.split(/\s+/);
|
|
||||||
const pattern = parts[0];
|
|
||||||
const owners = parts.slice(1);
|
|
||||||
|
|
||||||
let regex;
|
|
||||||
if (pattern.endsWith('*')) {
|
|
||||||
const dir = pattern.slice(0, -1);
|
|
||||||
regex = new RegExp(`^${dir.replace(/[.*+?^${}()|[\]\\]/g, '\\$&')}`);
|
|
||||||
} else if (pattern.includes('*')) {
|
|
||||||
// First escape all regex special chars except *, then replace * with .*
|
|
||||||
const regexPattern = pattern
|
|
||||||
.replace(/[.+?^${}()|[\]\\]/g, '\\$&')
|
|
||||||
.replace(/\*/g, '.*');
|
|
||||||
regex = new RegExp(`^${regexPattern}$`);
|
|
||||||
} else {
|
|
||||||
regex = new RegExp(`^${pattern.replace(/[.*+?^${}()|[\]\\]/g, '\\$&')}$`);
|
|
||||||
}
|
|
||||||
|
|
||||||
return { regex, owners };
|
|
||||||
});
|
|
||||||
|
|
||||||
for (const file of changedFiles) {
|
|
||||||
for (const { regex, owners } of codeownersRegexes) {
|
|
||||||
if (regex.test(file) && owners.some(owner => owner === `@${prAuthor}`)) {
|
|
||||||
labels.add('by-code-owner');
|
|
||||||
return labels;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
} catch (error) {
|
|
||||||
console.log('Failed to read or parse CODEOWNERS file:', error.message);
|
|
||||||
}
|
|
||||||
|
|
||||||
return labels;
|
|
||||||
}
|
|
||||||
|
|
||||||
// Strategy: Test detection
|
|
||||||
async function detectTests() {
|
|
||||||
const labels = new Set();
|
|
||||||
const testFiles = changedFiles.filter(file => file.startsWith('tests/'));
|
|
||||||
|
|
||||||
if (testFiles.length > 0) {
|
|
||||||
labels.add('has-tests');
|
|
||||||
}
|
|
||||||
|
|
||||||
return labels;
|
|
||||||
}
|
|
||||||
|
|
||||||
// Strategy: PR Template Checkbox detection
|
|
||||||
async function detectPRTemplateCheckboxes() {
|
|
||||||
const labels = new Set();
|
|
||||||
const prBody = context.payload.pull_request.body || '';
|
|
||||||
|
|
||||||
console.log('Checking PR template checkboxes...');
|
|
||||||
|
|
||||||
// Check for checked checkboxes in the "Types of changes" section
|
|
||||||
const checkboxPatterns = [
|
|
||||||
{ pattern: /- \[x\] Bugfix \(non-breaking change which fixes an issue\)/i, label: 'bugfix' },
|
|
||||||
{ pattern: /- \[x\] New feature \(non-breaking change which adds functionality\)/i, label: 'new-feature' },
|
|
||||||
{ pattern: /- \[x\] Breaking change \(fix or feature that would cause existing functionality to not work as expected\)/i, label: 'breaking-change' },
|
|
||||||
{ pattern: /- \[x\] Code quality improvements to existing code or addition of tests/i, label: 'code-quality' }
|
|
||||||
];
|
|
||||||
|
|
||||||
for (const { pattern, label } of checkboxPatterns) {
|
|
||||||
if (pattern.test(prBody)) {
|
|
||||||
console.log(`Found checked checkbox for: ${label}`);
|
|
||||||
labels.add(label);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
return labels;
|
|
||||||
}
|
|
||||||
|
|
||||||
// Strategy: Requirements detection
|
|
||||||
async function detectRequirements(allLabels) {
|
|
||||||
const labels = new Set();
|
|
||||||
|
|
||||||
// Check for missing tests
|
|
||||||
if ((allLabels.has('new-component') || allLabels.has('new-platform')) && !allLabels.has('has-tests')) {
|
|
||||||
labels.add('needs-tests');
|
|
||||||
}
|
|
||||||
|
|
||||||
// Check for missing docs
|
|
||||||
if (allLabels.has('new-component') || allLabels.has('new-platform') || allLabels.has('new-feature')) {
|
|
||||||
const prBody = context.payload.pull_request.body || '';
|
|
||||||
const hasDocsLink = DOCS_PR_PATTERNS.some(pattern => pattern.test(prBody));
|
|
||||||
|
|
||||||
if (!hasDocsLink) {
|
|
||||||
labels.add('needs-docs');
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
// Check for missing CODEOWNERS
|
|
||||||
if (allLabels.has('new-component')) {
|
|
||||||
const codeownersModified = prFiles.some(file =>
|
|
||||||
file.filename === 'CODEOWNERS' &&
|
|
||||||
(file.status === 'modified' || file.status === 'added') &&
|
|
||||||
(file.additions || 0) > 0
|
|
||||||
);
|
|
||||||
|
|
||||||
if (!codeownersModified) {
|
|
||||||
labels.add('needs-codeowners');
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
return labels;
|
|
||||||
}
|
|
||||||
|
|
||||||
// Generate review messages
|
|
||||||
function generateReviewMessages(finalLabels) {
|
|
||||||
const messages = [];
|
|
||||||
const prAuthor = context.payload.pull_request.user.login;
|
|
||||||
|
|
||||||
// Too big message
|
|
||||||
if (finalLabels.includes('too-big')) {
|
|
||||||
const testChanges = prFiles
|
|
||||||
.filter(file => file.filename.startsWith('tests/'))
|
|
||||||
.reduce((sum, file) => sum + (file.additions || 0) + (file.deletions || 0), 0);
|
|
||||||
const nonTestChanges = totalChanges - testChanges;
|
|
||||||
|
|
||||||
const tooManyLabels = finalLabels.length > MAX_LABELS;
|
|
||||||
const tooManyChanges = nonTestChanges > TOO_BIG_THRESHOLD;
|
|
||||||
|
|
||||||
let message = `${TOO_BIG_MARKER}\n### 📦 Pull Request Size\n\n`;
|
|
||||||
|
|
||||||
if (tooManyLabels && tooManyChanges) {
|
|
||||||
message += `This PR is too large with ${nonTestChanges} line changes (excluding tests) and affects ${finalLabels.length} different components/areas.`;
|
|
||||||
} else if (tooManyLabels) {
|
|
||||||
message += `This PR affects ${finalLabels.length} different components/areas.`;
|
|
||||||
} else {
|
|
||||||
message += `This PR is too large with ${nonTestChanges} line changes (excluding tests).`;
|
|
||||||
}
|
|
||||||
|
|
||||||
message += ` Please consider breaking it down into smaller, focused PRs to make review easier and reduce the risk of conflicts.\n\n`;
|
|
||||||
message += `For guidance on breaking down large PRs, see: https://developers.esphome.io/contributing/submitting-your-work/#how-to-approach-large-submissions`;
|
|
||||||
|
|
||||||
messages.push(message);
|
|
||||||
}
|
|
||||||
|
|
||||||
// CODEOWNERS message
|
|
||||||
if (finalLabels.includes('needs-codeowners')) {
|
|
||||||
const message = `${CODEOWNERS_MARKER}\n### 👥 Code Ownership\n\n` +
|
|
||||||
`Hey there @${prAuthor},\n` +
|
|
||||||
`Thanks for submitting this pull request! Can you add yourself as a codeowner for this integration? ` +
|
|
||||||
`This way we can notify you if a bug report for this integration is reported.\n\n` +
|
|
||||||
`In \`__init__.py\` of the integration, please add:\n\n` +
|
|
||||||
`\`\`\`python\nCODEOWNERS = ["@${prAuthor}"]\n\`\`\`\n\n` +
|
|
||||||
`And run \`script/build_codeowners.py\``;
|
|
||||||
|
|
||||||
messages.push(message);
|
|
||||||
}
|
|
||||||
|
|
||||||
return messages;
|
|
||||||
}
|
|
||||||
|
|
||||||
// Handle reviews
|
|
||||||
async function handleReviews(finalLabels) {
|
|
||||||
const reviewMessages = generateReviewMessages(finalLabels);
|
|
||||||
const hasReviewableLabels = finalLabels.some(label =>
|
|
||||||
['too-big', 'needs-codeowners'].includes(label)
|
|
||||||
);
|
|
||||||
|
|
||||||
const { data: reviews } = await github.rest.pulls.listReviews({
|
|
||||||
owner,
|
|
||||||
repo,
|
|
||||||
pull_number: pr_number
|
|
||||||
});
|
|
||||||
|
|
||||||
const botReviews = reviews.filter(review =>
|
|
||||||
review.user.type === 'Bot' &&
|
|
||||||
review.state === 'CHANGES_REQUESTED' &&
|
|
||||||
review.body && review.body.includes(BOT_COMMENT_MARKER)
|
|
||||||
);
|
|
||||||
|
|
||||||
if (hasReviewableLabels) {
|
|
||||||
const reviewBody = `${BOT_COMMENT_MARKER}\n\n${reviewMessages.join('\n\n---\n\n')}`;
|
|
||||||
|
|
||||||
if (botReviews.length > 0) {
|
|
||||||
// Update existing review
|
|
||||||
await github.rest.pulls.updateReview({
|
|
||||||
owner,
|
|
||||||
repo,
|
|
||||||
pull_number: pr_number,
|
|
||||||
review_id: botReviews[0].id,
|
|
||||||
body: reviewBody
|
|
||||||
});
|
|
||||||
console.log('Updated existing bot review');
|
|
||||||
} else {
|
|
||||||
// Create new review
|
|
||||||
await github.rest.pulls.createReview({
|
|
||||||
owner,
|
|
||||||
repo,
|
|
||||||
pull_number: pr_number,
|
|
||||||
body: reviewBody,
|
|
||||||
event: 'REQUEST_CHANGES'
|
|
||||||
});
|
|
||||||
console.log('Created new bot review');
|
|
||||||
}
|
|
||||||
} else if (botReviews.length > 0) {
|
|
||||||
// Dismiss existing reviews
|
|
||||||
for (const review of botReviews) {
|
|
||||||
try {
|
|
||||||
await github.rest.pulls.dismissReview({
|
|
||||||
owner,
|
|
||||||
repo,
|
|
||||||
pull_number: pr_number,
|
|
||||||
review_id: review.id,
|
|
||||||
message: 'Review dismissed: All requirements have been met'
|
|
||||||
});
|
|
||||||
console.log(`Dismissed bot review ${review.id}`);
|
|
||||||
} catch (error) {
|
|
||||||
console.log(`Failed to dismiss review ${review.id}:`, error.message);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
// Main execution
|
|
||||||
const apiData = await fetchApiData();
|
|
||||||
const baseRef = context.payload.pull_request.base.ref;
|
|
||||||
|
|
||||||
// Early exit for non-dev branches
|
|
||||||
if (baseRef !== 'dev') {
|
|
||||||
const branchLabels = await detectMergeBranch();
|
|
||||||
const finalLabels = Array.from(branchLabels);
|
|
||||||
|
|
||||||
console.log('Computed labels (merge branch only):', finalLabels.join(', '));
|
|
||||||
|
|
||||||
// Apply labels
|
|
||||||
if (finalLabels.length > 0) {
|
|
||||||
await github.rest.issues.addLabels({
|
|
||||||
owner,
|
|
||||||
repo,
|
|
||||||
issue_number: pr_number,
|
|
||||||
labels: finalLabels
|
|
||||||
});
|
|
||||||
}
|
|
||||||
|
|
||||||
// Remove old managed labels
|
|
||||||
const labelsToRemove = managedLabels.filter(label => !finalLabels.includes(label));
|
|
||||||
for (const label of labelsToRemove) {
|
|
||||||
try {
|
|
||||||
await github.rest.issues.removeLabel({
|
|
||||||
owner,
|
|
||||||
repo,
|
|
||||||
issue_number: pr_number,
|
|
||||||
name: label
|
|
||||||
});
|
|
||||||
} catch (error) {
|
|
||||||
console.log(`Failed to remove label ${label}:`, error.message);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
return;
|
|
||||||
}
|
|
||||||
|
|
||||||
// Run all strategies
|
|
||||||
const [
|
|
||||||
branchLabels,
|
|
||||||
componentLabels,
|
|
||||||
newComponentLabels,
|
|
||||||
newPlatformLabels,
|
|
||||||
coreLabels,
|
|
||||||
sizeLabels,
|
|
||||||
dashboardLabels,
|
|
||||||
actionsLabels,
|
|
||||||
codeOwnerLabels,
|
|
||||||
testLabels,
|
|
||||||
checkboxLabels
|
|
||||||
] = await Promise.all([
|
|
||||||
detectMergeBranch(),
|
|
||||||
detectComponentPlatforms(apiData),
|
|
||||||
detectNewComponents(),
|
|
||||||
detectNewPlatforms(apiData),
|
|
||||||
detectCoreChanges(),
|
|
||||||
detectPRSize(),
|
|
||||||
detectDashboardChanges(),
|
|
||||||
detectGitHubActionsChanges(),
|
|
||||||
detectCodeOwner(),
|
|
||||||
detectTests(),
|
|
||||||
detectPRTemplateCheckboxes()
|
|
||||||
]);
|
|
||||||
|
|
||||||
// Combine all labels
|
|
||||||
const allLabels = new Set([
|
|
||||||
...branchLabels,
|
|
||||||
...componentLabels,
|
|
||||||
...newComponentLabels,
|
|
||||||
...newPlatformLabels,
|
|
||||||
...coreLabels,
|
|
||||||
...sizeLabels,
|
|
||||||
...dashboardLabels,
|
|
||||||
...actionsLabels,
|
|
||||||
...codeOwnerLabels,
|
|
||||||
...testLabels,
|
|
||||||
...checkboxLabels
|
|
||||||
]);
|
|
||||||
|
|
||||||
// Detect requirements based on all other labels
|
|
||||||
const requirementLabels = await detectRequirements(allLabels);
|
|
||||||
for (const label of requirementLabels) {
|
|
||||||
allLabels.add(label);
|
|
||||||
}
|
|
||||||
|
|
||||||
let finalLabels = Array.from(allLabels);
|
|
||||||
|
|
||||||
// For mega-PRs, exclude component labels if there are too many
|
|
||||||
if (isMegaPR) {
|
|
||||||
const componentLabels = finalLabels.filter(label => label.startsWith('component: '));
|
|
||||||
if (componentLabels.length > COMPONENT_LABEL_THRESHOLD) {
|
|
||||||
finalLabels = finalLabels.filter(label => !label.startsWith('component: '));
|
|
||||||
console.log(`Mega-PR detected - excluding ${componentLabels.length} component labels (threshold: ${COMPONENT_LABEL_THRESHOLD})`);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
// Handle too many labels (only for non-mega PRs)
|
|
||||||
const tooManyLabels = finalLabels.length > MAX_LABELS;
|
|
||||||
|
|
||||||
if (tooManyLabels && !isMegaPR && !finalLabels.includes('too-big')) {
|
|
||||||
finalLabels = ['too-big'];
|
|
||||||
}
|
|
||||||
|
|
||||||
console.log('Computed labels:', finalLabels.join(', '));
|
|
||||||
|
|
||||||
// Handle reviews
|
|
||||||
await handleReviews(finalLabels);
|
|
||||||
|
|
||||||
// Apply labels
|
|
||||||
if (finalLabels.length > 0) {
|
|
||||||
console.log(`Adding labels: ${finalLabels.join(', ')}`);
|
|
||||||
await github.rest.issues.addLabels({
|
|
||||||
owner,
|
|
||||||
repo,
|
|
||||||
issue_number: pr_number,
|
|
||||||
labels: finalLabels
|
|
||||||
});
|
|
||||||
}
|
|
||||||
|
|
||||||
// Remove old managed labels
|
|
||||||
const labelsToRemove = managedLabels.filter(label => !finalLabels.includes(label));
|
|
||||||
for (const label of labelsToRemove) {
|
|
||||||
console.log(`Removing label: ${label}`);
|
|
||||||
try {
|
|
||||||
await github.rest.issues.removeLabel({
|
|
||||||
owner,
|
|
||||||
repo,
|
|
||||||
issue_number: pr_number,
|
|
||||||
name: label
|
|
||||||
});
|
|
||||||
} catch (error) {
|
|
||||||
console.log(`Failed to remove label ${label}:`, error.message);
|
|
||||||
}
|
|
||||||
}
|
|
75
.github/workflows/ci-clang-tidy-hash.yml
vendored
75
.github/workflows/ci-clang-tidy-hash.yml
vendored
@ -1,75 +0,0 @@
|
|||||||
name: Clang-tidy Hash CI
|
|
||||||
|
|
||||||
on:
|
|
||||||
pull_request:
|
|
||||||
paths:
|
|
||||||
- ".clang-tidy"
|
|
||||||
- "platformio.ini"
|
|
||||||
- "requirements_dev.txt"
|
|
||||||
- ".clang-tidy.hash"
|
|
||||||
- "script/clang_tidy_hash.py"
|
|
||||||
- ".github/workflows/ci-clang-tidy-hash.yml"
|
|
||||||
|
|
||||||
permissions:
|
|
||||||
contents: read
|
|
||||||
pull-requests: write
|
|
||||||
|
|
||||||
jobs:
|
|
||||||
verify-hash:
|
|
||||||
name: Verify clang-tidy hash
|
|
||||||
runs-on: ubuntu-latest
|
|
||||||
steps:
|
|
||||||
- name: Checkout
|
|
||||||
uses: actions/checkout@v4.2.2
|
|
||||||
|
|
||||||
- name: Set up Python
|
|
||||||
uses: actions/setup-python@v5.6.0
|
|
||||||
with:
|
|
||||||
python-version: "3.11"
|
|
||||||
|
|
||||||
- name: Verify hash
|
|
||||||
run: |
|
|
||||||
python script/clang_tidy_hash.py --verify
|
|
||||||
|
|
||||||
- if: failure()
|
|
||||||
name: Show hash details
|
|
||||||
run: |
|
|
||||||
python script/clang_tidy_hash.py
|
|
||||||
echo "## Job Failed" | tee -a $GITHUB_STEP_SUMMARY
|
|
||||||
echo "You have modified clang-tidy configuration but have not updated the hash." | tee -a $GITHUB_STEP_SUMMARY
|
|
||||||
echo "Please run 'script/clang_tidy_hash.py --update' and commit the changes." | tee -a $GITHUB_STEP_SUMMARY
|
|
||||||
|
|
||||||
- if: failure()
|
|
||||||
name: Request changes
|
|
||||||
uses: actions/github-script@v7.0.1
|
|
||||||
with:
|
|
||||||
script: |
|
|
||||||
await github.rest.pulls.createReview({
|
|
||||||
pull_number: context.issue.number,
|
|
||||||
owner: context.repo.owner,
|
|
||||||
repo: context.repo.repo,
|
|
||||||
event: 'REQUEST_CHANGES',
|
|
||||||
body: 'You have modified clang-tidy configuration but have not updated the hash.\nPlease run `script/clang_tidy_hash.py --update` and commit the changes.'
|
|
||||||
})
|
|
||||||
|
|
||||||
- if: success()
|
|
||||||
name: Dismiss review
|
|
||||||
uses: actions/github-script@v7.0.1
|
|
||||||
with:
|
|
||||||
script: |
|
|
||||||
let reviews = await github.rest.pulls.listReviews({
|
|
||||||
pull_number: context.issue.number,
|
|
||||||
owner: context.repo.owner,
|
|
||||||
repo: context.repo.repo
|
|
||||||
});
|
|
||||||
for (let review of reviews.data) {
|
|
||||||
if (review.user.login === 'github-actions[bot]' && review.state === 'CHANGES_REQUESTED') {
|
|
||||||
await github.rest.pulls.dismissReview({
|
|
||||||
pull_number: context.issue.number,
|
|
||||||
owner: context.repo.owner,
|
|
||||||
repo: context.repo.repo,
|
|
||||||
review_id: review.id,
|
|
||||||
message: 'Clang-tidy hash now matches configuration.'
|
|
||||||
});
|
|
||||||
}
|
|
||||||
}
|
|
2
.github/workflows/ci-docker.yml
vendored
2
.github/workflows/ci-docker.yml
vendored
@ -47,7 +47,7 @@ jobs:
|
|||||||
- name: Set up Python
|
- name: Set up Python
|
||||||
uses: actions/setup-python@v5.6.0
|
uses: actions/setup-python@v5.6.0
|
||||||
with:
|
with:
|
||||||
python-version: "3.11"
|
python-version: "3.10"
|
||||||
- name: Set up Docker Buildx
|
- name: Set up Docker Buildx
|
||||||
uses: docker/setup-buildx-action@v3.11.1
|
uses: docker/setup-buildx-action@v3.11.1
|
||||||
|
|
||||||
|
279
.github/workflows/ci.yml
vendored
279
.github/workflows/ci.yml
vendored
@ -20,8 +20,8 @@ permissions:
|
|||||||
contents: read
|
contents: read
|
||||||
|
|
||||||
env:
|
env:
|
||||||
DEFAULT_PYTHON: "3.11"
|
DEFAULT_PYTHON: "3.10"
|
||||||
PYUPGRADE_TARGET: "--py311-plus"
|
PYUPGRADE_TARGET: "--py310-plus"
|
||||||
|
|
||||||
concurrency:
|
concurrency:
|
||||||
# yamllint disable-line rule:line-length
|
# yamllint disable-line rule:line-length
|
||||||
@ -39,7 +39,7 @@ jobs:
|
|||||||
uses: actions/checkout@v4.2.2
|
uses: actions/checkout@v4.2.2
|
||||||
- name: Generate cache-key
|
- name: Generate cache-key
|
||||||
id: cache-key
|
id: cache-key
|
||||||
run: echo key="${{ hashFiles('requirements.txt', 'requirements_test.txt', '.pre-commit-config.yaml') }}" >> $GITHUB_OUTPUT
|
run: echo key="${{ hashFiles('requirements.txt', 'requirements_test.txt') }}" >> $GITHUB_OUTPUT
|
||||||
- name: Set up Python ${{ env.DEFAULT_PYTHON }}
|
- name: Set up Python ${{ env.DEFAULT_PYTHON }}
|
||||||
id: python
|
id: python
|
||||||
uses: actions/setup-python@v5.6.0
|
uses: actions/setup-python@v5.6.0
|
||||||
@ -58,16 +58,56 @@ jobs:
|
|||||||
python -m venv venv
|
python -m venv venv
|
||||||
. venv/bin/activate
|
. venv/bin/activate
|
||||||
python --version
|
python --version
|
||||||
pip install -r requirements.txt -r requirements_test.txt pre-commit
|
pip install -r requirements.txt -r requirements_test.txt
|
||||||
pip install -e .
|
pip install -e .
|
||||||
|
|
||||||
|
ruff:
|
||||||
|
name: Check ruff
|
||||||
|
runs-on: ubuntu-24.04
|
||||||
|
needs:
|
||||||
|
- common
|
||||||
|
steps:
|
||||||
|
- name: Check out code from GitHub
|
||||||
|
uses: actions/checkout@v4.2.2
|
||||||
|
- name: Restore Python
|
||||||
|
uses: ./.github/actions/restore-python
|
||||||
|
with:
|
||||||
|
python-version: ${{ env.DEFAULT_PYTHON }}
|
||||||
|
cache-key: ${{ needs.common.outputs.cache-key }}
|
||||||
|
- name: Run Ruff
|
||||||
|
run: |
|
||||||
|
. venv/bin/activate
|
||||||
|
ruff format esphome tests
|
||||||
|
- name: Suggested changes
|
||||||
|
run: script/ci-suggest-changes
|
||||||
|
if: always()
|
||||||
|
|
||||||
|
flake8:
|
||||||
|
name: Check flake8
|
||||||
|
runs-on: ubuntu-24.04
|
||||||
|
needs:
|
||||||
|
- common
|
||||||
|
steps:
|
||||||
|
- name: Check out code from GitHub
|
||||||
|
uses: actions/checkout@v4.2.2
|
||||||
|
- name: Restore Python
|
||||||
|
uses: ./.github/actions/restore-python
|
||||||
|
with:
|
||||||
|
python-version: ${{ env.DEFAULT_PYTHON }}
|
||||||
|
cache-key: ${{ needs.common.outputs.cache-key }}
|
||||||
|
- name: Run flake8
|
||||||
|
run: |
|
||||||
|
. venv/bin/activate
|
||||||
|
flake8 esphome
|
||||||
|
- name: Suggested changes
|
||||||
|
run: script/ci-suggest-changes
|
||||||
|
if: always()
|
||||||
|
|
||||||
pylint:
|
pylint:
|
||||||
name: Check pylint
|
name: Check pylint
|
||||||
runs-on: ubuntu-24.04
|
runs-on: ubuntu-24.04
|
||||||
needs:
|
needs:
|
||||||
- common
|
- common
|
||||||
- determine-jobs
|
|
||||||
if: needs.determine-jobs.outputs.python-linters == 'true'
|
|
||||||
steps:
|
steps:
|
||||||
- name: Check out code from GitHub
|
- name: Check out code from GitHub
|
||||||
uses: actions/checkout@v4.2.2
|
uses: actions/checkout@v4.2.2
|
||||||
@ -84,6 +124,27 @@ jobs:
|
|||||||
run: script/ci-suggest-changes
|
run: script/ci-suggest-changes
|
||||||
if: always()
|
if: always()
|
||||||
|
|
||||||
|
pyupgrade:
|
||||||
|
name: Check pyupgrade
|
||||||
|
runs-on: ubuntu-24.04
|
||||||
|
needs:
|
||||||
|
- common
|
||||||
|
steps:
|
||||||
|
- name: Check out code from GitHub
|
||||||
|
uses: actions/checkout@v4.2.2
|
||||||
|
- name: Restore Python
|
||||||
|
uses: ./.github/actions/restore-python
|
||||||
|
with:
|
||||||
|
python-version: ${{ env.DEFAULT_PYTHON }}
|
||||||
|
cache-key: ${{ needs.common.outputs.cache-key }}
|
||||||
|
- name: Run pyupgrade
|
||||||
|
run: |
|
||||||
|
. venv/bin/activate
|
||||||
|
pyupgrade ${{ env.PYUPGRADE_TARGET }} `find esphome -name "*.py" -type f`
|
||||||
|
- name: Suggested changes
|
||||||
|
run: script/ci-suggest-changes
|
||||||
|
if: always()
|
||||||
|
|
||||||
ci-custom:
|
ci-custom:
|
||||||
name: Run script/ci-custom
|
name: Run script/ci-custom
|
||||||
runs-on: ubuntu-24.04
|
runs-on: ubuntu-24.04
|
||||||
@ -112,6 +173,7 @@ jobs:
|
|||||||
fail-fast: false
|
fail-fast: false
|
||||||
matrix:
|
matrix:
|
||||||
python-version:
|
python-version:
|
||||||
|
- "3.10"
|
||||||
- "3.11"
|
- "3.11"
|
||||||
- "3.12"
|
- "3.12"
|
||||||
- "3.13"
|
- "3.13"
|
||||||
@ -127,10 +189,14 @@ jobs:
|
|||||||
os: windows-latest
|
os: windows-latest
|
||||||
- python-version: "3.12"
|
- python-version: "3.12"
|
||||||
os: windows-latest
|
os: windows-latest
|
||||||
|
- python-version: "3.10"
|
||||||
|
os: windows-latest
|
||||||
- python-version: "3.13"
|
- python-version: "3.13"
|
||||||
os: macOS-latest
|
os: macOS-latest
|
||||||
- python-version: "3.12"
|
- python-version: "3.12"
|
||||||
os: macOS-latest
|
os: macOS-latest
|
||||||
|
- python-version: "3.10"
|
||||||
|
os: macOS-latest
|
||||||
runs-on: ${{ matrix.os }}
|
runs-on: ${{ matrix.os }}
|
||||||
needs:
|
needs:
|
||||||
- common
|
- common
|
||||||
@ -138,7 +204,6 @@ jobs:
|
|||||||
- name: Check out code from GitHub
|
- name: Check out code from GitHub
|
||||||
uses: actions/checkout@v4.2.2
|
uses: actions/checkout@v4.2.2
|
||||||
- name: Restore Python
|
- name: Restore Python
|
||||||
id: restore-python
|
|
||||||
uses: ./.github/actions/restore-python
|
uses: ./.github/actions/restore-python
|
||||||
with:
|
with:
|
||||||
python-version: ${{ matrix.python-version }}
|
python-version: ${{ matrix.python-version }}
|
||||||
@ -148,7 +213,7 @@ jobs:
|
|||||||
- name: Run pytest
|
- name: Run pytest
|
||||||
if: matrix.os == 'windows-latest'
|
if: matrix.os == 'windows-latest'
|
||||||
run: |
|
run: |
|
||||||
. ./venv/Scripts/activate.ps1
|
./venv/Scripts/activate
|
||||||
pytest -vv --cov-report=xml --tb=native -n auto tests --ignore=tests/integration/
|
pytest -vv --cov-report=xml --tb=native -n auto tests --ignore=tests/integration/
|
||||||
- name: Run pytest
|
- name: Run pytest
|
||||||
if: matrix.os == 'ubuntu-latest' || matrix.os == 'macOS-latest'
|
if: matrix.os == 'ubuntu-latest' || matrix.os == 'macOS-latest'
|
||||||
@ -159,59 +224,12 @@ jobs:
|
|||||||
uses: codecov/codecov-action@v5.4.3
|
uses: codecov/codecov-action@v5.4.3
|
||||||
with:
|
with:
|
||||||
token: ${{ secrets.CODECOV_TOKEN }}
|
token: ${{ secrets.CODECOV_TOKEN }}
|
||||||
- name: Save Python virtual environment cache
|
|
||||||
if: github.ref == 'refs/heads/dev'
|
|
||||||
uses: actions/cache/save@v4.2.3
|
|
||||||
with:
|
|
||||||
path: venv
|
|
||||||
key: ${{ runner.os }}-${{ steps.restore-python.outputs.python-version }}-venv-${{ needs.common.outputs.cache-key }}
|
|
||||||
|
|
||||||
determine-jobs:
|
|
||||||
name: Determine which jobs to run
|
|
||||||
runs-on: ubuntu-24.04
|
|
||||||
needs:
|
|
||||||
- common
|
|
||||||
outputs:
|
|
||||||
integration-tests: ${{ steps.determine.outputs.integration-tests }}
|
|
||||||
clang-tidy: ${{ steps.determine.outputs.clang-tidy }}
|
|
||||||
python-linters: ${{ steps.determine.outputs.python-linters }}
|
|
||||||
changed-components: ${{ steps.determine.outputs.changed-components }}
|
|
||||||
component-test-count: ${{ steps.determine.outputs.component-test-count }}
|
|
||||||
steps:
|
|
||||||
- name: Check out code from GitHub
|
|
||||||
uses: actions/checkout@v4.2.2
|
|
||||||
with:
|
|
||||||
# Fetch enough history to find the merge base
|
|
||||||
fetch-depth: 2
|
|
||||||
- name: Restore Python
|
|
||||||
uses: ./.github/actions/restore-python
|
|
||||||
with:
|
|
||||||
python-version: ${{ env.DEFAULT_PYTHON }}
|
|
||||||
cache-key: ${{ needs.common.outputs.cache-key }}
|
|
||||||
- name: Determine which tests to run
|
|
||||||
id: determine
|
|
||||||
env:
|
|
||||||
GH_TOKEN: ${{ github.token }}
|
|
||||||
run: |
|
|
||||||
. venv/bin/activate
|
|
||||||
output=$(python script/determine-jobs.py)
|
|
||||||
echo "Test determination output:"
|
|
||||||
echo "$output" | jq
|
|
||||||
|
|
||||||
# Extract individual fields
|
|
||||||
echo "integration-tests=$(echo "$output" | jq -r '.integration_tests')" >> $GITHUB_OUTPUT
|
|
||||||
echo "clang-tidy=$(echo "$output" | jq -r '.clang_tidy')" >> $GITHUB_OUTPUT
|
|
||||||
echo "python-linters=$(echo "$output" | jq -r '.python_linters')" >> $GITHUB_OUTPUT
|
|
||||||
echo "changed-components=$(echo "$output" | jq -c '.changed_components')" >> $GITHUB_OUTPUT
|
|
||||||
echo "component-test-count=$(echo "$output" | jq -r '.component_test_count')" >> $GITHUB_OUTPUT
|
|
||||||
|
|
||||||
integration-tests:
|
integration-tests:
|
||||||
name: Run integration tests
|
name: Run integration tests
|
||||||
runs-on: ubuntu-latest
|
runs-on: ubuntu-latest
|
||||||
needs:
|
needs:
|
||||||
- common
|
- common
|
||||||
- determine-jobs
|
|
||||||
if: needs.determine-jobs.outputs.integration-tests == 'true'
|
|
||||||
steps:
|
steps:
|
||||||
- name: Check out code from GitHub
|
- name: Check out code from GitHub
|
||||||
uses: actions/checkout@v4.2.2
|
uses: actions/checkout@v4.2.2
|
||||||
@ -241,15 +259,44 @@ jobs:
|
|||||||
. venv/bin/activate
|
. venv/bin/activate
|
||||||
pytest -vv --no-cov --tb=native -n auto tests/integration/
|
pytest -vv --no-cov --tb=native -n auto tests/integration/
|
||||||
|
|
||||||
|
clang-format:
|
||||||
|
name: Check clang-format
|
||||||
|
runs-on: ubuntu-24.04
|
||||||
|
needs:
|
||||||
|
- common
|
||||||
|
steps:
|
||||||
|
- name: Check out code from GitHub
|
||||||
|
uses: actions/checkout@v4.2.2
|
||||||
|
- name: Restore Python
|
||||||
|
uses: ./.github/actions/restore-python
|
||||||
|
with:
|
||||||
|
python-version: ${{ env.DEFAULT_PYTHON }}
|
||||||
|
cache-key: ${{ needs.common.outputs.cache-key }}
|
||||||
|
- name: Install clang-format
|
||||||
|
run: |
|
||||||
|
. venv/bin/activate
|
||||||
|
pip install clang-format -c requirements_dev.txt
|
||||||
|
- name: Run clang-format
|
||||||
|
run: |
|
||||||
|
. venv/bin/activate
|
||||||
|
script/clang-format -i
|
||||||
|
git diff-index --quiet HEAD --
|
||||||
|
- name: Suggested changes
|
||||||
|
run: script/ci-suggest-changes
|
||||||
|
if: always()
|
||||||
|
|
||||||
clang-tidy:
|
clang-tidy:
|
||||||
name: ${{ matrix.name }}
|
name: ${{ matrix.name }}
|
||||||
runs-on: ubuntu-24.04
|
runs-on: ubuntu-24.04
|
||||||
needs:
|
needs:
|
||||||
- common
|
- common
|
||||||
- determine-jobs
|
- ruff
|
||||||
if: needs.determine-jobs.outputs.clang-tidy == 'true'
|
- ci-custom
|
||||||
env:
|
- clang-format
|
||||||
GH_TOKEN: ${{ github.token }}
|
- flake8
|
||||||
|
- pylint
|
||||||
|
- pytest
|
||||||
|
- pyupgrade
|
||||||
strategy:
|
strategy:
|
||||||
fail-fast: false
|
fail-fast: false
|
||||||
max-parallel: 2
|
max-parallel: 2
|
||||||
@ -281,17 +328,13 @@ jobs:
|
|||||||
pio_cache_key: tidyesp32-idf
|
pio_cache_key: tidyesp32-idf
|
||||||
- id: clang-tidy
|
- id: clang-tidy
|
||||||
name: Run script/clang-tidy for ZEPHYR
|
name: Run script/clang-tidy for ZEPHYR
|
||||||
options: --environment nrf52-tidy --grep USE_ZEPHYR --grep USE_NRF52
|
options: --environment nrf52-tidy --grep USE_ZEPHYR
|
||||||
pio_cache_key: tidy-zephyr
|
pio_cache_key: tidy-zephyr
|
||||||
ignore_errors: false
|
ignore_errors: false
|
||||||
|
|
||||||
steps:
|
steps:
|
||||||
- name: Check out code from GitHub
|
- name: Check out code from GitHub
|
||||||
uses: actions/checkout@v4.2.2
|
uses: actions/checkout@v4.2.2
|
||||||
with:
|
|
||||||
# Need history for HEAD~1 to work for checking changed files
|
|
||||||
fetch-depth: 2
|
|
||||||
|
|
||||||
- name: Restore Python
|
- name: Restore Python
|
||||||
uses: ./.github/actions/restore-python
|
uses: ./.github/actions/restore-python
|
||||||
with:
|
with:
|
||||||
@ -303,14 +346,14 @@ jobs:
|
|||||||
uses: actions/cache@v4.2.3
|
uses: actions/cache@v4.2.3
|
||||||
with:
|
with:
|
||||||
path: ~/.platformio
|
path: ~/.platformio
|
||||||
key: platformio-${{ matrix.pio_cache_key }}-${{ hashFiles('platformio.ini') }}
|
key: platformio-${{ matrix.pio_cache_key }}
|
||||||
|
|
||||||
- name: Cache platformio
|
- name: Cache platformio
|
||||||
if: github.ref != 'refs/heads/dev'
|
if: github.ref != 'refs/heads/dev'
|
||||||
uses: actions/cache/restore@v4.2.3
|
uses: actions/cache/restore@v4.2.3
|
||||||
with:
|
with:
|
||||||
path: ~/.platformio
|
path: ~/.platformio
|
||||||
key: platformio-${{ matrix.pio_cache_key }}-${{ hashFiles('platformio.ini') }}
|
key: platformio-${{ matrix.pio_cache_key }}
|
||||||
|
|
||||||
- name: Register problem matchers
|
- name: Register problem matchers
|
||||||
run: |
|
run: |
|
||||||
@ -324,28 +367,10 @@ jobs:
|
|||||||
mkdir -p .temp
|
mkdir -p .temp
|
||||||
pio run --list-targets -e esp32-idf-tidy
|
pio run --list-targets -e esp32-idf-tidy
|
||||||
|
|
||||||
- name: Check if full clang-tidy scan needed
|
|
||||||
id: check_full_scan
|
|
||||||
run: |
|
|
||||||
. venv/bin/activate
|
|
||||||
if python script/clang_tidy_hash.py --check; then
|
|
||||||
echo "full_scan=true" >> $GITHUB_OUTPUT
|
|
||||||
echo "reason=hash_changed" >> $GITHUB_OUTPUT
|
|
||||||
else
|
|
||||||
echo "full_scan=false" >> $GITHUB_OUTPUT
|
|
||||||
echo "reason=normal" >> $GITHUB_OUTPUT
|
|
||||||
fi
|
|
||||||
|
|
||||||
- name: Run clang-tidy
|
- name: Run clang-tidy
|
||||||
run: |
|
run: |
|
||||||
. venv/bin/activate
|
. venv/bin/activate
|
||||||
if [ "${{ steps.check_full_scan.outputs.full_scan }}" = "true" ]; then
|
script/clang-tidy --all-headers --fix ${{ matrix.options }} ${{ matrix.ignore_errors && '|| true' || '' }}
|
||||||
echo "Running FULL clang-tidy scan (hash changed)"
|
|
||||||
script/clang-tidy --all-headers --fix ${{ matrix.options }} ${{ matrix.ignore_errors && '|| true' || '' }}
|
|
||||||
else
|
|
||||||
echo "Running clang-tidy on changed files only"
|
|
||||||
script/clang-tidy --all-headers --fix --changed ${{ matrix.options }} ${{ matrix.ignore_errors && '|| true' || '' }}
|
|
||||||
fi
|
|
||||||
env:
|
env:
|
||||||
# Also cache libdeps, store them in a ~/.platformio subfolder
|
# Also cache libdeps, store them in a ~/.platformio subfolder
|
||||||
PLATFORMIO_LIBDEPS_DIR: ~/.platformio/libdeps
|
PLATFORMIO_LIBDEPS_DIR: ~/.platformio/libdeps
|
||||||
@ -355,18 +380,59 @@ jobs:
|
|||||||
# yamllint disable-line rule:line-length
|
# yamllint disable-line rule:line-length
|
||||||
if: always()
|
if: always()
|
||||||
|
|
||||||
|
list-components:
|
||||||
|
runs-on: ubuntu-24.04
|
||||||
|
needs:
|
||||||
|
- common
|
||||||
|
if: github.event_name == 'pull_request'
|
||||||
|
outputs:
|
||||||
|
components: ${{ steps.list-components.outputs.components }}
|
||||||
|
count: ${{ steps.list-components.outputs.count }}
|
||||||
|
steps:
|
||||||
|
- name: Check out code from GitHub
|
||||||
|
uses: actions/checkout@v4.2.2
|
||||||
|
with:
|
||||||
|
# Fetch enough history so `git merge-base refs/remotes/origin/dev HEAD` works.
|
||||||
|
fetch-depth: 500
|
||||||
|
- name: Get target branch
|
||||||
|
id: target-branch
|
||||||
|
run: |
|
||||||
|
echo "branch=${{ github.event.pull_request.base.ref }}" >> $GITHUB_OUTPUT
|
||||||
|
- name: Fetch ${{ steps.target-branch.outputs.branch }} branch
|
||||||
|
run: |
|
||||||
|
git -c protocol.version=2 fetch --no-tags --prune --no-recurse-submodules --depth=1 origin +refs/heads/${{ steps.target-branch.outputs.branch }}:refs/remotes/origin/${{ steps.target-branch.outputs.branch }}
|
||||||
|
git merge-base refs/remotes/origin/${{ steps.target-branch.outputs.branch }} HEAD
|
||||||
|
- name: Restore Python
|
||||||
|
uses: ./.github/actions/restore-python
|
||||||
|
with:
|
||||||
|
python-version: ${{ env.DEFAULT_PYTHON }}
|
||||||
|
cache-key: ${{ needs.common.outputs.cache-key }}
|
||||||
|
- name: Find changed components
|
||||||
|
id: list-components
|
||||||
|
run: |
|
||||||
|
. venv/bin/activate
|
||||||
|
components=$(script/list-components.py --changed --branch ${{ steps.target-branch.outputs.branch }})
|
||||||
|
output_components=$(echo "$components" | jq -R -s -c 'split("\n")[:-1] | map(select(length > 0))')
|
||||||
|
count=$(echo "$output_components" | jq length)
|
||||||
|
|
||||||
|
echo "components=$output_components" >> $GITHUB_OUTPUT
|
||||||
|
echo "count=$count" >> $GITHUB_OUTPUT
|
||||||
|
|
||||||
|
echo "$count Components:"
|
||||||
|
echo "$output_components" | jq
|
||||||
|
|
||||||
test-build-components:
|
test-build-components:
|
||||||
name: Component test ${{ matrix.file }}
|
name: Component test ${{ matrix.file }}
|
||||||
runs-on: ubuntu-24.04
|
runs-on: ubuntu-24.04
|
||||||
needs:
|
needs:
|
||||||
- common
|
- common
|
||||||
- determine-jobs
|
- list-components
|
||||||
if: github.event_name == 'pull_request' && fromJSON(needs.determine-jobs.outputs.component-test-count) > 0 && fromJSON(needs.determine-jobs.outputs.component-test-count) < 100
|
if: github.event_name == 'pull_request' && fromJSON(needs.list-components.outputs.count) > 0 && fromJSON(needs.list-components.outputs.count) < 100
|
||||||
strategy:
|
strategy:
|
||||||
fail-fast: false
|
fail-fast: false
|
||||||
max-parallel: 2
|
max-parallel: 2
|
||||||
matrix:
|
matrix:
|
||||||
file: ${{ fromJson(needs.determine-jobs.outputs.changed-components) }}
|
file: ${{ fromJson(needs.list-components.outputs.components) }}
|
||||||
steps:
|
steps:
|
||||||
- name: Install dependencies
|
- name: Install dependencies
|
||||||
run: |
|
run: |
|
||||||
@ -394,8 +460,8 @@ jobs:
|
|||||||
runs-on: ubuntu-24.04
|
runs-on: ubuntu-24.04
|
||||||
needs:
|
needs:
|
||||||
- common
|
- common
|
||||||
- determine-jobs
|
- list-components
|
||||||
if: github.event_name == 'pull_request' && fromJSON(needs.determine-jobs.outputs.component-test-count) >= 100
|
if: github.event_name == 'pull_request' && fromJSON(needs.list-components.outputs.count) >= 100
|
||||||
outputs:
|
outputs:
|
||||||
matrix: ${{ steps.split.outputs.components }}
|
matrix: ${{ steps.split.outputs.components }}
|
||||||
steps:
|
steps:
|
||||||
@ -404,7 +470,7 @@ jobs:
|
|||||||
- name: Split components into 20 groups
|
- name: Split components into 20 groups
|
||||||
id: split
|
id: split
|
||||||
run: |
|
run: |
|
||||||
components=$(echo '${{ needs.determine-jobs.outputs.changed-components }}' | jq -c '.[]' | shuf | jq -s -c '[_nwise(20) | join(" ")]')
|
components=$(echo '${{ needs.list-components.outputs.components }}' | jq -c '.[]' | shuf | jq -s -c '[_nwise(20) | join(" ")]')
|
||||||
echo "components=$components" >> $GITHUB_OUTPUT
|
echo "components=$components" >> $GITHUB_OUTPUT
|
||||||
|
|
||||||
test-build-components-split:
|
test-build-components-split:
|
||||||
@ -412,9 +478,9 @@ jobs:
|
|||||||
runs-on: ubuntu-24.04
|
runs-on: ubuntu-24.04
|
||||||
needs:
|
needs:
|
||||||
- common
|
- common
|
||||||
- determine-jobs
|
- list-components
|
||||||
- test-build-components-splitter
|
- test-build-components-splitter
|
||||||
if: github.event_name == 'pull_request' && fromJSON(needs.determine-jobs.outputs.component-test-count) >= 100
|
if: github.event_name == 'pull_request' && fromJSON(needs.list-components.outputs.count) >= 100
|
||||||
strategy:
|
strategy:
|
||||||
fail-fast: false
|
fail-fast: false
|
||||||
max-parallel: 4
|
max-parallel: 4
|
||||||
@ -451,41 +517,24 @@ jobs:
|
|||||||
./script/test_build_components -e compile -c $component
|
./script/test_build_components -e compile -c $component
|
||||||
done
|
done
|
||||||
|
|
||||||
pre-commit-ci-lite:
|
|
||||||
name: pre-commit.ci lite
|
|
||||||
runs-on: ubuntu-latest
|
|
||||||
needs:
|
|
||||||
- common
|
|
||||||
if: github.event_name == 'pull_request' && github.base_ref != 'beta' && github.base_ref != 'release'
|
|
||||||
steps:
|
|
||||||
- name: Check out code from GitHub
|
|
||||||
uses: actions/checkout@v4.2.2
|
|
||||||
- name: Restore Python
|
|
||||||
uses: ./.github/actions/restore-python
|
|
||||||
with:
|
|
||||||
python-version: ${{ env.DEFAULT_PYTHON }}
|
|
||||||
cache-key: ${{ needs.common.outputs.cache-key }}
|
|
||||||
- uses: pre-commit/action@v3.0.1
|
|
||||||
env:
|
|
||||||
SKIP: pylint,clang-tidy-hash
|
|
||||||
- uses: pre-commit-ci/lite-action@v1.1.0
|
|
||||||
if: always()
|
|
||||||
|
|
||||||
ci-status:
|
ci-status:
|
||||||
name: CI Status
|
name: CI Status
|
||||||
runs-on: ubuntu-24.04
|
runs-on: ubuntu-24.04
|
||||||
needs:
|
needs:
|
||||||
- common
|
- common
|
||||||
|
- ruff
|
||||||
- ci-custom
|
- ci-custom
|
||||||
|
- clang-format
|
||||||
|
- flake8
|
||||||
- pylint
|
- pylint
|
||||||
- pytest
|
- pytest
|
||||||
- integration-tests
|
- integration-tests
|
||||||
|
- pyupgrade
|
||||||
- clang-tidy
|
- clang-tidy
|
||||||
- determine-jobs
|
- list-components
|
||||||
- test-build-components
|
- test-build-components
|
||||||
- test-build-components-splitter
|
- test-build-components-splitter
|
||||||
- test-build-components-split
|
- test-build-components-split
|
||||||
- pre-commit-ci-lite
|
|
||||||
if: always()
|
if: always()
|
||||||
steps:
|
steps:
|
||||||
- name: Success
|
- name: Success
|
||||||
|
324
.github/workflows/codeowner-review-request.yml
vendored
324
.github/workflows/codeowner-review-request.yml
vendored
@ -1,324 +0,0 @@
|
|||||||
# This workflow automatically requests reviews from codeowners when:
|
|
||||||
# 1. A PR is opened, reopened, or synchronized (updated)
|
|
||||||
# 2. A PR is marked as ready for review
|
|
||||||
#
|
|
||||||
# It reads the CODEOWNERS file and matches all changed files in the PR against
|
|
||||||
# the codeowner patterns, then requests reviews from the appropriate owners
|
|
||||||
# while avoiding duplicate requests for users who have already been requested
|
|
||||||
# or have already reviewed the PR.
|
|
||||||
|
|
||||||
name: Request Codeowner Reviews
|
|
||||||
|
|
||||||
on:
|
|
||||||
# Needs to be pull_request_target to get write permissions
|
|
||||||
pull_request_target:
|
|
||||||
types: [opened, reopened, synchronize, ready_for_review]
|
|
||||||
|
|
||||||
permissions:
|
|
||||||
pull-requests: write
|
|
||||||
contents: read
|
|
||||||
|
|
||||||
jobs:
|
|
||||||
request-codeowner-reviews:
|
|
||||||
name: Run
|
|
||||||
if: ${{ !github.event.pull_request.draft }}
|
|
||||||
runs-on: ubuntu-latest
|
|
||||||
steps:
|
|
||||||
- name: Request reviews from component codeowners
|
|
||||||
uses: actions/github-script@v7.0.1
|
|
||||||
with:
|
|
||||||
script: |
|
|
||||||
const owner = context.repo.owner;
|
|
||||||
const repo = context.repo.repo;
|
|
||||||
const pr_number = context.payload.pull_request.number;
|
|
||||||
|
|
||||||
console.log(`Processing PR #${pr_number} for codeowner review requests`);
|
|
||||||
|
|
||||||
// Hidden marker to identify bot comments from this workflow
|
|
||||||
const BOT_COMMENT_MARKER = '<!-- codeowner-review-request-bot -->';
|
|
||||||
|
|
||||||
try {
|
|
||||||
// Get the list of changed files in this PR
|
|
||||||
const { data: files } = await github.rest.pulls.listFiles({
|
|
||||||
owner,
|
|
||||||
repo,
|
|
||||||
pull_number: pr_number
|
|
||||||
});
|
|
||||||
|
|
||||||
const changedFiles = files.map(file => file.filename);
|
|
||||||
console.log(`Found ${changedFiles.length} changed files`);
|
|
||||||
|
|
||||||
if (changedFiles.length === 0) {
|
|
||||||
console.log('No changed files found, skipping codeowner review requests');
|
|
||||||
return;
|
|
||||||
}
|
|
||||||
|
|
||||||
// Fetch CODEOWNERS file from root
|
|
||||||
const { data: codeownersFile } = await github.rest.repos.getContent({
|
|
||||||
owner,
|
|
||||||
repo,
|
|
||||||
path: 'CODEOWNERS',
|
|
||||||
ref: context.payload.pull_request.base.sha
|
|
||||||
});
|
|
||||||
const codeownersContent = Buffer.from(codeownersFile.content, 'base64').toString('utf8');
|
|
||||||
|
|
||||||
// Parse CODEOWNERS file to extract all patterns and their owners
|
|
||||||
const codeownersLines = codeownersContent.split('\n')
|
|
||||||
.map(line => line.trim())
|
|
||||||
.filter(line => line && !line.startsWith('#'));
|
|
||||||
|
|
||||||
const codeownersPatterns = [];
|
|
||||||
|
|
||||||
// Convert CODEOWNERS pattern to regex (robust glob handling)
|
|
||||||
function globToRegex(pattern) {
|
|
||||||
// Escape regex special characters except for glob wildcards
|
|
||||||
let regexStr = pattern
|
|
||||||
.replace(/([.+^=!:${}()|[\]\\])/g, '\\$1') // escape regex chars
|
|
||||||
.replace(/\*\*/g, '.*') // globstar
|
|
||||||
.replace(/\*/g, '[^/]*') // single star
|
|
||||||
.replace(/\?/g, '.'); // question mark
|
|
||||||
return new RegExp('^' + regexStr + '$');
|
|
||||||
}
|
|
||||||
|
|
||||||
// Helper function to create comment body
|
|
||||||
function createCommentBody(reviewersList, teamsList, matchedFileCount, isSuccessful = true) {
|
|
||||||
const reviewerMentions = reviewersList.map(r => `@${r}`);
|
|
||||||
const teamMentions = teamsList.map(t => `@${owner}/${t}`);
|
|
||||||
const allMentions = [...reviewerMentions, ...teamMentions].join(', ');
|
|
||||||
|
|
||||||
if (isSuccessful) {
|
|
||||||
return `${BOT_COMMENT_MARKER}\n👋 Hi there! I've automatically requested reviews from codeowners based on the files changed in this PR.\n\n${allMentions} - You've been requested to review this PR as codeowner(s) of ${matchedFileCount} file(s) that were modified. Thanks for your time! 🙏`;
|
|
||||||
} else {
|
|
||||||
return `${BOT_COMMENT_MARKER}\n👋 Hi there! This PR modifies ${matchedFileCount} file(s) with codeowners.\n\n${allMentions} - As codeowner(s) of the affected files, your review would be appreciated! 🙏\n\n_Note: Automatic review request may have failed, but you're still welcome to review._`;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
for (const line of codeownersLines) {
|
|
||||||
const parts = line.split(/\s+/);
|
|
||||||
if (parts.length < 2) continue;
|
|
||||||
|
|
||||||
const pattern = parts[0];
|
|
||||||
const owners = parts.slice(1);
|
|
||||||
|
|
||||||
// Use robust glob-to-regex conversion
|
|
||||||
const regex = globToRegex(pattern);
|
|
||||||
codeownersPatterns.push({ pattern, regex, owners });
|
|
||||||
}
|
|
||||||
|
|
||||||
console.log(`Parsed ${codeownersPatterns.length} codeowner patterns`);
|
|
||||||
|
|
||||||
// Match changed files against CODEOWNERS patterns
|
|
||||||
const matchedOwners = new Set();
|
|
||||||
const matchedTeams = new Set();
|
|
||||||
const fileMatches = new Map(); // Track which files matched which patterns
|
|
||||||
|
|
||||||
for (const file of changedFiles) {
|
|
||||||
for (const { pattern, regex, owners } of codeownersPatterns) {
|
|
||||||
if (regex.test(file)) {
|
|
||||||
console.log(`File '${file}' matches pattern '${pattern}' with owners: ${owners.join(', ')}`);
|
|
||||||
|
|
||||||
if (!fileMatches.has(file)) {
|
|
||||||
fileMatches.set(file, []);
|
|
||||||
}
|
|
||||||
fileMatches.get(file).push({ pattern, owners });
|
|
||||||
|
|
||||||
// Add owners to the appropriate set (remove @ prefix)
|
|
||||||
for (const owner of owners) {
|
|
||||||
const cleanOwner = owner.startsWith('@') ? owner.slice(1) : owner;
|
|
||||||
if (cleanOwner.includes('/')) {
|
|
||||||
// Team mention (org/team-name)
|
|
||||||
const teamName = cleanOwner.split('/')[1];
|
|
||||||
matchedTeams.add(teamName);
|
|
||||||
} else {
|
|
||||||
// Individual user
|
|
||||||
matchedOwners.add(cleanOwner);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
if (matchedOwners.size === 0 && matchedTeams.size === 0) {
|
|
||||||
console.log('No codeowners found for any changed files');
|
|
||||||
return;
|
|
||||||
}
|
|
||||||
|
|
||||||
// Remove the PR author from reviewers
|
|
||||||
const prAuthor = context.payload.pull_request.user.login;
|
|
||||||
matchedOwners.delete(prAuthor);
|
|
||||||
|
|
||||||
// Get current reviewers to avoid duplicate requests (but still mention them)
|
|
||||||
const { data: prData } = await github.rest.pulls.get({
|
|
||||||
owner,
|
|
||||||
repo,
|
|
||||||
pull_number: pr_number
|
|
||||||
});
|
|
||||||
|
|
||||||
const currentReviewers = new Set();
|
|
||||||
const currentTeams = new Set();
|
|
||||||
|
|
||||||
if (prData.requested_reviewers) {
|
|
||||||
prData.requested_reviewers.forEach(reviewer => {
|
|
||||||
currentReviewers.add(reviewer.login);
|
|
||||||
});
|
|
||||||
}
|
|
||||||
|
|
||||||
if (prData.requested_teams) {
|
|
||||||
prData.requested_teams.forEach(team => {
|
|
||||||
currentTeams.add(team.slug);
|
|
||||||
});
|
|
||||||
}
|
|
||||||
|
|
||||||
// Check for completed reviews to avoid re-requesting users who have already reviewed
|
|
||||||
const { data: reviews } = await github.rest.pulls.listReviews({
|
|
||||||
owner,
|
|
||||||
repo,
|
|
||||||
pull_number: pr_number
|
|
||||||
});
|
|
||||||
|
|
||||||
const reviewedUsers = new Set();
|
|
||||||
reviews.forEach(review => {
|
|
||||||
reviewedUsers.add(review.user.login);
|
|
||||||
});
|
|
||||||
|
|
||||||
// Check for previous comments from this workflow to avoid duplicate pings
|
|
||||||
const comments = await github.paginate(
|
|
||||||
github.rest.issues.listComments,
|
|
||||||
{
|
|
||||||
owner,
|
|
||||||
repo,
|
|
||||||
issue_number: pr_number
|
|
||||||
}
|
|
||||||
);
|
|
||||||
|
|
||||||
const previouslyPingedUsers = new Set();
|
|
||||||
const previouslyPingedTeams = new Set();
|
|
||||||
|
|
||||||
// Look for comments from github-actions bot that contain our bot marker
|
|
||||||
const workflowComments = comments.filter(comment =>
|
|
||||||
comment.user.type === 'Bot' &&
|
|
||||||
comment.body.includes(BOT_COMMENT_MARKER)
|
|
||||||
);
|
|
||||||
|
|
||||||
// Extract previously mentioned users and teams from workflow comments
|
|
||||||
for (const comment of workflowComments) {
|
|
||||||
// Match @username patterns (not team mentions)
|
|
||||||
const userMentions = comment.body.match(/@([a-zA-Z0-9_.-]+)(?![/])/g) || [];
|
|
||||||
userMentions.forEach(mention => {
|
|
||||||
const username = mention.slice(1); // remove @
|
|
||||||
previouslyPingedUsers.add(username);
|
|
||||||
});
|
|
||||||
|
|
||||||
// Match @org/team patterns
|
|
||||||
const teamMentions = comment.body.match(/@[a-zA-Z0-9_.-]+\/([a-zA-Z0-9_.-]+)/g) || [];
|
|
||||||
teamMentions.forEach(mention => {
|
|
||||||
const teamName = mention.split('/')[1];
|
|
||||||
previouslyPingedTeams.add(teamName);
|
|
||||||
});
|
|
||||||
}
|
|
||||||
|
|
||||||
console.log(`Found ${previouslyPingedUsers.size} previously pinged users and ${previouslyPingedTeams.size} previously pinged teams`);
|
|
||||||
|
|
||||||
// Remove users who have already been pinged in previous workflow comments
|
|
||||||
previouslyPingedUsers.forEach(user => {
|
|
||||||
matchedOwners.delete(user);
|
|
||||||
});
|
|
||||||
|
|
||||||
previouslyPingedTeams.forEach(team => {
|
|
||||||
matchedTeams.delete(team);
|
|
||||||
});
|
|
||||||
|
|
||||||
// Remove only users who have already submitted reviews (not just requested reviewers)
|
|
||||||
reviewedUsers.forEach(reviewer => {
|
|
||||||
matchedOwners.delete(reviewer);
|
|
||||||
});
|
|
||||||
|
|
||||||
// For teams, we'll still remove already requested teams to avoid API errors
|
|
||||||
currentTeams.forEach(team => {
|
|
||||||
matchedTeams.delete(team);
|
|
||||||
});
|
|
||||||
|
|
||||||
const reviewersList = Array.from(matchedOwners);
|
|
||||||
const teamsList = Array.from(matchedTeams);
|
|
||||||
|
|
||||||
if (reviewersList.length === 0 && teamsList.length === 0) {
|
|
||||||
console.log('No eligible reviewers found (all may already be requested, reviewed, or previously pinged)');
|
|
||||||
return;
|
|
||||||
}
|
|
||||||
|
|
||||||
const totalReviewers = reviewersList.length + teamsList.length;
|
|
||||||
console.log(`Requesting reviews from ${reviewersList.length} users and ${teamsList.length} teams for ${fileMatches.size} matched files`);
|
|
||||||
|
|
||||||
// Request reviews
|
|
||||||
try {
|
|
||||||
const requestParams = {
|
|
||||||
owner,
|
|
||||||
repo,
|
|
||||||
pull_number: pr_number
|
|
||||||
};
|
|
||||||
|
|
||||||
// Filter out users who are already requested reviewers for the API call
|
|
||||||
const newReviewers = reviewersList.filter(reviewer => !currentReviewers.has(reviewer));
|
|
||||||
const newTeams = teamsList.filter(team => !currentTeams.has(team));
|
|
||||||
|
|
||||||
if (newReviewers.length > 0) {
|
|
||||||
requestParams.reviewers = newReviewers;
|
|
||||||
}
|
|
||||||
|
|
||||||
if (newTeams.length > 0) {
|
|
||||||
requestParams.team_reviewers = newTeams;
|
|
||||||
}
|
|
||||||
|
|
||||||
// Only make the API call if there are new reviewers to request
|
|
||||||
if (newReviewers.length > 0 || newTeams.length > 0) {
|
|
||||||
await github.rest.pulls.requestReviewers(requestParams);
|
|
||||||
console.log(`Successfully requested reviews from ${newReviewers.length} new users and ${newTeams.length} new teams`);
|
|
||||||
} else {
|
|
||||||
console.log('All codeowners are already requested reviewers or have reviewed');
|
|
||||||
}
|
|
||||||
|
|
||||||
// Only add a comment if there are new codeowners to mention (not previously pinged)
|
|
||||||
if (reviewersList.length > 0 || teamsList.length > 0) {
|
|
||||||
const commentBody = createCommentBody(reviewersList, teamsList, fileMatches.size, true);
|
|
||||||
|
|
||||||
await github.rest.issues.createComment({
|
|
||||||
owner,
|
|
||||||
repo,
|
|
||||||
issue_number: pr_number,
|
|
||||||
body: commentBody
|
|
||||||
});
|
|
||||||
console.log(`Added comment mentioning ${reviewersList.length} users and ${teamsList.length} teams`);
|
|
||||||
} else {
|
|
||||||
console.log('No new codeowners to mention in comment (all previously pinged)');
|
|
||||||
}
|
|
||||||
} catch (error) {
|
|
||||||
if (error.status === 422) {
|
|
||||||
console.log('Some reviewers may already be requested or unavailable:', error.message);
|
|
||||||
|
|
||||||
// Only try to add a comment if there are new codeowners to mention
|
|
||||||
if (reviewersList.length > 0 || teamsList.length > 0) {
|
|
||||||
const commentBody = createCommentBody(reviewersList, teamsList, fileMatches.size, false);
|
|
||||||
|
|
||||||
try {
|
|
||||||
await github.rest.issues.createComment({
|
|
||||||
owner,
|
|
||||||
repo,
|
|
||||||
issue_number: pr_number,
|
|
||||||
body: commentBody
|
|
||||||
});
|
|
||||||
console.log(`Added fallback comment mentioning ${reviewersList.length} users and ${teamsList.length} teams`);
|
|
||||||
} catch (commentError) {
|
|
||||||
console.log('Failed to add comment:', commentError.message);
|
|
||||||
}
|
|
||||||
} else {
|
|
||||||
console.log('No new codeowners to mention in fallback comment');
|
|
||||||
}
|
|
||||||
} else {
|
|
||||||
throw error;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
} catch (error) {
|
|
||||||
console.log('Failed to process codeowner review requests:', error.message);
|
|
||||||
console.error(error);
|
|
||||||
}
|
|
157
.github/workflows/external-component-bot.yml
vendored
157
.github/workflows/external-component-bot.yml
vendored
@ -1,157 +0,0 @@
|
|||||||
name: Add External Component Comment
|
|
||||||
|
|
||||||
on:
|
|
||||||
pull_request_target:
|
|
||||||
types: [opened, synchronize]
|
|
||||||
|
|
||||||
permissions:
|
|
||||||
contents: read # Needed to fetch PR details
|
|
||||||
issues: write # Needed to create and update comments (PR comments are managed via the issues REST API)
|
|
||||||
pull-requests: write # also needed?
|
|
||||||
|
|
||||||
jobs:
|
|
||||||
external-comment:
|
|
||||||
name: External component comment
|
|
||||||
runs-on: ubuntu-latest
|
|
||||||
steps:
|
|
||||||
- name: Add external component comment
|
|
||||||
uses: actions/github-script@v7.0.1
|
|
||||||
with:
|
|
||||||
github-token: ${{ secrets.GITHUB_TOKEN }}
|
|
||||||
script: |
|
|
||||||
// Generate external component usage instructions
|
|
||||||
function generateExternalComponentInstructions(prNumber, componentNames, owner, repo) {
|
|
||||||
let source;
|
|
||||||
if (owner === 'esphome' && repo === 'esphome')
|
|
||||||
source = `github://pr#${prNumber}`;
|
|
||||||
else
|
|
||||||
source = `github://${owner}/${repo}@pull/${prNumber}/head`;
|
|
||||||
return `To use the changes from this PR as an external component, add the following to your ESPHome configuration YAML file:
|
|
||||||
|
|
||||||
\`\`\`yaml
|
|
||||||
external_components:
|
|
||||||
- source: ${source}
|
|
||||||
components: [${componentNames.join(', ')}]
|
|
||||||
refresh: 1h
|
|
||||||
\`\`\``;
|
|
||||||
}
|
|
||||||
|
|
||||||
// Generate repo clone instructions
|
|
||||||
function generateRepoInstructions(prNumber, owner, repo, branch) {
|
|
||||||
return `To use the changes in this PR:
|
|
||||||
|
|
||||||
\`\`\`bash
|
|
||||||
# Clone the repository:
|
|
||||||
git clone https://github.com/${owner}/${repo}
|
|
||||||
cd ${repo}
|
|
||||||
|
|
||||||
# Checkout the PR branch:
|
|
||||||
git fetch origin pull/${prNumber}/head:${branch}
|
|
||||||
git checkout ${branch}
|
|
||||||
|
|
||||||
# Install the development version:
|
|
||||||
script/setup
|
|
||||||
|
|
||||||
# Activate the development version:
|
|
||||||
source venv/bin/activate
|
|
||||||
\`\`\`
|
|
||||||
|
|
||||||
Now you can run \`esphome\` as usual to test the changes in this PR.
|
|
||||||
`;
|
|
||||||
}
|
|
||||||
|
|
||||||
async function createComment(octokit, owner, repo, prNumber, esphomeChanges, componentChanges) {
|
|
||||||
const commentMarker = "<!-- This comment was generated automatically by the external-component-bot workflow. -->";
|
|
||||||
const legacyCommentMarker = "<!-- This comment was generated automatically by a GitHub workflow. -->";
|
|
||||||
let commentBody;
|
|
||||||
if (esphomeChanges.length === 1) {
|
|
||||||
commentBody = generateExternalComponentInstructions(prNumber, componentChanges, owner, repo);
|
|
||||||
} else {
|
|
||||||
commentBody = generateRepoInstructions(prNumber, owner, repo, context.payload.pull_request.head.ref);
|
|
||||||
}
|
|
||||||
commentBody += `\n\n---\n(Added by the PR bot)\n\n${commentMarker}`;
|
|
||||||
|
|
||||||
// Check for existing bot comment
|
|
||||||
const comments = await github.paginate(
|
|
||||||
github.rest.issues.listComments,
|
|
||||||
{
|
|
||||||
owner: owner,
|
|
||||||
repo: repo,
|
|
||||||
issue_number: prNumber,
|
|
||||||
per_page: 100,
|
|
||||||
}
|
|
||||||
);
|
|
||||||
|
|
||||||
const sorted = comments.sort((a, b) => new Date(b.updated_at) - new Date(a.updated_at));
|
|
||||||
|
|
||||||
const botComment = sorted.find(comment =>
|
|
||||||
(
|
|
||||||
comment.body.includes(commentMarker) ||
|
|
||||||
comment.body.includes(legacyCommentMarker)
|
|
||||||
) && comment.user.type === "Bot"
|
|
||||||
);
|
|
||||||
|
|
||||||
if (botComment && botComment.body === commentBody) {
|
|
||||||
// No changes in the comment, do nothing
|
|
||||||
return;
|
|
||||||
}
|
|
||||||
|
|
||||||
if (botComment) {
|
|
||||||
// Update existing comment
|
|
||||||
await github.rest.issues.updateComment({
|
|
||||||
owner: owner,
|
|
||||||
repo: repo,
|
|
||||||
comment_id: botComment.id,
|
|
||||||
body: commentBody,
|
|
||||||
});
|
|
||||||
} else {
|
|
||||||
// Create new comment
|
|
||||||
await github.rest.issues.createComment({
|
|
||||||
owner: owner,
|
|
||||||
repo: repo,
|
|
||||||
issue_number: prNumber,
|
|
||||||
body: commentBody,
|
|
||||||
});
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
async function getEsphomeAndComponentChanges(github, owner, repo, prNumber) {
|
|
||||||
const changedFiles = await github.rest.pulls.listFiles({
|
|
||||||
owner: owner,
|
|
||||||
repo: repo,
|
|
||||||
pull_number: prNumber,
|
|
||||||
});
|
|
||||||
|
|
||||||
const esphomeChanges = changedFiles.data
|
|
||||||
.filter(file => file.filename !== "esphome/core/defines.h" && file.filename.startsWith('esphome/'))
|
|
||||||
.map(file => {
|
|
||||||
const match = file.filename.match(/esphome\/([^/]+)/);
|
|
||||||
return match ? match[1] : null;
|
|
||||||
})
|
|
||||||
.filter(it => it !== null);
|
|
||||||
|
|
||||||
if (esphomeChanges.length === 0) {
|
|
||||||
return {esphomeChanges: [], componentChanges: []};
|
|
||||||
}
|
|
||||||
|
|
||||||
const uniqueEsphomeChanges = [...new Set(esphomeChanges)];
|
|
||||||
const componentChanges = changedFiles.data
|
|
||||||
.filter(file => file.filename.startsWith('esphome/components/'))
|
|
||||||
.map(file => {
|
|
||||||
const match = file.filename.match(/esphome\/components\/([^/]+)\//);
|
|
||||||
return match ? match[1] : null;
|
|
||||||
})
|
|
||||||
.filter(it => it !== null);
|
|
||||||
|
|
||||||
return {esphomeChanges: uniqueEsphomeChanges, componentChanges: [...new Set(componentChanges)]};
|
|
||||||
}
|
|
||||||
|
|
||||||
// Start of main code.
|
|
||||||
|
|
||||||
const prNumber = context.payload.pull_request.number;
|
|
||||||
const {owner, repo} = context.repo;
|
|
||||||
|
|
||||||
const {esphomeChanges, componentChanges} = await getEsphomeAndComponentChanges(github, owner, repo, prNumber);
|
|
||||||
if (componentChanges.length !== 0) {
|
|
||||||
await createComment(github, owner, repo, prNumber, esphomeChanges, componentChanges);
|
|
||||||
}
|
|
163
.github/workflows/issue-codeowner-notify.yml
vendored
163
.github/workflows/issue-codeowner-notify.yml
vendored
@ -1,163 +0,0 @@
|
|||||||
# This workflow automatically notifies codeowners when an issue is labeled with component labels.
|
|
||||||
# It reads the CODEOWNERS file to find the maintainers for the labeled components
|
|
||||||
# and posts a comment mentioning them to ensure they're aware of the issue.
|
|
||||||
|
|
||||||
name: Notify Issue Codeowners
|
|
||||||
|
|
||||||
on:
|
|
||||||
issues:
|
|
||||||
types: [labeled]
|
|
||||||
|
|
||||||
permissions:
|
|
||||||
issues: write
|
|
||||||
contents: read
|
|
||||||
|
|
||||||
jobs:
|
|
||||||
notify-codeowners:
|
|
||||||
name: Run
|
|
||||||
if: ${{ startsWith(github.event.label.name, format('component{0} ', ':')) }}
|
|
||||||
runs-on: ubuntu-latest
|
|
||||||
steps:
|
|
||||||
- name: Notify codeowners for component issues
|
|
||||||
uses: actions/github-script@v7.0.1
|
|
||||||
with:
|
|
||||||
script: |
|
|
||||||
const owner = context.repo.owner;
|
|
||||||
const repo = context.repo.repo;
|
|
||||||
const issue_number = context.payload.issue.number;
|
|
||||||
const labelName = context.payload.label.name;
|
|
||||||
|
|
||||||
console.log(`Processing issue #${issue_number} with label: ${labelName}`);
|
|
||||||
|
|
||||||
// Hidden marker to identify bot comments from this workflow
|
|
||||||
const BOT_COMMENT_MARKER = '<!-- issue-codeowner-notify-bot -->';
|
|
||||||
|
|
||||||
// Extract component name from label
|
|
||||||
const componentName = labelName.replace('component: ', '');
|
|
||||||
console.log(`Component: ${componentName}`);
|
|
||||||
|
|
||||||
try {
|
|
||||||
// Fetch CODEOWNERS file from root
|
|
||||||
const { data: codeownersFile } = await github.rest.repos.getContent({
|
|
||||||
owner,
|
|
||||||
repo,
|
|
||||||
path: 'CODEOWNERS'
|
|
||||||
});
|
|
||||||
const codeownersContent = Buffer.from(codeownersFile.content, 'base64').toString('utf8');
|
|
||||||
|
|
||||||
// Parse CODEOWNERS file to extract component mappings
|
|
||||||
const codeownersLines = codeownersContent.split('\n')
|
|
||||||
.map(line => line.trim())
|
|
||||||
.filter(line => line && !line.startsWith('#'));
|
|
||||||
|
|
||||||
let componentOwners = null;
|
|
||||||
|
|
||||||
for (const line of codeownersLines) {
|
|
||||||
const parts = line.split(/\s+/);
|
|
||||||
if (parts.length < 2) continue;
|
|
||||||
|
|
||||||
const pattern = parts[0];
|
|
||||||
const owners = parts.slice(1);
|
|
||||||
|
|
||||||
// Look for component patterns: esphome/components/{component}/*
|
|
||||||
const componentMatch = pattern.match(/^esphome\/components\/([^\/]+)\/\*$/);
|
|
||||||
if (componentMatch && componentMatch[1] === componentName) {
|
|
||||||
componentOwners = owners;
|
|
||||||
break;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
if (!componentOwners) {
|
|
||||||
console.log(`No codeowners found for component: ${componentName}`);
|
|
||||||
return;
|
|
||||||
}
|
|
||||||
|
|
||||||
console.log(`Found codeowners for '${componentName}': ${componentOwners.join(', ')}`);
|
|
||||||
|
|
||||||
// Separate users and teams
|
|
||||||
const userOwners = [];
|
|
||||||
const teamOwners = [];
|
|
||||||
|
|
||||||
for (const owner of componentOwners) {
|
|
||||||
const cleanOwner = owner.startsWith('@') ? owner.slice(1) : owner;
|
|
||||||
if (cleanOwner.includes('/')) {
|
|
||||||
// Team mention (org/team-name)
|
|
||||||
teamOwners.push(`@${cleanOwner}`);
|
|
||||||
} else {
|
|
||||||
// Individual user
|
|
||||||
userOwners.push(`@${cleanOwner}`);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
// Remove issue author from mentions to avoid self-notification
|
|
||||||
const issueAuthor = context.payload.issue.user.login;
|
|
||||||
const filteredUserOwners = userOwners.filter(mention =>
|
|
||||||
mention !== `@${issueAuthor}`
|
|
||||||
);
|
|
||||||
|
|
||||||
// Check for previous comments from this workflow to avoid duplicate pings
|
|
||||||
const comments = await github.paginate(
|
|
||||||
github.rest.issues.listComments,
|
|
||||||
{
|
|
||||||
owner,
|
|
||||||
repo,
|
|
||||||
issue_number: issue_number
|
|
||||||
}
|
|
||||||
);
|
|
||||||
|
|
||||||
const previouslyPingedUsers = new Set();
|
|
||||||
const previouslyPingedTeams = new Set();
|
|
||||||
|
|
||||||
// Look for comments from github-actions bot that contain codeowner pings for this component
|
|
||||||
const workflowComments = comments.filter(comment =>
|
|
||||||
comment.user.type === 'Bot' &&
|
|
||||||
comment.body.includes(BOT_COMMENT_MARKER) &&
|
|
||||||
comment.body.includes(`component: ${componentName}`)
|
|
||||||
);
|
|
||||||
|
|
||||||
// Extract previously mentioned users and teams from workflow comments
|
|
||||||
for (const comment of workflowComments) {
|
|
||||||
// Match @username patterns (not team mentions)
|
|
||||||
const userMentions = comment.body.match(/@([a-zA-Z0-9_.-]+)(?![/])/g) || [];
|
|
||||||
userMentions.forEach(mention => {
|
|
||||||
previouslyPingedUsers.add(mention); // Keep @ prefix for easy comparison
|
|
||||||
});
|
|
||||||
|
|
||||||
// Match @org/team patterns
|
|
||||||
const teamMentions = comment.body.match(/@[a-zA-Z0-9_.-]+\/[a-zA-Z0-9_.-]+/g) || [];
|
|
||||||
teamMentions.forEach(mention => {
|
|
||||||
previouslyPingedTeams.add(mention);
|
|
||||||
});
|
|
||||||
}
|
|
||||||
|
|
||||||
console.log(`Found ${previouslyPingedUsers.size} previously pinged users and ${previouslyPingedTeams.size} previously pinged teams for component ${componentName}`);
|
|
||||||
|
|
||||||
// Remove previously pinged users and teams
|
|
||||||
const newUserOwners = filteredUserOwners.filter(mention => !previouslyPingedUsers.has(mention));
|
|
||||||
const newTeamOwners = teamOwners.filter(mention => !previouslyPingedTeams.has(mention));
|
|
||||||
|
|
||||||
const allMentions = [...newUserOwners, ...newTeamOwners];
|
|
||||||
|
|
||||||
if (allMentions.length === 0) {
|
|
||||||
console.log('No new codeowners to notify (all previously pinged or issue author is the only codeowner)');
|
|
||||||
return;
|
|
||||||
}
|
|
||||||
|
|
||||||
// Create comment body
|
|
||||||
const mentionString = allMentions.join(', ');
|
|
||||||
const commentBody = `${BOT_COMMENT_MARKER}\n👋 Hey ${mentionString}!\n\nThis issue has been labeled with \`component: ${componentName}\` and you've been identified as a codeowner of this component. Please take a look when you have a chance!\n\nThanks for maintaining this component! 🙏`;
|
|
||||||
|
|
||||||
// Post comment
|
|
||||||
await github.rest.issues.createComment({
|
|
||||||
owner,
|
|
||||||
repo,
|
|
||||||
issue_number: issue_number,
|
|
||||||
body: commentBody
|
|
||||||
});
|
|
||||||
|
|
||||||
console.log(`Successfully notified new codeowners: ${mentionString}`);
|
|
||||||
|
|
||||||
} catch (error) {
|
|
||||||
console.log('Failed to process codeowner notifications:', error.message);
|
|
||||||
console.error(error);
|
|
||||||
}
|
|
2
.github/workflows/release.yml
vendored
2
.github/workflows/release.yml
vendored
@ -96,7 +96,7 @@ jobs:
|
|||||||
- name: Set up Python
|
- name: Set up Python
|
||||||
uses: actions/setup-python@v5.6.0
|
uses: actions/setup-python@v5.6.0
|
||||||
with:
|
with:
|
||||||
python-version: "3.11"
|
python-version: "3.10"
|
||||||
|
|
||||||
- name: Set up Docker Buildx
|
- name: Set up Docker Buildx
|
||||||
uses: docker/setup-buildx-action@v3.11.1
|
uses: docker/setup-buildx-action@v3.11.1
|
||||||
|
25
.github/workflows/yaml-lint.yml
vendored
Normal file
25
.github/workflows/yaml-lint.yml
vendored
Normal file
@ -0,0 +1,25 @@
|
|||||||
|
---
|
||||||
|
name: YAML lint
|
||||||
|
|
||||||
|
on:
|
||||||
|
push:
|
||||||
|
branches: [dev, beta, release]
|
||||||
|
paths:
|
||||||
|
- "**.yaml"
|
||||||
|
- "**.yml"
|
||||||
|
pull_request:
|
||||||
|
paths:
|
||||||
|
- "**.yaml"
|
||||||
|
- "**.yml"
|
||||||
|
|
||||||
|
jobs:
|
||||||
|
yamllint:
|
||||||
|
name: yamllint
|
||||||
|
runs-on: ubuntu-latest
|
||||||
|
steps:
|
||||||
|
- name: Check out code from GitHub
|
||||||
|
uses: actions/checkout@v4.2.2
|
||||||
|
- name: Run yamllint
|
||||||
|
uses: frenck/action-yamllint@v1.5.0
|
||||||
|
with:
|
||||||
|
strict: true
|
@ -4,14 +4,15 @@
|
|||||||
|
|
||||||
ci:
|
ci:
|
||||||
autoupdate_commit_msg: 'pre-commit: autoupdate'
|
autoupdate_commit_msg: 'pre-commit: autoupdate'
|
||||||
autoupdate_schedule: off # Disabled until ruff versions are synced between deps and pre-commit
|
autoupdate_schedule: weekly
|
||||||
|
autofix_prs: false
|
||||||
# Skip hooks that have issues in pre-commit CI environment
|
# Skip hooks that have issues in pre-commit CI environment
|
||||||
skip: [pylint, clang-tidy-hash]
|
skip: [pylint, yamllint]
|
||||||
|
|
||||||
repos:
|
repos:
|
||||||
- repo: https://github.com/astral-sh/ruff-pre-commit
|
- repo: https://github.com/astral-sh/ruff-pre-commit
|
||||||
# Ruff version.
|
# Ruff version.
|
||||||
rev: v0.12.7
|
rev: v0.12.2
|
||||||
hooks:
|
hooks:
|
||||||
# Run the linter.
|
# Run the linter.
|
||||||
- id: ruff
|
- id: ruff
|
||||||
@ -27,25 +28,22 @@ repos:
|
|||||||
- pydocstyle==5.1.1
|
- pydocstyle==5.1.1
|
||||||
files: ^(esphome|tests)/.+\.py$
|
files: ^(esphome|tests)/.+\.py$
|
||||||
- repo: https://github.com/pre-commit/pre-commit-hooks
|
- repo: https://github.com/pre-commit/pre-commit-hooks
|
||||||
rev: v5.0.0
|
rev: v3.4.0
|
||||||
hooks:
|
hooks:
|
||||||
- id: no-commit-to-branch
|
- id: no-commit-to-branch
|
||||||
args:
|
args:
|
||||||
- --branch=dev
|
- --branch=dev
|
||||||
- --branch=release
|
- --branch=release
|
||||||
- --branch=beta
|
- --branch=beta
|
||||||
- id: end-of-file-fixer
|
|
||||||
- id: trailing-whitespace
|
|
||||||
- repo: https://github.com/asottile/pyupgrade
|
- repo: https://github.com/asottile/pyupgrade
|
||||||
rev: v3.20.0
|
rev: v3.20.0
|
||||||
hooks:
|
hooks:
|
||||||
- id: pyupgrade
|
- id: pyupgrade
|
||||||
args: [--py311-plus]
|
args: [--py310-plus]
|
||||||
- repo: https://github.com/adrienverge/yamllint.git
|
- repo: https://github.com/adrienverge/yamllint.git
|
||||||
rev: v1.37.1
|
rev: v1.37.1
|
||||||
hooks:
|
hooks:
|
||||||
- id: yamllint
|
- id: yamllint
|
||||||
exclude: ^(\.clang-format|\.clang-tidy)$
|
|
||||||
- repo: https://github.com/pre-commit/mirrors-clang-format
|
- repo: https://github.com/pre-commit/mirrors-clang-format
|
||||||
rev: v13.0.1
|
rev: v13.0.1
|
||||||
hooks:
|
hooks:
|
||||||
@ -58,10 +56,3 @@ repos:
|
|||||||
entry: python3 script/run-in-env.py pylint
|
entry: python3 script/run-in-env.py pylint
|
||||||
language: system
|
language: system
|
||||||
types: [python]
|
types: [python]
|
||||||
- id: clang-tidy-hash
|
|
||||||
name: Update clang-tidy hash
|
|
||||||
entry: python script/clang_tidy_hash.py --update-if-changed
|
|
||||||
language: python
|
|
||||||
files: ^(\.clang-tidy|platformio\.ini|requirements_dev\.txt)$
|
|
||||||
pass_filenames: false
|
|
||||||
additional_dependencies: []
|
|
||||||
|
11
CODEOWNERS
11
CODEOWNERS
@ -9,7 +9,6 @@
|
|||||||
pyproject.toml @esphome/core
|
pyproject.toml @esphome/core
|
||||||
esphome/*.py @esphome/core
|
esphome/*.py @esphome/core
|
||||||
esphome/core/* @esphome/core
|
esphome/core/* @esphome/core
|
||||||
.github/** @esphome/core
|
|
||||||
|
|
||||||
# Integrations
|
# Integrations
|
||||||
esphome/components/a01nyub/* @MrSuicideParrot
|
esphome/components/a01nyub/* @MrSuicideParrot
|
||||||
@ -29,7 +28,7 @@ esphome/components/aic3204/* @kbx81
|
|||||||
esphome/components/airthings_ble/* @jeromelaban
|
esphome/components/airthings_ble/* @jeromelaban
|
||||||
esphome/components/airthings_wave_base/* @jeromelaban @kpfleming @ncareau
|
esphome/components/airthings_wave_base/* @jeromelaban @kpfleming @ncareau
|
||||||
esphome/components/airthings_wave_mini/* @ncareau
|
esphome/components/airthings_wave_mini/* @ncareau
|
||||||
esphome/components/airthings_wave_plus/* @jeromelaban @precurse
|
esphome/components/airthings_wave_plus/* @jeromelaban
|
||||||
esphome/components/alarm_control_panel/* @grahambrown11 @hwstar
|
esphome/components/alarm_control_panel/* @grahambrown11 @hwstar
|
||||||
esphome/components/alpha3/* @jan-hofmeier
|
esphome/components/alpha3/* @jan-hofmeier
|
||||||
esphome/components/am2315c/* @swoboda1337
|
esphome/components/am2315c/* @swoboda1337
|
||||||
@ -155,7 +154,6 @@ esphome/components/esp32_rmt/* @jesserockz
|
|||||||
esphome/components/esp32_rmt_led_strip/* @jesserockz
|
esphome/components/esp32_rmt_led_strip/* @jesserockz
|
||||||
esphome/components/esp8266/* @esphome/core
|
esphome/components/esp8266/* @esphome/core
|
||||||
esphome/components/esp_ldo/* @clydebarrow
|
esphome/components/esp_ldo/* @clydebarrow
|
||||||
esphome/components/espnow/* @jesserockz
|
|
||||||
esphome/components/ethernet_info/* @gtjadsonsantos
|
esphome/components/ethernet_info/* @gtjadsonsantos
|
||||||
esphome/components/event/* @nohat
|
esphome/components/event/* @nohat
|
||||||
esphome/components/event_emitter/* @Rapsssito
|
esphome/components/event_emitter/* @Rapsssito
|
||||||
@ -247,7 +245,6 @@ esphome/components/lcd_menu/* @numo68
|
|||||||
esphome/components/ld2410/* @regevbr @sebcaps
|
esphome/components/ld2410/* @regevbr @sebcaps
|
||||||
esphome/components/ld2420/* @descipher
|
esphome/components/ld2420/* @descipher
|
||||||
esphome/components/ld2450/* @hareeshmu
|
esphome/components/ld2450/* @hareeshmu
|
||||||
esphome/components/ld24xx/* @kbx81
|
|
||||||
esphome/components/ledc/* @OttoWinter
|
esphome/components/ledc/* @OttoWinter
|
||||||
esphome/components/libretiny/* @kuba2k2
|
esphome/components/libretiny/* @kuba2k2
|
||||||
esphome/components/libretiny_pwm/* @kuba2k2
|
esphome/components/libretiny_pwm/* @kuba2k2
|
||||||
@ -295,7 +292,6 @@ esphome/components/microphone/* @jesserockz @kahrendt
|
|||||||
esphome/components/mics_4514/* @jesserockz
|
esphome/components/mics_4514/* @jesserockz
|
||||||
esphome/components/midea/* @dudanov
|
esphome/components/midea/* @dudanov
|
||||||
esphome/components/midea_ir/* @dudanov
|
esphome/components/midea_ir/* @dudanov
|
||||||
esphome/components/mipi_dsi/* @clydebarrow
|
|
||||||
esphome/components/mipi_spi/* @clydebarrow
|
esphome/components/mipi_spi/* @clydebarrow
|
||||||
esphome/components/mitsubishi/* @RubyBailey
|
esphome/components/mitsubishi/* @RubyBailey
|
||||||
esphome/components/mixer/speaker/* @kahrendt
|
esphome/components/mixer/speaker/* @kahrendt
|
||||||
@ -328,7 +324,6 @@ esphome/components/nextion/text_sensor/* @senexcrenshaw
|
|||||||
esphome/components/nfc/* @jesserockz @kbx81
|
esphome/components/nfc/* @jesserockz @kbx81
|
||||||
esphome/components/noblex/* @AGalfra
|
esphome/components/noblex/* @AGalfra
|
||||||
esphome/components/npi19/* @bakerkj
|
esphome/components/npi19/* @bakerkj
|
||||||
esphome/components/nrf52/* @tomaszduda23
|
|
||||||
esphome/components/number/* @esphome/core
|
esphome/components/number/* @esphome/core
|
||||||
esphome/components/one_wire/* @ssieb
|
esphome/components/one_wire/* @ssieb
|
||||||
esphome/components/online_image/* @clydebarrow @guillempages
|
esphome/components/online_image/* @clydebarrow @guillempages
|
||||||
@ -383,7 +378,6 @@ esphome/components/rp2040_pwm/* @jesserockz
|
|||||||
esphome/components/rpi_dpi_rgb/* @clydebarrow
|
esphome/components/rpi_dpi_rgb/* @clydebarrow
|
||||||
esphome/components/rtl87xx/* @kuba2k2
|
esphome/components/rtl87xx/* @kuba2k2
|
||||||
esphome/components/rtttl/* @glmnet
|
esphome/components/rtttl/* @glmnet
|
||||||
esphome/components/runtime_stats/* @bdraco
|
|
||||||
esphome/components/safe_mode/* @jsuanet @kbx81 @paulmonigatti
|
esphome/components/safe_mode/* @jsuanet @kbx81 @paulmonigatti
|
||||||
esphome/components/scd4x/* @martgras @sjtrny
|
esphome/components/scd4x/* @martgras @sjtrny
|
||||||
esphome/components/script/* @esphome/core
|
esphome/components/script/* @esphome/core
|
||||||
@ -473,7 +467,7 @@ esphome/components/tlc5971/* @IJIJI
|
|||||||
esphome/components/tm1621/* @Philippe12
|
esphome/components/tm1621/* @Philippe12
|
||||||
esphome/components/tm1637/* @glmnet
|
esphome/components/tm1637/* @glmnet
|
||||||
esphome/components/tm1638/* @skykingjwc
|
esphome/components/tm1638/* @skykingjwc
|
||||||
esphome/components/tm1651/* @mrtoy-me
|
esphome/components/tm1651/* @freekode
|
||||||
esphome/components/tmp102/* @timsavage
|
esphome/components/tmp102/* @timsavage
|
||||||
esphome/components/tmp1075/* @sybrenstuvel
|
esphome/components/tmp1075/* @sybrenstuvel
|
||||||
esphome/components/tmp117/* @Azimath
|
esphome/components/tmp117/* @Azimath
|
||||||
@ -541,6 +535,5 @@ esphome/components/xiaomi_xmwsdj04mmc/* @medusalix
|
|||||||
esphome/components/xl9535/* @mreditor97
|
esphome/components/xl9535/* @mreditor97
|
||||||
esphome/components/xpt2046/touchscreen/* @nielsnl68 @numo68
|
esphome/components/xpt2046/touchscreen/* @nielsnl68 @numo68
|
||||||
esphome/components/xxtea/* @clydebarrow
|
esphome/components/xxtea/* @clydebarrow
|
||||||
esphome/components/zephyr/* @tomaszduda23
|
|
||||||
esphome/components/zhlt01/* @cfeenstra1024
|
esphome/components/zhlt01/* @cfeenstra1024
|
||||||
esphome/components/zio_ultrasonic/* @kahrendt
|
esphome/components/zio_ultrasonic/* @kahrendt
|
||||||
|
@ -7,7 +7,7 @@ project and be sure to join us on [Discord](https://discord.gg/KhAMKrd).
|
|||||||
|
|
||||||
**See also:**
|
**See also:**
|
||||||
|
|
||||||
[Documentation](https://esphome.io) -- [Issues](https://github.com/esphome/esphome/issues) -- [Feature requests](https://github.com/orgs/esphome/discussions)
|
[Documentation](https://esphome.io) -- [Issues](https://github.com/esphome/issues/issues) -- [Feature requests](https://github.com/esphome/feature-requests/issues)
|
||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
|
2
Doxyfile
2
Doxyfile
@ -48,7 +48,7 @@ PROJECT_NAME = ESPHome
|
|||||||
# could be handy for archiving the generated documentation or if some version
|
# could be handy for archiving the generated documentation or if some version
|
||||||
# control system is used.
|
# control system is used.
|
||||||
|
|
||||||
PROJECT_NUMBER = 2025.8.0-dev
|
PROJECT_NUMBER = 2025.7.2
|
||||||
|
|
||||||
# Using the PROJECT_BRIEF tag one can provide an optional one line description
|
# Using the PROJECT_BRIEF tag one can provide an optional one line description
|
||||||
# for a project that appears at the top of each page and should give viewer a
|
# for a project that appears at the top of each page and should give viewer a
|
||||||
|
@ -9,7 +9,7 @@
|
|||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
[Documentation](https://esphome.io) -- [Issues](https://github.com/esphome/esphome/issues) -- [Feature requests](https://github.com/orgs/esphome/discussions)
|
[Documentation](https://esphome.io) -- [Issues](https://github.com/esphome/issues/issues) -- [Feature requests](https://github.com/esphome/feature-requests/issues)
|
||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
|
@ -2,7 +2,6 @@
|
|||||||
import argparse
|
import argparse
|
||||||
from datetime import datetime
|
from datetime import datetime
|
||||||
import functools
|
import functools
|
||||||
import getpass
|
|
||||||
import importlib
|
import importlib
|
||||||
import logging
|
import logging
|
||||||
import os
|
import os
|
||||||
@ -35,7 +34,6 @@ from esphome.const import (
|
|||||||
CONF_PORT,
|
CONF_PORT,
|
||||||
CONF_SUBSTITUTIONS,
|
CONF_SUBSTITUTIONS,
|
||||||
CONF_TOPIC,
|
CONF_TOPIC,
|
||||||
ENV_NOGITIGNORE,
|
|
||||||
PLATFORM_ESP32,
|
PLATFORM_ESP32,
|
||||||
PLATFORM_ESP8266,
|
PLATFORM_ESP8266,
|
||||||
PLATFORM_RP2040,
|
PLATFORM_RP2040,
|
||||||
@ -90,9 +88,9 @@ def choose_prompt(options, purpose: str = None):
|
|||||||
def choose_upload_log_host(
|
def choose_upload_log_host(
|
||||||
default, check_default, show_ota, show_mqtt, show_api, purpose: str = None
|
default, check_default, show_ota, show_mqtt, show_api, purpose: str = None
|
||||||
):
|
):
|
||||||
options = [
|
options = []
|
||||||
(f"{port.path} ({port.description})", port.path) for port in get_serial_ports()
|
for port in get_serial_ports():
|
||||||
]
|
options.append((f"{port.path} ({port.description})", port.path))
|
||||||
if default == "SERIAL":
|
if default == "SERIAL":
|
||||||
return choose_prompt(options, purpose=purpose)
|
return choose_prompt(options, purpose=purpose)
|
||||||
if (show_ota and "ota" in CORE.config) or (show_api and "api" in CORE.config):
|
if (show_ota and "ota" in CORE.config) or (show_api and "api" in CORE.config):
|
||||||
@ -120,7 +118,9 @@ def mqtt_logging_enabled(mqtt_config):
|
|||||||
return False
|
return False
|
||||||
if CONF_TOPIC not in log_topic:
|
if CONF_TOPIC not in log_topic:
|
||||||
return False
|
return False
|
||||||
return log_topic.get(CONF_LEVEL, None) != "NONE"
|
if log_topic.get(CONF_LEVEL, None) == "NONE":
|
||||||
|
return False
|
||||||
|
return True
|
||||||
|
|
||||||
|
|
||||||
def get_port_type(port):
|
def get_port_type(port):
|
||||||
@ -209,9 +209,6 @@ def wrap_to_code(name, comp):
|
|||||||
|
|
||||||
|
|
||||||
def write_cpp(config):
|
def write_cpp(config):
|
||||||
if not get_bool_env(ENV_NOGITIGNORE):
|
|
||||||
writer.write_gitignore()
|
|
||||||
|
|
||||||
generate_cpp_contents(config)
|
generate_cpp_contents(config)
|
||||||
return write_cpp_file()
|
return write_cpp_file()
|
||||||
|
|
||||||
@ -228,13 +225,10 @@ def generate_cpp_contents(config):
|
|||||||
|
|
||||||
|
|
||||||
def write_cpp_file():
|
def write_cpp_file():
|
||||||
|
writer.write_platformio_project()
|
||||||
|
|
||||||
code_s = indent(CORE.cpp_main_section)
|
code_s = indent(CORE.cpp_main_section)
|
||||||
writer.write_cpp(code_s)
|
writer.write_cpp(code_s)
|
||||||
|
|
||||||
from esphome.build_gen import platformio
|
|
||||||
|
|
||||||
platformio.write_project()
|
|
||||||
|
|
||||||
return 0
|
return 0
|
||||||
|
|
||||||
|
|
||||||
@ -336,7 +330,7 @@ def check_permissions(port):
|
|||||||
raise EsphomeError(
|
raise EsphomeError(
|
||||||
"You do not have read or write permission on the selected serial port. "
|
"You do not have read or write permission on the selected serial port. "
|
||||||
"To resolve this issue, you can add your user to the dialout group "
|
"To resolve this issue, you can add your user to the dialout group "
|
||||||
f"by running the following command: sudo usermod -a -G dialout {getpass.getuser()}. "
|
f"by running the following command: sudo usermod -a -G dialout {os.getlogin()}. "
|
||||||
"You will need to log out & back in or reboot to activate the new group access."
|
"You will need to log out & back in or reboot to activate the new group access."
|
||||||
)
|
)
|
||||||
|
|
||||||
|
@ -391,7 +391,8 @@ async def build_action(full_config, template_arg, args):
|
|||||||
)
|
)
|
||||||
action_id = full_config[CONF_TYPE_ID]
|
action_id = full_config[CONF_TYPE_ID]
|
||||||
builder = registry_entry.coroutine_fun
|
builder = registry_entry.coroutine_fun
|
||||||
return await builder(config, action_id, template_arg, args)
|
ret = await builder(config, action_id, template_arg, args)
|
||||||
|
return ret
|
||||||
|
|
||||||
|
|
||||||
async def build_action_list(config, templ, arg_type):
|
async def build_action_list(config, templ, arg_type):
|
||||||
@ -408,7 +409,8 @@ async def build_condition(full_config, template_arg, args):
|
|||||||
)
|
)
|
||||||
action_id = full_config[CONF_TYPE_ID]
|
action_id = full_config[CONF_TYPE_ID]
|
||||||
builder = registry_entry.coroutine_fun
|
builder = registry_entry.coroutine_fun
|
||||||
return await builder(config, action_id, template_arg, args)
|
ret = await builder(config, action_id, template_arg, args)
|
||||||
|
return ret
|
||||||
|
|
||||||
|
|
||||||
async def build_condition_list(config, templ, args):
|
async def build_condition_list(config, templ, args):
|
||||||
|
@ -1,102 +0,0 @@
|
|||||||
import os
|
|
||||||
|
|
||||||
from esphome.const import __version__
|
|
||||||
from esphome.core import CORE
|
|
||||||
from esphome.helpers import mkdir_p, read_file, write_file_if_changed
|
|
||||||
from esphome.writer import find_begin_end, update_storage_json
|
|
||||||
|
|
||||||
INI_AUTO_GENERATE_BEGIN = "; ========== AUTO GENERATED CODE BEGIN ==========="
|
|
||||||
INI_AUTO_GENERATE_END = "; =========== AUTO GENERATED CODE END ============"
|
|
||||||
|
|
||||||
INI_BASE_FORMAT = (
|
|
||||||
"""; Auto generated code by esphome
|
|
||||||
|
|
||||||
[common]
|
|
||||||
lib_deps =
|
|
||||||
build_flags =
|
|
||||||
upload_flags =
|
|
||||||
|
|
||||||
""",
|
|
||||||
"""
|
|
||||||
|
|
||||||
""",
|
|
||||||
)
|
|
||||||
|
|
||||||
|
|
||||||
def format_ini(data: dict[str, str | list[str]]) -> str:
|
|
||||||
content = ""
|
|
||||||
for key, value in sorted(data.items()):
|
|
||||||
if isinstance(value, list):
|
|
||||||
content += f"{key} =\n"
|
|
||||||
for x in value:
|
|
||||||
content += f" {x}\n"
|
|
||||||
else:
|
|
||||||
content += f"{key} = {value}\n"
|
|
||||||
return content
|
|
||||||
|
|
||||||
|
|
||||||
def get_ini_content():
|
|
||||||
CORE.add_platformio_option(
|
|
||||||
"lib_deps",
|
|
||||||
[x.as_lib_dep for x in CORE.platformio_libraries.values()]
|
|
||||||
+ ["${common.lib_deps}"],
|
|
||||||
)
|
|
||||||
# Sort to avoid changing build flags order
|
|
||||||
CORE.add_platformio_option("build_flags", sorted(CORE.build_flags))
|
|
||||||
|
|
||||||
# Sort to avoid changing build unflags order
|
|
||||||
CORE.add_platformio_option("build_unflags", sorted(CORE.build_unflags))
|
|
||||||
|
|
||||||
# Add extra script for C++ flags
|
|
||||||
CORE.add_platformio_option("extra_scripts", [f"pre:{CXX_FLAGS_FILE_NAME}"])
|
|
||||||
|
|
||||||
content = "[platformio]\n"
|
|
||||||
content += f"description = ESPHome {__version__}\n"
|
|
||||||
|
|
||||||
content += f"[env:{CORE.name}]\n"
|
|
||||||
content += format_ini(CORE.platformio_options)
|
|
||||||
|
|
||||||
return content
|
|
||||||
|
|
||||||
|
|
||||||
def write_ini(content):
|
|
||||||
update_storage_json()
|
|
||||||
path = CORE.relative_build_path("platformio.ini")
|
|
||||||
|
|
||||||
if os.path.isfile(path):
|
|
||||||
text = read_file(path)
|
|
||||||
content_format = find_begin_end(
|
|
||||||
text, INI_AUTO_GENERATE_BEGIN, INI_AUTO_GENERATE_END
|
|
||||||
)
|
|
||||||
else:
|
|
||||||
content_format = INI_BASE_FORMAT
|
|
||||||
full_file = f"{content_format[0] + INI_AUTO_GENERATE_BEGIN}\n{content}"
|
|
||||||
full_file += INI_AUTO_GENERATE_END + content_format[1]
|
|
||||||
write_file_if_changed(path, full_file)
|
|
||||||
|
|
||||||
|
|
||||||
def write_project():
|
|
||||||
mkdir_p(CORE.build_path)
|
|
||||||
|
|
||||||
content = get_ini_content()
|
|
||||||
write_ini(content)
|
|
||||||
|
|
||||||
# Write extra script for C++ specific flags
|
|
||||||
write_cxx_flags_script()
|
|
||||||
|
|
||||||
|
|
||||||
CXX_FLAGS_FILE_NAME = "cxx_flags.py"
|
|
||||||
CXX_FLAGS_FILE_CONTENTS = """# Auto-generated ESPHome script for C++ specific compiler flags
|
|
||||||
Import("env")
|
|
||||||
|
|
||||||
# Add C++ specific flags
|
|
||||||
"""
|
|
||||||
|
|
||||||
|
|
||||||
def write_cxx_flags_script() -> None:
|
|
||||||
path = CORE.relative_build_path(CXX_FLAGS_FILE_NAME)
|
|
||||||
contents = CXX_FLAGS_FILE_CONTENTS
|
|
||||||
if not CORE.is_host:
|
|
||||||
contents += 'env.Append(CXXFLAGS=["-Wno-volatile"])'
|
|
||||||
contents += "\n"
|
|
||||||
write_file_if_changed(path, contents)
|
|
@ -7,6 +7,7 @@ namespace a4988 {
|
|||||||
static const char *const TAG = "a4988.stepper";
|
static const char *const TAG = "a4988.stepper";
|
||||||
|
|
||||||
void A4988::setup() {
|
void A4988::setup() {
|
||||||
|
ESP_LOGCONFIG(TAG, "Running setup");
|
||||||
if (this->sleep_pin_ != nullptr) {
|
if (this->sleep_pin_ != nullptr) {
|
||||||
this->sleep_pin_->setup();
|
this->sleep_pin_->setup();
|
||||||
this->sleep_pin_->digital_write(false);
|
this->sleep_pin_->digital_write(false);
|
||||||
|
@ -7,6 +7,8 @@ namespace absolute_humidity {
|
|||||||
static const char *const TAG = "absolute_humidity.sensor";
|
static const char *const TAG = "absolute_humidity.sensor";
|
||||||
|
|
||||||
void AbsoluteHumidityComponent::setup() {
|
void AbsoluteHumidityComponent::setup() {
|
||||||
|
ESP_LOGCONFIG(TAG, "Running setup for '%s'", this->get_name().c_str());
|
||||||
|
|
||||||
ESP_LOGD(TAG, " Added callback for temperature '%s'", this->temperature_sensor_->get_name().c_str());
|
ESP_LOGD(TAG, " Added callback for temperature '%s'", this->temperature_sensor_->get_name().c_str());
|
||||||
this->temperature_sensor_->add_on_state_callback([this](float state) { this->temperature_callback_(state); });
|
this->temperature_sensor_->add_on_state_callback([this](float state) { this->temperature_callback_(state); });
|
||||||
if (this->temperature_sensor_->has_state()) {
|
if (this->temperature_sensor_->has_state()) {
|
||||||
|
@ -1,11 +1,10 @@
|
|||||||
from esphome import pins
|
from esphome import pins
|
||||||
import esphome.codegen as cg
|
import esphome.codegen as cg
|
||||||
from esphome.components.esp32 import VARIANT_ESP32P4, get_esp32_variant
|
from esphome.components.esp32 import get_esp32_variant
|
||||||
from esphome.components.esp32.const import (
|
from esphome.components.esp32.const import (
|
||||||
VARIANT_ESP32,
|
VARIANT_ESP32,
|
||||||
VARIANT_ESP32C2,
|
VARIANT_ESP32C2,
|
||||||
VARIANT_ESP32C3,
|
VARIANT_ESP32C3,
|
||||||
VARIANT_ESP32C5,
|
|
||||||
VARIANT_ESP32C6,
|
VARIANT_ESP32C6,
|
||||||
VARIANT_ESP32H2,
|
VARIANT_ESP32H2,
|
||||||
VARIANT_ESP32S2,
|
VARIANT_ESP32S2,
|
||||||
@ -52,103 +51,82 @@ SAMPLING_MODES = {
|
|||||||
"max": sampling_mode.MAX,
|
"max": sampling_mode.MAX,
|
||||||
}
|
}
|
||||||
|
|
||||||
adc_unit_t = cg.global_ns.enum("adc_unit_t", is_class=True)
|
adc1_channel_t = cg.global_ns.enum("adc1_channel_t")
|
||||||
|
adc2_channel_t = cg.global_ns.enum("adc2_channel_t")
|
||||||
adc_channel_t = cg.global_ns.enum("adc_channel_t", is_class=True)
|
|
||||||
|
|
||||||
# pin to adc1 channel mapping
|
# pin to adc1 channel mapping
|
||||||
# https://github.com/espressif/esp-idf/blob/v4.4.8/components/driver/include/driver/adc.h
|
# https://github.com/espressif/esp-idf/blob/v4.4.8/components/driver/include/driver/adc.h
|
||||||
ESP32_VARIANT_ADC1_PIN_TO_CHANNEL = {
|
ESP32_VARIANT_ADC1_PIN_TO_CHANNEL = {
|
||||||
# https://github.com/espressif/esp-idf/blob/master/components/soc/esp32/include/soc/adc_channel.h
|
# https://github.com/espressif/esp-idf/blob/master/components/soc/esp32/include/soc/adc_channel.h
|
||||||
VARIANT_ESP32: {
|
VARIANT_ESP32: {
|
||||||
36: adc_channel_t.ADC_CHANNEL_0,
|
36: adc1_channel_t.ADC1_CHANNEL_0,
|
||||||
37: adc_channel_t.ADC_CHANNEL_1,
|
37: adc1_channel_t.ADC1_CHANNEL_1,
|
||||||
38: adc_channel_t.ADC_CHANNEL_2,
|
38: adc1_channel_t.ADC1_CHANNEL_2,
|
||||||
39: adc_channel_t.ADC_CHANNEL_3,
|
39: adc1_channel_t.ADC1_CHANNEL_3,
|
||||||
32: adc_channel_t.ADC_CHANNEL_4,
|
32: adc1_channel_t.ADC1_CHANNEL_4,
|
||||||
33: adc_channel_t.ADC_CHANNEL_5,
|
33: adc1_channel_t.ADC1_CHANNEL_5,
|
||||||
34: adc_channel_t.ADC_CHANNEL_6,
|
34: adc1_channel_t.ADC1_CHANNEL_6,
|
||||||
35: adc_channel_t.ADC_CHANNEL_7,
|
35: adc1_channel_t.ADC1_CHANNEL_7,
|
||||||
},
|
},
|
||||||
# https://github.com/espressif/esp-idf/blob/master/components/soc/esp32c2/include/soc/adc_channel.h
|
# https://github.com/espressif/esp-idf/blob/master/components/soc/esp32c2/include/soc/adc_channel.h
|
||||||
VARIANT_ESP32C2: {
|
VARIANT_ESP32C2: {
|
||||||
0: adc_channel_t.ADC_CHANNEL_0,
|
0: adc1_channel_t.ADC1_CHANNEL_0,
|
||||||
1: adc_channel_t.ADC_CHANNEL_1,
|
1: adc1_channel_t.ADC1_CHANNEL_1,
|
||||||
2: adc_channel_t.ADC_CHANNEL_2,
|
2: adc1_channel_t.ADC1_CHANNEL_2,
|
||||||
3: adc_channel_t.ADC_CHANNEL_3,
|
3: adc1_channel_t.ADC1_CHANNEL_3,
|
||||||
4: adc_channel_t.ADC_CHANNEL_4,
|
4: adc1_channel_t.ADC1_CHANNEL_4,
|
||||||
},
|
},
|
||||||
# https://github.com/espressif/esp-idf/blob/master/components/soc/esp32c3/include/soc/adc_channel.h
|
# https://github.com/espressif/esp-idf/blob/master/components/soc/esp32c3/include/soc/adc_channel.h
|
||||||
VARIANT_ESP32C3: {
|
VARIANT_ESP32C3: {
|
||||||
0: adc_channel_t.ADC_CHANNEL_0,
|
0: adc1_channel_t.ADC1_CHANNEL_0,
|
||||||
1: adc_channel_t.ADC_CHANNEL_1,
|
1: adc1_channel_t.ADC1_CHANNEL_1,
|
||||||
2: adc_channel_t.ADC_CHANNEL_2,
|
2: adc1_channel_t.ADC1_CHANNEL_2,
|
||||||
3: adc_channel_t.ADC_CHANNEL_3,
|
3: adc1_channel_t.ADC1_CHANNEL_3,
|
||||||
4: adc_channel_t.ADC_CHANNEL_4,
|
4: adc1_channel_t.ADC1_CHANNEL_4,
|
||||||
},
|
|
||||||
# ESP32-C5 ADC1 pin mapping - based on official ESP-IDF documentation
|
|
||||||
# https://docs.espressif.com/projects/esp-idf/en/latest/esp32c5/api-reference/peripherals/gpio.html
|
|
||||||
VARIANT_ESP32C5: {
|
|
||||||
1: adc_channel_t.ADC_CHANNEL_0,
|
|
||||||
2: adc_channel_t.ADC_CHANNEL_1,
|
|
||||||
3: adc_channel_t.ADC_CHANNEL_2,
|
|
||||||
4: adc_channel_t.ADC_CHANNEL_3,
|
|
||||||
5: adc_channel_t.ADC_CHANNEL_4,
|
|
||||||
6: adc_channel_t.ADC_CHANNEL_5,
|
|
||||||
},
|
},
|
||||||
# https://github.com/espressif/esp-idf/blob/master/components/soc/esp32c6/include/soc/adc_channel.h
|
# https://github.com/espressif/esp-idf/blob/master/components/soc/esp32c6/include/soc/adc_channel.h
|
||||||
VARIANT_ESP32C6: {
|
VARIANT_ESP32C6: {
|
||||||
0: adc_channel_t.ADC_CHANNEL_0,
|
0: adc1_channel_t.ADC1_CHANNEL_0,
|
||||||
1: adc_channel_t.ADC_CHANNEL_1,
|
1: adc1_channel_t.ADC1_CHANNEL_1,
|
||||||
2: adc_channel_t.ADC_CHANNEL_2,
|
2: adc1_channel_t.ADC1_CHANNEL_2,
|
||||||
3: adc_channel_t.ADC_CHANNEL_3,
|
3: adc1_channel_t.ADC1_CHANNEL_3,
|
||||||
4: adc_channel_t.ADC_CHANNEL_4,
|
4: adc1_channel_t.ADC1_CHANNEL_4,
|
||||||
5: adc_channel_t.ADC_CHANNEL_5,
|
5: adc1_channel_t.ADC1_CHANNEL_5,
|
||||||
6: adc_channel_t.ADC_CHANNEL_6,
|
6: adc1_channel_t.ADC1_CHANNEL_6,
|
||||||
},
|
},
|
||||||
# https://github.com/espressif/esp-idf/blob/master/components/soc/esp32h2/include/soc/adc_channel.h
|
# https://github.com/espressif/esp-idf/blob/master/components/soc/esp32h2/include/soc/adc_channel.h
|
||||||
VARIANT_ESP32H2: {
|
VARIANT_ESP32H2: {
|
||||||
1: adc_channel_t.ADC_CHANNEL_0,
|
1: adc1_channel_t.ADC1_CHANNEL_0,
|
||||||
2: adc_channel_t.ADC_CHANNEL_1,
|
2: adc1_channel_t.ADC1_CHANNEL_1,
|
||||||
3: adc_channel_t.ADC_CHANNEL_2,
|
3: adc1_channel_t.ADC1_CHANNEL_2,
|
||||||
4: adc_channel_t.ADC_CHANNEL_3,
|
4: adc1_channel_t.ADC1_CHANNEL_3,
|
||||||
5: adc_channel_t.ADC_CHANNEL_4,
|
5: adc1_channel_t.ADC1_CHANNEL_4,
|
||||||
},
|
},
|
||||||
# https://github.com/espressif/esp-idf/blob/master/components/soc/esp32s2/include/soc/adc_channel.h
|
# https://github.com/espressif/esp-idf/blob/master/components/soc/esp32s2/include/soc/adc_channel.h
|
||||||
VARIANT_ESP32S2: {
|
VARIANT_ESP32S2: {
|
||||||
1: adc_channel_t.ADC_CHANNEL_0,
|
1: adc1_channel_t.ADC1_CHANNEL_0,
|
||||||
2: adc_channel_t.ADC_CHANNEL_1,
|
2: adc1_channel_t.ADC1_CHANNEL_1,
|
||||||
3: adc_channel_t.ADC_CHANNEL_2,
|
3: adc1_channel_t.ADC1_CHANNEL_2,
|
||||||
4: adc_channel_t.ADC_CHANNEL_3,
|
4: adc1_channel_t.ADC1_CHANNEL_3,
|
||||||
5: adc_channel_t.ADC_CHANNEL_4,
|
5: adc1_channel_t.ADC1_CHANNEL_4,
|
||||||
6: adc_channel_t.ADC_CHANNEL_5,
|
6: adc1_channel_t.ADC1_CHANNEL_5,
|
||||||
7: adc_channel_t.ADC_CHANNEL_6,
|
7: adc1_channel_t.ADC1_CHANNEL_6,
|
||||||
8: adc_channel_t.ADC_CHANNEL_7,
|
8: adc1_channel_t.ADC1_CHANNEL_7,
|
||||||
9: adc_channel_t.ADC_CHANNEL_8,
|
9: adc1_channel_t.ADC1_CHANNEL_8,
|
||||||
10: adc_channel_t.ADC_CHANNEL_9,
|
10: adc1_channel_t.ADC1_CHANNEL_9,
|
||||||
},
|
},
|
||||||
# https://github.com/espressif/esp-idf/blob/master/components/soc/esp32s3/include/soc/adc_channel.h
|
# https://github.com/espressif/esp-idf/blob/master/components/soc/esp32s3/include/soc/adc_channel.h
|
||||||
VARIANT_ESP32S3: {
|
VARIANT_ESP32S3: {
|
||||||
1: adc_channel_t.ADC_CHANNEL_0,
|
1: adc1_channel_t.ADC1_CHANNEL_0,
|
||||||
2: adc_channel_t.ADC_CHANNEL_1,
|
2: adc1_channel_t.ADC1_CHANNEL_1,
|
||||||
3: adc_channel_t.ADC_CHANNEL_2,
|
3: adc1_channel_t.ADC1_CHANNEL_2,
|
||||||
4: adc_channel_t.ADC_CHANNEL_3,
|
4: adc1_channel_t.ADC1_CHANNEL_3,
|
||||||
5: adc_channel_t.ADC_CHANNEL_4,
|
5: adc1_channel_t.ADC1_CHANNEL_4,
|
||||||
6: adc_channel_t.ADC_CHANNEL_5,
|
6: adc1_channel_t.ADC1_CHANNEL_5,
|
||||||
7: adc_channel_t.ADC_CHANNEL_6,
|
7: adc1_channel_t.ADC1_CHANNEL_6,
|
||||||
8: adc_channel_t.ADC_CHANNEL_7,
|
8: adc1_channel_t.ADC1_CHANNEL_7,
|
||||||
9: adc_channel_t.ADC_CHANNEL_8,
|
9: adc1_channel_t.ADC1_CHANNEL_8,
|
||||||
10: adc_channel_t.ADC_CHANNEL_9,
|
10: adc1_channel_t.ADC1_CHANNEL_9,
|
||||||
},
|
|
||||||
VARIANT_ESP32P4: {
|
|
||||||
16: adc_channel_t.ADC_CHANNEL_0,
|
|
||||||
17: adc_channel_t.ADC_CHANNEL_1,
|
|
||||||
18: adc_channel_t.ADC_CHANNEL_2,
|
|
||||||
19: adc_channel_t.ADC_CHANNEL_3,
|
|
||||||
20: adc_channel_t.ADC_CHANNEL_4,
|
|
||||||
21: adc_channel_t.ADC_CHANNEL_5,
|
|
||||||
22: adc_channel_t.ADC_CHANNEL_6,
|
|
||||||
23: adc_channel_t.ADC_CHANNEL_7,
|
|
||||||
},
|
},
|
||||||
}
|
}
|
||||||
|
|
||||||
@ -157,64 +135,54 @@ ESP32_VARIANT_ADC1_PIN_TO_CHANNEL = {
|
|||||||
ESP32_VARIANT_ADC2_PIN_TO_CHANNEL = {
|
ESP32_VARIANT_ADC2_PIN_TO_CHANNEL = {
|
||||||
# https://github.com/espressif/esp-idf/blob/master/components/soc/esp32/include/soc/adc_channel.h
|
# https://github.com/espressif/esp-idf/blob/master/components/soc/esp32/include/soc/adc_channel.h
|
||||||
VARIANT_ESP32: {
|
VARIANT_ESP32: {
|
||||||
4: adc_channel_t.ADC_CHANNEL_0,
|
4: adc2_channel_t.ADC2_CHANNEL_0,
|
||||||
0: adc_channel_t.ADC_CHANNEL_1,
|
0: adc2_channel_t.ADC2_CHANNEL_1,
|
||||||
2: adc_channel_t.ADC_CHANNEL_2,
|
2: adc2_channel_t.ADC2_CHANNEL_2,
|
||||||
15: adc_channel_t.ADC_CHANNEL_3,
|
15: adc2_channel_t.ADC2_CHANNEL_3,
|
||||||
13: adc_channel_t.ADC_CHANNEL_4,
|
13: adc2_channel_t.ADC2_CHANNEL_4,
|
||||||
12: adc_channel_t.ADC_CHANNEL_5,
|
12: adc2_channel_t.ADC2_CHANNEL_5,
|
||||||
14: adc_channel_t.ADC_CHANNEL_6,
|
14: adc2_channel_t.ADC2_CHANNEL_6,
|
||||||
27: adc_channel_t.ADC_CHANNEL_7,
|
27: adc2_channel_t.ADC2_CHANNEL_7,
|
||||||
25: adc_channel_t.ADC_CHANNEL_8,
|
25: adc2_channel_t.ADC2_CHANNEL_8,
|
||||||
26: adc_channel_t.ADC_CHANNEL_9,
|
26: adc2_channel_t.ADC2_CHANNEL_9,
|
||||||
},
|
},
|
||||||
# https://github.com/espressif/esp-idf/blob/master/components/soc/esp32c2/include/soc/adc_channel.h
|
# https://github.com/espressif/esp-idf/blob/master/components/soc/esp32c2/include/soc/adc_channel.h
|
||||||
VARIANT_ESP32C2: {
|
VARIANT_ESP32C2: {
|
||||||
5: adc_channel_t.ADC_CHANNEL_0,
|
5: adc2_channel_t.ADC2_CHANNEL_0,
|
||||||
},
|
},
|
||||||
# https://github.com/espressif/esp-idf/blob/master/components/soc/esp32c3/include/soc/adc_channel.h
|
# https://github.com/espressif/esp-idf/blob/master/components/soc/esp32c3/include/soc/adc_channel.h
|
||||||
VARIANT_ESP32C3: {
|
VARIANT_ESP32C3: {
|
||||||
5: adc_channel_t.ADC_CHANNEL_0,
|
5: adc2_channel_t.ADC2_CHANNEL_0,
|
||||||
},
|
},
|
||||||
# ESP32-C5 has no ADC2 channels
|
|
||||||
VARIANT_ESP32C5: {}, # no ADC2
|
|
||||||
# https://github.com/espressif/esp-idf/blob/master/components/soc/esp32c6/include/soc/adc_channel.h
|
# https://github.com/espressif/esp-idf/blob/master/components/soc/esp32c6/include/soc/adc_channel.h
|
||||||
VARIANT_ESP32C6: {}, # no ADC2
|
VARIANT_ESP32C6: {}, # no ADC2
|
||||||
# https://github.com/espressif/esp-idf/blob/master/components/soc/esp32h2/include/soc/adc_channel.h
|
# https://github.com/espressif/esp-idf/blob/master/components/soc/esp32h2/include/soc/adc_channel.h
|
||||||
VARIANT_ESP32H2: {}, # no ADC2
|
VARIANT_ESP32H2: {}, # no ADC2
|
||||||
# https://github.com/espressif/esp-idf/blob/master/components/soc/esp32s2/include/soc/adc_channel.h
|
# https://github.com/espressif/esp-idf/blob/master/components/soc/esp32s2/include/soc/adc_channel.h
|
||||||
VARIANT_ESP32S2: {
|
VARIANT_ESP32S2: {
|
||||||
11: adc_channel_t.ADC_CHANNEL_0,
|
11: adc2_channel_t.ADC2_CHANNEL_0,
|
||||||
12: adc_channel_t.ADC_CHANNEL_1,
|
12: adc2_channel_t.ADC2_CHANNEL_1,
|
||||||
13: adc_channel_t.ADC_CHANNEL_2,
|
13: adc2_channel_t.ADC2_CHANNEL_2,
|
||||||
14: adc_channel_t.ADC_CHANNEL_3,
|
14: adc2_channel_t.ADC2_CHANNEL_3,
|
||||||
15: adc_channel_t.ADC_CHANNEL_4,
|
15: adc2_channel_t.ADC2_CHANNEL_4,
|
||||||
16: adc_channel_t.ADC_CHANNEL_5,
|
16: adc2_channel_t.ADC2_CHANNEL_5,
|
||||||
17: adc_channel_t.ADC_CHANNEL_6,
|
17: adc2_channel_t.ADC2_CHANNEL_6,
|
||||||
18: adc_channel_t.ADC_CHANNEL_7,
|
18: adc2_channel_t.ADC2_CHANNEL_7,
|
||||||
19: adc_channel_t.ADC_CHANNEL_8,
|
19: adc2_channel_t.ADC2_CHANNEL_8,
|
||||||
20: adc_channel_t.ADC_CHANNEL_9,
|
20: adc2_channel_t.ADC2_CHANNEL_9,
|
||||||
},
|
},
|
||||||
# https://github.com/espressif/esp-idf/blob/master/components/soc/esp32s3/include/soc/adc_channel.h
|
# https://github.com/espressif/esp-idf/blob/master/components/soc/esp32s3/include/soc/adc_channel.h
|
||||||
VARIANT_ESP32S3: {
|
VARIANT_ESP32S3: {
|
||||||
11: adc_channel_t.ADC_CHANNEL_0,
|
11: adc2_channel_t.ADC2_CHANNEL_0,
|
||||||
12: adc_channel_t.ADC_CHANNEL_1,
|
12: adc2_channel_t.ADC2_CHANNEL_1,
|
||||||
13: adc_channel_t.ADC_CHANNEL_2,
|
13: adc2_channel_t.ADC2_CHANNEL_2,
|
||||||
14: adc_channel_t.ADC_CHANNEL_3,
|
14: adc2_channel_t.ADC2_CHANNEL_3,
|
||||||
15: adc_channel_t.ADC_CHANNEL_4,
|
15: adc2_channel_t.ADC2_CHANNEL_4,
|
||||||
16: adc_channel_t.ADC_CHANNEL_5,
|
16: adc2_channel_t.ADC2_CHANNEL_5,
|
||||||
17: adc_channel_t.ADC_CHANNEL_6,
|
17: adc2_channel_t.ADC2_CHANNEL_6,
|
||||||
18: adc_channel_t.ADC_CHANNEL_7,
|
18: adc2_channel_t.ADC2_CHANNEL_7,
|
||||||
19: adc_channel_t.ADC_CHANNEL_8,
|
19: adc2_channel_t.ADC2_CHANNEL_8,
|
||||||
20: adc_channel_t.ADC_CHANNEL_9,
|
20: adc2_channel_t.ADC2_CHANNEL_9,
|
||||||
},
|
|
||||||
VARIANT_ESP32P4: {
|
|
||||||
49: adc_channel_t.ADC_CHANNEL_0,
|
|
||||||
50: adc_channel_t.ADC_CHANNEL_1,
|
|
||||||
51: adc_channel_t.ADC_CHANNEL_2,
|
|
||||||
52: adc_channel_t.ADC_CHANNEL_3,
|
|
||||||
53: adc_channel_t.ADC_CHANNEL_4,
|
|
||||||
54: adc_channel_t.ADC_CHANNEL_5,
|
|
||||||
},
|
},
|
||||||
}
|
}
|
||||||
|
|
||||||
@ -267,11 +235,6 @@ def validate_adc_pin(value):
|
|||||||
{CONF_ANALOG: True, CONF_INPUT: True}, internal=True
|
{CONF_ANALOG: True, CONF_INPUT: True}, internal=True
|
||||||
)(value)
|
)(value)
|
||||||
|
|
||||||
if CORE.is_nrf52:
|
|
||||||
return pins.gpio_pin_schema(
|
|
||||||
{CONF_ANALOG: True, CONF_INPUT: True}, internal=True
|
|
||||||
)(value)
|
|
||||||
|
|
||||||
raise NotImplementedError
|
raise NotImplementedError
|
||||||
|
|
||||||
|
|
||||||
@ -288,6 +251,5 @@ FILTER_SOURCE_FILES = filter_source_files_from_platform(
|
|||||||
PlatformFramework.RTL87XX_ARDUINO,
|
PlatformFramework.RTL87XX_ARDUINO,
|
||||||
PlatformFramework.LN882X_ARDUINO,
|
PlatformFramework.LN882X_ARDUINO,
|
||||||
},
|
},
|
||||||
"adc_sensor_zephyr.cpp": {PlatformFramework.NRF52_ZEPHYR},
|
|
||||||
}
|
}
|
||||||
)
|
)
|
||||||
|
@ -3,19 +3,12 @@
|
|||||||
#include "esphome/components/sensor/sensor.h"
|
#include "esphome/components/sensor/sensor.h"
|
||||||
#include "esphome/components/voltage_sampler/voltage_sampler.h"
|
#include "esphome/components/voltage_sampler/voltage_sampler.h"
|
||||||
#include "esphome/core/component.h"
|
#include "esphome/core/component.h"
|
||||||
#include "esphome/core/defines.h"
|
|
||||||
#include "esphome/core/hal.h"
|
#include "esphome/core/hal.h"
|
||||||
|
|
||||||
#ifdef USE_ESP32
|
#ifdef USE_ESP32
|
||||||
#include "esp_adc/adc_cali.h"
|
#include <esp_adc_cal.h>
|
||||||
#include "esp_adc/adc_cali_scheme.h"
|
#include "driver/adc.h"
|
||||||
#include "esp_adc/adc_oneshot.h"
|
#endif // USE_ESP32
|
||||||
#include "hal/adc_types.h" // This defines ADC_CHANNEL_MAX
|
|
||||||
#endif // USE_ESP32
|
|
||||||
|
|
||||||
#ifdef USE_ZEPHYR
|
|
||||||
#include <zephyr/drivers/adc.h>
|
|
||||||
#endif
|
|
||||||
|
|
||||||
namespace esphome {
|
namespace esphome {
|
||||||
namespace adc {
|
namespace adc {
|
||||||
@ -42,92 +35,51 @@ enum class SamplingMode : uint8_t {
|
|||||||
|
|
||||||
const LogString *sampling_mode_to_str(SamplingMode mode);
|
const LogString *sampling_mode_to_str(SamplingMode mode);
|
||||||
|
|
||||||
template<typename T> class Aggregator {
|
class Aggregator {
|
||||||
public:
|
public:
|
||||||
Aggregator(SamplingMode mode);
|
Aggregator(SamplingMode mode);
|
||||||
void add_sample(T value);
|
void add_sample(uint32_t value);
|
||||||
T aggregate();
|
uint32_t aggregate();
|
||||||
|
|
||||||
protected:
|
protected:
|
||||||
T aggr_{0};
|
uint32_t aggr_{0};
|
||||||
uint8_t samples_{0};
|
uint32_t samples_{0};
|
||||||
SamplingMode mode_{SamplingMode::AVG};
|
SamplingMode mode_{SamplingMode::AVG};
|
||||||
};
|
};
|
||||||
|
|
||||||
class ADCSensor : public sensor::Sensor, public PollingComponent, public voltage_sampler::VoltageSampler {
|
class ADCSensor : public sensor::Sensor, public PollingComponent, public voltage_sampler::VoltageSampler {
|
||||||
public:
|
public:
|
||||||
/// Update the sensor's state by reading the current ADC value.
|
|
||||||
/// This method is called periodically based on the update interval.
|
|
||||||
void update() override;
|
|
||||||
|
|
||||||
/// Set up the ADC sensor by initializing hardware and calibration parameters.
|
|
||||||
/// This method is called once during device initialization.
|
|
||||||
void setup() override;
|
|
||||||
|
|
||||||
/// Output the configuration details of the ADC sensor for debugging purposes.
|
|
||||||
/// This method is called during the ESPHome setup process to log the configuration.
|
|
||||||
void dump_config() override;
|
|
||||||
|
|
||||||
/// Return the setup priority for this component.
|
|
||||||
/// Components with higher priority are initialized earlier during setup.
|
|
||||||
/// @return A float representing the setup priority.
|
|
||||||
float get_setup_priority() const override;
|
|
||||||
|
|
||||||
#ifdef USE_ZEPHYR
|
|
||||||
/// Set the ADC channel to be used by the ADC sensor.
|
|
||||||
/// @param channel Pointer to an adc_dt_spec structure representing the ADC channel.
|
|
||||||
void set_adc_channel(const adc_dt_spec *channel) { this->channel_ = channel; }
|
|
||||||
#endif
|
|
||||||
/// Set the GPIO pin to be used by the ADC sensor.
|
|
||||||
/// @param pin Pointer to an InternalGPIOPin representing the ADC input pin.
|
|
||||||
void set_pin(InternalGPIOPin *pin) { this->pin_ = pin; }
|
|
||||||
|
|
||||||
/// Enable or disable the output of raw ADC values (unprocessed data).
|
|
||||||
/// @param output_raw Boolean indicating whether to output raw ADC values (true) or processed values (false).
|
|
||||||
void set_output_raw(bool output_raw) { this->output_raw_ = output_raw; }
|
|
||||||
|
|
||||||
/// Set the number of samples to be taken for ADC readings to improve accuracy.
|
|
||||||
/// A higher sample count reduces noise but increases the reading time.
|
|
||||||
/// @param sample_count The number of samples (e.g., 1, 4, 8).
|
|
||||||
void set_sample_count(uint8_t sample_count);
|
|
||||||
|
|
||||||
/// Set the sampling mode for how multiple ADC samples are combined into a single measurement.
|
|
||||||
///
|
|
||||||
/// When multiple samples are taken (controlled by set_sample_count), they can be combined
|
|
||||||
/// in one of three ways:
|
|
||||||
/// - SamplingMode::AVG: Compute the average (default)
|
|
||||||
/// - SamplingMode::MIN: Use the lowest sample value
|
|
||||||
/// - SamplingMode::MAX: Use the highest sample value
|
|
||||||
/// @param sampling_mode The desired sampling mode to use for aggregating ADC samples.
|
|
||||||
void set_sampling_mode(SamplingMode sampling_mode);
|
|
||||||
|
|
||||||
/// Perform a single ADC sampling operation and return the measured value.
|
|
||||||
/// This function handles raw readings, calibration, and averaging as needed.
|
|
||||||
/// @return The sampled value as a float.
|
|
||||||
float sample() override;
|
|
||||||
|
|
||||||
#ifdef USE_ESP32
|
#ifdef USE_ESP32
|
||||||
/// Set the ADC attenuation level to adjust the input voltage range.
|
/// Set the attenuation for this pin. Only available on the ESP32.
|
||||||
/// This determines how the ADC interprets input voltages, allowing for greater precision
|
|
||||||
/// or the ability to measure higher voltages depending on the chosen attenuation level.
|
|
||||||
/// @param attenuation The desired ADC attenuation level (e.g., ADC_ATTEN_DB_0, ADC_ATTEN_DB_11).
|
|
||||||
void set_attenuation(adc_atten_t attenuation) { this->attenuation_ = attenuation; }
|
void set_attenuation(adc_atten_t attenuation) { this->attenuation_ = attenuation; }
|
||||||
|
void set_channel1(adc1_channel_t channel) {
|
||||||
/// Configure the ADC to use a specific channel on a specific ADC unit.
|
this->channel1_ = channel;
|
||||||
/// This sets the channel for single-shot or continuous ADC measurements.
|
this->channel2_ = ADC2_CHANNEL_MAX;
|
||||||
/// @param unit The ADC unit to use (ADC_UNIT_1 or ADC_UNIT_2).
|
}
|
||||||
/// @param channel The ADC channel to configure, such as ADC_CHANNEL_0, ADC_CHANNEL_3, etc.
|
void set_channel2(adc2_channel_t channel) {
|
||||||
void set_channel(adc_unit_t unit, adc_channel_t channel) {
|
this->channel2_ = channel;
|
||||||
this->adc_unit_ = unit;
|
this->channel1_ = ADC1_CHANNEL_MAX;
|
||||||
this->channel_ = channel;
|
|
||||||
}
|
}
|
||||||
|
|
||||||
/// Set whether autoranging should be enabled for the ADC.
|
|
||||||
/// Autoranging automatically adjusts the attenuation level to handle a wide range of input voltages.
|
|
||||||
/// @param autorange Boolean indicating whether to enable autoranging.
|
|
||||||
void set_autorange(bool autorange) { this->autorange_ = autorange; }
|
void set_autorange(bool autorange) { this->autorange_ = autorange; }
|
||||||
#endif // USE_ESP32
|
#endif // USE_ESP32
|
||||||
|
|
||||||
|
/// Update ADC values
|
||||||
|
void update() override;
|
||||||
|
/// Setup ADC
|
||||||
|
void setup() override;
|
||||||
|
void dump_config() override;
|
||||||
|
/// `HARDWARE_LATE` setup priority
|
||||||
|
float get_setup_priority() const override;
|
||||||
|
void set_pin(InternalGPIOPin *pin) { this->pin_ = pin; }
|
||||||
|
void set_output_raw(bool output_raw) { this->output_raw_ = output_raw; }
|
||||||
|
void set_sample_count(uint8_t sample_count);
|
||||||
|
void set_sampling_mode(SamplingMode sampling_mode);
|
||||||
|
float sample() override;
|
||||||
|
|
||||||
|
#ifdef USE_ESP8266
|
||||||
|
std::string unique_id() override;
|
||||||
|
#endif // USE_ESP8266
|
||||||
|
|
||||||
#ifdef USE_RP2040
|
#ifdef USE_RP2040
|
||||||
void set_is_temperature() { this->is_temperature_ = true; }
|
void set_is_temperature() { this->is_temperature_ = true; }
|
||||||
#endif // USE_RP2040
|
#endif // USE_RP2040
|
||||||
@ -138,32 +90,17 @@ class ADCSensor : public sensor::Sensor, public PollingComponent, public voltage
|
|||||||
InternalGPIOPin *pin_;
|
InternalGPIOPin *pin_;
|
||||||
SamplingMode sampling_mode_{SamplingMode::AVG};
|
SamplingMode sampling_mode_{SamplingMode::AVG};
|
||||||
|
|
||||||
#ifdef USE_ESP32
|
|
||||||
float sample_autorange_();
|
|
||||||
float sample_fixed_attenuation_();
|
|
||||||
bool autorange_{false};
|
|
||||||
adc_oneshot_unit_handle_t adc_handle_{nullptr};
|
|
||||||
adc_cali_handle_t calibration_handle_{nullptr};
|
|
||||||
adc_atten_t attenuation_{ADC_ATTEN_DB_0};
|
|
||||||
adc_channel_t channel_{};
|
|
||||||
adc_unit_t adc_unit_{};
|
|
||||||
struct SetupFlags {
|
|
||||||
uint8_t init_complete : 1;
|
|
||||||
uint8_t config_complete : 1;
|
|
||||||
uint8_t handle_init_complete : 1;
|
|
||||||
uint8_t calibration_complete : 1;
|
|
||||||
uint8_t reserved : 4;
|
|
||||||
} setup_flags_{};
|
|
||||||
static adc_oneshot_unit_handle_t shared_adc_handles[2];
|
|
||||||
#endif // USE_ESP32
|
|
||||||
|
|
||||||
#ifdef USE_RP2040
|
#ifdef USE_RP2040
|
||||||
bool is_temperature_{false};
|
bool is_temperature_{false};
|
||||||
#endif // USE_RP2040
|
#endif // USE_RP2040
|
||||||
|
|
||||||
#ifdef USE_ZEPHYR
|
#ifdef USE_ESP32
|
||||||
const struct adc_dt_spec *channel_ = nullptr;
|
adc_atten_t attenuation_{ADC_ATTEN_DB_0};
|
||||||
#endif
|
adc1_channel_t channel1_{ADC1_CHANNEL_MAX};
|
||||||
|
adc2_channel_t channel2_{ADC2_CHANNEL_MAX};
|
||||||
|
bool autorange_{false};
|
||||||
|
esp_adc_cal_characteristics_t cal_characteristics_[SOC_ADC_ATTEN_NUM] = {};
|
||||||
|
#endif // USE_ESP32
|
||||||
};
|
};
|
||||||
|
|
||||||
} // namespace adc
|
} // namespace adc
|
||||||
|
@ -18,15 +18,15 @@ const LogString *sampling_mode_to_str(SamplingMode mode) {
|
|||||||
return LOG_STR("unknown");
|
return LOG_STR("unknown");
|
||||||
}
|
}
|
||||||
|
|
||||||
template<typename T> Aggregator<T>::Aggregator(SamplingMode mode) {
|
Aggregator::Aggregator(SamplingMode mode) {
|
||||||
this->mode_ = mode;
|
this->mode_ = mode;
|
||||||
// set to max uint if mode is "min"
|
// set to max uint if mode is "min"
|
||||||
if (mode == SamplingMode::MIN) {
|
if (mode == SamplingMode::MIN) {
|
||||||
this->aggr_ = std::numeric_limits<T>::max();
|
this->aggr_ = UINT32_MAX;
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
template<typename T> void Aggregator<T>::add_sample(T value) {
|
void Aggregator::add_sample(uint32_t value) {
|
||||||
this->samples_ += 1;
|
this->samples_ += 1;
|
||||||
|
|
||||||
switch (this->mode_) {
|
switch (this->mode_) {
|
||||||
@ -47,7 +47,7 @@ template<typename T> void Aggregator<T>::add_sample(T value) {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
template<typename T> T Aggregator<T>::aggregate() {
|
uint32_t Aggregator::aggregate() {
|
||||||
if (this->mode_ == SamplingMode::AVG) {
|
if (this->mode_ == SamplingMode::AVG) {
|
||||||
if (this->samples_ == 0) {
|
if (this->samples_ == 0) {
|
||||||
return this->aggr_;
|
return this->aggr_;
|
||||||
@ -59,12 +59,6 @@ template<typename T> T Aggregator<T>::aggregate() {
|
|||||||
return this->aggr_;
|
return this->aggr_;
|
||||||
}
|
}
|
||||||
|
|
||||||
#ifdef USE_ZEPHYR
|
|
||||||
template class Aggregator<int32_t>;
|
|
||||||
#else
|
|
||||||
template class Aggregator<uint32_t>;
|
|
||||||
#endif
|
|
||||||
|
|
||||||
void ADCSensor::update() {
|
void ADCSensor::update() {
|
||||||
float value_v = this->sample();
|
float value_v = this->sample();
|
||||||
ESP_LOGV(TAG, "'%s': Voltage=%.4fV", this->get_name().c_str(), value_v);
|
ESP_LOGV(TAG, "'%s': Voltage=%.4fV", this->get_name().c_str(), value_v);
|
||||||
|
@ -8,313 +8,145 @@ namespace adc {
|
|||||||
|
|
||||||
static const char *const TAG = "adc.esp32";
|
static const char *const TAG = "adc.esp32";
|
||||||
|
|
||||||
adc_oneshot_unit_handle_t ADCSensor::shared_adc_handles[2] = {nullptr, nullptr};
|
static const adc_bits_width_t ADC_WIDTH_MAX_SOC_BITS = static_cast<adc_bits_width_t>(ADC_WIDTH_MAX - 1);
|
||||||
|
|
||||||
const LogString *attenuation_to_str(adc_atten_t attenuation) {
|
#ifndef SOC_ADC_RTC_MAX_BITWIDTH
|
||||||
switch (attenuation) {
|
#if USE_ESP32_VARIANT_ESP32S2
|
||||||
case ADC_ATTEN_DB_0:
|
static const int32_t SOC_ADC_RTC_MAX_BITWIDTH = 13;
|
||||||
return LOG_STR("0 dB");
|
#else
|
||||||
case ADC_ATTEN_DB_2_5:
|
static const int32_t SOC_ADC_RTC_MAX_BITWIDTH = 12;
|
||||||
return LOG_STR("2.5 dB");
|
#endif // USE_ESP32_VARIANT_ESP32S2
|
||||||
case ADC_ATTEN_DB_6:
|
#endif // SOC_ADC_RTC_MAX_BITWIDTH
|
||||||
return LOG_STR("6 dB");
|
|
||||||
case ADC_ATTEN_DB_12_COMPAT:
|
|
||||||
return LOG_STR("12 dB");
|
|
||||||
default:
|
|
||||||
return LOG_STR("Unknown Attenuation");
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
const LogString *adc_unit_to_str(adc_unit_t unit) {
|
static const int ADC_MAX = (1 << SOC_ADC_RTC_MAX_BITWIDTH) - 1;
|
||||||
switch (unit) {
|
static const int ADC_HALF = (1 << SOC_ADC_RTC_MAX_BITWIDTH) >> 1;
|
||||||
case ADC_UNIT_1:
|
|
||||||
return LOG_STR("ADC1");
|
|
||||||
case ADC_UNIT_2:
|
|
||||||
return LOG_STR("ADC2");
|
|
||||||
default:
|
|
||||||
return LOG_STR("Unknown ADC Unit");
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
void ADCSensor::setup() {
|
void ADCSensor::setup() {
|
||||||
// Check if another sensor already initialized this ADC unit
|
ESP_LOGCONFIG(TAG, "Running setup for '%s'", this->get_name().c_str());
|
||||||
if (ADCSensor::shared_adc_handles[this->adc_unit_] == nullptr) {
|
|
||||||
adc_oneshot_unit_init_cfg_t init_config = {}; // Zero initialize
|
if (this->channel1_ != ADC1_CHANNEL_MAX) {
|
||||||
init_config.unit_id = this->adc_unit_;
|
adc1_config_width(ADC_WIDTH_MAX_SOC_BITS);
|
||||||
init_config.ulp_mode = ADC_ULP_MODE_DISABLE;
|
if (!this->autorange_) {
|
||||||
#if USE_ESP32_VARIANT_ESP32C3 || USE_ESP32_VARIANT_ESP32C5 || USE_ESP32_VARIANT_ESP32C6 || USE_ESP32_VARIANT_ESP32H2
|
adc1_config_channel_atten(this->channel1_, this->attenuation_);
|
||||||
init_config.clk_src = ADC_DIGI_CLK_SRC_DEFAULT;
|
}
|
||||||
#endif // USE_ESP32_VARIANT_ESP32C3 || USE_ESP32_VARIANT_ESP32C5 || USE_ESP32_VARIANT_ESP32C6 ||
|
} else if (this->channel2_ != ADC2_CHANNEL_MAX) {
|
||||||
// USE_ESP32_VARIANT_ESP32H2
|
if (!this->autorange_) {
|
||||||
esp_err_t err = adc_oneshot_new_unit(&init_config, &ADCSensor::shared_adc_handles[this->adc_unit_]);
|
adc2_config_channel_atten(this->channel2_, this->attenuation_);
|
||||||
if (err != ESP_OK) {
|
|
||||||
ESP_LOGE(TAG, "Error initializing %s: %d", LOG_STR_ARG(adc_unit_to_str(this->adc_unit_)), err);
|
|
||||||
this->mark_failed();
|
|
||||||
return;
|
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
this->adc_handle_ = ADCSensor::shared_adc_handles[this->adc_unit_];
|
|
||||||
|
|
||||||
this->setup_flags_.handle_init_complete = true;
|
for (int32_t i = 0; i <= ADC_ATTEN_DB_12_COMPAT; i++) {
|
||||||
|
auto adc_unit = this->channel1_ != ADC1_CHANNEL_MAX ? ADC_UNIT_1 : ADC_UNIT_2;
|
||||||
adc_oneshot_chan_cfg_t config = {
|
auto cal_value = esp_adc_cal_characterize(adc_unit, (adc_atten_t) i, ADC_WIDTH_MAX_SOC_BITS,
|
||||||
.atten = this->attenuation_,
|
1100, // default vref
|
||||||
.bitwidth = ADC_BITWIDTH_DEFAULT,
|
&this->cal_characteristics_[i]);
|
||||||
};
|
switch (cal_value) {
|
||||||
esp_err_t err = adc_oneshot_config_channel(this->adc_handle_, this->channel_, &config);
|
case ESP_ADC_CAL_VAL_EFUSE_VREF:
|
||||||
if (err != ESP_OK) {
|
ESP_LOGV(TAG, "Using eFuse Vref for calibration");
|
||||||
ESP_LOGE(TAG, "Error configuring channel: %d", err);
|
break;
|
||||||
this->mark_failed();
|
case ESP_ADC_CAL_VAL_EFUSE_TP:
|
||||||
return;
|
ESP_LOGV(TAG, "Using two-point eFuse Vref for calibration");
|
||||||
}
|
break;
|
||||||
this->setup_flags_.config_complete = true;
|
case ESP_ADC_CAL_VAL_DEFAULT_VREF:
|
||||||
|
default:
|
||||||
// Initialize ADC calibration
|
break;
|
||||||
if (this->calibration_handle_ == nullptr) {
|
|
||||||
adc_cali_handle_t handle = nullptr;
|
|
||||||
|
|
||||||
#if USE_ESP32_VARIANT_ESP32C3 || USE_ESP32_VARIANT_ESP32C5 || USE_ESP32_VARIANT_ESP32C6 || \
|
|
||||||
USE_ESP32_VARIANT_ESP32S3 || USE_ESP32_VARIANT_ESP32H2 || USE_ESP32_VARIANT_ESP32P4
|
|
||||||
// RISC-V variants and S3 use curve fitting calibration
|
|
||||||
adc_cali_curve_fitting_config_t cali_config = {}; // Zero initialize first
|
|
||||||
#if ESP_IDF_VERSION >= ESP_IDF_VERSION_VAL(5, 3, 0)
|
|
||||||
cali_config.chan = this->channel_;
|
|
||||||
#endif // ESP_IDF_VERSION >= ESP_IDF_VERSION_VAL(5, 3, 0)
|
|
||||||
cali_config.unit_id = this->adc_unit_;
|
|
||||||
cali_config.atten = this->attenuation_;
|
|
||||||
cali_config.bitwidth = ADC_BITWIDTH_DEFAULT;
|
|
||||||
|
|
||||||
err = adc_cali_create_scheme_curve_fitting(&cali_config, &handle);
|
|
||||||
if (err == ESP_OK) {
|
|
||||||
this->calibration_handle_ = handle;
|
|
||||||
this->setup_flags_.calibration_complete = true;
|
|
||||||
ESP_LOGV(TAG, "Using curve fitting calibration");
|
|
||||||
} else {
|
|
||||||
ESP_LOGW(TAG, "Curve fitting calibration failed with error %d, will use uncalibrated readings", err);
|
|
||||||
this->setup_flags_.calibration_complete = false;
|
|
||||||
}
|
}
|
||||||
#else // Other ESP32 variants use line fitting calibration
|
|
||||||
adc_cali_line_fitting_config_t cali_config = {
|
|
||||||
.unit_id = this->adc_unit_,
|
|
||||||
.atten = this->attenuation_,
|
|
||||||
.bitwidth = ADC_BITWIDTH_DEFAULT,
|
|
||||||
#if !defined(USE_ESP32_VARIANT_ESP32S2)
|
|
||||||
.default_vref = 1100, // Default reference voltage in mV
|
|
||||||
#endif // !defined(USE_ESP32_VARIANT_ESP32S2)
|
|
||||||
};
|
|
||||||
err = adc_cali_create_scheme_line_fitting(&cali_config, &handle);
|
|
||||||
if (err == ESP_OK) {
|
|
||||||
this->calibration_handle_ = handle;
|
|
||||||
this->setup_flags_.calibration_complete = true;
|
|
||||||
ESP_LOGV(TAG, "Using line fitting calibration");
|
|
||||||
} else {
|
|
||||||
ESP_LOGW(TAG, "Line fitting calibration failed with error %d, will use uncalibrated readings", err);
|
|
||||||
this->setup_flags_.calibration_complete = false;
|
|
||||||
}
|
|
||||||
#endif // USE_ESP32_VARIANT_ESP32C3 || ESP32C5 || ESP32C6 || ESP32S3 || ESP32H2
|
|
||||||
}
|
}
|
||||||
|
|
||||||
this->setup_flags_.init_complete = true;
|
|
||||||
}
|
}
|
||||||
|
|
||||||
void ADCSensor::dump_config() {
|
void ADCSensor::dump_config() {
|
||||||
|
static const char *const ATTEN_AUTO_STR = "auto";
|
||||||
|
static const char *const ATTEN_0DB_STR = "0 db";
|
||||||
|
static const char *const ATTEN_2_5DB_STR = "2.5 db";
|
||||||
|
static const char *const ATTEN_6DB_STR = "6 db";
|
||||||
|
static const char *const ATTEN_12DB_STR = "12 db";
|
||||||
|
const char *atten_str = ATTEN_AUTO_STR;
|
||||||
|
|
||||||
LOG_SENSOR("", "ADC Sensor", this);
|
LOG_SENSOR("", "ADC Sensor", this);
|
||||||
LOG_PIN(" Pin: ", this->pin_);
|
LOG_PIN(" Pin: ", this->pin_);
|
||||||
|
|
||||||
|
if (!this->autorange_) {
|
||||||
|
switch (this->attenuation_) {
|
||||||
|
case ADC_ATTEN_DB_0:
|
||||||
|
atten_str = ATTEN_0DB_STR;
|
||||||
|
break;
|
||||||
|
case ADC_ATTEN_DB_2_5:
|
||||||
|
atten_str = ATTEN_2_5DB_STR;
|
||||||
|
break;
|
||||||
|
case ADC_ATTEN_DB_6:
|
||||||
|
atten_str = ATTEN_6DB_STR;
|
||||||
|
break;
|
||||||
|
case ADC_ATTEN_DB_12_COMPAT:
|
||||||
|
atten_str = ATTEN_12DB_STR;
|
||||||
|
break;
|
||||||
|
default: // This is to satisfy the unused ADC_ATTEN_MAX
|
||||||
|
break;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
ESP_LOGCONFIG(TAG,
|
ESP_LOGCONFIG(TAG,
|
||||||
" Channel: %d\n"
|
" Attenuation: %s\n"
|
||||||
" Unit: %s\n"
|
" Samples: %i\n"
|
||||||
" Attenuation: %s\n"
|
|
||||||
" Samples: %i\n"
|
|
||||||
" Sampling mode: %s",
|
" Sampling mode: %s",
|
||||||
this->channel_, LOG_STR_ARG(adc_unit_to_str(this->adc_unit_)),
|
atten_str, this->sample_count_, LOG_STR_ARG(sampling_mode_to_str(this->sampling_mode_)));
|
||||||
this->autorange_ ? "Auto" : LOG_STR_ARG(attenuation_to_str(this->attenuation_)), this->sample_count_,
|
|
||||||
LOG_STR_ARG(sampling_mode_to_str(this->sampling_mode_)));
|
|
||||||
|
|
||||||
ESP_LOGCONFIG(
|
|
||||||
TAG,
|
|
||||||
" Setup Status:\n"
|
|
||||||
" Handle Init: %s\n"
|
|
||||||
" Config: %s\n"
|
|
||||||
" Calibration: %s\n"
|
|
||||||
" Overall Init: %s",
|
|
||||||
this->setup_flags_.handle_init_complete ? "OK" : "FAILED", this->setup_flags_.config_complete ? "OK" : "FAILED",
|
|
||||||
this->setup_flags_.calibration_complete ? "OK" : "FAILED", this->setup_flags_.init_complete ? "OK" : "FAILED");
|
|
||||||
|
|
||||||
LOG_UPDATE_INTERVAL(this);
|
LOG_UPDATE_INTERVAL(this);
|
||||||
}
|
}
|
||||||
|
|
||||||
float ADCSensor::sample() {
|
float ADCSensor::sample() {
|
||||||
if (this->autorange_) {
|
if (!this->autorange_) {
|
||||||
return this->sample_autorange_();
|
auto aggr = Aggregator(this->sampling_mode_);
|
||||||
} else {
|
|
||||||
return this->sample_fixed_attenuation_();
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
float ADCSensor::sample_fixed_attenuation_() {
|
for (uint8_t sample = 0; sample < this->sample_count_; sample++) {
|
||||||
auto aggr = Aggregator<uint32_t>(this->sampling_mode_);
|
int raw = -1;
|
||||||
|
if (this->channel1_ != ADC1_CHANNEL_MAX) {
|
||||||
|
raw = adc1_get_raw(this->channel1_);
|
||||||
|
} else if (this->channel2_ != ADC2_CHANNEL_MAX) {
|
||||||
|
adc2_get_raw(this->channel2_, ADC_WIDTH_MAX_SOC_BITS, &raw);
|
||||||
|
}
|
||||||
|
if (raw == -1) {
|
||||||
|
return NAN;
|
||||||
|
}
|
||||||
|
|
||||||
for (uint8_t sample = 0; sample < this->sample_count_; sample++) {
|
aggr.add_sample(raw);
|
||||||
int raw;
|
|
||||||
esp_err_t err = adc_oneshot_read(this->adc_handle_, this->channel_, &raw);
|
|
||||||
|
|
||||||
if (err != ESP_OK) {
|
|
||||||
ESP_LOGW(TAG, "ADC read failed with error %d", err);
|
|
||||||
continue;
|
|
||||||
}
|
}
|
||||||
|
if (this->output_raw_) {
|
||||||
if (raw == -1) {
|
return aggr.aggregate();
|
||||||
ESP_LOGW(TAG, "Invalid ADC reading");
|
|
||||||
continue;
|
|
||||||
}
|
}
|
||||||
|
uint32_t mv =
|
||||||
aggr.add_sample(raw);
|
esp_adc_cal_raw_to_voltage(aggr.aggregate(), &this->cal_characteristics_[(int32_t) this->attenuation_]);
|
||||||
|
return mv / 1000.0f;
|
||||||
}
|
}
|
||||||
|
|
||||||
uint32_t final_value = aggr.aggregate();
|
int raw12 = ADC_MAX, raw6 = ADC_MAX, raw2 = ADC_MAX, raw0 = ADC_MAX;
|
||||||
|
|
||||||
if (this->output_raw_) {
|
if (this->channel1_ != ADC1_CHANNEL_MAX) {
|
||||||
return final_value;
|
adc1_config_channel_atten(this->channel1_, ADC_ATTEN_DB_12_COMPAT);
|
||||||
}
|
raw12 = adc1_get_raw(this->channel1_);
|
||||||
|
if (raw12 < ADC_MAX) {
|
||||||
if (this->calibration_handle_ != nullptr) {
|
adc1_config_channel_atten(this->channel1_, ADC_ATTEN_DB_6);
|
||||||
int voltage_mv;
|
raw6 = adc1_get_raw(this->channel1_);
|
||||||
esp_err_t err = adc_cali_raw_to_voltage(this->calibration_handle_, final_value, &voltage_mv);
|
if (raw6 < ADC_MAX) {
|
||||||
if (err == ESP_OK) {
|
adc1_config_channel_atten(this->channel1_, ADC_ATTEN_DB_2_5);
|
||||||
return voltage_mv / 1000.0f;
|
raw2 = adc1_get_raw(this->channel1_);
|
||||||
} else {
|
if (raw2 < ADC_MAX) {
|
||||||
ESP_LOGW(TAG, "ADC calibration conversion failed with error %d, disabling calibration", err);
|
adc1_config_channel_atten(this->channel1_, ADC_ATTEN_DB_0);
|
||||||
if (this->calibration_handle_ != nullptr) {
|
raw0 = adc1_get_raw(this->channel1_);
|
||||||
#if USE_ESP32_VARIANT_ESP32C3 || USE_ESP32_VARIANT_ESP32C5 || USE_ESP32_VARIANT_ESP32C6 || \
|
}
|
||||||
USE_ESP32_VARIANT_ESP32S3 || USE_ESP32_VARIANT_ESP32H2 || USE_ESP32_VARIANT_ESP32P4
|
|
||||||
adc_cali_delete_scheme_curve_fitting(this->calibration_handle_);
|
|
||||||
#else // Other ESP32 variants use line fitting calibration
|
|
||||||
adc_cali_delete_scheme_line_fitting(this->calibration_handle_);
|
|
||||||
#endif // USE_ESP32_VARIANT_ESP32C3 || ESP32C5 || ESP32C6 || ESP32S3 || ESP32H2
|
|
||||||
this->calibration_handle_ = nullptr;
|
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
} else if (this->channel2_ != ADC2_CHANNEL_MAX) {
|
||||||
|
adc2_config_channel_atten(this->channel2_, ADC_ATTEN_DB_12_COMPAT);
|
||||||
return final_value * 3.3f / 4095.0f;
|
adc2_get_raw(this->channel2_, ADC_WIDTH_MAX_SOC_BITS, &raw12);
|
||||||
}
|
if (raw12 < ADC_MAX) {
|
||||||
|
adc2_config_channel_atten(this->channel2_, ADC_ATTEN_DB_6);
|
||||||
float ADCSensor::sample_autorange_() {
|
adc2_get_raw(this->channel2_, ADC_WIDTH_MAX_SOC_BITS, &raw6);
|
||||||
// Auto-range mode
|
if (raw6 < ADC_MAX) {
|
||||||
auto read_atten = [this](adc_atten_t atten) -> std::pair<int, float> {
|
adc2_config_channel_atten(this->channel2_, ADC_ATTEN_DB_2_5);
|
||||||
// First reconfigure the attenuation for this reading
|
adc2_get_raw(this->channel2_, ADC_WIDTH_MAX_SOC_BITS, &raw2);
|
||||||
adc_oneshot_chan_cfg_t config = {
|
if (raw2 < ADC_MAX) {
|
||||||
.atten = atten,
|
adc2_config_channel_atten(this->channel2_, ADC_ATTEN_DB_0);
|
||||||
.bitwidth = ADC_BITWIDTH_DEFAULT,
|
adc2_get_raw(this->channel2_, ADC_WIDTH_MAX_SOC_BITS, &raw0);
|
||||||
};
|
}
|
||||||
|
|
||||||
esp_err_t err = adc_oneshot_config_channel(this->adc_handle_, this->channel_, &config);
|
|
||||||
|
|
||||||
if (err != ESP_OK) {
|
|
||||||
ESP_LOGW(TAG, "Error configuring ADC channel for autorange: %d", err);
|
|
||||||
return {-1, 0.0f};
|
|
||||||
}
|
|
||||||
|
|
||||||
// Need to recalibrate for the new attenuation
|
|
||||||
if (this->calibration_handle_ != nullptr) {
|
|
||||||
// Delete old calibration handle
|
|
||||||
#if USE_ESP32_VARIANT_ESP32C3 || USE_ESP32_VARIANT_ESP32C5 || USE_ESP32_VARIANT_ESP32C6 || \
|
|
||||||
USE_ESP32_VARIANT_ESP32S3 || USE_ESP32_VARIANT_ESP32H2 || USE_ESP32_VARIANT_ESP32P4
|
|
||||||
adc_cali_delete_scheme_curve_fitting(this->calibration_handle_);
|
|
||||||
#else
|
|
||||||
adc_cali_delete_scheme_line_fitting(this->calibration_handle_);
|
|
||||||
#endif
|
|
||||||
this->calibration_handle_ = nullptr;
|
|
||||||
}
|
|
||||||
|
|
||||||
// Create new calibration handle for this attenuation
|
|
||||||
adc_cali_handle_t handle = nullptr;
|
|
||||||
|
|
||||||
#if USE_ESP32_VARIANT_ESP32C3 || USE_ESP32_VARIANT_ESP32C5 || USE_ESP32_VARIANT_ESP32C6 || \
|
|
||||||
USE_ESP32_VARIANT_ESP32S3 || USE_ESP32_VARIANT_ESP32H2 || USE_ESP32_VARIANT_ESP32P4
|
|
||||||
adc_cali_curve_fitting_config_t cali_config = {};
|
|
||||||
#if ESP_IDF_VERSION >= ESP_IDF_VERSION_VAL(5, 3, 0)
|
|
||||||
cali_config.chan = this->channel_;
|
|
||||||
#endif
|
|
||||||
cali_config.unit_id = this->adc_unit_;
|
|
||||||
cali_config.atten = atten;
|
|
||||||
cali_config.bitwidth = ADC_BITWIDTH_DEFAULT;
|
|
||||||
|
|
||||||
err = adc_cali_create_scheme_curve_fitting(&cali_config, &handle);
|
|
||||||
#else
|
|
||||||
adc_cali_line_fitting_config_t cali_config = {
|
|
||||||
.unit_id = this->adc_unit_,
|
|
||||||
.atten = atten,
|
|
||||||
.bitwidth = ADC_BITWIDTH_DEFAULT,
|
|
||||||
#if !defined(USE_ESP32_VARIANT_ESP32S2)
|
|
||||||
.default_vref = 1100,
|
|
||||||
#endif
|
|
||||||
};
|
|
||||||
err = adc_cali_create_scheme_line_fitting(&cali_config, &handle);
|
|
||||||
#endif
|
|
||||||
|
|
||||||
int raw;
|
|
||||||
err = adc_oneshot_read(this->adc_handle_, this->channel_, &raw);
|
|
||||||
|
|
||||||
if (err != ESP_OK) {
|
|
||||||
ESP_LOGW(TAG, "ADC read failed in autorange with error %d", err);
|
|
||||||
if (handle != nullptr) {
|
|
||||||
#if USE_ESP32_VARIANT_ESP32C3 || USE_ESP32_VARIANT_ESP32C5 || USE_ESP32_VARIANT_ESP32C6 || \
|
|
||||||
USE_ESP32_VARIANT_ESP32S3 || USE_ESP32_VARIANT_ESP32H2 || USE_ESP32_VARIANT_ESP32P4
|
|
||||||
adc_cali_delete_scheme_curve_fitting(handle);
|
|
||||||
#else
|
|
||||||
adc_cali_delete_scheme_line_fitting(handle);
|
|
||||||
#endif
|
|
||||||
}
|
|
||||||
return {-1, 0.0f};
|
|
||||||
}
|
|
||||||
|
|
||||||
float voltage = 0.0f;
|
|
||||||
if (handle != nullptr) {
|
|
||||||
int voltage_mv;
|
|
||||||
err = adc_cali_raw_to_voltage(handle, raw, &voltage_mv);
|
|
||||||
if (err == ESP_OK) {
|
|
||||||
voltage = voltage_mv / 1000.0f;
|
|
||||||
} else {
|
|
||||||
voltage = raw * 3.3f / 4095.0f;
|
|
||||||
}
|
|
||||||
// Clean up calibration handle
|
|
||||||
#if USE_ESP32_VARIANT_ESP32C3 || USE_ESP32_VARIANT_ESP32C5 || USE_ESP32_VARIANT_ESP32C6 || \
|
|
||||||
USE_ESP32_VARIANT_ESP32S3 || USE_ESP32_VARIANT_ESP32H2 || USE_ESP32_VARIANT_ESP32P4
|
|
||||||
adc_cali_delete_scheme_curve_fitting(handle);
|
|
||||||
#else
|
|
||||||
adc_cali_delete_scheme_line_fitting(handle);
|
|
||||||
#endif
|
|
||||||
} else {
|
|
||||||
voltage = raw * 3.3f / 4095.0f;
|
|
||||||
}
|
|
||||||
|
|
||||||
return {raw, voltage};
|
|
||||||
};
|
|
||||||
|
|
||||||
auto [raw12, mv12] = read_atten(ADC_ATTEN_DB_12);
|
|
||||||
if (raw12 == -1) {
|
|
||||||
ESP_LOGE(TAG, "Failed to read ADC in autorange mode");
|
|
||||||
return NAN;
|
|
||||||
}
|
|
||||||
|
|
||||||
int raw6 = 4095, raw2 = 4095, raw0 = 4095;
|
|
||||||
float mv6 = 0, mv2 = 0, mv0 = 0;
|
|
||||||
|
|
||||||
if (raw12 < 4095) {
|
|
||||||
auto [raw6_val, mv6_val] = read_atten(ADC_ATTEN_DB_6);
|
|
||||||
raw6 = raw6_val;
|
|
||||||
mv6 = mv6_val;
|
|
||||||
|
|
||||||
if (raw6 < 4095 && raw6 != -1) {
|
|
||||||
auto [raw2_val, mv2_val] = read_atten(ADC_ATTEN_DB_2_5);
|
|
||||||
raw2 = raw2_val;
|
|
||||||
mv2 = mv2_val;
|
|
||||||
|
|
||||||
if (raw2 < 4095 && raw2 != -1) {
|
|
||||||
auto [raw0_val, mv0_val] = read_atten(ADC_ATTEN_DB_0);
|
|
||||||
raw0 = raw0_val;
|
|
||||||
mv0 = mv0_val;
|
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
@ -323,19 +155,19 @@ float ADCSensor::sample_autorange_() {
|
|||||||
return NAN;
|
return NAN;
|
||||||
}
|
}
|
||||||
|
|
||||||
const int adc_half = 2048;
|
uint32_t mv12 = esp_adc_cal_raw_to_voltage(raw12, &this->cal_characteristics_[(int32_t) ADC_ATTEN_DB_12_COMPAT]);
|
||||||
uint32_t c12 = std::min(raw12, adc_half);
|
uint32_t mv6 = esp_adc_cal_raw_to_voltage(raw6, &this->cal_characteristics_[(int32_t) ADC_ATTEN_DB_6]);
|
||||||
uint32_t c6 = adc_half - std::abs(raw6 - adc_half);
|
uint32_t mv2 = esp_adc_cal_raw_to_voltage(raw2, &this->cal_characteristics_[(int32_t) ADC_ATTEN_DB_2_5]);
|
||||||
uint32_t c2 = adc_half - std::abs(raw2 - adc_half);
|
uint32_t mv0 = esp_adc_cal_raw_to_voltage(raw0, &this->cal_characteristics_[(int32_t) ADC_ATTEN_DB_0]);
|
||||||
uint32_t c0 = std::min(4095 - raw0, adc_half);
|
|
||||||
|
uint32_t c12 = std::min(raw12, ADC_HALF);
|
||||||
|
uint32_t c6 = ADC_HALF - std::abs(raw6 - ADC_HALF);
|
||||||
|
uint32_t c2 = ADC_HALF - std::abs(raw2 - ADC_HALF);
|
||||||
|
uint32_t c0 = std::min(ADC_MAX - raw0, ADC_HALF);
|
||||||
uint32_t csum = c12 + c6 + c2 + c0;
|
uint32_t csum = c12 + c6 + c2 + c0;
|
||||||
|
|
||||||
if (csum == 0) {
|
uint32_t mv_scaled = (mv12 * c12) + (mv6 * c6) + (mv2 * c2) + (mv0 * c0);
|
||||||
ESP_LOGE(TAG, "Invalid weight sum in autorange calculation");
|
return mv_scaled / (float) (csum * 1000U);
|
||||||
return NAN;
|
|
||||||
}
|
|
||||||
|
|
||||||
return (mv12 * c12 + mv6 * c6 + mv2 * c2 + mv0 * c0) / csum;
|
|
||||||
}
|
}
|
||||||
|
|
||||||
} // namespace adc
|
} // namespace adc
|
||||||
|
@ -17,6 +17,7 @@ namespace adc {
|
|||||||
static const char *const TAG = "adc.esp8266";
|
static const char *const TAG = "adc.esp8266";
|
||||||
|
|
||||||
void ADCSensor::setup() {
|
void ADCSensor::setup() {
|
||||||
|
ESP_LOGCONFIG(TAG, "Running setup for '%s'", this->get_name().c_str());
|
||||||
#ifndef USE_ADC_SENSOR_VCC
|
#ifndef USE_ADC_SENSOR_VCC
|
||||||
this->pin_->setup();
|
this->pin_->setup();
|
||||||
#endif
|
#endif
|
||||||
@ -37,7 +38,7 @@ void ADCSensor::dump_config() {
|
|||||||
}
|
}
|
||||||
|
|
||||||
float ADCSensor::sample() {
|
float ADCSensor::sample() {
|
||||||
auto aggr = Aggregator<uint32_t>(this->sampling_mode_);
|
auto aggr = Aggregator(this->sampling_mode_);
|
||||||
|
|
||||||
for (uint8_t sample = 0; sample < this->sample_count_; sample++) {
|
for (uint8_t sample = 0; sample < this->sample_count_; sample++) {
|
||||||
uint32_t raw = 0;
|
uint32_t raw = 0;
|
||||||
@ -55,6 +56,8 @@ float ADCSensor::sample() {
|
|||||||
return aggr.aggregate() / 1024.0f;
|
return aggr.aggregate() / 1024.0f;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
std::string ADCSensor::unique_id() { return get_mac_address() + "-adc"; }
|
||||||
|
|
||||||
} // namespace adc
|
} // namespace adc
|
||||||
} // namespace esphome
|
} // namespace esphome
|
||||||
|
|
||||||
|
@ -9,6 +9,7 @@ namespace adc {
|
|||||||
static const char *const TAG = "adc.libretiny";
|
static const char *const TAG = "adc.libretiny";
|
||||||
|
|
||||||
void ADCSensor::setup() {
|
void ADCSensor::setup() {
|
||||||
|
ESP_LOGCONFIG(TAG, "Running setup for '%s'", this->get_name().c_str());
|
||||||
#ifndef USE_ADC_SENSOR_VCC
|
#ifndef USE_ADC_SENSOR_VCC
|
||||||
this->pin_->setup();
|
this->pin_->setup();
|
||||||
#endif // !USE_ADC_SENSOR_VCC
|
#endif // !USE_ADC_SENSOR_VCC
|
||||||
@ -30,7 +31,7 @@ void ADCSensor::dump_config() {
|
|||||||
|
|
||||||
float ADCSensor::sample() {
|
float ADCSensor::sample() {
|
||||||
uint32_t raw = 0;
|
uint32_t raw = 0;
|
||||||
auto aggr = Aggregator<uint32_t>(this->sampling_mode_);
|
auto aggr = Aggregator(this->sampling_mode_);
|
||||||
|
|
||||||
if (this->output_raw_) {
|
if (this->output_raw_) {
|
||||||
for (uint8_t sample = 0; sample < this->sample_count_; sample++) {
|
for (uint8_t sample = 0; sample < this->sample_count_; sample++) {
|
||||||
|
@ -14,6 +14,7 @@ namespace adc {
|
|||||||
static const char *const TAG = "adc.rp2040";
|
static const char *const TAG = "adc.rp2040";
|
||||||
|
|
||||||
void ADCSensor::setup() {
|
void ADCSensor::setup() {
|
||||||
|
ESP_LOGCONFIG(TAG, "Running setup for '%s'", this->get_name().c_str());
|
||||||
static bool initialized = false;
|
static bool initialized = false;
|
||||||
if (!initialized) {
|
if (!initialized) {
|
||||||
adc_init();
|
adc_init();
|
||||||
@ -41,7 +42,7 @@ void ADCSensor::dump_config() {
|
|||||||
|
|
||||||
float ADCSensor::sample() {
|
float ADCSensor::sample() {
|
||||||
uint32_t raw = 0;
|
uint32_t raw = 0;
|
||||||
auto aggr = Aggregator<uint32_t>(this->sampling_mode_);
|
auto aggr = Aggregator(this->sampling_mode_);
|
||||||
|
|
||||||
if (this->is_temperature_) {
|
if (this->is_temperature_) {
|
||||||
adc_set_temp_sensor_enabled(true);
|
adc_set_temp_sensor_enabled(true);
|
||||||
|
@ -1,207 +0,0 @@
|
|||||||
|
|
||||||
#include "adc_sensor.h"
|
|
||||||
#ifdef USE_ZEPHYR
|
|
||||||
#include "esphome/core/log.h"
|
|
||||||
|
|
||||||
#include "hal/nrf_saadc.h"
|
|
||||||
|
|
||||||
namespace esphome {
|
|
||||||
namespace adc {
|
|
||||||
|
|
||||||
static const char *const TAG = "adc.zephyr";
|
|
||||||
|
|
||||||
void ADCSensor::setup() {
|
|
||||||
if (!adc_is_ready_dt(this->channel_)) {
|
|
||||||
ESP_LOGE(TAG, "ADC controller device %s not ready", this->channel_->dev->name);
|
|
||||||
return;
|
|
||||||
}
|
|
||||||
|
|
||||||
auto err = adc_channel_setup_dt(this->channel_);
|
|
||||||
if (err < 0) {
|
|
||||||
ESP_LOGE(TAG, "Could not setup channel %s (%d)", this->channel_->dev->name, err);
|
|
||||||
return;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
#if ESPHOME_LOG_LEVEL >= ESPHOME_LOG_LEVEL_VERBOSE
|
|
||||||
static const LogString *gain_to_str(enum adc_gain gain) {
|
|
||||||
switch (gain) {
|
|
||||||
case ADC_GAIN_1_6:
|
|
||||||
return LOG_STR("1/6");
|
|
||||||
case ADC_GAIN_1_5:
|
|
||||||
return LOG_STR("1/5");
|
|
||||||
case ADC_GAIN_1_4:
|
|
||||||
return LOG_STR("1/4");
|
|
||||||
case ADC_GAIN_1_3:
|
|
||||||
return LOG_STR("1/3");
|
|
||||||
case ADC_GAIN_2_5:
|
|
||||||
return LOG_STR("2/5");
|
|
||||||
case ADC_GAIN_1_2:
|
|
||||||
return LOG_STR("1/2");
|
|
||||||
case ADC_GAIN_2_3:
|
|
||||||
return LOG_STR("2/3");
|
|
||||||
case ADC_GAIN_4_5:
|
|
||||||
return LOG_STR("4/5");
|
|
||||||
case ADC_GAIN_1:
|
|
||||||
return LOG_STR("1");
|
|
||||||
case ADC_GAIN_2:
|
|
||||||
return LOG_STR("2");
|
|
||||||
case ADC_GAIN_3:
|
|
||||||
return LOG_STR("3");
|
|
||||||
case ADC_GAIN_4:
|
|
||||||
return LOG_STR("4");
|
|
||||||
case ADC_GAIN_6:
|
|
||||||
return LOG_STR("6");
|
|
||||||
case ADC_GAIN_8:
|
|
||||||
return LOG_STR("8");
|
|
||||||
case ADC_GAIN_12:
|
|
||||||
return LOG_STR("12");
|
|
||||||
case ADC_GAIN_16:
|
|
||||||
return LOG_STR("16");
|
|
||||||
case ADC_GAIN_24:
|
|
||||||
return LOG_STR("24");
|
|
||||||
case ADC_GAIN_32:
|
|
||||||
return LOG_STR("32");
|
|
||||||
case ADC_GAIN_64:
|
|
||||||
return LOG_STR("64");
|
|
||||||
case ADC_GAIN_128:
|
|
||||||
return LOG_STR("128");
|
|
||||||
}
|
|
||||||
return LOG_STR("undefined gain");
|
|
||||||
}
|
|
||||||
|
|
||||||
static const LogString *reference_to_str(enum adc_reference reference) {
|
|
||||||
switch (reference) {
|
|
||||||
case ADC_REF_VDD_1:
|
|
||||||
return LOG_STR("VDD");
|
|
||||||
case ADC_REF_VDD_1_2:
|
|
||||||
return LOG_STR("VDD/2");
|
|
||||||
case ADC_REF_VDD_1_3:
|
|
||||||
return LOG_STR("VDD/3");
|
|
||||||
case ADC_REF_VDD_1_4:
|
|
||||||
return LOG_STR("VDD/4");
|
|
||||||
case ADC_REF_INTERNAL:
|
|
||||||
return LOG_STR("INTERNAL");
|
|
||||||
case ADC_REF_EXTERNAL0:
|
|
||||||
return LOG_STR("External, input 0");
|
|
||||||
case ADC_REF_EXTERNAL1:
|
|
||||||
return LOG_STR("External, input 1");
|
|
||||||
}
|
|
||||||
return LOG_STR("undefined reference");
|
|
||||||
}
|
|
||||||
|
|
||||||
static const LogString *input_to_str(uint8_t input) {
|
|
||||||
switch (input) {
|
|
||||||
case NRF_SAADC_INPUT_AIN0:
|
|
||||||
return LOG_STR("AIN0");
|
|
||||||
case NRF_SAADC_INPUT_AIN1:
|
|
||||||
return LOG_STR("AIN1");
|
|
||||||
case NRF_SAADC_INPUT_AIN2:
|
|
||||||
return LOG_STR("AIN2");
|
|
||||||
case NRF_SAADC_INPUT_AIN3:
|
|
||||||
return LOG_STR("AIN3");
|
|
||||||
case NRF_SAADC_INPUT_AIN4:
|
|
||||||
return LOG_STR("AIN4");
|
|
||||||
case NRF_SAADC_INPUT_AIN5:
|
|
||||||
return LOG_STR("AIN5");
|
|
||||||
case NRF_SAADC_INPUT_AIN6:
|
|
||||||
return LOG_STR("AIN6");
|
|
||||||
case NRF_SAADC_INPUT_AIN7:
|
|
||||||
return LOG_STR("AIN7");
|
|
||||||
case NRF_SAADC_INPUT_VDD:
|
|
||||||
return LOG_STR("VDD");
|
|
||||||
case NRF_SAADC_INPUT_VDDHDIV5:
|
|
||||||
return LOG_STR("VDDHDIV5");
|
|
||||||
}
|
|
||||||
return LOG_STR("undefined input");
|
|
||||||
}
|
|
||||||
#endif // ESPHOME_LOG_LEVEL >= ESPHOME_LOG_LEVEL_VERBOSE
|
|
||||||
|
|
||||||
void ADCSensor::dump_config() {
|
|
||||||
LOG_SENSOR("", "ADC Sensor", this);
|
|
||||||
LOG_PIN(" Pin: ", this->pin_);
|
|
||||||
#if ESPHOME_LOG_LEVEL >= ESPHOME_LOG_LEVEL_VERBOSE
|
|
||||||
ESP_LOGV(TAG,
|
|
||||||
" Name: %s\n"
|
|
||||||
" Channel: %d\n"
|
|
||||||
" vref_mv: %d\n"
|
|
||||||
" Resolution %d\n"
|
|
||||||
" Oversampling %d",
|
|
||||||
this->channel_->dev->name, this->channel_->channel_id, this->channel_->vref_mv, this->channel_->resolution,
|
|
||||||
this->channel_->oversampling);
|
|
||||||
|
|
||||||
ESP_LOGV(TAG,
|
|
||||||
" Gain: %s\n"
|
|
||||||
" reference: %s\n"
|
|
||||||
" acquisition_time: %d\n"
|
|
||||||
" differential %s",
|
|
||||||
LOG_STR_ARG(gain_to_str(this->channel_->channel_cfg.gain)),
|
|
||||||
LOG_STR_ARG(reference_to_str(this->channel_->channel_cfg.reference)),
|
|
||||||
this->channel_->channel_cfg.acquisition_time, YESNO(this->channel_->channel_cfg.differential));
|
|
||||||
if (this->channel_->channel_cfg.differential) {
|
|
||||||
ESP_LOGV(TAG,
|
|
||||||
" Positive: %s\n"
|
|
||||||
" Negative: %s",
|
|
||||||
LOG_STR_ARG(input_to_str(this->channel_->channel_cfg.input_positive)),
|
|
||||||
LOG_STR_ARG(input_to_str(this->channel_->channel_cfg.input_negative)));
|
|
||||||
} else {
|
|
||||||
ESP_LOGV(TAG, " Positive: %s", LOG_STR_ARG(input_to_str(this->channel_->channel_cfg.input_positive)));
|
|
||||||
}
|
|
||||||
#endif
|
|
||||||
|
|
||||||
LOG_UPDATE_INTERVAL(this);
|
|
||||||
}
|
|
||||||
|
|
||||||
float ADCSensor::sample() {
|
|
||||||
auto aggr = Aggregator<int32_t>(this->sampling_mode_);
|
|
||||||
int err;
|
|
||||||
for (uint8_t sample = 0; sample < this->sample_count_; sample++) {
|
|
||||||
int16_t buf = 0;
|
|
||||||
struct adc_sequence sequence = {
|
|
||||||
.buffer = &buf,
|
|
||||||
/* buffer size in bytes, not number of samples */
|
|
||||||
.buffer_size = sizeof(buf),
|
|
||||||
};
|
|
||||||
int32_t val_raw;
|
|
||||||
|
|
||||||
err = adc_sequence_init_dt(this->channel_, &sequence);
|
|
||||||
if (err < 0) {
|
|
||||||
ESP_LOGE(TAG, "Could sequence init %s (%d)", this->channel_->dev->name, err);
|
|
||||||
return 0.0;
|
|
||||||
}
|
|
||||||
|
|
||||||
err = adc_read(this->channel_->dev, &sequence);
|
|
||||||
if (err < 0) {
|
|
||||||
ESP_LOGE(TAG, "Could not read %s (%d)", this->channel_->dev->name, err);
|
|
||||||
return 0.0;
|
|
||||||
}
|
|
||||||
|
|
||||||
val_raw = (int32_t) buf;
|
|
||||||
if (!this->channel_->channel_cfg.differential) {
|
|
||||||
// https://github.com/adafruit/Adafruit_nRF52_Arduino/blob/0ed4d9ffc674ae407be7cacf5696a02f5e789861/cores/nRF5/wiring_analog_nRF52.c#L222
|
|
||||||
if (val_raw < 0) {
|
|
||||||
val_raw = 0;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
aggr.add_sample(val_raw);
|
|
||||||
}
|
|
||||||
|
|
||||||
int32_t val_mv = aggr.aggregate();
|
|
||||||
|
|
||||||
if (this->output_raw_) {
|
|
||||||
return val_mv;
|
|
||||||
}
|
|
||||||
|
|
||||||
err = adc_raw_to_millivolts_dt(this->channel_, &val_mv);
|
|
||||||
/* conversion to mV may not be supported, skip if not */
|
|
||||||
if (err < 0) {
|
|
||||||
ESP_LOGE(TAG, "Value in mV not available %s (%d)", this->channel_->dev->name, err);
|
|
||||||
return 0.0;
|
|
||||||
}
|
|
||||||
|
|
||||||
return val_mv / 1000.0f;
|
|
||||||
}
|
|
||||||
|
|
||||||
} // namespace adc
|
|
||||||
} // namespace esphome
|
|
||||||
#endif
|
|
@ -3,12 +3,6 @@ import logging
|
|||||||
import esphome.codegen as cg
|
import esphome.codegen as cg
|
||||||
from esphome.components import sensor, voltage_sampler
|
from esphome.components import sensor, voltage_sampler
|
||||||
from esphome.components.esp32 import get_esp32_variant
|
from esphome.components.esp32 import get_esp32_variant
|
||||||
from esphome.components.nrf52.const import AIN_TO_GPIO, EXTRA_ADC
|
|
||||||
from esphome.components.zephyr import (
|
|
||||||
zephyr_add_overlay,
|
|
||||||
zephyr_add_prj_conf,
|
|
||||||
zephyr_add_user,
|
|
||||||
)
|
|
||||||
import esphome.config_validation as cv
|
import esphome.config_validation as cv
|
||||||
from esphome.const import (
|
from esphome.const import (
|
||||||
CONF_ATTENUATION,
|
CONF_ATTENUATION,
|
||||||
@ -16,12 +10,13 @@ from esphome.const import (
|
|||||||
CONF_NUMBER,
|
CONF_NUMBER,
|
||||||
CONF_PIN,
|
CONF_PIN,
|
||||||
CONF_RAW,
|
CONF_RAW,
|
||||||
|
CONF_WIFI,
|
||||||
DEVICE_CLASS_VOLTAGE,
|
DEVICE_CLASS_VOLTAGE,
|
||||||
PLATFORM_NRF52,
|
|
||||||
STATE_CLASS_MEASUREMENT,
|
STATE_CLASS_MEASUREMENT,
|
||||||
UNIT_VOLT,
|
UNIT_VOLT,
|
||||||
)
|
)
|
||||||
from esphome.core import CORE
|
from esphome.core import CORE
|
||||||
|
import esphome.final_validate as fv
|
||||||
|
|
||||||
from . import (
|
from . import (
|
||||||
ATTENUATION_MODES,
|
ATTENUATION_MODES,
|
||||||
@ -29,7 +24,6 @@ from . import (
|
|||||||
ESP32_VARIANT_ADC2_PIN_TO_CHANNEL,
|
ESP32_VARIANT_ADC2_PIN_TO_CHANNEL,
|
||||||
SAMPLING_MODES,
|
SAMPLING_MODES,
|
||||||
adc_ns,
|
adc_ns,
|
||||||
adc_unit_t,
|
|
||||||
validate_adc_pin,
|
validate_adc_pin,
|
||||||
)
|
)
|
||||||
|
|
||||||
@ -63,14 +57,25 @@ def validate_config(config):
|
|||||||
return config
|
return config
|
||||||
|
|
||||||
|
|
||||||
|
def final_validate_config(config):
|
||||||
|
if CORE.is_esp32:
|
||||||
|
variant = get_esp32_variant()
|
||||||
|
if (
|
||||||
|
CONF_WIFI in fv.full_config.get()
|
||||||
|
and config[CONF_PIN][CONF_NUMBER]
|
||||||
|
in ESP32_VARIANT_ADC2_PIN_TO_CHANNEL[variant]
|
||||||
|
):
|
||||||
|
raise cv.Invalid(
|
||||||
|
f"{variant} doesn't support ADC on this pin when Wi-Fi is configured"
|
||||||
|
)
|
||||||
|
|
||||||
|
return config
|
||||||
|
|
||||||
|
|
||||||
ADCSensor = adc_ns.class_(
|
ADCSensor = adc_ns.class_(
|
||||||
"ADCSensor", sensor.Sensor, cg.PollingComponent, voltage_sampler.VoltageSampler
|
"ADCSensor", sensor.Sensor, cg.PollingComponent, voltage_sampler.VoltageSampler
|
||||||
)
|
)
|
||||||
|
|
||||||
CONF_NRF_SAADC = "nrf_saadc"
|
|
||||||
|
|
||||||
adc_dt_spec = cg.global_ns.class_("adc_dt_spec")
|
|
||||||
|
|
||||||
CONFIG_SCHEMA = cv.All(
|
CONFIG_SCHEMA = cv.All(
|
||||||
sensor.sensor_schema(
|
sensor.sensor_schema(
|
||||||
ADCSensor,
|
ADCSensor,
|
||||||
@ -86,7 +91,6 @@ CONFIG_SCHEMA = cv.All(
|
|||||||
cv.SplitDefault(CONF_ATTENUATION, esp32="0db"): cv.All(
|
cv.SplitDefault(CONF_ATTENUATION, esp32="0db"): cv.All(
|
||||||
cv.only_on_esp32, _attenuation
|
cv.only_on_esp32, _attenuation
|
||||||
),
|
),
|
||||||
cv.OnlyWith(CONF_NRF_SAADC, PLATFORM_NRF52): cv.declare_id(adc_dt_spec),
|
|
||||||
cv.Optional(CONF_SAMPLES, default=1): cv.int_range(min=1, max=255),
|
cv.Optional(CONF_SAMPLES, default=1): cv.int_range(min=1, max=255),
|
||||||
cv.Optional(CONF_SAMPLING_MODE, default="avg"): _sampling_mode,
|
cv.Optional(CONF_SAMPLING_MODE, default="avg"): _sampling_mode,
|
||||||
}
|
}
|
||||||
@ -95,7 +99,7 @@ CONFIG_SCHEMA = cv.All(
|
|||||||
validate_config,
|
validate_config,
|
||||||
)
|
)
|
||||||
|
|
||||||
CONF_ADC_CHANNEL_ID = "adc_channel_id"
|
FINAL_VALIDATE_SCHEMA = final_validate_config
|
||||||
|
|
||||||
|
|
||||||
async def to_code(config):
|
async def to_code(config):
|
||||||
@ -107,7 +111,7 @@ async def to_code(config):
|
|||||||
cg.add_define("USE_ADC_SENSOR_VCC")
|
cg.add_define("USE_ADC_SENSOR_VCC")
|
||||||
elif config[CONF_PIN] == "TEMPERATURE":
|
elif config[CONF_PIN] == "TEMPERATURE":
|
||||||
cg.add(var.set_is_temperature())
|
cg.add(var.set_is_temperature())
|
||||||
elif not CORE.is_nrf52 or config[CONF_PIN][CONF_NUMBER] not in EXTRA_ADC:
|
else:
|
||||||
pin = await cg.gpio_pin_expression(config[CONF_PIN])
|
pin = await cg.gpio_pin_expression(config[CONF_PIN])
|
||||||
cg.add(var.set_pin(pin))
|
cg.add(var.set_pin(pin))
|
||||||
|
|
||||||
@ -115,13 +119,13 @@ async def to_code(config):
|
|||||||
cg.add(var.set_sample_count(config[CONF_SAMPLES]))
|
cg.add(var.set_sample_count(config[CONF_SAMPLES]))
|
||||||
cg.add(var.set_sampling_mode(config[CONF_SAMPLING_MODE]))
|
cg.add(var.set_sampling_mode(config[CONF_SAMPLING_MODE]))
|
||||||
|
|
||||||
if CORE.is_esp32:
|
if attenuation := config.get(CONF_ATTENUATION):
|
||||||
if attenuation := config.get(CONF_ATTENUATION):
|
if attenuation == "auto":
|
||||||
if attenuation == "auto":
|
cg.add(var.set_autorange(cg.global_ns.true))
|
||||||
cg.add(var.set_autorange(cg.global_ns.true))
|
else:
|
||||||
else:
|
cg.add(var.set_attenuation(attenuation))
|
||||||
cg.add(var.set_attenuation(attenuation))
|
|
||||||
|
|
||||||
|
if CORE.is_esp32:
|
||||||
variant = get_esp32_variant()
|
variant = get_esp32_variant()
|
||||||
pin_num = config[CONF_PIN][CONF_NUMBER]
|
pin_num = config[CONF_PIN][CONF_NUMBER]
|
||||||
if (
|
if (
|
||||||
@ -129,48 +133,10 @@ async def to_code(config):
|
|||||||
and pin_num in ESP32_VARIANT_ADC1_PIN_TO_CHANNEL[variant]
|
and pin_num in ESP32_VARIANT_ADC1_PIN_TO_CHANNEL[variant]
|
||||||
):
|
):
|
||||||
chan = ESP32_VARIANT_ADC1_PIN_TO_CHANNEL[variant][pin_num]
|
chan = ESP32_VARIANT_ADC1_PIN_TO_CHANNEL[variant][pin_num]
|
||||||
cg.add(var.set_channel(adc_unit_t.ADC_UNIT_1, chan))
|
cg.add(var.set_channel1(chan))
|
||||||
elif (
|
elif (
|
||||||
variant in ESP32_VARIANT_ADC2_PIN_TO_CHANNEL
|
variant in ESP32_VARIANT_ADC2_PIN_TO_CHANNEL
|
||||||
and pin_num in ESP32_VARIANT_ADC2_PIN_TO_CHANNEL[variant]
|
and pin_num in ESP32_VARIANT_ADC2_PIN_TO_CHANNEL[variant]
|
||||||
):
|
):
|
||||||
chan = ESP32_VARIANT_ADC2_PIN_TO_CHANNEL[variant][pin_num]
|
chan = ESP32_VARIANT_ADC2_PIN_TO_CHANNEL[variant][pin_num]
|
||||||
cg.add(var.set_channel(adc_unit_t.ADC_UNIT_2, chan))
|
cg.add(var.set_channel2(chan))
|
||||||
|
|
||||||
elif CORE.is_nrf52:
|
|
||||||
CORE.data.setdefault(CONF_ADC_CHANNEL_ID, 0)
|
|
||||||
channel_id = CORE.data[CONF_ADC_CHANNEL_ID]
|
|
||||||
CORE.data[CONF_ADC_CHANNEL_ID] = channel_id + 1
|
|
||||||
zephyr_add_prj_conf("ADC", True)
|
|
||||||
nrf_saadc = config[CONF_NRF_SAADC]
|
|
||||||
rhs = cg.RawExpression(
|
|
||||||
f"ADC_DT_SPEC_GET_BY_IDX(DT_PATH(zephyr_user), {channel_id})"
|
|
||||||
)
|
|
||||||
adc = cg.new_Pvariable(nrf_saadc, rhs)
|
|
||||||
cg.add(var.set_adc_channel(adc))
|
|
||||||
gain = "ADC_GAIN_1_6"
|
|
||||||
pin_number = config[CONF_PIN][CONF_NUMBER]
|
|
||||||
if pin_number == "VDDHDIV5":
|
|
||||||
gain = "ADC_GAIN_1_2"
|
|
||||||
if isinstance(pin_number, int):
|
|
||||||
GPIO_TO_AIN = {v: k for k, v in AIN_TO_GPIO.items()}
|
|
||||||
pin_number = GPIO_TO_AIN[pin_number]
|
|
||||||
zephyr_add_user("io-channels", f"<&adc {channel_id}>")
|
|
||||||
zephyr_add_overlay(
|
|
||||||
f"""
|
|
||||||
&adc {{
|
|
||||||
#address-cells = <1>;
|
|
||||||
#size-cells = <0>;
|
|
||||||
|
|
||||||
channel@{channel_id} {{
|
|
||||||
reg = <{channel_id}>;
|
|
||||||
zephyr,gain = "{gain}";
|
|
||||||
zephyr,reference = "ADC_REF_INTERNAL";
|
|
||||||
zephyr,acquisition-time = <ADC_ACQ_TIME_DEFAULT>;
|
|
||||||
zephyr,input-positive = <NRF_SAADC_{pin_number}>;
|
|
||||||
zephyr,resolution = <14>;
|
|
||||||
zephyr,oversampling = <8>;
|
|
||||||
}};
|
|
||||||
}};
|
|
||||||
"""
|
|
||||||
)
|
|
||||||
|
@ -8,7 +8,10 @@ static const char *const TAG = "adc128s102";
|
|||||||
|
|
||||||
float ADC128S102::get_setup_priority() const { return setup_priority::HARDWARE; }
|
float ADC128S102::get_setup_priority() const { return setup_priority::HARDWARE; }
|
||||||
|
|
||||||
void ADC128S102::setup() { this->spi_setup(); }
|
void ADC128S102::setup() {
|
||||||
|
ESP_LOGCONFIG(TAG, "Running setup");
|
||||||
|
this->spi_setup();
|
||||||
|
}
|
||||||
|
|
||||||
void ADC128S102::dump_config() {
|
void ADC128S102::dump_config() {
|
||||||
ESP_LOGCONFIG(TAG, "ADC128S102:");
|
ESP_LOGCONFIG(TAG, "ADC128S102:");
|
||||||
|
@ -10,6 +10,7 @@ static const uint8_t ADS1115_REGISTER_CONVERSION = 0x00;
|
|||||||
static const uint8_t ADS1115_REGISTER_CONFIG = 0x01;
|
static const uint8_t ADS1115_REGISTER_CONFIG = 0x01;
|
||||||
|
|
||||||
void ADS1115Component::setup() {
|
void ADS1115Component::setup() {
|
||||||
|
ESP_LOGCONFIG(TAG, "Running setup");
|
||||||
uint16_t value;
|
uint16_t value;
|
||||||
if (!this->read_byte_16(ADS1115_REGISTER_CONVERSION, &value)) {
|
if (!this->read_byte_16(ADS1115_REGISTER_CONVERSION, &value)) {
|
||||||
this->mark_failed();
|
this->mark_failed();
|
||||||
|
@ -9,6 +9,7 @@ static const char *const TAG = "ads1118";
|
|||||||
static const uint8_t ADS1118_DATA_RATE_860_SPS = 0b111;
|
static const uint8_t ADS1118_DATA_RATE_860_SPS = 0b111;
|
||||||
|
|
||||||
void ADS1118::setup() {
|
void ADS1118::setup() {
|
||||||
|
ESP_LOGCONFIG(TAG, "Running setup");
|
||||||
this->spi_setup();
|
this->spi_setup();
|
||||||
|
|
||||||
this->config_ = 0;
|
this->config_ = 0;
|
||||||
|
@ -24,6 +24,8 @@ static const uint16_t ZP_CURRENT = 0x0000;
|
|||||||
static const uint16_t ZP_DEFAULT = 0xFFFF;
|
static const uint16_t ZP_DEFAULT = 0xFFFF;
|
||||||
|
|
||||||
void AGS10Component::setup() {
|
void AGS10Component::setup() {
|
||||||
|
ESP_LOGCONFIG(TAG, "Running setup");
|
||||||
|
|
||||||
auto version = this->read_version_();
|
auto version = this->read_version_();
|
||||||
if (version) {
|
if (version) {
|
||||||
ESP_LOGD(TAG, "AGS10 Sensor Version: 0x%02X", *version);
|
ESP_LOGD(TAG, "AGS10 Sensor Version: 0x%02X", *version);
|
||||||
@ -43,6 +45,8 @@ void AGS10Component::setup() {
|
|||||||
} else {
|
} else {
|
||||||
ESP_LOGE(TAG, "AGS10 Sensor Resistance: unknown");
|
ESP_LOGE(TAG, "AGS10 Sensor Resistance: unknown");
|
||||||
}
|
}
|
||||||
|
|
||||||
|
ESP_LOGD(TAG, "Sensor initialized");
|
||||||
}
|
}
|
||||||
|
|
||||||
void AGS10Component::update() {
|
void AGS10Component::update() {
|
||||||
|
@ -38,6 +38,8 @@ static const uint8_t AHT10_STATUS_BUSY = 0x80;
|
|||||||
static const float AHT10_DIVISOR = 1048576.0f; // 2^20, used for temperature and humidity calculations
|
static const float AHT10_DIVISOR = 1048576.0f; // 2^20, used for temperature and humidity calculations
|
||||||
|
|
||||||
void AHT10Component::setup() {
|
void AHT10Component::setup() {
|
||||||
|
ESP_LOGCONFIG(TAG, "Running setup");
|
||||||
|
|
||||||
if (this->write(AHT10_SOFTRESET_CMD, sizeof(AHT10_SOFTRESET_CMD)) != i2c::ERROR_OK) {
|
if (this->write(AHT10_SOFTRESET_CMD, sizeof(AHT10_SOFTRESET_CMD)) != i2c::ERROR_OK) {
|
||||||
ESP_LOGE(TAG, "Reset failed");
|
ESP_LOGE(TAG, "Reset failed");
|
||||||
}
|
}
|
||||||
@ -78,6 +80,8 @@ void AHT10Component::setup() {
|
|||||||
this->mark_failed();
|
this->mark_failed();
|
||||||
return;
|
return;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
ESP_LOGV(TAG, "Initialization complete");
|
||||||
}
|
}
|
||||||
|
|
||||||
void AHT10Component::restart_read_() {
|
void AHT10Component::restart_read_() {
|
||||||
|
@ -17,6 +17,8 @@ static const char *const TAG = "aic3204";
|
|||||||
}
|
}
|
||||||
|
|
||||||
void AIC3204::setup() {
|
void AIC3204::setup() {
|
||||||
|
ESP_LOGCONFIG(TAG, "Running setup");
|
||||||
|
|
||||||
// Set register page to 0
|
// Set register page to 0
|
||||||
ERROR_CHECK(this->write_byte(AIC3204_PAGE_CTRL, 0x00), "Set page 0 failed");
|
ERROR_CHECK(this->write_byte(AIC3204_PAGE_CTRL, 0x00), "Set page 0 failed");
|
||||||
// Initiate SW reset (PLL is powered off as part of reset)
|
// Initiate SW reset (PLL is powered off as part of reset)
|
||||||
|
@ -1 +1 @@
|
|||||||
CODEOWNERS = ["@jeromelaban", "@precurse"]
|
CODEOWNERS = ["@jeromelaban"]
|
||||||
|
@ -73,29 +73,11 @@ void AirthingsWavePlus::dump_config() {
|
|||||||
LOG_SENSOR(" ", "Illuminance", this->illuminance_sensor_);
|
LOG_SENSOR(" ", "Illuminance", this->illuminance_sensor_);
|
||||||
}
|
}
|
||||||
|
|
||||||
void AirthingsWavePlus::setup() {
|
AirthingsWavePlus::AirthingsWavePlus() {
|
||||||
const char *service_uuid;
|
this->service_uuid_ = espbt::ESPBTUUID::from_raw(SERVICE_UUID);
|
||||||
const char *characteristic_uuid;
|
this->sensors_data_characteristic_uuid_ = espbt::ESPBTUUID::from_raw(CHARACTERISTIC_UUID);
|
||||||
const char *access_control_point_characteristic_uuid;
|
|
||||||
|
|
||||||
// Change UUIDs for Wave Radon Gen2
|
|
||||||
switch (this->wave_device_type_) {
|
|
||||||
case WaveDeviceType::WAVE_GEN2:
|
|
||||||
service_uuid = SERVICE_UUID_WAVE_RADON_GEN2;
|
|
||||||
characteristic_uuid = CHARACTERISTIC_UUID_WAVE_RADON_GEN2;
|
|
||||||
access_control_point_characteristic_uuid = ACCESS_CONTROL_POINT_CHARACTERISTIC_UUID_WAVE_RADON_GEN2;
|
|
||||||
break;
|
|
||||||
default:
|
|
||||||
// Wave Plus
|
|
||||||
service_uuid = SERVICE_UUID;
|
|
||||||
characteristic_uuid = CHARACTERISTIC_UUID;
|
|
||||||
access_control_point_characteristic_uuid = ACCESS_CONTROL_POINT_CHARACTERISTIC_UUID;
|
|
||||||
}
|
|
||||||
|
|
||||||
this->service_uuid_ = espbt::ESPBTUUID::from_raw(service_uuid);
|
|
||||||
this->sensors_data_characteristic_uuid_ = espbt::ESPBTUUID::from_raw(characteristic_uuid);
|
|
||||||
this->access_control_point_characteristic_uuid_ =
|
this->access_control_point_characteristic_uuid_ =
|
||||||
espbt::ESPBTUUID::from_raw(access_control_point_characteristic_uuid);
|
espbt::ESPBTUUID::from_raw(ACCESS_CONTROL_POINT_CHARACTERISTIC_UUID);
|
||||||
}
|
}
|
||||||
|
|
||||||
} // namespace airthings_wave_plus
|
} // namespace airthings_wave_plus
|
||||||
|
@ -9,20 +9,13 @@ namespace airthings_wave_plus {
|
|||||||
|
|
||||||
namespace espbt = esphome::esp32_ble_tracker;
|
namespace espbt = esphome::esp32_ble_tracker;
|
||||||
|
|
||||||
enum WaveDeviceType : uint8_t { WAVE_PLUS = 0, WAVE_GEN2 = 1 };
|
|
||||||
|
|
||||||
static const char *const SERVICE_UUID = "b42e1c08-ade7-11e4-89d3-123b93f75cba";
|
static const char *const SERVICE_UUID = "b42e1c08-ade7-11e4-89d3-123b93f75cba";
|
||||||
static const char *const CHARACTERISTIC_UUID = "b42e2a68-ade7-11e4-89d3-123b93f75cba";
|
static const char *const CHARACTERISTIC_UUID = "b42e2a68-ade7-11e4-89d3-123b93f75cba";
|
||||||
static const char *const ACCESS_CONTROL_POINT_CHARACTERISTIC_UUID = "b42e2d06-ade7-11e4-89d3-123b93f75cba";
|
static const char *const ACCESS_CONTROL_POINT_CHARACTERISTIC_UUID = "b42e2d06-ade7-11e4-89d3-123b93f75cba";
|
||||||
|
|
||||||
static const char *const SERVICE_UUID_WAVE_RADON_GEN2 = "b42e4a8e-ade7-11e4-89d3-123b93f75cba";
|
|
||||||
static const char *const CHARACTERISTIC_UUID_WAVE_RADON_GEN2 = "b42e4dcc-ade7-11e4-89d3-123b93f75cba";
|
|
||||||
static const char *const ACCESS_CONTROL_POINT_CHARACTERISTIC_UUID_WAVE_RADON_GEN2 =
|
|
||||||
"b42e50d8-ade7-11e4-89d3-123b93f75cba";
|
|
||||||
|
|
||||||
class AirthingsWavePlus : public airthings_wave_base::AirthingsWaveBase {
|
class AirthingsWavePlus : public airthings_wave_base::AirthingsWaveBase {
|
||||||
public:
|
public:
|
||||||
void setup() override;
|
AirthingsWavePlus();
|
||||||
|
|
||||||
void dump_config() override;
|
void dump_config() override;
|
||||||
|
|
||||||
@ -30,14 +23,12 @@ class AirthingsWavePlus : public airthings_wave_base::AirthingsWaveBase {
|
|||||||
void set_radon_long_term(sensor::Sensor *radon_long_term) { radon_long_term_sensor_ = radon_long_term; }
|
void set_radon_long_term(sensor::Sensor *radon_long_term) { radon_long_term_sensor_ = radon_long_term; }
|
||||||
void set_co2(sensor::Sensor *co2) { co2_sensor_ = co2; }
|
void set_co2(sensor::Sensor *co2) { co2_sensor_ = co2; }
|
||||||
void set_illuminance(sensor::Sensor *illuminance) { illuminance_sensor_ = illuminance; }
|
void set_illuminance(sensor::Sensor *illuminance) { illuminance_sensor_ = illuminance; }
|
||||||
void set_device_type(WaveDeviceType wave_device_type) { wave_device_type_ = wave_device_type; }
|
|
||||||
|
|
||||||
protected:
|
protected:
|
||||||
bool is_valid_radon_value_(uint16_t radon);
|
bool is_valid_radon_value_(uint16_t radon);
|
||||||
bool is_valid_co2_value_(uint16_t co2);
|
bool is_valid_co2_value_(uint16_t co2);
|
||||||
|
|
||||||
void read_sensors(uint8_t *raw_value, uint16_t value_len) override;
|
void read_sensors(uint8_t *raw_value, uint16_t value_len) override;
|
||||||
WaveDeviceType wave_device_type_{WaveDeviceType::WAVE_PLUS};
|
|
||||||
|
|
||||||
sensor::Sensor *radon_sensor_{nullptr};
|
sensor::Sensor *radon_sensor_{nullptr};
|
||||||
sensor::Sensor *radon_long_term_sensor_{nullptr};
|
sensor::Sensor *radon_long_term_sensor_{nullptr};
|
||||||
|
@ -7,7 +7,6 @@ from esphome.const import (
|
|||||||
CONF_ILLUMINANCE,
|
CONF_ILLUMINANCE,
|
||||||
CONF_RADON,
|
CONF_RADON,
|
||||||
CONF_RADON_LONG_TERM,
|
CONF_RADON_LONG_TERM,
|
||||||
CONF_TVOC,
|
|
||||||
DEVICE_CLASS_CARBON_DIOXIDE,
|
DEVICE_CLASS_CARBON_DIOXIDE,
|
||||||
DEVICE_CLASS_ILLUMINANCE,
|
DEVICE_CLASS_ILLUMINANCE,
|
||||||
ICON_RADIOACTIVE,
|
ICON_RADIOACTIVE,
|
||||||
@ -16,7 +15,6 @@ from esphome.const import (
|
|||||||
UNIT_LUX,
|
UNIT_LUX,
|
||||||
UNIT_PARTS_PER_MILLION,
|
UNIT_PARTS_PER_MILLION,
|
||||||
)
|
)
|
||||||
from esphome.types import ConfigType
|
|
||||||
|
|
||||||
DEPENDENCIES = airthings_wave_base.DEPENDENCIES
|
DEPENDENCIES = airthings_wave_base.DEPENDENCIES
|
||||||
|
|
||||||
@ -27,59 +25,35 @@ AirthingsWavePlus = airthings_wave_plus_ns.class_(
|
|||||||
"AirthingsWavePlus", airthings_wave_base.AirthingsWaveBase
|
"AirthingsWavePlus", airthings_wave_base.AirthingsWaveBase
|
||||||
)
|
)
|
||||||
|
|
||||||
CONF_DEVICE_TYPE = "device_type"
|
|
||||||
WaveDeviceType = airthings_wave_plus_ns.enum("WaveDeviceType")
|
|
||||||
DEVICE_TYPES = {
|
|
||||||
"WAVE_PLUS": WaveDeviceType.WAVE_PLUS,
|
|
||||||
"WAVE_GEN2": WaveDeviceType.WAVE_GEN2,
|
|
||||||
}
|
|
||||||
|
|
||||||
|
CONFIG_SCHEMA = airthings_wave_base.BASE_SCHEMA.extend(
|
||||||
def validate_wave_gen2_config(config: ConfigType) -> ConfigType:
|
{
|
||||||
"""Validate that Wave Gen2 devices don't have CO2 or TVOC sensors."""
|
cv.GenerateID(): cv.declare_id(AirthingsWavePlus),
|
||||||
if config[CONF_DEVICE_TYPE] == "WAVE_GEN2":
|
cv.Optional(CONF_RADON): sensor.sensor_schema(
|
||||||
if CONF_CO2 in config:
|
unit_of_measurement=UNIT_BECQUEREL_PER_CUBIC_METER,
|
||||||
raise cv.Invalid("Wave Gen2 devices do not support CO2 sensor")
|
icon=ICON_RADIOACTIVE,
|
||||||
# Check for TVOC in the base schema config
|
accuracy_decimals=0,
|
||||||
if CONF_TVOC in config:
|
state_class=STATE_CLASS_MEASUREMENT,
|
||||||
raise cv.Invalid("Wave Gen2 devices do not support TVOC sensor")
|
),
|
||||||
return config
|
cv.Optional(CONF_RADON_LONG_TERM): sensor.sensor_schema(
|
||||||
|
unit_of_measurement=UNIT_BECQUEREL_PER_CUBIC_METER,
|
||||||
|
icon=ICON_RADIOACTIVE,
|
||||||
CONFIG_SCHEMA = cv.All(
|
accuracy_decimals=0,
|
||||||
airthings_wave_base.BASE_SCHEMA.extend(
|
state_class=STATE_CLASS_MEASUREMENT,
|
||||||
{
|
),
|
||||||
cv.GenerateID(): cv.declare_id(AirthingsWavePlus),
|
cv.Optional(CONF_CO2): sensor.sensor_schema(
|
||||||
cv.Optional(CONF_RADON): sensor.sensor_schema(
|
unit_of_measurement=UNIT_PARTS_PER_MILLION,
|
||||||
unit_of_measurement=UNIT_BECQUEREL_PER_CUBIC_METER,
|
accuracy_decimals=0,
|
||||||
icon=ICON_RADIOACTIVE,
|
device_class=DEVICE_CLASS_CARBON_DIOXIDE,
|
||||||
accuracy_decimals=0,
|
state_class=STATE_CLASS_MEASUREMENT,
|
||||||
state_class=STATE_CLASS_MEASUREMENT,
|
),
|
||||||
),
|
cv.Optional(CONF_ILLUMINANCE): sensor.sensor_schema(
|
||||||
cv.Optional(CONF_RADON_LONG_TERM): sensor.sensor_schema(
|
unit_of_measurement=UNIT_LUX,
|
||||||
unit_of_measurement=UNIT_BECQUEREL_PER_CUBIC_METER,
|
accuracy_decimals=0,
|
||||||
icon=ICON_RADIOACTIVE,
|
device_class=DEVICE_CLASS_ILLUMINANCE,
|
||||||
accuracy_decimals=0,
|
state_class=STATE_CLASS_MEASUREMENT,
|
||||||
state_class=STATE_CLASS_MEASUREMENT,
|
),
|
||||||
),
|
}
|
||||||
cv.Optional(CONF_CO2): sensor.sensor_schema(
|
|
||||||
unit_of_measurement=UNIT_PARTS_PER_MILLION,
|
|
||||||
accuracy_decimals=0,
|
|
||||||
device_class=DEVICE_CLASS_CARBON_DIOXIDE,
|
|
||||||
state_class=STATE_CLASS_MEASUREMENT,
|
|
||||||
),
|
|
||||||
cv.Optional(CONF_ILLUMINANCE): sensor.sensor_schema(
|
|
||||||
unit_of_measurement=UNIT_LUX,
|
|
||||||
accuracy_decimals=0,
|
|
||||||
device_class=DEVICE_CLASS_ILLUMINANCE,
|
|
||||||
state_class=STATE_CLASS_MEASUREMENT,
|
|
||||||
),
|
|
||||||
cv.Optional(CONF_DEVICE_TYPE, default="WAVE_PLUS"): cv.enum(
|
|
||||||
DEVICE_TYPES, upper=True
|
|
||||||
),
|
|
||||||
}
|
|
||||||
),
|
|
||||||
validate_wave_gen2_config,
|
|
||||||
)
|
)
|
||||||
|
|
||||||
|
|
||||||
@ -99,4 +73,3 @@ async def to_code(config):
|
|||||||
if config_illuminance := config.get(CONF_ILLUMINANCE):
|
if config_illuminance := config.get(CONF_ILLUMINANCE):
|
||||||
sens = await sensor.new_sensor(config_illuminance)
|
sens = await sensor.new_sensor(config_illuminance)
|
||||||
cg.add(var.set_illuminance(sens))
|
cg.add(var.set_illuminance(sens))
|
||||||
cg.add(var.set_device_type(config[CONF_DEVICE_TYPE]))
|
|
||||||
|
@ -301,7 +301,8 @@ async def alarm_action_disarm_to_code(config, action_id, template_arg, args):
|
|||||||
)
|
)
|
||||||
async def alarm_action_pending_to_code(config, action_id, template_arg, args):
|
async def alarm_action_pending_to_code(config, action_id, template_arg, args):
|
||||||
paren = await cg.get_variable(config[CONF_ID])
|
paren = await cg.get_variable(config[CONF_ID])
|
||||||
return cg.new_Pvariable(action_id, template_arg, paren)
|
var = cg.new_Pvariable(action_id, template_arg, paren)
|
||||||
|
return var
|
||||||
|
|
||||||
|
|
||||||
@automation.register_action(
|
@automation.register_action(
|
||||||
@ -309,7 +310,8 @@ async def alarm_action_pending_to_code(config, action_id, template_arg, args):
|
|||||||
)
|
)
|
||||||
async def alarm_action_trigger_to_code(config, action_id, template_arg, args):
|
async def alarm_action_trigger_to_code(config, action_id, template_arg, args):
|
||||||
paren = await cg.get_variable(config[CONF_ID])
|
paren = await cg.get_variable(config[CONF_ID])
|
||||||
return cg.new_Pvariable(action_id, template_arg, paren)
|
var = cg.new_Pvariable(action_id, template_arg, paren)
|
||||||
|
return var
|
||||||
|
|
||||||
|
|
||||||
@automation.register_action(
|
@automation.register_action(
|
||||||
@ -317,7 +319,8 @@ async def alarm_action_trigger_to_code(config, action_id, template_arg, args):
|
|||||||
)
|
)
|
||||||
async def alarm_action_chime_to_code(config, action_id, template_arg, args):
|
async def alarm_action_chime_to_code(config, action_id, template_arg, args):
|
||||||
paren = await cg.get_variable(config[CONF_ID])
|
paren = await cg.get_variable(config[CONF_ID])
|
||||||
return cg.new_Pvariable(action_id, template_arg, paren)
|
var = cg.new_Pvariable(action_id, template_arg, paren)
|
||||||
|
return var
|
||||||
|
|
||||||
|
|
||||||
@automation.register_action(
|
@automation.register_action(
|
||||||
@ -330,7 +333,8 @@ async def alarm_action_chime_to_code(config, action_id, template_arg, args):
|
|||||||
)
|
)
|
||||||
async def alarm_action_ready_to_code(config, action_id, template_arg, args):
|
async def alarm_action_ready_to_code(config, action_id, template_arg, args):
|
||||||
paren = await cg.get_variable(config[CONF_ID])
|
paren = await cg.get_variable(config[CONF_ID])
|
||||||
return cg.new_Pvariable(action_id, template_arg, paren)
|
var = cg.new_Pvariable(action_id, template_arg, paren)
|
||||||
|
return var
|
||||||
|
|
||||||
|
|
||||||
@automation.register_condition(
|
@automation.register_condition(
|
||||||
|
@ -90,6 +90,8 @@ bool AM2315C::convert_(uint8_t *data, float &humidity, float &temperature) {
|
|||||||
}
|
}
|
||||||
|
|
||||||
void AM2315C::setup() {
|
void AM2315C::setup() {
|
||||||
|
ESP_LOGCONFIG(TAG, "Running setup");
|
||||||
|
|
||||||
// get status
|
// get status
|
||||||
uint8_t status = 0;
|
uint8_t status = 0;
|
||||||
if (this->read(&status, 1) != i2c::ERROR_OK) {
|
if (this->read(&status, 1) != i2c::ERROR_OK) {
|
||||||
|
@ -34,6 +34,7 @@ void AM2320Component::update() {
|
|||||||
this->status_clear_warning();
|
this->status_clear_warning();
|
||||||
}
|
}
|
||||||
void AM2320Component::setup() {
|
void AM2320Component::setup() {
|
||||||
|
ESP_LOGCONFIG(TAG, "Running setup");
|
||||||
uint8_t data[8];
|
uint8_t data[8];
|
||||||
data[0] = 0;
|
data[0] = 0;
|
||||||
data[1] = 4;
|
data[1] = 4;
|
||||||
|
@ -34,20 +34,17 @@ SetFrameAction = animation_ns.class_(
|
|||||||
"AnimationSetFrameAction", automation.Action, cg.Parented.template(Animation_)
|
"AnimationSetFrameAction", automation.Action, cg.Parented.template(Animation_)
|
||||||
)
|
)
|
||||||
|
|
||||||
CONFIG_SCHEMA = cv.All(
|
CONFIG_SCHEMA = espImage.IMAGE_SCHEMA.extend(
|
||||||
espImage.IMAGE_SCHEMA.extend(
|
{
|
||||||
{
|
cv.Required(CONF_ID): cv.declare_id(Animation_),
|
||||||
cv.Required(CONF_ID): cv.declare_id(Animation_),
|
cv.Optional(CONF_LOOP): cv.All(
|
||||||
cv.Optional(CONF_LOOP): cv.All(
|
{
|
||||||
{
|
cv.Optional(CONF_START_FRAME, default=0): cv.positive_int,
|
||||||
cv.Optional(CONF_START_FRAME, default=0): cv.positive_int,
|
cv.Optional(CONF_END_FRAME): cv.positive_int,
|
||||||
cv.Optional(CONF_END_FRAME): cv.positive_int,
|
cv.Optional(CONF_REPEAT): cv.positive_int,
|
||||||
cv.Optional(CONF_REPEAT): cv.positive_int,
|
}
|
||||||
}
|
),
|
||||||
),
|
},
|
||||||
},
|
|
||||||
),
|
|
||||||
espImage.validate_settings,
|
|
||||||
)
|
)
|
||||||
|
|
||||||
|
|
||||||
|
@ -54,6 +54,8 @@ enum { // APDS9306 registers
|
|||||||
}
|
}
|
||||||
|
|
||||||
void APDS9306::setup() {
|
void APDS9306::setup() {
|
||||||
|
ESP_LOGCONFIG(TAG, "Running setup");
|
||||||
|
|
||||||
uint8_t id;
|
uint8_t id;
|
||||||
if (!this->read_byte(APDS9306_PART_ID, &id)) { // Part ID register
|
if (!this->read_byte(APDS9306_PART_ID, &id)) { // Part ID register
|
||||||
this->error_code_ = COMMUNICATION_FAILED;
|
this->error_code_ = COMMUNICATION_FAILED;
|
||||||
@ -84,6 +86,8 @@ void APDS9306::setup() {
|
|||||||
|
|
||||||
// Set to active mode
|
// Set to active mode
|
||||||
APDS9306_WRITE_BYTE(APDS9306_MAIN_CTRL, 0x02);
|
APDS9306_WRITE_BYTE(APDS9306_MAIN_CTRL, 0x02);
|
||||||
|
|
||||||
|
ESP_LOGCONFIG(TAG, "APDS9306 setup complete");
|
||||||
}
|
}
|
||||||
|
|
||||||
void APDS9306::dump_config() {
|
void APDS9306::dump_config() {
|
||||||
|
@ -15,6 +15,7 @@ static const char *const TAG = "apds9960";
|
|||||||
#define APDS9960_WRITE_BYTE(reg, value) APDS9960_ERROR_CHECK(this->write_byte(reg, value));
|
#define APDS9960_WRITE_BYTE(reg, value) APDS9960_ERROR_CHECK(this->write_byte(reg, value));
|
||||||
|
|
||||||
void APDS9960::setup() {
|
void APDS9960::setup() {
|
||||||
|
ESP_LOGCONFIG(TAG, "Running setup");
|
||||||
uint8_t id;
|
uint8_t id;
|
||||||
if (!this->read_byte(0x92, &id)) { // ID register
|
if (!this->read_byte(0x92, &id)) { // ID register
|
||||||
this->error_code_ = COMMUNICATION_FAILED;
|
this->error_code_ = COMMUNICATION_FAILED;
|
||||||
|
@ -53,8 +53,6 @@ SERVICE_ARG_NATIVE_TYPES = {
|
|||||||
CONF_ENCRYPTION = "encryption"
|
CONF_ENCRYPTION = "encryption"
|
||||||
CONF_BATCH_DELAY = "batch_delay"
|
CONF_BATCH_DELAY = "batch_delay"
|
||||||
CONF_CUSTOM_SERVICES = "custom_services"
|
CONF_CUSTOM_SERVICES = "custom_services"
|
||||||
CONF_HOMEASSISTANT_SERVICES = "homeassistant_services"
|
|
||||||
CONF_HOMEASSISTANT_STATES = "homeassistant_states"
|
|
||||||
|
|
||||||
|
|
||||||
def validate_encryption_key(value):
|
def validate_encryption_key(value):
|
||||||
@ -120,8 +118,6 @@ CONFIG_SCHEMA = cv.All(
|
|||||||
cv.Range(max=cv.TimePeriod(milliseconds=65535)),
|
cv.Range(max=cv.TimePeriod(milliseconds=65535)),
|
||||||
),
|
),
|
||||||
cv.Optional(CONF_CUSTOM_SERVICES, default=False): cv.boolean,
|
cv.Optional(CONF_CUSTOM_SERVICES, default=False): cv.boolean,
|
||||||
cv.Optional(CONF_HOMEASSISTANT_SERVICES, default=False): cv.boolean,
|
|
||||||
cv.Optional(CONF_HOMEASSISTANT_STATES, default=False): cv.boolean,
|
|
||||||
cv.Optional(CONF_ON_CLIENT_CONNECTED): automation.validate_automation(
|
cv.Optional(CONF_ON_CLIENT_CONNECTED): automation.validate_automation(
|
||||||
single=True
|
single=True
|
||||||
),
|
),
|
||||||
@ -150,12 +146,6 @@ async def to_code(config):
|
|||||||
if config.get(CONF_ACTIONS) or config[CONF_CUSTOM_SERVICES]:
|
if config.get(CONF_ACTIONS) or config[CONF_CUSTOM_SERVICES]:
|
||||||
cg.add_define("USE_API_SERVICES")
|
cg.add_define("USE_API_SERVICES")
|
||||||
|
|
||||||
if config[CONF_HOMEASSISTANT_SERVICES]:
|
|
||||||
cg.add_define("USE_API_HOMEASSISTANT_SERVICES")
|
|
||||||
|
|
||||||
if config[CONF_HOMEASSISTANT_STATES]:
|
|
||||||
cg.add_define("USE_API_HOMEASSISTANT_STATES")
|
|
||||||
|
|
||||||
if actions := config.get(CONF_ACTIONS, []):
|
if actions := config.get(CONF_ACTIONS, []):
|
||||||
for conf in actions:
|
for conf in actions:
|
||||||
template_args = []
|
template_args = []
|
||||||
@ -245,7 +235,6 @@ HOMEASSISTANT_ACTION_ACTION_SCHEMA = cv.All(
|
|||||||
HOMEASSISTANT_ACTION_ACTION_SCHEMA,
|
HOMEASSISTANT_ACTION_ACTION_SCHEMA,
|
||||||
)
|
)
|
||||||
async def homeassistant_service_to_code(config, action_id, template_arg, args):
|
async def homeassistant_service_to_code(config, action_id, template_arg, args):
|
||||||
cg.add_define("USE_API_HOMEASSISTANT_SERVICES")
|
|
||||||
serv = await cg.get_variable(config[CONF_ID])
|
serv = await cg.get_variable(config[CONF_ID])
|
||||||
var = cg.new_Pvariable(action_id, template_arg, serv, False)
|
var = cg.new_Pvariable(action_id, template_arg, serv, False)
|
||||||
templ = await cg.templatable(config[CONF_ACTION], args, None)
|
templ = await cg.templatable(config[CONF_ACTION], args, None)
|
||||||
@ -289,7 +278,6 @@ HOMEASSISTANT_EVENT_ACTION_SCHEMA = cv.Schema(
|
|||||||
HOMEASSISTANT_EVENT_ACTION_SCHEMA,
|
HOMEASSISTANT_EVENT_ACTION_SCHEMA,
|
||||||
)
|
)
|
||||||
async def homeassistant_event_to_code(config, action_id, template_arg, args):
|
async def homeassistant_event_to_code(config, action_id, template_arg, args):
|
||||||
cg.add_define("USE_API_HOMEASSISTANT_SERVICES")
|
|
||||||
serv = await cg.get_variable(config[CONF_ID])
|
serv = await cg.get_variable(config[CONF_ID])
|
||||||
var = cg.new_Pvariable(action_id, template_arg, serv, True)
|
var = cg.new_Pvariable(action_id, template_arg, serv, True)
|
||||||
templ = await cg.templatable(config[CONF_EVENT], args, None)
|
templ = await cg.templatable(config[CONF_EVENT], args, None)
|
||||||
@ -335,10 +323,9 @@ async def api_connected_to_code(config, condition_id, template_arg, args):
|
|||||||
|
|
||||||
|
|
||||||
def FILTER_SOURCE_FILES() -> list[str]:
|
def FILTER_SOURCE_FILES() -> list[str]:
|
||||||
"""Filter out api_pb2_dump.cpp when proto message dumping is not enabled,
|
"""Filter out api_pb2_dump.cpp when proto message dumping is not enabled
|
||||||
user_services.cpp when no services are defined, and protocol-specific
|
and user_services.cpp when no services are defined."""
|
||||||
implementations based on encryption configuration."""
|
files_to_filter = []
|
||||||
files_to_filter: list[str] = []
|
|
||||||
|
|
||||||
# api_pb2_dump.cpp is only needed when HAS_PROTO_MESSAGE_DUMP is defined
|
# api_pb2_dump.cpp is only needed when HAS_PROTO_MESSAGE_DUMP is defined
|
||||||
# This is a particularly large file that still needs to be opened and read
|
# This is a particularly large file that still needs to be opened and read
|
||||||
@ -354,16 +341,4 @@ def FILTER_SOURCE_FILES() -> list[str]:
|
|||||||
if config and not config.get(CONF_ACTIONS) and not config[CONF_CUSTOM_SERVICES]:
|
if config and not config.get(CONF_ACTIONS) and not config[CONF_CUSTOM_SERVICES]:
|
||||||
files_to_filter.append("user_services.cpp")
|
files_to_filter.append("user_services.cpp")
|
||||||
|
|
||||||
# Filter protocol-specific implementations based on encryption configuration
|
|
||||||
encryption_config = config.get(CONF_ENCRYPTION) if config else None
|
|
||||||
|
|
||||||
# If encryption is not configured at all, we only need plaintext
|
|
||||||
if encryption_config is None:
|
|
||||||
files_to_filter.append("api_frame_helper_noise.cpp")
|
|
||||||
# If encryption is configured with a key, we only need noise
|
|
||||||
elif encryption_config.get(CONF_KEY):
|
|
||||||
files_to_filter.append("api_frame_helper_plaintext.cpp")
|
|
||||||
# If encryption is configured but no key is provided, we need both
|
|
||||||
# (this allows a plaintext client to provide a noise key)
|
|
||||||
|
|
||||||
return files_to_filter
|
return files_to_filter
|
||||||
|
File diff suppressed because it is too large
Load Diff
File diff suppressed because it is too large
Load Diff
@ -13,36 +13,13 @@
|
|||||||
#include <vector>
|
#include <vector>
|
||||||
#include <functional>
|
#include <functional>
|
||||||
|
|
||||||
namespace esphome::api {
|
namespace esphome {
|
||||||
|
namespace api {
|
||||||
// Client information structure
|
|
||||||
struct ClientInfo {
|
|
||||||
std::string name; // Client name from Hello message
|
|
||||||
std::string peername; // IP:port from socket
|
|
||||||
|
|
||||||
std::string get_combined_info() const {
|
|
||||||
if (name == peername) {
|
|
||||||
// Before Hello message, both are the same
|
|
||||||
return name;
|
|
||||||
}
|
|
||||||
return name + " (" + peername + ")";
|
|
||||||
}
|
|
||||||
};
|
|
||||||
|
|
||||||
// Keepalive timeout in milliseconds
|
// Keepalive timeout in milliseconds
|
||||||
static constexpr uint32_t KEEPALIVE_TIMEOUT_MS = 60000;
|
static constexpr uint32_t KEEPALIVE_TIMEOUT_MS = 60000;
|
||||||
// Maximum number of entities to process in a single batch during initial state/info sending
|
// Maximum number of entities to process in a single batch during initial state/info sending
|
||||||
// This was increased from 20 to 24 after removing the unique_id field from entity info messages,
|
static constexpr size_t MAX_INITIAL_PER_BATCH = 20;
|
||||||
// which reduced message sizes allowing more entities per batch without exceeding packet limits
|
|
||||||
static constexpr size_t MAX_INITIAL_PER_BATCH = 24;
|
|
||||||
// Maximum number of packets to process in a single batch (platform-dependent)
|
|
||||||
// This limit exists to prevent stack overflow from the PacketInfo array in process_batch_
|
|
||||||
// Each PacketInfo is 8 bytes, so 64 * 8 = 512 bytes, 32 * 8 = 256 bytes
|
|
||||||
#if defined(USE_ESP32) || defined(USE_HOST)
|
|
||||||
static constexpr size_t MAX_PACKETS_PER_BATCH = 64; // ESP32 has 8KB+ stack, HOST has plenty
|
|
||||||
#else
|
|
||||||
static constexpr size_t MAX_PACKETS_PER_BATCH = 32; // ESP8266/RP2040/etc have smaller stacks
|
|
||||||
#endif
|
|
||||||
|
|
||||||
class APIConnection : public APIServerConnection {
|
class APIConnection : public APIServerConnection {
|
||||||
public:
|
public:
|
||||||
@ -131,16 +108,15 @@ class APIConnection : public APIServerConnection {
|
|||||||
void media_player_command(const MediaPlayerCommandRequest &msg) override;
|
void media_player_command(const MediaPlayerCommandRequest &msg) override;
|
||||||
#endif
|
#endif
|
||||||
bool try_send_log_message(int level, const char *tag, const char *line, size_t message_len);
|
bool try_send_log_message(int level, const char *tag, const char *line, size_t message_len);
|
||||||
#ifdef USE_API_HOMEASSISTANT_SERVICES
|
|
||||||
void send_homeassistant_service_call(const HomeassistantServiceResponse &call) {
|
void send_homeassistant_service_call(const HomeassistantServiceResponse &call) {
|
||||||
if (!this->flags_.service_call_subscription)
|
if (!this->flags_.service_call_subscription)
|
||||||
return;
|
return;
|
||||||
this->send_message(call, HomeassistantServiceResponse::MESSAGE_TYPE);
|
this->send_message(call);
|
||||||
}
|
}
|
||||||
#endif
|
|
||||||
#ifdef USE_BLUETOOTH_PROXY
|
#ifdef USE_BLUETOOTH_PROXY
|
||||||
void subscribe_bluetooth_le_advertisements(const SubscribeBluetoothLEAdvertisementsRequest &msg) override;
|
void subscribe_bluetooth_le_advertisements(const SubscribeBluetoothLEAdvertisementsRequest &msg) override;
|
||||||
void unsubscribe_bluetooth_le_advertisements(const UnsubscribeBluetoothLEAdvertisementsRequest &msg) override;
|
void unsubscribe_bluetooth_le_advertisements(const UnsubscribeBluetoothLEAdvertisementsRequest &msg) override;
|
||||||
|
bool send_bluetooth_le_advertisement(const BluetoothLEAdvertisementResponse &msg);
|
||||||
|
|
||||||
void bluetooth_device_request(const BluetoothDeviceRequest &msg) override;
|
void bluetooth_device_request(const BluetoothDeviceRequest &msg) override;
|
||||||
void bluetooth_gatt_read(const BluetoothGATTReadRequest &msg) override;
|
void bluetooth_gatt_read(const BluetoothGATTReadRequest &msg) override;
|
||||||
@ -149,14 +125,15 @@ class APIConnection : public APIServerConnection {
|
|||||||
void bluetooth_gatt_write_descriptor(const BluetoothGATTWriteDescriptorRequest &msg) override;
|
void bluetooth_gatt_write_descriptor(const BluetoothGATTWriteDescriptorRequest &msg) override;
|
||||||
void bluetooth_gatt_get_services(const BluetoothGATTGetServicesRequest &msg) override;
|
void bluetooth_gatt_get_services(const BluetoothGATTGetServicesRequest &msg) override;
|
||||||
void bluetooth_gatt_notify(const BluetoothGATTNotifyRequest &msg) override;
|
void bluetooth_gatt_notify(const BluetoothGATTNotifyRequest &msg) override;
|
||||||
bool send_subscribe_bluetooth_connections_free_response(const SubscribeBluetoothConnectionsFreeRequest &msg) override;
|
BluetoothConnectionsFreeResponse subscribe_bluetooth_connections_free(
|
||||||
|
const SubscribeBluetoothConnectionsFreeRequest &msg) override;
|
||||||
void bluetooth_scanner_set_mode(const BluetoothScannerSetModeRequest &msg) override;
|
void bluetooth_scanner_set_mode(const BluetoothScannerSetModeRequest &msg) override;
|
||||||
|
|
||||||
#endif
|
#endif
|
||||||
#ifdef USE_HOMEASSISTANT_TIME
|
#ifdef USE_HOMEASSISTANT_TIME
|
||||||
void send_time_request() {
|
void send_time_request() {
|
||||||
GetTimeRequest req;
|
GetTimeRequest req;
|
||||||
this->send_message(req, GetTimeRequest::MESSAGE_TYPE);
|
this->send_message(req);
|
||||||
}
|
}
|
||||||
#endif
|
#endif
|
||||||
|
|
||||||
@ -167,7 +144,8 @@ class APIConnection : public APIServerConnection {
|
|||||||
void on_voice_assistant_audio(const VoiceAssistantAudio &msg) override;
|
void on_voice_assistant_audio(const VoiceAssistantAudio &msg) override;
|
||||||
void on_voice_assistant_timer_event_response(const VoiceAssistantTimerEventResponse &msg) override;
|
void on_voice_assistant_timer_event_response(const VoiceAssistantTimerEventResponse &msg) override;
|
||||||
void on_voice_assistant_announce_request(const VoiceAssistantAnnounceRequest &msg) override;
|
void on_voice_assistant_announce_request(const VoiceAssistantAnnounceRequest &msg) override;
|
||||||
bool send_voice_assistant_get_configuration_response(const VoiceAssistantConfigurationRequest &msg) override;
|
VoiceAssistantConfigurationResponse voice_assistant_get_configuration(
|
||||||
|
const VoiceAssistantConfigurationRequest &msg) override;
|
||||||
void voice_assistant_set_configuration(const VoiceAssistantSetConfiguration &msg) override;
|
void voice_assistant_set_configuration(const VoiceAssistantSetConfiguration &msg) override;
|
||||||
#endif
|
#endif
|
||||||
|
|
||||||
@ -190,17 +168,15 @@ class APIConnection : public APIServerConnection {
|
|||||||
// we initiated ping
|
// we initiated ping
|
||||||
this->flags_.sent_ping = false;
|
this->flags_.sent_ping = false;
|
||||||
}
|
}
|
||||||
#ifdef USE_API_HOMEASSISTANT_STATES
|
|
||||||
void on_home_assistant_state_response(const HomeAssistantStateResponse &msg) override;
|
void on_home_assistant_state_response(const HomeAssistantStateResponse &msg) override;
|
||||||
#endif
|
|
||||||
#ifdef USE_HOMEASSISTANT_TIME
|
#ifdef USE_HOMEASSISTANT_TIME
|
||||||
void on_get_time_response(const GetTimeResponse &value) override;
|
void on_get_time_response(const GetTimeResponse &value) override;
|
||||||
#endif
|
#endif
|
||||||
bool send_hello_response(const HelloRequest &msg) override;
|
HelloResponse hello(const HelloRequest &msg) override;
|
||||||
bool send_connect_response(const ConnectRequest &msg) override;
|
ConnectResponse connect(const ConnectRequest &msg) override;
|
||||||
bool send_disconnect_response(const DisconnectRequest &msg) override;
|
DisconnectResponse disconnect(const DisconnectRequest &msg) override;
|
||||||
bool send_ping_response(const PingRequest &msg) override;
|
PingResponse ping(const PingRequest &msg) override { return {}; }
|
||||||
bool send_device_info_response(const DeviceInfoRequest &msg) override;
|
DeviceInfoResponse device_info(const DeviceInfoRequest &msg) override;
|
||||||
void list_entities(const ListEntitiesRequest &msg) override { this->list_entities_iterator_.begin(); }
|
void list_entities(const ListEntitiesRequest &msg) override { this->list_entities_iterator_.begin(); }
|
||||||
void subscribe_states(const SubscribeStatesRequest &msg) override {
|
void subscribe_states(const SubscribeStatesRequest &msg) override {
|
||||||
this->flags_.state_subscription = true;
|
this->flags_.state_subscription = true;
|
||||||
@ -211,20 +187,19 @@ class APIConnection : public APIServerConnection {
|
|||||||
if (msg.dump_config)
|
if (msg.dump_config)
|
||||||
App.schedule_dump_config();
|
App.schedule_dump_config();
|
||||||
}
|
}
|
||||||
#ifdef USE_API_HOMEASSISTANT_SERVICES
|
|
||||||
void subscribe_homeassistant_services(const SubscribeHomeassistantServicesRequest &msg) override {
|
void subscribe_homeassistant_services(const SubscribeHomeassistantServicesRequest &msg) override {
|
||||||
this->flags_.service_call_subscription = true;
|
this->flags_.service_call_subscription = true;
|
||||||
}
|
}
|
||||||
#endif
|
|
||||||
#ifdef USE_API_HOMEASSISTANT_STATES
|
|
||||||
void subscribe_home_assistant_states(const SubscribeHomeAssistantStatesRequest &msg) override;
|
void subscribe_home_assistant_states(const SubscribeHomeAssistantStatesRequest &msg) override;
|
||||||
#endif
|
GetTimeResponse get_time(const GetTimeRequest &msg) override {
|
||||||
bool send_get_time_response(const GetTimeRequest &msg) override;
|
// TODO
|
||||||
|
return {};
|
||||||
|
}
|
||||||
#ifdef USE_API_SERVICES
|
#ifdef USE_API_SERVICES
|
||||||
void execute_service(const ExecuteServiceRequest &msg) override;
|
void execute_service(const ExecuteServiceRequest &msg) override;
|
||||||
#endif
|
#endif
|
||||||
#ifdef USE_API_NOISE
|
#ifdef USE_API_NOISE
|
||||||
bool send_noise_encryption_set_key_response(const NoiseEncryptionSetKeyRequest &msg) override;
|
NoiseEncryptionSetKeyResponse noise_encryption_set_key(const NoiseEncryptionSetKeyRequest &msg) override;
|
||||||
#endif
|
#endif
|
||||||
|
|
||||||
bool is_authenticated() override {
|
bool is_authenticated() override {
|
||||||
@ -234,18 +209,8 @@ class APIConnection : public APIServerConnection {
|
|||||||
return static_cast<ConnectionState>(this->flags_.connection_state) == ConnectionState::CONNECTED ||
|
return static_cast<ConnectionState>(this->flags_.connection_state) == ConnectionState::CONNECTED ||
|
||||||
this->is_authenticated();
|
this->is_authenticated();
|
||||||
}
|
}
|
||||||
uint8_t get_log_subscription_level() const { return this->flags_.log_subscription; }
|
|
||||||
|
|
||||||
// Get client API version for feature detection
|
|
||||||
bool client_supports_api_version(uint16_t major, uint16_t minor) const {
|
|
||||||
return this->client_api_version_major_ > major ||
|
|
||||||
(this->client_api_version_major_ == major && this->client_api_version_minor_ >= minor);
|
|
||||||
}
|
|
||||||
|
|
||||||
void on_fatal_error() override;
|
void on_fatal_error() override;
|
||||||
#ifdef USE_API_PASSWORD
|
|
||||||
void on_unauthenticated_access() override;
|
void on_unauthenticated_access() override;
|
||||||
#endif
|
|
||||||
void on_no_setup_connection() override;
|
void on_no_setup_connection() override;
|
||||||
ProtoWriteBuffer create_buffer(uint32_t reserve_size) override {
|
ProtoWriteBuffer create_buffer(uint32_t reserve_size) override {
|
||||||
// FIXME: ensure no recursive writes can happen
|
// FIXME: ensure no recursive writes can happen
|
||||||
@ -295,59 +260,49 @@ class APIConnection : public APIServerConnection {
|
|||||||
bool try_to_clear_buffer(bool log_out_of_space);
|
bool try_to_clear_buffer(bool log_out_of_space);
|
||||||
bool send_buffer(ProtoWriteBuffer buffer, uint8_t message_type) override;
|
bool send_buffer(ProtoWriteBuffer buffer, uint8_t message_type) override;
|
||||||
|
|
||||||
std::string get_client_combined_info() const { return this->client_info_.get_combined_info(); }
|
std::string get_client_combined_info() const {
|
||||||
|
if (this->client_info_ == this->client_peername_) {
|
||||||
|
// Before Hello message, both are the same (just IP:port)
|
||||||
|
return this->client_info_;
|
||||||
|
}
|
||||||
|
return this->client_info_ + " (" + this->client_peername_ + ")";
|
||||||
|
}
|
||||||
|
|
||||||
// Buffer allocator methods for batch processing
|
// Buffer allocator methods for batch processing
|
||||||
ProtoWriteBuffer allocate_single_message_buffer(uint16_t size);
|
ProtoWriteBuffer allocate_single_message_buffer(uint16_t size);
|
||||||
ProtoWriteBuffer allocate_batch_message_buffer(uint16_t size);
|
ProtoWriteBuffer allocate_batch_message_buffer(uint16_t size);
|
||||||
|
|
||||||
protected:
|
protected:
|
||||||
// Helper function to handle authentication completion
|
// Helper function to fill common entity info fields
|
||||||
void complete_authentication_();
|
static void fill_entity_info_base(esphome::EntityBase *entity, InfoResponseProtoMessage &response) {
|
||||||
|
// Set common fields that are shared by all entity types
|
||||||
|
response.key = entity->get_object_id_hash();
|
||||||
|
response.object_id = entity->get_object_id();
|
||||||
|
|
||||||
#ifdef USE_API_HOMEASSISTANT_STATES
|
if (entity->has_own_name())
|
||||||
void process_state_subscriptions_();
|
response.name = entity->get_name();
|
||||||
|
|
||||||
|
// Set common EntityBase properties
|
||||||
|
response.icon = entity->get_icon();
|
||||||
|
response.disabled_by_default = entity->is_disabled_by_default();
|
||||||
|
response.entity_category = static_cast<enums::EntityCategory>(entity->get_entity_category());
|
||||||
|
#ifdef USE_DEVICES
|
||||||
|
response.device_id = entity->get_device_id();
|
||||||
#endif
|
#endif
|
||||||
|
}
|
||||||
|
|
||||||
|
// Helper function to fill common entity state fields
|
||||||
|
static void fill_entity_state_base(esphome::EntityBase *entity, StateResponseProtoMessage &response) {
|
||||||
|
response.key = entity->get_object_id_hash();
|
||||||
|
#ifdef USE_DEVICES
|
||||||
|
response.device_id = entity->get_device_id();
|
||||||
|
#endif
|
||||||
|
}
|
||||||
|
|
||||||
// Non-template helper to encode any ProtoMessage
|
// Non-template helper to encode any ProtoMessage
|
||||||
static uint16_t encode_message_to_buffer(ProtoMessage &msg, uint8_t message_type, APIConnection *conn,
|
static uint16_t encode_message_to_buffer(ProtoMessage &msg, uint8_t message_type, APIConnection *conn,
|
||||||
uint32_t remaining_size, bool is_single);
|
uint32_t remaining_size, bool is_single);
|
||||||
|
|
||||||
// Helper to fill entity state base and encode message
|
|
||||||
static uint16_t fill_and_encode_entity_state(EntityBase *entity, StateResponseProtoMessage &msg, uint8_t message_type,
|
|
||||||
APIConnection *conn, uint32_t remaining_size, bool is_single) {
|
|
||||||
msg.key = entity->get_object_id_hash();
|
|
||||||
#ifdef USE_DEVICES
|
|
||||||
msg.device_id = entity->get_device_id();
|
|
||||||
#endif
|
|
||||||
return encode_message_to_buffer(msg, message_type, conn, remaining_size, is_single);
|
|
||||||
}
|
|
||||||
|
|
||||||
// Helper to fill entity info base and encode message
|
|
||||||
static uint16_t fill_and_encode_entity_info(EntityBase *entity, InfoResponseProtoMessage &msg, uint8_t message_type,
|
|
||||||
APIConnection *conn, uint32_t remaining_size, bool is_single) {
|
|
||||||
// Set common fields that are shared by all entity types
|
|
||||||
msg.key = entity->get_object_id_hash();
|
|
||||||
// IMPORTANT: get_object_id() may return a temporary std::string
|
|
||||||
std::string object_id = entity->get_object_id();
|
|
||||||
msg.set_object_id(StringRef(object_id));
|
|
||||||
|
|
||||||
if (entity->has_own_name()) {
|
|
||||||
msg.set_name(entity->get_name());
|
|
||||||
}
|
|
||||||
|
|
||||||
// Set common EntityBase properties
|
|
||||||
#ifdef USE_ENTITY_ICON
|
|
||||||
msg.set_icon(entity->get_icon_ref());
|
|
||||||
#endif
|
|
||||||
msg.disabled_by_default = entity->is_disabled_by_default();
|
|
||||||
msg.entity_category = static_cast<enums::EntityCategory>(entity->get_entity_category());
|
|
||||||
#ifdef USE_DEVICES
|
|
||||||
msg.device_id = entity->get_device_id();
|
|
||||||
#endif
|
|
||||||
return encode_message_to_buffer(msg, message_type, conn, remaining_size, is_single);
|
|
||||||
}
|
|
||||||
|
|
||||||
#ifdef USE_VOICE_ASSISTANT
|
#ifdef USE_VOICE_ASSISTANT
|
||||||
// Helper to check voice assistant validity and connection ownership
|
// Helper to check voice assistant validity and connection ownership
|
||||||
inline bool check_voice_assistant_api_connection_() const;
|
inline bool check_voice_assistant_api_connection_() const;
|
||||||
@ -508,14 +463,13 @@ class APIConnection : public APIServerConnection {
|
|||||||
std::unique_ptr<camera::CameraImageReader> image_reader_;
|
std::unique_ptr<camera::CameraImageReader> image_reader_;
|
||||||
#endif
|
#endif
|
||||||
|
|
||||||
// Group 3: Client info struct (24 bytes on 32-bit: 2 strings × 12 bytes each)
|
// Group 3: Strings (12 bytes each on 32-bit, 4-byte aligned)
|
||||||
ClientInfo client_info_;
|
std::string client_info_;
|
||||||
|
std::string client_peername_;
|
||||||
|
|
||||||
// Group 4: 4-byte types
|
// Group 4: 4-byte types
|
||||||
uint32_t last_traffic_;
|
uint32_t last_traffic_;
|
||||||
#ifdef USE_API_HOMEASSISTANT_STATES
|
|
||||||
int state_subs_at_ = -1;
|
int state_subs_at_ = -1;
|
||||||
#endif
|
|
||||||
|
|
||||||
// Function pointer type for message encoding
|
// Function pointer type for message encoding
|
||||||
using MessageCreatorPtr = uint16_t (*)(EntityBase *, APIConnection *, uint32_t remaining_size, bool is_single);
|
using MessageCreatorPtr = uint16_t (*)(EntityBase *, APIConnection *, uint32_t remaining_size, bool is_single);
|
||||||
@ -743,12 +697,8 @@ class APIConnection : public APIServerConnection {
|
|||||||
this->deferred_batch_.add_item_front(entity, MessageCreator(function_ptr), message_type, estimated_size);
|
this->deferred_batch_.add_item_front(entity, MessageCreator(function_ptr), message_type, estimated_size);
|
||||||
return this->schedule_batch_();
|
return this->schedule_batch_();
|
||||||
}
|
}
|
||||||
|
|
||||||
// Helper function to log API errors with errno
|
|
||||||
void log_warning_(const char *message, APIError err);
|
|
||||||
// Specific helper for duplicated error message
|
|
||||||
void log_socket_operation_failed_(APIError err);
|
|
||||||
};
|
};
|
||||||
|
|
||||||
} // namespace esphome::api
|
} // namespace api
|
||||||
|
} // namespace esphome
|
||||||
#endif
|
#endif
|
||||||
|
File diff suppressed because it is too large
Load Diff
@ -8,17 +8,16 @@
|
|||||||
|
|
||||||
#include "esphome/core/defines.h"
|
#include "esphome/core/defines.h"
|
||||||
#ifdef USE_API
|
#ifdef USE_API
|
||||||
|
#ifdef USE_API_NOISE
|
||||||
|
#include "noise/protocol.h"
|
||||||
|
#endif
|
||||||
|
|
||||||
|
#include "api_noise_context.h"
|
||||||
#include "esphome/components/socket/socket.h"
|
#include "esphome/components/socket/socket.h"
|
||||||
#include "esphome/core/application.h"
|
#include "esphome/core/application.h"
|
||||||
#include "esphome/core/log.h"
|
|
||||||
|
|
||||||
namespace esphome::api {
|
namespace esphome {
|
||||||
|
namespace api {
|
||||||
// uncomment to log raw packets
|
|
||||||
//#define HELPER_LOG_PACKETS
|
|
||||||
|
|
||||||
// Forward declaration
|
|
||||||
struct ClientInfo;
|
|
||||||
|
|
||||||
class ProtoWriteBuffer;
|
class ProtoWriteBuffer;
|
||||||
|
|
||||||
@ -41,6 +40,7 @@ struct PacketInfo {
|
|||||||
enum class APIError : uint16_t {
|
enum class APIError : uint16_t {
|
||||||
OK = 0,
|
OK = 0,
|
||||||
WOULD_BLOCK = 1001,
|
WOULD_BLOCK = 1001,
|
||||||
|
BAD_HANDSHAKE_PACKET_LEN = 1002,
|
||||||
BAD_INDICATOR = 1003,
|
BAD_INDICATOR = 1003,
|
||||||
BAD_DATA_PACKET = 1004,
|
BAD_DATA_PACKET = 1004,
|
||||||
TCP_NODELAY_FAILED = 1005,
|
TCP_NODELAY_FAILED = 1005,
|
||||||
@ -51,19 +51,16 @@ enum class APIError : uint16_t {
|
|||||||
BAD_ARG = 1010,
|
BAD_ARG = 1010,
|
||||||
SOCKET_READ_FAILED = 1011,
|
SOCKET_READ_FAILED = 1011,
|
||||||
SOCKET_WRITE_FAILED = 1012,
|
SOCKET_WRITE_FAILED = 1012,
|
||||||
OUT_OF_MEMORY = 1018,
|
|
||||||
CONNECTION_CLOSED = 1022,
|
|
||||||
#ifdef USE_API_NOISE
|
|
||||||
BAD_HANDSHAKE_PACKET_LEN = 1002,
|
|
||||||
HANDSHAKESTATE_READ_FAILED = 1013,
|
HANDSHAKESTATE_READ_FAILED = 1013,
|
||||||
HANDSHAKESTATE_WRITE_FAILED = 1014,
|
HANDSHAKESTATE_WRITE_FAILED = 1014,
|
||||||
HANDSHAKESTATE_BAD_STATE = 1015,
|
HANDSHAKESTATE_BAD_STATE = 1015,
|
||||||
CIPHERSTATE_DECRYPT_FAILED = 1016,
|
CIPHERSTATE_DECRYPT_FAILED = 1016,
|
||||||
CIPHERSTATE_ENCRYPT_FAILED = 1017,
|
CIPHERSTATE_ENCRYPT_FAILED = 1017,
|
||||||
|
OUT_OF_MEMORY = 1018,
|
||||||
HANDSHAKESTATE_SETUP_FAILED = 1019,
|
HANDSHAKESTATE_SETUP_FAILED = 1019,
|
||||||
HANDSHAKESTATE_SPLIT_FAILED = 1020,
|
HANDSHAKESTATE_SPLIT_FAILED = 1020,
|
||||||
BAD_HANDSHAKE_ERROR_BYTE = 1021,
|
BAD_HANDSHAKE_ERROR_BYTE = 1021,
|
||||||
#endif
|
CONNECTION_CLOSED = 1022,
|
||||||
};
|
};
|
||||||
|
|
||||||
const char *api_error_to_str(APIError err);
|
const char *api_error_to_str(APIError err);
|
||||||
@ -71,8 +68,7 @@ const char *api_error_to_str(APIError err);
|
|||||||
class APIFrameHelper {
|
class APIFrameHelper {
|
||||||
public:
|
public:
|
||||||
APIFrameHelper() = default;
|
APIFrameHelper() = default;
|
||||||
explicit APIFrameHelper(std::unique_ptr<socket::Socket> socket, const ClientInfo *client_info)
|
explicit APIFrameHelper(std::unique_ptr<socket::Socket> socket) : socket_owned_(std::move(socket)) {
|
||||||
: socket_owned_(std::move(socket)), client_info_(client_info) {
|
|
||||||
socket_ = socket_owned_.get();
|
socket_ = socket_owned_.get();
|
||||||
}
|
}
|
||||||
virtual ~APIFrameHelper() = default;
|
virtual ~APIFrameHelper() = default;
|
||||||
@ -98,6 +94,8 @@ class APIFrameHelper {
|
|||||||
}
|
}
|
||||||
return APIError::OK;
|
return APIError::OK;
|
||||||
}
|
}
|
||||||
|
// Give this helper a name for logging
|
||||||
|
void set_log_info(std::string info) { info_ = std::move(info); }
|
||||||
virtual APIError write_protobuf_packet(uint8_t type, ProtoWriteBuffer buffer) = 0;
|
virtual APIError write_protobuf_packet(uint8_t type, ProtoWriteBuffer buffer) = 0;
|
||||||
// Write multiple protobuf packets in a single operation
|
// Write multiple protobuf packets in a single operation
|
||||||
// packets contains (message_type, offset, length) for each message in the buffer
|
// packets contains (message_type, offset, length) for each message in the buffer
|
||||||
@ -111,28 +109,29 @@ class APIFrameHelper {
|
|||||||
bool is_socket_ready() const { return socket_ != nullptr && socket_->ready(); }
|
bool is_socket_ready() const { return socket_ != nullptr && socket_->ready(); }
|
||||||
|
|
||||||
protected:
|
protected:
|
||||||
|
// Struct for holding parsed frame data
|
||||||
|
struct ParsedFrame {
|
||||||
|
std::vector<uint8_t> msg;
|
||||||
|
};
|
||||||
|
|
||||||
// Buffer containing data to be sent
|
// Buffer containing data to be sent
|
||||||
struct SendBuffer {
|
struct SendBuffer {
|
||||||
std::unique_ptr<uint8_t[]> data;
|
std::vector<uint8_t> data;
|
||||||
uint16_t size{0}; // Total size of the buffer
|
uint16_t offset{0}; // Current offset within the buffer (uint16_t to reduce memory usage)
|
||||||
uint16_t offset{0}; // Current offset within the buffer
|
|
||||||
|
|
||||||
// Using uint16_t reduces memory usage since ESPHome API messages are limited to UINT16_MAX (65535) bytes
|
// Using uint16_t reduces memory usage since ESPHome API messages are limited to UINT16_MAX (65535) bytes
|
||||||
uint16_t remaining() const { return size - offset; }
|
uint16_t remaining() const { return static_cast<uint16_t>(data.size()) - offset; }
|
||||||
const uint8_t *current_data() const { return data.get() + offset; }
|
const uint8_t *current_data() const { return data.data() + offset; }
|
||||||
};
|
};
|
||||||
|
|
||||||
// Common implementation for writing raw data to socket
|
// Common implementation for writing raw data to socket
|
||||||
APIError write_raw_(const struct iovec *iov, int iovcnt, uint16_t total_write_len);
|
APIError write_raw_(const struct iovec *iov, int iovcnt);
|
||||||
|
|
||||||
// Try to send data from the tx buffer
|
// Try to send data from the tx buffer
|
||||||
APIError try_send_tx_buf_();
|
APIError try_send_tx_buf_();
|
||||||
|
|
||||||
// Helper method to buffer data from IOVs
|
// Helper method to buffer data from IOVs
|
||||||
void buffer_data_from_iov_(const struct iovec *iov, int iovcnt, uint16_t total_write_len, uint16_t offset);
|
void buffer_data_from_iov_(const struct iovec *iov, int iovcnt, uint16_t total_write_len);
|
||||||
|
|
||||||
// Common socket write error handling
|
|
||||||
APIError handle_socket_write_error_();
|
|
||||||
template<typename StateEnum>
|
template<typename StateEnum>
|
||||||
APIError write_raw_(const struct iovec *iov, int iovcnt, socket::Socket *socket, std::vector<uint8_t> &tx_buf,
|
APIError write_raw_(const struct iovec *iov, int iovcnt, socket::Socket *socket, std::vector<uint8_t> &tx_buf,
|
||||||
const std::string &info, StateEnum &state, StateEnum failed_state);
|
const std::string &info, StateEnum &state, StateEnum failed_state);
|
||||||
@ -162,13 +161,10 @@ class APIFrameHelper {
|
|||||||
|
|
||||||
// Containers (size varies, but typically 12+ bytes on 32-bit)
|
// Containers (size varies, but typically 12+ bytes on 32-bit)
|
||||||
std::deque<SendBuffer> tx_buf_;
|
std::deque<SendBuffer> tx_buf_;
|
||||||
|
std::string info_;
|
||||||
std::vector<struct iovec> reusable_iovs_;
|
std::vector<struct iovec> reusable_iovs_;
|
||||||
std::vector<uint8_t> rx_buf_;
|
std::vector<uint8_t> rx_buf_;
|
||||||
|
|
||||||
// Pointer to client info (4 bytes on 32-bit)
|
|
||||||
// Note: The pointed-to ClientInfo object must outlive this APIFrameHelper instance.
|
|
||||||
const ClientInfo *client_info_{nullptr};
|
|
||||||
|
|
||||||
// Group smaller types together
|
// Group smaller types together
|
||||||
uint16_t rx_buf_len_ = 0;
|
uint16_t rx_buf_len_ = 0;
|
||||||
State state_{State::INITIALIZE};
|
State state_{State::INITIALIZE};
|
||||||
@ -183,6 +179,105 @@ class APIFrameHelper {
|
|||||||
APIError handle_socket_read_result_(ssize_t received);
|
APIError handle_socket_read_result_(ssize_t received);
|
||||||
};
|
};
|
||||||
|
|
||||||
} // namespace esphome::api
|
#ifdef USE_API_NOISE
|
||||||
|
class APINoiseFrameHelper : public APIFrameHelper {
|
||||||
|
public:
|
||||||
|
APINoiseFrameHelper(std::unique_ptr<socket::Socket> socket, std::shared_ptr<APINoiseContext> ctx)
|
||||||
|
: APIFrameHelper(std::move(socket)), ctx_(std::move(ctx)) {
|
||||||
|
// Noise header structure:
|
||||||
|
// Pos 0: indicator (0x01)
|
||||||
|
// Pos 1-2: encrypted payload size (16-bit big-endian)
|
||||||
|
// Pos 3-6: encrypted type (16-bit) + data_len (16-bit)
|
||||||
|
// Pos 7+: actual payload data
|
||||||
|
frame_header_padding_ = 7;
|
||||||
|
}
|
||||||
|
~APINoiseFrameHelper() override;
|
||||||
|
APIError init() override;
|
||||||
|
APIError loop() override;
|
||||||
|
APIError read_packet(ReadPacketBuffer *buffer) override;
|
||||||
|
APIError write_protobuf_packet(uint8_t type, ProtoWriteBuffer buffer) override;
|
||||||
|
APIError write_protobuf_packets(ProtoWriteBuffer buffer, std::span<const PacketInfo> packets) override;
|
||||||
|
// Get the frame header padding required by this protocol
|
||||||
|
uint8_t frame_header_padding() override { return frame_header_padding_; }
|
||||||
|
// Get the frame footer size required by this protocol
|
||||||
|
uint8_t frame_footer_size() override { return frame_footer_size_; }
|
||||||
|
|
||||||
#endif // USE_API
|
protected:
|
||||||
|
APIError state_action_();
|
||||||
|
APIError try_read_frame_(ParsedFrame *frame);
|
||||||
|
APIError write_frame_(const uint8_t *data, uint16_t len);
|
||||||
|
APIError init_handshake_();
|
||||||
|
APIError check_handshake_finished_();
|
||||||
|
void send_explicit_handshake_reject_(const std::string &reason);
|
||||||
|
|
||||||
|
// Pointers first (4 bytes each)
|
||||||
|
NoiseHandshakeState *handshake_{nullptr};
|
||||||
|
NoiseCipherState *send_cipher_{nullptr};
|
||||||
|
NoiseCipherState *recv_cipher_{nullptr};
|
||||||
|
|
||||||
|
// Shared pointer (8 bytes on 32-bit = 4 bytes control block pointer + 4 bytes object pointer)
|
||||||
|
std::shared_ptr<APINoiseContext> ctx_;
|
||||||
|
|
||||||
|
// Vector (12 bytes on 32-bit)
|
||||||
|
std::vector<uint8_t> prologue_;
|
||||||
|
|
||||||
|
// NoiseProtocolId (size depends on implementation)
|
||||||
|
NoiseProtocolId nid_;
|
||||||
|
|
||||||
|
// Group small types together
|
||||||
|
// Fixed-size header buffer for noise protocol:
|
||||||
|
// 1 byte for indicator + 2 bytes for message size (16-bit value, not varint)
|
||||||
|
// Note: Maximum message size is UINT16_MAX (65535), with a limit of 128 bytes during handshake phase
|
||||||
|
uint8_t rx_header_buf_[3];
|
||||||
|
uint8_t rx_header_buf_len_ = 0;
|
||||||
|
// 4 bytes total, no padding
|
||||||
|
};
|
||||||
|
#endif // USE_API_NOISE
|
||||||
|
|
||||||
|
#ifdef USE_API_PLAINTEXT
|
||||||
|
class APIPlaintextFrameHelper : public APIFrameHelper {
|
||||||
|
public:
|
||||||
|
APIPlaintextFrameHelper(std::unique_ptr<socket::Socket> socket) : APIFrameHelper(std::move(socket)) {
|
||||||
|
// Plaintext header structure (worst case):
|
||||||
|
// Pos 0: indicator (0x00)
|
||||||
|
// Pos 1-3: payload size varint (up to 3 bytes)
|
||||||
|
// Pos 4-5: message type varint (up to 2 bytes)
|
||||||
|
// Pos 6+: actual payload data
|
||||||
|
frame_header_padding_ = 6;
|
||||||
|
}
|
||||||
|
~APIPlaintextFrameHelper() override = default;
|
||||||
|
APIError init() override;
|
||||||
|
APIError loop() override;
|
||||||
|
APIError read_packet(ReadPacketBuffer *buffer) override;
|
||||||
|
APIError write_protobuf_packet(uint8_t type, ProtoWriteBuffer buffer) override;
|
||||||
|
APIError write_protobuf_packets(ProtoWriteBuffer buffer, std::span<const PacketInfo> packets) override;
|
||||||
|
uint8_t frame_header_padding() override { return frame_header_padding_; }
|
||||||
|
// Get the frame footer size required by this protocol
|
||||||
|
uint8_t frame_footer_size() override { return frame_footer_size_; }
|
||||||
|
|
||||||
|
protected:
|
||||||
|
APIError try_read_frame_(ParsedFrame *frame);
|
||||||
|
|
||||||
|
// Group 2-byte aligned types
|
||||||
|
uint16_t rx_header_parsed_type_ = 0;
|
||||||
|
uint16_t rx_header_parsed_len_ = 0;
|
||||||
|
|
||||||
|
// Group 1-byte types together
|
||||||
|
// Fixed-size header buffer for plaintext protocol:
|
||||||
|
// We now store the indicator byte + the two varints.
|
||||||
|
// To match noise protocol's maximum message size (UINT16_MAX = 65535), we need:
|
||||||
|
// 1 byte for indicator + 3 bytes for message size varint (supports up to 2097151) + 2 bytes for message type varint
|
||||||
|
//
|
||||||
|
// While varints could theoretically be up to 10 bytes each for 64-bit values,
|
||||||
|
// attempting to process messages with headers that large would likely crash the
|
||||||
|
// ESP32 due to memory constraints.
|
||||||
|
uint8_t rx_header_buf_[6]; // 1 byte indicator + 5 bytes for varints (3 for size + 2 for type)
|
||||||
|
uint8_t rx_header_buf_pos_ = 0;
|
||||||
|
bool rx_header_parsed_ = false;
|
||||||
|
// 8 bytes total, no padding needed
|
||||||
|
};
|
||||||
|
#endif
|
||||||
|
|
||||||
|
} // namespace api
|
||||||
|
} // namespace esphome
|
||||||
|
#endif
|
||||||
|
@ -1,583 +0,0 @@
|
|||||||
#include "api_frame_helper_noise.h"
|
|
||||||
#ifdef USE_API
|
|
||||||
#ifdef USE_API_NOISE
|
|
||||||
#include "api_connection.h" // For ClientInfo struct
|
|
||||||
#include "esphome/core/application.h"
|
|
||||||
#include "esphome/core/hal.h"
|
|
||||||
#include "esphome/core/helpers.h"
|
|
||||||
#include "esphome/core/log.h"
|
|
||||||
#include "proto.h"
|
|
||||||
#include <cstring>
|
|
||||||
#include <cinttypes>
|
|
||||||
|
|
||||||
namespace esphome::api {
|
|
||||||
|
|
||||||
static const char *const TAG = "api.noise";
|
|
||||||
static const char *const PROLOGUE_INIT = "NoiseAPIInit";
|
|
||||||
static constexpr size_t PROLOGUE_INIT_LEN = 12; // strlen("NoiseAPIInit")
|
|
||||||
|
|
||||||
#define HELPER_LOG(msg, ...) ESP_LOGVV(TAG, "%s: " msg, this->client_info_->get_combined_info().c_str(), ##__VA_ARGS__)
|
|
||||||
|
|
||||||
#ifdef HELPER_LOG_PACKETS
|
|
||||||
#define LOG_PACKET_RECEIVED(buffer) ESP_LOGVV(TAG, "Received frame: %s", format_hex_pretty(buffer).c_str())
|
|
||||||
#define LOG_PACKET_SENDING(data, len) ESP_LOGVV(TAG, "Sending raw: %s", format_hex_pretty(data, len).c_str())
|
|
||||||
#else
|
|
||||||
#define LOG_PACKET_RECEIVED(buffer) ((void) 0)
|
|
||||||
#define LOG_PACKET_SENDING(data, len) ((void) 0)
|
|
||||||
#endif
|
|
||||||
|
|
||||||
/// Convert a noise error code to a readable error
|
|
||||||
std::string noise_err_to_str(int err) {
|
|
||||||
if (err == NOISE_ERROR_NO_MEMORY)
|
|
||||||
return "NO_MEMORY";
|
|
||||||
if (err == NOISE_ERROR_UNKNOWN_ID)
|
|
||||||
return "UNKNOWN_ID";
|
|
||||||
if (err == NOISE_ERROR_UNKNOWN_NAME)
|
|
||||||
return "UNKNOWN_NAME";
|
|
||||||
if (err == NOISE_ERROR_MAC_FAILURE)
|
|
||||||
return "MAC_FAILURE";
|
|
||||||
if (err == NOISE_ERROR_NOT_APPLICABLE)
|
|
||||||
return "NOT_APPLICABLE";
|
|
||||||
if (err == NOISE_ERROR_SYSTEM)
|
|
||||||
return "SYSTEM";
|
|
||||||
if (err == NOISE_ERROR_REMOTE_KEY_REQUIRED)
|
|
||||||
return "REMOTE_KEY_REQUIRED";
|
|
||||||
if (err == NOISE_ERROR_LOCAL_KEY_REQUIRED)
|
|
||||||
return "LOCAL_KEY_REQUIRED";
|
|
||||||
if (err == NOISE_ERROR_PSK_REQUIRED)
|
|
||||||
return "PSK_REQUIRED";
|
|
||||||
if (err == NOISE_ERROR_INVALID_LENGTH)
|
|
||||||
return "INVALID_LENGTH";
|
|
||||||
if (err == NOISE_ERROR_INVALID_PARAM)
|
|
||||||
return "INVALID_PARAM";
|
|
||||||
if (err == NOISE_ERROR_INVALID_STATE)
|
|
||||||
return "INVALID_STATE";
|
|
||||||
if (err == NOISE_ERROR_INVALID_NONCE)
|
|
||||||
return "INVALID_NONCE";
|
|
||||||
if (err == NOISE_ERROR_INVALID_PRIVATE_KEY)
|
|
||||||
return "INVALID_PRIVATE_KEY";
|
|
||||||
if (err == NOISE_ERROR_INVALID_PUBLIC_KEY)
|
|
||||||
return "INVALID_PUBLIC_KEY";
|
|
||||||
if (err == NOISE_ERROR_INVALID_FORMAT)
|
|
||||||
return "INVALID_FORMAT";
|
|
||||||
if (err == NOISE_ERROR_INVALID_SIGNATURE)
|
|
||||||
return "INVALID_SIGNATURE";
|
|
||||||
return to_string(err);
|
|
||||||
}
|
|
||||||
|
|
||||||
/// Initialize the frame helper, returns OK if successful.
|
|
||||||
APIError APINoiseFrameHelper::init() {
|
|
||||||
APIError err = init_common_();
|
|
||||||
if (err != APIError::OK) {
|
|
||||||
return err;
|
|
||||||
}
|
|
||||||
|
|
||||||
// init prologue
|
|
||||||
size_t old_size = prologue_.size();
|
|
||||||
prologue_.resize(old_size + PROLOGUE_INIT_LEN);
|
|
||||||
std::memcpy(prologue_.data() + old_size, PROLOGUE_INIT, PROLOGUE_INIT_LEN);
|
|
||||||
|
|
||||||
state_ = State::CLIENT_HELLO;
|
|
||||||
return APIError::OK;
|
|
||||||
}
|
|
||||||
// Helper for handling handshake frame errors
|
|
||||||
APIError APINoiseFrameHelper::handle_handshake_frame_error_(APIError aerr) {
|
|
||||||
if (aerr == APIError::BAD_INDICATOR) {
|
|
||||||
send_explicit_handshake_reject_("Bad indicator byte");
|
|
||||||
} else if (aerr == APIError::BAD_HANDSHAKE_PACKET_LEN) {
|
|
||||||
send_explicit_handshake_reject_("Bad handshake packet len");
|
|
||||||
}
|
|
||||||
return aerr;
|
|
||||||
}
|
|
||||||
|
|
||||||
// Helper for handling noise library errors
|
|
||||||
APIError APINoiseFrameHelper::handle_noise_error_(int err, const char *func_name, APIError api_err) {
|
|
||||||
if (err != 0) {
|
|
||||||
state_ = State::FAILED;
|
|
||||||
HELPER_LOG("%s failed: %s", func_name, noise_err_to_str(err).c_str());
|
|
||||||
return api_err;
|
|
||||||
}
|
|
||||||
return APIError::OK;
|
|
||||||
}
|
|
||||||
|
|
||||||
/// Run through handshake messages (if in that phase)
|
|
||||||
APIError APINoiseFrameHelper::loop() {
|
|
||||||
// During handshake phase, process as many actions as possible until we can't progress
|
|
||||||
// socket_->ready() stays true until next main loop, but state_action() will return
|
|
||||||
// WOULD_BLOCK when no more data is available to read
|
|
||||||
while (state_ != State::DATA && this->socket_->ready()) {
|
|
||||||
APIError err = state_action_();
|
|
||||||
if (err == APIError::WOULD_BLOCK) {
|
|
||||||
break;
|
|
||||||
}
|
|
||||||
if (err != APIError::OK) {
|
|
||||||
return err;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
// Use base class implementation for buffer sending
|
|
||||||
return APIFrameHelper::loop();
|
|
||||||
}
|
|
||||||
|
|
||||||
/** Read a packet into the rx_buf_. If successful, stores frame data in the frame parameter
|
|
||||||
*
|
|
||||||
* @param frame: The struct to hold the frame information in.
|
|
||||||
* msg_start: points to the start of the payload - this pointer is only valid until the next
|
|
||||||
* try_receive_raw_ call
|
|
||||||
*
|
|
||||||
* @return 0 if a full packet is in rx_buf_
|
|
||||||
* @return -1 if error, check errno.
|
|
||||||
*
|
|
||||||
* errno EWOULDBLOCK: Packet could not be read without blocking. Try again later.
|
|
||||||
* errno ENOMEM: Not enough memory for reading packet.
|
|
||||||
* errno API_ERROR_BAD_INDICATOR: Bad indicator byte at start of frame.
|
|
||||||
* errno API_ERROR_HANDSHAKE_PACKET_LEN: Packet too big for this phase.
|
|
||||||
*/
|
|
||||||
APIError APINoiseFrameHelper::try_read_frame_(std::vector<uint8_t> *frame) {
|
|
||||||
if (frame == nullptr) {
|
|
||||||
HELPER_LOG("Bad argument for try_read_frame_");
|
|
||||||
return APIError::BAD_ARG;
|
|
||||||
}
|
|
||||||
|
|
||||||
// read header
|
|
||||||
if (rx_header_buf_len_ < 3) {
|
|
||||||
// no header information yet
|
|
||||||
uint8_t to_read = 3 - rx_header_buf_len_;
|
|
||||||
ssize_t received = this->socket_->read(&rx_header_buf_[rx_header_buf_len_], to_read);
|
|
||||||
APIError err = handle_socket_read_result_(received);
|
|
||||||
if (err != APIError::OK) {
|
|
||||||
return err;
|
|
||||||
}
|
|
||||||
rx_header_buf_len_ += static_cast<uint8_t>(received);
|
|
||||||
if (static_cast<uint8_t>(received) != to_read) {
|
|
||||||
// not a full read
|
|
||||||
return APIError::WOULD_BLOCK;
|
|
||||||
}
|
|
||||||
|
|
||||||
if (rx_header_buf_[0] != 0x01) {
|
|
||||||
state_ = State::FAILED;
|
|
||||||
HELPER_LOG("Bad indicator byte %u", rx_header_buf_[0]);
|
|
||||||
return APIError::BAD_INDICATOR;
|
|
||||||
}
|
|
||||||
// header reading done
|
|
||||||
}
|
|
||||||
|
|
||||||
// read body
|
|
||||||
uint16_t msg_size = (((uint16_t) rx_header_buf_[1]) << 8) | rx_header_buf_[2];
|
|
||||||
|
|
||||||
if (state_ != State::DATA && msg_size > 128) {
|
|
||||||
// for handshake message only permit up to 128 bytes
|
|
||||||
state_ = State::FAILED;
|
|
||||||
HELPER_LOG("Bad packet len for handshake: %d", msg_size);
|
|
||||||
return APIError::BAD_HANDSHAKE_PACKET_LEN;
|
|
||||||
}
|
|
||||||
|
|
||||||
// reserve space for body
|
|
||||||
if (rx_buf_.size() != msg_size) {
|
|
||||||
rx_buf_.resize(msg_size);
|
|
||||||
}
|
|
||||||
|
|
||||||
if (rx_buf_len_ < msg_size) {
|
|
||||||
// more data to read
|
|
||||||
uint16_t to_read = msg_size - rx_buf_len_;
|
|
||||||
ssize_t received = this->socket_->read(&rx_buf_[rx_buf_len_], to_read);
|
|
||||||
APIError err = handle_socket_read_result_(received);
|
|
||||||
if (err != APIError::OK) {
|
|
||||||
return err;
|
|
||||||
}
|
|
||||||
rx_buf_len_ += static_cast<uint16_t>(received);
|
|
||||||
if (static_cast<uint16_t>(received) != to_read) {
|
|
||||||
// not all read
|
|
||||||
return APIError::WOULD_BLOCK;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
LOG_PACKET_RECEIVED(rx_buf_);
|
|
||||||
*frame = std::move(rx_buf_);
|
|
||||||
// consume msg
|
|
||||||
rx_buf_ = {};
|
|
||||||
rx_buf_len_ = 0;
|
|
||||||
rx_header_buf_len_ = 0;
|
|
||||||
return APIError::OK;
|
|
||||||
}
|
|
||||||
|
|
||||||
/** To be called from read/write methods.
|
|
||||||
*
|
|
||||||
* This method runs through the internal handshake methods, if in that state.
|
|
||||||
*
|
|
||||||
* If the handshake is still active when this method returns and a read/write can't take place at
|
|
||||||
* the moment, returns WOULD_BLOCK.
|
|
||||||
* If an error occurred, returns that error. Only returns OK if the transport is ready for data
|
|
||||||
* traffic.
|
|
||||||
*/
|
|
||||||
APIError APINoiseFrameHelper::state_action_() {
|
|
||||||
int err;
|
|
||||||
APIError aerr;
|
|
||||||
if (state_ == State::INITIALIZE) {
|
|
||||||
HELPER_LOG("Bad state for method: %d", (int) state_);
|
|
||||||
return APIError::BAD_STATE;
|
|
||||||
}
|
|
||||||
if (state_ == State::CLIENT_HELLO) {
|
|
||||||
// waiting for client hello
|
|
||||||
std::vector<uint8_t> frame;
|
|
||||||
aerr = try_read_frame_(&frame);
|
|
||||||
if (aerr != APIError::OK) {
|
|
||||||
return handle_handshake_frame_error_(aerr);
|
|
||||||
}
|
|
||||||
// ignore contents, may be used in future for flags
|
|
||||||
// Resize for: existing prologue + 2 size bytes + frame data
|
|
||||||
size_t old_size = prologue_.size();
|
|
||||||
prologue_.resize(old_size + 2 + frame.size());
|
|
||||||
prologue_[old_size] = (uint8_t) (frame.size() >> 8);
|
|
||||||
prologue_[old_size + 1] = (uint8_t) frame.size();
|
|
||||||
std::memcpy(prologue_.data() + old_size + 2, frame.data(), frame.size());
|
|
||||||
|
|
||||||
state_ = State::SERVER_HELLO;
|
|
||||||
}
|
|
||||||
if (state_ == State::SERVER_HELLO) {
|
|
||||||
// send server hello
|
|
||||||
const std::string &name = App.get_name();
|
|
||||||
const std::string &mac = get_mac_address();
|
|
||||||
|
|
||||||
std::vector<uint8_t> msg;
|
|
||||||
// Calculate positions and sizes
|
|
||||||
size_t name_len = name.size() + 1; // including null terminator
|
|
||||||
size_t mac_len = mac.size() + 1; // including null terminator
|
|
||||||
size_t name_offset = 1;
|
|
||||||
size_t mac_offset = name_offset + name_len;
|
|
||||||
size_t total_size = 1 + name_len + mac_len;
|
|
||||||
|
|
||||||
msg.resize(total_size);
|
|
||||||
|
|
||||||
// chosen proto
|
|
||||||
msg[0] = 0x01;
|
|
||||||
|
|
||||||
// node name, terminated by null byte
|
|
||||||
std::memcpy(msg.data() + name_offset, name.c_str(), name_len);
|
|
||||||
// node mac, terminated by null byte
|
|
||||||
std::memcpy(msg.data() + mac_offset, mac.c_str(), mac_len);
|
|
||||||
|
|
||||||
aerr = write_frame_(msg.data(), msg.size());
|
|
||||||
if (aerr != APIError::OK)
|
|
||||||
return aerr;
|
|
||||||
|
|
||||||
// start handshake
|
|
||||||
aerr = init_handshake_();
|
|
||||||
if (aerr != APIError::OK)
|
|
||||||
return aerr;
|
|
||||||
|
|
||||||
state_ = State::HANDSHAKE;
|
|
||||||
}
|
|
||||||
if (state_ == State::HANDSHAKE) {
|
|
||||||
int action = noise_handshakestate_get_action(handshake_);
|
|
||||||
if (action == NOISE_ACTION_READ_MESSAGE) {
|
|
||||||
// waiting for handshake msg
|
|
||||||
std::vector<uint8_t> frame;
|
|
||||||
aerr = try_read_frame_(&frame);
|
|
||||||
if (aerr != APIError::OK) {
|
|
||||||
return handle_handshake_frame_error_(aerr);
|
|
||||||
}
|
|
||||||
|
|
||||||
if (frame.empty()) {
|
|
||||||
send_explicit_handshake_reject_("Empty handshake message");
|
|
||||||
return APIError::BAD_HANDSHAKE_ERROR_BYTE;
|
|
||||||
} else if (frame[0] != 0x00) {
|
|
||||||
HELPER_LOG("Bad handshake error byte: %u", frame[0]);
|
|
||||||
send_explicit_handshake_reject_("Bad handshake error byte");
|
|
||||||
return APIError::BAD_HANDSHAKE_ERROR_BYTE;
|
|
||||||
}
|
|
||||||
|
|
||||||
NoiseBuffer mbuf;
|
|
||||||
noise_buffer_init(mbuf);
|
|
||||||
noise_buffer_set_input(mbuf, frame.data() + 1, frame.size() - 1);
|
|
||||||
err = noise_handshakestate_read_message(handshake_, &mbuf, nullptr);
|
|
||||||
if (err != 0) {
|
|
||||||
// Special handling for MAC failure
|
|
||||||
send_explicit_handshake_reject_(err == NOISE_ERROR_MAC_FAILURE ? "Handshake MAC failure" : "Handshake error");
|
|
||||||
return handle_noise_error_(err, "noise_handshakestate_read_message", APIError::HANDSHAKESTATE_READ_FAILED);
|
|
||||||
}
|
|
||||||
|
|
||||||
aerr = check_handshake_finished_();
|
|
||||||
if (aerr != APIError::OK)
|
|
||||||
return aerr;
|
|
||||||
} else if (action == NOISE_ACTION_WRITE_MESSAGE) {
|
|
||||||
uint8_t buffer[65];
|
|
||||||
NoiseBuffer mbuf;
|
|
||||||
noise_buffer_init(mbuf);
|
|
||||||
noise_buffer_set_output(mbuf, buffer + 1, sizeof(buffer) - 1);
|
|
||||||
|
|
||||||
err = noise_handshakestate_write_message(handshake_, &mbuf, nullptr);
|
|
||||||
APIError aerr_write =
|
|
||||||
handle_noise_error_(err, "noise_handshakestate_write_message", APIError::HANDSHAKESTATE_WRITE_FAILED);
|
|
||||||
if (aerr_write != APIError::OK)
|
|
||||||
return aerr_write;
|
|
||||||
buffer[0] = 0x00; // success
|
|
||||||
|
|
||||||
aerr = write_frame_(buffer, mbuf.size + 1);
|
|
||||||
if (aerr != APIError::OK)
|
|
||||||
return aerr;
|
|
||||||
aerr = check_handshake_finished_();
|
|
||||||
if (aerr != APIError::OK)
|
|
||||||
return aerr;
|
|
||||||
} else {
|
|
||||||
// bad state for action
|
|
||||||
state_ = State::FAILED;
|
|
||||||
HELPER_LOG("Bad action for handshake: %d", action);
|
|
||||||
return APIError::HANDSHAKESTATE_BAD_STATE;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
if (state_ == State::CLOSED || state_ == State::FAILED) {
|
|
||||||
return APIError::BAD_STATE;
|
|
||||||
}
|
|
||||||
return APIError::OK;
|
|
||||||
}
|
|
||||||
void APINoiseFrameHelper::send_explicit_handshake_reject_(const std::string &reason) {
|
|
||||||
std::vector<uint8_t> data;
|
|
||||||
data.resize(reason.length() + 1);
|
|
||||||
data[0] = 0x01; // failure
|
|
||||||
|
|
||||||
// Copy error message in bulk
|
|
||||||
if (!reason.empty()) {
|
|
||||||
std::memcpy(data.data() + 1, reason.c_str(), reason.length());
|
|
||||||
}
|
|
||||||
|
|
||||||
// temporarily remove failed state
|
|
||||||
auto orig_state = state_;
|
|
||||||
state_ = State::EXPLICIT_REJECT;
|
|
||||||
write_frame_(data.data(), data.size());
|
|
||||||
state_ = orig_state;
|
|
||||||
}
|
|
||||||
APIError APINoiseFrameHelper::read_packet(ReadPacketBuffer *buffer) {
|
|
||||||
int err;
|
|
||||||
APIError aerr;
|
|
||||||
aerr = state_action_();
|
|
||||||
if (aerr != APIError::OK) {
|
|
||||||
return aerr;
|
|
||||||
}
|
|
||||||
|
|
||||||
if (state_ != State::DATA) {
|
|
||||||
return APIError::WOULD_BLOCK;
|
|
||||||
}
|
|
||||||
|
|
||||||
std::vector<uint8_t> frame;
|
|
||||||
aerr = try_read_frame_(&frame);
|
|
||||||
if (aerr != APIError::OK)
|
|
||||||
return aerr;
|
|
||||||
|
|
||||||
NoiseBuffer mbuf;
|
|
||||||
noise_buffer_init(mbuf);
|
|
||||||
noise_buffer_set_inout(mbuf, frame.data(), frame.size(), frame.size());
|
|
||||||
err = noise_cipherstate_decrypt(recv_cipher_, &mbuf);
|
|
||||||
APIError decrypt_err = handle_noise_error_(err, "noise_cipherstate_decrypt", APIError::CIPHERSTATE_DECRYPT_FAILED);
|
|
||||||
if (decrypt_err != APIError::OK)
|
|
||||||
return decrypt_err;
|
|
||||||
|
|
||||||
uint16_t msg_size = mbuf.size;
|
|
||||||
uint8_t *msg_data = frame.data();
|
|
||||||
if (msg_size < 4) {
|
|
||||||
state_ = State::FAILED;
|
|
||||||
HELPER_LOG("Bad data packet: size %d too short", msg_size);
|
|
||||||
return APIError::BAD_DATA_PACKET;
|
|
||||||
}
|
|
||||||
|
|
||||||
uint16_t type = (((uint16_t) msg_data[0]) << 8) | msg_data[1];
|
|
||||||
uint16_t data_len = (((uint16_t) msg_data[2]) << 8) | msg_data[3];
|
|
||||||
if (data_len > msg_size - 4) {
|
|
||||||
state_ = State::FAILED;
|
|
||||||
HELPER_LOG("Bad data packet: data_len %u greater than msg_size %u", data_len, msg_size);
|
|
||||||
return APIError::BAD_DATA_PACKET;
|
|
||||||
}
|
|
||||||
|
|
||||||
buffer->container = std::move(frame);
|
|
||||||
buffer->data_offset = 4;
|
|
||||||
buffer->data_len = data_len;
|
|
||||||
buffer->type = type;
|
|
||||||
return APIError::OK;
|
|
||||||
}
|
|
||||||
APIError APINoiseFrameHelper::write_protobuf_packet(uint8_t type, ProtoWriteBuffer buffer) {
|
|
||||||
// Resize to include MAC space (required for Noise encryption)
|
|
||||||
buffer.get_buffer()->resize(buffer.get_buffer()->size() + frame_footer_size_);
|
|
||||||
PacketInfo packet{type, 0,
|
|
||||||
static_cast<uint16_t>(buffer.get_buffer()->size() - frame_header_padding_ - frame_footer_size_)};
|
|
||||||
return write_protobuf_packets(buffer, std::span<const PacketInfo>(&packet, 1));
|
|
||||||
}
|
|
||||||
|
|
||||||
APIError APINoiseFrameHelper::write_protobuf_packets(ProtoWriteBuffer buffer, std::span<const PacketInfo> packets) {
|
|
||||||
APIError aerr = state_action_();
|
|
||||||
if (aerr != APIError::OK) {
|
|
||||||
return aerr;
|
|
||||||
}
|
|
||||||
|
|
||||||
if (state_ != State::DATA) {
|
|
||||||
return APIError::WOULD_BLOCK;
|
|
||||||
}
|
|
||||||
|
|
||||||
if (packets.empty()) {
|
|
||||||
return APIError::OK;
|
|
||||||
}
|
|
||||||
|
|
||||||
std::vector<uint8_t> *raw_buffer = buffer.get_buffer();
|
|
||||||
uint8_t *buffer_data = raw_buffer->data(); // Cache buffer pointer
|
|
||||||
|
|
||||||
this->reusable_iovs_.clear();
|
|
||||||
this->reusable_iovs_.reserve(packets.size());
|
|
||||||
uint16_t total_write_len = 0;
|
|
||||||
|
|
||||||
// We need to encrypt each packet in place
|
|
||||||
for (const auto &packet : packets) {
|
|
||||||
// The buffer already has padding at offset
|
|
||||||
uint8_t *buf_start = buffer_data + packet.offset;
|
|
||||||
|
|
||||||
// Write noise header
|
|
||||||
buf_start[0] = 0x01; // indicator
|
|
||||||
// buf_start[1], buf_start[2] to be set after encryption
|
|
||||||
|
|
||||||
// Write message header (to be encrypted)
|
|
||||||
const uint8_t msg_offset = 3;
|
|
||||||
buf_start[msg_offset] = static_cast<uint8_t>(packet.message_type >> 8); // type high byte
|
|
||||||
buf_start[msg_offset + 1] = static_cast<uint8_t>(packet.message_type); // type low byte
|
|
||||||
buf_start[msg_offset + 2] = static_cast<uint8_t>(packet.payload_size >> 8); // data_len high byte
|
|
||||||
buf_start[msg_offset + 3] = static_cast<uint8_t>(packet.payload_size); // data_len low byte
|
|
||||||
// payload data is already in the buffer starting at offset + 7
|
|
||||||
|
|
||||||
// Make sure we have space for MAC
|
|
||||||
// The buffer should already have been sized appropriately
|
|
||||||
|
|
||||||
// Encrypt the message in place
|
|
||||||
NoiseBuffer mbuf;
|
|
||||||
noise_buffer_init(mbuf);
|
|
||||||
noise_buffer_set_inout(mbuf, buf_start + msg_offset, 4 + packet.payload_size,
|
|
||||||
4 + packet.payload_size + frame_footer_size_);
|
|
||||||
|
|
||||||
int err = noise_cipherstate_encrypt(send_cipher_, &mbuf);
|
|
||||||
APIError aerr = handle_noise_error_(err, "noise_cipherstate_encrypt", APIError::CIPHERSTATE_ENCRYPT_FAILED);
|
|
||||||
if (aerr != APIError::OK)
|
|
||||||
return aerr;
|
|
||||||
|
|
||||||
// Fill in the encrypted size
|
|
||||||
buf_start[1] = static_cast<uint8_t>(mbuf.size >> 8);
|
|
||||||
buf_start[2] = static_cast<uint8_t>(mbuf.size);
|
|
||||||
|
|
||||||
// Add iovec for this encrypted packet
|
|
||||||
size_t packet_len = static_cast<size_t>(3 + mbuf.size); // indicator + size + encrypted data
|
|
||||||
this->reusable_iovs_.push_back({buf_start, packet_len});
|
|
||||||
total_write_len += packet_len;
|
|
||||||
}
|
|
||||||
|
|
||||||
// Send all encrypted packets in one writev call
|
|
||||||
return this->write_raw_(this->reusable_iovs_.data(), this->reusable_iovs_.size(), total_write_len);
|
|
||||||
}
|
|
||||||
|
|
||||||
APIError APINoiseFrameHelper::write_frame_(const uint8_t *data, uint16_t len) {
|
|
||||||
uint8_t header[3];
|
|
||||||
header[0] = 0x01; // indicator
|
|
||||||
header[1] = (uint8_t) (len >> 8);
|
|
||||||
header[2] = (uint8_t) len;
|
|
||||||
|
|
||||||
struct iovec iov[2];
|
|
||||||
iov[0].iov_base = header;
|
|
||||||
iov[0].iov_len = 3;
|
|
||||||
if (len == 0) {
|
|
||||||
return this->write_raw_(iov, 1, 3); // Just header
|
|
||||||
}
|
|
||||||
iov[1].iov_base = const_cast<uint8_t *>(data);
|
|
||||||
iov[1].iov_len = len;
|
|
||||||
|
|
||||||
return this->write_raw_(iov, 2, 3 + len); // Header + data
|
|
||||||
}
|
|
||||||
|
|
||||||
/** Initiate the data structures for the handshake.
|
|
||||||
*
|
|
||||||
* @return 0 on success, -1 on error (check errno)
|
|
||||||
*/
|
|
||||||
APIError APINoiseFrameHelper::init_handshake_() {
|
|
||||||
int err;
|
|
||||||
memset(&nid_, 0, sizeof(nid_));
|
|
||||||
// const char *proto = "Noise_NNpsk0_25519_ChaChaPoly_SHA256";
|
|
||||||
// err = noise_protocol_name_to_id(&nid_, proto, strlen(proto));
|
|
||||||
nid_.pattern_id = NOISE_PATTERN_NN;
|
|
||||||
nid_.cipher_id = NOISE_CIPHER_CHACHAPOLY;
|
|
||||||
nid_.dh_id = NOISE_DH_CURVE25519;
|
|
||||||
nid_.prefix_id = NOISE_PREFIX_STANDARD;
|
|
||||||
nid_.hybrid_id = NOISE_DH_NONE;
|
|
||||||
nid_.hash_id = NOISE_HASH_SHA256;
|
|
||||||
nid_.modifier_ids[0] = NOISE_MODIFIER_PSK0;
|
|
||||||
|
|
||||||
err = noise_handshakestate_new_by_id(&handshake_, &nid_, NOISE_ROLE_RESPONDER);
|
|
||||||
APIError aerr = handle_noise_error_(err, "noise_handshakestate_new_by_id", APIError::HANDSHAKESTATE_SETUP_FAILED);
|
|
||||||
if (aerr != APIError::OK)
|
|
||||||
return aerr;
|
|
||||||
|
|
||||||
const auto &psk = ctx_->get_psk();
|
|
||||||
err = noise_handshakestate_set_pre_shared_key(handshake_, psk.data(), psk.size());
|
|
||||||
aerr = handle_noise_error_(err, "noise_handshakestate_set_pre_shared_key", APIError::HANDSHAKESTATE_SETUP_FAILED);
|
|
||||||
if (aerr != APIError::OK)
|
|
||||||
return aerr;
|
|
||||||
|
|
||||||
err = noise_handshakestate_set_prologue(handshake_, prologue_.data(), prologue_.size());
|
|
||||||
aerr = handle_noise_error_(err, "noise_handshakestate_set_prologue", APIError::HANDSHAKESTATE_SETUP_FAILED);
|
|
||||||
if (aerr != APIError::OK)
|
|
||||||
return aerr;
|
|
||||||
// set_prologue copies it into handshakestate, so we can get rid of it now
|
|
||||||
prologue_ = {};
|
|
||||||
|
|
||||||
err = noise_handshakestate_start(handshake_);
|
|
||||||
aerr = handle_noise_error_(err, "noise_handshakestate_start", APIError::HANDSHAKESTATE_SETUP_FAILED);
|
|
||||||
if (aerr != APIError::OK)
|
|
||||||
return aerr;
|
|
||||||
return APIError::OK;
|
|
||||||
}
|
|
||||||
|
|
||||||
APIError APINoiseFrameHelper::check_handshake_finished_() {
|
|
||||||
assert(state_ == State::HANDSHAKE);
|
|
||||||
|
|
||||||
int action = noise_handshakestate_get_action(handshake_);
|
|
||||||
if (action == NOISE_ACTION_READ_MESSAGE || action == NOISE_ACTION_WRITE_MESSAGE)
|
|
||||||
return APIError::OK;
|
|
||||||
if (action != NOISE_ACTION_SPLIT) {
|
|
||||||
state_ = State::FAILED;
|
|
||||||
HELPER_LOG("Bad action for handshake: %d", action);
|
|
||||||
return APIError::HANDSHAKESTATE_BAD_STATE;
|
|
||||||
}
|
|
||||||
int err = noise_handshakestate_split(handshake_, &send_cipher_, &recv_cipher_);
|
|
||||||
APIError aerr = handle_noise_error_(err, "noise_handshakestate_split", APIError::HANDSHAKESTATE_SPLIT_FAILED);
|
|
||||||
if (aerr != APIError::OK)
|
|
||||||
return aerr;
|
|
||||||
|
|
||||||
frame_footer_size_ = noise_cipherstate_get_mac_length(send_cipher_);
|
|
||||||
|
|
||||||
HELPER_LOG("Handshake complete!");
|
|
||||||
noise_handshakestate_free(handshake_);
|
|
||||||
handshake_ = nullptr;
|
|
||||||
state_ = State::DATA;
|
|
||||||
return APIError::OK;
|
|
||||||
}
|
|
||||||
|
|
||||||
APINoiseFrameHelper::~APINoiseFrameHelper() {
|
|
||||||
if (handshake_ != nullptr) {
|
|
||||||
noise_handshakestate_free(handshake_);
|
|
||||||
handshake_ = nullptr;
|
|
||||||
}
|
|
||||||
if (send_cipher_ != nullptr) {
|
|
||||||
noise_cipherstate_free(send_cipher_);
|
|
||||||
send_cipher_ = nullptr;
|
|
||||||
}
|
|
||||||
if (recv_cipher_ != nullptr) {
|
|
||||||
noise_cipherstate_free(recv_cipher_);
|
|
||||||
recv_cipher_ = nullptr;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
extern "C" {
|
|
||||||
// declare how noise generates random bytes (here with a good HWRNG based on the RF system)
|
|
||||||
void noise_rand_bytes(void *output, size_t len) {
|
|
||||||
if (!esphome::random_bytes(reinterpret_cast<uint8_t *>(output), len)) {
|
|
||||||
ESP_LOGE(TAG, "Acquiring random bytes failed; rebooting");
|
|
||||||
arch_restart();
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
} // namespace esphome::api
|
|
||||||
#endif // USE_API_NOISE
|
|
||||||
#endif // USE_API
|
|
@ -1,68 +0,0 @@
|
|||||||
#pragma once
|
|
||||||
#include "api_frame_helper.h"
|
|
||||||
#ifdef USE_API
|
|
||||||
#ifdef USE_API_NOISE
|
|
||||||
#include "noise/protocol.h"
|
|
||||||
#include "api_noise_context.h"
|
|
||||||
|
|
||||||
namespace esphome::api {
|
|
||||||
|
|
||||||
class APINoiseFrameHelper : public APIFrameHelper {
|
|
||||||
public:
|
|
||||||
APINoiseFrameHelper(std::unique_ptr<socket::Socket> socket, std::shared_ptr<APINoiseContext> ctx,
|
|
||||||
const ClientInfo *client_info)
|
|
||||||
: APIFrameHelper(std::move(socket), client_info), ctx_(std::move(ctx)) {
|
|
||||||
// Noise header structure:
|
|
||||||
// Pos 0: indicator (0x01)
|
|
||||||
// Pos 1-2: encrypted payload size (16-bit big-endian)
|
|
||||||
// Pos 3-6: encrypted type (16-bit) + data_len (16-bit)
|
|
||||||
// Pos 7+: actual payload data
|
|
||||||
frame_header_padding_ = 7;
|
|
||||||
}
|
|
||||||
~APINoiseFrameHelper() override;
|
|
||||||
APIError init() override;
|
|
||||||
APIError loop() override;
|
|
||||||
APIError read_packet(ReadPacketBuffer *buffer) override;
|
|
||||||
APIError write_protobuf_packet(uint8_t type, ProtoWriteBuffer buffer) override;
|
|
||||||
APIError write_protobuf_packets(ProtoWriteBuffer buffer, std::span<const PacketInfo> packets) override;
|
|
||||||
// Get the frame header padding required by this protocol
|
|
||||||
uint8_t frame_header_padding() override { return frame_header_padding_; }
|
|
||||||
// Get the frame footer size required by this protocol
|
|
||||||
uint8_t frame_footer_size() override { return frame_footer_size_; }
|
|
||||||
|
|
||||||
protected:
|
|
||||||
APIError state_action_();
|
|
||||||
APIError try_read_frame_(std::vector<uint8_t> *frame);
|
|
||||||
APIError write_frame_(const uint8_t *data, uint16_t len);
|
|
||||||
APIError init_handshake_();
|
|
||||||
APIError check_handshake_finished_();
|
|
||||||
void send_explicit_handshake_reject_(const std::string &reason);
|
|
||||||
APIError handle_handshake_frame_error_(APIError aerr);
|
|
||||||
APIError handle_noise_error_(int err, const char *func_name, APIError api_err);
|
|
||||||
|
|
||||||
// Pointers first (4 bytes each)
|
|
||||||
NoiseHandshakeState *handshake_{nullptr};
|
|
||||||
NoiseCipherState *send_cipher_{nullptr};
|
|
||||||
NoiseCipherState *recv_cipher_{nullptr};
|
|
||||||
|
|
||||||
// Shared pointer (8 bytes on 32-bit = 4 bytes control block pointer + 4 bytes object pointer)
|
|
||||||
std::shared_ptr<APINoiseContext> ctx_;
|
|
||||||
|
|
||||||
// Vector (12 bytes on 32-bit)
|
|
||||||
std::vector<uint8_t> prologue_;
|
|
||||||
|
|
||||||
// NoiseProtocolId (size depends on implementation)
|
|
||||||
NoiseProtocolId nid_;
|
|
||||||
|
|
||||||
// Group small types together
|
|
||||||
// Fixed-size header buffer for noise protocol:
|
|
||||||
// 1 byte for indicator + 2 bytes for message size (16-bit value, not varint)
|
|
||||||
// Note: Maximum message size is UINT16_MAX (65535), with a limit of 128 bytes during handshake phase
|
|
||||||
uint8_t rx_header_buf_[3];
|
|
||||||
uint8_t rx_header_buf_len_ = 0;
|
|
||||||
// 4 bytes total, no padding
|
|
||||||
};
|
|
||||||
|
|
||||||
} // namespace esphome::api
|
|
||||||
#endif // USE_API_NOISE
|
|
||||||
#endif // USE_API
|
|
@ -1,290 +0,0 @@
|
|||||||
#include "api_frame_helper_plaintext.h"
|
|
||||||
#ifdef USE_API
|
|
||||||
#ifdef USE_API_PLAINTEXT
|
|
||||||
#include "api_connection.h" // For ClientInfo struct
|
|
||||||
#include "esphome/core/application.h"
|
|
||||||
#include "esphome/core/hal.h"
|
|
||||||
#include "esphome/core/helpers.h"
|
|
||||||
#include "esphome/core/log.h"
|
|
||||||
#include "proto.h"
|
|
||||||
#include <cstring>
|
|
||||||
#include <cinttypes>
|
|
||||||
|
|
||||||
namespace esphome::api {
|
|
||||||
|
|
||||||
static const char *const TAG = "api.plaintext";
|
|
||||||
|
|
||||||
#define HELPER_LOG(msg, ...) ESP_LOGVV(TAG, "%s: " msg, this->client_info_->get_combined_info().c_str(), ##__VA_ARGS__)
|
|
||||||
|
|
||||||
#ifdef HELPER_LOG_PACKETS
|
|
||||||
#define LOG_PACKET_RECEIVED(buffer) ESP_LOGVV(TAG, "Received frame: %s", format_hex_pretty(buffer).c_str())
|
|
||||||
#define LOG_PACKET_SENDING(data, len) ESP_LOGVV(TAG, "Sending raw: %s", format_hex_pretty(data, len).c_str())
|
|
||||||
#else
|
|
||||||
#define LOG_PACKET_RECEIVED(buffer) ((void) 0)
|
|
||||||
#define LOG_PACKET_SENDING(data, len) ((void) 0)
|
|
||||||
#endif
|
|
||||||
|
|
||||||
/// Initialize the frame helper, returns OK if successful.
|
|
||||||
APIError APIPlaintextFrameHelper::init() {
|
|
||||||
APIError err = init_common_();
|
|
||||||
if (err != APIError::OK) {
|
|
||||||
return err;
|
|
||||||
}
|
|
||||||
|
|
||||||
state_ = State::DATA;
|
|
||||||
return APIError::OK;
|
|
||||||
}
|
|
||||||
APIError APIPlaintextFrameHelper::loop() {
|
|
||||||
if (state_ != State::DATA) {
|
|
||||||
return APIError::BAD_STATE;
|
|
||||||
}
|
|
||||||
// Use base class implementation for buffer sending
|
|
||||||
return APIFrameHelper::loop();
|
|
||||||
}
|
|
||||||
|
|
||||||
/** Read a packet into the rx_buf_. If successful, stores frame data in the frame parameter
|
|
||||||
*
|
|
||||||
* @param frame: The struct to hold the frame information in.
|
|
||||||
* msg: store the parsed frame in that struct
|
|
||||||
*
|
|
||||||
* @return See APIError
|
|
||||||
*
|
|
||||||
* error API_ERROR_BAD_INDICATOR: Bad indicator byte at start of frame.
|
|
||||||
*/
|
|
||||||
APIError APIPlaintextFrameHelper::try_read_frame_(std::vector<uint8_t> *frame) {
|
|
||||||
if (frame == nullptr) {
|
|
||||||
HELPER_LOG("Bad argument for try_read_frame_");
|
|
||||||
return APIError::BAD_ARG;
|
|
||||||
}
|
|
||||||
|
|
||||||
// read header
|
|
||||||
while (!rx_header_parsed_) {
|
|
||||||
// Now that we know when the socket is ready, we can read up to 3 bytes
|
|
||||||
// into the rx_header_buf_ before we have to switch back to reading
|
|
||||||
// one byte at a time to ensure we don't read past the message and
|
|
||||||
// into the next one.
|
|
||||||
|
|
||||||
// Read directly into rx_header_buf_ at the current position
|
|
||||||
// Try to get to at least 3 bytes total (indicator + 2 varint bytes), then read one byte at a time
|
|
||||||
ssize_t received =
|
|
||||||
this->socket_->read(&rx_header_buf_[rx_header_buf_pos_], rx_header_buf_pos_ < 3 ? 3 - rx_header_buf_pos_ : 1);
|
|
||||||
APIError err = handle_socket_read_result_(received);
|
|
||||||
if (err != APIError::OK) {
|
|
||||||
return err;
|
|
||||||
}
|
|
||||||
|
|
||||||
// If this was the first read, validate the indicator byte
|
|
||||||
if (rx_header_buf_pos_ == 0 && received > 0) {
|
|
||||||
if (rx_header_buf_[0] != 0x00) {
|
|
||||||
state_ = State::FAILED;
|
|
||||||
HELPER_LOG("Bad indicator byte %u", rx_header_buf_[0]);
|
|
||||||
return APIError::BAD_INDICATOR;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
rx_header_buf_pos_ += received;
|
|
||||||
|
|
||||||
// Check for buffer overflow
|
|
||||||
if (rx_header_buf_pos_ >= sizeof(rx_header_buf_)) {
|
|
||||||
state_ = State::FAILED;
|
|
||||||
HELPER_LOG("Header buffer overflow");
|
|
||||||
return APIError::BAD_DATA_PACKET;
|
|
||||||
}
|
|
||||||
|
|
||||||
// Need at least 3 bytes total (indicator + 2 varint bytes) before trying to parse
|
|
||||||
if (rx_header_buf_pos_ < 3) {
|
|
||||||
continue;
|
|
||||||
}
|
|
||||||
|
|
||||||
// At this point, we have at least 3 bytes total:
|
|
||||||
// - Validated indicator byte (0x00) stored at position 0
|
|
||||||
// - At least 2 bytes in the buffer for the varints
|
|
||||||
// Buffer layout:
|
|
||||||
// [0]: indicator byte (0x00)
|
|
||||||
// [1-3]: Message size varint (variable length)
|
|
||||||
// - 2 bytes would only allow up to 16383, which is less than noise's UINT16_MAX (65535)
|
|
||||||
// - 3 bytes allows up to 2097151, ensuring we support at least as much as noise
|
|
||||||
// [2-5]: Message type varint (variable length)
|
|
||||||
// We now attempt to parse both varints. If either is incomplete,
|
|
||||||
// we'll continue reading more bytes.
|
|
||||||
|
|
||||||
// Skip indicator byte at position 0
|
|
||||||
uint8_t varint_pos = 1;
|
|
||||||
uint32_t consumed = 0;
|
|
||||||
|
|
||||||
auto msg_size_varint = ProtoVarInt::parse(&rx_header_buf_[varint_pos], rx_header_buf_pos_ - varint_pos, &consumed);
|
|
||||||
if (!msg_size_varint.has_value()) {
|
|
||||||
// not enough data there yet
|
|
||||||
continue;
|
|
||||||
}
|
|
||||||
|
|
||||||
if (msg_size_varint->as_uint32() > std::numeric_limits<uint16_t>::max()) {
|
|
||||||
state_ = State::FAILED;
|
|
||||||
HELPER_LOG("Bad packet: message size %" PRIu32 " exceeds maximum %u", msg_size_varint->as_uint32(),
|
|
||||||
std::numeric_limits<uint16_t>::max());
|
|
||||||
return APIError::BAD_DATA_PACKET;
|
|
||||||
}
|
|
||||||
rx_header_parsed_len_ = msg_size_varint->as_uint16();
|
|
||||||
|
|
||||||
// Move to next varint position
|
|
||||||
varint_pos += consumed;
|
|
||||||
|
|
||||||
auto msg_type_varint = ProtoVarInt::parse(&rx_header_buf_[varint_pos], rx_header_buf_pos_ - varint_pos, &consumed);
|
|
||||||
if (!msg_type_varint.has_value()) {
|
|
||||||
// not enough data there yet
|
|
||||||
continue;
|
|
||||||
}
|
|
||||||
if (msg_type_varint->as_uint32() > std::numeric_limits<uint16_t>::max()) {
|
|
||||||
state_ = State::FAILED;
|
|
||||||
HELPER_LOG("Bad packet: message type %" PRIu32 " exceeds maximum %u", msg_type_varint->as_uint32(),
|
|
||||||
std::numeric_limits<uint16_t>::max());
|
|
||||||
return APIError::BAD_DATA_PACKET;
|
|
||||||
}
|
|
||||||
rx_header_parsed_type_ = msg_type_varint->as_uint16();
|
|
||||||
rx_header_parsed_ = true;
|
|
||||||
}
|
|
||||||
// header reading done
|
|
||||||
|
|
||||||
// reserve space for body
|
|
||||||
if (rx_buf_.size() != rx_header_parsed_len_) {
|
|
||||||
rx_buf_.resize(rx_header_parsed_len_);
|
|
||||||
}
|
|
||||||
|
|
||||||
if (rx_buf_len_ < rx_header_parsed_len_) {
|
|
||||||
// more data to read
|
|
||||||
uint16_t to_read = rx_header_parsed_len_ - rx_buf_len_;
|
|
||||||
ssize_t received = this->socket_->read(&rx_buf_[rx_buf_len_], to_read);
|
|
||||||
APIError err = handle_socket_read_result_(received);
|
|
||||||
if (err != APIError::OK) {
|
|
||||||
return err;
|
|
||||||
}
|
|
||||||
rx_buf_len_ += static_cast<uint16_t>(received);
|
|
||||||
if (static_cast<uint16_t>(received) != to_read) {
|
|
||||||
// not all read
|
|
||||||
return APIError::WOULD_BLOCK;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
LOG_PACKET_RECEIVED(rx_buf_);
|
|
||||||
*frame = std::move(rx_buf_);
|
|
||||||
// consume msg
|
|
||||||
rx_buf_ = {};
|
|
||||||
rx_buf_len_ = 0;
|
|
||||||
rx_header_buf_pos_ = 0;
|
|
||||||
rx_header_parsed_ = false;
|
|
||||||
return APIError::OK;
|
|
||||||
}
|
|
||||||
APIError APIPlaintextFrameHelper::read_packet(ReadPacketBuffer *buffer) {
|
|
||||||
APIError aerr;
|
|
||||||
|
|
||||||
if (state_ != State::DATA) {
|
|
||||||
return APIError::WOULD_BLOCK;
|
|
||||||
}
|
|
||||||
|
|
||||||
std::vector<uint8_t> frame;
|
|
||||||
aerr = try_read_frame_(&frame);
|
|
||||||
if (aerr != APIError::OK) {
|
|
||||||
if (aerr == APIError::BAD_INDICATOR) {
|
|
||||||
// Make sure to tell the remote that we don't
|
|
||||||
// understand the indicator byte so it knows
|
|
||||||
// we do not support it.
|
|
||||||
struct iovec iov[1];
|
|
||||||
// The \x00 first byte is the marker for plaintext.
|
|
||||||
//
|
|
||||||
// The remote will know how to handle the indicator byte,
|
|
||||||
// but it likely won't understand the rest of the message.
|
|
||||||
//
|
|
||||||
// We must send at least 3 bytes to be read, so we add
|
|
||||||
// a message after the indicator byte to ensures its long
|
|
||||||
// enough and can aid in debugging.
|
|
||||||
const char msg[] = "\x00"
|
|
||||||
"Bad indicator byte";
|
|
||||||
iov[0].iov_base = (void *) msg;
|
|
||||||
iov[0].iov_len = 19;
|
|
||||||
this->write_raw_(iov, 1, 19);
|
|
||||||
}
|
|
||||||
return aerr;
|
|
||||||
}
|
|
||||||
|
|
||||||
buffer->container = std::move(frame);
|
|
||||||
buffer->data_offset = 0;
|
|
||||||
buffer->data_len = rx_header_parsed_len_;
|
|
||||||
buffer->type = rx_header_parsed_type_;
|
|
||||||
return APIError::OK;
|
|
||||||
}
|
|
||||||
APIError APIPlaintextFrameHelper::write_protobuf_packet(uint8_t type, ProtoWriteBuffer buffer) {
|
|
||||||
PacketInfo packet{type, 0, static_cast<uint16_t>(buffer.get_buffer()->size() - frame_header_padding_)};
|
|
||||||
return write_protobuf_packets(buffer, std::span<const PacketInfo>(&packet, 1));
|
|
||||||
}
|
|
||||||
|
|
||||||
APIError APIPlaintextFrameHelper::write_protobuf_packets(ProtoWriteBuffer buffer, std::span<const PacketInfo> packets) {
|
|
||||||
if (state_ != State::DATA) {
|
|
||||||
return APIError::BAD_STATE;
|
|
||||||
}
|
|
||||||
|
|
||||||
if (packets.empty()) {
|
|
||||||
return APIError::OK;
|
|
||||||
}
|
|
||||||
|
|
||||||
std::vector<uint8_t> *raw_buffer = buffer.get_buffer();
|
|
||||||
uint8_t *buffer_data = raw_buffer->data(); // Cache buffer pointer
|
|
||||||
|
|
||||||
this->reusable_iovs_.clear();
|
|
||||||
this->reusable_iovs_.reserve(packets.size());
|
|
||||||
uint16_t total_write_len = 0;
|
|
||||||
|
|
||||||
for (const auto &packet : packets) {
|
|
||||||
// Calculate varint sizes for header layout
|
|
||||||
uint8_t size_varint_len = api::ProtoSize::varint(static_cast<uint32_t>(packet.payload_size));
|
|
||||||
uint8_t type_varint_len = api::ProtoSize::varint(static_cast<uint32_t>(packet.message_type));
|
|
||||||
uint8_t total_header_len = 1 + size_varint_len + type_varint_len;
|
|
||||||
|
|
||||||
// Calculate where to start writing the header
|
|
||||||
// The header starts at the latest possible position to minimize unused padding
|
|
||||||
//
|
|
||||||
// Example 1 (small values): total_header_len = 3, header_offset = 6 - 3 = 3
|
|
||||||
// [0-2] - Unused padding
|
|
||||||
// [3] - 0x00 indicator byte
|
|
||||||
// [4] - Payload size varint (1 byte, for sizes 0-127)
|
|
||||||
// [5] - Message type varint (1 byte, for types 0-127)
|
|
||||||
// [6...] - Actual payload data
|
|
||||||
//
|
|
||||||
// Example 2 (medium values): total_header_len = 4, header_offset = 6 - 4 = 2
|
|
||||||
// [0-1] - Unused padding
|
|
||||||
// [2] - 0x00 indicator byte
|
|
||||||
// [3-4] - Payload size varint (2 bytes, for sizes 128-16383)
|
|
||||||
// [5] - Message type varint (1 byte, for types 0-127)
|
|
||||||
// [6...] - Actual payload data
|
|
||||||
//
|
|
||||||
// Example 3 (large values): total_header_len = 6, header_offset = 6 - 6 = 0
|
|
||||||
// [0] - 0x00 indicator byte
|
|
||||||
// [1-3] - Payload size varint (3 bytes, for sizes 16384-2097151)
|
|
||||||
// [4-5] - Message type varint (2 bytes, for types 128-32767)
|
|
||||||
// [6...] - Actual payload data
|
|
||||||
//
|
|
||||||
// The message starts at offset + frame_header_padding_
|
|
||||||
// So we write the header starting at offset + frame_header_padding_ - total_header_len
|
|
||||||
uint8_t *buf_start = buffer_data + packet.offset;
|
|
||||||
uint32_t header_offset = frame_header_padding_ - total_header_len;
|
|
||||||
|
|
||||||
// Write the plaintext header
|
|
||||||
buf_start[header_offset] = 0x00; // indicator
|
|
||||||
|
|
||||||
// Encode varints directly into buffer
|
|
||||||
ProtoVarInt(packet.payload_size).encode_to_buffer_unchecked(buf_start + header_offset + 1, size_varint_len);
|
|
||||||
ProtoVarInt(packet.message_type)
|
|
||||||
.encode_to_buffer_unchecked(buf_start + header_offset + 1 + size_varint_len, type_varint_len);
|
|
||||||
|
|
||||||
// Add iovec for this packet (header + payload)
|
|
||||||
size_t packet_len = static_cast<size_t>(total_header_len + packet.payload_size);
|
|
||||||
this->reusable_iovs_.push_back({buf_start + header_offset, packet_len});
|
|
||||||
total_write_len += packet_len;
|
|
||||||
}
|
|
||||||
|
|
||||||
// Send all packets in one writev call
|
|
||||||
return write_raw_(this->reusable_iovs_.data(), this->reusable_iovs_.size(), total_write_len);
|
|
||||||
}
|
|
||||||
|
|
||||||
} // namespace esphome::api
|
|
||||||
#endif // USE_API_PLAINTEXT
|
|
||||||
#endif // USE_API
|
|
@ -1,53 +0,0 @@
|
|||||||
#pragma once
|
|
||||||
#include "api_frame_helper.h"
|
|
||||||
#ifdef USE_API
|
|
||||||
#ifdef USE_API_PLAINTEXT
|
|
||||||
|
|
||||||
namespace esphome::api {
|
|
||||||
|
|
||||||
class APIPlaintextFrameHelper : public APIFrameHelper {
|
|
||||||
public:
|
|
||||||
APIPlaintextFrameHelper(std::unique_ptr<socket::Socket> socket, const ClientInfo *client_info)
|
|
||||||
: APIFrameHelper(std::move(socket), client_info) {
|
|
||||||
// Plaintext header structure (worst case):
|
|
||||||
// Pos 0: indicator (0x00)
|
|
||||||
// Pos 1-3: payload size varint (up to 3 bytes)
|
|
||||||
// Pos 4-5: message type varint (up to 2 bytes)
|
|
||||||
// Pos 6+: actual payload data
|
|
||||||
frame_header_padding_ = 6;
|
|
||||||
}
|
|
||||||
~APIPlaintextFrameHelper() override = default;
|
|
||||||
APIError init() override;
|
|
||||||
APIError loop() override;
|
|
||||||
APIError read_packet(ReadPacketBuffer *buffer) override;
|
|
||||||
APIError write_protobuf_packet(uint8_t type, ProtoWriteBuffer buffer) override;
|
|
||||||
APIError write_protobuf_packets(ProtoWriteBuffer buffer, std::span<const PacketInfo> packets) override;
|
|
||||||
uint8_t frame_header_padding() override { return frame_header_padding_; }
|
|
||||||
// Get the frame footer size required by this protocol
|
|
||||||
uint8_t frame_footer_size() override { return frame_footer_size_; }
|
|
||||||
|
|
||||||
protected:
|
|
||||||
APIError try_read_frame_(std::vector<uint8_t> *frame);
|
|
||||||
|
|
||||||
// Group 2-byte aligned types
|
|
||||||
uint16_t rx_header_parsed_type_ = 0;
|
|
||||||
uint16_t rx_header_parsed_len_ = 0;
|
|
||||||
|
|
||||||
// Group 1-byte types together
|
|
||||||
// Fixed-size header buffer for plaintext protocol:
|
|
||||||
// We now store the indicator byte + the two varints.
|
|
||||||
// To match noise protocol's maximum message size (UINT16_MAX = 65535), we need:
|
|
||||||
// 1 byte for indicator + 3 bytes for message size varint (supports up to 2097151) + 2 bytes for message type varint
|
|
||||||
//
|
|
||||||
// While varints could theoretically be up to 10 bytes each for 64-bit values,
|
|
||||||
// attempting to process messages with headers that large would likely crash the
|
|
||||||
// ESP32 due to memory constraints.
|
|
||||||
uint8_t rx_header_buf_[6]; // 1 byte indicator + 5 bytes for varints (3 for size + 2 for type)
|
|
||||||
uint8_t rx_header_buf_pos_ = 0;
|
|
||||||
bool rx_header_parsed_ = false;
|
|
||||||
// 8 bytes total, no padding needed
|
|
||||||
};
|
|
||||||
|
|
||||||
} // namespace esphome::api
|
|
||||||
#endif // USE_API_PLAINTEXT
|
|
||||||
#endif // USE_API
|
|
@ -3,7 +3,8 @@
|
|||||||
#include <cstdint>
|
#include <cstdint>
|
||||||
#include "esphome/core/defines.h"
|
#include "esphome/core/defines.h"
|
||||||
|
|
||||||
namespace esphome::api {
|
namespace esphome {
|
||||||
|
namespace api {
|
||||||
|
|
||||||
#ifdef USE_API_NOISE
|
#ifdef USE_API_NOISE
|
||||||
using psk_t = std::array<uint8_t, 32>;
|
using psk_t = std::array<uint8_t, 32>;
|
||||||
@ -27,4 +28,5 @@ class APINoiseContext {
|
|||||||
};
|
};
|
||||||
#endif // USE_API_NOISE
|
#endif // USE_API_NOISE
|
||||||
|
|
||||||
} // namespace esphome::api
|
} // namespace api
|
||||||
|
} // namespace esphome
|
||||||
|
@ -23,37 +23,3 @@ extend google.protobuf.MessageOptions {
|
|||||||
optional bool no_delay = 1040 [default=false];
|
optional bool no_delay = 1040 [default=false];
|
||||||
optional string base_class = 1041;
|
optional string base_class = 1041;
|
||||||
}
|
}
|
||||||
|
|
||||||
extend google.protobuf.FieldOptions {
|
|
||||||
optional string field_ifdef = 1042;
|
|
||||||
optional uint32 fixed_array_size = 50007;
|
|
||||||
optional bool no_zero_copy = 50008 [default=false];
|
|
||||||
optional bool fixed_array_skip_zero = 50009 [default=false];
|
|
||||||
optional string fixed_array_size_define = 50010;
|
|
||||||
|
|
||||||
// container_pointer: Zero-copy optimization for repeated fields.
|
|
||||||
//
|
|
||||||
// When container_pointer is set on a repeated field, the generated message will
|
|
||||||
// store a pointer to an existing container instead of copying the data into the
|
|
||||||
// message's own repeated field. This eliminates heap allocations and improves performance.
|
|
||||||
//
|
|
||||||
// Requirements for safe usage:
|
|
||||||
// 1. The source container must remain valid until the message is encoded
|
|
||||||
// 2. Messages must be encoded immediately (which ESPHome does by default)
|
|
||||||
// 3. The container type must match the field type exactly
|
|
||||||
//
|
|
||||||
// Supported container types:
|
|
||||||
// - "std::vector<T>" for most repeated fields
|
|
||||||
// - "std::set<T>" for unique/sorted data
|
|
||||||
// - Full type specification required for enums (e.g., "std::set<climate::ClimateMode>")
|
|
||||||
//
|
|
||||||
// Example usage in .proto file:
|
|
||||||
// repeated string supported_modes = 12 [(container_pointer) = "std::set"];
|
|
||||||
// repeated ColorMode color_modes = 13 [(container_pointer) = "std::set<light::ColorMode>"];
|
|
||||||
//
|
|
||||||
// The corresponding C++ code must provide const reference access to a container
|
|
||||||
// that matches the specified type and remains valid during message encoding.
|
|
||||||
// This is typically done through methods returning const T& or special accessor
|
|
||||||
// methods like get_options() or supported_modes_for_api_().
|
|
||||||
optional string container_pointer = 50001;
|
|
||||||
}
|
|
||||||
|
File diff suppressed because it is too large
Load Diff
File diff suppressed because it is too large
Load Diff
File diff suppressed because it is too large
Load Diff
@ -1,34 +0,0 @@
|
|||||||
#pragma once
|
|
||||||
|
|
||||||
#include "esphome/core/defines.h"
|
|
||||||
|
|
||||||
// This file provides includes needed by the generated protobuf code
|
|
||||||
// when using pointer optimizations for component-specific types
|
|
||||||
|
|
||||||
#ifdef USE_CLIMATE
|
|
||||||
#include "esphome/components/climate/climate_mode.h"
|
|
||||||
#include "esphome/components/climate/climate_traits.h"
|
|
||||||
#endif
|
|
||||||
|
|
||||||
#ifdef USE_LIGHT
|
|
||||||
#include "esphome/components/light/light_traits.h"
|
|
||||||
#endif
|
|
||||||
|
|
||||||
#ifdef USE_FAN
|
|
||||||
#include "esphome/components/fan/fan_traits.h"
|
|
||||||
#endif
|
|
||||||
|
|
||||||
#ifdef USE_SELECT
|
|
||||||
#include "esphome/components/select/select_traits.h"
|
|
||||||
#endif
|
|
||||||
|
|
||||||
// Standard library includes that might be needed
|
|
||||||
#include <set>
|
|
||||||
#include <vector>
|
|
||||||
#include <string>
|
|
||||||
|
|
||||||
namespace esphome::api {
|
|
||||||
|
|
||||||
// This file only provides includes, no actual code
|
|
||||||
|
|
||||||
} // namespace esphome::api
|
|
@ -3,7 +3,8 @@
|
|||||||
#include "api_pb2_service.h"
|
#include "api_pb2_service.h"
|
||||||
#include "esphome/core/log.h"
|
#include "esphome/core/log.h"
|
||||||
|
|
||||||
namespace esphome::api {
|
namespace esphome {
|
||||||
|
namespace api {
|
||||||
|
|
||||||
static const char *const TAG = "api.service";
|
static const char *const TAG = "api.service";
|
||||||
|
|
||||||
@ -15,7 +16,7 @@ void APIServerConnectionBase::log_send_message_(const char *name, const std::str
|
|||||||
|
|
||||||
void APIServerConnectionBase::read_message(uint32_t msg_size, uint32_t msg_type, uint8_t *msg_data) {
|
void APIServerConnectionBase::read_message(uint32_t msg_size, uint32_t msg_type, uint8_t *msg_data) {
|
||||||
switch (msg_type) {
|
switch (msg_type) {
|
||||||
case HelloRequest::MESSAGE_TYPE: {
|
case 1: {
|
||||||
HelloRequest msg;
|
HelloRequest msg;
|
||||||
msg.decode(msg_data, msg_size);
|
msg.decode(msg_data, msg_size);
|
||||||
#ifdef HAS_PROTO_MESSAGE_DUMP
|
#ifdef HAS_PROTO_MESSAGE_DUMP
|
||||||
@ -24,7 +25,7 @@ void APIServerConnectionBase::read_message(uint32_t msg_size, uint32_t msg_type,
|
|||||||
this->on_hello_request(msg);
|
this->on_hello_request(msg);
|
||||||
break;
|
break;
|
||||||
}
|
}
|
||||||
case ConnectRequest::MESSAGE_TYPE: {
|
case 3: {
|
||||||
ConnectRequest msg;
|
ConnectRequest msg;
|
||||||
msg.decode(msg_data, msg_size);
|
msg.decode(msg_data, msg_size);
|
||||||
#ifdef HAS_PROTO_MESSAGE_DUMP
|
#ifdef HAS_PROTO_MESSAGE_DUMP
|
||||||
@ -33,70 +34,70 @@ void APIServerConnectionBase::read_message(uint32_t msg_size, uint32_t msg_type,
|
|||||||
this->on_connect_request(msg);
|
this->on_connect_request(msg);
|
||||||
break;
|
break;
|
||||||
}
|
}
|
||||||
case DisconnectRequest::MESSAGE_TYPE: {
|
case 5: {
|
||||||
DisconnectRequest msg;
|
DisconnectRequest msg;
|
||||||
// Empty message: no decode needed
|
msg.decode(msg_data, msg_size);
|
||||||
#ifdef HAS_PROTO_MESSAGE_DUMP
|
#ifdef HAS_PROTO_MESSAGE_DUMP
|
||||||
ESP_LOGVV(TAG, "on_disconnect_request: %s", msg.dump().c_str());
|
ESP_LOGVV(TAG, "on_disconnect_request: %s", msg.dump().c_str());
|
||||||
#endif
|
#endif
|
||||||
this->on_disconnect_request(msg);
|
this->on_disconnect_request(msg);
|
||||||
break;
|
break;
|
||||||
}
|
}
|
||||||
case DisconnectResponse::MESSAGE_TYPE: {
|
case 6: {
|
||||||
DisconnectResponse msg;
|
DisconnectResponse msg;
|
||||||
// Empty message: no decode needed
|
msg.decode(msg_data, msg_size);
|
||||||
#ifdef HAS_PROTO_MESSAGE_DUMP
|
#ifdef HAS_PROTO_MESSAGE_DUMP
|
||||||
ESP_LOGVV(TAG, "on_disconnect_response: %s", msg.dump().c_str());
|
ESP_LOGVV(TAG, "on_disconnect_response: %s", msg.dump().c_str());
|
||||||
#endif
|
#endif
|
||||||
this->on_disconnect_response(msg);
|
this->on_disconnect_response(msg);
|
||||||
break;
|
break;
|
||||||
}
|
}
|
||||||
case PingRequest::MESSAGE_TYPE: {
|
case 7: {
|
||||||
PingRequest msg;
|
PingRequest msg;
|
||||||
// Empty message: no decode needed
|
msg.decode(msg_data, msg_size);
|
||||||
#ifdef HAS_PROTO_MESSAGE_DUMP
|
#ifdef HAS_PROTO_MESSAGE_DUMP
|
||||||
ESP_LOGVV(TAG, "on_ping_request: %s", msg.dump().c_str());
|
ESP_LOGVV(TAG, "on_ping_request: %s", msg.dump().c_str());
|
||||||
#endif
|
#endif
|
||||||
this->on_ping_request(msg);
|
this->on_ping_request(msg);
|
||||||
break;
|
break;
|
||||||
}
|
}
|
||||||
case PingResponse::MESSAGE_TYPE: {
|
case 8: {
|
||||||
PingResponse msg;
|
PingResponse msg;
|
||||||
// Empty message: no decode needed
|
msg.decode(msg_data, msg_size);
|
||||||
#ifdef HAS_PROTO_MESSAGE_DUMP
|
#ifdef HAS_PROTO_MESSAGE_DUMP
|
||||||
ESP_LOGVV(TAG, "on_ping_response: %s", msg.dump().c_str());
|
ESP_LOGVV(TAG, "on_ping_response: %s", msg.dump().c_str());
|
||||||
#endif
|
#endif
|
||||||
this->on_ping_response(msg);
|
this->on_ping_response(msg);
|
||||||
break;
|
break;
|
||||||
}
|
}
|
||||||
case DeviceInfoRequest::MESSAGE_TYPE: {
|
case 9: {
|
||||||
DeviceInfoRequest msg;
|
DeviceInfoRequest msg;
|
||||||
// Empty message: no decode needed
|
msg.decode(msg_data, msg_size);
|
||||||
#ifdef HAS_PROTO_MESSAGE_DUMP
|
#ifdef HAS_PROTO_MESSAGE_DUMP
|
||||||
ESP_LOGVV(TAG, "on_device_info_request: %s", msg.dump().c_str());
|
ESP_LOGVV(TAG, "on_device_info_request: %s", msg.dump().c_str());
|
||||||
#endif
|
#endif
|
||||||
this->on_device_info_request(msg);
|
this->on_device_info_request(msg);
|
||||||
break;
|
break;
|
||||||
}
|
}
|
||||||
case ListEntitiesRequest::MESSAGE_TYPE: {
|
case 11: {
|
||||||
ListEntitiesRequest msg;
|
ListEntitiesRequest msg;
|
||||||
// Empty message: no decode needed
|
msg.decode(msg_data, msg_size);
|
||||||
#ifdef HAS_PROTO_MESSAGE_DUMP
|
#ifdef HAS_PROTO_MESSAGE_DUMP
|
||||||
ESP_LOGVV(TAG, "on_list_entities_request: %s", msg.dump().c_str());
|
ESP_LOGVV(TAG, "on_list_entities_request: %s", msg.dump().c_str());
|
||||||
#endif
|
#endif
|
||||||
this->on_list_entities_request(msg);
|
this->on_list_entities_request(msg);
|
||||||
break;
|
break;
|
||||||
}
|
}
|
||||||
case SubscribeStatesRequest::MESSAGE_TYPE: {
|
case 20: {
|
||||||
SubscribeStatesRequest msg;
|
SubscribeStatesRequest msg;
|
||||||
// Empty message: no decode needed
|
msg.decode(msg_data, msg_size);
|
||||||
#ifdef HAS_PROTO_MESSAGE_DUMP
|
#ifdef HAS_PROTO_MESSAGE_DUMP
|
||||||
ESP_LOGVV(TAG, "on_subscribe_states_request: %s", msg.dump().c_str());
|
ESP_LOGVV(TAG, "on_subscribe_states_request: %s", msg.dump().c_str());
|
||||||
#endif
|
#endif
|
||||||
this->on_subscribe_states_request(msg);
|
this->on_subscribe_states_request(msg);
|
||||||
break;
|
break;
|
||||||
}
|
}
|
||||||
case SubscribeLogsRequest::MESSAGE_TYPE: {
|
case 28: {
|
||||||
SubscribeLogsRequest msg;
|
SubscribeLogsRequest msg;
|
||||||
msg.decode(msg_data, msg_size);
|
msg.decode(msg_data, msg_size);
|
||||||
#ifdef HAS_PROTO_MESSAGE_DUMP
|
#ifdef HAS_PROTO_MESSAGE_DUMP
|
||||||
@ -106,7 +107,7 @@ void APIServerConnectionBase::read_message(uint32_t msg_size, uint32_t msg_type,
|
|||||||
break;
|
break;
|
||||||
}
|
}
|
||||||
#ifdef USE_COVER
|
#ifdef USE_COVER
|
||||||
case CoverCommandRequest::MESSAGE_TYPE: {
|
case 30: {
|
||||||
CoverCommandRequest msg;
|
CoverCommandRequest msg;
|
||||||
msg.decode(msg_data, msg_size);
|
msg.decode(msg_data, msg_size);
|
||||||
#ifdef HAS_PROTO_MESSAGE_DUMP
|
#ifdef HAS_PROTO_MESSAGE_DUMP
|
||||||
@ -117,7 +118,7 @@ void APIServerConnectionBase::read_message(uint32_t msg_size, uint32_t msg_type,
|
|||||||
}
|
}
|
||||||
#endif
|
#endif
|
||||||
#ifdef USE_FAN
|
#ifdef USE_FAN
|
||||||
case FanCommandRequest::MESSAGE_TYPE: {
|
case 31: {
|
||||||
FanCommandRequest msg;
|
FanCommandRequest msg;
|
||||||
msg.decode(msg_data, msg_size);
|
msg.decode(msg_data, msg_size);
|
||||||
#ifdef HAS_PROTO_MESSAGE_DUMP
|
#ifdef HAS_PROTO_MESSAGE_DUMP
|
||||||
@ -128,7 +129,7 @@ void APIServerConnectionBase::read_message(uint32_t msg_size, uint32_t msg_type,
|
|||||||
}
|
}
|
||||||
#endif
|
#endif
|
||||||
#ifdef USE_LIGHT
|
#ifdef USE_LIGHT
|
||||||
case LightCommandRequest::MESSAGE_TYPE: {
|
case 32: {
|
||||||
LightCommandRequest msg;
|
LightCommandRequest msg;
|
||||||
msg.decode(msg_data, msg_size);
|
msg.decode(msg_data, msg_size);
|
||||||
#ifdef HAS_PROTO_MESSAGE_DUMP
|
#ifdef HAS_PROTO_MESSAGE_DUMP
|
||||||
@ -139,7 +140,7 @@ void APIServerConnectionBase::read_message(uint32_t msg_size, uint32_t msg_type,
|
|||||||
}
|
}
|
||||||
#endif
|
#endif
|
||||||
#ifdef USE_SWITCH
|
#ifdef USE_SWITCH
|
||||||
case SwitchCommandRequest::MESSAGE_TYPE: {
|
case 33: {
|
||||||
SwitchCommandRequest msg;
|
SwitchCommandRequest msg;
|
||||||
msg.decode(msg_data, msg_size);
|
msg.decode(msg_data, msg_size);
|
||||||
#ifdef HAS_PROTO_MESSAGE_DUMP
|
#ifdef HAS_PROTO_MESSAGE_DUMP
|
||||||
@ -149,27 +150,25 @@ void APIServerConnectionBase::read_message(uint32_t msg_size, uint32_t msg_type,
|
|||||||
break;
|
break;
|
||||||
}
|
}
|
||||||
#endif
|
#endif
|
||||||
#ifdef USE_API_HOMEASSISTANT_SERVICES
|
case 34: {
|
||||||
case SubscribeHomeassistantServicesRequest::MESSAGE_TYPE: {
|
|
||||||
SubscribeHomeassistantServicesRequest msg;
|
SubscribeHomeassistantServicesRequest msg;
|
||||||
// Empty message: no decode needed
|
msg.decode(msg_data, msg_size);
|
||||||
#ifdef HAS_PROTO_MESSAGE_DUMP
|
#ifdef HAS_PROTO_MESSAGE_DUMP
|
||||||
ESP_LOGVV(TAG, "on_subscribe_homeassistant_services_request: %s", msg.dump().c_str());
|
ESP_LOGVV(TAG, "on_subscribe_homeassistant_services_request: %s", msg.dump().c_str());
|
||||||
#endif
|
#endif
|
||||||
this->on_subscribe_homeassistant_services_request(msg);
|
this->on_subscribe_homeassistant_services_request(msg);
|
||||||
break;
|
break;
|
||||||
}
|
}
|
||||||
#endif
|
case 36: {
|
||||||
case GetTimeRequest::MESSAGE_TYPE: {
|
|
||||||
GetTimeRequest msg;
|
GetTimeRequest msg;
|
||||||
// Empty message: no decode needed
|
msg.decode(msg_data, msg_size);
|
||||||
#ifdef HAS_PROTO_MESSAGE_DUMP
|
#ifdef HAS_PROTO_MESSAGE_DUMP
|
||||||
ESP_LOGVV(TAG, "on_get_time_request: %s", msg.dump().c_str());
|
ESP_LOGVV(TAG, "on_get_time_request: %s", msg.dump().c_str());
|
||||||
#endif
|
#endif
|
||||||
this->on_get_time_request(msg);
|
this->on_get_time_request(msg);
|
||||||
break;
|
break;
|
||||||
}
|
}
|
||||||
case GetTimeResponse::MESSAGE_TYPE: {
|
case 37: {
|
||||||
GetTimeResponse msg;
|
GetTimeResponse msg;
|
||||||
msg.decode(msg_data, msg_size);
|
msg.decode(msg_data, msg_size);
|
||||||
#ifdef HAS_PROTO_MESSAGE_DUMP
|
#ifdef HAS_PROTO_MESSAGE_DUMP
|
||||||
@ -178,19 +177,16 @@ void APIServerConnectionBase::read_message(uint32_t msg_size, uint32_t msg_type,
|
|||||||
this->on_get_time_response(msg);
|
this->on_get_time_response(msg);
|
||||||
break;
|
break;
|
||||||
}
|
}
|
||||||
#ifdef USE_API_HOMEASSISTANT_STATES
|
case 38: {
|
||||||
case SubscribeHomeAssistantStatesRequest::MESSAGE_TYPE: {
|
|
||||||
SubscribeHomeAssistantStatesRequest msg;
|
SubscribeHomeAssistantStatesRequest msg;
|
||||||
// Empty message: no decode needed
|
msg.decode(msg_data, msg_size);
|
||||||
#ifdef HAS_PROTO_MESSAGE_DUMP
|
#ifdef HAS_PROTO_MESSAGE_DUMP
|
||||||
ESP_LOGVV(TAG, "on_subscribe_home_assistant_states_request: %s", msg.dump().c_str());
|
ESP_LOGVV(TAG, "on_subscribe_home_assistant_states_request: %s", msg.dump().c_str());
|
||||||
#endif
|
#endif
|
||||||
this->on_subscribe_home_assistant_states_request(msg);
|
this->on_subscribe_home_assistant_states_request(msg);
|
||||||
break;
|
break;
|
||||||
}
|
}
|
||||||
#endif
|
case 40: {
|
||||||
#ifdef USE_API_HOMEASSISTANT_STATES
|
|
||||||
case HomeAssistantStateResponse::MESSAGE_TYPE: {
|
|
||||||
HomeAssistantStateResponse msg;
|
HomeAssistantStateResponse msg;
|
||||||
msg.decode(msg_data, msg_size);
|
msg.decode(msg_data, msg_size);
|
||||||
#ifdef HAS_PROTO_MESSAGE_DUMP
|
#ifdef HAS_PROTO_MESSAGE_DUMP
|
||||||
@ -199,9 +195,8 @@ void APIServerConnectionBase::read_message(uint32_t msg_size, uint32_t msg_type,
|
|||||||
this->on_home_assistant_state_response(msg);
|
this->on_home_assistant_state_response(msg);
|
||||||
break;
|
break;
|
||||||
}
|
}
|
||||||
#endif
|
|
||||||
#ifdef USE_API_SERVICES
|
#ifdef USE_API_SERVICES
|
||||||
case ExecuteServiceRequest::MESSAGE_TYPE: {
|
case 42: {
|
||||||
ExecuteServiceRequest msg;
|
ExecuteServiceRequest msg;
|
||||||
msg.decode(msg_data, msg_size);
|
msg.decode(msg_data, msg_size);
|
||||||
#ifdef HAS_PROTO_MESSAGE_DUMP
|
#ifdef HAS_PROTO_MESSAGE_DUMP
|
||||||
@ -212,7 +207,7 @@ void APIServerConnectionBase::read_message(uint32_t msg_size, uint32_t msg_type,
|
|||||||
}
|
}
|
||||||
#endif
|
#endif
|
||||||
#ifdef USE_CAMERA
|
#ifdef USE_CAMERA
|
||||||
case CameraImageRequest::MESSAGE_TYPE: {
|
case 45: {
|
||||||
CameraImageRequest msg;
|
CameraImageRequest msg;
|
||||||
msg.decode(msg_data, msg_size);
|
msg.decode(msg_data, msg_size);
|
||||||
#ifdef HAS_PROTO_MESSAGE_DUMP
|
#ifdef HAS_PROTO_MESSAGE_DUMP
|
||||||
@ -223,7 +218,7 @@ void APIServerConnectionBase::read_message(uint32_t msg_size, uint32_t msg_type,
|
|||||||
}
|
}
|
||||||
#endif
|
#endif
|
||||||
#ifdef USE_CLIMATE
|
#ifdef USE_CLIMATE
|
||||||
case ClimateCommandRequest::MESSAGE_TYPE: {
|
case 48: {
|
||||||
ClimateCommandRequest msg;
|
ClimateCommandRequest msg;
|
||||||
msg.decode(msg_data, msg_size);
|
msg.decode(msg_data, msg_size);
|
||||||
#ifdef HAS_PROTO_MESSAGE_DUMP
|
#ifdef HAS_PROTO_MESSAGE_DUMP
|
||||||
@ -234,7 +229,7 @@ void APIServerConnectionBase::read_message(uint32_t msg_size, uint32_t msg_type,
|
|||||||
}
|
}
|
||||||
#endif
|
#endif
|
||||||
#ifdef USE_NUMBER
|
#ifdef USE_NUMBER
|
||||||
case NumberCommandRequest::MESSAGE_TYPE: {
|
case 51: {
|
||||||
NumberCommandRequest msg;
|
NumberCommandRequest msg;
|
||||||
msg.decode(msg_data, msg_size);
|
msg.decode(msg_data, msg_size);
|
||||||
#ifdef HAS_PROTO_MESSAGE_DUMP
|
#ifdef HAS_PROTO_MESSAGE_DUMP
|
||||||
@ -245,7 +240,7 @@ void APIServerConnectionBase::read_message(uint32_t msg_size, uint32_t msg_type,
|
|||||||
}
|
}
|
||||||
#endif
|
#endif
|
||||||
#ifdef USE_SELECT
|
#ifdef USE_SELECT
|
||||||
case SelectCommandRequest::MESSAGE_TYPE: {
|
case 54: {
|
||||||
SelectCommandRequest msg;
|
SelectCommandRequest msg;
|
||||||
msg.decode(msg_data, msg_size);
|
msg.decode(msg_data, msg_size);
|
||||||
#ifdef HAS_PROTO_MESSAGE_DUMP
|
#ifdef HAS_PROTO_MESSAGE_DUMP
|
||||||
@ -256,7 +251,7 @@ void APIServerConnectionBase::read_message(uint32_t msg_size, uint32_t msg_type,
|
|||||||
}
|
}
|
||||||
#endif
|
#endif
|
||||||
#ifdef USE_SIREN
|
#ifdef USE_SIREN
|
||||||
case SirenCommandRequest::MESSAGE_TYPE: {
|
case 57: {
|
||||||
SirenCommandRequest msg;
|
SirenCommandRequest msg;
|
||||||
msg.decode(msg_data, msg_size);
|
msg.decode(msg_data, msg_size);
|
||||||
#ifdef HAS_PROTO_MESSAGE_DUMP
|
#ifdef HAS_PROTO_MESSAGE_DUMP
|
||||||
@ -267,7 +262,7 @@ void APIServerConnectionBase::read_message(uint32_t msg_size, uint32_t msg_type,
|
|||||||
}
|
}
|
||||||
#endif
|
#endif
|
||||||
#ifdef USE_LOCK
|
#ifdef USE_LOCK
|
||||||
case LockCommandRequest::MESSAGE_TYPE: {
|
case 60: {
|
||||||
LockCommandRequest msg;
|
LockCommandRequest msg;
|
||||||
msg.decode(msg_data, msg_size);
|
msg.decode(msg_data, msg_size);
|
||||||
#ifdef HAS_PROTO_MESSAGE_DUMP
|
#ifdef HAS_PROTO_MESSAGE_DUMP
|
||||||
@ -278,7 +273,7 @@ void APIServerConnectionBase::read_message(uint32_t msg_size, uint32_t msg_type,
|
|||||||
}
|
}
|
||||||
#endif
|
#endif
|
||||||
#ifdef USE_BUTTON
|
#ifdef USE_BUTTON
|
||||||
case ButtonCommandRequest::MESSAGE_TYPE: {
|
case 62: {
|
||||||
ButtonCommandRequest msg;
|
ButtonCommandRequest msg;
|
||||||
msg.decode(msg_data, msg_size);
|
msg.decode(msg_data, msg_size);
|
||||||
#ifdef HAS_PROTO_MESSAGE_DUMP
|
#ifdef HAS_PROTO_MESSAGE_DUMP
|
||||||
@ -289,7 +284,7 @@ void APIServerConnectionBase::read_message(uint32_t msg_size, uint32_t msg_type,
|
|||||||
}
|
}
|
||||||
#endif
|
#endif
|
||||||
#ifdef USE_MEDIA_PLAYER
|
#ifdef USE_MEDIA_PLAYER
|
||||||
case MediaPlayerCommandRequest::MESSAGE_TYPE: {
|
case 65: {
|
||||||
MediaPlayerCommandRequest msg;
|
MediaPlayerCommandRequest msg;
|
||||||
msg.decode(msg_data, msg_size);
|
msg.decode(msg_data, msg_size);
|
||||||
#ifdef HAS_PROTO_MESSAGE_DUMP
|
#ifdef HAS_PROTO_MESSAGE_DUMP
|
||||||
@ -300,7 +295,7 @@ void APIServerConnectionBase::read_message(uint32_t msg_size, uint32_t msg_type,
|
|||||||
}
|
}
|
||||||
#endif
|
#endif
|
||||||
#ifdef USE_BLUETOOTH_PROXY
|
#ifdef USE_BLUETOOTH_PROXY
|
||||||
case SubscribeBluetoothLEAdvertisementsRequest::MESSAGE_TYPE: {
|
case 66: {
|
||||||
SubscribeBluetoothLEAdvertisementsRequest msg;
|
SubscribeBluetoothLEAdvertisementsRequest msg;
|
||||||
msg.decode(msg_data, msg_size);
|
msg.decode(msg_data, msg_size);
|
||||||
#ifdef HAS_PROTO_MESSAGE_DUMP
|
#ifdef HAS_PROTO_MESSAGE_DUMP
|
||||||
@ -311,7 +306,7 @@ void APIServerConnectionBase::read_message(uint32_t msg_size, uint32_t msg_type,
|
|||||||
}
|
}
|
||||||
#endif
|
#endif
|
||||||
#ifdef USE_BLUETOOTH_PROXY
|
#ifdef USE_BLUETOOTH_PROXY
|
||||||
case BluetoothDeviceRequest::MESSAGE_TYPE: {
|
case 68: {
|
||||||
BluetoothDeviceRequest msg;
|
BluetoothDeviceRequest msg;
|
||||||
msg.decode(msg_data, msg_size);
|
msg.decode(msg_data, msg_size);
|
||||||
#ifdef HAS_PROTO_MESSAGE_DUMP
|
#ifdef HAS_PROTO_MESSAGE_DUMP
|
||||||
@ -322,7 +317,7 @@ void APIServerConnectionBase::read_message(uint32_t msg_size, uint32_t msg_type,
|
|||||||
}
|
}
|
||||||
#endif
|
#endif
|
||||||
#ifdef USE_BLUETOOTH_PROXY
|
#ifdef USE_BLUETOOTH_PROXY
|
||||||
case BluetoothGATTGetServicesRequest::MESSAGE_TYPE: {
|
case 70: {
|
||||||
BluetoothGATTGetServicesRequest msg;
|
BluetoothGATTGetServicesRequest msg;
|
||||||
msg.decode(msg_data, msg_size);
|
msg.decode(msg_data, msg_size);
|
||||||
#ifdef HAS_PROTO_MESSAGE_DUMP
|
#ifdef HAS_PROTO_MESSAGE_DUMP
|
||||||
@ -333,7 +328,7 @@ void APIServerConnectionBase::read_message(uint32_t msg_size, uint32_t msg_type,
|
|||||||
}
|
}
|
||||||
#endif
|
#endif
|
||||||
#ifdef USE_BLUETOOTH_PROXY
|
#ifdef USE_BLUETOOTH_PROXY
|
||||||
case BluetoothGATTReadRequest::MESSAGE_TYPE: {
|
case 73: {
|
||||||
BluetoothGATTReadRequest msg;
|
BluetoothGATTReadRequest msg;
|
||||||
msg.decode(msg_data, msg_size);
|
msg.decode(msg_data, msg_size);
|
||||||
#ifdef HAS_PROTO_MESSAGE_DUMP
|
#ifdef HAS_PROTO_MESSAGE_DUMP
|
||||||
@ -344,7 +339,7 @@ void APIServerConnectionBase::read_message(uint32_t msg_size, uint32_t msg_type,
|
|||||||
}
|
}
|
||||||
#endif
|
#endif
|
||||||
#ifdef USE_BLUETOOTH_PROXY
|
#ifdef USE_BLUETOOTH_PROXY
|
||||||
case BluetoothGATTWriteRequest::MESSAGE_TYPE: {
|
case 75: {
|
||||||
BluetoothGATTWriteRequest msg;
|
BluetoothGATTWriteRequest msg;
|
||||||
msg.decode(msg_data, msg_size);
|
msg.decode(msg_data, msg_size);
|
||||||
#ifdef HAS_PROTO_MESSAGE_DUMP
|
#ifdef HAS_PROTO_MESSAGE_DUMP
|
||||||
@ -355,7 +350,7 @@ void APIServerConnectionBase::read_message(uint32_t msg_size, uint32_t msg_type,
|
|||||||
}
|
}
|
||||||
#endif
|
#endif
|
||||||
#ifdef USE_BLUETOOTH_PROXY
|
#ifdef USE_BLUETOOTH_PROXY
|
||||||
case BluetoothGATTReadDescriptorRequest::MESSAGE_TYPE: {
|
case 76: {
|
||||||
BluetoothGATTReadDescriptorRequest msg;
|
BluetoothGATTReadDescriptorRequest msg;
|
||||||
msg.decode(msg_data, msg_size);
|
msg.decode(msg_data, msg_size);
|
||||||
#ifdef HAS_PROTO_MESSAGE_DUMP
|
#ifdef HAS_PROTO_MESSAGE_DUMP
|
||||||
@ -366,7 +361,7 @@ void APIServerConnectionBase::read_message(uint32_t msg_size, uint32_t msg_type,
|
|||||||
}
|
}
|
||||||
#endif
|
#endif
|
||||||
#ifdef USE_BLUETOOTH_PROXY
|
#ifdef USE_BLUETOOTH_PROXY
|
||||||
case BluetoothGATTWriteDescriptorRequest::MESSAGE_TYPE: {
|
case 77: {
|
||||||
BluetoothGATTWriteDescriptorRequest msg;
|
BluetoothGATTWriteDescriptorRequest msg;
|
||||||
msg.decode(msg_data, msg_size);
|
msg.decode(msg_data, msg_size);
|
||||||
#ifdef HAS_PROTO_MESSAGE_DUMP
|
#ifdef HAS_PROTO_MESSAGE_DUMP
|
||||||
@ -377,7 +372,7 @@ void APIServerConnectionBase::read_message(uint32_t msg_size, uint32_t msg_type,
|
|||||||
}
|
}
|
||||||
#endif
|
#endif
|
||||||
#ifdef USE_BLUETOOTH_PROXY
|
#ifdef USE_BLUETOOTH_PROXY
|
||||||
case BluetoothGATTNotifyRequest::MESSAGE_TYPE: {
|
case 78: {
|
||||||
BluetoothGATTNotifyRequest msg;
|
BluetoothGATTNotifyRequest msg;
|
||||||
msg.decode(msg_data, msg_size);
|
msg.decode(msg_data, msg_size);
|
||||||
#ifdef HAS_PROTO_MESSAGE_DUMP
|
#ifdef HAS_PROTO_MESSAGE_DUMP
|
||||||
@ -388,9 +383,9 @@ void APIServerConnectionBase::read_message(uint32_t msg_size, uint32_t msg_type,
|
|||||||
}
|
}
|
||||||
#endif
|
#endif
|
||||||
#ifdef USE_BLUETOOTH_PROXY
|
#ifdef USE_BLUETOOTH_PROXY
|
||||||
case SubscribeBluetoothConnectionsFreeRequest::MESSAGE_TYPE: {
|
case 80: {
|
||||||
SubscribeBluetoothConnectionsFreeRequest msg;
|
SubscribeBluetoothConnectionsFreeRequest msg;
|
||||||
// Empty message: no decode needed
|
msg.decode(msg_data, msg_size);
|
||||||
#ifdef HAS_PROTO_MESSAGE_DUMP
|
#ifdef HAS_PROTO_MESSAGE_DUMP
|
||||||
ESP_LOGVV(TAG, "on_subscribe_bluetooth_connections_free_request: %s", msg.dump().c_str());
|
ESP_LOGVV(TAG, "on_subscribe_bluetooth_connections_free_request: %s", msg.dump().c_str());
|
||||||
#endif
|
#endif
|
||||||
@ -399,9 +394,9 @@ void APIServerConnectionBase::read_message(uint32_t msg_size, uint32_t msg_type,
|
|||||||
}
|
}
|
||||||
#endif
|
#endif
|
||||||
#ifdef USE_BLUETOOTH_PROXY
|
#ifdef USE_BLUETOOTH_PROXY
|
||||||
case UnsubscribeBluetoothLEAdvertisementsRequest::MESSAGE_TYPE: {
|
case 87: {
|
||||||
UnsubscribeBluetoothLEAdvertisementsRequest msg;
|
UnsubscribeBluetoothLEAdvertisementsRequest msg;
|
||||||
// Empty message: no decode needed
|
msg.decode(msg_data, msg_size);
|
||||||
#ifdef HAS_PROTO_MESSAGE_DUMP
|
#ifdef HAS_PROTO_MESSAGE_DUMP
|
||||||
ESP_LOGVV(TAG, "on_unsubscribe_bluetooth_le_advertisements_request: %s", msg.dump().c_str());
|
ESP_LOGVV(TAG, "on_unsubscribe_bluetooth_le_advertisements_request: %s", msg.dump().c_str());
|
||||||
#endif
|
#endif
|
||||||
@ -410,7 +405,7 @@ void APIServerConnectionBase::read_message(uint32_t msg_size, uint32_t msg_type,
|
|||||||
}
|
}
|
||||||
#endif
|
#endif
|
||||||
#ifdef USE_VOICE_ASSISTANT
|
#ifdef USE_VOICE_ASSISTANT
|
||||||
case SubscribeVoiceAssistantRequest::MESSAGE_TYPE: {
|
case 89: {
|
||||||
SubscribeVoiceAssistantRequest msg;
|
SubscribeVoiceAssistantRequest msg;
|
||||||
msg.decode(msg_data, msg_size);
|
msg.decode(msg_data, msg_size);
|
||||||
#ifdef HAS_PROTO_MESSAGE_DUMP
|
#ifdef HAS_PROTO_MESSAGE_DUMP
|
||||||
@ -421,7 +416,7 @@ void APIServerConnectionBase::read_message(uint32_t msg_size, uint32_t msg_type,
|
|||||||
}
|
}
|
||||||
#endif
|
#endif
|
||||||
#ifdef USE_VOICE_ASSISTANT
|
#ifdef USE_VOICE_ASSISTANT
|
||||||
case VoiceAssistantResponse::MESSAGE_TYPE: {
|
case 91: {
|
||||||
VoiceAssistantResponse msg;
|
VoiceAssistantResponse msg;
|
||||||
msg.decode(msg_data, msg_size);
|
msg.decode(msg_data, msg_size);
|
||||||
#ifdef HAS_PROTO_MESSAGE_DUMP
|
#ifdef HAS_PROTO_MESSAGE_DUMP
|
||||||
@ -432,7 +427,7 @@ void APIServerConnectionBase::read_message(uint32_t msg_size, uint32_t msg_type,
|
|||||||
}
|
}
|
||||||
#endif
|
#endif
|
||||||
#ifdef USE_VOICE_ASSISTANT
|
#ifdef USE_VOICE_ASSISTANT
|
||||||
case VoiceAssistantEventResponse::MESSAGE_TYPE: {
|
case 92: {
|
||||||
VoiceAssistantEventResponse msg;
|
VoiceAssistantEventResponse msg;
|
||||||
msg.decode(msg_data, msg_size);
|
msg.decode(msg_data, msg_size);
|
||||||
#ifdef HAS_PROTO_MESSAGE_DUMP
|
#ifdef HAS_PROTO_MESSAGE_DUMP
|
||||||
@ -443,7 +438,7 @@ void APIServerConnectionBase::read_message(uint32_t msg_size, uint32_t msg_type,
|
|||||||
}
|
}
|
||||||
#endif
|
#endif
|
||||||
#ifdef USE_ALARM_CONTROL_PANEL
|
#ifdef USE_ALARM_CONTROL_PANEL
|
||||||
case AlarmControlPanelCommandRequest::MESSAGE_TYPE: {
|
case 96: {
|
||||||
AlarmControlPanelCommandRequest msg;
|
AlarmControlPanelCommandRequest msg;
|
||||||
msg.decode(msg_data, msg_size);
|
msg.decode(msg_data, msg_size);
|
||||||
#ifdef HAS_PROTO_MESSAGE_DUMP
|
#ifdef HAS_PROTO_MESSAGE_DUMP
|
||||||
@ -454,7 +449,7 @@ void APIServerConnectionBase::read_message(uint32_t msg_size, uint32_t msg_type,
|
|||||||
}
|
}
|
||||||
#endif
|
#endif
|
||||||
#ifdef USE_TEXT
|
#ifdef USE_TEXT
|
||||||
case TextCommandRequest::MESSAGE_TYPE: {
|
case 99: {
|
||||||
TextCommandRequest msg;
|
TextCommandRequest msg;
|
||||||
msg.decode(msg_data, msg_size);
|
msg.decode(msg_data, msg_size);
|
||||||
#ifdef HAS_PROTO_MESSAGE_DUMP
|
#ifdef HAS_PROTO_MESSAGE_DUMP
|
||||||
@ -465,7 +460,7 @@ void APIServerConnectionBase::read_message(uint32_t msg_size, uint32_t msg_type,
|
|||||||
}
|
}
|
||||||
#endif
|
#endif
|
||||||
#ifdef USE_DATETIME_DATE
|
#ifdef USE_DATETIME_DATE
|
||||||
case DateCommandRequest::MESSAGE_TYPE: {
|
case 102: {
|
||||||
DateCommandRequest msg;
|
DateCommandRequest msg;
|
||||||
msg.decode(msg_data, msg_size);
|
msg.decode(msg_data, msg_size);
|
||||||
#ifdef HAS_PROTO_MESSAGE_DUMP
|
#ifdef HAS_PROTO_MESSAGE_DUMP
|
||||||
@ -476,7 +471,7 @@ void APIServerConnectionBase::read_message(uint32_t msg_size, uint32_t msg_type,
|
|||||||
}
|
}
|
||||||
#endif
|
#endif
|
||||||
#ifdef USE_DATETIME_TIME
|
#ifdef USE_DATETIME_TIME
|
||||||
case TimeCommandRequest::MESSAGE_TYPE: {
|
case 105: {
|
||||||
TimeCommandRequest msg;
|
TimeCommandRequest msg;
|
||||||
msg.decode(msg_data, msg_size);
|
msg.decode(msg_data, msg_size);
|
||||||
#ifdef HAS_PROTO_MESSAGE_DUMP
|
#ifdef HAS_PROTO_MESSAGE_DUMP
|
||||||
@ -487,7 +482,7 @@ void APIServerConnectionBase::read_message(uint32_t msg_size, uint32_t msg_type,
|
|||||||
}
|
}
|
||||||
#endif
|
#endif
|
||||||
#ifdef USE_VOICE_ASSISTANT
|
#ifdef USE_VOICE_ASSISTANT
|
||||||
case VoiceAssistantAudio::MESSAGE_TYPE: {
|
case 106: {
|
||||||
VoiceAssistantAudio msg;
|
VoiceAssistantAudio msg;
|
||||||
msg.decode(msg_data, msg_size);
|
msg.decode(msg_data, msg_size);
|
||||||
#ifdef HAS_PROTO_MESSAGE_DUMP
|
#ifdef HAS_PROTO_MESSAGE_DUMP
|
||||||
@ -498,7 +493,7 @@ void APIServerConnectionBase::read_message(uint32_t msg_size, uint32_t msg_type,
|
|||||||
}
|
}
|
||||||
#endif
|
#endif
|
||||||
#ifdef USE_VALVE
|
#ifdef USE_VALVE
|
||||||
case ValveCommandRequest::MESSAGE_TYPE: {
|
case 111: {
|
||||||
ValveCommandRequest msg;
|
ValveCommandRequest msg;
|
||||||
msg.decode(msg_data, msg_size);
|
msg.decode(msg_data, msg_size);
|
||||||
#ifdef HAS_PROTO_MESSAGE_DUMP
|
#ifdef HAS_PROTO_MESSAGE_DUMP
|
||||||
@ -509,7 +504,7 @@ void APIServerConnectionBase::read_message(uint32_t msg_size, uint32_t msg_type,
|
|||||||
}
|
}
|
||||||
#endif
|
#endif
|
||||||
#ifdef USE_DATETIME_DATETIME
|
#ifdef USE_DATETIME_DATETIME
|
||||||
case DateTimeCommandRequest::MESSAGE_TYPE: {
|
case 114: {
|
||||||
DateTimeCommandRequest msg;
|
DateTimeCommandRequest msg;
|
||||||
msg.decode(msg_data, msg_size);
|
msg.decode(msg_data, msg_size);
|
||||||
#ifdef HAS_PROTO_MESSAGE_DUMP
|
#ifdef HAS_PROTO_MESSAGE_DUMP
|
||||||
@ -520,7 +515,7 @@ void APIServerConnectionBase::read_message(uint32_t msg_size, uint32_t msg_type,
|
|||||||
}
|
}
|
||||||
#endif
|
#endif
|
||||||
#ifdef USE_VOICE_ASSISTANT
|
#ifdef USE_VOICE_ASSISTANT
|
||||||
case VoiceAssistantTimerEventResponse::MESSAGE_TYPE: {
|
case 115: {
|
||||||
VoiceAssistantTimerEventResponse msg;
|
VoiceAssistantTimerEventResponse msg;
|
||||||
msg.decode(msg_data, msg_size);
|
msg.decode(msg_data, msg_size);
|
||||||
#ifdef HAS_PROTO_MESSAGE_DUMP
|
#ifdef HAS_PROTO_MESSAGE_DUMP
|
||||||
@ -531,7 +526,7 @@ void APIServerConnectionBase::read_message(uint32_t msg_size, uint32_t msg_type,
|
|||||||
}
|
}
|
||||||
#endif
|
#endif
|
||||||
#ifdef USE_UPDATE
|
#ifdef USE_UPDATE
|
||||||
case UpdateCommandRequest::MESSAGE_TYPE: {
|
case 118: {
|
||||||
UpdateCommandRequest msg;
|
UpdateCommandRequest msg;
|
||||||
msg.decode(msg_data, msg_size);
|
msg.decode(msg_data, msg_size);
|
||||||
#ifdef HAS_PROTO_MESSAGE_DUMP
|
#ifdef HAS_PROTO_MESSAGE_DUMP
|
||||||
@ -542,7 +537,7 @@ void APIServerConnectionBase::read_message(uint32_t msg_size, uint32_t msg_type,
|
|||||||
}
|
}
|
||||||
#endif
|
#endif
|
||||||
#ifdef USE_VOICE_ASSISTANT
|
#ifdef USE_VOICE_ASSISTANT
|
||||||
case VoiceAssistantAnnounceRequest::MESSAGE_TYPE: {
|
case 119: {
|
||||||
VoiceAssistantAnnounceRequest msg;
|
VoiceAssistantAnnounceRequest msg;
|
||||||
msg.decode(msg_data, msg_size);
|
msg.decode(msg_data, msg_size);
|
||||||
#ifdef HAS_PROTO_MESSAGE_DUMP
|
#ifdef HAS_PROTO_MESSAGE_DUMP
|
||||||
@ -553,9 +548,9 @@ void APIServerConnectionBase::read_message(uint32_t msg_size, uint32_t msg_type,
|
|||||||
}
|
}
|
||||||
#endif
|
#endif
|
||||||
#ifdef USE_VOICE_ASSISTANT
|
#ifdef USE_VOICE_ASSISTANT
|
||||||
case VoiceAssistantConfigurationRequest::MESSAGE_TYPE: {
|
case 121: {
|
||||||
VoiceAssistantConfigurationRequest msg;
|
VoiceAssistantConfigurationRequest msg;
|
||||||
// Empty message: no decode needed
|
msg.decode(msg_data, msg_size);
|
||||||
#ifdef HAS_PROTO_MESSAGE_DUMP
|
#ifdef HAS_PROTO_MESSAGE_DUMP
|
||||||
ESP_LOGVV(TAG, "on_voice_assistant_configuration_request: %s", msg.dump().c_str());
|
ESP_LOGVV(TAG, "on_voice_assistant_configuration_request: %s", msg.dump().c_str());
|
||||||
#endif
|
#endif
|
||||||
@ -564,7 +559,7 @@ void APIServerConnectionBase::read_message(uint32_t msg_size, uint32_t msg_type,
|
|||||||
}
|
}
|
||||||
#endif
|
#endif
|
||||||
#ifdef USE_VOICE_ASSISTANT
|
#ifdef USE_VOICE_ASSISTANT
|
||||||
case VoiceAssistantSetConfiguration::MESSAGE_TYPE: {
|
case 123: {
|
||||||
VoiceAssistantSetConfiguration msg;
|
VoiceAssistantSetConfiguration msg;
|
||||||
msg.decode(msg_data, msg_size);
|
msg.decode(msg_data, msg_size);
|
||||||
#ifdef HAS_PROTO_MESSAGE_DUMP
|
#ifdef HAS_PROTO_MESSAGE_DUMP
|
||||||
@ -575,7 +570,7 @@ void APIServerConnectionBase::read_message(uint32_t msg_size, uint32_t msg_type,
|
|||||||
}
|
}
|
||||||
#endif
|
#endif
|
||||||
#ifdef USE_API_NOISE
|
#ifdef USE_API_NOISE
|
||||||
case NoiseEncryptionSetKeyRequest::MESSAGE_TYPE: {
|
case 124: {
|
||||||
NoiseEncryptionSetKeyRequest msg;
|
NoiseEncryptionSetKeyRequest msg;
|
||||||
msg.decode(msg_data, msg_size);
|
msg.decode(msg_data, msg_size);
|
||||||
#ifdef HAS_PROTO_MESSAGE_DUMP
|
#ifdef HAS_PROTO_MESSAGE_DUMP
|
||||||
@ -586,7 +581,7 @@ void APIServerConnectionBase::read_message(uint32_t msg_size, uint32_t msg_type,
|
|||||||
}
|
}
|
||||||
#endif
|
#endif
|
||||||
#ifdef USE_BLUETOOTH_PROXY
|
#ifdef USE_BLUETOOTH_PROXY
|
||||||
case BluetoothScannerSetModeRequest::MESSAGE_TYPE: {
|
case 127: {
|
||||||
BluetoothScannerSetModeRequest msg;
|
BluetoothScannerSetModeRequest msg;
|
||||||
msg.decode(msg_data, msg_size);
|
msg.decode(msg_data, msg_size);
|
||||||
#ifdef HAS_PROTO_MESSAGE_DUMP
|
#ifdef HAS_PROTO_MESSAGE_DUMP
|
||||||
@ -602,28 +597,35 @@ void APIServerConnectionBase::read_message(uint32_t msg_size, uint32_t msg_type,
|
|||||||
}
|
}
|
||||||
|
|
||||||
void APIServerConnection::on_hello_request(const HelloRequest &msg) {
|
void APIServerConnection::on_hello_request(const HelloRequest &msg) {
|
||||||
if (!this->send_hello_response(msg)) {
|
HelloResponse ret = this->hello(msg);
|
||||||
|
if (!this->send_message(ret)) {
|
||||||
this->on_fatal_error();
|
this->on_fatal_error();
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
void APIServerConnection::on_connect_request(const ConnectRequest &msg) {
|
void APIServerConnection::on_connect_request(const ConnectRequest &msg) {
|
||||||
if (!this->send_connect_response(msg)) {
|
ConnectResponse ret = this->connect(msg);
|
||||||
|
if (!this->send_message(ret)) {
|
||||||
this->on_fatal_error();
|
this->on_fatal_error();
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
void APIServerConnection::on_disconnect_request(const DisconnectRequest &msg) {
|
void APIServerConnection::on_disconnect_request(const DisconnectRequest &msg) {
|
||||||
if (!this->send_disconnect_response(msg)) {
|
DisconnectResponse ret = this->disconnect(msg);
|
||||||
|
if (!this->send_message(ret)) {
|
||||||
this->on_fatal_error();
|
this->on_fatal_error();
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
void APIServerConnection::on_ping_request(const PingRequest &msg) {
|
void APIServerConnection::on_ping_request(const PingRequest &msg) {
|
||||||
if (!this->send_ping_response(msg)) {
|
PingResponse ret = this->ping(msg);
|
||||||
|
if (!this->send_message(ret)) {
|
||||||
this->on_fatal_error();
|
this->on_fatal_error();
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
void APIServerConnection::on_device_info_request(const DeviceInfoRequest &msg) {
|
void APIServerConnection::on_device_info_request(const DeviceInfoRequest &msg) {
|
||||||
if (this->check_connection_setup_() && !this->send_device_info_response(msg)) {
|
if (this->check_connection_setup_()) {
|
||||||
this->on_fatal_error();
|
DeviceInfoResponse ret = this->device_info(msg);
|
||||||
|
if (!this->send_message(ret)) {
|
||||||
|
this->on_fatal_error();
|
||||||
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
void APIServerConnection::on_list_entities_request(const ListEntitiesRequest &msg) {
|
void APIServerConnection::on_list_entities_request(const ListEntitiesRequest &msg) {
|
||||||
@ -641,24 +643,23 @@ void APIServerConnection::on_subscribe_logs_request(const SubscribeLogsRequest &
|
|||||||
this->subscribe_logs(msg);
|
this->subscribe_logs(msg);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
#ifdef USE_API_HOMEASSISTANT_SERVICES
|
|
||||||
void APIServerConnection::on_subscribe_homeassistant_services_request(
|
void APIServerConnection::on_subscribe_homeassistant_services_request(
|
||||||
const SubscribeHomeassistantServicesRequest &msg) {
|
const SubscribeHomeassistantServicesRequest &msg) {
|
||||||
if (this->check_authenticated_()) {
|
if (this->check_authenticated_()) {
|
||||||
this->subscribe_homeassistant_services(msg);
|
this->subscribe_homeassistant_services(msg);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
#endif
|
|
||||||
#ifdef USE_API_HOMEASSISTANT_STATES
|
|
||||||
void APIServerConnection::on_subscribe_home_assistant_states_request(const SubscribeHomeAssistantStatesRequest &msg) {
|
void APIServerConnection::on_subscribe_home_assistant_states_request(const SubscribeHomeAssistantStatesRequest &msg) {
|
||||||
if (this->check_authenticated_()) {
|
if (this->check_authenticated_()) {
|
||||||
this->subscribe_home_assistant_states(msg);
|
this->subscribe_home_assistant_states(msg);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
#endif
|
|
||||||
void APIServerConnection::on_get_time_request(const GetTimeRequest &msg) {
|
void APIServerConnection::on_get_time_request(const GetTimeRequest &msg) {
|
||||||
if (this->check_connection_setup_() && !this->send_get_time_response(msg)) {
|
if (this->check_connection_setup_()) {
|
||||||
this->on_fatal_error();
|
GetTimeResponse ret = this->get_time(msg);
|
||||||
|
if (!this->send_message(ret)) {
|
||||||
|
this->on_fatal_error();
|
||||||
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
#ifdef USE_API_SERVICES
|
#ifdef USE_API_SERVICES
|
||||||
@ -670,8 +671,11 @@ void APIServerConnection::on_execute_service_request(const ExecuteServiceRequest
|
|||||||
#endif
|
#endif
|
||||||
#ifdef USE_API_NOISE
|
#ifdef USE_API_NOISE
|
||||||
void APIServerConnection::on_noise_encryption_set_key_request(const NoiseEncryptionSetKeyRequest &msg) {
|
void APIServerConnection::on_noise_encryption_set_key_request(const NoiseEncryptionSetKeyRequest &msg) {
|
||||||
if (this->check_authenticated_() && !this->send_noise_encryption_set_key_response(msg)) {
|
if (this->check_authenticated_()) {
|
||||||
this->on_fatal_error();
|
NoiseEncryptionSetKeyResponse ret = this->noise_encryption_set_key(msg);
|
||||||
|
if (!this->send_message(ret)) {
|
||||||
|
this->on_fatal_error();
|
||||||
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
#endif
|
#endif
|
||||||
@ -861,8 +865,11 @@ void APIServerConnection::on_bluetooth_gatt_notify_request(const BluetoothGATTNo
|
|||||||
#ifdef USE_BLUETOOTH_PROXY
|
#ifdef USE_BLUETOOTH_PROXY
|
||||||
void APIServerConnection::on_subscribe_bluetooth_connections_free_request(
|
void APIServerConnection::on_subscribe_bluetooth_connections_free_request(
|
||||||
const SubscribeBluetoothConnectionsFreeRequest &msg) {
|
const SubscribeBluetoothConnectionsFreeRequest &msg) {
|
||||||
if (this->check_authenticated_() && !this->send_subscribe_bluetooth_connections_free_response(msg)) {
|
if (this->check_authenticated_()) {
|
||||||
this->on_fatal_error();
|
BluetoothConnectionsFreeResponse ret = this->subscribe_bluetooth_connections_free(msg);
|
||||||
|
if (!this->send_message(ret)) {
|
||||||
|
this->on_fatal_error();
|
||||||
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
#endif
|
#endif
|
||||||
@ -890,8 +897,11 @@ void APIServerConnection::on_subscribe_voice_assistant_request(const SubscribeVo
|
|||||||
#endif
|
#endif
|
||||||
#ifdef USE_VOICE_ASSISTANT
|
#ifdef USE_VOICE_ASSISTANT
|
||||||
void APIServerConnection::on_voice_assistant_configuration_request(const VoiceAssistantConfigurationRequest &msg) {
|
void APIServerConnection::on_voice_assistant_configuration_request(const VoiceAssistantConfigurationRequest &msg) {
|
||||||
if (this->check_authenticated_() && !this->send_voice_assistant_get_configuration_response(msg)) {
|
if (this->check_authenticated_()) {
|
||||||
this->on_fatal_error();
|
VoiceAssistantConfigurationResponse ret = this->voice_assistant_get_configuration(msg);
|
||||||
|
if (!this->send_message(ret)) {
|
||||||
|
this->on_fatal_error();
|
||||||
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
#endif
|
#endif
|
||||||
@ -910,4 +920,5 @@ void APIServerConnection::on_alarm_control_panel_command_request(const AlarmCont
|
|||||||
}
|
}
|
||||||
#endif
|
#endif
|
||||||
|
|
||||||
} // namespace esphome::api
|
} // namespace api
|
||||||
|
} // namespace esphome
|
||||||
|
@ -6,7 +6,8 @@
|
|||||||
|
|
||||||
#include "api_pb2.h"
|
#include "api_pb2.h"
|
||||||
|
|
||||||
namespace esphome::api {
|
namespace esphome {
|
||||||
|
namespace api {
|
||||||
|
|
||||||
class APIServerConnectionBase : public ProtoService {
|
class APIServerConnectionBase : public ProtoService {
|
||||||
public:
|
public:
|
||||||
@ -17,11 +18,11 @@ class APIServerConnectionBase : public ProtoService {
|
|||||||
public:
|
public:
|
||||||
#endif
|
#endif
|
||||||
|
|
||||||
bool send_message(const ProtoMessage &msg, uint8_t message_type) {
|
template<typename T> bool send_message(const T &msg) {
|
||||||
#ifdef HAS_PROTO_MESSAGE_DUMP
|
#ifdef HAS_PROTO_MESSAGE_DUMP
|
||||||
this->log_send_message_(msg.message_name(), msg.dump());
|
this->log_send_message_(msg.message_name(), msg.dump());
|
||||||
#endif
|
#endif
|
||||||
return this->send_message_(msg, message_type);
|
return this->send_message_(msg, T::MESSAGE_TYPE);
|
||||||
}
|
}
|
||||||
|
|
||||||
virtual void on_hello_request(const HelloRequest &value){};
|
virtual void on_hello_request(const HelloRequest &value){};
|
||||||
@ -60,17 +61,11 @@ class APIServerConnectionBase : public ProtoService {
|
|||||||
virtual void on_noise_encryption_set_key_request(const NoiseEncryptionSetKeyRequest &value){};
|
virtual void on_noise_encryption_set_key_request(const NoiseEncryptionSetKeyRequest &value){};
|
||||||
#endif
|
#endif
|
||||||
|
|
||||||
#ifdef USE_API_HOMEASSISTANT_SERVICES
|
|
||||||
virtual void on_subscribe_homeassistant_services_request(const SubscribeHomeassistantServicesRequest &value){};
|
virtual void on_subscribe_homeassistant_services_request(const SubscribeHomeassistantServicesRequest &value){};
|
||||||
#endif
|
|
||||||
|
|
||||||
#ifdef USE_API_HOMEASSISTANT_STATES
|
|
||||||
virtual void on_subscribe_home_assistant_states_request(const SubscribeHomeAssistantStatesRequest &value){};
|
virtual void on_subscribe_home_assistant_states_request(const SubscribeHomeAssistantStatesRequest &value){};
|
||||||
#endif
|
|
||||||
|
|
||||||
#ifdef USE_API_HOMEASSISTANT_STATES
|
|
||||||
virtual void on_home_assistant_state_response(const HomeAssistantStateResponse &value){};
|
virtual void on_home_assistant_state_response(const HomeAssistantStateResponse &value){};
|
||||||
#endif
|
|
||||||
virtual void on_get_time_request(const GetTimeRequest &value){};
|
virtual void on_get_time_request(const GetTimeRequest &value){};
|
||||||
virtual void on_get_time_response(const GetTimeResponse &value){};
|
virtual void on_get_time_response(const GetTimeResponse &value){};
|
||||||
|
|
||||||
@ -212,26 +207,22 @@ class APIServerConnectionBase : public ProtoService {
|
|||||||
|
|
||||||
class APIServerConnection : public APIServerConnectionBase {
|
class APIServerConnection : public APIServerConnectionBase {
|
||||||
public:
|
public:
|
||||||
virtual bool send_hello_response(const HelloRequest &msg) = 0;
|
virtual HelloResponse hello(const HelloRequest &msg) = 0;
|
||||||
virtual bool send_connect_response(const ConnectRequest &msg) = 0;
|
virtual ConnectResponse connect(const ConnectRequest &msg) = 0;
|
||||||
virtual bool send_disconnect_response(const DisconnectRequest &msg) = 0;
|
virtual DisconnectResponse disconnect(const DisconnectRequest &msg) = 0;
|
||||||
virtual bool send_ping_response(const PingRequest &msg) = 0;
|
virtual PingResponse ping(const PingRequest &msg) = 0;
|
||||||
virtual bool send_device_info_response(const DeviceInfoRequest &msg) = 0;
|
virtual DeviceInfoResponse device_info(const DeviceInfoRequest &msg) = 0;
|
||||||
virtual void list_entities(const ListEntitiesRequest &msg) = 0;
|
virtual void list_entities(const ListEntitiesRequest &msg) = 0;
|
||||||
virtual void subscribe_states(const SubscribeStatesRequest &msg) = 0;
|
virtual void subscribe_states(const SubscribeStatesRequest &msg) = 0;
|
||||||
virtual void subscribe_logs(const SubscribeLogsRequest &msg) = 0;
|
virtual void subscribe_logs(const SubscribeLogsRequest &msg) = 0;
|
||||||
#ifdef USE_API_HOMEASSISTANT_SERVICES
|
|
||||||
virtual void subscribe_homeassistant_services(const SubscribeHomeassistantServicesRequest &msg) = 0;
|
virtual void subscribe_homeassistant_services(const SubscribeHomeassistantServicesRequest &msg) = 0;
|
||||||
#endif
|
|
||||||
#ifdef USE_API_HOMEASSISTANT_STATES
|
|
||||||
virtual void subscribe_home_assistant_states(const SubscribeHomeAssistantStatesRequest &msg) = 0;
|
virtual void subscribe_home_assistant_states(const SubscribeHomeAssistantStatesRequest &msg) = 0;
|
||||||
#endif
|
virtual GetTimeResponse get_time(const GetTimeRequest &msg) = 0;
|
||||||
virtual bool send_get_time_response(const GetTimeRequest &msg) = 0;
|
|
||||||
#ifdef USE_API_SERVICES
|
#ifdef USE_API_SERVICES
|
||||||
virtual void execute_service(const ExecuteServiceRequest &msg) = 0;
|
virtual void execute_service(const ExecuteServiceRequest &msg) = 0;
|
||||||
#endif
|
#endif
|
||||||
#ifdef USE_API_NOISE
|
#ifdef USE_API_NOISE
|
||||||
virtual bool send_noise_encryption_set_key_response(const NoiseEncryptionSetKeyRequest &msg) = 0;
|
virtual NoiseEncryptionSetKeyResponse noise_encryption_set_key(const NoiseEncryptionSetKeyRequest &msg) = 0;
|
||||||
#endif
|
#endif
|
||||||
#ifdef USE_BUTTON
|
#ifdef USE_BUTTON
|
||||||
virtual void button_command(const ButtonCommandRequest &msg) = 0;
|
virtual void button_command(const ButtonCommandRequest &msg) = 0;
|
||||||
@ -312,7 +303,7 @@ class APIServerConnection : public APIServerConnectionBase {
|
|||||||
virtual void bluetooth_gatt_notify(const BluetoothGATTNotifyRequest &msg) = 0;
|
virtual void bluetooth_gatt_notify(const BluetoothGATTNotifyRequest &msg) = 0;
|
||||||
#endif
|
#endif
|
||||||
#ifdef USE_BLUETOOTH_PROXY
|
#ifdef USE_BLUETOOTH_PROXY
|
||||||
virtual bool send_subscribe_bluetooth_connections_free_response(
|
virtual BluetoothConnectionsFreeResponse subscribe_bluetooth_connections_free(
|
||||||
const SubscribeBluetoothConnectionsFreeRequest &msg) = 0;
|
const SubscribeBluetoothConnectionsFreeRequest &msg) = 0;
|
||||||
#endif
|
#endif
|
||||||
#ifdef USE_BLUETOOTH_PROXY
|
#ifdef USE_BLUETOOTH_PROXY
|
||||||
@ -325,7 +316,8 @@ class APIServerConnection : public APIServerConnectionBase {
|
|||||||
virtual void subscribe_voice_assistant(const SubscribeVoiceAssistantRequest &msg) = 0;
|
virtual void subscribe_voice_assistant(const SubscribeVoiceAssistantRequest &msg) = 0;
|
||||||
#endif
|
#endif
|
||||||
#ifdef USE_VOICE_ASSISTANT
|
#ifdef USE_VOICE_ASSISTANT
|
||||||
virtual bool send_voice_assistant_get_configuration_response(const VoiceAssistantConfigurationRequest &msg) = 0;
|
virtual VoiceAssistantConfigurationResponse voice_assistant_get_configuration(
|
||||||
|
const VoiceAssistantConfigurationRequest &msg) = 0;
|
||||||
#endif
|
#endif
|
||||||
#ifdef USE_VOICE_ASSISTANT
|
#ifdef USE_VOICE_ASSISTANT
|
||||||
virtual void voice_assistant_set_configuration(const VoiceAssistantSetConfiguration &msg) = 0;
|
virtual void voice_assistant_set_configuration(const VoiceAssistantSetConfiguration &msg) = 0;
|
||||||
@ -342,12 +334,8 @@ class APIServerConnection : public APIServerConnectionBase {
|
|||||||
void on_list_entities_request(const ListEntitiesRequest &msg) override;
|
void on_list_entities_request(const ListEntitiesRequest &msg) override;
|
||||||
void on_subscribe_states_request(const SubscribeStatesRequest &msg) override;
|
void on_subscribe_states_request(const SubscribeStatesRequest &msg) override;
|
||||||
void on_subscribe_logs_request(const SubscribeLogsRequest &msg) override;
|
void on_subscribe_logs_request(const SubscribeLogsRequest &msg) override;
|
||||||
#ifdef USE_API_HOMEASSISTANT_SERVICES
|
|
||||||
void on_subscribe_homeassistant_services_request(const SubscribeHomeassistantServicesRequest &msg) override;
|
void on_subscribe_homeassistant_services_request(const SubscribeHomeassistantServicesRequest &msg) override;
|
||||||
#endif
|
|
||||||
#ifdef USE_API_HOMEASSISTANT_STATES
|
|
||||||
void on_subscribe_home_assistant_states_request(const SubscribeHomeAssistantStatesRequest &msg) override;
|
void on_subscribe_home_assistant_states_request(const SubscribeHomeAssistantStatesRequest &msg) override;
|
||||||
#endif
|
|
||||||
void on_get_time_request(const GetTimeRequest &msg) override;
|
void on_get_time_request(const GetTimeRequest &msg) override;
|
||||||
#ifdef USE_API_SERVICES
|
#ifdef USE_API_SERVICES
|
||||||
void on_execute_service_request(const ExecuteServiceRequest &msg) override;
|
void on_execute_service_request(const ExecuteServiceRequest &msg) override;
|
||||||
@ -457,4 +445,5 @@ class APIServerConnection : public APIServerConnectionBase {
|
|||||||
#endif
|
#endif
|
||||||
};
|
};
|
||||||
|
|
||||||
} // namespace esphome::api
|
} // namespace api
|
||||||
|
} // namespace esphome
|
||||||
|
@ -16,7 +16,8 @@
|
|||||||
|
|
||||||
#include <algorithm>
|
#include <algorithm>
|
||||||
|
|
||||||
namespace esphome::api {
|
namespace esphome {
|
||||||
|
namespace api {
|
||||||
|
|
||||||
static const char *const TAG = "api";
|
static const char *const TAG = "api";
|
||||||
|
|
||||||
@ -30,6 +31,7 @@ APIServer::APIServer() {
|
|||||||
}
|
}
|
||||||
|
|
||||||
void APIServer::setup() {
|
void APIServer::setup() {
|
||||||
|
ESP_LOGCONFIG(TAG, "Running setup");
|
||||||
this->setup_controller();
|
this->setup_controller();
|
||||||
|
|
||||||
#ifdef USE_API_NOISE
|
#ifdef USE_API_NOISE
|
||||||
@ -103,7 +105,7 @@ void APIServer::setup() {
|
|||||||
return;
|
return;
|
||||||
}
|
}
|
||||||
for (auto &c : this->clients_) {
|
for (auto &c : this->clients_) {
|
||||||
if (!c->flags_.remove && c->get_log_subscription_level() >= level)
|
if (!c->flags_.remove)
|
||||||
c->try_send_log_message(level, tag, message, message_len);
|
c->try_send_log_message(level, tag, message, message_len);
|
||||||
}
|
}
|
||||||
});
|
});
|
||||||
@ -183,9 +185,9 @@ void APIServer::loop() {
|
|||||||
|
|
||||||
// Rare case: handle disconnection
|
// Rare case: handle disconnection
|
||||||
#ifdef USE_API_CLIENT_DISCONNECTED_TRIGGER
|
#ifdef USE_API_CLIENT_DISCONNECTED_TRIGGER
|
||||||
this->client_disconnected_trigger_->trigger(client->client_info_.name, client->client_info_.peername);
|
this->client_disconnected_trigger_->trigger(client->client_info_, client->client_peername_);
|
||||||
#endif
|
#endif
|
||||||
ESP_LOGV(TAG, "Remove connection %s", client->client_info_.name.c_str());
|
ESP_LOGV(TAG, "Remove connection %s", client->client_info_.c_str());
|
||||||
|
|
||||||
// Swap with the last element and pop (avoids expensive vector shifts)
|
// Swap with the last element and pop (avoids expensive vector shifts)
|
||||||
if (client_index < this->clients_.size() - 1) {
|
if (client_index < this->clients_.size() - 1) {
|
||||||
@ -203,20 +205,22 @@ void APIServer::loop() {
|
|||||||
|
|
||||||
void APIServer::dump_config() {
|
void APIServer::dump_config() {
|
||||||
ESP_LOGCONFIG(TAG,
|
ESP_LOGCONFIG(TAG,
|
||||||
"Server:\n"
|
"API Server:\n"
|
||||||
" Address: %s:%u",
|
" Address: %s:%u",
|
||||||
network::get_use_address().c_str(), this->port_);
|
network::get_use_address().c_str(), this->port_);
|
||||||
#ifdef USE_API_NOISE
|
#ifdef USE_API_NOISE
|
||||||
ESP_LOGCONFIG(TAG, " Noise encryption: %s", YESNO(this->noise_ctx_->has_psk()));
|
ESP_LOGCONFIG(TAG, " Using noise encryption: %s", YESNO(this->noise_ctx_->has_psk()));
|
||||||
if (!this->noise_ctx_->has_psk()) {
|
if (!this->noise_ctx_->has_psk()) {
|
||||||
ESP_LOGCONFIG(TAG, " Supports encryption: YES");
|
ESP_LOGCONFIG(TAG, " Supports noise encryption: YES");
|
||||||
}
|
}
|
||||||
#else
|
#else
|
||||||
ESP_LOGCONFIG(TAG, " Noise encryption: NO");
|
ESP_LOGCONFIG(TAG, " Using noise encryption: NO");
|
||||||
#endif
|
#endif
|
||||||
}
|
}
|
||||||
|
|
||||||
#ifdef USE_API_PASSWORD
|
#ifdef USE_API_PASSWORD
|
||||||
|
bool APIServer::uses_password() const { return !this->password_.empty(); }
|
||||||
|
|
||||||
bool APIServer::check_password(const std::string &password) const {
|
bool APIServer::check_password(const std::string &password) const {
|
||||||
// depend only on input password length
|
// depend only on input password length
|
||||||
const char *a = this->password_.c_str();
|
const char *a = this->password_.c_str();
|
||||||
@ -369,15 +373,12 @@ void APIServer::set_password(const std::string &password) { this->password_ = pa
|
|||||||
|
|
||||||
void APIServer::set_batch_delay(uint16_t batch_delay) { this->batch_delay_ = batch_delay; }
|
void APIServer::set_batch_delay(uint16_t batch_delay) { this->batch_delay_ = batch_delay; }
|
||||||
|
|
||||||
#ifdef USE_API_HOMEASSISTANT_SERVICES
|
|
||||||
void APIServer::send_homeassistant_service_call(const HomeassistantServiceResponse &call) {
|
void APIServer::send_homeassistant_service_call(const HomeassistantServiceResponse &call) {
|
||||||
for (auto &client : this->clients_) {
|
for (auto &client : this->clients_) {
|
||||||
client->send_homeassistant_service_call(call);
|
client->send_homeassistant_service_call(call);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
#endif
|
|
||||||
|
|
||||||
#ifdef USE_API_HOMEASSISTANT_STATES
|
|
||||||
void APIServer::subscribe_home_assistant_state(std::string entity_id, optional<std::string> attribute,
|
void APIServer::subscribe_home_assistant_state(std::string entity_id, optional<std::string> attribute,
|
||||||
std::function<void(std::string)> f) {
|
std::function<void(std::string)> f) {
|
||||||
this->state_subs_.push_back(HomeAssistantStateSubscription{
|
this->state_subs_.push_back(HomeAssistantStateSubscription{
|
||||||
@ -401,7 +402,6 @@ void APIServer::get_home_assistant_state(std::string entity_id, optional<std::st
|
|||||||
const std::vector<APIServer::HomeAssistantStateSubscription> &APIServer::get_state_subs() const {
|
const std::vector<APIServer::HomeAssistantStateSubscription> &APIServer::get_state_subs() const {
|
||||||
return this->state_subs_;
|
return this->state_subs_;
|
||||||
}
|
}
|
||||||
#endif
|
|
||||||
|
|
||||||
uint16_t APIServer::get_port() const { return this->port_; }
|
uint16_t APIServer::get_port() const { return this->port_; }
|
||||||
|
|
||||||
@ -428,11 +428,10 @@ bool APIServer::save_noise_psk(psk_t psk, bool make_active) {
|
|||||||
ESP_LOGD(TAG, "Noise PSK saved");
|
ESP_LOGD(TAG, "Noise PSK saved");
|
||||||
if (make_active) {
|
if (make_active) {
|
||||||
this->set_timeout(100, [this, psk]() {
|
this->set_timeout(100, [this, psk]() {
|
||||||
ESP_LOGW(TAG, "Disconnecting all clients to reset PSK");
|
ESP_LOGW(TAG, "Disconnecting all clients to reset connections");
|
||||||
this->set_noise_psk(psk);
|
this->set_noise_psk(psk);
|
||||||
for (auto &c : this->clients_) {
|
for (auto &c : this->clients_) {
|
||||||
DisconnectRequest req;
|
c->send_message(DisconnectRequest());
|
||||||
c->send_message(req, DisconnectRequest::MESSAGE_TYPE);
|
|
||||||
}
|
}
|
||||||
});
|
});
|
||||||
}
|
}
|
||||||
@ -465,8 +464,7 @@ void APIServer::on_shutdown() {
|
|||||||
|
|
||||||
// Send disconnect requests to all connected clients
|
// Send disconnect requests to all connected clients
|
||||||
for (auto &c : this->clients_) {
|
for (auto &c : this->clients_) {
|
||||||
DisconnectRequest req;
|
if (!c->send_message(DisconnectRequest())) {
|
||||||
if (!c->send_message(req, DisconnectRequest::MESSAGE_TYPE)) {
|
|
||||||
// If we can't send the disconnect request directly (tx_buffer full),
|
// If we can't send the disconnect request directly (tx_buffer full),
|
||||||
// schedule it at the front of the batch so it will be sent with priority
|
// schedule it at the front of the batch so it will be sent with priority
|
||||||
c->schedule_message_front_(nullptr, &APIConnection::try_send_disconnect_request, DisconnectRequest::MESSAGE_TYPE,
|
c->schedule_message_front_(nullptr, &APIConnection::try_send_disconnect_request, DisconnectRequest::MESSAGE_TYPE,
|
||||||
@ -486,5 +484,6 @@ bool APIServer::teardown() {
|
|||||||
return this->clients_.empty();
|
return this->clients_.empty();
|
||||||
}
|
}
|
||||||
|
|
||||||
} // namespace esphome::api
|
} // namespace api
|
||||||
|
} // namespace esphome
|
||||||
#endif
|
#endif
|
||||||
|
@ -18,7 +18,8 @@
|
|||||||
|
|
||||||
#include <vector>
|
#include <vector>
|
||||||
|
|
||||||
namespace esphome::api {
|
namespace esphome {
|
||||||
|
namespace api {
|
||||||
|
|
||||||
#ifdef USE_API_NOISE
|
#ifdef USE_API_NOISE
|
||||||
struct SavedNoisePsk {
|
struct SavedNoisePsk {
|
||||||
@ -38,6 +39,7 @@ class APIServer : public Component, public Controller {
|
|||||||
bool teardown() override;
|
bool teardown() override;
|
||||||
#ifdef USE_API_PASSWORD
|
#ifdef USE_API_PASSWORD
|
||||||
bool check_password(const std::string &password) const;
|
bool check_password(const std::string &password) const;
|
||||||
|
bool uses_password() const;
|
||||||
void set_password(const std::string &password);
|
void set_password(const std::string &password);
|
||||||
#endif
|
#endif
|
||||||
void set_port(uint16_t port);
|
void set_port(uint16_t port);
|
||||||
@ -106,9 +108,7 @@ class APIServer : public Component, public Controller {
|
|||||||
#ifdef USE_MEDIA_PLAYER
|
#ifdef USE_MEDIA_PLAYER
|
||||||
void on_media_player_update(media_player::MediaPlayer *obj) override;
|
void on_media_player_update(media_player::MediaPlayer *obj) override;
|
||||||
#endif
|
#endif
|
||||||
#ifdef USE_API_HOMEASSISTANT_SERVICES
|
|
||||||
void send_homeassistant_service_call(const HomeassistantServiceResponse &call);
|
void send_homeassistant_service_call(const HomeassistantServiceResponse &call);
|
||||||
#endif
|
|
||||||
#ifdef USE_API_SERVICES
|
#ifdef USE_API_SERVICES
|
||||||
void register_user_service(UserServiceDescriptor *descriptor) { this->user_services_.push_back(descriptor); }
|
void register_user_service(UserServiceDescriptor *descriptor) { this->user_services_.push_back(descriptor); }
|
||||||
#endif
|
#endif
|
||||||
@ -128,7 +128,6 @@ class APIServer : public Component, public Controller {
|
|||||||
|
|
||||||
bool is_connected() const;
|
bool is_connected() const;
|
||||||
|
|
||||||
#ifdef USE_API_HOMEASSISTANT_STATES
|
|
||||||
struct HomeAssistantStateSubscription {
|
struct HomeAssistantStateSubscription {
|
||||||
std::string entity_id;
|
std::string entity_id;
|
||||||
optional<std::string> attribute;
|
optional<std::string> attribute;
|
||||||
@ -141,7 +140,6 @@ class APIServer : public Component, public Controller {
|
|||||||
void get_home_assistant_state(std::string entity_id, optional<std::string> attribute,
|
void get_home_assistant_state(std::string entity_id, optional<std::string> attribute,
|
||||||
std::function<void(std::string)> f);
|
std::function<void(std::string)> f);
|
||||||
const std::vector<HomeAssistantStateSubscription> &get_state_subs() const;
|
const std::vector<HomeAssistantStateSubscription> &get_state_subs() const;
|
||||||
#endif
|
|
||||||
#ifdef USE_API_SERVICES
|
#ifdef USE_API_SERVICES
|
||||||
const std::vector<UserServiceDescriptor *> &get_user_services() const { return this->user_services_; }
|
const std::vector<UserServiceDescriptor *> &get_user_services() const { return this->user_services_; }
|
||||||
#endif
|
#endif
|
||||||
@ -175,9 +173,7 @@ class APIServer : public Component, public Controller {
|
|||||||
std::string password_;
|
std::string password_;
|
||||||
#endif
|
#endif
|
||||||
std::vector<uint8_t> shared_write_buffer_; // Shared proto write buffer for all connections
|
std::vector<uint8_t> shared_write_buffer_; // Shared proto write buffer for all connections
|
||||||
#ifdef USE_API_HOMEASSISTANT_STATES
|
|
||||||
std::vector<HomeAssistantStateSubscription> state_subs_;
|
std::vector<HomeAssistantStateSubscription> state_subs_;
|
||||||
#endif
|
|
||||||
#ifdef USE_API_SERVICES
|
#ifdef USE_API_SERVICES
|
||||||
std::vector<UserServiceDescriptor *> user_services_;
|
std::vector<UserServiceDescriptor *> user_services_;
|
||||||
#endif
|
#endif
|
||||||
@ -201,5 +197,6 @@ template<typename... Ts> class APIConnectedCondition : public Condition<Ts...> {
|
|||||||
bool check(Ts... x) override { return global_api_server->is_connected(); }
|
bool check(Ts... x) override { return global_api_server->is_connected(); }
|
||||||
};
|
};
|
||||||
|
|
||||||
} // namespace esphome::api
|
} // namespace api
|
||||||
|
} // namespace esphome
|
||||||
#endif
|
#endif
|
||||||
|
@ -14,8 +14,6 @@ with warnings.catch_warnings():
|
|||||||
from aioesphomeapi import APIClient, parse_log_message
|
from aioesphomeapi import APIClient, parse_log_message
|
||||||
from aioesphomeapi.log_runner import async_run
|
from aioesphomeapi.log_runner import async_run
|
||||||
|
|
||||||
import contextlib
|
|
||||||
|
|
||||||
from esphome.const import CONF_KEY, CONF_PASSWORD, CONF_PORT, __version__
|
from esphome.const import CONF_KEY, CONF_PASSWORD, CONF_PORT, __version__
|
||||||
from esphome.core import CORE
|
from esphome.core import CORE
|
||||||
|
|
||||||
@ -68,5 +66,7 @@ async def async_run_logs(config: dict[str, Any], address: str) -> None:
|
|||||||
|
|
||||||
def run_logs(config: dict[str, Any], address: str) -> None:
|
def run_logs(config: dict[str, Any], address: str) -> None:
|
||||||
"""Run the logs command."""
|
"""Run the logs command."""
|
||||||
with contextlib.suppress(KeyboardInterrupt):
|
try:
|
||||||
asyncio.run(async_run_logs(config, address))
|
asyncio.run(async_run_logs(config, address))
|
||||||
|
except KeyboardInterrupt:
|
||||||
|
pass
|
||||||
|
@ -6,7 +6,8 @@
|
|||||||
#ifdef USE_API_SERVICES
|
#ifdef USE_API_SERVICES
|
||||||
#include "user_services.h"
|
#include "user_services.h"
|
||||||
#endif
|
#endif
|
||||||
namespace esphome::api {
|
namespace esphome {
|
||||||
|
namespace api {
|
||||||
|
|
||||||
#ifdef USE_API_SERVICES
|
#ifdef USE_API_SERVICES
|
||||||
template<typename T, typename... Ts> class CustomAPIDeviceService : public UserServiceBase<Ts...> {
|
template<typename T, typename... Ts> class CustomAPIDeviceService : public UserServiceBase<Ts...> {
|
||||||
@ -83,7 +84,6 @@ class CustomAPIDevice {
|
|||||||
}
|
}
|
||||||
#endif
|
#endif
|
||||||
|
|
||||||
#ifdef USE_API_HOMEASSISTANT_STATES
|
|
||||||
/** Subscribe to the state (or attribute state) of an entity from Home Assistant.
|
/** Subscribe to the state (or attribute state) of an entity from Home Assistant.
|
||||||
*
|
*
|
||||||
* Usage:
|
* Usage:
|
||||||
@ -135,9 +135,7 @@ class CustomAPIDevice {
|
|||||||
auto f = std::bind(callback, (T *) this, entity_id, std::placeholders::_1);
|
auto f = std::bind(callback, (T *) this, entity_id, std::placeholders::_1);
|
||||||
global_api_server->subscribe_home_assistant_state(entity_id, optional<std::string>(attribute), f);
|
global_api_server->subscribe_home_assistant_state(entity_id, optional<std::string>(attribute), f);
|
||||||
}
|
}
|
||||||
#endif
|
|
||||||
|
|
||||||
#ifdef USE_API_HOMEASSISTANT_SERVICES
|
|
||||||
/** Call a Home Assistant service from ESPHome.
|
/** Call a Home Assistant service from ESPHome.
|
||||||
*
|
*
|
||||||
* Usage:
|
* Usage:
|
||||||
@ -150,7 +148,7 @@ class CustomAPIDevice {
|
|||||||
*/
|
*/
|
||||||
void call_homeassistant_service(const std::string &service_name) {
|
void call_homeassistant_service(const std::string &service_name) {
|
||||||
HomeassistantServiceResponse resp;
|
HomeassistantServiceResponse resp;
|
||||||
resp.set_service(StringRef(service_name));
|
resp.service = service_name;
|
||||||
global_api_server->send_homeassistant_service_call(resp);
|
global_api_server->send_homeassistant_service_call(resp);
|
||||||
}
|
}
|
||||||
|
|
||||||
@ -170,12 +168,12 @@ class CustomAPIDevice {
|
|||||||
*/
|
*/
|
||||||
void call_homeassistant_service(const std::string &service_name, const std::map<std::string, std::string> &data) {
|
void call_homeassistant_service(const std::string &service_name, const std::map<std::string, std::string> &data) {
|
||||||
HomeassistantServiceResponse resp;
|
HomeassistantServiceResponse resp;
|
||||||
resp.set_service(StringRef(service_name));
|
resp.service = service_name;
|
||||||
for (auto &it : data) {
|
for (auto &it : data) {
|
||||||
resp.data.emplace_back();
|
HomeassistantServiceMap kv;
|
||||||
auto &kv = resp.data.back();
|
kv.key = it.first;
|
||||||
kv.set_key(StringRef(it.first));
|
|
||||||
kv.value = it.second;
|
kv.value = it.second;
|
||||||
|
resp.data.push_back(kv);
|
||||||
}
|
}
|
||||||
global_api_server->send_homeassistant_service_call(resp);
|
global_api_server->send_homeassistant_service_call(resp);
|
||||||
}
|
}
|
||||||
@ -192,7 +190,7 @@ class CustomAPIDevice {
|
|||||||
*/
|
*/
|
||||||
void fire_homeassistant_event(const std::string &event_name) {
|
void fire_homeassistant_event(const std::string &event_name) {
|
||||||
HomeassistantServiceResponse resp;
|
HomeassistantServiceResponse resp;
|
||||||
resp.set_service(StringRef(event_name));
|
resp.service = event_name;
|
||||||
resp.is_event = true;
|
resp.is_event = true;
|
||||||
global_api_server->send_homeassistant_service_call(resp);
|
global_api_server->send_homeassistant_service_call(resp);
|
||||||
}
|
}
|
||||||
@ -212,18 +210,18 @@ class CustomAPIDevice {
|
|||||||
*/
|
*/
|
||||||
void fire_homeassistant_event(const std::string &service_name, const std::map<std::string, std::string> &data) {
|
void fire_homeassistant_event(const std::string &service_name, const std::map<std::string, std::string> &data) {
|
||||||
HomeassistantServiceResponse resp;
|
HomeassistantServiceResponse resp;
|
||||||
resp.set_service(StringRef(service_name));
|
resp.service = service_name;
|
||||||
resp.is_event = true;
|
resp.is_event = true;
|
||||||
for (auto &it : data) {
|
for (auto &it : data) {
|
||||||
resp.data.emplace_back();
|
HomeassistantServiceMap kv;
|
||||||
auto &kv = resp.data.back();
|
kv.key = it.first;
|
||||||
kv.set_key(StringRef(it.first));
|
|
||||||
kv.value = it.second;
|
kv.value = it.second;
|
||||||
|
resp.data.push_back(kv);
|
||||||
}
|
}
|
||||||
global_api_server->send_homeassistant_service_call(resp);
|
global_api_server->send_homeassistant_service_call(resp);
|
||||||
}
|
}
|
||||||
#endif
|
|
||||||
};
|
};
|
||||||
|
|
||||||
} // namespace esphome::api
|
} // namespace api
|
||||||
|
} // namespace esphome
|
||||||
#endif
|
#endif
|
||||||
|
@ -2,13 +2,13 @@
|
|||||||
|
|
||||||
#include "api_server.h"
|
#include "api_server.h"
|
||||||
#ifdef USE_API
|
#ifdef USE_API
|
||||||
#ifdef USE_API_HOMEASSISTANT_SERVICES
|
|
||||||
#include "api_pb2.h"
|
#include "api_pb2.h"
|
||||||
#include "esphome/core/automation.h"
|
#include "esphome/core/automation.h"
|
||||||
#include "esphome/core/helpers.h"
|
#include "esphome/core/helpers.h"
|
||||||
#include <vector>
|
#include <vector>
|
||||||
|
|
||||||
namespace esphome::api {
|
namespace esphome {
|
||||||
|
namespace api {
|
||||||
|
|
||||||
template<typename... X> class TemplatableStringValue : public TemplatableValue<std::string, X...> {
|
template<typename... X> class TemplatableStringValue : public TemplatableValue<std::string, X...> {
|
||||||
private:
|
private:
|
||||||
@ -36,9 +36,6 @@ template<typename... X> class TemplatableStringValue : public TemplatableValue<s
|
|||||||
|
|
||||||
template<typename... Ts> class TemplatableKeyValuePair {
|
template<typename... Ts> class TemplatableKeyValuePair {
|
||||||
public:
|
public:
|
||||||
// Keys are always string literals from YAML dictionary keys (e.g., "code", "event")
|
|
||||||
// and never templatable values or lambdas. Only the value parameter can be a lambda/template.
|
|
||||||
// Using pass-by-value with std::move allows optimal performance for both lvalues and rvalues.
|
|
||||||
template<typename T> TemplatableKeyValuePair(std::string key, T value) : key(std::move(key)), value(value) {}
|
template<typename T> TemplatableKeyValuePair(std::string key, T value) : key(std::move(key)), value(value) {}
|
||||||
std::string key;
|
std::string key;
|
||||||
TemplatableStringValue<Ts...> value;
|
TemplatableStringValue<Ts...> value;
|
||||||
@ -50,39 +47,37 @@ template<typename... Ts> class HomeAssistantServiceCallAction : public Action<Ts
|
|||||||
|
|
||||||
template<typename T> void set_service(T service) { this->service_ = service; }
|
template<typename T> void set_service(T service) { this->service_ = service; }
|
||||||
|
|
||||||
// Keys are always string literals from the Python code generation (e.g., cg.add(var.add_data("tag_id", templ))).
|
template<typename T> void add_data(std::string key, T value) {
|
||||||
// The value parameter can be a lambda/template, but keys are never templatable.
|
this->data_.push_back(TemplatableKeyValuePair<Ts...>(key, value));
|
||||||
// Using pass-by-value allows the compiler to optimize for both lvalues and rvalues.
|
}
|
||||||
template<typename T> void add_data(std::string key, T value) { this->data_.emplace_back(std::move(key), value); }
|
|
||||||
template<typename T> void add_data_template(std::string key, T value) {
|
template<typename T> void add_data_template(std::string key, T value) {
|
||||||
this->data_template_.emplace_back(std::move(key), value);
|
this->data_template_.push_back(TemplatableKeyValuePair<Ts...>(key, value));
|
||||||
}
|
}
|
||||||
template<typename T> void add_variable(std::string key, T value) {
|
template<typename T> void add_variable(std::string key, T value) {
|
||||||
this->variables_.emplace_back(std::move(key), value);
|
this->variables_.push_back(TemplatableKeyValuePair<Ts...>(key, value));
|
||||||
}
|
}
|
||||||
|
|
||||||
void play(Ts... x) override {
|
void play(Ts... x) override {
|
||||||
HomeassistantServiceResponse resp;
|
HomeassistantServiceResponse resp;
|
||||||
std::string service_value = this->service_.value(x...);
|
resp.service = this->service_.value(x...);
|
||||||
resp.set_service(StringRef(service_value));
|
|
||||||
resp.is_event = this->is_event_;
|
resp.is_event = this->is_event_;
|
||||||
for (auto &it : this->data_) {
|
for (auto &it : this->data_) {
|
||||||
resp.data.emplace_back();
|
HomeassistantServiceMap kv;
|
||||||
auto &kv = resp.data.back();
|
kv.key = it.key;
|
||||||
kv.set_key(StringRef(it.key));
|
|
||||||
kv.value = it.value.value(x...);
|
kv.value = it.value.value(x...);
|
||||||
|
resp.data.push_back(kv);
|
||||||
}
|
}
|
||||||
for (auto &it : this->data_template_) {
|
for (auto &it : this->data_template_) {
|
||||||
resp.data_template.emplace_back();
|
HomeassistantServiceMap kv;
|
||||||
auto &kv = resp.data_template.back();
|
kv.key = it.key;
|
||||||
kv.set_key(StringRef(it.key));
|
|
||||||
kv.value = it.value.value(x...);
|
kv.value = it.value.value(x...);
|
||||||
|
resp.data_template.push_back(kv);
|
||||||
}
|
}
|
||||||
for (auto &it : this->variables_) {
|
for (auto &it : this->variables_) {
|
||||||
resp.variables.emplace_back();
|
HomeassistantServiceMap kv;
|
||||||
auto &kv = resp.variables.back();
|
kv.key = it.key;
|
||||||
kv.set_key(StringRef(it.key));
|
|
||||||
kv.value = it.value.value(x...);
|
kv.value = it.value.value(x...);
|
||||||
|
resp.variables.push_back(kv);
|
||||||
}
|
}
|
||||||
this->parent_->send_homeassistant_service_call(resp);
|
this->parent_->send_homeassistant_service_call(resp);
|
||||||
}
|
}
|
||||||
@ -96,6 +91,6 @@ template<typename... Ts> class HomeAssistantServiceCallAction : public Action<Ts
|
|||||||
std::vector<TemplatableKeyValuePair<Ts...>> variables_;
|
std::vector<TemplatableKeyValuePair<Ts...>> variables_;
|
||||||
};
|
};
|
||||||
|
|
||||||
} // namespace esphome::api
|
} // namespace api
|
||||||
#endif
|
} // namespace esphome
|
||||||
#endif
|
#endif
|
||||||
|
@ -6,7 +6,8 @@
|
|||||||
#include "esphome/core/log.h"
|
#include "esphome/core/log.h"
|
||||||
#include "esphome/core/util.h"
|
#include "esphome/core/util.h"
|
||||||
|
|
||||||
namespace esphome::api {
|
namespace esphome {
|
||||||
|
namespace api {
|
||||||
|
|
||||||
// Generate entity handler implementations using macros
|
// Generate entity handler implementations using macros
|
||||||
#ifdef USE_BINARY_SENSOR
|
#ifdef USE_BINARY_SENSOR
|
||||||
@ -85,9 +86,10 @@ ListEntitiesIterator::ListEntitiesIterator(APIConnection *client) : client_(clie
|
|||||||
#ifdef USE_API_SERVICES
|
#ifdef USE_API_SERVICES
|
||||||
bool ListEntitiesIterator::on_service(UserServiceDescriptor *service) {
|
bool ListEntitiesIterator::on_service(UserServiceDescriptor *service) {
|
||||||
auto resp = service->encode_list_service_response();
|
auto resp = service->encode_list_service_response();
|
||||||
return this->client_->send_message(resp, ListEntitiesServicesResponse::MESSAGE_TYPE);
|
return this->client_->send_message(resp);
|
||||||
}
|
}
|
||||||
#endif
|
#endif
|
||||||
|
|
||||||
} // namespace esphome::api
|
} // namespace api
|
||||||
|
} // namespace esphome
|
||||||
#endif
|
#endif
|
||||||
|
@ -4,7 +4,8 @@
|
|||||||
#ifdef USE_API
|
#ifdef USE_API
|
||||||
#include "esphome/core/component.h"
|
#include "esphome/core/component.h"
|
||||||
#include "esphome/core/component_iterator.h"
|
#include "esphome/core/component_iterator.h"
|
||||||
namespace esphome::api {
|
namespace esphome {
|
||||||
|
namespace api {
|
||||||
|
|
||||||
class APIConnection;
|
class APIConnection;
|
||||||
|
|
||||||
@ -95,5 +96,6 @@ class ListEntitiesIterator : public ComponentIterator {
|
|||||||
APIConnection *client_;
|
APIConnection *client_;
|
||||||
};
|
};
|
||||||
|
|
||||||
} // namespace esphome::api
|
} // namespace api
|
||||||
|
} // namespace esphome
|
||||||
#endif
|
#endif
|
||||||
|
@ -3,11 +3,12 @@
|
|||||||
#include "esphome/core/helpers.h"
|
#include "esphome/core/helpers.h"
|
||||||
#include "esphome/core/log.h"
|
#include "esphome/core/log.h"
|
||||||
|
|
||||||
namespace esphome::api {
|
namespace esphome {
|
||||||
|
namespace api {
|
||||||
|
|
||||||
static const char *const TAG = "api.proto";
|
static const char *const TAG = "api.proto";
|
||||||
|
|
||||||
void ProtoDecodableMessage::decode(const uint8_t *buffer, size_t length) {
|
void ProtoMessage::decode(const uint8_t *buffer, size_t length) {
|
||||||
uint32_t i = 0;
|
uint32_t i = 0;
|
||||||
bool error = false;
|
bool error = false;
|
||||||
while (i < length) {
|
while (i < length) {
|
||||||
@ -88,4 +89,5 @@ std::string ProtoMessage::dump() const {
|
|||||||
}
|
}
|
||||||
#endif
|
#endif
|
||||||
|
|
||||||
} // namespace esphome::api
|
} // namespace api
|
||||||
|
} // namespace esphome
|
||||||
|
@ -3,47 +3,16 @@
|
|||||||
#include "esphome/core/component.h"
|
#include "esphome/core/component.h"
|
||||||
#include "esphome/core/helpers.h"
|
#include "esphome/core/helpers.h"
|
||||||
#include "esphome/core/log.h"
|
#include "esphome/core/log.h"
|
||||||
#include "esphome/core/string_ref.h"
|
|
||||||
|
|
||||||
#include <cassert>
|
#include <cassert>
|
||||||
#include <cstring>
|
|
||||||
#include <vector>
|
#include <vector>
|
||||||
|
|
||||||
#ifdef ESPHOME_LOG_HAS_VERY_VERBOSE
|
#ifdef ESPHOME_LOG_HAS_VERY_VERBOSE
|
||||||
#define HAS_PROTO_MESSAGE_DUMP
|
#define HAS_PROTO_MESSAGE_DUMP
|
||||||
#endif
|
#endif
|
||||||
|
|
||||||
namespace esphome::api {
|
namespace esphome {
|
||||||
|
namespace api {
|
||||||
/*
|
|
||||||
* StringRef Ownership Model for API Protocol Messages
|
|
||||||
* ===================================================
|
|
||||||
*
|
|
||||||
* StringRef is used for zero-copy string handling in outgoing (SOURCE_SERVER) messages.
|
|
||||||
* It holds a pointer and length to existing string data without copying.
|
|
||||||
*
|
|
||||||
* CRITICAL: The referenced string data MUST remain valid until message encoding completes.
|
|
||||||
*
|
|
||||||
* Safe StringRef Patterns:
|
|
||||||
* 1. String literals: StringRef("literal") - Always safe (static storage duration)
|
|
||||||
* 2. Member variables: StringRef(this->member_string_) - Safe if object outlives encoding
|
|
||||||
* 3. Global/static strings: StringRef(GLOBAL_CONSTANT) - Always safe
|
|
||||||
* 4. Local variables: Safe ONLY if encoding happens before function returns:
|
|
||||||
* std::string temp = compute_value();
|
|
||||||
* msg.set_field(StringRef(temp));
|
|
||||||
* return this->send_message(msg); // temp is valid during encoding
|
|
||||||
*
|
|
||||||
* Unsafe Patterns (WILL cause crashes/corruption):
|
|
||||||
* 1. Temporaries: msg.set_field(StringRef(obj.get_string())) // get_string() returns by value
|
|
||||||
* 2. Concatenation: msg.set_field(StringRef(str1 + str2)) // Result is temporary
|
|
||||||
*
|
|
||||||
* For unsafe patterns, store in a local variable first:
|
|
||||||
* std::string temp = get_string(); // or str1 + str2
|
|
||||||
* msg.set_field(StringRef(temp));
|
|
||||||
*
|
|
||||||
* The send_*_response pattern ensures proper lifetime management by encoding
|
|
||||||
* within the same function scope where temporaries are created.
|
|
||||||
*/
|
|
||||||
|
|
||||||
/// Representation of a VarInt - in ProtoBuf should be 64bit but we only use 32bit
|
/// Representation of a VarInt - in ProtoBuf should be 64bit but we only use 32bit
|
||||||
class ProtoVarInt {
|
class ProtoVarInt {
|
||||||
@ -166,7 +135,6 @@ class ProtoVarInt {
|
|||||||
|
|
||||||
// Forward declaration for decode_to_message and encode_to_writer
|
// Forward declaration for decode_to_message and encode_to_writer
|
||||||
class ProtoMessage;
|
class ProtoMessage;
|
||||||
class ProtoDecodableMessage;
|
|
||||||
|
|
||||||
class ProtoLengthDelimited {
|
class ProtoLengthDelimited {
|
||||||
public:
|
public:
|
||||||
@ -174,15 +142,15 @@ class ProtoLengthDelimited {
|
|||||||
std::string as_string() const { return std::string(reinterpret_cast<const char *>(this->value_), this->length_); }
|
std::string as_string() const { return std::string(reinterpret_cast<const char *>(this->value_), this->length_); }
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Decode the length-delimited data into an existing ProtoDecodableMessage instance.
|
* Decode the length-delimited data into an existing ProtoMessage instance.
|
||||||
*
|
*
|
||||||
* This method allows decoding without templates, enabling use in contexts
|
* This method allows decoding without templates, enabling use in contexts
|
||||||
* where the message type is not known at compile time. The ProtoDecodableMessage's
|
* where the message type is not known at compile time. The ProtoMessage's
|
||||||
* decode() method will be called with the raw data and length.
|
* decode() method will be called with the raw data and length.
|
||||||
*
|
*
|
||||||
* @param msg The ProtoDecodableMessage instance to decode into
|
* @param msg The ProtoMessage instance to decode into
|
||||||
*/
|
*/
|
||||||
void decode_to_message(ProtoDecodableMessage &msg) const;
|
void decode_to_message(ProtoMessage &msg) const;
|
||||||
|
|
||||||
protected:
|
protected:
|
||||||
const uint8_t *const value_;
|
const uint8_t *const value_;
|
||||||
@ -207,7 +175,23 @@ class Proto32Bit {
|
|||||||
const uint32_t value_;
|
const uint32_t value_;
|
||||||
};
|
};
|
||||||
|
|
||||||
// NOTE: Proto64Bit class removed - wire type 1 (64-bit fixed) not supported
|
class Proto64Bit {
|
||||||
|
public:
|
||||||
|
explicit Proto64Bit(uint64_t value) : value_(value) {}
|
||||||
|
uint64_t as_fixed64() const { return this->value_; }
|
||||||
|
int64_t as_sfixed64() const { return static_cast<int64_t>(this->value_); }
|
||||||
|
double as_double() const {
|
||||||
|
union {
|
||||||
|
uint64_t raw;
|
||||||
|
double value;
|
||||||
|
} s{};
|
||||||
|
s.raw = this->value_;
|
||||||
|
return s.value;
|
||||||
|
}
|
||||||
|
|
||||||
|
protected:
|
||||||
|
const uint64_t value_;
|
||||||
|
};
|
||||||
|
|
||||||
class ProtoWriteBuffer {
|
class ProtoWriteBuffer {
|
||||||
public:
|
public:
|
||||||
@ -221,9 +205,9 @@ class ProtoWriteBuffer {
|
|||||||
* @param field_id Field number (tag) in the protobuf message
|
* @param field_id Field number (tag) in the protobuf message
|
||||||
* @param type Wire type value:
|
* @param type Wire type value:
|
||||||
* - 0: Varint (int32, int64, uint32, uint64, sint32, sint64, bool, enum)
|
* - 0: Varint (int32, int64, uint32, uint64, sint32, sint64, bool, enum)
|
||||||
|
* - 1: 64-bit (fixed64, sfixed64, double)
|
||||||
* - 2: Length-delimited (string, bytes, embedded messages, packed repeated fields)
|
* - 2: Length-delimited (string, bytes, embedded messages, packed repeated fields)
|
||||||
* - 5: 32-bit (fixed32, sfixed32, float)
|
* - 5: 32-bit (fixed32, sfixed32, float)
|
||||||
* - Note: Wire type 1 (64-bit fixed) is not supported
|
|
||||||
*
|
*
|
||||||
* Following https://protobuf.dev/programming-guides/encoding/#structure
|
* Following https://protobuf.dev/programming-guides/encoding/#structure
|
||||||
*/
|
*/
|
||||||
@ -237,20 +221,12 @@ class ProtoWriteBuffer {
|
|||||||
|
|
||||||
this->encode_field_raw(field_id, 2); // type 2: Length-delimited string
|
this->encode_field_raw(field_id, 2); // type 2: Length-delimited string
|
||||||
this->encode_varint_raw(len);
|
this->encode_varint_raw(len);
|
||||||
|
auto *data = reinterpret_cast<const uint8_t *>(string);
|
||||||
// Using resize + memcpy instead of insert provides significant performance improvement:
|
this->buffer_->insert(this->buffer_->end(), data, data + len);
|
||||||
// ~10-11x faster for 16-32 byte strings, ~3x faster for 64-byte strings
|
|
||||||
// as it avoids iterator checks and potential element moves that insert performs
|
|
||||||
size_t old_size = this->buffer_->size();
|
|
||||||
this->buffer_->resize(old_size + len);
|
|
||||||
std::memcpy(this->buffer_->data() + old_size, string, len);
|
|
||||||
}
|
}
|
||||||
void encode_string(uint32_t field_id, const std::string &value, bool force = false) {
|
void encode_string(uint32_t field_id, const std::string &value, bool force = false) {
|
||||||
this->encode_string(field_id, value.data(), value.size(), force);
|
this->encode_string(field_id, value.data(), value.size(), force);
|
||||||
}
|
}
|
||||||
void encode_string(uint32_t field_id, const StringRef &ref, bool force = false) {
|
|
||||||
this->encode_string(field_id, ref.c_str(), ref.size(), force);
|
|
||||||
}
|
|
||||||
void encode_bytes(uint32_t field_id, const uint8_t *data, size_t len, bool force = false) {
|
void encode_bytes(uint32_t field_id, const uint8_t *data, size_t len, bool force = false) {
|
||||||
this->encode_string(field_id, reinterpret_cast<const char *>(data), len, force);
|
this->encode_string(field_id, reinterpret_cast<const char *>(data), len, force);
|
||||||
}
|
}
|
||||||
@ -282,10 +258,20 @@ class ProtoWriteBuffer {
|
|||||||
this->write((value >> 16) & 0xFF);
|
this->write((value >> 16) & 0xFF);
|
||||||
this->write((value >> 24) & 0xFF);
|
this->write((value >> 24) & 0xFF);
|
||||||
}
|
}
|
||||||
// NOTE: Wire type 1 (64-bit fixed: double, fixed64, sfixed64) is intentionally
|
void encode_fixed64(uint32_t field_id, uint64_t value, bool force = false) {
|
||||||
// not supported to reduce overhead on embedded systems. All ESPHome devices are
|
if (value == 0 && !force)
|
||||||
// 32-bit microcontrollers where 64-bit operations are expensive. If 64-bit support
|
return;
|
||||||
// is needed in the future, the necessary encoding/decoding functions must be added.
|
|
||||||
|
this->encode_field_raw(field_id, 1); // type 1: 64-bit fixed64
|
||||||
|
this->write((value >> 0) & 0xFF);
|
||||||
|
this->write((value >> 8) & 0xFF);
|
||||||
|
this->write((value >> 16) & 0xFF);
|
||||||
|
this->write((value >> 24) & 0xFF);
|
||||||
|
this->write((value >> 32) & 0xFF);
|
||||||
|
this->write((value >> 40) & 0xFF);
|
||||||
|
this->write((value >> 48) & 0xFF);
|
||||||
|
this->write((value >> 56) & 0xFF);
|
||||||
|
}
|
||||||
void encode_float(uint32_t field_id, float value, bool force = false) {
|
void encode_float(uint32_t field_id, float value, bool force = false) {
|
||||||
if (value == 0.0f && !force)
|
if (value == 0.0f && !force)
|
||||||
return;
|
return;
|
||||||
@ -333,62 +319,46 @@ class ProtoWriteBuffer {
|
|||||||
std::vector<uint8_t> *buffer_;
|
std::vector<uint8_t> *buffer_;
|
||||||
};
|
};
|
||||||
|
|
||||||
// Forward declaration
|
|
||||||
class ProtoSize;
|
|
||||||
|
|
||||||
class ProtoMessage {
|
class ProtoMessage {
|
||||||
public:
|
public:
|
||||||
virtual ~ProtoMessage() = default;
|
virtual ~ProtoMessage() = default;
|
||||||
// Default implementation for messages with no fields
|
// Default implementation for messages with no fields
|
||||||
virtual void encode(ProtoWriteBuffer buffer) const {}
|
virtual void encode(ProtoWriteBuffer buffer) const {}
|
||||||
|
void decode(const uint8_t *buffer, size_t length);
|
||||||
// Default implementation for messages with no fields
|
// Default implementation for messages with no fields
|
||||||
virtual void calculate_size(ProtoSize &size) const {}
|
virtual void calculate_size(uint32_t &total_size) const {}
|
||||||
#ifdef HAS_PROTO_MESSAGE_DUMP
|
#ifdef HAS_PROTO_MESSAGE_DUMP
|
||||||
std::string dump() const;
|
std::string dump() const;
|
||||||
virtual void dump_to(std::string &out) const = 0;
|
virtual void dump_to(std::string &out) const = 0;
|
||||||
virtual const char *message_name() const { return "unknown"; }
|
virtual const char *message_name() const { return "unknown"; }
|
||||||
#endif
|
#endif
|
||||||
};
|
|
||||||
|
|
||||||
// Base class for messages that support decoding
|
|
||||||
class ProtoDecodableMessage : public ProtoMessage {
|
|
||||||
public:
|
|
||||||
void decode(const uint8_t *buffer, size_t length);
|
|
||||||
|
|
||||||
protected:
|
protected:
|
||||||
virtual bool decode_varint(uint32_t field_id, ProtoVarInt value) { return false; }
|
virtual bool decode_varint(uint32_t field_id, ProtoVarInt value) { return false; }
|
||||||
virtual bool decode_length(uint32_t field_id, ProtoLengthDelimited value) { return false; }
|
virtual bool decode_length(uint32_t field_id, ProtoLengthDelimited value) { return false; }
|
||||||
virtual bool decode_32bit(uint32_t field_id, Proto32Bit value) { return false; }
|
virtual bool decode_32bit(uint32_t field_id, Proto32Bit value) { return false; }
|
||||||
// NOTE: decode_64bit removed - wire type 1 not supported
|
virtual bool decode_64bit(uint32_t field_id, Proto64Bit value) { return false; }
|
||||||
};
|
};
|
||||||
|
|
||||||
class ProtoSize {
|
class ProtoSize {
|
||||||
private:
|
|
||||||
uint32_t total_size_ = 0;
|
|
||||||
|
|
||||||
public:
|
public:
|
||||||
/**
|
/**
|
||||||
* @brief ProtoSize class for Protocol Buffer serialization size calculation
|
* @brief ProtoSize class for Protocol Buffer serialization size calculation
|
||||||
*
|
*
|
||||||
* This class provides methods to calculate the exact byte counts needed
|
* This class provides static methods to calculate the exact byte counts needed
|
||||||
* for encoding various Protocol Buffer field types. The class now uses an
|
* for encoding various Protocol Buffer field types. All methods are designed to be
|
||||||
* object-based approach to reduce parameter passing overhead while keeping
|
* efficient for the common case where many fields have default values.
|
||||||
* varint calculation methods static for external use.
|
|
||||||
*
|
*
|
||||||
* Implements Protocol Buffer encoding size calculation according to:
|
* Implements Protocol Buffer encoding size calculation according to:
|
||||||
* https://protobuf.dev/programming-guides/encoding/
|
* https://protobuf.dev/programming-guides/encoding/
|
||||||
*
|
*
|
||||||
* Key features:
|
* Key features:
|
||||||
* - Object-based approach reduces flash usage by eliminating parameter passing
|
|
||||||
* - Early-return optimization for zero/default values
|
* - Early-return optimization for zero/default values
|
||||||
* - Static varint methods for external callers
|
* - Direct total_size updates to avoid unnecessary additions
|
||||||
* - Specialized handling for different field types according to protobuf spec
|
* - Specialized handling for different field types according to protobuf spec
|
||||||
|
* - Templated helpers for repeated fields and messages
|
||||||
*/
|
*/
|
||||||
|
|
||||||
ProtoSize() = default;
|
|
||||||
|
|
||||||
uint32_t get_size() const { return total_size_; }
|
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* @brief Calculates the size in bytes needed to encode a uint32_t value as a varint
|
* @brief Calculates the size in bytes needed to encode a uint32_t value as a varint
|
||||||
*
|
*
|
||||||
@ -489,7 +459,9 @@ class ProtoSize {
|
|||||||
* @brief Common parameters for all add_*_field methods
|
* @brief Common parameters for all add_*_field methods
|
||||||
*
|
*
|
||||||
* All add_*_field methods follow these common patterns:
|
* All add_*_field methods follow these common patterns:
|
||||||
* * @param field_id_size Pre-calculated size of the field ID in bytes
|
*
|
||||||
|
* @param total_size Reference to the total message size to update
|
||||||
|
* @param field_id_size Pre-calculated size of the field ID in bytes
|
||||||
* @param value The value to calculate size for (type varies)
|
* @param value The value to calculate size for (type varies)
|
||||||
* @param force Whether to calculate size even if the value is default/zero/empty
|
* @param force Whether to calculate size even if the value is default/zero/empty
|
||||||
*
|
*
|
||||||
@ -502,181 +474,244 @@ class ProtoSize {
|
|||||||
/**
|
/**
|
||||||
* @brief Calculates and adds the size of an int32 field to the total message size
|
* @brief Calculates and adds the size of an int32 field to the total message size
|
||||||
*/
|
*/
|
||||||
inline void add_int32(uint32_t field_id_size, int32_t value) {
|
static inline void add_int32_field(uint32_t &total_size, uint32_t field_id_size, int32_t value) {
|
||||||
if (value != 0) {
|
// Skip calculation if value is zero
|
||||||
add_int32_force(field_id_size, value);
|
if (value == 0) {
|
||||||
|
return; // No need to update total_size
|
||||||
|
}
|
||||||
|
|
||||||
|
// Calculate and directly add to total_size
|
||||||
|
if (value < 0) {
|
||||||
|
// Negative values are encoded as 10-byte varints in protobuf
|
||||||
|
total_size += field_id_size + 10;
|
||||||
|
} else {
|
||||||
|
// For non-negative values, use the standard varint size
|
||||||
|
total_size += field_id_size + varint(static_cast<uint32_t>(value));
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* @brief Calculates and adds the size of an int32 field to the total message size (force version)
|
* @brief Calculates and adds the size of an int32 field to the total message size (repeated field version)
|
||||||
*/
|
*/
|
||||||
inline void add_int32_force(uint32_t field_id_size, int32_t value) {
|
static inline void add_int32_field_repeated(uint32_t &total_size, uint32_t field_id_size, int32_t value) {
|
||||||
// Always calculate size when forced
|
// Always calculate size for repeated fields
|
||||||
// Negative values are encoded as 10-byte varints in protobuf
|
if (value < 0) {
|
||||||
total_size_ += field_id_size + (value < 0 ? 10 : varint(static_cast<uint32_t>(value)));
|
// Negative values are encoded as 10-byte varints in protobuf
|
||||||
|
total_size += field_id_size + 10;
|
||||||
|
} else {
|
||||||
|
// For non-negative values, use the standard varint size
|
||||||
|
total_size += field_id_size + varint(static_cast<uint32_t>(value));
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* @brief Calculates and adds the size of a uint32 field to the total message size
|
* @brief Calculates and adds the size of a uint32 field to the total message size
|
||||||
*/
|
*/
|
||||||
inline void add_uint32(uint32_t field_id_size, uint32_t value) {
|
static inline void add_uint32_field(uint32_t &total_size, uint32_t field_id_size, uint32_t value) {
|
||||||
if (value != 0) {
|
// Skip calculation if value is zero
|
||||||
add_uint32_force(field_id_size, value);
|
if (value == 0) {
|
||||||
|
return; // No need to update total_size
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// Calculate and directly add to total_size
|
||||||
|
total_size += field_id_size + varint(value);
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* @brief Calculates and adds the size of a uint32 field to the total message size (force version)
|
* @brief Calculates and adds the size of a uint32 field to the total message size (repeated field version)
|
||||||
*/
|
*/
|
||||||
inline void add_uint32_force(uint32_t field_id_size, uint32_t value) {
|
static inline void add_uint32_field_repeated(uint32_t &total_size, uint32_t field_id_size, uint32_t value) {
|
||||||
// Always calculate size when force is true
|
// Always calculate size for repeated fields
|
||||||
total_size_ += field_id_size + varint(value);
|
total_size += field_id_size + varint(value);
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* @brief Calculates and adds the size of a boolean field to the total message size
|
* @brief Calculates and adds the size of a boolean field to the total message size
|
||||||
*/
|
*/
|
||||||
inline void add_bool(uint32_t field_id_size, bool value) {
|
static inline void add_bool_field(uint32_t &total_size, uint32_t field_id_size, bool value) {
|
||||||
if (value) {
|
// Skip calculation if value is false
|
||||||
// Boolean fields always use 1 byte when true
|
if (!value) {
|
||||||
total_size_ += field_id_size + 1;
|
return; // No need to update total_size
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// Boolean fields always use 1 byte when true
|
||||||
|
total_size += field_id_size + 1;
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* @brief Calculates and adds the size of a boolean field to the total message size (force version)
|
* @brief Calculates and adds the size of a boolean field to the total message size (repeated field version)
|
||||||
*/
|
*/
|
||||||
inline void add_bool_force(uint32_t field_id_size, bool value) {
|
static inline void add_bool_field_repeated(uint32_t &total_size, uint32_t field_id_size, bool value) {
|
||||||
// Always calculate size when force is true
|
// Always calculate size for repeated fields
|
||||||
// Boolean fields always use 1 byte
|
// Boolean fields always use 1 byte
|
||||||
total_size_ += field_id_size + 1;
|
total_size += field_id_size + 1;
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* @brief Calculates and adds the size of a float field to the total message size
|
* @brief Calculates and adds the size of a fixed field to the total message size
|
||||||
|
*
|
||||||
|
* Fixed fields always take exactly N bytes (4 for fixed32/float, 8 for fixed64/double).
|
||||||
|
*
|
||||||
|
* @tparam NumBytes The number of bytes for this fixed field (4 or 8)
|
||||||
|
* @param is_nonzero Whether the value is non-zero
|
||||||
*/
|
*/
|
||||||
inline void add_float(uint32_t field_id_size, float value) {
|
template<uint32_t NumBytes>
|
||||||
if (value != 0.0f) {
|
static inline void add_fixed_field(uint32_t &total_size, uint32_t field_id_size, bool is_nonzero) {
|
||||||
total_size_ += field_id_size + 4;
|
// Skip calculation if value is zero
|
||||||
|
if (!is_nonzero) {
|
||||||
|
return; // No need to update total_size
|
||||||
}
|
}
|
||||||
}
|
|
||||||
|
|
||||||
// NOTE: add_double_field removed - wire type 1 (64-bit: double) not supported
|
// Fixed fields always take exactly NumBytes
|
||||||
// to reduce overhead on embedded systems
|
total_size += field_id_size + NumBytes;
|
||||||
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* @brief Calculates and adds the size of a fixed32 field to the total message size
|
* @brief Calculates and adds the size of an enum field to the total message size
|
||||||
|
*
|
||||||
|
* Enum fields are encoded as uint32 varints.
|
||||||
*/
|
*/
|
||||||
inline void add_fixed32(uint32_t field_id_size, uint32_t value) {
|
static inline void add_enum_field(uint32_t &total_size, uint32_t field_id_size, uint32_t value) {
|
||||||
if (value != 0) {
|
// Skip calculation if value is zero
|
||||||
total_size_ += field_id_size + 4;
|
if (value == 0) {
|
||||||
|
return; // No need to update total_size
|
||||||
}
|
}
|
||||||
}
|
|
||||||
|
|
||||||
// NOTE: add_fixed64_field removed - wire type 1 (64-bit: fixed64) not supported
|
// Enums are encoded as uint32
|
||||||
// to reduce overhead on embedded systems
|
total_size += field_id_size + varint(value);
|
||||||
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* @brief Calculates and adds the size of a sfixed32 field to the total message size
|
* @brief Calculates and adds the size of an enum field to the total message size (repeated field version)
|
||||||
|
*
|
||||||
|
* Enum fields are encoded as uint32 varints.
|
||||||
*/
|
*/
|
||||||
inline void add_sfixed32(uint32_t field_id_size, int32_t value) {
|
static inline void add_enum_field_repeated(uint32_t &total_size, uint32_t field_id_size, uint32_t value) {
|
||||||
if (value != 0) {
|
// Always calculate size for repeated fields
|
||||||
total_size_ += field_id_size + 4;
|
// Enums are encoded as uint32
|
||||||
}
|
total_size += field_id_size + varint(value);
|
||||||
}
|
}
|
||||||
|
|
||||||
// NOTE: add_sfixed64_field removed - wire type 1 (64-bit: sfixed64) not supported
|
|
||||||
// to reduce overhead on embedded systems
|
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* @brief Calculates and adds the size of a sint32 field to the total message size
|
* @brief Calculates and adds the size of a sint32 field to the total message size
|
||||||
*
|
*
|
||||||
* Sint32 fields use ZigZag encoding, which is more efficient for negative values.
|
* Sint32 fields use ZigZag encoding, which is more efficient for negative values.
|
||||||
*/
|
*/
|
||||||
inline void add_sint32(uint32_t field_id_size, int32_t value) {
|
static inline void add_sint32_field(uint32_t &total_size, uint32_t field_id_size, int32_t value) {
|
||||||
if (value != 0) {
|
// Skip calculation if value is zero
|
||||||
add_sint32_force(field_id_size, value);
|
if (value == 0) {
|
||||||
|
return; // No need to update total_size
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// ZigZag encoding for sint32: (n << 1) ^ (n >> 31)
|
||||||
|
uint32_t zigzag = (static_cast<uint32_t>(value) << 1) ^ (static_cast<uint32_t>(value >> 31));
|
||||||
|
total_size += field_id_size + varint(zigzag);
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* @brief Calculates and adds the size of a sint32 field to the total message size (force version)
|
* @brief Calculates and adds the size of a sint32 field to the total message size (repeated field version)
|
||||||
*
|
*
|
||||||
* Sint32 fields use ZigZag encoding, which is more efficient for negative values.
|
* Sint32 fields use ZigZag encoding, which is more efficient for negative values.
|
||||||
*/
|
*/
|
||||||
inline void add_sint32_force(uint32_t field_id_size, int32_t value) {
|
static inline void add_sint32_field_repeated(uint32_t &total_size, uint32_t field_id_size, int32_t value) {
|
||||||
// Always calculate size when force is true
|
// Always calculate size for repeated fields
|
||||||
// ZigZag encoding for sint32: (n << 1) ^ (n >> 31)
|
// ZigZag encoding for sint32: (n << 1) ^ (n >> 31)
|
||||||
uint32_t zigzag = (static_cast<uint32_t>(value) << 1) ^ (static_cast<uint32_t>(value >> 31));
|
uint32_t zigzag = (static_cast<uint32_t>(value) << 1) ^ (static_cast<uint32_t>(value >> 31));
|
||||||
total_size_ += field_id_size + varint(zigzag);
|
total_size += field_id_size + varint(zigzag);
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* @brief Calculates and adds the size of an int64 field to the total message size
|
* @brief Calculates and adds the size of an int64 field to the total message size
|
||||||
*/
|
*/
|
||||||
inline void add_int64(uint32_t field_id_size, int64_t value) {
|
static inline void add_int64_field(uint32_t &total_size, uint32_t field_id_size, int64_t value) {
|
||||||
if (value != 0) {
|
// Skip calculation if value is zero
|
||||||
add_int64_force(field_id_size, value);
|
if (value == 0) {
|
||||||
|
return; // No need to update total_size
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// Calculate and directly add to total_size
|
||||||
|
total_size += field_id_size + varint(value);
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* @brief Calculates and adds the size of an int64 field to the total message size (force version)
|
* @brief Calculates and adds the size of an int64 field to the total message size (repeated field version)
|
||||||
*/
|
*/
|
||||||
inline void add_int64_force(uint32_t field_id_size, int64_t value) {
|
static inline void add_int64_field_repeated(uint32_t &total_size, uint32_t field_id_size, int64_t value) {
|
||||||
// Always calculate size when force is true
|
// Always calculate size for repeated fields
|
||||||
total_size_ += field_id_size + varint(value);
|
total_size += field_id_size + varint(value);
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* @brief Calculates and adds the size of a uint64 field to the total message size
|
* @brief Calculates and adds the size of a uint64 field to the total message size
|
||||||
*/
|
*/
|
||||||
inline void add_uint64(uint32_t field_id_size, uint64_t value) {
|
static inline void add_uint64_field(uint32_t &total_size, uint32_t field_id_size, uint64_t value) {
|
||||||
if (value != 0) {
|
// Skip calculation if value is zero
|
||||||
add_uint64_force(field_id_size, value);
|
if (value == 0) {
|
||||||
|
return; // No need to update total_size
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// Calculate and directly add to total_size
|
||||||
|
total_size += field_id_size + varint(value);
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* @brief Calculates and adds the size of a uint64 field to the total message size (force version)
|
* @brief Calculates and adds the size of a uint64 field to the total message size (repeated field version)
|
||||||
*/
|
*/
|
||||||
inline void add_uint64_force(uint32_t field_id_size, uint64_t value) {
|
static inline void add_uint64_field_repeated(uint32_t &total_size, uint32_t field_id_size, uint64_t value) {
|
||||||
// Always calculate size when force is true
|
// Always calculate size for repeated fields
|
||||||
total_size_ += field_id_size + varint(value);
|
total_size += field_id_size + varint(value);
|
||||||
}
|
}
|
||||||
|
|
||||||
// NOTE: sint64 support functions (add_sint64_field, add_sint64_field_force) removed
|
|
||||||
// sint64 type is not supported by ESPHome API to reduce overhead on embedded systems
|
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* @brief Calculates and adds the size of a length-delimited field (string/bytes) to the total message size
|
* @brief Calculates and adds the size of a sint64 field to the total message size
|
||||||
|
*
|
||||||
|
* Sint64 fields use ZigZag encoding, which is more efficient for negative values.
|
||||||
*/
|
*/
|
||||||
inline void add_length(uint32_t field_id_size, size_t len) {
|
static inline void add_sint64_field(uint32_t &total_size, uint32_t field_id_size, int64_t value) {
|
||||||
if (len != 0) {
|
// Skip calculation if value is zero
|
||||||
add_length_force(field_id_size, len);
|
if (value == 0) {
|
||||||
|
return; // No need to update total_size
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// ZigZag encoding for sint64: (n << 1) ^ (n >> 63)
|
||||||
|
uint64_t zigzag = (static_cast<uint64_t>(value) << 1) ^ (static_cast<uint64_t>(value >> 63));
|
||||||
|
total_size += field_id_size + varint(zigzag);
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* @brief Calculates and adds the size of a length-delimited field (string/bytes) to the total message size (repeated
|
* @brief Calculates and adds the size of a sint64 field to the total message size (repeated field version)
|
||||||
* field version)
|
*
|
||||||
|
* Sint64 fields use ZigZag encoding, which is more efficient for negative values.
|
||||||
*/
|
*/
|
||||||
inline void add_length_force(uint32_t field_id_size, size_t len) {
|
static inline void add_sint64_field_repeated(uint32_t &total_size, uint32_t field_id_size, int64_t value) {
|
||||||
// Always calculate size when force is true
|
// Always calculate size for repeated fields
|
||||||
// Field ID + length varint + data bytes
|
// ZigZag encoding for sint64: (n << 1) ^ (n >> 63)
|
||||||
total_size_ += field_id_size + varint(static_cast<uint32_t>(len)) + static_cast<uint32_t>(len);
|
uint64_t zigzag = (static_cast<uint64_t>(value) << 1) ^ (static_cast<uint64_t>(value >> 63));
|
||||||
|
total_size += field_id_size + varint(zigzag);
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* @brief Adds a pre-calculated size directly to the total
|
* @brief Calculates and adds the size of a string/bytes field to the total message size
|
||||||
*
|
|
||||||
* This is used when we can calculate the total size by multiplying the number
|
|
||||||
* of elements by the bytes per element (for repeated fixed-size types like float, fixed32, etc.)
|
|
||||||
*
|
|
||||||
* @param size The pre-calculated total size to add
|
|
||||||
*/
|
*/
|
||||||
inline void add_precalculated_size(uint32_t size) { total_size_ += size; }
|
static inline void add_string_field(uint32_t &total_size, uint32_t field_id_size, const std::string &str) {
|
||||||
|
// Skip calculation if string is empty
|
||||||
|
if (str.empty()) {
|
||||||
|
return; // No need to update total_size
|
||||||
|
}
|
||||||
|
|
||||||
|
// Calculate and directly add to total_size
|
||||||
|
const uint32_t str_size = static_cast<uint32_t>(str.size());
|
||||||
|
total_size += field_id_size + varint(str_size) + str_size;
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* @brief Calculates and adds the size of a string/bytes field to the total message size (repeated field version)
|
||||||
|
*/
|
||||||
|
static inline void add_string_field_repeated(uint32_t &total_size, uint32_t field_id_size, const std::string &str) {
|
||||||
|
// Always calculate size for repeated fields
|
||||||
|
const uint32_t str_size = static_cast<uint32_t>(str.size());
|
||||||
|
total_size += field_id_size + varint(str_size) + str_size;
|
||||||
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* @brief Calculates and adds the size of a nested message field to the total message size
|
* @brief Calculates and adds the size of a nested message field to the total message size
|
||||||
@ -686,21 +721,26 @@ class ProtoSize {
|
|||||||
*
|
*
|
||||||
* @param nested_size The pre-calculated size of the nested message
|
* @param nested_size The pre-calculated size of the nested message
|
||||||
*/
|
*/
|
||||||
inline void add_message_field(uint32_t field_id_size, uint32_t nested_size) {
|
static inline void add_message_field(uint32_t &total_size, uint32_t field_id_size, uint32_t nested_size) {
|
||||||
if (nested_size != 0) {
|
// Skip calculation if nested message is empty
|
||||||
add_message_field_force(field_id_size, nested_size);
|
if (nested_size == 0) {
|
||||||
|
return; // No need to update total_size
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// Calculate and directly add to total_size
|
||||||
|
// Field ID + length varint + nested message content
|
||||||
|
total_size += field_id_size + varint(nested_size) + nested_size;
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* @brief Calculates and adds the size of a nested message field to the total message size (force version)
|
* @brief Calculates and adds the size of a nested message field to the total message size (repeated field version)
|
||||||
*
|
*
|
||||||
* @param nested_size The pre-calculated size of the nested message
|
* @param nested_size The pre-calculated size of the nested message
|
||||||
*/
|
*/
|
||||||
inline void add_message_field_force(uint32_t field_id_size, uint32_t nested_size) {
|
static inline void add_message_field_repeated(uint32_t &total_size, uint32_t field_id_size, uint32_t nested_size) {
|
||||||
// Always calculate size when force is true
|
// Always calculate size for repeated fields
|
||||||
// Field ID + length varint + nested message content
|
// Field ID + length varint + nested message content
|
||||||
total_size_ += field_id_size + varint(nested_size) + nested_size;
|
total_size += field_id_size + varint(nested_size) + nested_size;
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
@ -712,29 +752,26 @@ class ProtoSize {
|
|||||||
*
|
*
|
||||||
* @param message The nested message object
|
* @param message The nested message object
|
||||||
*/
|
*/
|
||||||
inline void add_message_object(uint32_t field_id_size, const ProtoMessage &message) {
|
static inline void add_message_object(uint32_t &total_size, uint32_t field_id_size, const ProtoMessage &message) {
|
||||||
// Calculate nested message size by creating a temporary ProtoSize
|
uint32_t nested_size = 0;
|
||||||
ProtoSize nested_calc;
|
message.calculate_size(nested_size);
|
||||||
message.calculate_size(nested_calc);
|
|
||||||
uint32_t nested_size = nested_calc.get_size();
|
|
||||||
|
|
||||||
// Use the base implementation with the calculated nested_size
|
// Use the base implementation with the calculated nested_size
|
||||||
add_message_field(field_id_size, nested_size);
|
add_message_field(total_size, field_id_size, nested_size);
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* @brief Calculates and adds the size of a nested message field to the total message size (force version)
|
* @brief Calculates and adds the size of a nested message field to the total message size (repeated field version)
|
||||||
*
|
*
|
||||||
* @param message The nested message object
|
* @param message The nested message object
|
||||||
*/
|
*/
|
||||||
inline void add_message_object_force(uint32_t field_id_size, const ProtoMessage &message) {
|
static inline void add_message_object_repeated(uint32_t &total_size, uint32_t field_id_size,
|
||||||
// Calculate nested message size by creating a temporary ProtoSize
|
const ProtoMessage &message) {
|
||||||
ProtoSize nested_calc;
|
uint32_t nested_size = 0;
|
||||||
message.calculate_size(nested_calc);
|
message.calculate_size(nested_size);
|
||||||
uint32_t nested_size = nested_calc.get_size();
|
|
||||||
|
|
||||||
// Use the base implementation with the calculated nested_size
|
// Use the base implementation with the calculated nested_size
|
||||||
add_message_field_force(field_id_size, nested_size);
|
add_message_field_repeated(total_size, field_id_size, nested_size);
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
@ -747,15 +784,16 @@ class ProtoSize {
|
|||||||
* @param messages Vector of message objects
|
* @param messages Vector of message objects
|
||||||
*/
|
*/
|
||||||
template<typename MessageType>
|
template<typename MessageType>
|
||||||
inline void add_repeated_message(uint32_t field_id_size, const std::vector<MessageType> &messages) {
|
static inline void add_repeated_message(uint32_t &total_size, uint32_t field_id_size,
|
||||||
|
const std::vector<MessageType> &messages) {
|
||||||
// Skip if the vector is empty
|
// Skip if the vector is empty
|
||||||
if (messages.empty()) {
|
if (messages.empty()) {
|
||||||
return;
|
return;
|
||||||
}
|
}
|
||||||
|
|
||||||
// Use the force version for all messages in the repeated field
|
// Use the repeated field version for all messages
|
||||||
for (const auto &message : messages) {
|
for (const auto &message : messages) {
|
||||||
add_message_object_force(field_id_size, message);
|
add_message_object_repeated(total_size, field_id_size, message);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
};
|
};
|
||||||
@ -765,9 +803,8 @@ inline void ProtoWriteBuffer::encode_message(uint32_t field_id, const ProtoMessa
|
|||||||
this->encode_field_raw(field_id, 2); // type 2: Length-delimited message
|
this->encode_field_raw(field_id, 2); // type 2: Length-delimited message
|
||||||
|
|
||||||
// Calculate the message size first
|
// Calculate the message size first
|
||||||
ProtoSize msg_size;
|
uint32_t msg_length_bytes = 0;
|
||||||
value.calculate_size(msg_size);
|
value.calculate_size(msg_length_bytes);
|
||||||
uint32_t msg_length_bytes = msg_size.get_size();
|
|
||||||
|
|
||||||
// Calculate how many bytes the length varint needs
|
// Calculate how many bytes the length varint needs
|
||||||
uint32_t varint_length_bytes = ProtoSize::varint(msg_length_bytes);
|
uint32_t varint_length_bytes = ProtoSize::varint(msg_length_bytes);
|
||||||
@ -786,8 +823,8 @@ inline void ProtoWriteBuffer::encode_message(uint32_t field_id, const ProtoMessa
|
|||||||
assert(this->buffer_->size() == begin + varint_length_bytes + msg_length_bytes);
|
assert(this->buffer_->size() == begin + varint_length_bytes + msg_length_bytes);
|
||||||
}
|
}
|
||||||
|
|
||||||
// Implementation of decode_to_message - must be after ProtoDecodableMessage is defined
|
// Implementation of decode_to_message - must be after ProtoMessage is defined
|
||||||
inline void ProtoLengthDelimited::decode_to_message(ProtoDecodableMessage &msg) const {
|
inline void ProtoLengthDelimited::decode_to_message(ProtoMessage &msg) const {
|
||||||
msg.decode(this->value_, this->length_);
|
msg.decode(this->value_, this->length_);
|
||||||
}
|
}
|
||||||
|
|
||||||
@ -799,9 +836,7 @@ class ProtoService {
|
|||||||
virtual bool is_authenticated() = 0;
|
virtual bool is_authenticated() = 0;
|
||||||
virtual bool is_connection_setup() = 0;
|
virtual bool is_connection_setup() = 0;
|
||||||
virtual void on_fatal_error() = 0;
|
virtual void on_fatal_error() = 0;
|
||||||
#ifdef USE_API_PASSWORD
|
|
||||||
virtual void on_unauthenticated_access() = 0;
|
virtual void on_unauthenticated_access() = 0;
|
||||||
#endif
|
|
||||||
virtual void on_no_setup_connection() = 0;
|
virtual void on_no_setup_connection() = 0;
|
||||||
/**
|
/**
|
||||||
* Create a buffer with a reserved size.
|
* Create a buffer with a reserved size.
|
||||||
@ -816,9 +851,8 @@ class ProtoService {
|
|||||||
|
|
||||||
// Optimized method that pre-allocates buffer based on message size
|
// Optimized method that pre-allocates buffer based on message size
|
||||||
bool send_message_(const ProtoMessage &msg, uint8_t message_type) {
|
bool send_message_(const ProtoMessage &msg, uint8_t message_type) {
|
||||||
ProtoSize size;
|
uint32_t msg_size = 0;
|
||||||
msg.calculate_size(size);
|
msg.calculate_size(msg_size);
|
||||||
uint32_t msg_size = size.get_size();
|
|
||||||
|
|
||||||
// Create a pre-sized buffer
|
// Create a pre-sized buffer
|
||||||
auto buffer = this->create_buffer(msg_size);
|
auto buffer = this->create_buffer(msg_size);
|
||||||
@ -840,7 +874,6 @@ class ProtoService {
|
|||||||
}
|
}
|
||||||
|
|
||||||
bool check_authenticated_() {
|
bool check_authenticated_() {
|
||||||
#ifdef USE_API_PASSWORD
|
|
||||||
if (!this->check_connection_setup_()) {
|
if (!this->check_connection_setup_()) {
|
||||||
return false;
|
return false;
|
||||||
}
|
}
|
||||||
@ -849,10 +882,8 @@ class ProtoService {
|
|||||||
return false;
|
return false;
|
||||||
}
|
}
|
||||||
return true;
|
return true;
|
||||||
#else
|
|
||||||
return this->check_connection_setup_();
|
|
||||||
#endif
|
|
||||||
}
|
}
|
||||||
};
|
};
|
||||||
|
|
||||||
} // namespace esphome::api
|
} // namespace api
|
||||||
|
} // namespace esphome
|
||||||
|
@ -3,7 +3,8 @@
|
|||||||
#include "api_connection.h"
|
#include "api_connection.h"
|
||||||
#include "esphome/core/log.h"
|
#include "esphome/core/log.h"
|
||||||
|
|
||||||
namespace esphome::api {
|
namespace esphome {
|
||||||
|
namespace api {
|
||||||
|
|
||||||
// Generate entity handler implementations using macros
|
// Generate entity handler implementations using macros
|
||||||
#ifdef USE_BINARY_SENSOR
|
#ifdef USE_BINARY_SENSOR
|
||||||
@ -68,5 +69,6 @@ INITIAL_STATE_HANDLER(update, update::UpdateEntity)
|
|||||||
|
|
||||||
InitialStateIterator::InitialStateIterator(APIConnection *client) : client_(client) {}
|
InitialStateIterator::InitialStateIterator(APIConnection *client) : client_(client) {}
|
||||||
|
|
||||||
} // namespace esphome::api
|
} // namespace api
|
||||||
|
} // namespace esphome
|
||||||
#endif
|
#endif
|
||||||
|
@ -5,7 +5,8 @@
|
|||||||
#include "esphome/core/component.h"
|
#include "esphome/core/component.h"
|
||||||
#include "esphome/core/component_iterator.h"
|
#include "esphome/core/component_iterator.h"
|
||||||
#include "esphome/core/controller.h"
|
#include "esphome/core/controller.h"
|
||||||
namespace esphome::api {
|
namespace esphome {
|
||||||
|
namespace api {
|
||||||
|
|
||||||
class APIConnection;
|
class APIConnection;
|
||||||
|
|
||||||
@ -88,5 +89,6 @@ class InitialStateIterator : public ComponentIterator {
|
|||||||
APIConnection *client_;
|
APIConnection *client_;
|
||||||
};
|
};
|
||||||
|
|
||||||
} // namespace esphome::api
|
} // namespace api
|
||||||
|
} // namespace esphome
|
||||||
#endif
|
#endif
|
||||||
|
@ -1,7 +1,8 @@
|
|||||||
#include "user_services.h"
|
#include "user_services.h"
|
||||||
#include "esphome/core/log.h"
|
#include "esphome/core/log.h"
|
||||||
|
|
||||||
namespace esphome::api {
|
namespace esphome {
|
||||||
|
namespace api {
|
||||||
|
|
||||||
template<> bool get_execute_arg_value<bool>(const ExecuteServiceArgument &arg) { return arg.bool_; }
|
template<> bool get_execute_arg_value<bool>(const ExecuteServiceArgument &arg) { return arg.bool_; }
|
||||||
template<> int32_t get_execute_arg_value<int32_t>(const ExecuteServiceArgument &arg) {
|
template<> int32_t get_execute_arg_value<int32_t>(const ExecuteServiceArgument &arg) {
|
||||||
@ -39,4 +40,5 @@ template<> enums::ServiceArgType to_service_arg_type<std::vector<std::string>>()
|
|||||||
return enums::SERVICE_ARG_TYPE_STRING_ARRAY;
|
return enums::SERVICE_ARG_TYPE_STRING_ARRAY;
|
||||||
}
|
}
|
||||||
|
|
||||||
} // namespace esphome::api
|
} // namespace api
|
||||||
|
} // namespace esphome
|
||||||
|
@ -8,15 +8,14 @@
|
|||||||
#include "api_pb2.h"
|
#include "api_pb2.h"
|
||||||
|
|
||||||
#ifdef USE_API_SERVICES
|
#ifdef USE_API_SERVICES
|
||||||
namespace esphome::api {
|
namespace esphome {
|
||||||
|
namespace api {
|
||||||
|
|
||||||
class UserServiceDescriptor {
|
class UserServiceDescriptor {
|
||||||
public:
|
public:
|
||||||
virtual ListEntitiesServicesResponse encode_list_service_response() = 0;
|
virtual ListEntitiesServicesResponse encode_list_service_response() = 0;
|
||||||
|
|
||||||
virtual bool execute_service(const ExecuteServiceRequest &req) = 0;
|
virtual bool execute_service(const ExecuteServiceRequest &req) = 0;
|
||||||
|
|
||||||
bool is_internal() { return false; }
|
|
||||||
};
|
};
|
||||||
|
|
||||||
template<typename T> T get_execute_arg_value(const ExecuteServiceArgument &arg);
|
template<typename T> T get_execute_arg_value(const ExecuteServiceArgument &arg);
|
||||||
@ -32,14 +31,14 @@ template<typename... Ts> class UserServiceBase : public UserServiceDescriptor {
|
|||||||
|
|
||||||
ListEntitiesServicesResponse encode_list_service_response() override {
|
ListEntitiesServicesResponse encode_list_service_response() override {
|
||||||
ListEntitiesServicesResponse msg;
|
ListEntitiesServicesResponse msg;
|
||||||
msg.set_name(StringRef(this->name_));
|
msg.name = this->name_;
|
||||||
msg.key = this->key_;
|
msg.key = this->key_;
|
||||||
std::array<enums::ServiceArgType, sizeof...(Ts)> arg_types = {to_service_arg_type<Ts>()...};
|
std::array<enums::ServiceArgType, sizeof...(Ts)> arg_types = {to_service_arg_type<Ts>()...};
|
||||||
for (int i = 0; i < sizeof...(Ts); i++) {
|
for (int i = 0; i < sizeof...(Ts); i++) {
|
||||||
msg.args.emplace_back();
|
ListEntitiesServicesArgument arg;
|
||||||
auto &arg = msg.args.back();
|
|
||||||
arg.type = arg_types[i];
|
arg.type = arg_types[i];
|
||||||
arg.set_name(StringRef(this->arg_names_[i]));
|
arg.name = this->arg_names_[i];
|
||||||
|
msg.args.push_back(arg);
|
||||||
}
|
}
|
||||||
return msg;
|
return msg;
|
||||||
}
|
}
|
||||||
@ -73,5 +72,6 @@ template<typename... Ts> class UserServiceTrigger : public UserServiceBase<Ts...
|
|||||||
void execute(Ts... x) override { this->trigger(x...); } // NOLINT
|
void execute(Ts... x) override { this->trigger(x...); } // NOLINT
|
||||||
};
|
};
|
||||||
|
|
||||||
} // namespace esphome::api
|
} // namespace api
|
||||||
|
} // namespace esphome
|
||||||
#endif // USE_API_SERVICES
|
#endif // USE_API_SERVICES
|
||||||
|
@ -7,6 +7,8 @@ namespace as3935 {
|
|||||||
static const char *const TAG = "as3935";
|
static const char *const TAG = "as3935";
|
||||||
|
|
||||||
void AS3935Component::setup() {
|
void AS3935Component::setup() {
|
||||||
|
ESP_LOGCONFIG(TAG, "Running setup");
|
||||||
|
|
||||||
this->irq_pin_->setup();
|
this->irq_pin_->setup();
|
||||||
LOG_PIN(" IRQ Pin: ", this->irq_pin_);
|
LOG_PIN(" IRQ Pin: ", this->irq_pin_);
|
||||||
|
|
||||||
|
@ -7,7 +7,9 @@ namespace as3935_spi {
|
|||||||
static const char *const TAG = "as3935_spi";
|
static const char *const TAG = "as3935_spi";
|
||||||
|
|
||||||
void SPIAS3935Component::setup() {
|
void SPIAS3935Component::setup() {
|
||||||
|
ESP_LOGI(TAG, "SPIAS3935Component setup started!");
|
||||||
this->spi_setup();
|
this->spi_setup();
|
||||||
|
ESP_LOGI(TAG, "SPI setup finished!");
|
||||||
AS3935Component::setup();
|
AS3935Component::setup();
|
||||||
}
|
}
|
||||||
|
|
||||||
|
@ -23,6 +23,8 @@ static const uint8_t REGISTER_AGC = 0x1A; // 8 bytes / R
|
|||||||
static const uint8_t REGISTER_MAGNITUDE = 0x1B; // 16 bytes / R
|
static const uint8_t REGISTER_MAGNITUDE = 0x1B; // 16 bytes / R
|
||||||
|
|
||||||
void AS5600Component::setup() {
|
void AS5600Component::setup() {
|
||||||
|
ESP_LOGCONFIG(TAG, "Running setup");
|
||||||
|
|
||||||
if (!this->read_byte(REGISTER_STATUS).has_value()) {
|
if (!this->read_byte(REGISTER_STATUS).has_value()) {
|
||||||
this->mark_failed();
|
this->mark_failed();
|
||||||
return;
|
return;
|
||||||
|
@ -8,6 +8,7 @@ namespace as7341 {
|
|||||||
static const char *const TAG = "as7341";
|
static const char *const TAG = "as7341";
|
||||||
|
|
||||||
void AS7341Component::setup() {
|
void AS7341Component::setup() {
|
||||||
|
ESP_LOGCONFIG(TAG, "Running setup");
|
||||||
LOG_I2C_DEVICE(this);
|
LOG_I2C_DEVICE(this);
|
||||||
|
|
||||||
// Verify device ID
|
// Verify device ID
|
||||||
|
@ -71,7 +71,7 @@ bool AT581XComponent::i2c_read_reg(uint8_t addr, uint8_t &data) {
|
|||||||
return this->read_register(addr, &data, 1) == esphome::i2c::NO_ERROR;
|
return this->read_register(addr, &data, 1) == esphome::i2c::NO_ERROR;
|
||||||
}
|
}
|
||||||
|
|
||||||
void AT581XComponent::setup() {}
|
void AT581XComponent::setup() { ESP_LOGCONFIG(TAG, "Running setup"); }
|
||||||
void AT581XComponent::dump_config() { LOG_I2C_DEVICE(this); }
|
void AT581XComponent::dump_config() { LOG_I2C_DEVICE(this); }
|
||||||
#define ARRAY_SIZE(X) (sizeof(X) / sizeof((X)[0]))
|
#define ARRAY_SIZE(X) (sizeof(X) / sizeof((X)[0]))
|
||||||
bool AT581XComponent::i2c_write_config() {
|
bool AT581XComponent::i2c_write_config() {
|
||||||
|
@ -41,6 +41,7 @@ void ATM90E26Component::update() {
|
|||||||
}
|
}
|
||||||
|
|
||||||
void ATM90E26Component::setup() {
|
void ATM90E26Component::setup() {
|
||||||
|
ESP_LOGCONFIG(TAG, "Running setup");
|
||||||
this->spi_setup();
|
this->spi_setup();
|
||||||
|
|
||||||
uint16_t mmode = 0x422; // default values for everything but L/N line current gains
|
uint16_t mmode = 0x422; // default values for everything but L/N line current gains
|
||||||
|
@ -109,6 +109,7 @@ void ATM90E32Component::update() {
|
|||||||
}
|
}
|
||||||
|
|
||||||
void ATM90E32Component::setup() {
|
void ATM90E32Component::setup() {
|
||||||
|
ESP_LOGCONFIG(TAG, "Running setup");
|
||||||
this->spi_setup();
|
this->spi_setup();
|
||||||
|
|
||||||
uint16_t mmode0 = 0x87; // 3P4W 50Hz
|
uint16_t mmode0 = 0x87; // 3P4W 50Hz
|
||||||
|
@ -15,7 +15,7 @@ class AudioStreamInfo {
|
|||||||
* - An audio sample represents a unit of audio for one channel.
|
* - An audio sample represents a unit of audio for one channel.
|
||||||
* - A frame represents a unit of audio with a sample for every channel.
|
* - A frame represents a unit of audio with a sample for every channel.
|
||||||
*
|
*
|
||||||
* In general, converting between bytes, samples, and frames shouldn't result in rounding errors so long as frames
|
* In gneneral, converting between bytes, samples, and frames shouldn't result in rounding errors so long as frames
|
||||||
* are used as the main unit when transferring audio data. Durations may result in rounding for certain sample rates;
|
* are used as the main unit when transferring audio data. Durations may result in rounding for certain sample rates;
|
||||||
* e.g., 44.1 KHz. The ``frames_to_milliseconds_with_remainder`` function should be used for accuracy, as it takes
|
* e.g., 44.1 KHz. The ``frames_to_milliseconds_with_remainder`` function should be used for accuracy, as it takes
|
||||||
* into account the remainder rather than just ignoring any rounding.
|
* into account the remainder rather than just ignoring any rounding.
|
||||||
@ -76,7 +76,7 @@ class AudioStreamInfo {
|
|||||||
|
|
||||||
/// @brief Computes the duration, in microseconds, the given amount of frames represents.
|
/// @brief Computes the duration, in microseconds, the given amount of frames represents.
|
||||||
/// @param frames Number of audio frames
|
/// @param frames Number of audio frames
|
||||||
/// @return Duration in microseconds `frames` represents. May be slightly inaccurate due to integer division rounding
|
/// @return Duration in microseconds `frames` respresents. May be slightly inaccurate due to integer divison rounding
|
||||||
/// for certain sample rates.
|
/// for certain sample rates.
|
||||||
uint32_t frames_to_microseconds(uint32_t frames) const;
|
uint32_t frames_to_microseconds(uint32_t frames) const;
|
||||||
|
|
||||||
|
@ -17,6 +17,7 @@ constexpr static const uint8_t AXS_READ_TOUCHPAD[11] = {0xb5, 0xab, 0xa5, 0x5a,
|
|||||||
}
|
}
|
||||||
|
|
||||||
void AXS15231Touchscreen::setup() {
|
void AXS15231Touchscreen::setup() {
|
||||||
|
ESP_LOGCONFIG(TAG, "Running setup");
|
||||||
if (this->reset_pin_ != nullptr) {
|
if (this->reset_pin_ != nullptr) {
|
||||||
this->reset_pin_->setup();
|
this->reset_pin_->setup();
|
||||||
this->reset_pin_->digital_write(false);
|
this->reset_pin_->digital_write(false);
|
||||||
@ -35,6 +36,7 @@ void AXS15231Touchscreen::setup() {
|
|||||||
if (this->y_raw_max_ == 0) {
|
if (this->y_raw_max_ == 0) {
|
||||||
this->y_raw_max_ = this->display_->get_native_height();
|
this->y_raw_max_ = this->display_->get_native_height();
|
||||||
}
|
}
|
||||||
|
ESP_LOGCONFIG(TAG, "AXS15231 Touchscreen setup complete");
|
||||||
}
|
}
|
||||||
|
|
||||||
void AXS15231Touchscreen::update_touches() {
|
void AXS15231Touchscreen::update_touches() {
|
||||||
|
@ -121,6 +121,8 @@ void spi_dma_tx_finish_callback(unsigned int param) {
|
|||||||
}
|
}
|
||||||
|
|
||||||
void BekenSPILEDStripLightOutput::setup() {
|
void BekenSPILEDStripLightOutput::setup() {
|
||||||
|
ESP_LOGCONFIG(TAG, "Running setup");
|
||||||
|
|
||||||
size_t buffer_size = this->get_buffer_size_();
|
size_t buffer_size = this->get_buffer_size_();
|
||||||
size_t dma_buffer_size = (buffer_size * 8) + (2 * 64);
|
size_t dma_buffer_size = (buffer_size * 8) + (2 * 64);
|
||||||
|
|
||||||
|
@ -38,6 +38,7 @@ MTreg:
|
|||||||
*/
|
*/
|
||||||
|
|
||||||
void BH1750Sensor::setup() {
|
void BH1750Sensor::setup() {
|
||||||
|
ESP_LOGCONFIG(TAG, "Running setup for '%s'", this->name_.c_str());
|
||||||
uint8_t turn_on = BH1750_COMMAND_POWER_ON;
|
uint8_t turn_on = BH1750_COMMAND_POWER_ON;
|
||||||
if (this->write(&turn_on, 1) != i2c::ERROR_OK) {
|
if (this->write(&turn_on, 1) != i2c::ERROR_OK) {
|
||||||
this->mark_failed();
|
this->mark_failed();
|
||||||
|
@ -266,10 +266,8 @@ async def delayed_off_filter_to_code(config, filter_id):
|
|||||||
async def autorepeat_filter_to_code(config, filter_id):
|
async def autorepeat_filter_to_code(config, filter_id):
|
||||||
timings = []
|
timings = []
|
||||||
if len(config) > 0:
|
if len(config) > 0:
|
||||||
timings.extend(
|
for conf in config:
|
||||||
(conf[CONF_DELAY], conf[CONF_TIME_OFF], conf[CONF_TIME_ON])
|
timings.append((conf[CONF_DELAY], conf[CONF_TIME_OFF], conf[CONF_TIME_ON]))
|
||||||
for conf in config
|
|
||||||
)
|
|
||||||
else:
|
else:
|
||||||
timings.append(
|
timings.append(
|
||||||
(
|
(
|
||||||
@ -516,7 +514,6 @@ def binary_sensor_schema(
|
|||||||
icon: str = cv.UNDEFINED,
|
icon: str = cv.UNDEFINED,
|
||||||
entity_category: str = cv.UNDEFINED,
|
entity_category: str = cv.UNDEFINED,
|
||||||
device_class: str = cv.UNDEFINED,
|
device_class: str = cv.UNDEFINED,
|
||||||
filters: list = cv.UNDEFINED,
|
|
||||||
) -> cv.Schema:
|
) -> cv.Schema:
|
||||||
schema = {}
|
schema = {}
|
||||||
|
|
||||||
@ -528,7 +525,6 @@ def binary_sensor_schema(
|
|||||||
(CONF_ICON, icon, cv.icon),
|
(CONF_ICON, icon, cv.icon),
|
||||||
(CONF_ENTITY_CATEGORY, entity_category, cv.entity_category),
|
(CONF_ENTITY_CATEGORY, entity_category, cv.entity_category),
|
||||||
(CONF_DEVICE_CLASS, device_class, validate_device_class),
|
(CONF_DEVICE_CLASS, device_class, validate_device_class),
|
||||||
(CONF_FILTERS, filters, validate_filters),
|
|
||||||
]:
|
]:
|
||||||
if default is not cv.UNDEFINED:
|
if default is not cv.UNDEFINED:
|
||||||
schema[cv.Optional(key, default=default)] = validator
|
schema[cv.Optional(key, default=default)] = validator
|
||||||
@ -577,15 +573,16 @@ async def setup_binary_sensor_core_(var, config):
|
|||||||
await automation.build_automation(trigger, [], conf)
|
await automation.build_automation(trigger, [], conf)
|
||||||
|
|
||||||
for conf in config.get(CONF_ON_MULTI_CLICK, []):
|
for conf in config.get(CONF_ON_MULTI_CLICK, []):
|
||||||
timings = [
|
timings = []
|
||||||
cg.StructInitializer(
|
for tim in conf[CONF_TIMING]:
|
||||||
MultiClickTriggerEvent,
|
timings.append(
|
||||||
("state", tim[CONF_STATE]),
|
cg.StructInitializer(
|
||||||
("min_length", tim[CONF_MIN_LENGTH]),
|
MultiClickTriggerEvent,
|
||||||
("max_length", tim.get(CONF_MAX_LENGTH, 4294967294)),
|
("state", tim[CONF_STATE]),
|
||||||
|
("min_length", tim[CONF_MIN_LENGTH]),
|
||||||
|
("max_length", tim.get(CONF_MAX_LENGTH, 4294967294)),
|
||||||
|
)
|
||||||
)
|
)
|
||||||
for tim in conf[CONF_TIMING]
|
|
||||||
]
|
|
||||||
trigger = cg.new_Pvariable(conf[CONF_TRIGGER_ID], var, timings)
|
trigger = cg.new_Pvariable(conf[CONF_TRIGGER_ID], var, timings)
|
||||||
if CONF_INVALID_COOLDOWN in conf:
|
if CONF_INVALID_COOLDOWN in conf:
|
||||||
cg.add(trigger.set_invalid_cooldown(conf[CONF_INVALID_COOLDOWN]))
|
cg.add(trigger.set_invalid_cooldown(conf[CONF_INVALID_COOLDOWN]))
|
||||||
|
@ -175,7 +175,8 @@ BLE_REMOVE_BOND_ACTION_SCHEMA = cv.Schema(
|
|||||||
)
|
)
|
||||||
async def ble_disconnect_to_code(config, action_id, template_arg, args):
|
async def ble_disconnect_to_code(config, action_id, template_arg, args):
|
||||||
parent = await cg.get_variable(config[CONF_ID])
|
parent = await cg.get_variable(config[CONF_ID])
|
||||||
return cg.new_Pvariable(action_id, template_arg, parent)
|
var = cg.new_Pvariable(action_id, template_arg, parent)
|
||||||
|
return var
|
||||||
|
|
||||||
|
|
||||||
@automation.register_action(
|
@automation.register_action(
|
||||||
@ -183,7 +184,8 @@ async def ble_disconnect_to_code(config, action_id, template_arg, args):
|
|||||||
)
|
)
|
||||||
async def ble_connect_to_code(config, action_id, template_arg, args):
|
async def ble_connect_to_code(config, action_id, template_arg, args):
|
||||||
parent = await cg.get_variable(config[CONF_ID])
|
parent = await cg.get_variable(config[CONF_ID])
|
||||||
return cg.new_Pvariable(action_id, template_arg, parent)
|
var = cg.new_Pvariable(action_id, template_arg, parent)
|
||||||
|
return var
|
||||||
|
|
||||||
|
|
||||||
@automation.register_action(
|
@automation.register_action(
|
||||||
@ -280,7 +282,9 @@ async def passkey_reply_to_code(config, action_id, template_arg, args):
|
|||||||
)
|
)
|
||||||
async def remove_bond_to_code(config, action_id, template_arg, args):
|
async def remove_bond_to_code(config, action_id, template_arg, args):
|
||||||
parent = await cg.get_variable(config[CONF_ID])
|
parent = await cg.get_variable(config[CONF_ID])
|
||||||
return cg.new_Pvariable(action_id, template_arg, parent)
|
var = cg.new_Pvariable(action_id, template_arg, parent)
|
||||||
|
|
||||||
|
return var
|
||||||
|
|
||||||
|
|
||||||
async def to_code(config):
|
async def to_code(config):
|
||||||
|
Some files were not shown because too many files have changed in this diff Show More
Loading…
x
Reference in New Issue
Block a user