mirror of
https://github.com/esphome/esphome.git
synced 2025-07-24 12:16:35 +00:00
Compare commits
No commits in common. "dev" and "2024.11.0b1" have entirely different histories.
dev
...
2024.11.0b
@ -1,222 +0,0 @@
|
|||||||
# ESPHome AI Collaboration Guide
|
|
||||||
|
|
||||||
This document provides essential context for AI models interacting with this project. Adhering to these guidelines will ensure consistency and maintain code quality.
|
|
||||||
|
|
||||||
## 1. Project Overview & Purpose
|
|
||||||
|
|
||||||
* **Primary Goal:** ESPHome is a system to configure microcontrollers (like ESP32, ESP8266, RP2040, and LibreTiny-based chips) using simple yet powerful YAML configuration files. It generates C++ firmware that can be compiled and flashed to these devices, allowing users to control them remotely through home automation systems.
|
|
||||||
* **Business Domain:** Internet of Things (IoT), Home Automation.
|
|
||||||
|
|
||||||
## 2. Core Technologies & Stack
|
|
||||||
|
|
||||||
* **Languages:** Python (>=3.10), C++ (gnu++20)
|
|
||||||
* **Frameworks & Runtimes:** PlatformIO, Arduino, ESP-IDF.
|
|
||||||
* **Build Systems:** PlatformIO is the primary build system. CMake is used as an alternative.
|
|
||||||
* **Configuration:** YAML.
|
|
||||||
* **Key Libraries/Dependencies:**
|
|
||||||
* **Python:** `voluptuous` (for configuration validation), `PyYAML` (for parsing configuration files), `paho-mqtt` (for MQTT communication), `tornado` (for the web server), `aioesphomeapi` (for the native API).
|
|
||||||
* **C++:** `ArduinoJson` (for JSON serialization/deserialization), `AsyncMqttClient-esphome` (for MQTT), `ESPAsyncWebServer` (for the web server).
|
|
||||||
* **Package Manager(s):** `pip` (for Python dependencies), `platformio` (for C++/PlatformIO dependencies).
|
|
||||||
* **Communication Protocols:** Protobuf (for native API), MQTT, HTTP.
|
|
||||||
|
|
||||||
## 3. Architectural Patterns
|
|
||||||
|
|
||||||
* **Overall Architecture:** The project follows a code-generation architecture. The Python code parses user-defined YAML configuration files and generates C++ source code. This C++ code is then compiled and flashed to the target microcontroller using PlatformIO.
|
|
||||||
|
|
||||||
* **Directory Structure Philosophy:**
|
|
||||||
* `/esphome`: Contains the core Python source code for the ESPHome application.
|
|
||||||
* `/esphome/components`: Contains the individual components that can be used in ESPHome configurations. Each component is a self-contained unit with its own C++ and Python code.
|
|
||||||
* `/tests`: Contains all unit and integration tests for the Python code.
|
|
||||||
* `/docker`: Contains Docker-related files for building and running ESPHome in a container.
|
|
||||||
* `/script`: Contains helper scripts for development and maintenance.
|
|
||||||
|
|
||||||
* **Core Architectural Components:**
|
|
||||||
1. **Configuration System** (`esphome/config*.py`): Handles YAML parsing and validation using Voluptuous, schema definitions, and multi-platform configurations.
|
|
||||||
2. **Code Generation** (`esphome/codegen.py`, `esphome/cpp_generator.py`): Manages Python to C++ code generation, template processing, and build flag management.
|
|
||||||
3. **Component System** (`esphome/components/`): Contains modular hardware and software components with platform-specific implementations and dependency management.
|
|
||||||
4. **Core Framework** (`esphome/core/`): Manages the application lifecycle, hardware abstraction, and component registration.
|
|
||||||
5. **Dashboard** (`esphome/dashboard/`): A web-based interface for device configuration, management, and OTA updates.
|
|
||||||
|
|
||||||
* **Platform Support:**
|
|
||||||
1. **ESP32** (`components/esp32/`): Espressif ESP32 family. Supports multiple variants (S2, S3, C3, etc.) and both IDF and Arduino frameworks.
|
|
||||||
2. **ESP8266** (`components/esp8266/`): Espressif ESP8266. Arduino framework only, with memory constraints.
|
|
||||||
3. **RP2040** (`components/rp2040/`): Raspberry Pi Pico/RP2040. Arduino framework with PIO (Programmable I/O) support.
|
|
||||||
4. **LibreTiny** (`components/libretiny/`): Realtek and Beken chips. Supports multiple chip families and auto-generated components.
|
|
||||||
|
|
||||||
## 4. Coding Conventions & Style Guide
|
|
||||||
|
|
||||||
* **Formatting:**
|
|
||||||
* **Python:** Uses `ruff` and `flake8` for linting and formatting. Configuration is in `pyproject.toml`.
|
|
||||||
* **C++:** Uses `clang-format` for formatting. Configuration is in `.clang-format`.
|
|
||||||
|
|
||||||
* **Naming Conventions:**
|
|
||||||
* **Python:** Follows PEP 8. Use clear, descriptive names following snake_case.
|
|
||||||
* **C++:** Follows the Google C++ Style Guide.
|
|
||||||
|
|
||||||
* **Component Structure:**
|
|
||||||
* **Standard Files:**
|
|
||||||
```
|
|
||||||
components/[component_name]/
|
|
||||||
├── __init__.py # Component configuration schema and code generation
|
|
||||||
├── [component].h # C++ header file (if needed)
|
|
||||||
├── [component].cpp # C++ implementation (if needed)
|
|
||||||
└── [platform]/ # Platform-specific implementations
|
|
||||||
├── __init__.py # Platform-specific configuration
|
|
||||||
├── [platform].h # Platform C++ header
|
|
||||||
└── [platform].cpp # Platform C++ implementation
|
|
||||||
```
|
|
||||||
|
|
||||||
* **Component Metadata:**
|
|
||||||
- `DEPENDENCIES`: List of required components
|
|
||||||
- `AUTO_LOAD`: Components to automatically load
|
|
||||||
- `CONFLICTS_WITH`: Incompatible components
|
|
||||||
- `CODEOWNERS`: GitHub usernames responsible for maintenance
|
|
||||||
- `MULTI_CONF`: Whether multiple instances are allowed
|
|
||||||
|
|
||||||
* **Code Generation & Common Patterns:**
|
|
||||||
* **Configuration Schema Pattern:**
|
|
||||||
```python
|
|
||||||
import esphome.codegen as cg
|
|
||||||
import esphome.config_validation as cv
|
|
||||||
from esphome.const import CONF_KEY, CONF_ID
|
|
||||||
|
|
||||||
CONF_PARAM = "param" # A constant that does not yet exist in esphome/const.py
|
|
||||||
|
|
||||||
my_component_ns = cg.esphome_ns.namespace("my_component")
|
|
||||||
MyComponent = my_component_ns.class_("MyComponent", cg.Component)
|
|
||||||
|
|
||||||
CONFIG_SCHEMA = cv.Schema({
|
|
||||||
cv.GenerateID(): cv.declare_id(MyComponent),
|
|
||||||
cv.Required(CONF_KEY): cv.string,
|
|
||||||
cv.Optional(CONF_PARAM, default=42): cv.int_,
|
|
||||||
}).extend(cv.COMPONENT_SCHEMA)
|
|
||||||
|
|
||||||
async def to_code(config):
|
|
||||||
var = cg.new_Pvariable(config[CONF_ID])
|
|
||||||
await cg.register_component(var, config)
|
|
||||||
cg.add(var.set_key(config[CONF_KEY]))
|
|
||||||
cg.add(var.set_param(config[CONF_PARAM]))
|
|
||||||
```
|
|
||||||
|
|
||||||
* **C++ Class Pattern:**
|
|
||||||
```cpp
|
|
||||||
namespace esphome {
|
|
||||||
namespace my_component {
|
|
||||||
|
|
||||||
class MyComponent : public Component {
|
|
||||||
public:
|
|
||||||
void setup() override;
|
|
||||||
void loop() override;
|
|
||||||
void dump_config() override;
|
|
||||||
|
|
||||||
void set_key(const std::string &key) { this->key_ = key; }
|
|
||||||
void set_param(int param) { this->param_ = param; }
|
|
||||||
|
|
||||||
protected:
|
|
||||||
std::string key_;
|
|
||||||
int param_{0};
|
|
||||||
};
|
|
||||||
|
|
||||||
} // namespace my_component
|
|
||||||
} // namespace esphome
|
|
||||||
```
|
|
||||||
|
|
||||||
* **Common Component Examples:**
|
|
||||||
- **Sensor:**
|
|
||||||
```python
|
|
||||||
from esphome.components import sensor
|
|
||||||
CONFIG_SCHEMA = sensor.sensor_schema(MySensor).extend(cv.polling_component_schema("60s"))
|
|
||||||
async def to_code(config):
|
|
||||||
var = await sensor.new_sensor(config)
|
|
||||||
await cg.register_component(var, config)
|
|
||||||
```
|
|
||||||
|
|
||||||
- **Binary Sensor:**
|
|
||||||
```python
|
|
||||||
from esphome.components import binary_sensor
|
|
||||||
CONFIG_SCHEMA = binary_sensor.binary_sensor_schema().extend({ ... })
|
|
||||||
async def to_code(config):
|
|
||||||
var = await binary_sensor.new_binary_sensor(config)
|
|
||||||
```
|
|
||||||
|
|
||||||
- **Switch:**
|
|
||||||
```python
|
|
||||||
from esphome.components import switch
|
|
||||||
CONFIG_SCHEMA = switch.switch_schema().extend({ ... })
|
|
||||||
async def to_code(config):
|
|
||||||
var = await switch.new_switch(config)
|
|
||||||
```
|
|
||||||
|
|
||||||
* **Configuration Validation:**
|
|
||||||
* **Common Validators:** `cv.int_`, `cv.float_`, `cv.string`, `cv.boolean`, `cv.int_range(min=0, max=100)`, `cv.positive_int`, `cv.percentage`.
|
|
||||||
* **Complex Validation:** `cv.All(cv.string, cv.Length(min=1, max=50))`, `cv.Any(cv.int_, cv.string)`.
|
|
||||||
* **Platform-Specific:** `cv.only_on(["esp32", "esp8266"])`, `cv.only_with_arduino`.
|
|
||||||
* **Schema Extensions:**
|
|
||||||
```python
|
|
||||||
CONFIG_SCHEMA = cv.Schema({ ... })
|
|
||||||
.extend(cv.COMPONENT_SCHEMA)
|
|
||||||
.extend(uart.UART_DEVICE_SCHEMA)
|
|
||||||
.extend(i2c.i2c_device_schema(0x48))
|
|
||||||
.extend(spi.spi_device_schema(cs_pin_required=True))
|
|
||||||
```
|
|
||||||
|
|
||||||
## 5. Key Files & Entrypoints
|
|
||||||
|
|
||||||
* **Main Entrypoint(s):** `esphome/__main__.py` is the main entrypoint for the ESPHome command-line interface.
|
|
||||||
* **Configuration:**
|
|
||||||
* `pyproject.toml`: Defines the Python project metadata and dependencies.
|
|
||||||
* `platformio.ini`: Configures the PlatformIO build environments for different microcontrollers.
|
|
||||||
* `.pre-commit-config.yaml`: Configures the pre-commit hooks for linting and formatting.
|
|
||||||
* **CI/CD Pipeline:** Defined in `.github/workflows`.
|
|
||||||
|
|
||||||
## 6. Development & Testing Workflow
|
|
||||||
|
|
||||||
* **Local Development Environment:** Use the provided Docker container or create a Python virtual environment and install dependencies from `requirements_dev.txt`.
|
|
||||||
* **Running Commands:** Use the `script/run-in-env.py` script to execute commands within the project's virtual environment. For example, to run the linter: `python3 script/run-in-env.py pre-commit run`.
|
|
||||||
* **Testing:**
|
|
||||||
* **Python:** Run unit tests with `pytest`.
|
|
||||||
* **C++:** Use `clang-tidy` for static analysis.
|
|
||||||
* **Component Tests:** YAML-based compilation tests are located in `tests/`. The structure is as follows:
|
|
||||||
```
|
|
||||||
tests/
|
|
||||||
├── test_build_components/ # Base test configurations
|
|
||||||
└── components/[component]/ # Component-specific tests
|
|
||||||
```
|
|
||||||
Run them using `script/test_build_components`. Use `-c <component>` to test specific components and `-t <target>` for specific platforms.
|
|
||||||
* **Debugging and Troubleshooting:**
|
|
||||||
* **Debug Tools:**
|
|
||||||
- `esphome config <file>.yaml` to validate configuration.
|
|
||||||
- `esphome compile <file>.yaml` to compile without uploading.
|
|
||||||
- Check the Dashboard for real-time logs.
|
|
||||||
- Use component-specific debug logging.
|
|
||||||
* **Common Issues:**
|
|
||||||
- **Import Errors**: Check component dependencies and `PYTHONPATH`.
|
|
||||||
- **Validation Errors**: Review configuration schema definitions.
|
|
||||||
- **Build Errors**: Check platform compatibility and library versions.
|
|
||||||
- **Runtime Errors**: Review generated C++ code and component logic.
|
|
||||||
|
|
||||||
## 7. Specific Instructions for AI Collaboration
|
|
||||||
|
|
||||||
* **Contribution Workflow (Pull Request Process):**
|
|
||||||
1. **Fork & Branch:** Create a new branch in your fork.
|
|
||||||
2. **Make Changes:** Adhere to all coding conventions and patterns.
|
|
||||||
3. **Test:** Create component tests for all supported platforms and run the full test suite locally.
|
|
||||||
4. **Lint:** Run `pre-commit` to ensure code is compliant.
|
|
||||||
5. **Commit:** Commit your changes. There is no strict format for commit messages.
|
|
||||||
6. **Pull Request:** Submit a PR against the `dev` branch. The Pull Request title should have a prefix of the component being worked on (e.g., `[display] Fix bug`, `[abc123] Add new component`). Update documentation, examples, and add `CODEOWNERS` entries as needed. Pull requests should always be made with the PULL_REQUEST_TEMPLATE.md template filled out correctly.
|
|
||||||
|
|
||||||
* **Documentation Contributions:**
|
|
||||||
* Documentation is hosted in the separate `esphome/esphome-docs` repository.
|
|
||||||
* The contribution workflow is the same as for the codebase.
|
|
||||||
|
|
||||||
* **Best Practices:**
|
|
||||||
* **Component Development:** Keep dependencies minimal, provide clear error messages, and write comprehensive docstrings and tests.
|
|
||||||
* **Code Generation:** Generate minimal and efficient C++ code. Validate all user inputs thoroughly. Support multiple platform variations.
|
|
||||||
* **Configuration Design:** Aim for simplicity with sensible defaults, while allowing for advanced customization.
|
|
||||||
|
|
||||||
* **Security:** Be mindful of security when making changes to the API, web server, or any other network-related code. Do not hardcode secrets or keys.
|
|
||||||
|
|
||||||
* **Dependencies & Build System Integration:**
|
|
||||||
* **Python:** When adding a new Python dependency, add it to the appropriate `requirements*.txt` file and `pyproject.toml`.
|
|
||||||
* **C++ / PlatformIO:** When adding a new C++ dependency, add it to `platformio.ini` and use `cg.add_library`.
|
|
||||||
* **Build Flags:** Use `cg.add_build_flag(...)` to add compiler flags.
|
|
36
.clang-tidy
36
.clang-tidy
@ -7,39 +7,28 @@ Checks: >-
|
|||||||
-boost-*,
|
-boost-*,
|
||||||
-bugprone-easily-swappable-parameters,
|
-bugprone-easily-swappable-parameters,
|
||||||
-bugprone-implicit-widening-of-multiplication-result,
|
-bugprone-implicit-widening-of-multiplication-result,
|
||||||
-bugprone-multi-level-implicit-pointer-conversion,
|
|
||||||
-bugprone-narrowing-conversions,
|
-bugprone-narrowing-conversions,
|
||||||
-bugprone-signed-char-misuse,
|
-bugprone-signed-char-misuse,
|
||||||
-bugprone-switch-missing-default-case,
|
|
||||||
-cert-dcl50-cpp,
|
-cert-dcl50-cpp,
|
||||||
-cert-err33-c,
|
-cert-err33-c,
|
||||||
-cert-err58-cpp,
|
-cert-err58-cpp,
|
||||||
-cert-oop57-cpp,
|
-cert-oop57-cpp,
|
||||||
-cert-str34-c,
|
-cert-str34-c,
|
||||||
-clang-analyzer-optin.core.EnumCastOutOfRange,
|
|
||||||
-clang-analyzer-optin.cplusplus.UninitializedObject,
|
-clang-analyzer-optin.cplusplus.UninitializedObject,
|
||||||
-clang-analyzer-osx.*,
|
-clang-analyzer-osx.*,
|
||||||
-clang-diagnostic-delete-abstract-non-virtual-dtor,
|
-clang-diagnostic-delete-abstract-non-virtual-dtor,
|
||||||
-clang-diagnostic-delete-non-abstract-non-virtual-dtor,
|
-clang-diagnostic-delete-non-abstract-non-virtual-dtor,
|
||||||
-clang-diagnostic-deprecated-declarations,
|
|
||||||
-clang-diagnostic-ignored-optimization-argument,
|
-clang-diagnostic-ignored-optimization-argument,
|
||||||
-clang-diagnostic-missing-field-initializers,
|
|
||||||
-clang-diagnostic-shadow-field,
|
-clang-diagnostic-shadow-field,
|
||||||
-clang-diagnostic-unused-const-variable,
|
-clang-diagnostic-unused-const-variable,
|
||||||
-clang-diagnostic-unused-parameter,
|
-clang-diagnostic-unused-parameter,
|
||||||
-clang-diagnostic-vla-cxx-extension,
|
|
||||||
-concurrency-*,
|
-concurrency-*,
|
||||||
-cppcoreguidelines-avoid-c-arrays,
|
-cppcoreguidelines-avoid-c-arrays,
|
||||||
-cppcoreguidelines-avoid-const-or-ref-data-members,
|
|
||||||
-cppcoreguidelines-avoid-do-while,
|
|
||||||
-cppcoreguidelines-avoid-magic-numbers,
|
-cppcoreguidelines-avoid-magic-numbers,
|
||||||
-cppcoreguidelines-init-variables,
|
-cppcoreguidelines-init-variables,
|
||||||
-cppcoreguidelines-macro-to-enum,
|
|
||||||
-cppcoreguidelines-macro-usage,
|
-cppcoreguidelines-macro-usage,
|
||||||
-cppcoreguidelines-missing-std-forward,
|
|
||||||
-cppcoreguidelines-narrowing-conversions,
|
-cppcoreguidelines-narrowing-conversions,
|
||||||
-cppcoreguidelines-non-private-member-variables-in-classes,
|
-cppcoreguidelines-non-private-member-variables-in-classes,
|
||||||
-cppcoreguidelines-owning-memory,
|
|
||||||
-cppcoreguidelines-prefer-member-initializer,
|
-cppcoreguidelines-prefer-member-initializer,
|
||||||
-cppcoreguidelines-pro-bounds-array-to-pointer-decay,
|
-cppcoreguidelines-pro-bounds-array-to-pointer-decay,
|
||||||
-cppcoreguidelines-pro-bounds-constant-array-index,
|
-cppcoreguidelines-pro-bounds-constant-array-index,
|
||||||
@ -51,9 +40,7 @@ Checks: >-
|
|||||||
-cppcoreguidelines-pro-type-static-cast-downcast,
|
-cppcoreguidelines-pro-type-static-cast-downcast,
|
||||||
-cppcoreguidelines-pro-type-union-access,
|
-cppcoreguidelines-pro-type-union-access,
|
||||||
-cppcoreguidelines-pro-type-vararg,
|
-cppcoreguidelines-pro-type-vararg,
|
||||||
-cppcoreguidelines-rvalue-reference-param-not-moved,
|
|
||||||
-cppcoreguidelines-special-member-functions,
|
-cppcoreguidelines-special-member-functions,
|
||||||
-cppcoreguidelines-use-default-member-init,
|
|
||||||
-cppcoreguidelines-virtual-class-destructor,
|
-cppcoreguidelines-virtual-class-destructor,
|
||||||
-fuchsia-multiple-inheritance,
|
-fuchsia-multiple-inheritance,
|
||||||
-fuchsia-overloaded-operator,
|
-fuchsia-overloaded-operator,
|
||||||
@ -73,32 +60,20 @@ Checks: >-
|
|||||||
-llvm-include-order,
|
-llvm-include-order,
|
||||||
-llvm-qualified-auto,
|
-llvm-qualified-auto,
|
||||||
-llvmlibc-*,
|
-llvmlibc-*,
|
||||||
-misc-const-correctness,
|
|
||||||
-misc-include-cleaner,
|
|
||||||
-misc-no-recursion,
|
|
||||||
-misc-non-private-member-variables-in-classes,
|
-misc-non-private-member-variables-in-classes,
|
||||||
|
-misc-no-recursion,
|
||||||
-misc-unused-parameters,
|
-misc-unused-parameters,
|
||||||
-misc-use-anonymous-namespace,
|
|
||||||
-modernize-avoid-bind,
|
|
||||||
-modernize-avoid-c-arrays,
|
-modernize-avoid-c-arrays,
|
||||||
|
-modernize-avoid-bind,
|
||||||
-modernize-concat-nested-namespaces,
|
-modernize-concat-nested-namespaces,
|
||||||
-modernize-macro-to-enum,
|
|
||||||
-modernize-return-braced-init-list,
|
-modernize-return-braced-init-list,
|
||||||
-modernize-type-traits,
|
|
||||||
-modernize-use-auto,
|
-modernize-use-auto,
|
||||||
-modernize-use-constraints,
|
|
||||||
-modernize-use-default-member-init,
|
-modernize-use-default-member-init,
|
||||||
-modernize-use-equals-default,
|
-modernize-use-equals-default,
|
||||||
-modernize-use-nodiscard,
|
|
||||||
-modernize-use-nullptr,
|
|
||||||
-modernize-use-nodiscard,
|
|
||||||
-modernize-use-nullptr,
|
|
||||||
-modernize-use-trailing-return-type,
|
-modernize-use-trailing-return-type,
|
||||||
|
-modernize-use-nodiscard,
|
||||||
-mpi-*,
|
-mpi-*,
|
||||||
-objc-*,
|
-objc-*,
|
||||||
-performance-enum-size,
|
|
||||||
-readability-avoid-nested-conditional-operator,
|
|
||||||
-readability-container-contains,
|
|
||||||
-readability-container-data-pointer,
|
-readability-container-data-pointer,
|
||||||
-readability-convert-member-functions-to-static,
|
-readability-convert-member-functions-to-static,
|
||||||
-readability-else-after-return,
|
-readability-else-after-return,
|
||||||
@ -107,14 +82,11 @@ Checks: >-
|
|||||||
-readability-isolate-declaration,
|
-readability-isolate-declaration,
|
||||||
-readability-magic-numbers,
|
-readability-magic-numbers,
|
||||||
-readability-make-member-function-const,
|
-readability-make-member-function-const,
|
||||||
-readability-named-parameter,
|
|
||||||
-readability-redundant-casting,
|
|
||||||
-readability-redundant-inline-specifier,
|
|
||||||
-readability-redundant-member-init,
|
|
||||||
-readability-redundant-string-init,
|
-readability-redundant-string-init,
|
||||||
-readability-uppercase-literal-suffix,
|
-readability-uppercase-literal-suffix,
|
||||||
-readability-use-anyofallof,
|
-readability-use-anyofallof,
|
||||||
WarningsAsErrors: '*'
|
WarningsAsErrors: '*'
|
||||||
|
AnalyzeTemporaryDtors: false
|
||||||
FormatStyle: google
|
FormatStyle: google
|
||||||
CheckOptions:
|
CheckOptions:
|
||||||
- key: google-readability-function-size.StatementThreshold
|
- key: google-readability-function-size.StatementThreshold
|
||||||
|
@ -1 +0,0 @@
|
|||||||
8016e9cbe199bf1a65a0c90e7a37d768ec774f4fff70de530a9b55708af71e74
|
|
@ -1,4 +1,2 @@
|
|||||||
[run]
|
[run]
|
||||||
omit =
|
omit = esphome/components/*
|
||||||
esphome/components/*
|
|
||||||
tests/integration/*
|
|
||||||
|
@ -1,37 +0,0 @@
|
|||||||
ARG BUILD_BASE_VERSION=2025.04.0
|
|
||||||
|
|
||||||
|
|
||||||
FROM ghcr.io/esphome/docker-base:debian-${BUILD_BASE_VERSION} AS base
|
|
||||||
|
|
||||||
RUN git config --system --add safe.directory "*"
|
|
||||||
|
|
||||||
RUN apt update \
|
|
||||||
&& apt install -y \
|
|
||||||
protobuf-compiler
|
|
||||||
|
|
||||||
RUN pip install uv
|
|
||||||
|
|
||||||
RUN useradd esphome -m
|
|
||||||
|
|
||||||
USER esphome
|
|
||||||
ENV VIRTUAL_ENV=/home/esphome/.local/esphome-venv
|
|
||||||
RUN uv venv $VIRTUAL_ENV
|
|
||||||
ENV PATH="$VIRTUAL_ENV/bin:$PATH"
|
|
||||||
# Override this set to true in the docker-base image
|
|
||||||
ENV UV_SYSTEM_PYTHON=false
|
|
||||||
|
|
||||||
WORKDIR /tmp
|
|
||||||
|
|
||||||
COPY requirements.txt ./
|
|
||||||
RUN uv pip install -r requirements.txt
|
|
||||||
COPY requirements_dev.txt requirements_test.txt ./
|
|
||||||
RUN uv pip install -r requirements_dev.txt -r requirements_test.txt
|
|
||||||
|
|
||||||
RUN \
|
|
||||||
platformio settings set enable_telemetry No \
|
|
||||||
&& platformio settings set check_platformio_interval 1000000
|
|
||||||
|
|
||||||
COPY script/platformio_install_deps.py platformio.ini ./
|
|
||||||
RUN ./platformio_install_deps.py platformio.ini --libraries --platforms --tools
|
|
||||||
|
|
||||||
WORKDIR /workspaces
|
|
@ -1,17 +1,18 @@
|
|||||||
{
|
{
|
||||||
"name": "ESPHome Dev",
|
"name": "ESPHome Dev",
|
||||||
"context": "..",
|
"image": "ghcr.io/esphome/esphome-lint:dev",
|
||||||
"dockerFile": "Dockerfile",
|
|
||||||
"postCreateCommand": [
|
"postCreateCommand": [
|
||||||
"script/devcontainer-post-create"
|
"script/devcontainer-post-create"
|
||||||
],
|
],
|
||||||
"features": {
|
"containerEnv": {
|
||||||
"ghcr.io/devcontainers/features/github-cli:1": {}
|
"DEVCONTAINER": "1",
|
||||||
|
"PIP_BREAK_SYSTEM_PACKAGES": "1",
|
||||||
|
"PIP_ROOT_USER_ACTION": "ignore"
|
||||||
},
|
},
|
||||||
"runArgs": [
|
"runArgs": [
|
||||||
"--privileged",
|
"--privileged",
|
||||||
"-e",
|
"-e",
|
||||||
"GIT_EDITOR=code --wait"
|
"ESPHOME_DASHBOARD_USE_PING=1"
|
||||||
// uncomment and edit the path in order to pass though local USB serial to the conatiner
|
// uncomment and edit the path in order to pass though local USB serial to the conatiner
|
||||||
// , "--device=/dev/ttyACM0"
|
// , "--device=/dev/ttyACM0"
|
||||||
],
|
],
|
||||||
@ -30,7 +31,7 @@
|
|||||||
"ms-python.python",
|
"ms-python.python",
|
||||||
"ms-python.pylint",
|
"ms-python.pylint",
|
||||||
"ms-python.flake8",
|
"ms-python.flake8",
|
||||||
"charliermarsh.ruff",
|
"ms-python.black-formatter",
|
||||||
"visualstudioexptteam.vscodeintellicode",
|
"visualstudioexptteam.vscodeintellicode",
|
||||||
// yaml
|
// yaml
|
||||||
"redhat.vscode-yaml",
|
"redhat.vscode-yaml",
|
||||||
@ -48,11 +49,14 @@
|
|||||||
"flake8.args": [
|
"flake8.args": [
|
||||||
"--config=${workspaceFolder}/.flake8"
|
"--config=${workspaceFolder}/.flake8"
|
||||||
],
|
],
|
||||||
"ruff.configuration": "${workspaceFolder}/pyproject.toml",
|
"black-formatter.args": [
|
||||||
|
"--config",
|
||||||
|
"${workspaceFolder}/pyproject.toml"
|
||||||
|
],
|
||||||
"[python]": {
|
"[python]": {
|
||||||
// VS will say "Value is not accepted" before building the devcontainer, but the warning
|
// VS will say "Value is not accepted" before building the devcontainer, but the warning
|
||||||
// should go away after build is completed.
|
// should go away after build is completed.
|
||||||
"editor.defaultFormatter": "charliermarsh.ruff"
|
"editor.defaultFormatter": "ms-python.black-formatter"
|
||||||
},
|
},
|
||||||
"editor.formatOnPaste": false,
|
"editor.formatOnPaste": false,
|
||||||
"editor.formatOnSave": true,
|
"editor.formatOnSave": true,
|
||||||
|
@ -114,5 +114,4 @@ config/
|
|||||||
examples/
|
examples/
|
||||||
Dockerfile
|
Dockerfile
|
||||||
.git/
|
.git/
|
||||||
tests/
|
tests/build/
|
||||||
.*
|
|
||||||
|
4
.github/FUNDING.yml
vendored
Normal file
4
.github/FUNDING.yml
vendored
Normal file
@ -0,0 +1,4 @@
|
|||||||
|
---
|
||||||
|
# These are supported funding model platforms
|
||||||
|
|
||||||
|
custom: https://www.nabucasa.com
|
92
.github/ISSUE_TEMPLATE/bug_report.yml
vendored
92
.github/ISSUE_TEMPLATE/bug_report.yml
vendored
@ -1,92 +0,0 @@
|
|||||||
name: Report an issue with ESPHome
|
|
||||||
description: Report an issue with ESPHome.
|
|
||||||
body:
|
|
||||||
- type: markdown
|
|
||||||
attributes:
|
|
||||||
value: |
|
|
||||||
This issue form is for reporting bugs only!
|
|
||||||
|
|
||||||
If you have a feature request or enhancement, please [request them here instead][fr].
|
|
||||||
|
|
||||||
[fr]: https://github.com/orgs/esphome/discussions
|
|
||||||
- type: textarea
|
|
||||||
validations:
|
|
||||||
required: true
|
|
||||||
id: problem
|
|
||||||
attributes:
|
|
||||||
label: The problem
|
|
||||||
description: >-
|
|
||||||
Describe the issue you are experiencing here to communicate to the
|
|
||||||
maintainers. Tell us what you were trying to do and what happened.
|
|
||||||
|
|
||||||
Provide a clear and concise description of what the problem is.
|
|
||||||
|
|
||||||
- type: markdown
|
|
||||||
attributes:
|
|
||||||
value: |
|
|
||||||
## Environment
|
|
||||||
- type: input
|
|
||||||
id: version
|
|
||||||
validations:
|
|
||||||
required: true
|
|
||||||
attributes:
|
|
||||||
label: Which version of ESPHome has the issue?
|
|
||||||
description: >
|
|
||||||
ESPHome version like 1.19, 2025.6.0 or 2025.XX.X-dev.
|
|
||||||
- type: dropdown
|
|
||||||
validations:
|
|
||||||
required: true
|
|
||||||
id: installation
|
|
||||||
attributes:
|
|
||||||
label: What type of installation are you using?
|
|
||||||
options:
|
|
||||||
- Home Assistant Add-on
|
|
||||||
- Docker
|
|
||||||
- pip
|
|
||||||
- type: dropdown
|
|
||||||
validations:
|
|
||||||
required: true
|
|
||||||
id: platform
|
|
||||||
attributes:
|
|
||||||
label: What platform are you using?
|
|
||||||
options:
|
|
||||||
- ESP8266
|
|
||||||
- ESP32
|
|
||||||
- RP2040
|
|
||||||
- BK72XX
|
|
||||||
- RTL87XX
|
|
||||||
- LN882X
|
|
||||||
- Host
|
|
||||||
- Other
|
|
||||||
- type: input
|
|
||||||
id: component_name
|
|
||||||
attributes:
|
|
||||||
label: Component causing the issue
|
|
||||||
description: >
|
|
||||||
The name of the component or platform. For example, api/i2c or ultrasonic.
|
|
||||||
|
|
||||||
- type: markdown
|
|
||||||
attributes:
|
|
||||||
value: |
|
|
||||||
# Details
|
|
||||||
- type: textarea
|
|
||||||
id: config
|
|
||||||
attributes:
|
|
||||||
label: YAML Config
|
|
||||||
description: |
|
|
||||||
Include a complete YAML configuration file demonstrating the problem here. Preferably post the *entire* file - don't make assumptions about what is unimportant. However, if it's a large or complicated config then you will need to reduce it to the smallest possible file *that still demonstrates the problem*. If you don't provide enough information to *easily* reproduce the problem, it's unlikely your bug report will get any attention. Logs do not belong here, attach them below.
|
|
||||||
render: yaml
|
|
||||||
- type: textarea
|
|
||||||
id: logs
|
|
||||||
attributes:
|
|
||||||
label: Anything in the logs that might be useful for us?
|
|
||||||
description: For example, error message, or stack traces. Serial or USB logs are much more useful than WiFi logs.
|
|
||||||
render: txt
|
|
||||||
- type: textarea
|
|
||||||
id: additional
|
|
||||||
attributes:
|
|
||||||
label: Additional information
|
|
||||||
description: >
|
|
||||||
If you have any additional information for us, use the field below.
|
|
||||||
Please note, you can attach screenshots or screen recordings here, by
|
|
||||||
dragging and dropping files in the field below.
|
|
26
.github/ISSUE_TEMPLATE/config.yml
vendored
26
.github/ISSUE_TEMPLATE/config.yml
vendored
@ -1,21 +1,15 @@
|
|||||||
---
|
---
|
||||||
blank_issues_enabled: false
|
blank_issues_enabled: false
|
||||||
contact_links:
|
contact_links:
|
||||||
- name: Report an issue with the ESPHome documentation
|
- name: Issue Tracker
|
||||||
url: https://github.com/esphome/esphome-docs/issues/new/choose
|
url: https://github.com/esphome/issues
|
||||||
about: Report an issue with the ESPHome documentation.
|
about: Please create bug reports in the dedicated issue tracker.
|
||||||
- name: Report an issue with the ESPHome web server
|
- name: Feature Request Tracker
|
||||||
url: https://github.com/esphome/esphome-webserver/issues/new/choose
|
url: https://github.com/esphome/feature-requests
|
||||||
about: Report an issue with the ESPHome web server.
|
about: |
|
||||||
- name: Report an issue with the ESPHome Builder / Dashboard
|
Please create feature requests in the dedicated feature request tracker.
|
||||||
url: https://github.com/esphome/dashboard/issues/new/choose
|
|
||||||
about: Report an issue with the ESPHome Builder / Dashboard.
|
|
||||||
- name: Report an issue with the ESPHome API client
|
|
||||||
url: https://github.com/esphome/aioesphomeapi/issues/new/choose
|
|
||||||
about: Report an issue with the ESPHome API client.
|
|
||||||
- name: Make a Feature Request
|
|
||||||
url: https://github.com/orgs/esphome/discussions
|
|
||||||
about: Please create feature requests in the dedicated feature request tracker.
|
|
||||||
- name: Frequently Asked Question
|
- name: Frequently Asked Question
|
||||||
url: https://esphome.io/guides/faq.html
|
url: https://esphome.io/guides/faq.html
|
||||||
about: Please view the FAQ for common questions and what to include in a bug report.
|
about: |
|
||||||
|
Please view the FAQ for common questions and what
|
||||||
|
to include in a bug report.
|
||||||
|
1
.github/PULL_REQUEST_TEMPLATE.md
vendored
1
.github/PULL_REQUEST_TEMPLATE.md
vendored
@ -26,7 +26,6 @@
|
|||||||
- [ ] RP2040
|
- [ ] RP2040
|
||||||
- [ ] BK72xx
|
- [ ] BK72xx
|
||||||
- [ ] RTL87xx
|
- [ ] RTL87xx
|
||||||
- [ ] nRF52840
|
|
||||||
|
|
||||||
## Example entry for `config.yaml`:
|
## Example entry for `config.yaml`:
|
||||||
|
|
||||||
|
33
.github/actions/build-image/action.yaml
vendored
33
.github/actions/build-image/action.yaml
vendored
@ -1,11 +1,15 @@
|
|||||||
name: Build Image
|
name: Build Image
|
||||||
inputs:
|
inputs:
|
||||||
|
platform:
|
||||||
|
description: "Platform to build for"
|
||||||
|
required: true
|
||||||
|
example: "linux/amd64"
|
||||||
target:
|
target:
|
||||||
description: "Target to build"
|
description: "Target to build"
|
||||||
required: true
|
required: true
|
||||||
example: "docker"
|
example: "docker"
|
||||||
build_type:
|
baseimg:
|
||||||
description: "Build type"
|
description: "Base image type"
|
||||||
required: true
|
required: true
|
||||||
example: "docker"
|
example: "docker"
|
||||||
suffix:
|
suffix:
|
||||||
@ -15,11 +19,6 @@ inputs:
|
|||||||
description: "Version to build"
|
description: "Version to build"
|
||||||
required: true
|
required: true
|
||||||
example: "2023.12.0"
|
example: "2023.12.0"
|
||||||
base_os:
|
|
||||||
description: "Base OS to use"
|
|
||||||
required: false
|
|
||||||
default: "debian"
|
|
||||||
example: "debian"
|
|
||||||
runs:
|
runs:
|
||||||
using: "composite"
|
using: "composite"
|
||||||
steps:
|
steps:
|
||||||
@ -47,52 +46,52 @@ runs:
|
|||||||
|
|
||||||
- name: Build and push to ghcr by digest
|
- name: Build and push to ghcr by digest
|
||||||
id: build-ghcr
|
id: build-ghcr
|
||||||
uses: docker/build-push-action@v6.18.0
|
uses: docker/build-push-action@v6.9.0
|
||||||
env:
|
env:
|
||||||
DOCKER_BUILD_SUMMARY: false
|
DOCKER_BUILD_SUMMARY: false
|
||||||
DOCKER_BUILD_RECORD_UPLOAD: false
|
DOCKER_BUILD_RECORD_UPLOAD: false
|
||||||
with:
|
with:
|
||||||
context: .
|
context: .
|
||||||
file: ./docker/Dockerfile
|
file: ./docker/Dockerfile
|
||||||
|
platforms: ${{ inputs.platform }}
|
||||||
target: ${{ inputs.target }}
|
target: ${{ inputs.target }}
|
||||||
cache-from: type=gha
|
cache-from: type=gha
|
||||||
cache-to: ${{ steps.cache-to.outputs.value }}
|
cache-to: ${{ steps.cache-to.outputs.value }}
|
||||||
build-args: |
|
build-args: |
|
||||||
BUILD_TYPE=${{ inputs.build_type }}
|
BASEIMGTYPE=${{ inputs.baseimg }}
|
||||||
BUILD_VERSION=${{ inputs.version }}
|
BUILD_VERSION=${{ inputs.version }}
|
||||||
BUILD_OS=${{ inputs.base_os }}
|
|
||||||
outputs: |
|
outputs: |
|
||||||
type=image,name=ghcr.io/${{ steps.tags.outputs.image_name }},push-by-digest=true,name-canonical=true,push=true
|
type=image,name=ghcr.io/${{ steps.tags.outputs.image_name }},push-by-digest=true,name-canonical=true,push=true
|
||||||
|
|
||||||
- name: Export ghcr digests
|
- name: Export ghcr digests
|
||||||
shell: bash
|
shell: bash
|
||||||
run: |
|
run: |
|
||||||
mkdir -p /tmp/digests/${{ inputs.build_type }}/ghcr
|
mkdir -p /tmp/digests/${{ inputs.target }}/ghcr
|
||||||
digest="${{ steps.build-ghcr.outputs.digest }}"
|
digest="${{ steps.build-ghcr.outputs.digest }}"
|
||||||
touch "/tmp/digests/${{ inputs.build_type }}/ghcr/${digest#sha256:}"
|
touch "/tmp/digests/${{ inputs.target }}/ghcr/${digest#sha256:}"
|
||||||
|
|
||||||
- name: Build and push to dockerhub by digest
|
- name: Build and push to dockerhub by digest
|
||||||
id: build-dockerhub
|
id: build-dockerhub
|
||||||
uses: docker/build-push-action@v6.18.0
|
uses: docker/build-push-action@v6.9.0
|
||||||
env:
|
env:
|
||||||
DOCKER_BUILD_SUMMARY: false
|
DOCKER_BUILD_SUMMARY: false
|
||||||
DOCKER_BUILD_RECORD_UPLOAD: false
|
DOCKER_BUILD_RECORD_UPLOAD: false
|
||||||
with:
|
with:
|
||||||
context: .
|
context: .
|
||||||
file: ./docker/Dockerfile
|
file: ./docker/Dockerfile
|
||||||
|
platforms: ${{ inputs.platform }}
|
||||||
target: ${{ inputs.target }}
|
target: ${{ inputs.target }}
|
||||||
cache-from: type=gha
|
cache-from: type=gha
|
||||||
cache-to: ${{ steps.cache-to.outputs.value }}
|
cache-to: ${{ steps.cache-to.outputs.value }}
|
||||||
build-args: |
|
build-args: |
|
||||||
BUILD_TYPE=${{ inputs.build_type }}
|
BASEIMGTYPE=${{ inputs.baseimg }}
|
||||||
BUILD_VERSION=${{ inputs.version }}
|
BUILD_VERSION=${{ inputs.version }}
|
||||||
BUILD_OS=${{ inputs.base_os }}
|
|
||||||
outputs: |
|
outputs: |
|
||||||
type=image,name=docker.io/${{ steps.tags.outputs.image_name }},push-by-digest=true,name-canonical=true,push=true
|
type=image,name=docker.io/${{ steps.tags.outputs.image_name }},push-by-digest=true,name-canonical=true,push=true
|
||||||
|
|
||||||
- name: Export dockerhub digests
|
- name: Export dockerhub digests
|
||||||
shell: bash
|
shell: bash
|
||||||
run: |
|
run: |
|
||||||
mkdir -p /tmp/digests/${{ inputs.build_type }}/dockerhub
|
mkdir -p /tmp/digests/${{ inputs.target }}/dockerhub
|
||||||
digest="${{ steps.build-dockerhub.outputs.digest }}"
|
digest="${{ steps.build-dockerhub.outputs.digest }}"
|
||||||
touch "/tmp/digests/${{ inputs.build_type }}/dockerhub/${digest#sha256:}"
|
touch "/tmp/digests/${{ inputs.target }}/dockerhub/${digest#sha256:}"
|
||||||
|
10
.github/actions/restore-python/action.yml
vendored
10
.github/actions/restore-python/action.yml
vendored
@ -17,12 +17,12 @@ runs:
|
|||||||
steps:
|
steps:
|
||||||
- name: Set up Python ${{ inputs.python-version }}
|
- name: Set up Python ${{ inputs.python-version }}
|
||||||
id: python
|
id: python
|
||||||
uses: actions/setup-python@v5.6.0
|
uses: actions/setup-python@v5.3.0
|
||||||
with:
|
with:
|
||||||
python-version: ${{ inputs.python-version }}
|
python-version: ${{ inputs.python-version }}
|
||||||
- name: Restore Python virtual environment
|
- name: Restore Python virtual environment
|
||||||
id: cache-venv
|
id: cache-venv
|
||||||
uses: actions/cache/restore@v4.2.3
|
uses: actions/cache/restore@v4.1.2
|
||||||
with:
|
with:
|
||||||
path: venv
|
path: venv
|
||||||
# yamllint disable-line rule:line-length
|
# yamllint disable-line rule:line-length
|
||||||
@ -34,14 +34,14 @@ runs:
|
|||||||
python -m venv venv
|
python -m venv venv
|
||||||
source venv/bin/activate
|
source venv/bin/activate
|
||||||
python --version
|
python --version
|
||||||
pip install -r requirements.txt -r requirements_test.txt
|
pip install -r requirements.txt -r requirements_optional.txt -r requirements_test.txt
|
||||||
pip install -e .
|
pip install -e .
|
||||||
- name: Create Python virtual environment
|
- name: Create Python virtual environment
|
||||||
if: steps.cache-venv.outputs.cache-hit != 'true' && runner.os == 'Windows'
|
if: steps.cache-venv.outputs.cache-hit != 'true' && runner.os == 'Windows'
|
||||||
shell: bash
|
shell: bash
|
||||||
run: |
|
run: |
|
||||||
python -m venv venv
|
python -m venv venv
|
||||||
source ./venv/Scripts/activate
|
./venv/Scripts/activate
|
||||||
python --version
|
python --version
|
||||||
pip install -r requirements.txt -r requirements_test.txt
|
pip install -r requirements.txt -r requirements_optional.txt -r requirements_test.txt
|
||||||
pip install -e .
|
pip install -e .
|
||||||
|
1
.github/copilot-instructions.md
vendored
1
.github/copilot-instructions.md
vendored
@ -1 +0,0 @@
|
|||||||
../.ai/instructions.md
|
|
10
.github/dependabot.yml
vendored
10
.github/dependabot.yml
vendored
@ -9,9 +9,6 @@ updates:
|
|||||||
# Hypotehsis is only used for testing and is updated quite often
|
# Hypotehsis is only used for testing and is updated quite often
|
||||||
- dependency-name: hypothesis
|
- dependency-name: hypothesis
|
||||||
- package-ecosystem: github-actions
|
- package-ecosystem: github-actions
|
||||||
labels:
|
|
||||||
- "dependencies"
|
|
||||||
- "github-actions"
|
|
||||||
directory: "/"
|
directory: "/"
|
||||||
schedule:
|
schedule:
|
||||||
interval: daily
|
interval: daily
|
||||||
@ -20,20 +17,15 @@ updates:
|
|||||||
docker-actions:
|
docker-actions:
|
||||||
applies-to: version-updates
|
applies-to: version-updates
|
||||||
patterns:
|
patterns:
|
||||||
|
- "docker/setup-qemu-action"
|
||||||
- "docker/login-action"
|
- "docker/login-action"
|
||||||
- "docker/setup-buildx-action"
|
- "docker/setup-buildx-action"
|
||||||
- package-ecosystem: github-actions
|
- package-ecosystem: github-actions
|
||||||
labels:
|
|
||||||
- "dependencies"
|
|
||||||
- "github-actions"
|
|
||||||
directory: "/.github/actions/build-image"
|
directory: "/.github/actions/build-image"
|
||||||
schedule:
|
schedule:
|
||||||
interval: daily
|
interval: daily
|
||||||
open-pull-requests-limit: 10
|
open-pull-requests-limit: 10
|
||||||
- package-ecosystem: github-actions
|
- package-ecosystem: github-actions
|
||||||
labels:
|
|
||||||
- "dependencies"
|
|
||||||
- "github-actions"
|
|
||||||
directory: "/.github/actions/restore-python"
|
directory: "/.github/actions/restore-python"
|
||||||
schedule:
|
schedule:
|
||||||
interval: daily
|
interval: daily
|
||||||
|
598
.github/workflows/auto-label-pr.yml
vendored
598
.github/workflows/auto-label-pr.yml
vendored
@ -1,598 +0,0 @@
|
|||||||
name: Auto Label PR
|
|
||||||
|
|
||||||
on:
|
|
||||||
# Runs only on pull_request_target due to having access to a App token.
|
|
||||||
# This means PRs from forks will not be able to alter this workflow to get the tokens
|
|
||||||
pull_request_target:
|
|
||||||
types: [labeled, opened, reopened, synchronize, edited]
|
|
||||||
|
|
||||||
permissions:
|
|
||||||
pull-requests: write
|
|
||||||
contents: read
|
|
||||||
|
|
||||||
env:
|
|
||||||
SMALL_PR_THRESHOLD: 30
|
|
||||||
MAX_LABELS: 15
|
|
||||||
TOO_BIG_THRESHOLD: 1000
|
|
||||||
|
|
||||||
jobs:
|
|
||||||
label:
|
|
||||||
runs-on: ubuntu-latest
|
|
||||||
if: github.event.action != 'labeled' || github.event.sender.type != 'Bot'
|
|
||||||
steps:
|
|
||||||
- name: Checkout
|
|
||||||
uses: actions/checkout@v4.2.2
|
|
||||||
|
|
||||||
- name: Generate a token
|
|
||||||
id: generate-token
|
|
||||||
uses: actions/create-github-app-token@v2
|
|
||||||
with:
|
|
||||||
app-id: ${{ secrets.ESPHOME_GITHUB_APP_ID }}
|
|
||||||
private-key: ${{ secrets.ESPHOME_GITHUB_APP_PRIVATE_KEY }}
|
|
||||||
|
|
||||||
- name: Auto Label PR
|
|
||||||
uses: actions/github-script@v7.0.1
|
|
||||||
with:
|
|
||||||
github-token: ${{ steps.generate-token.outputs.token }}
|
|
||||||
script: |
|
|
||||||
const fs = require('fs');
|
|
||||||
|
|
||||||
// Constants
|
|
||||||
const SMALL_PR_THRESHOLD = parseInt('${{ env.SMALL_PR_THRESHOLD }}');
|
|
||||||
const MAX_LABELS = parseInt('${{ env.MAX_LABELS }}');
|
|
||||||
const TOO_BIG_THRESHOLD = parseInt('${{ env.TOO_BIG_THRESHOLD }}');
|
|
||||||
const BOT_COMMENT_MARKER = '<!-- auto-label-pr-bot -->';
|
|
||||||
const CODEOWNERS_MARKER = '<!-- codeowners-request -->';
|
|
||||||
const TOO_BIG_MARKER = '<!-- too-big-request -->';
|
|
||||||
|
|
||||||
const MANAGED_LABELS = [
|
|
||||||
'new-component',
|
|
||||||
'new-platform',
|
|
||||||
'new-target-platform',
|
|
||||||
'merging-to-release',
|
|
||||||
'merging-to-beta',
|
|
||||||
'core',
|
|
||||||
'small-pr',
|
|
||||||
'dashboard',
|
|
||||||
'github-actions',
|
|
||||||
'by-code-owner',
|
|
||||||
'has-tests',
|
|
||||||
'needs-tests',
|
|
||||||
'needs-docs',
|
|
||||||
'needs-codeowners',
|
|
||||||
'too-big',
|
|
||||||
'labeller-recheck'
|
|
||||||
];
|
|
||||||
|
|
||||||
const DOCS_PR_PATTERNS = [
|
|
||||||
/https:\/\/github\.com\/esphome\/esphome-docs\/pull\/\d+/,
|
|
||||||
/esphome\/esphome-docs#\d+/
|
|
||||||
];
|
|
||||||
|
|
||||||
// Global state
|
|
||||||
const { owner, repo } = context.repo;
|
|
||||||
const pr_number = context.issue.number;
|
|
||||||
|
|
||||||
// Get current labels and PR data
|
|
||||||
const { data: currentLabelsData } = await github.rest.issues.listLabelsOnIssue({
|
|
||||||
owner,
|
|
||||||
repo,
|
|
||||||
issue_number: pr_number
|
|
||||||
});
|
|
||||||
const currentLabels = currentLabelsData.map(label => label.name);
|
|
||||||
const managedLabels = currentLabels.filter(label =>
|
|
||||||
label.startsWith('component: ') || MANAGED_LABELS.includes(label)
|
|
||||||
);
|
|
||||||
|
|
||||||
const { data: prFiles } = await github.rest.pulls.listFiles({
|
|
||||||
owner,
|
|
||||||
repo,
|
|
||||||
pull_number: pr_number
|
|
||||||
});
|
|
||||||
|
|
||||||
// Calculate data from PR files
|
|
||||||
const changedFiles = prFiles.map(file => file.filename);
|
|
||||||
const totalChanges = prFiles.reduce((sum, file) => sum + (file.additions || 0) + (file.deletions || 0), 0);
|
|
||||||
|
|
||||||
console.log('Current labels:', currentLabels.join(', '));
|
|
||||||
console.log('Changed files:', changedFiles.length);
|
|
||||||
console.log('Total changes:', totalChanges);
|
|
||||||
|
|
||||||
// Fetch API data
|
|
||||||
async function fetchApiData() {
|
|
||||||
try {
|
|
||||||
const response = await fetch('https://data.esphome.io/components.json');
|
|
||||||
const componentsData = await response.json();
|
|
||||||
return {
|
|
||||||
targetPlatforms: componentsData.target_platforms || [],
|
|
||||||
platformComponents: componentsData.platform_components || []
|
|
||||||
};
|
|
||||||
} catch (error) {
|
|
||||||
console.log('Failed to fetch components data from API:', error.message);
|
|
||||||
return { targetPlatforms: [], platformComponents: [] };
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
// Strategy: Merge branch detection
|
|
||||||
async function detectMergeBranch() {
|
|
||||||
const labels = new Set();
|
|
||||||
const baseRef = context.payload.pull_request.base.ref;
|
|
||||||
|
|
||||||
if (baseRef === 'release') {
|
|
||||||
labels.add('merging-to-release');
|
|
||||||
} else if (baseRef === 'beta') {
|
|
||||||
labels.add('merging-to-beta');
|
|
||||||
}
|
|
||||||
|
|
||||||
return labels;
|
|
||||||
}
|
|
||||||
|
|
||||||
// Strategy: Component and platform labeling
|
|
||||||
async function detectComponentPlatforms(apiData) {
|
|
||||||
const labels = new Set();
|
|
||||||
const componentRegex = /^esphome\/components\/([^\/]+)\//;
|
|
||||||
const targetPlatformRegex = new RegExp(`^esphome\/components\/(${apiData.targetPlatforms.join('|')})/`);
|
|
||||||
|
|
||||||
for (const file of changedFiles) {
|
|
||||||
const componentMatch = file.match(componentRegex);
|
|
||||||
if (componentMatch) {
|
|
||||||
labels.add(`component: ${componentMatch[1]}`);
|
|
||||||
}
|
|
||||||
|
|
||||||
const platformMatch = file.match(targetPlatformRegex);
|
|
||||||
if (platformMatch) {
|
|
||||||
labels.add(`platform: ${platformMatch[1]}`);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
return labels;
|
|
||||||
}
|
|
||||||
|
|
||||||
// Strategy: New component detection
|
|
||||||
async function detectNewComponents() {
|
|
||||||
const labels = new Set();
|
|
||||||
const addedFiles = prFiles.filter(file => file.status === 'added').map(file => file.filename);
|
|
||||||
|
|
||||||
for (const file of addedFiles) {
|
|
||||||
const componentMatch = file.match(/^esphome\/components\/([^\/]+)\/__init__\.py$/);
|
|
||||||
if (componentMatch) {
|
|
||||||
try {
|
|
||||||
const content = fs.readFileSync(file, 'utf8');
|
|
||||||
if (content.includes('IS_TARGET_PLATFORM = True')) {
|
|
||||||
labels.add('new-target-platform');
|
|
||||||
}
|
|
||||||
} catch (error) {
|
|
||||||
console.log(`Failed to read content of ${file}:`, error.message);
|
|
||||||
}
|
|
||||||
labels.add('new-component');
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
return labels;
|
|
||||||
}
|
|
||||||
|
|
||||||
// Strategy: New platform detection
|
|
||||||
async function detectNewPlatforms(apiData) {
|
|
||||||
const labels = new Set();
|
|
||||||
const addedFiles = prFiles.filter(file => file.status === 'added').map(file => file.filename);
|
|
||||||
|
|
||||||
for (const file of addedFiles) {
|
|
||||||
const platformFileMatch = file.match(/^esphome\/components\/([^\/]+)\/([^\/]+)\.py$/);
|
|
||||||
if (platformFileMatch) {
|
|
||||||
const [, component, platform] = platformFileMatch;
|
|
||||||
if (apiData.platformComponents.includes(platform)) {
|
|
||||||
labels.add('new-platform');
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
const platformDirMatch = file.match(/^esphome\/components\/([^\/]+)\/([^\/]+)\/__init__\.py$/);
|
|
||||||
if (platformDirMatch) {
|
|
||||||
const [, component, platform] = platformDirMatch;
|
|
||||||
if (apiData.platformComponents.includes(platform)) {
|
|
||||||
labels.add('new-platform');
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
return labels;
|
|
||||||
}
|
|
||||||
|
|
||||||
// Strategy: Core files detection
|
|
||||||
async function detectCoreChanges() {
|
|
||||||
const labels = new Set();
|
|
||||||
const coreFiles = changedFiles.filter(file =>
|
|
||||||
file.startsWith('esphome/core/') ||
|
|
||||||
(file.startsWith('esphome/') && file.split('/').length === 2)
|
|
||||||
);
|
|
||||||
|
|
||||||
if (coreFiles.length > 0) {
|
|
||||||
labels.add('core');
|
|
||||||
}
|
|
||||||
|
|
||||||
return labels;
|
|
||||||
}
|
|
||||||
|
|
||||||
// Strategy: PR size detection
|
|
||||||
async function detectPRSize() {
|
|
||||||
const labels = new Set();
|
|
||||||
const testChanges = prFiles
|
|
||||||
.filter(file => file.filename.startsWith('tests/'))
|
|
||||||
.reduce((sum, file) => sum + (file.additions || 0) + (file.deletions || 0), 0);
|
|
||||||
|
|
||||||
const nonTestChanges = totalChanges - testChanges;
|
|
||||||
|
|
||||||
if (totalChanges <= SMALL_PR_THRESHOLD) {
|
|
||||||
labels.add('small-pr');
|
|
||||||
}
|
|
||||||
|
|
||||||
if (nonTestChanges > TOO_BIG_THRESHOLD) {
|
|
||||||
labels.add('too-big');
|
|
||||||
}
|
|
||||||
|
|
||||||
return labels;
|
|
||||||
}
|
|
||||||
|
|
||||||
// Strategy: Dashboard changes
|
|
||||||
async function detectDashboardChanges() {
|
|
||||||
const labels = new Set();
|
|
||||||
const dashboardFiles = changedFiles.filter(file =>
|
|
||||||
file.startsWith('esphome/dashboard/') ||
|
|
||||||
file.startsWith('esphome/components/dashboard_import/')
|
|
||||||
);
|
|
||||||
|
|
||||||
if (dashboardFiles.length > 0) {
|
|
||||||
labels.add('dashboard');
|
|
||||||
}
|
|
||||||
|
|
||||||
return labels;
|
|
||||||
}
|
|
||||||
|
|
||||||
// Strategy: GitHub Actions changes
|
|
||||||
async function detectGitHubActionsChanges() {
|
|
||||||
const labels = new Set();
|
|
||||||
const githubActionsFiles = changedFiles.filter(file =>
|
|
||||||
file.startsWith('.github/workflows/')
|
|
||||||
);
|
|
||||||
|
|
||||||
if (githubActionsFiles.length > 0) {
|
|
||||||
labels.add('github-actions');
|
|
||||||
}
|
|
||||||
|
|
||||||
return labels;
|
|
||||||
}
|
|
||||||
|
|
||||||
// Strategy: Code owner detection
|
|
||||||
async function detectCodeOwner() {
|
|
||||||
const labels = new Set();
|
|
||||||
|
|
||||||
try {
|
|
||||||
const { data: codeownersFile } = await github.rest.repos.getContent({
|
|
||||||
owner,
|
|
||||||
repo,
|
|
||||||
path: 'CODEOWNERS',
|
|
||||||
});
|
|
||||||
|
|
||||||
const codeownersContent = Buffer.from(codeownersFile.content, 'base64').toString('utf8');
|
|
||||||
const prAuthor = context.payload.pull_request.user.login;
|
|
||||||
|
|
||||||
const codeownersLines = codeownersContent.split('\n')
|
|
||||||
.map(line => line.trim())
|
|
||||||
.filter(line => line && !line.startsWith('#'));
|
|
||||||
|
|
||||||
const codeownersRegexes = codeownersLines.map(line => {
|
|
||||||
const parts = line.split(/\s+/);
|
|
||||||
const pattern = parts[0];
|
|
||||||
const owners = parts.slice(1);
|
|
||||||
|
|
||||||
let regex;
|
|
||||||
if (pattern.endsWith('*')) {
|
|
||||||
const dir = pattern.slice(0, -1);
|
|
||||||
regex = new RegExp(`^${dir.replace(/[.*+?^${}()|[\]\\]/g, '\\$&')}`);
|
|
||||||
} else if (pattern.includes('*')) {
|
|
||||||
const regexPattern = pattern
|
|
||||||
.replace(/[.*+?^${}()|[\]\\]/g, '\\$&')
|
|
||||||
.replace(/\\*/g, '.*');
|
|
||||||
regex = new RegExp(`^${regexPattern}$`);
|
|
||||||
} else {
|
|
||||||
regex = new RegExp(`^${pattern.replace(/[.*+?^${}()|[\]\\]/g, '\\$&')}$`);
|
|
||||||
}
|
|
||||||
|
|
||||||
return { regex, owners };
|
|
||||||
});
|
|
||||||
|
|
||||||
for (const file of changedFiles) {
|
|
||||||
for (const { regex, owners } of codeownersRegexes) {
|
|
||||||
if (regex.test(file) && owners.some(owner => owner === `@${prAuthor}`)) {
|
|
||||||
labels.add('by-code-owner');
|
|
||||||
return labels;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
} catch (error) {
|
|
||||||
console.log('Failed to read or parse CODEOWNERS file:', error.message);
|
|
||||||
}
|
|
||||||
|
|
||||||
return labels;
|
|
||||||
}
|
|
||||||
|
|
||||||
// Strategy: Test detection
|
|
||||||
async function detectTests() {
|
|
||||||
const labels = new Set();
|
|
||||||
const testFiles = changedFiles.filter(file => file.startsWith('tests/'));
|
|
||||||
|
|
||||||
if (testFiles.length > 0) {
|
|
||||||
labels.add('has-tests');
|
|
||||||
}
|
|
||||||
|
|
||||||
return labels;
|
|
||||||
}
|
|
||||||
|
|
||||||
// Strategy: Requirements detection
|
|
||||||
async function detectRequirements(allLabels) {
|
|
||||||
const labels = new Set();
|
|
||||||
|
|
||||||
// Check for missing tests
|
|
||||||
if ((allLabels.has('new-component') || allLabels.has('new-platform')) && !allLabels.has('has-tests')) {
|
|
||||||
labels.add('needs-tests');
|
|
||||||
}
|
|
||||||
|
|
||||||
// Check for missing docs
|
|
||||||
if (allLabels.has('new-component') || allLabels.has('new-platform')) {
|
|
||||||
const prBody = context.payload.pull_request.body || '';
|
|
||||||
const hasDocsLink = DOCS_PR_PATTERNS.some(pattern => pattern.test(prBody));
|
|
||||||
|
|
||||||
if (!hasDocsLink) {
|
|
||||||
labels.add('needs-docs');
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
// Check for missing CODEOWNERS
|
|
||||||
if (allLabels.has('new-component')) {
|
|
||||||
const codeownersModified = prFiles.some(file =>
|
|
||||||
file.filename === 'CODEOWNERS' &&
|
|
||||||
(file.status === 'modified' || file.status === 'added') &&
|
|
||||||
(file.additions || 0) > 0
|
|
||||||
);
|
|
||||||
|
|
||||||
if (!codeownersModified) {
|
|
||||||
labels.add('needs-codeowners');
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
return labels;
|
|
||||||
}
|
|
||||||
|
|
||||||
// Generate review messages
|
|
||||||
function generateReviewMessages(finalLabels) {
|
|
||||||
const messages = [];
|
|
||||||
const prAuthor = context.payload.pull_request.user.login;
|
|
||||||
|
|
||||||
// Too big message
|
|
||||||
if (finalLabels.includes('too-big')) {
|
|
||||||
const testChanges = prFiles
|
|
||||||
.filter(file => file.filename.startsWith('tests/'))
|
|
||||||
.reduce((sum, file) => sum + (file.additions || 0) + (file.deletions || 0), 0);
|
|
||||||
const nonTestChanges = totalChanges - testChanges;
|
|
||||||
|
|
||||||
const tooManyLabels = finalLabels.length > MAX_LABELS;
|
|
||||||
const tooManyChanges = nonTestChanges > TOO_BIG_THRESHOLD;
|
|
||||||
|
|
||||||
let message = `${TOO_BIG_MARKER}\n### 📦 Pull Request Size\n\n`;
|
|
||||||
|
|
||||||
if (tooManyLabels && tooManyChanges) {
|
|
||||||
message += `This PR is too large with ${nonTestChanges} line changes (excluding tests) and affects ${finalLabels.length} different components/areas.`;
|
|
||||||
} else if (tooManyLabels) {
|
|
||||||
message += `This PR affects ${finalLabels.length} different components/areas.`;
|
|
||||||
} else {
|
|
||||||
message += `This PR is too large with ${nonTestChanges} line changes (excluding tests).`;
|
|
||||||
}
|
|
||||||
|
|
||||||
message += ` Please consider breaking it down into smaller, focused PRs to make review easier and reduce the risk of conflicts.\n\n`;
|
|
||||||
message += `For guidance on breaking down large PRs, see: https://developers.esphome.io/contributing/submitting-your-work/#how-to-approach-large-submissions`;
|
|
||||||
|
|
||||||
messages.push(message);
|
|
||||||
}
|
|
||||||
|
|
||||||
// CODEOWNERS message
|
|
||||||
if (finalLabels.includes('needs-codeowners')) {
|
|
||||||
const message = `${CODEOWNERS_MARKER}\n### 👥 Code Ownership\n\n` +
|
|
||||||
`Hey there @${prAuthor},\n` +
|
|
||||||
`Thanks for submitting this pull request! Can you add yourself as a codeowner for this integration? ` +
|
|
||||||
`This way we can notify you if a bug report for this integration is reported.\n\n` +
|
|
||||||
`In \`__init__.py\` of the integration, please add:\n\n` +
|
|
||||||
`\`\`\`python\nCODEOWNERS = ["@${prAuthor}"]\n\`\`\`\n\n` +
|
|
||||||
`And run \`script/build_codeowners.py\``;
|
|
||||||
|
|
||||||
messages.push(message);
|
|
||||||
}
|
|
||||||
|
|
||||||
return messages;
|
|
||||||
}
|
|
||||||
|
|
||||||
// Handle reviews
|
|
||||||
async function handleReviews(finalLabels) {
|
|
||||||
const reviewMessages = generateReviewMessages(finalLabels);
|
|
||||||
const hasReviewableLabels = finalLabels.some(label =>
|
|
||||||
['too-big', 'needs-codeowners'].includes(label)
|
|
||||||
);
|
|
||||||
|
|
||||||
const { data: reviews } = await github.rest.pulls.listReviews({
|
|
||||||
owner,
|
|
||||||
repo,
|
|
||||||
pull_number: pr_number
|
|
||||||
});
|
|
||||||
|
|
||||||
const botReviews = reviews.filter(review =>
|
|
||||||
review.user.type === 'Bot' &&
|
|
||||||
review.state === 'CHANGES_REQUESTED' &&
|
|
||||||
review.body && review.body.includes(BOT_COMMENT_MARKER)
|
|
||||||
);
|
|
||||||
|
|
||||||
if (hasReviewableLabels) {
|
|
||||||
const reviewBody = `${BOT_COMMENT_MARKER}\n\n${reviewMessages.join('\n\n---\n\n')}`;
|
|
||||||
|
|
||||||
if (botReviews.length > 0) {
|
|
||||||
// Update existing review
|
|
||||||
await github.rest.pulls.updateReview({
|
|
||||||
owner,
|
|
||||||
repo,
|
|
||||||
pull_number: pr_number,
|
|
||||||
review_id: botReviews[0].id,
|
|
||||||
body: reviewBody
|
|
||||||
});
|
|
||||||
console.log('Updated existing bot review');
|
|
||||||
} else {
|
|
||||||
// Create new review
|
|
||||||
await github.rest.pulls.createReview({
|
|
||||||
owner,
|
|
||||||
repo,
|
|
||||||
pull_number: pr_number,
|
|
||||||
body: reviewBody,
|
|
||||||
event: 'REQUEST_CHANGES'
|
|
||||||
});
|
|
||||||
console.log('Created new bot review');
|
|
||||||
}
|
|
||||||
} else if (botReviews.length > 0) {
|
|
||||||
// Dismiss existing reviews
|
|
||||||
for (const review of botReviews) {
|
|
||||||
try {
|
|
||||||
await github.rest.pulls.dismissReview({
|
|
||||||
owner,
|
|
||||||
repo,
|
|
||||||
pull_number: pr_number,
|
|
||||||
review_id: review.id,
|
|
||||||
message: 'Review dismissed: All requirements have been met'
|
|
||||||
});
|
|
||||||
console.log(`Dismissed bot review ${review.id}`);
|
|
||||||
} catch (error) {
|
|
||||||
console.log(`Failed to dismiss review ${review.id}:`, error.message);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
// Main execution
|
|
||||||
const apiData = await fetchApiData();
|
|
||||||
const baseRef = context.payload.pull_request.base.ref;
|
|
||||||
|
|
||||||
// Early exit for non-dev branches
|
|
||||||
if (baseRef !== 'dev') {
|
|
||||||
const branchLabels = await detectMergeBranch();
|
|
||||||
const finalLabels = Array.from(branchLabels);
|
|
||||||
|
|
||||||
console.log('Computed labels (merge branch only):', finalLabels.join(', '));
|
|
||||||
|
|
||||||
// Apply labels
|
|
||||||
if (finalLabels.length > 0) {
|
|
||||||
await github.rest.issues.addLabels({
|
|
||||||
owner,
|
|
||||||
repo,
|
|
||||||
issue_number: pr_number,
|
|
||||||
labels: finalLabels
|
|
||||||
});
|
|
||||||
}
|
|
||||||
|
|
||||||
// Remove old managed labels
|
|
||||||
const labelsToRemove = managedLabels.filter(label => !finalLabels.includes(label));
|
|
||||||
for (const label of labelsToRemove) {
|
|
||||||
try {
|
|
||||||
await github.rest.issues.removeLabel({
|
|
||||||
owner,
|
|
||||||
repo,
|
|
||||||
issue_number: pr_number,
|
|
||||||
name: label
|
|
||||||
});
|
|
||||||
} catch (error) {
|
|
||||||
console.log(`Failed to remove label ${label}:`, error.message);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
return;
|
|
||||||
}
|
|
||||||
|
|
||||||
// Run all strategies
|
|
||||||
const [
|
|
||||||
branchLabels,
|
|
||||||
componentLabels,
|
|
||||||
newComponentLabels,
|
|
||||||
newPlatformLabels,
|
|
||||||
coreLabels,
|
|
||||||
sizeLabels,
|
|
||||||
dashboardLabels,
|
|
||||||
actionsLabels,
|
|
||||||
codeOwnerLabels,
|
|
||||||
testLabels
|
|
||||||
] = await Promise.all([
|
|
||||||
detectMergeBranch(),
|
|
||||||
detectComponentPlatforms(apiData),
|
|
||||||
detectNewComponents(),
|
|
||||||
detectNewPlatforms(apiData),
|
|
||||||
detectCoreChanges(),
|
|
||||||
detectPRSize(),
|
|
||||||
detectDashboardChanges(),
|
|
||||||
detectGitHubActionsChanges(),
|
|
||||||
detectCodeOwner(),
|
|
||||||
detectTests()
|
|
||||||
]);
|
|
||||||
|
|
||||||
// Combine all labels
|
|
||||||
const allLabels = new Set([
|
|
||||||
...branchLabels,
|
|
||||||
...componentLabels,
|
|
||||||
...newComponentLabels,
|
|
||||||
...newPlatformLabels,
|
|
||||||
...coreLabels,
|
|
||||||
...sizeLabels,
|
|
||||||
...dashboardLabels,
|
|
||||||
...actionsLabels,
|
|
||||||
...codeOwnerLabels,
|
|
||||||
...testLabels
|
|
||||||
]);
|
|
||||||
|
|
||||||
// Detect requirements based on all other labels
|
|
||||||
const requirementLabels = await detectRequirements(allLabels);
|
|
||||||
for (const label of requirementLabels) {
|
|
||||||
allLabels.add(label);
|
|
||||||
}
|
|
||||||
|
|
||||||
let finalLabels = Array.from(allLabels);
|
|
||||||
|
|
||||||
// Handle too many labels
|
|
||||||
const isMegaPR = currentLabels.includes('mega-pr');
|
|
||||||
const tooManyLabels = finalLabels.length > MAX_LABELS;
|
|
||||||
|
|
||||||
if (tooManyLabels && !isMegaPR && !finalLabels.includes('too-big')) {
|
|
||||||
finalLabels = ['too-big'];
|
|
||||||
}
|
|
||||||
|
|
||||||
console.log('Computed labels:', finalLabels.join(', '));
|
|
||||||
|
|
||||||
// Handle reviews
|
|
||||||
await handleReviews(finalLabels);
|
|
||||||
|
|
||||||
// Apply labels
|
|
||||||
if (finalLabels.length > 0) {
|
|
||||||
console.log(`Adding labels: ${finalLabels.join(', ')}`);
|
|
||||||
await github.rest.issues.addLabels({
|
|
||||||
owner,
|
|
||||||
repo,
|
|
||||||
issue_number: pr_number,
|
|
||||||
labels: finalLabels
|
|
||||||
});
|
|
||||||
}
|
|
||||||
|
|
||||||
// Remove old managed labels
|
|
||||||
const labelsToRemove = managedLabels.filter(label => !finalLabels.includes(label));
|
|
||||||
for (const label of labelsToRemove) {
|
|
||||||
console.log(`Removing label: ${label}`);
|
|
||||||
try {
|
|
||||||
await github.rest.issues.removeLabel({
|
|
||||||
owner,
|
|
||||||
repo,
|
|
||||||
issue_number: pr_number,
|
|
||||||
name: label
|
|
||||||
});
|
|
||||||
} catch (error) {
|
|
||||||
console.log(`Failed to remove label ${label}:`, error.message);
|
|
||||||
}
|
|
||||||
}
|
|
15
.github/workflows/ci-api-proto.yml
vendored
15
.github/workflows/ci-api-proto.yml
vendored
@ -21,9 +21,9 @@ jobs:
|
|||||||
runs-on: ubuntu-latest
|
runs-on: ubuntu-latest
|
||||||
steps:
|
steps:
|
||||||
- name: Checkout
|
- name: Checkout
|
||||||
uses: actions/checkout@v4.2.2
|
uses: actions/checkout@v4.1.7
|
||||||
- name: Set up Python
|
- name: Set up Python
|
||||||
uses: actions/setup-python@v5.6.0
|
uses: actions/setup-python@v5.3.0
|
||||||
with:
|
with:
|
||||||
python-version: "3.11"
|
python-version: "3.11"
|
||||||
|
|
||||||
@ -57,17 +57,6 @@ jobs:
|
|||||||
event: 'REQUEST_CHANGES',
|
event: 'REQUEST_CHANGES',
|
||||||
body: 'You have altered the generated proto files but they do not match what is expected.\nPlease run "script/api_protobuf/api_protobuf.py" and commit the changes.'
|
body: 'You have altered the generated proto files but they do not match what is expected.\nPlease run "script/api_protobuf/api_protobuf.py" and commit the changes.'
|
||||||
})
|
})
|
||||||
- if: failure()
|
|
||||||
name: Show changes
|
|
||||||
run: git diff
|
|
||||||
- if: failure()
|
|
||||||
name: Archive artifacts
|
|
||||||
uses: actions/upload-artifact@v4.6.2
|
|
||||||
with:
|
|
||||||
name: generated-proto-files
|
|
||||||
path: |
|
|
||||||
esphome/components/api/api_pb2.*
|
|
||||||
esphome/components/api/api_pb2_service.*
|
|
||||||
- if: success()
|
- if: success()
|
||||||
name: Dismiss review
|
name: Dismiss review
|
||||||
uses: actions/github-script@v7.0.1
|
uses: actions/github-script@v7.0.1
|
||||||
|
75
.github/workflows/ci-clang-tidy-hash.yml
vendored
75
.github/workflows/ci-clang-tidy-hash.yml
vendored
@ -1,75 +0,0 @@
|
|||||||
name: Clang-tidy Hash CI
|
|
||||||
|
|
||||||
on:
|
|
||||||
pull_request:
|
|
||||||
paths:
|
|
||||||
- ".clang-tidy"
|
|
||||||
- "platformio.ini"
|
|
||||||
- "requirements_dev.txt"
|
|
||||||
- ".clang-tidy.hash"
|
|
||||||
- "script/clang_tidy_hash.py"
|
|
||||||
- ".github/workflows/ci-clang-tidy-hash.yml"
|
|
||||||
|
|
||||||
permissions:
|
|
||||||
contents: read
|
|
||||||
pull-requests: write
|
|
||||||
|
|
||||||
jobs:
|
|
||||||
verify-hash:
|
|
||||||
name: Verify clang-tidy hash
|
|
||||||
runs-on: ubuntu-latest
|
|
||||||
steps:
|
|
||||||
- name: Checkout
|
|
||||||
uses: actions/checkout@v4.2.2
|
|
||||||
|
|
||||||
- name: Set up Python
|
|
||||||
uses: actions/setup-python@v5.6.0
|
|
||||||
with:
|
|
||||||
python-version: "3.11"
|
|
||||||
|
|
||||||
- name: Verify hash
|
|
||||||
run: |
|
|
||||||
python script/clang_tidy_hash.py --verify
|
|
||||||
|
|
||||||
- if: failure()
|
|
||||||
name: Show hash details
|
|
||||||
run: |
|
|
||||||
python script/clang_tidy_hash.py
|
|
||||||
echo "## Job Failed" | tee -a $GITHUB_STEP_SUMMARY
|
|
||||||
echo "You have modified clang-tidy configuration but have not updated the hash." | tee -a $GITHUB_STEP_SUMMARY
|
|
||||||
echo "Please run 'script/clang_tidy_hash.py --update' and commit the changes." | tee -a $GITHUB_STEP_SUMMARY
|
|
||||||
|
|
||||||
- if: failure()
|
|
||||||
name: Request changes
|
|
||||||
uses: actions/github-script@v7.0.1
|
|
||||||
with:
|
|
||||||
script: |
|
|
||||||
await github.rest.pulls.createReview({
|
|
||||||
pull_number: context.issue.number,
|
|
||||||
owner: context.repo.owner,
|
|
||||||
repo: context.repo.repo,
|
|
||||||
event: 'REQUEST_CHANGES',
|
|
||||||
body: 'You have modified clang-tidy configuration but have not updated the hash.\nPlease run `script/clang_tidy_hash.py --update` and commit the changes.'
|
|
||||||
})
|
|
||||||
|
|
||||||
- if: success()
|
|
||||||
name: Dismiss review
|
|
||||||
uses: actions/github-script@v7.0.1
|
|
||||||
with:
|
|
||||||
script: |
|
|
||||||
let reviews = await github.rest.pulls.listReviews({
|
|
||||||
pull_number: context.issue.number,
|
|
||||||
owner: context.repo.owner,
|
|
||||||
repo: context.repo.repo
|
|
||||||
});
|
|
||||||
for (let review of reviews.data) {
|
|
||||||
if (review.user.login === 'github-actions[bot]' && review.state === 'CHANGES_REQUESTED') {
|
|
||||||
await github.rest.pulls.dismissReview({
|
|
||||||
pull_number: context.issue.number,
|
|
||||||
owner: context.repo.owner,
|
|
||||||
repo: context.repo.repo,
|
|
||||||
review_id: review.id,
|
|
||||||
message: 'Clang-tidy hash now matches configuration.'
|
|
||||||
});
|
|
||||||
}
|
|
||||||
}
|
|
21
.github/workflows/ci-docker.yml
vendored
21
.github/workflows/ci-docker.yml
vendored
@ -33,23 +33,22 @@ concurrency:
|
|||||||
jobs:
|
jobs:
|
||||||
check-docker:
|
check-docker:
|
||||||
name: Build docker containers
|
name: Build docker containers
|
||||||
runs-on: ${{ matrix.os }}
|
runs-on: ubuntu-latest
|
||||||
strategy:
|
strategy:
|
||||||
fail-fast: false
|
fail-fast: false
|
||||||
matrix:
|
matrix:
|
||||||
os: ["ubuntu-24.04", "ubuntu-24.04-arm"]
|
arch: [amd64, armv7, aarch64]
|
||||||
build_type:
|
build_type: ["ha-addon", "docker", "lint"]
|
||||||
- "ha-addon"
|
|
||||||
- "docker"
|
|
||||||
# - "lint"
|
|
||||||
steps:
|
steps:
|
||||||
- uses: actions/checkout@v4.2.2
|
- uses: actions/checkout@v4.1.7
|
||||||
- name: Set up Python
|
- name: Set up Python
|
||||||
uses: actions/setup-python@v5.6.0
|
uses: actions/setup-python@v5.3.0
|
||||||
with:
|
with:
|
||||||
python-version: "3.11"
|
python-version: "3.9"
|
||||||
- name: Set up Docker Buildx
|
- name: Set up Docker Buildx
|
||||||
uses: docker/setup-buildx-action@v3.11.1
|
uses: docker/setup-buildx-action@v3.7.1
|
||||||
|
- name: Set up QEMU
|
||||||
|
uses: docker/setup-qemu-action@v3.2.0
|
||||||
|
|
||||||
- name: Set TAG
|
- name: Set TAG
|
||||||
run: |
|
run: |
|
||||||
@ -59,6 +58,6 @@ jobs:
|
|||||||
run: |
|
run: |
|
||||||
docker/build.py \
|
docker/build.py \
|
||||||
--tag "${TAG}" \
|
--tag "${TAG}" \
|
||||||
--arch "${{ matrix.os == 'ubuntu-24.04-arm' && 'aarch64' || 'amd64' }}" \
|
--arch "${{ matrix.arch }}" \
|
||||||
--build-type "${{ matrix.build_type }}" \
|
--build-type "${{ matrix.build_type }}" \
|
||||||
build
|
build
|
||||||
|
364
.github/workflows/ci.yml
vendored
364
.github/workflows/ci.yml
vendored
@ -13,15 +13,14 @@ on:
|
|||||||
- ".github/workflows/ci.yml"
|
- ".github/workflows/ci.yml"
|
||||||
- "!.yamllint"
|
- "!.yamllint"
|
||||||
- "!.github/dependabot.yml"
|
- "!.github/dependabot.yml"
|
||||||
- "!docker/**"
|
|
||||||
merge_group:
|
merge_group:
|
||||||
|
|
||||||
permissions:
|
permissions:
|
||||||
contents: read
|
contents: read
|
||||||
|
|
||||||
env:
|
env:
|
||||||
DEFAULT_PYTHON: "3.11"
|
DEFAULT_PYTHON: "3.9"
|
||||||
PYUPGRADE_TARGET: "--py311-plus"
|
PYUPGRADE_TARGET: "--py39-plus"
|
||||||
|
|
||||||
concurrency:
|
concurrency:
|
||||||
# yamllint disable-line rule:line-length
|
# yamllint disable-line rule:line-length
|
||||||
@ -31,23 +30,23 @@ concurrency:
|
|||||||
jobs:
|
jobs:
|
||||||
common:
|
common:
|
||||||
name: Create common environment
|
name: Create common environment
|
||||||
runs-on: ubuntu-24.04
|
runs-on: ubuntu-latest
|
||||||
outputs:
|
outputs:
|
||||||
cache-key: ${{ steps.cache-key.outputs.key }}
|
cache-key: ${{ steps.cache-key.outputs.key }}
|
||||||
steps:
|
steps:
|
||||||
- name: Check out code from GitHub
|
- name: Check out code from GitHub
|
||||||
uses: actions/checkout@v4.2.2
|
uses: actions/checkout@v4.1.7
|
||||||
- name: Generate cache-key
|
- name: Generate cache-key
|
||||||
id: cache-key
|
id: cache-key
|
||||||
run: echo key="${{ hashFiles('requirements.txt', 'requirements_test.txt', '.pre-commit-config.yaml') }}" >> $GITHUB_OUTPUT
|
run: echo key="${{ hashFiles('requirements.txt', 'requirements_optional.txt', 'requirements_test.txt') }}" >> $GITHUB_OUTPUT
|
||||||
- name: Set up Python ${{ env.DEFAULT_PYTHON }}
|
- name: Set up Python ${{ env.DEFAULT_PYTHON }}
|
||||||
id: python
|
id: python
|
||||||
uses: actions/setup-python@v5.6.0
|
uses: actions/setup-python@v5.3.0
|
||||||
with:
|
with:
|
||||||
python-version: ${{ env.DEFAULT_PYTHON }}
|
python-version: ${{ env.DEFAULT_PYTHON }}
|
||||||
- name: Restore Python virtual environment
|
- name: Restore Python virtual environment
|
||||||
id: cache-venv
|
id: cache-venv
|
||||||
uses: actions/cache@v4.2.3
|
uses: actions/cache@v4.1.2
|
||||||
with:
|
with:
|
||||||
path: venv
|
path: venv
|
||||||
# yamllint disable-line rule:line-length
|
# yamllint disable-line rule:line-length
|
||||||
@ -58,19 +57,59 @@ jobs:
|
|||||||
python -m venv venv
|
python -m venv venv
|
||||||
. venv/bin/activate
|
. venv/bin/activate
|
||||||
python --version
|
python --version
|
||||||
pip install -r requirements.txt -r requirements_test.txt pre-commit
|
pip install -r requirements.txt -r requirements_optional.txt -r requirements_test.txt
|
||||||
pip install -e .
|
pip install -e .
|
||||||
|
|
||||||
|
black:
|
||||||
|
name: Check black
|
||||||
|
runs-on: ubuntu-latest
|
||||||
|
needs:
|
||||||
|
- common
|
||||||
|
steps:
|
||||||
|
- name: Check out code from GitHub
|
||||||
|
uses: actions/checkout@v4.1.7
|
||||||
|
- name: Restore Python
|
||||||
|
uses: ./.github/actions/restore-python
|
||||||
|
with:
|
||||||
|
python-version: ${{ env.DEFAULT_PYTHON }}
|
||||||
|
cache-key: ${{ needs.common.outputs.cache-key }}
|
||||||
|
- name: Run black
|
||||||
|
run: |
|
||||||
|
. venv/bin/activate
|
||||||
|
black --verbose esphome tests
|
||||||
|
- name: Suggested changes
|
||||||
|
run: script/ci-suggest-changes
|
||||||
|
if: always()
|
||||||
|
|
||||||
|
flake8:
|
||||||
|
name: Check flake8
|
||||||
|
runs-on: ubuntu-latest
|
||||||
|
needs:
|
||||||
|
- common
|
||||||
|
steps:
|
||||||
|
- name: Check out code from GitHub
|
||||||
|
uses: actions/checkout@v4.1.7
|
||||||
|
- name: Restore Python
|
||||||
|
uses: ./.github/actions/restore-python
|
||||||
|
with:
|
||||||
|
python-version: ${{ env.DEFAULT_PYTHON }}
|
||||||
|
cache-key: ${{ needs.common.outputs.cache-key }}
|
||||||
|
- name: Run flake8
|
||||||
|
run: |
|
||||||
|
. venv/bin/activate
|
||||||
|
flake8 esphome
|
||||||
|
- name: Suggested changes
|
||||||
|
run: script/ci-suggest-changes
|
||||||
|
if: always()
|
||||||
|
|
||||||
pylint:
|
pylint:
|
||||||
name: Check pylint
|
name: Check pylint
|
||||||
runs-on: ubuntu-24.04
|
runs-on: ubuntu-latest
|
||||||
needs:
|
needs:
|
||||||
- common
|
- common
|
||||||
- determine-jobs
|
|
||||||
if: needs.determine-jobs.outputs.python-linters == 'true'
|
|
||||||
steps:
|
steps:
|
||||||
- name: Check out code from GitHub
|
- name: Check out code from GitHub
|
||||||
uses: actions/checkout@v4.2.2
|
uses: actions/checkout@v4.1.7
|
||||||
- name: Restore Python
|
- name: Restore Python
|
||||||
uses: ./.github/actions/restore-python
|
uses: ./.github/actions/restore-python
|
||||||
with:
|
with:
|
||||||
@ -84,14 +123,35 @@ jobs:
|
|||||||
run: script/ci-suggest-changes
|
run: script/ci-suggest-changes
|
||||||
if: always()
|
if: always()
|
||||||
|
|
||||||
ci-custom:
|
pyupgrade:
|
||||||
name: Run script/ci-custom
|
name: Check pyupgrade
|
||||||
runs-on: ubuntu-24.04
|
runs-on: ubuntu-latest
|
||||||
needs:
|
needs:
|
||||||
- common
|
- common
|
||||||
steps:
|
steps:
|
||||||
- name: Check out code from GitHub
|
- name: Check out code from GitHub
|
||||||
uses: actions/checkout@v4.2.2
|
uses: actions/checkout@v4.1.7
|
||||||
|
- name: Restore Python
|
||||||
|
uses: ./.github/actions/restore-python
|
||||||
|
with:
|
||||||
|
python-version: ${{ env.DEFAULT_PYTHON }}
|
||||||
|
cache-key: ${{ needs.common.outputs.cache-key }}
|
||||||
|
- name: Run pyupgrade
|
||||||
|
run: |
|
||||||
|
. venv/bin/activate
|
||||||
|
pyupgrade ${{ env.PYUPGRADE_TARGET }} `find esphome -name "*.py" -type f`
|
||||||
|
- name: Suggested changes
|
||||||
|
run: script/ci-suggest-changes
|
||||||
|
if: always()
|
||||||
|
|
||||||
|
ci-custom:
|
||||||
|
name: Run script/ci-custom
|
||||||
|
runs-on: ubuntu-latest
|
||||||
|
needs:
|
||||||
|
- common
|
||||||
|
steps:
|
||||||
|
- name: Check out code from GitHub
|
||||||
|
uses: actions/checkout@v4.1.7
|
||||||
- name: Restore Python
|
- name: Restore Python
|
||||||
uses: ./.github/actions/restore-python
|
uses: ./.github/actions/restore-python
|
||||||
with:
|
with:
|
||||||
@ -104,7 +164,6 @@ jobs:
|
|||||||
. venv/bin/activate
|
. venv/bin/activate
|
||||||
script/ci-custom.py
|
script/ci-custom.py
|
||||||
script/build_codeowners.py --check
|
script/build_codeowners.py --check
|
||||||
script/build_language_schema.py --check
|
|
||||||
|
|
||||||
pytest:
|
pytest:
|
||||||
name: Run pytest
|
name: Run pytest
|
||||||
@ -112,9 +171,10 @@ jobs:
|
|||||||
fail-fast: false
|
fail-fast: false
|
||||||
matrix:
|
matrix:
|
||||||
python-version:
|
python-version:
|
||||||
|
- "3.9"
|
||||||
|
- "3.10"
|
||||||
- "3.11"
|
- "3.11"
|
||||||
- "3.12"
|
- "3.12"
|
||||||
- "3.13"
|
|
||||||
os:
|
os:
|
||||||
- ubuntu-latest
|
- ubuntu-latest
|
||||||
- macOS-latest
|
- macOS-latest
|
||||||
@ -123,22 +183,25 @@ jobs:
|
|||||||
# Minimize CI resource usage
|
# Minimize CI resource usage
|
||||||
# by only running the Python version
|
# by only running the Python version
|
||||||
# version used for docker images on Windows and macOS
|
# version used for docker images on Windows and macOS
|
||||||
- python-version: "3.13"
|
|
||||||
os: windows-latest
|
|
||||||
- python-version: "3.12"
|
- python-version: "3.12"
|
||||||
os: windows-latest
|
os: windows-latest
|
||||||
- python-version: "3.13"
|
- python-version: "3.10"
|
||||||
|
os: windows-latest
|
||||||
|
- python-version: "3.9"
|
||||||
|
os: windows-latest
|
||||||
|
- python-version: "3.12"
|
||||||
os: macOS-latest
|
os: macOS-latest
|
||||||
- python-version: "3.12"
|
- python-version: "3.10"
|
||||||
|
os: macOS-latest
|
||||||
|
- python-version: "3.9"
|
||||||
os: macOS-latest
|
os: macOS-latest
|
||||||
runs-on: ${{ matrix.os }}
|
runs-on: ${{ matrix.os }}
|
||||||
needs:
|
needs:
|
||||||
- common
|
- common
|
||||||
steps:
|
steps:
|
||||||
- name: Check out code from GitHub
|
- name: Check out code from GitHub
|
||||||
uses: actions/checkout@v4.2.2
|
uses: actions/checkout@v4.1.7
|
||||||
- name: Restore Python
|
- name: Restore Python
|
||||||
id: restore-python
|
|
||||||
uses: ./.github/actions/restore-python
|
uses: ./.github/actions/restore-python
|
||||||
with:
|
with:
|
||||||
python-version: ${{ matrix.python-version }}
|
python-version: ${{ matrix.python-version }}
|
||||||
@ -148,108 +211,56 @@ jobs:
|
|||||||
- name: Run pytest
|
- name: Run pytest
|
||||||
if: matrix.os == 'windows-latest'
|
if: matrix.os == 'windows-latest'
|
||||||
run: |
|
run: |
|
||||||
. ./venv/Scripts/activate.ps1
|
./venv/Scripts/activate
|
||||||
pytest -vv --cov-report=xml --tb=native -n auto tests --ignore=tests/integration/
|
pytest -vv --cov-report=xml --tb=native tests
|
||||||
- name: Run pytest
|
- name: Run pytest
|
||||||
if: matrix.os == 'ubuntu-latest' || matrix.os == 'macOS-latest'
|
if: matrix.os == 'ubuntu-latest' || matrix.os == 'macOS-latest'
|
||||||
run: |
|
run: |
|
||||||
. venv/bin/activate
|
. venv/bin/activate
|
||||||
pytest -vv --cov-report=xml --tb=native -n auto tests --ignore=tests/integration/
|
pytest -vv --cov-report=xml --tb=native tests
|
||||||
- name: Upload coverage to Codecov
|
- name: Upload coverage to Codecov
|
||||||
uses: codecov/codecov-action@v5.4.3
|
uses: codecov/codecov-action@v4
|
||||||
with:
|
with:
|
||||||
token: ${{ secrets.CODECOV_TOKEN }}
|
token: ${{ secrets.CODECOV_TOKEN }}
|
||||||
- name: Save Python virtual environment cache
|
|
||||||
if: github.ref == 'refs/heads/dev'
|
|
||||||
uses: actions/cache/save@v4.2.3
|
|
||||||
with:
|
|
||||||
path: venv
|
|
||||||
key: ${{ runner.os }}-${{ steps.restore-python.outputs.python-version }}-venv-${{ needs.common.outputs.cache-key }}
|
|
||||||
|
|
||||||
determine-jobs:
|
clang-format:
|
||||||
name: Determine which jobs to run
|
name: Check clang-format
|
||||||
runs-on: ubuntu-24.04
|
runs-on: ubuntu-latest
|
||||||
needs:
|
needs:
|
||||||
- common
|
- common
|
||||||
outputs:
|
|
||||||
integration-tests: ${{ steps.determine.outputs.integration-tests }}
|
|
||||||
clang-tidy: ${{ steps.determine.outputs.clang-tidy }}
|
|
||||||
python-linters: ${{ steps.determine.outputs.python-linters }}
|
|
||||||
changed-components: ${{ steps.determine.outputs.changed-components }}
|
|
||||||
component-test-count: ${{ steps.determine.outputs.component-test-count }}
|
|
||||||
steps:
|
steps:
|
||||||
- name: Check out code from GitHub
|
- name: Check out code from GitHub
|
||||||
uses: actions/checkout@v4.2.2
|
uses: actions/checkout@v4.1.7
|
||||||
with:
|
|
||||||
# Fetch enough history to find the merge base
|
|
||||||
fetch-depth: 2
|
|
||||||
- name: Restore Python
|
- name: Restore Python
|
||||||
uses: ./.github/actions/restore-python
|
uses: ./.github/actions/restore-python
|
||||||
with:
|
with:
|
||||||
python-version: ${{ env.DEFAULT_PYTHON }}
|
python-version: ${{ env.DEFAULT_PYTHON }}
|
||||||
cache-key: ${{ needs.common.outputs.cache-key }}
|
cache-key: ${{ needs.common.outputs.cache-key }}
|
||||||
- name: Determine which tests to run
|
- name: Install clang-format
|
||||||
id: determine
|
|
||||||
env:
|
|
||||||
GH_TOKEN: ${{ github.token }}
|
|
||||||
run: |
|
run: |
|
||||||
. venv/bin/activate
|
. venv/bin/activate
|
||||||
output=$(python script/determine-jobs.py)
|
pip install clang-format -c requirements_dev.txt
|
||||||
echo "Test determination output:"
|
- name: Run clang-format
|
||||||
echo "$output" | jq
|
|
||||||
|
|
||||||
# Extract individual fields
|
|
||||||
echo "integration-tests=$(echo "$output" | jq -r '.integration_tests')" >> $GITHUB_OUTPUT
|
|
||||||
echo "clang-tidy=$(echo "$output" | jq -r '.clang_tidy')" >> $GITHUB_OUTPUT
|
|
||||||
echo "python-linters=$(echo "$output" | jq -r '.python_linters')" >> $GITHUB_OUTPUT
|
|
||||||
echo "changed-components=$(echo "$output" | jq -c '.changed_components')" >> $GITHUB_OUTPUT
|
|
||||||
echo "component-test-count=$(echo "$output" | jq -r '.component_test_count')" >> $GITHUB_OUTPUT
|
|
||||||
|
|
||||||
integration-tests:
|
|
||||||
name: Run integration tests
|
|
||||||
runs-on: ubuntu-latest
|
|
||||||
needs:
|
|
||||||
- common
|
|
||||||
- determine-jobs
|
|
||||||
if: needs.determine-jobs.outputs.integration-tests == 'true'
|
|
||||||
steps:
|
|
||||||
- name: Check out code from GitHub
|
|
||||||
uses: actions/checkout@v4.2.2
|
|
||||||
- name: Set up Python 3.13
|
|
||||||
id: python
|
|
||||||
uses: actions/setup-python@v5.6.0
|
|
||||||
with:
|
|
||||||
python-version: "3.13"
|
|
||||||
- name: Restore Python virtual environment
|
|
||||||
id: cache-venv
|
|
||||||
uses: actions/cache@v4.2.3
|
|
||||||
with:
|
|
||||||
path: venv
|
|
||||||
key: ${{ runner.os }}-${{ steps.python.outputs.python-version }}-venv-${{ needs.common.outputs.cache-key }}
|
|
||||||
- name: Create Python virtual environment
|
|
||||||
if: steps.cache-venv.outputs.cache-hit != 'true'
|
|
||||||
run: |
|
|
||||||
python -m venv venv
|
|
||||||
. venv/bin/activate
|
|
||||||
python --version
|
|
||||||
pip install -r requirements.txt -r requirements_test.txt
|
|
||||||
pip install -e .
|
|
||||||
- name: Register matcher
|
|
||||||
run: echo "::add-matcher::.github/workflows/matchers/pytest.json"
|
|
||||||
- name: Run integration tests
|
|
||||||
run: |
|
run: |
|
||||||
. venv/bin/activate
|
. venv/bin/activate
|
||||||
pytest -vv --no-cov --tb=native -n auto tests/integration/
|
script/clang-format -i
|
||||||
|
git diff-index --quiet HEAD --
|
||||||
|
- name: Suggested changes
|
||||||
|
run: script/ci-suggest-changes
|
||||||
|
if: always()
|
||||||
|
|
||||||
clang-tidy:
|
clang-tidy:
|
||||||
name: ${{ matrix.name }}
|
name: ${{ matrix.name }}
|
||||||
runs-on: ubuntu-24.04
|
runs-on: ubuntu-latest
|
||||||
needs:
|
needs:
|
||||||
- common
|
- common
|
||||||
- determine-jobs
|
- black
|
||||||
if: needs.determine-jobs.outputs.clang-tidy == 'true'
|
- ci-custom
|
||||||
env:
|
- clang-format
|
||||||
GH_TOKEN: ${{ github.token }}
|
- flake8
|
||||||
|
- pylint
|
||||||
|
- pytest
|
||||||
|
- pyupgrade
|
||||||
strategy:
|
strategy:
|
||||||
fail-fast: false
|
fail-fast: false
|
||||||
max-parallel: 2
|
max-parallel: 2
|
||||||
@ -279,19 +290,10 @@ jobs:
|
|||||||
name: Run script/clang-tidy for ESP32 IDF
|
name: Run script/clang-tidy for ESP32 IDF
|
||||||
options: --environment esp32-idf-tidy --grep USE_ESP_IDF
|
options: --environment esp32-idf-tidy --grep USE_ESP_IDF
|
||||||
pio_cache_key: tidyesp32-idf
|
pio_cache_key: tidyesp32-idf
|
||||||
- id: clang-tidy
|
|
||||||
name: Run script/clang-tidy for ZEPHYR
|
|
||||||
options: --environment nrf52-tidy --grep USE_ZEPHYR
|
|
||||||
pio_cache_key: tidy-zephyr
|
|
||||||
ignore_errors: false
|
|
||||||
|
|
||||||
steps:
|
steps:
|
||||||
- name: Check out code from GitHub
|
- name: Check out code from GitHub
|
||||||
uses: actions/checkout@v4.2.2
|
uses: actions/checkout@v4.1.7
|
||||||
with:
|
|
||||||
# Need history for HEAD~1 to work for checking changed files
|
|
||||||
fetch-depth: 2
|
|
||||||
|
|
||||||
- name: Restore Python
|
- name: Restore Python
|
||||||
uses: ./.github/actions/restore-python
|
uses: ./.github/actions/restore-python
|
||||||
with:
|
with:
|
||||||
@ -300,17 +302,22 @@ jobs:
|
|||||||
|
|
||||||
- name: Cache platformio
|
- name: Cache platformio
|
||||||
if: github.ref == 'refs/heads/dev'
|
if: github.ref == 'refs/heads/dev'
|
||||||
uses: actions/cache@v4.2.3
|
uses: actions/cache@v4.1.2
|
||||||
with:
|
with:
|
||||||
path: ~/.platformio
|
path: ~/.platformio
|
||||||
key: platformio-${{ matrix.pio_cache_key }}-${{ hashFiles('platformio.ini') }}
|
key: platformio-${{ matrix.pio_cache_key }}
|
||||||
|
|
||||||
- name: Cache platformio
|
- name: Cache platformio
|
||||||
if: github.ref != 'refs/heads/dev'
|
if: github.ref != 'refs/heads/dev'
|
||||||
uses: actions/cache/restore@v4.2.3
|
uses: actions/cache/restore@v4.1.2
|
||||||
with:
|
with:
|
||||||
path: ~/.platformio
|
path: ~/.platformio
|
||||||
key: platformio-${{ matrix.pio_cache_key }}-${{ hashFiles('platformio.ini') }}
|
key: platformio-${{ matrix.pio_cache_key }}
|
||||||
|
|
||||||
|
- name: Install clang-tidy
|
||||||
|
run: |
|
||||||
|
sudo apt-get update
|
||||||
|
sudo apt-get install clang-tidy-14
|
||||||
|
|
||||||
- name: Register problem matchers
|
- name: Register problem matchers
|
||||||
run: |
|
run: |
|
||||||
@ -324,49 +331,72 @@ jobs:
|
|||||||
mkdir -p .temp
|
mkdir -p .temp
|
||||||
pio run --list-targets -e esp32-idf-tidy
|
pio run --list-targets -e esp32-idf-tidy
|
||||||
|
|
||||||
- name: Check if full clang-tidy scan needed
|
|
||||||
id: check_full_scan
|
|
||||||
run: |
|
|
||||||
. venv/bin/activate
|
|
||||||
if python script/clang_tidy_hash.py --check; then
|
|
||||||
echo "full_scan=true" >> $GITHUB_OUTPUT
|
|
||||||
echo "reason=hash_changed" >> $GITHUB_OUTPUT
|
|
||||||
else
|
|
||||||
echo "full_scan=false" >> $GITHUB_OUTPUT
|
|
||||||
echo "reason=normal" >> $GITHUB_OUTPUT
|
|
||||||
fi
|
|
||||||
|
|
||||||
- name: Run clang-tidy
|
- name: Run clang-tidy
|
||||||
run: |
|
run: |
|
||||||
. venv/bin/activate
|
. venv/bin/activate
|
||||||
if [ "${{ steps.check_full_scan.outputs.full_scan }}" = "true" ]; then
|
script/clang-tidy --all-headers --fix ${{ matrix.options }}
|
||||||
echo "Running FULL clang-tidy scan (hash changed)"
|
|
||||||
script/clang-tidy --all-headers --fix ${{ matrix.options }} ${{ matrix.ignore_errors && '|| true' || '' }}
|
|
||||||
else
|
|
||||||
echo "Running clang-tidy on changed files only"
|
|
||||||
script/clang-tidy --all-headers --fix --changed ${{ matrix.options }} ${{ matrix.ignore_errors && '|| true' || '' }}
|
|
||||||
fi
|
|
||||||
env:
|
env:
|
||||||
# Also cache libdeps, store them in a ~/.platformio subfolder
|
# Also cache libdeps, store them in a ~/.platformio subfolder
|
||||||
PLATFORMIO_LIBDEPS_DIR: ~/.platformio/libdeps
|
PLATFORMIO_LIBDEPS_DIR: ~/.platformio/libdeps
|
||||||
|
|
||||||
- name: Suggested changes
|
- name: Suggested changes
|
||||||
run: script/ci-suggest-changes ${{ matrix.ignore_errors && '|| true' || '' }}
|
run: script/ci-suggest-changes
|
||||||
# yamllint disable-line rule:line-length
|
# yamllint disable-line rule:line-length
|
||||||
if: always()
|
if: always()
|
||||||
|
|
||||||
test-build-components:
|
list-components:
|
||||||
name: Component test ${{ matrix.file }}
|
runs-on: ubuntu-latest
|
||||||
runs-on: ubuntu-24.04
|
|
||||||
needs:
|
needs:
|
||||||
- common
|
- common
|
||||||
- determine-jobs
|
if: github.event_name == 'pull_request'
|
||||||
if: github.event_name == 'pull_request' && fromJSON(needs.determine-jobs.outputs.component-test-count) > 0 && fromJSON(needs.determine-jobs.outputs.component-test-count) < 100
|
outputs:
|
||||||
|
components: ${{ steps.list-components.outputs.components }}
|
||||||
|
count: ${{ steps.list-components.outputs.count }}
|
||||||
|
steps:
|
||||||
|
- name: Check out code from GitHub
|
||||||
|
uses: actions/checkout@v4.1.7
|
||||||
|
with:
|
||||||
|
# Fetch enough history so `git merge-base refs/remotes/origin/dev HEAD` works.
|
||||||
|
fetch-depth: 500
|
||||||
|
- name: Get target branch
|
||||||
|
id: target-branch
|
||||||
|
run: |
|
||||||
|
echo "branch=${{ github.event.pull_request.base.ref }}" >> $GITHUB_OUTPUT
|
||||||
|
- name: Fetch ${{ steps.target-branch.outputs.branch }} branch
|
||||||
|
run: |
|
||||||
|
git -c protocol.version=2 fetch --no-tags --prune --no-recurse-submodules --depth=1 origin +refs/heads/${{ steps.target-branch.outputs.branch }}:refs/remotes/origin/${{ steps.target-branch.outputs.branch }}
|
||||||
|
git merge-base refs/remotes/origin/${{ steps.target-branch.outputs.branch }} HEAD
|
||||||
|
- name: Restore Python
|
||||||
|
uses: ./.github/actions/restore-python
|
||||||
|
with:
|
||||||
|
python-version: ${{ env.DEFAULT_PYTHON }}
|
||||||
|
cache-key: ${{ needs.common.outputs.cache-key }}
|
||||||
|
- name: Find changed components
|
||||||
|
id: list-components
|
||||||
|
run: |
|
||||||
|
. venv/bin/activate
|
||||||
|
components=$(script/list-components.py --changed --branch ${{ steps.target-branch.outputs.branch }})
|
||||||
|
output_components=$(echo "$components" | jq -R -s -c 'split("\n")[:-1] | map(select(length > 0))')
|
||||||
|
count=$(echo "$output_components" | jq length)
|
||||||
|
|
||||||
|
echo "components=$output_components" >> $GITHUB_OUTPUT
|
||||||
|
echo "count=$count" >> $GITHUB_OUTPUT
|
||||||
|
|
||||||
|
echo "$count Components:"
|
||||||
|
echo "$output_components" | jq
|
||||||
|
|
||||||
|
test-build-components:
|
||||||
|
name: Component test ${{ matrix.file }}
|
||||||
|
runs-on: ubuntu-latest
|
||||||
|
needs:
|
||||||
|
- common
|
||||||
|
- list-components
|
||||||
|
if: github.event_name == 'pull_request' && fromJSON(needs.list-components.outputs.count) > 0 && fromJSON(needs.list-components.outputs.count) < 100
|
||||||
strategy:
|
strategy:
|
||||||
fail-fast: false
|
fail-fast: false
|
||||||
max-parallel: 2
|
max-parallel: 2
|
||||||
matrix:
|
matrix:
|
||||||
file: ${{ fromJson(needs.determine-jobs.outputs.changed-components) }}
|
file: ${{ fromJson(needs.list-components.outputs.components) }}
|
||||||
steps:
|
steps:
|
||||||
- name: Install dependencies
|
- name: Install dependencies
|
||||||
run: |
|
run: |
|
||||||
@ -374,7 +404,7 @@ jobs:
|
|||||||
sudo apt-get install libsdl2-dev
|
sudo apt-get install libsdl2-dev
|
||||||
|
|
||||||
- name: Check out code from GitHub
|
- name: Check out code from GitHub
|
||||||
uses: actions/checkout@v4.2.2
|
uses: actions/checkout@v4.1.7
|
||||||
- name: Restore Python
|
- name: Restore Python
|
||||||
uses: ./.github/actions/restore-python
|
uses: ./.github/actions/restore-python
|
||||||
with:
|
with:
|
||||||
@ -391,30 +421,30 @@ jobs:
|
|||||||
|
|
||||||
test-build-components-splitter:
|
test-build-components-splitter:
|
||||||
name: Split components for testing into 20 groups maximum
|
name: Split components for testing into 20 groups maximum
|
||||||
runs-on: ubuntu-24.04
|
runs-on: ubuntu-latest
|
||||||
needs:
|
needs:
|
||||||
- common
|
- common
|
||||||
- determine-jobs
|
- list-components
|
||||||
if: github.event_name == 'pull_request' && fromJSON(needs.determine-jobs.outputs.component-test-count) >= 100
|
if: github.event_name == 'pull_request' && fromJSON(needs.list-components.outputs.count) >= 100
|
||||||
outputs:
|
outputs:
|
||||||
matrix: ${{ steps.split.outputs.components }}
|
matrix: ${{ steps.split.outputs.components }}
|
||||||
steps:
|
steps:
|
||||||
- name: Check out code from GitHub
|
- name: Check out code from GitHub
|
||||||
uses: actions/checkout@v4.2.2
|
uses: actions/checkout@v4.1.7
|
||||||
- name: Split components into 20 groups
|
- name: Split components into 20 groups
|
||||||
id: split
|
id: split
|
||||||
run: |
|
run: |
|
||||||
components=$(echo '${{ needs.determine-jobs.outputs.changed-components }}' | jq -c '.[]' | shuf | jq -s -c '[_nwise(20) | join(" ")]')
|
components=$(echo '${{ needs.list-components.outputs.components }}' | jq -c '.[]' | shuf | jq -s -c '[_nwise(20) | join(" ")]')
|
||||||
echo "components=$components" >> $GITHUB_OUTPUT
|
echo "components=$components" >> $GITHUB_OUTPUT
|
||||||
|
|
||||||
test-build-components-split:
|
test-build-components-split:
|
||||||
name: Test split components
|
name: Test split components
|
||||||
runs-on: ubuntu-24.04
|
runs-on: ubuntu-latest
|
||||||
needs:
|
needs:
|
||||||
- common
|
- common
|
||||||
- determine-jobs
|
- list-components
|
||||||
- test-build-components-splitter
|
- test-build-components-splitter
|
||||||
if: github.event_name == 'pull_request' && fromJSON(needs.determine-jobs.outputs.component-test-count) >= 100
|
if: github.event_name == 'pull_request' && fromJSON(needs.list-components.outputs.count) >= 100
|
||||||
strategy:
|
strategy:
|
||||||
fail-fast: false
|
fail-fast: false
|
||||||
max-parallel: 4
|
max-parallel: 4
|
||||||
@ -430,7 +460,7 @@ jobs:
|
|||||||
sudo apt-get install libsdl2-dev
|
sudo apt-get install libsdl2-dev
|
||||||
|
|
||||||
- name: Check out code from GitHub
|
- name: Check out code from GitHub
|
||||||
uses: actions/checkout@v4.2.2
|
uses: actions/checkout@v4.1.7
|
||||||
- name: Restore Python
|
- name: Restore Python
|
||||||
uses: ./.github/actions/restore-python
|
uses: ./.github/actions/restore-python
|
||||||
with:
|
with:
|
||||||
@ -451,41 +481,23 @@ jobs:
|
|||||||
./script/test_build_components -e compile -c $component
|
./script/test_build_components -e compile -c $component
|
||||||
done
|
done
|
||||||
|
|
||||||
pre-commit-ci-lite:
|
ci-status:
|
||||||
name: pre-commit.ci lite
|
name: CI Status
|
||||||
runs-on: ubuntu-latest
|
runs-on: ubuntu-latest
|
||||||
needs:
|
needs:
|
||||||
- common
|
- common
|
||||||
if: github.event_name == 'pull_request' && github.base_ref != 'beta' && github.base_ref != 'release'
|
- black
|
||||||
steps:
|
|
||||||
- name: Check out code from GitHub
|
|
||||||
uses: actions/checkout@v4.2.2
|
|
||||||
- name: Restore Python
|
|
||||||
uses: ./.github/actions/restore-python
|
|
||||||
with:
|
|
||||||
python-version: ${{ env.DEFAULT_PYTHON }}
|
|
||||||
cache-key: ${{ needs.common.outputs.cache-key }}
|
|
||||||
- uses: pre-commit/action@v3.0.1
|
|
||||||
env:
|
|
||||||
SKIP: pylint,clang-tidy-hash
|
|
||||||
- uses: pre-commit-ci/lite-action@v1.1.0
|
|
||||||
if: always()
|
|
||||||
|
|
||||||
ci-status:
|
|
||||||
name: CI Status
|
|
||||||
runs-on: ubuntu-24.04
|
|
||||||
needs:
|
|
||||||
- common
|
|
||||||
- ci-custom
|
- ci-custom
|
||||||
|
- clang-format
|
||||||
|
- flake8
|
||||||
- pylint
|
- pylint
|
||||||
- pytest
|
- pytest
|
||||||
- integration-tests
|
- pyupgrade
|
||||||
- clang-tidy
|
- clang-tidy
|
||||||
- determine-jobs
|
- list-components
|
||||||
- test-build-components
|
- test-build-components
|
||||||
- test-build-components-splitter
|
- test-build-components-splitter
|
||||||
- test-build-components-split
|
- test-build-components-split
|
||||||
- pre-commit-ci-lite
|
|
||||||
if: always()
|
if: always()
|
||||||
steps:
|
steps:
|
||||||
- name: Success
|
- name: Success
|
||||||
|
324
.github/workflows/codeowner-review-request.yml
vendored
324
.github/workflows/codeowner-review-request.yml
vendored
@ -1,324 +0,0 @@
|
|||||||
# This workflow automatically requests reviews from codeowners when:
|
|
||||||
# 1. A PR is opened, reopened, or synchronized (updated)
|
|
||||||
# 2. A PR is marked as ready for review
|
|
||||||
#
|
|
||||||
# It reads the CODEOWNERS file and matches all changed files in the PR against
|
|
||||||
# the codeowner patterns, then requests reviews from the appropriate owners
|
|
||||||
# while avoiding duplicate requests for users who have already been requested
|
|
||||||
# or have already reviewed the PR.
|
|
||||||
|
|
||||||
name: Request Codeowner Reviews
|
|
||||||
|
|
||||||
on:
|
|
||||||
# Needs to be pull_request_target to get write permissions
|
|
||||||
pull_request_target:
|
|
||||||
types: [opened, reopened, synchronize, ready_for_review]
|
|
||||||
|
|
||||||
permissions:
|
|
||||||
pull-requests: write
|
|
||||||
contents: read
|
|
||||||
|
|
||||||
jobs:
|
|
||||||
request-codeowner-reviews:
|
|
||||||
name: Run
|
|
||||||
if: ${{ !github.event.pull_request.draft }}
|
|
||||||
runs-on: ubuntu-latest
|
|
||||||
steps:
|
|
||||||
- name: Request reviews from component codeowners
|
|
||||||
uses: actions/github-script@v7.0.1
|
|
||||||
with:
|
|
||||||
script: |
|
|
||||||
const owner = context.repo.owner;
|
|
||||||
const repo = context.repo.repo;
|
|
||||||
const pr_number = context.payload.pull_request.number;
|
|
||||||
|
|
||||||
console.log(`Processing PR #${pr_number} for codeowner review requests`);
|
|
||||||
|
|
||||||
// Hidden marker to identify bot comments from this workflow
|
|
||||||
const BOT_COMMENT_MARKER = '<!-- codeowner-review-request-bot -->';
|
|
||||||
|
|
||||||
try {
|
|
||||||
// Get the list of changed files in this PR
|
|
||||||
const { data: files } = await github.rest.pulls.listFiles({
|
|
||||||
owner,
|
|
||||||
repo,
|
|
||||||
pull_number: pr_number
|
|
||||||
});
|
|
||||||
|
|
||||||
const changedFiles = files.map(file => file.filename);
|
|
||||||
console.log(`Found ${changedFiles.length} changed files`);
|
|
||||||
|
|
||||||
if (changedFiles.length === 0) {
|
|
||||||
console.log('No changed files found, skipping codeowner review requests');
|
|
||||||
return;
|
|
||||||
}
|
|
||||||
|
|
||||||
// Fetch CODEOWNERS file from root
|
|
||||||
const { data: codeownersFile } = await github.rest.repos.getContent({
|
|
||||||
owner,
|
|
||||||
repo,
|
|
||||||
path: 'CODEOWNERS',
|
|
||||||
ref: context.payload.pull_request.base.sha
|
|
||||||
});
|
|
||||||
const codeownersContent = Buffer.from(codeownersFile.content, 'base64').toString('utf8');
|
|
||||||
|
|
||||||
// Parse CODEOWNERS file to extract all patterns and their owners
|
|
||||||
const codeownersLines = codeownersContent.split('\n')
|
|
||||||
.map(line => line.trim())
|
|
||||||
.filter(line => line && !line.startsWith('#'));
|
|
||||||
|
|
||||||
const codeownersPatterns = [];
|
|
||||||
|
|
||||||
// Convert CODEOWNERS pattern to regex (robust glob handling)
|
|
||||||
function globToRegex(pattern) {
|
|
||||||
// Escape regex special characters except for glob wildcards
|
|
||||||
let regexStr = pattern
|
|
||||||
.replace(/([.+^=!:${}()|[\]\\])/g, '\\$1') // escape regex chars
|
|
||||||
.replace(/\*\*/g, '.*') // globstar
|
|
||||||
.replace(/\*/g, '[^/]*') // single star
|
|
||||||
.replace(/\?/g, '.'); // question mark
|
|
||||||
return new RegExp('^' + regexStr + '$');
|
|
||||||
}
|
|
||||||
|
|
||||||
// Helper function to create comment body
|
|
||||||
function createCommentBody(reviewersList, teamsList, matchedFileCount, isSuccessful = true) {
|
|
||||||
const reviewerMentions = reviewersList.map(r => `@${r}`);
|
|
||||||
const teamMentions = teamsList.map(t => `@${owner}/${t}`);
|
|
||||||
const allMentions = [...reviewerMentions, ...teamMentions].join(', ');
|
|
||||||
|
|
||||||
if (isSuccessful) {
|
|
||||||
return `${BOT_COMMENT_MARKER}\n👋 Hi there! I've automatically requested reviews from codeowners based on the files changed in this PR.\n\n${allMentions} - You've been requested to review this PR as codeowner(s) of ${matchedFileCount} file(s) that were modified. Thanks for your time! 🙏`;
|
|
||||||
} else {
|
|
||||||
return `${BOT_COMMENT_MARKER}\n👋 Hi there! This PR modifies ${matchedFileCount} file(s) with codeowners.\n\n${allMentions} - As codeowner(s) of the affected files, your review would be appreciated! 🙏\n\n_Note: Automatic review request may have failed, but you're still welcome to review._`;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
for (const line of codeownersLines) {
|
|
||||||
const parts = line.split(/\s+/);
|
|
||||||
if (parts.length < 2) continue;
|
|
||||||
|
|
||||||
const pattern = parts[0];
|
|
||||||
const owners = parts.slice(1);
|
|
||||||
|
|
||||||
// Use robust glob-to-regex conversion
|
|
||||||
const regex = globToRegex(pattern);
|
|
||||||
codeownersPatterns.push({ pattern, regex, owners });
|
|
||||||
}
|
|
||||||
|
|
||||||
console.log(`Parsed ${codeownersPatterns.length} codeowner patterns`);
|
|
||||||
|
|
||||||
// Match changed files against CODEOWNERS patterns
|
|
||||||
const matchedOwners = new Set();
|
|
||||||
const matchedTeams = new Set();
|
|
||||||
const fileMatches = new Map(); // Track which files matched which patterns
|
|
||||||
|
|
||||||
for (const file of changedFiles) {
|
|
||||||
for (const { pattern, regex, owners } of codeownersPatterns) {
|
|
||||||
if (regex.test(file)) {
|
|
||||||
console.log(`File '${file}' matches pattern '${pattern}' with owners: ${owners.join(', ')}`);
|
|
||||||
|
|
||||||
if (!fileMatches.has(file)) {
|
|
||||||
fileMatches.set(file, []);
|
|
||||||
}
|
|
||||||
fileMatches.get(file).push({ pattern, owners });
|
|
||||||
|
|
||||||
// Add owners to the appropriate set (remove @ prefix)
|
|
||||||
for (const owner of owners) {
|
|
||||||
const cleanOwner = owner.startsWith('@') ? owner.slice(1) : owner;
|
|
||||||
if (cleanOwner.includes('/')) {
|
|
||||||
// Team mention (org/team-name)
|
|
||||||
const teamName = cleanOwner.split('/')[1];
|
|
||||||
matchedTeams.add(teamName);
|
|
||||||
} else {
|
|
||||||
// Individual user
|
|
||||||
matchedOwners.add(cleanOwner);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
if (matchedOwners.size === 0 && matchedTeams.size === 0) {
|
|
||||||
console.log('No codeowners found for any changed files');
|
|
||||||
return;
|
|
||||||
}
|
|
||||||
|
|
||||||
// Remove the PR author from reviewers
|
|
||||||
const prAuthor = context.payload.pull_request.user.login;
|
|
||||||
matchedOwners.delete(prAuthor);
|
|
||||||
|
|
||||||
// Get current reviewers to avoid duplicate requests (but still mention them)
|
|
||||||
const { data: prData } = await github.rest.pulls.get({
|
|
||||||
owner,
|
|
||||||
repo,
|
|
||||||
pull_number: pr_number
|
|
||||||
});
|
|
||||||
|
|
||||||
const currentReviewers = new Set();
|
|
||||||
const currentTeams = new Set();
|
|
||||||
|
|
||||||
if (prData.requested_reviewers) {
|
|
||||||
prData.requested_reviewers.forEach(reviewer => {
|
|
||||||
currentReviewers.add(reviewer.login);
|
|
||||||
});
|
|
||||||
}
|
|
||||||
|
|
||||||
if (prData.requested_teams) {
|
|
||||||
prData.requested_teams.forEach(team => {
|
|
||||||
currentTeams.add(team.slug);
|
|
||||||
});
|
|
||||||
}
|
|
||||||
|
|
||||||
// Check for completed reviews to avoid re-requesting users who have already reviewed
|
|
||||||
const { data: reviews } = await github.rest.pulls.listReviews({
|
|
||||||
owner,
|
|
||||||
repo,
|
|
||||||
pull_number: pr_number
|
|
||||||
});
|
|
||||||
|
|
||||||
const reviewedUsers = new Set();
|
|
||||||
reviews.forEach(review => {
|
|
||||||
reviewedUsers.add(review.user.login);
|
|
||||||
});
|
|
||||||
|
|
||||||
// Check for previous comments from this workflow to avoid duplicate pings
|
|
||||||
const comments = await github.paginate(
|
|
||||||
github.rest.issues.listComments,
|
|
||||||
{
|
|
||||||
owner,
|
|
||||||
repo,
|
|
||||||
issue_number: pr_number
|
|
||||||
}
|
|
||||||
);
|
|
||||||
|
|
||||||
const previouslyPingedUsers = new Set();
|
|
||||||
const previouslyPingedTeams = new Set();
|
|
||||||
|
|
||||||
// Look for comments from github-actions bot that contain our bot marker
|
|
||||||
const workflowComments = comments.filter(comment =>
|
|
||||||
comment.user.type === 'Bot' &&
|
|
||||||
comment.body.includes(BOT_COMMENT_MARKER)
|
|
||||||
);
|
|
||||||
|
|
||||||
// Extract previously mentioned users and teams from workflow comments
|
|
||||||
for (const comment of workflowComments) {
|
|
||||||
// Match @username patterns (not team mentions)
|
|
||||||
const userMentions = comment.body.match(/@([a-zA-Z0-9_.-]+)(?![/])/g) || [];
|
|
||||||
userMentions.forEach(mention => {
|
|
||||||
const username = mention.slice(1); // remove @
|
|
||||||
previouslyPingedUsers.add(username);
|
|
||||||
});
|
|
||||||
|
|
||||||
// Match @org/team patterns
|
|
||||||
const teamMentions = comment.body.match(/@[a-zA-Z0-9_.-]+\/([a-zA-Z0-9_.-]+)/g) || [];
|
|
||||||
teamMentions.forEach(mention => {
|
|
||||||
const teamName = mention.split('/')[1];
|
|
||||||
previouslyPingedTeams.add(teamName);
|
|
||||||
});
|
|
||||||
}
|
|
||||||
|
|
||||||
console.log(`Found ${previouslyPingedUsers.size} previously pinged users and ${previouslyPingedTeams.size} previously pinged teams`);
|
|
||||||
|
|
||||||
// Remove users who have already been pinged in previous workflow comments
|
|
||||||
previouslyPingedUsers.forEach(user => {
|
|
||||||
matchedOwners.delete(user);
|
|
||||||
});
|
|
||||||
|
|
||||||
previouslyPingedTeams.forEach(team => {
|
|
||||||
matchedTeams.delete(team);
|
|
||||||
});
|
|
||||||
|
|
||||||
// Remove only users who have already submitted reviews (not just requested reviewers)
|
|
||||||
reviewedUsers.forEach(reviewer => {
|
|
||||||
matchedOwners.delete(reviewer);
|
|
||||||
});
|
|
||||||
|
|
||||||
// For teams, we'll still remove already requested teams to avoid API errors
|
|
||||||
currentTeams.forEach(team => {
|
|
||||||
matchedTeams.delete(team);
|
|
||||||
});
|
|
||||||
|
|
||||||
const reviewersList = Array.from(matchedOwners);
|
|
||||||
const teamsList = Array.from(matchedTeams);
|
|
||||||
|
|
||||||
if (reviewersList.length === 0 && teamsList.length === 0) {
|
|
||||||
console.log('No eligible reviewers found (all may already be requested, reviewed, or previously pinged)');
|
|
||||||
return;
|
|
||||||
}
|
|
||||||
|
|
||||||
const totalReviewers = reviewersList.length + teamsList.length;
|
|
||||||
console.log(`Requesting reviews from ${reviewersList.length} users and ${teamsList.length} teams for ${fileMatches.size} matched files`);
|
|
||||||
|
|
||||||
// Request reviews
|
|
||||||
try {
|
|
||||||
const requestParams = {
|
|
||||||
owner,
|
|
||||||
repo,
|
|
||||||
pull_number: pr_number
|
|
||||||
};
|
|
||||||
|
|
||||||
// Filter out users who are already requested reviewers for the API call
|
|
||||||
const newReviewers = reviewersList.filter(reviewer => !currentReviewers.has(reviewer));
|
|
||||||
const newTeams = teamsList.filter(team => !currentTeams.has(team));
|
|
||||||
|
|
||||||
if (newReviewers.length > 0) {
|
|
||||||
requestParams.reviewers = newReviewers;
|
|
||||||
}
|
|
||||||
|
|
||||||
if (newTeams.length > 0) {
|
|
||||||
requestParams.team_reviewers = newTeams;
|
|
||||||
}
|
|
||||||
|
|
||||||
// Only make the API call if there are new reviewers to request
|
|
||||||
if (newReviewers.length > 0 || newTeams.length > 0) {
|
|
||||||
await github.rest.pulls.requestReviewers(requestParams);
|
|
||||||
console.log(`Successfully requested reviews from ${newReviewers.length} new users and ${newTeams.length} new teams`);
|
|
||||||
} else {
|
|
||||||
console.log('All codeowners are already requested reviewers or have reviewed');
|
|
||||||
}
|
|
||||||
|
|
||||||
// Only add a comment if there are new codeowners to mention (not previously pinged)
|
|
||||||
if (reviewersList.length > 0 || teamsList.length > 0) {
|
|
||||||
const commentBody = createCommentBody(reviewersList, teamsList, fileMatches.size, true);
|
|
||||||
|
|
||||||
await github.rest.issues.createComment({
|
|
||||||
owner,
|
|
||||||
repo,
|
|
||||||
issue_number: pr_number,
|
|
||||||
body: commentBody
|
|
||||||
});
|
|
||||||
console.log(`Added comment mentioning ${reviewersList.length} users and ${teamsList.length} teams`);
|
|
||||||
} else {
|
|
||||||
console.log('No new codeowners to mention in comment (all previously pinged)');
|
|
||||||
}
|
|
||||||
} catch (error) {
|
|
||||||
if (error.status === 422) {
|
|
||||||
console.log('Some reviewers may already be requested or unavailable:', error.message);
|
|
||||||
|
|
||||||
// Only try to add a comment if there are new codeowners to mention
|
|
||||||
if (reviewersList.length > 0 || teamsList.length > 0) {
|
|
||||||
const commentBody = createCommentBody(reviewersList, teamsList, fileMatches.size, false);
|
|
||||||
|
|
||||||
try {
|
|
||||||
await github.rest.issues.createComment({
|
|
||||||
owner,
|
|
||||||
repo,
|
|
||||||
issue_number: pr_number,
|
|
||||||
body: commentBody
|
|
||||||
});
|
|
||||||
console.log(`Added fallback comment mentioning ${reviewersList.length} users and ${teamsList.length} teams`);
|
|
||||||
} catch (commentError) {
|
|
||||||
console.log('Failed to add comment:', commentError.message);
|
|
||||||
}
|
|
||||||
} else {
|
|
||||||
console.log('No new codeowners to mention in fallback comment');
|
|
||||||
}
|
|
||||||
} else {
|
|
||||||
throw error;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
} catch (error) {
|
|
||||||
console.log('Failed to process codeowner review requests:', error.message);
|
|
||||||
console.error(error);
|
|
||||||
}
|
|
157
.github/workflows/external-component-bot.yml
vendored
157
.github/workflows/external-component-bot.yml
vendored
@ -1,157 +0,0 @@
|
|||||||
name: Add External Component Comment
|
|
||||||
|
|
||||||
on:
|
|
||||||
pull_request_target:
|
|
||||||
types: [opened, synchronize]
|
|
||||||
|
|
||||||
permissions:
|
|
||||||
contents: read # Needed to fetch PR details
|
|
||||||
issues: write # Needed to create and update comments (PR comments are managed via the issues REST API)
|
|
||||||
pull-requests: write # also needed?
|
|
||||||
|
|
||||||
jobs:
|
|
||||||
external-comment:
|
|
||||||
name: External component comment
|
|
||||||
runs-on: ubuntu-latest
|
|
||||||
steps:
|
|
||||||
- name: Add external component comment
|
|
||||||
uses: actions/github-script@v7.0.1
|
|
||||||
with:
|
|
||||||
github-token: ${{ secrets.GITHUB_TOKEN }}
|
|
||||||
script: |
|
|
||||||
// Generate external component usage instructions
|
|
||||||
function generateExternalComponentInstructions(prNumber, componentNames, owner, repo) {
|
|
||||||
let source;
|
|
||||||
if (owner === 'esphome' && repo === 'esphome')
|
|
||||||
source = `github://pr#${prNumber}`;
|
|
||||||
else
|
|
||||||
source = `github://${owner}/${repo}@pull/${prNumber}/head`;
|
|
||||||
return `To use the changes from this PR as an external component, add the following to your ESPHome configuration YAML file:
|
|
||||||
|
|
||||||
\`\`\`yaml
|
|
||||||
external_components:
|
|
||||||
- source: ${source}
|
|
||||||
components: [${componentNames.join(', ')}]
|
|
||||||
refresh: 1h
|
|
||||||
\`\`\``;
|
|
||||||
}
|
|
||||||
|
|
||||||
// Generate repo clone instructions
|
|
||||||
function generateRepoInstructions(prNumber, owner, repo, branch) {
|
|
||||||
return `To use the changes in this PR:
|
|
||||||
|
|
||||||
\`\`\`bash
|
|
||||||
# Clone the repository:
|
|
||||||
git clone https://github.com/${owner}/${repo}
|
|
||||||
cd ${repo}
|
|
||||||
|
|
||||||
# Checkout the PR branch:
|
|
||||||
git fetch origin pull/${prNumber}/head:${branch}
|
|
||||||
git checkout ${branch}
|
|
||||||
|
|
||||||
# Install the development version:
|
|
||||||
script/setup
|
|
||||||
|
|
||||||
# Activate the development version:
|
|
||||||
source venv/bin/activate
|
|
||||||
\`\`\`
|
|
||||||
|
|
||||||
Now you can run \`esphome\` as usual to test the changes in this PR.
|
|
||||||
`;
|
|
||||||
}
|
|
||||||
|
|
||||||
async function createComment(octokit, owner, repo, prNumber, esphomeChanges, componentChanges) {
|
|
||||||
const commentMarker = "<!-- This comment was generated automatically by the external-component-bot workflow. -->";
|
|
||||||
const legacyCommentMarker = "<!-- This comment was generated automatically by a GitHub workflow. -->";
|
|
||||||
let commentBody;
|
|
||||||
if (esphomeChanges.length === 1) {
|
|
||||||
commentBody = generateExternalComponentInstructions(prNumber, componentChanges, owner, repo);
|
|
||||||
} else {
|
|
||||||
commentBody = generateRepoInstructions(prNumber, owner, repo, context.payload.pull_request.head.ref);
|
|
||||||
}
|
|
||||||
commentBody += `\n\n---\n(Added by the PR bot)\n\n${commentMarker}`;
|
|
||||||
|
|
||||||
// Check for existing bot comment
|
|
||||||
const comments = await github.paginate(
|
|
||||||
github.rest.issues.listComments,
|
|
||||||
{
|
|
||||||
owner: owner,
|
|
||||||
repo: repo,
|
|
||||||
issue_number: prNumber,
|
|
||||||
per_page: 100,
|
|
||||||
}
|
|
||||||
);
|
|
||||||
|
|
||||||
const sorted = comments.sort((a, b) => new Date(b.updated_at) - new Date(a.updated_at));
|
|
||||||
|
|
||||||
const botComment = sorted.find(comment =>
|
|
||||||
(
|
|
||||||
comment.body.includes(commentMarker) ||
|
|
||||||
comment.body.includes(legacyCommentMarker)
|
|
||||||
) && comment.user.type === "Bot"
|
|
||||||
);
|
|
||||||
|
|
||||||
if (botComment && botComment.body === commentBody) {
|
|
||||||
// No changes in the comment, do nothing
|
|
||||||
return;
|
|
||||||
}
|
|
||||||
|
|
||||||
if (botComment) {
|
|
||||||
// Update existing comment
|
|
||||||
await github.rest.issues.updateComment({
|
|
||||||
owner: owner,
|
|
||||||
repo: repo,
|
|
||||||
comment_id: botComment.id,
|
|
||||||
body: commentBody,
|
|
||||||
});
|
|
||||||
} else {
|
|
||||||
// Create new comment
|
|
||||||
await github.rest.issues.createComment({
|
|
||||||
owner: owner,
|
|
||||||
repo: repo,
|
|
||||||
issue_number: prNumber,
|
|
||||||
body: commentBody,
|
|
||||||
});
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
async function getEsphomeAndComponentChanges(github, owner, repo, prNumber) {
|
|
||||||
const changedFiles = await github.rest.pulls.listFiles({
|
|
||||||
owner: owner,
|
|
||||||
repo: repo,
|
|
||||||
pull_number: prNumber,
|
|
||||||
});
|
|
||||||
|
|
||||||
const esphomeChanges = changedFiles.data
|
|
||||||
.filter(file => file.filename !== "esphome/core/defines.h" && file.filename.startsWith('esphome/'))
|
|
||||||
.map(file => {
|
|
||||||
const match = file.filename.match(/esphome\/([^/]+)/);
|
|
||||||
return match ? match[1] : null;
|
|
||||||
})
|
|
||||||
.filter(it => it !== null);
|
|
||||||
|
|
||||||
if (esphomeChanges.length === 0) {
|
|
||||||
return {esphomeChanges: [], componentChanges: []};
|
|
||||||
}
|
|
||||||
|
|
||||||
const uniqueEsphomeChanges = [...new Set(esphomeChanges)];
|
|
||||||
const componentChanges = changedFiles.data
|
|
||||||
.filter(file => file.filename.startsWith('esphome/components/'))
|
|
||||||
.map(file => {
|
|
||||||
const match = file.filename.match(/esphome\/components\/([^/]+)\//);
|
|
||||||
return match ? match[1] : null;
|
|
||||||
})
|
|
||||||
.filter(it => it !== null);
|
|
||||||
|
|
||||||
return {esphomeChanges: uniqueEsphomeChanges, componentChanges: [...new Set(componentChanges)]};
|
|
||||||
}
|
|
||||||
|
|
||||||
// Start of main code.
|
|
||||||
|
|
||||||
const prNumber = context.payload.pull_request.number;
|
|
||||||
const {owner, repo} = context.repo;
|
|
||||||
|
|
||||||
const {esphomeChanges, componentChanges} = await getEsphomeAndComponentChanges(github, owner, repo, prNumber);
|
|
||||||
if (componentChanges.length !== 0) {
|
|
||||||
await createComment(github, owner, repo, prNumber, esphomeChanges, componentChanges);
|
|
||||||
}
|
|
163
.github/workflows/issue-codeowner-notify.yml
vendored
163
.github/workflows/issue-codeowner-notify.yml
vendored
@ -1,163 +0,0 @@
|
|||||||
# This workflow automatically notifies codeowners when an issue is labeled with component labels.
|
|
||||||
# It reads the CODEOWNERS file to find the maintainers for the labeled components
|
|
||||||
# and posts a comment mentioning them to ensure they're aware of the issue.
|
|
||||||
|
|
||||||
name: Notify Issue Codeowners
|
|
||||||
|
|
||||||
on:
|
|
||||||
issues:
|
|
||||||
types: [labeled]
|
|
||||||
|
|
||||||
permissions:
|
|
||||||
issues: write
|
|
||||||
contents: read
|
|
||||||
|
|
||||||
jobs:
|
|
||||||
notify-codeowners:
|
|
||||||
name: Run
|
|
||||||
if: ${{ startsWith(github.event.label.name, format('component{0} ', ':')) }}
|
|
||||||
runs-on: ubuntu-latest
|
|
||||||
steps:
|
|
||||||
- name: Notify codeowners for component issues
|
|
||||||
uses: actions/github-script@v7.0.1
|
|
||||||
with:
|
|
||||||
script: |
|
|
||||||
const owner = context.repo.owner;
|
|
||||||
const repo = context.repo.repo;
|
|
||||||
const issue_number = context.payload.issue.number;
|
|
||||||
const labelName = context.payload.label.name;
|
|
||||||
|
|
||||||
console.log(`Processing issue #${issue_number} with label: ${labelName}`);
|
|
||||||
|
|
||||||
// Hidden marker to identify bot comments from this workflow
|
|
||||||
const BOT_COMMENT_MARKER = '<!-- issue-codeowner-notify-bot -->';
|
|
||||||
|
|
||||||
// Extract component name from label
|
|
||||||
const componentName = labelName.replace('component: ', '');
|
|
||||||
console.log(`Component: ${componentName}`);
|
|
||||||
|
|
||||||
try {
|
|
||||||
// Fetch CODEOWNERS file from root
|
|
||||||
const { data: codeownersFile } = await github.rest.repos.getContent({
|
|
||||||
owner,
|
|
||||||
repo,
|
|
||||||
path: 'CODEOWNERS'
|
|
||||||
});
|
|
||||||
const codeownersContent = Buffer.from(codeownersFile.content, 'base64').toString('utf8');
|
|
||||||
|
|
||||||
// Parse CODEOWNERS file to extract component mappings
|
|
||||||
const codeownersLines = codeownersContent.split('\n')
|
|
||||||
.map(line => line.trim())
|
|
||||||
.filter(line => line && !line.startsWith('#'));
|
|
||||||
|
|
||||||
let componentOwners = null;
|
|
||||||
|
|
||||||
for (const line of codeownersLines) {
|
|
||||||
const parts = line.split(/\s+/);
|
|
||||||
if (parts.length < 2) continue;
|
|
||||||
|
|
||||||
const pattern = parts[0];
|
|
||||||
const owners = parts.slice(1);
|
|
||||||
|
|
||||||
// Look for component patterns: esphome/components/{component}/*
|
|
||||||
const componentMatch = pattern.match(/^esphome\/components\/([^\/]+)\/\*$/);
|
|
||||||
if (componentMatch && componentMatch[1] === componentName) {
|
|
||||||
componentOwners = owners;
|
|
||||||
break;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
if (!componentOwners) {
|
|
||||||
console.log(`No codeowners found for component: ${componentName}`);
|
|
||||||
return;
|
|
||||||
}
|
|
||||||
|
|
||||||
console.log(`Found codeowners for '${componentName}': ${componentOwners.join(', ')}`);
|
|
||||||
|
|
||||||
// Separate users and teams
|
|
||||||
const userOwners = [];
|
|
||||||
const teamOwners = [];
|
|
||||||
|
|
||||||
for (const owner of componentOwners) {
|
|
||||||
const cleanOwner = owner.startsWith('@') ? owner.slice(1) : owner;
|
|
||||||
if (cleanOwner.includes('/')) {
|
|
||||||
// Team mention (org/team-name)
|
|
||||||
teamOwners.push(`@${cleanOwner}`);
|
|
||||||
} else {
|
|
||||||
// Individual user
|
|
||||||
userOwners.push(`@${cleanOwner}`);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
// Remove issue author from mentions to avoid self-notification
|
|
||||||
const issueAuthor = context.payload.issue.user.login;
|
|
||||||
const filteredUserOwners = userOwners.filter(mention =>
|
|
||||||
mention !== `@${issueAuthor}`
|
|
||||||
);
|
|
||||||
|
|
||||||
// Check for previous comments from this workflow to avoid duplicate pings
|
|
||||||
const comments = await github.paginate(
|
|
||||||
github.rest.issues.listComments,
|
|
||||||
{
|
|
||||||
owner,
|
|
||||||
repo,
|
|
||||||
issue_number: issue_number
|
|
||||||
}
|
|
||||||
);
|
|
||||||
|
|
||||||
const previouslyPingedUsers = new Set();
|
|
||||||
const previouslyPingedTeams = new Set();
|
|
||||||
|
|
||||||
// Look for comments from github-actions bot that contain codeowner pings for this component
|
|
||||||
const workflowComments = comments.filter(comment =>
|
|
||||||
comment.user.type === 'Bot' &&
|
|
||||||
comment.body.includes(BOT_COMMENT_MARKER) &&
|
|
||||||
comment.body.includes(`component: ${componentName}`)
|
|
||||||
);
|
|
||||||
|
|
||||||
// Extract previously mentioned users and teams from workflow comments
|
|
||||||
for (const comment of workflowComments) {
|
|
||||||
// Match @username patterns (not team mentions)
|
|
||||||
const userMentions = comment.body.match(/@([a-zA-Z0-9_.-]+)(?![/])/g) || [];
|
|
||||||
userMentions.forEach(mention => {
|
|
||||||
previouslyPingedUsers.add(mention); // Keep @ prefix for easy comparison
|
|
||||||
});
|
|
||||||
|
|
||||||
// Match @org/team patterns
|
|
||||||
const teamMentions = comment.body.match(/@[a-zA-Z0-9_.-]+\/[a-zA-Z0-9_.-]+/g) || [];
|
|
||||||
teamMentions.forEach(mention => {
|
|
||||||
previouslyPingedTeams.add(mention);
|
|
||||||
});
|
|
||||||
}
|
|
||||||
|
|
||||||
console.log(`Found ${previouslyPingedUsers.size} previously pinged users and ${previouslyPingedTeams.size} previously pinged teams for component ${componentName}`);
|
|
||||||
|
|
||||||
// Remove previously pinged users and teams
|
|
||||||
const newUserOwners = filteredUserOwners.filter(mention => !previouslyPingedUsers.has(mention));
|
|
||||||
const newTeamOwners = teamOwners.filter(mention => !previouslyPingedTeams.has(mention));
|
|
||||||
|
|
||||||
const allMentions = [...newUserOwners, ...newTeamOwners];
|
|
||||||
|
|
||||||
if (allMentions.length === 0) {
|
|
||||||
console.log('No new codeowners to notify (all previously pinged or issue author is the only codeowner)');
|
|
||||||
return;
|
|
||||||
}
|
|
||||||
|
|
||||||
// Create comment body
|
|
||||||
const mentionString = allMentions.join(', ');
|
|
||||||
const commentBody = `${BOT_COMMENT_MARKER}\n👋 Hey ${mentionString}!\n\nThis issue has been labeled with \`component: ${componentName}\` and you've been identified as a codeowner of this component. Please take a look when you have a chance!\n\nThanks for maintaining this component! 🙏`;
|
|
||||||
|
|
||||||
// Post comment
|
|
||||||
await github.rest.issues.createComment({
|
|
||||||
owner,
|
|
||||||
repo,
|
|
||||||
issue_number: issue_number,
|
|
||||||
body: commentBody
|
|
||||||
});
|
|
||||||
|
|
||||||
console.log(`Successfully notified new codeowners: ${mentionString}`);
|
|
||||||
|
|
||||||
} catch (error) {
|
|
||||||
console.log('Failed to process codeowner notifications:', error.message);
|
|
||||||
console.error(error);
|
|
||||||
}
|
|
23
.github/workflows/lock.yml
vendored
23
.github/workflows/lock.yml
vendored
@ -1,11 +1,28 @@
|
|||||||
---
|
---
|
||||||
name: Lock closed issues and PRs
|
name: Lock
|
||||||
|
|
||||||
on:
|
on:
|
||||||
schedule:
|
schedule:
|
||||||
- cron: "30 0 * * *" # Run daily at 00:30 UTC
|
- cron: "30 0 * * *"
|
||||||
workflow_dispatch:
|
workflow_dispatch:
|
||||||
|
|
||||||
|
permissions:
|
||||||
|
issues: write
|
||||||
|
pull-requests: write
|
||||||
|
|
||||||
|
concurrency:
|
||||||
|
group: lock
|
||||||
|
|
||||||
jobs:
|
jobs:
|
||||||
lock:
|
lock:
|
||||||
uses: esphome/workflows/.github/workflows/lock.yml@main
|
runs-on: ubuntu-latest
|
||||||
|
steps:
|
||||||
|
- uses: dessant/lock-threads@v5.0.1
|
||||||
|
with:
|
||||||
|
pr-inactive-days: "1"
|
||||||
|
pr-lock-reason: ""
|
||||||
|
exclude-any-pr-labels: keep-open
|
||||||
|
|
||||||
|
issue-inactive-days: "7"
|
||||||
|
issue-lock-reason: ""
|
||||||
|
exclude-any-issue-labels: keep-open
|
||||||
|
4
.github/workflows/matchers/lint-python.json
vendored
4
.github/workflows/matchers/lint-python.json
vendored
@ -1,11 +1,11 @@
|
|||||||
{
|
{
|
||||||
"problemMatcher": [
|
"problemMatcher": [
|
||||||
{
|
{
|
||||||
"owner": "ruff",
|
"owner": "black",
|
||||||
"severity": "error",
|
"severity": "error",
|
||||||
"pattern": [
|
"pattern": [
|
||||||
{
|
{
|
||||||
"regexp": "^(.*): (Please format this file with the ruff formatter)",
|
"regexp": "^(.*): (Please format this file with the black formatter)",
|
||||||
"file": 1,
|
"file": 1,
|
||||||
"message": 2
|
"message": 2
|
||||||
}
|
}
|
||||||
|
133
.github/workflows/release.yml
vendored
133
.github/workflows/release.yml
vendored
@ -18,9 +18,8 @@ jobs:
|
|||||||
outputs:
|
outputs:
|
||||||
tag: ${{ steps.tag.outputs.tag }}
|
tag: ${{ steps.tag.outputs.tag }}
|
||||||
branch_build: ${{ steps.tag.outputs.branch_build }}
|
branch_build: ${{ steps.tag.outputs.branch_build }}
|
||||||
deploy_env: ${{ steps.tag.outputs.deploy_env }}
|
|
||||||
steps:
|
steps:
|
||||||
- uses: actions/checkout@v4.2.2
|
- uses: actions/checkout@v4.1.7
|
||||||
- name: Get tag
|
- name: Get tag
|
||||||
id: tag
|
id: tag
|
||||||
# yamllint disable rule:line-length
|
# yamllint disable rule:line-length
|
||||||
@ -28,11 +27,6 @@ jobs:
|
|||||||
if [[ "${{ github.event_name }}" = "release" ]]; then
|
if [[ "${{ github.event_name }}" = "release" ]]; then
|
||||||
TAG="${{ github.event.release.tag_name}}"
|
TAG="${{ github.event.release.tag_name}}"
|
||||||
BRANCH_BUILD="false"
|
BRANCH_BUILD="false"
|
||||||
if [[ "${{ github.event.release.prerelease }}" = "true" ]]; then
|
|
||||||
ENVIRONMENT="beta"
|
|
||||||
else
|
|
||||||
ENVIRONMENT="production"
|
|
||||||
fi
|
|
||||||
else
|
else
|
||||||
TAG=$(cat esphome/const.py | sed -n -E "s/^__version__\s+=\s+\"(.+)\"$/\1/p")
|
TAG=$(cat esphome/const.py | sed -n -E "s/^__version__\s+=\s+\"(.+)\"$/\1/p")
|
||||||
today="$(date --utc '+%Y%m%d')"
|
today="$(date --utc '+%Y%m%d')"
|
||||||
@ -41,15 +35,12 @@ jobs:
|
|||||||
if [[ "$BRANCH" != "dev" ]]; then
|
if [[ "$BRANCH" != "dev" ]]; then
|
||||||
TAG="${TAG}-${BRANCH}"
|
TAG="${TAG}-${BRANCH}"
|
||||||
BRANCH_BUILD="true"
|
BRANCH_BUILD="true"
|
||||||
ENVIRONMENT=""
|
|
||||||
else
|
else
|
||||||
BRANCH_BUILD="false"
|
BRANCH_BUILD="false"
|
||||||
ENVIRONMENT="dev"
|
|
||||||
fi
|
fi
|
||||||
fi
|
fi
|
||||||
echo "tag=${TAG}" >> $GITHUB_OUTPUT
|
echo "tag=${TAG}" >> $GITHUB_OUTPUT
|
||||||
echo "branch_build=${BRANCH_BUILD}" >> $GITHUB_OUTPUT
|
echo "branch_build=${BRANCH_BUILD}" >> $GITHUB_OUTPUT
|
||||||
echo "deploy_env=${ENVIRONMENT}" >> $GITHUB_OUTPUT
|
|
||||||
# yamllint enable rule:line-length
|
# yamllint enable rule:line-length
|
||||||
|
|
||||||
deploy-pypi:
|
deploy-pypi:
|
||||||
@ -60,54 +51,57 @@ jobs:
|
|||||||
contents: read
|
contents: read
|
||||||
id-token: write
|
id-token: write
|
||||||
steps:
|
steps:
|
||||||
- uses: actions/checkout@v4.2.2
|
- uses: actions/checkout@v4.1.7
|
||||||
- name: Set up Python
|
- name: Set up Python
|
||||||
uses: actions/setup-python@v5.6.0
|
uses: actions/setup-python@v5.3.0
|
||||||
with:
|
with:
|
||||||
python-version: "3.x"
|
python-version: "3.x"
|
||||||
|
- name: Set up python environment
|
||||||
|
env:
|
||||||
|
ESPHOME_NO_VENV: 1
|
||||||
|
run: script/setup
|
||||||
- name: Build
|
- name: Build
|
||||||
run: |-
|
run: |-
|
||||||
pip3 install build
|
pip3 install build
|
||||||
python3 -m build
|
python3 -m build
|
||||||
- name: Publish
|
- name: Publish
|
||||||
uses: pypa/gh-action-pypi-publish@v1.12.4
|
uses: pypa/gh-action-pypi-publish@v1.12.2
|
||||||
with:
|
|
||||||
skip-existing: true
|
|
||||||
|
|
||||||
deploy-docker:
|
deploy-docker:
|
||||||
name: Build ESPHome ${{ matrix.platform.arch }}
|
name: Build ESPHome ${{ matrix.platform }}
|
||||||
if: github.repository == 'esphome/esphome'
|
if: github.repository == 'esphome/esphome'
|
||||||
permissions:
|
permissions:
|
||||||
contents: read
|
contents: read
|
||||||
packages: write
|
packages: write
|
||||||
runs-on: ${{ matrix.platform.os }}
|
runs-on: ubuntu-latest
|
||||||
needs: [init]
|
needs: [init]
|
||||||
strategy:
|
strategy:
|
||||||
fail-fast: false
|
fail-fast: false
|
||||||
matrix:
|
matrix:
|
||||||
platform:
|
platform:
|
||||||
- arch: amd64
|
- linux/amd64
|
||||||
os: "ubuntu-24.04"
|
- linux/arm/v7
|
||||||
- arch: arm64
|
- linux/arm64
|
||||||
os: "ubuntu-24.04-arm"
|
|
||||||
|
|
||||||
steps:
|
steps:
|
||||||
- uses: actions/checkout@v4.2.2
|
- uses: actions/checkout@v4.1.7
|
||||||
- name: Set up Python
|
- name: Set up Python
|
||||||
uses: actions/setup-python@v5.6.0
|
uses: actions/setup-python@v5.3.0
|
||||||
with:
|
with:
|
||||||
python-version: "3.11"
|
python-version: "3.9"
|
||||||
|
|
||||||
- name: Set up Docker Buildx
|
- name: Set up Docker Buildx
|
||||||
uses: docker/setup-buildx-action@v3.11.1
|
uses: docker/setup-buildx-action@v3.7.1
|
||||||
|
- name: Set up QEMU
|
||||||
|
if: matrix.platform != 'linux/amd64'
|
||||||
|
uses: docker/setup-qemu-action@v3.2.0
|
||||||
|
|
||||||
- name: Log in to docker hub
|
- name: Log in to docker hub
|
||||||
uses: docker/login-action@v3.4.0
|
uses: docker/login-action@v3.3.0
|
||||||
with:
|
with:
|
||||||
username: ${{ secrets.DOCKER_USER }}
|
username: ${{ secrets.DOCKER_USER }}
|
||||||
password: ${{ secrets.DOCKER_PASSWORD }}
|
password: ${{ secrets.DOCKER_PASSWORD }}
|
||||||
- name: Log in to the GitHub container registry
|
- name: Log in to the GitHub container registry
|
||||||
uses: docker/login-action@v3.4.0
|
uses: docker/login-action@v3.3.0
|
||||||
with:
|
with:
|
||||||
registry: ghcr.io
|
registry: ghcr.io
|
||||||
username: ${{ github.actor }}
|
username: ${{ github.actor }}
|
||||||
@ -116,36 +110,45 @@ jobs:
|
|||||||
- name: Build docker
|
- name: Build docker
|
||||||
uses: ./.github/actions/build-image
|
uses: ./.github/actions/build-image
|
||||||
with:
|
with:
|
||||||
target: final
|
platform: ${{ matrix.platform }}
|
||||||
build_type: docker
|
target: docker
|
||||||
|
baseimg: docker
|
||||||
suffix: ""
|
suffix: ""
|
||||||
version: ${{ needs.init.outputs.tag }}
|
version: ${{ needs.init.outputs.tag }}
|
||||||
|
|
||||||
- name: Build ha-addon
|
- name: Build ha-addon
|
||||||
uses: ./.github/actions/build-image
|
uses: ./.github/actions/build-image
|
||||||
with:
|
with:
|
||||||
target: final
|
platform: ${{ matrix.platform }}
|
||||||
build_type: ha-addon
|
target: hassio
|
||||||
|
baseimg: hassio
|
||||||
suffix: "hassio"
|
suffix: "hassio"
|
||||||
version: ${{ needs.init.outputs.tag }}
|
version: ${{ needs.init.outputs.tag }}
|
||||||
|
|
||||||
# - name: Build lint
|
- name: Build lint
|
||||||
# uses: ./.github/actions/build-image
|
uses: ./.github/actions/build-image
|
||||||
# with:
|
with:
|
||||||
# target: lint
|
platform: ${{ matrix.platform }}
|
||||||
# build_type: lint
|
target: lint
|
||||||
# suffix: lint
|
baseimg: docker
|
||||||
# version: ${{ needs.init.outputs.tag }}
|
suffix: lint
|
||||||
|
version: ${{ needs.init.outputs.tag }}
|
||||||
|
|
||||||
|
- name: Sanitize platform name
|
||||||
|
id: sanitize
|
||||||
|
run: |
|
||||||
|
echo "${{ matrix.platform }}" | sed 's|/|-|g' > /tmp/platform
|
||||||
|
echo name=$(cat /tmp/platform) >> $GITHUB_OUTPUT
|
||||||
|
|
||||||
- name: Upload digests
|
- name: Upload digests
|
||||||
uses: actions/upload-artifact@v4.6.2
|
uses: actions/upload-artifact@v4.4.3
|
||||||
with:
|
with:
|
||||||
name: digests-${{ matrix.platform.arch }}
|
name: digests-${{ steps.sanitize.outputs.name }}
|
||||||
path: /tmp/digests
|
path: /tmp/digests
|
||||||
retention-days: 1
|
retention-days: 1
|
||||||
|
|
||||||
deploy-manifest:
|
deploy-manifest:
|
||||||
name: Publish ESPHome ${{ matrix.image.build_type }} to ${{ matrix.registry }}
|
name: Publish ESPHome ${{ matrix.image.title }} to ${{ matrix.registry }}
|
||||||
runs-on: ubuntu-latest
|
runs-on: ubuntu-latest
|
||||||
needs:
|
needs:
|
||||||
- init
|
- init
|
||||||
@ -158,37 +161,40 @@ jobs:
|
|||||||
fail-fast: false
|
fail-fast: false
|
||||||
matrix:
|
matrix:
|
||||||
image:
|
image:
|
||||||
- build_type: "docker"
|
- title: "ha-addon"
|
||||||
suffix: ""
|
target: "hassio"
|
||||||
- build_type: "ha-addon"
|
|
||||||
suffix: "hassio"
|
suffix: "hassio"
|
||||||
# - build_type: "lint"
|
- title: "docker"
|
||||||
# suffix: "lint"
|
target: "docker"
|
||||||
|
suffix: ""
|
||||||
|
- title: "lint"
|
||||||
|
target: "lint"
|
||||||
|
suffix: "lint"
|
||||||
registry:
|
registry:
|
||||||
- ghcr
|
- ghcr
|
||||||
- dockerhub
|
- dockerhub
|
||||||
steps:
|
steps:
|
||||||
- uses: actions/checkout@v4.2.2
|
- uses: actions/checkout@v4.1.7
|
||||||
|
|
||||||
- name: Download digests
|
- name: Download digests
|
||||||
uses: actions/download-artifact@v4.3.0
|
uses: actions/download-artifact@v4.1.8
|
||||||
with:
|
with:
|
||||||
pattern: digests-*
|
pattern: digests-*
|
||||||
path: /tmp/digests
|
path: /tmp/digests
|
||||||
merge-multiple: true
|
merge-multiple: true
|
||||||
|
|
||||||
- name: Set up Docker Buildx
|
- name: Set up Docker Buildx
|
||||||
uses: docker/setup-buildx-action@v3.11.1
|
uses: docker/setup-buildx-action@v3.7.1
|
||||||
|
|
||||||
- name: Log in to docker hub
|
- name: Log in to docker hub
|
||||||
if: matrix.registry == 'dockerhub'
|
if: matrix.registry == 'dockerhub'
|
||||||
uses: docker/login-action@v3.4.0
|
uses: docker/login-action@v3.3.0
|
||||||
with:
|
with:
|
||||||
username: ${{ secrets.DOCKER_USER }}
|
username: ${{ secrets.DOCKER_USER }}
|
||||||
password: ${{ secrets.DOCKER_PASSWORD }}
|
password: ${{ secrets.DOCKER_PASSWORD }}
|
||||||
- name: Log in to the GitHub container registry
|
- name: Log in to the GitHub container registry
|
||||||
if: matrix.registry == 'ghcr'
|
if: matrix.registry == 'ghcr'
|
||||||
uses: docker/login-action@v3.4.0
|
uses: docker/login-action@v3.3.0
|
||||||
with:
|
with:
|
||||||
registry: ghcr.io
|
registry: ghcr.io
|
||||||
username: ${{ github.actor }}
|
username: ${{ github.actor }}
|
||||||
@ -207,7 +213,7 @@ jobs:
|
|||||||
done
|
done
|
||||||
|
|
||||||
- name: Create manifest list and push
|
- name: Create manifest list and push
|
||||||
working-directory: /tmp/digests/${{ matrix.image.build_type }}/${{ matrix.registry }}
|
working-directory: /tmp/digests/${{ matrix.image.target }}/${{ matrix.registry }}
|
||||||
run: |
|
run: |
|
||||||
docker buildx imagetools create $(jq -Rcnr 'inputs | . / "," | map("-t " + .) | join(" ")' <<< "${{ steps.tags.outputs.tags}}") \
|
docker buildx imagetools create $(jq -Rcnr 'inputs | . / "," | map("-t " + .) | join(" ")' <<< "${{ steps.tags.outputs.tags}}") \
|
||||||
$(printf '${{ steps.tags.outputs.image }}@sha256:%s ' *)
|
$(printf '${{ steps.tags.outputs.image }}@sha256:%s ' *)
|
||||||
@ -238,24 +244,3 @@ jobs:
|
|||||||
content: description
|
content: description
|
||||||
}
|
}
|
||||||
})
|
})
|
||||||
|
|
||||||
deploy-esphome-schema:
|
|
||||||
if: github.repository == 'esphome/esphome' && needs.init.outputs.branch_build == 'false'
|
|
||||||
runs-on: ubuntu-latest
|
|
||||||
needs: [init]
|
|
||||||
environment: ${{ needs.init.outputs.deploy_env }}
|
|
||||||
steps:
|
|
||||||
- name: Trigger Workflow
|
|
||||||
uses: actions/github-script@v7.0.1
|
|
||||||
with:
|
|
||||||
github-token: ${{ secrets.DEPLOY_ESPHOME_SCHEMA_REPO_TOKEN }}
|
|
||||||
script: |
|
|
||||||
github.rest.actions.createWorkflowDispatch({
|
|
||||||
owner: "esphome",
|
|
||||||
repo: "esphome-schema",
|
|
||||||
workflow_id: "generate-schemas.yml",
|
|
||||||
ref: "main",
|
|
||||||
inputs: {
|
|
||||||
version: "${{ needs.init.outputs.tag }}",
|
|
||||||
}
|
|
||||||
})
|
|
||||||
|
4
.github/workflows/stale.yml
vendored
4
.github/workflows/stale.yml
vendored
@ -17,7 +17,7 @@ jobs:
|
|||||||
stale:
|
stale:
|
||||||
runs-on: ubuntu-latest
|
runs-on: ubuntu-latest
|
||||||
steps:
|
steps:
|
||||||
- uses: actions/stale@v9.1.0
|
- uses: actions/stale@v9.0.0
|
||||||
with:
|
with:
|
||||||
days-before-pr-stale: 90
|
days-before-pr-stale: 90
|
||||||
days-before-pr-close: 7
|
days-before-pr-close: 7
|
||||||
@ -37,7 +37,7 @@ jobs:
|
|||||||
close-issues:
|
close-issues:
|
||||||
runs-on: ubuntu-latest
|
runs-on: ubuntu-latest
|
||||||
steps:
|
steps:
|
||||||
- uses: actions/stale@v9.1.0
|
- uses: actions/stale@v9.0.0
|
||||||
with:
|
with:
|
||||||
days-before-pr-stale: -1
|
days-before-pr-stale: -1
|
||||||
days-before-pr-close: -1
|
days-before-pr-close: -1
|
||||||
|
14
.github/workflows/sync-device-classes.yml
vendored
14
.github/workflows/sync-device-classes.yml
vendored
@ -13,18 +13,18 @@ jobs:
|
|||||||
if: github.repository == 'esphome/esphome'
|
if: github.repository == 'esphome/esphome'
|
||||||
steps:
|
steps:
|
||||||
- name: Checkout
|
- name: Checkout
|
||||||
uses: actions/checkout@v4.2.2
|
uses: actions/checkout@v4.1.7
|
||||||
|
|
||||||
- name: Checkout Home Assistant
|
- name: Checkout Home Assistant
|
||||||
uses: actions/checkout@v4.2.2
|
uses: actions/checkout@v4.1.7
|
||||||
with:
|
with:
|
||||||
repository: home-assistant/core
|
repository: home-assistant/core
|
||||||
path: lib/home-assistant
|
path: lib/home-assistant
|
||||||
|
|
||||||
- name: Setup Python
|
- name: Setup Python
|
||||||
uses: actions/setup-python@v5.6.0
|
uses: actions/setup-python@v5.3.0
|
||||||
with:
|
with:
|
||||||
python-version: 3.13
|
python-version: 3.12
|
||||||
|
|
||||||
- name: Install Home Assistant
|
- name: Install Home Assistant
|
||||||
run: |
|
run: |
|
||||||
@ -36,11 +36,11 @@ jobs:
|
|||||||
python ./script/sync-device_class.py
|
python ./script/sync-device_class.py
|
||||||
|
|
||||||
- name: Commit changes
|
- name: Commit changes
|
||||||
uses: peter-evans/create-pull-request@v7.0.8
|
uses: peter-evans/create-pull-request@v7.0.5
|
||||||
with:
|
with:
|
||||||
commit-message: "Synchronise Device Classes from Home Assistant"
|
commit-message: "Synchronise Device Classes from Home Assistant"
|
||||||
committer: esphomebot <esphome@openhomefoundation.org>
|
committer: esphomebot <esphome@nabucasa.com>
|
||||||
author: esphomebot <esphome@openhomefoundation.org>
|
author: esphomebot <esphome@nabucasa.com>
|
||||||
branch: sync/device-classes
|
branch: sync/device-classes
|
||||||
delete-branch: true
|
delete-branch: true
|
||||||
title: "Synchronise Device Classes from Home Assistant"
|
title: "Synchronise Device Classes from Home Assistant"
|
||||||
|
25
.github/workflows/yaml-lint.yml
vendored
Normal file
25
.github/workflows/yaml-lint.yml
vendored
Normal file
@ -0,0 +1,25 @@
|
|||||||
|
---
|
||||||
|
name: YAML lint
|
||||||
|
|
||||||
|
on:
|
||||||
|
push:
|
||||||
|
branches: [dev, beta, release]
|
||||||
|
paths:
|
||||||
|
- "**.yaml"
|
||||||
|
- "**.yml"
|
||||||
|
pull_request:
|
||||||
|
paths:
|
||||||
|
- "**.yaml"
|
||||||
|
- "**.yml"
|
||||||
|
|
||||||
|
jobs:
|
||||||
|
yamllint:
|
||||||
|
name: yamllint
|
||||||
|
runs-on: ubuntu-latest
|
||||||
|
steps:
|
||||||
|
- name: Check out code from GitHub
|
||||||
|
uses: actions/checkout@v4.1.7
|
||||||
|
- name: Run yamllint
|
||||||
|
uses: frenck/action-yamllint@v1.5.0
|
||||||
|
with:
|
||||||
|
strict: true
|
1
.gitignore
vendored
1
.gitignore
vendored
@ -143,4 +143,3 @@ sdkconfig.*
|
|||||||
/components
|
/components
|
||||||
/managed_components
|
/managed_components
|
||||||
|
|
||||||
api-docs/
|
|
||||||
|
@ -1,51 +1,49 @@
|
|||||||
---
|
---
|
||||||
# See https://pre-commit.com for more information
|
# See https://pre-commit.com for more information
|
||||||
# See https://pre-commit.com/hooks.html for more hooks
|
# See https://pre-commit.com/hooks.html for more hooks
|
||||||
|
|
||||||
ci:
|
|
||||||
autoupdate_commit_msg: 'pre-commit: autoupdate'
|
|
||||||
autoupdate_schedule: off # Disabled until ruff versions are synced between deps and pre-commit
|
|
||||||
# Skip hooks that have issues in pre-commit CI environment
|
|
||||||
skip: [pylint, clang-tidy-hash]
|
|
||||||
|
|
||||||
repos:
|
repos:
|
||||||
- repo: https://github.com/astral-sh/ruff-pre-commit
|
- repo: https://github.com/astral-sh/ruff-pre-commit
|
||||||
# Ruff version.
|
# Ruff version.
|
||||||
rev: v0.12.4
|
rev: v0.5.4
|
||||||
hooks:
|
hooks:
|
||||||
# Run the linter.
|
# Run the linter.
|
||||||
- id: ruff
|
- id: ruff
|
||||||
args: [--fix]
|
args: [--fix]
|
||||||
# Run the formatter.
|
# Run the formatter.
|
||||||
- id: ruff-format
|
- id: ruff-format
|
||||||
|
- repo: https://github.com/psf/black-pre-commit-mirror
|
||||||
|
rev: 24.4.2
|
||||||
|
hooks:
|
||||||
|
- id: black
|
||||||
|
args:
|
||||||
|
- --safe
|
||||||
|
- --quiet
|
||||||
|
files: ^((esphome|script|tests)/.+)?[^/]+\.py$
|
||||||
- repo: https://github.com/PyCQA/flake8
|
- repo: https://github.com/PyCQA/flake8
|
||||||
rev: 7.3.0
|
rev: 6.1.0
|
||||||
hooks:
|
hooks:
|
||||||
- id: flake8
|
- id: flake8
|
||||||
additional_dependencies:
|
additional_dependencies:
|
||||||
- flake8-docstrings==1.7.0
|
- flake8-docstrings==1.5.0
|
||||||
- pydocstyle==5.1.1
|
- pydocstyle==5.1.1
|
||||||
files: ^(esphome|tests)/.+\.py$
|
files: ^(esphome|tests)/.+\.py$
|
||||||
- repo: https://github.com/pre-commit/pre-commit-hooks
|
- repo: https://github.com/pre-commit/pre-commit-hooks
|
||||||
rev: v5.0.0
|
rev: v3.4.0
|
||||||
hooks:
|
hooks:
|
||||||
- id: no-commit-to-branch
|
- id: no-commit-to-branch
|
||||||
args:
|
args:
|
||||||
- --branch=dev
|
- --branch=dev
|
||||||
- --branch=release
|
- --branch=release
|
||||||
- --branch=beta
|
- --branch=beta
|
||||||
- id: end-of-file-fixer
|
|
||||||
- id: trailing-whitespace
|
|
||||||
- repo: https://github.com/asottile/pyupgrade
|
- repo: https://github.com/asottile/pyupgrade
|
||||||
rev: v3.20.0
|
rev: v3.15.2
|
||||||
hooks:
|
hooks:
|
||||||
- id: pyupgrade
|
- id: pyupgrade
|
||||||
args: [--py311-plus]
|
args: [--py39-plus]
|
||||||
- repo: https://github.com/adrienverge/yamllint.git
|
- repo: https://github.com/adrienverge/yamllint.git
|
||||||
rev: v1.37.1
|
rev: v1.35.1
|
||||||
hooks:
|
hooks:
|
||||||
- id: yamllint
|
- id: yamllint
|
||||||
exclude: ^(\.clang-format|\.clang-tidy)$
|
|
||||||
- repo: https://github.com/pre-commit/mirrors-clang-format
|
- repo: https://github.com/pre-commit/mirrors-clang-format
|
||||||
rev: v13.0.1
|
rev: v13.0.1
|
||||||
hooks:
|
hooks:
|
||||||
@ -55,13 +53,6 @@ repos:
|
|||||||
hooks:
|
hooks:
|
||||||
- id: pylint
|
- id: pylint
|
||||||
name: pylint
|
name: pylint
|
||||||
entry: python3 script/run-in-env.py pylint
|
entry: script/run-in-env.sh pylint
|
||||||
language: system
|
language: script
|
||||||
types: [python]
|
types: [python]
|
||||||
- id: clang-tidy-hash
|
|
||||||
name: Update clang-tidy hash
|
|
||||||
entry: python script/clang_tidy_hash.py --update-if-changed
|
|
||||||
language: python
|
|
||||||
files: ^(\.clang-tidy|platformio\.ini|requirements_dev\.txt)$
|
|
||||||
pass_filenames: false
|
|
||||||
additional_dependencies: []
|
|
||||||
|
65
CODEOWNERS
65
CODEOWNERS
@ -9,7 +9,6 @@
|
|||||||
pyproject.toml @esphome/core
|
pyproject.toml @esphome/core
|
||||||
esphome/*.py @esphome/core
|
esphome/*.py @esphome/core
|
||||||
esphome/core/* @esphome/core
|
esphome/core/* @esphome/core
|
||||||
.github/** @esphome/core
|
|
||||||
|
|
||||||
# Integrations
|
# Integrations
|
||||||
esphome/components/a01nyub/* @MrSuicideParrot
|
esphome/components/a01nyub/* @MrSuicideParrot
|
||||||
@ -29,7 +28,7 @@ esphome/components/aic3204/* @kbx81
|
|||||||
esphome/components/airthings_ble/* @jeromelaban
|
esphome/components/airthings_ble/* @jeromelaban
|
||||||
esphome/components/airthings_wave_base/* @jeromelaban @kpfleming @ncareau
|
esphome/components/airthings_wave_base/* @jeromelaban @kpfleming @ncareau
|
||||||
esphome/components/airthings_wave_mini/* @ncareau
|
esphome/components/airthings_wave_mini/* @ncareau
|
||||||
esphome/components/airthings_wave_plus/* @jeromelaban @precurse
|
esphome/components/airthings_wave_plus/* @jeromelaban
|
||||||
esphome/components/alarm_control_panel/* @grahambrown11 @hwstar
|
esphome/components/alarm_control_panel/* @grahambrown11 @hwstar
|
||||||
esphome/components/alpha3/* @jan-hofmeier
|
esphome/components/alpha3/* @jan-hofmeier
|
||||||
esphome/components/am2315c/* @swoboda1337
|
esphome/components/am2315c/* @swoboda1337
|
||||||
@ -50,7 +49,6 @@ esphome/components/atc_mithermometer/* @ahpohl
|
|||||||
esphome/components/atm90e26/* @danieltwagner
|
esphome/components/atm90e26/* @danieltwagner
|
||||||
esphome/components/atm90e32/* @circuitsetup @descipher
|
esphome/components/atm90e32/* @circuitsetup @descipher
|
||||||
esphome/components/audio/* @kahrendt
|
esphome/components/audio/* @kahrendt
|
||||||
esphome/components/audio_adc/* @kbx81
|
|
||||||
esphome/components/audio_dac/* @kbx81
|
esphome/components/audio_dac/* @kbx81
|
||||||
esphome/components/axs15231/* @clydebarrow
|
esphome/components/axs15231/* @clydebarrow
|
||||||
esphome/components/b_parasite/* @rbaron
|
esphome/components/b_parasite/* @rbaron
|
||||||
@ -88,20 +86,16 @@ esphome/components/bp1658cj/* @Cossid
|
|||||||
esphome/components/bp5758d/* @Cossid
|
esphome/components/bp5758d/* @Cossid
|
||||||
esphome/components/button/* @esphome/core
|
esphome/components/button/* @esphome/core
|
||||||
esphome/components/bytebuffer/* @clydebarrow
|
esphome/components/bytebuffer/* @clydebarrow
|
||||||
esphome/components/camera/* @DT-art1 @bdraco
|
|
||||||
esphome/components/canbus/* @danielschramm @mvturnho
|
esphome/components/canbus/* @danielschramm @mvturnho
|
||||||
esphome/components/cap1188/* @mreditor97
|
esphome/components/cap1188/* @mreditor97
|
||||||
esphome/components/captive_portal/* @OttoWinter
|
esphome/components/captive_portal/* @OttoWinter
|
||||||
esphome/components/ccs811/* @habbie
|
esphome/components/ccs811/* @habbie
|
||||||
esphome/components/cd74hc4067/* @asoehlke
|
esphome/components/cd74hc4067/* @asoehlke
|
||||||
esphome/components/ch422g/* @clydebarrow @jesterret
|
esphome/components/ch422g/* @clydebarrow @jesterret
|
||||||
esphome/components/chsc6x/* @kkosik20
|
|
||||||
esphome/components/climate/* @esphome/core
|
esphome/components/climate/* @esphome/core
|
||||||
esphome/components/climate_ir/* @glmnet
|
esphome/components/climate_ir/* @glmnet
|
||||||
esphome/components/cm1106/* @andrewjswan
|
|
||||||
esphome/components/color_temperature/* @jesserockz
|
esphome/components/color_temperature/* @jesserockz
|
||||||
esphome/components/combination/* @Cat-Ion @kahrendt
|
esphome/components/combination/* @Cat-Ion @kahrendt
|
||||||
esphome/components/const/* @esphome/core
|
|
||||||
esphome/components/coolix/* @glmnet
|
esphome/components/coolix/* @glmnet
|
||||||
esphome/components/copy/* @OttoWinter
|
esphome/components/copy/* @OttoWinter
|
||||||
esphome/components/cover/* @esphome/core
|
esphome/components/cover/* @esphome/core
|
||||||
@ -126,7 +120,6 @@ esphome/components/dht/* @OttoWinter
|
|||||||
esphome/components/display_menu_base/* @numo68
|
esphome/components/display_menu_base/* @numo68
|
||||||
esphome/components/dps310/* @kbx81
|
esphome/components/dps310/* @kbx81
|
||||||
esphome/components/ds1307/* @badbadc0ffee
|
esphome/components/ds1307/* @badbadc0ffee
|
||||||
esphome/components/ds2484/* @mrk-its
|
|
||||||
esphome/components/dsmr/* @glmnet @zuidwijk
|
esphome/components/dsmr/* @glmnet @zuidwijk
|
||||||
esphome/components/duty_time/* @dudanov
|
esphome/components/duty_time/* @dudanov
|
||||||
esphome/components/ee895/* @Stock-M
|
esphome/components/ee895/* @Stock-M
|
||||||
@ -138,26 +131,19 @@ esphome/components/ens160_base/* @latonita @vincentscode
|
|||||||
esphome/components/ens160_i2c/* @latonita
|
esphome/components/ens160_i2c/* @latonita
|
||||||
esphome/components/ens160_spi/* @latonita
|
esphome/components/ens160_spi/* @latonita
|
||||||
esphome/components/ens210/* @itn3rd77
|
esphome/components/ens210/* @itn3rd77
|
||||||
esphome/components/es7210/* @kahrendt
|
|
||||||
esphome/components/es7243e/* @kbx81
|
|
||||||
esphome/components/es8156/* @kbx81
|
|
||||||
esphome/components/es8311/* @kahrendt @kroimon
|
esphome/components/es8311/* @kahrendt @kroimon
|
||||||
esphome/components/es8388/* @P4uLT
|
|
||||||
esphome/components/esp32/* @esphome/core
|
esphome/components/esp32/* @esphome/core
|
||||||
esphome/components/esp32_ble/* @Rapsssito @jesserockz
|
esphome/components/esp32_ble/* @Rapsssito @jesserockz
|
||||||
esphome/components/esp32_ble_client/* @jesserockz
|
esphome/components/esp32_ble_client/* @jesserockz
|
||||||
esphome/components/esp32_ble_server/* @Rapsssito @clydebarrow @jesserockz
|
esphome/components/esp32_ble_server/* @Rapsssito @clydebarrow @jesserockz
|
||||||
esphome/components/esp32_camera_web_server/* @ayufan
|
esphome/components/esp32_camera_web_server/* @ayufan
|
||||||
esphome/components/esp32_can/* @Sympatron
|
esphome/components/esp32_can/* @Sympatron
|
||||||
esphome/components/esp32_hosted/* @swoboda1337
|
|
||||||
esphome/components/esp32_improv/* @jesserockz
|
esphome/components/esp32_improv/* @jesserockz
|
||||||
esphome/components/esp32_rmt/* @jesserockz
|
esphome/components/esp32_rmt/* @jesserockz
|
||||||
esphome/components/esp32_rmt_led_strip/* @jesserockz
|
esphome/components/esp32_rmt_led_strip/* @jesserockz
|
||||||
esphome/components/esp8266/* @esphome/core
|
esphome/components/esp8266/* @esphome/core
|
||||||
esphome/components/esp_ldo/* @clydebarrow
|
|
||||||
esphome/components/ethernet_info/* @gtjadsonsantos
|
esphome/components/ethernet_info/* @gtjadsonsantos
|
||||||
esphome/components/event/* @nohat
|
esphome/components/event/* @nohat
|
||||||
esphome/components/event_emitter/* @Rapsssito
|
|
||||||
esphome/components/exposure_notifications/* @OttoWinter
|
esphome/components/exposure_notifications/* @OttoWinter
|
||||||
esphome/components/ezo/* @ssieb
|
esphome/components/ezo/* @ssieb
|
||||||
esphome/components/ezo_pmp/* @carlos-sarmiento
|
esphome/components/ezo_pmp/* @carlos-sarmiento
|
||||||
@ -171,13 +157,12 @@ esphome/components/ft5x06/* @clydebarrow
|
|||||||
esphome/components/ft63x6/* @gpambrozio
|
esphome/components/ft63x6/* @gpambrozio
|
||||||
esphome/components/gcja5/* @gcormier
|
esphome/components/gcja5/* @gcormier
|
||||||
esphome/components/gdk101/* @Szewcson
|
esphome/components/gdk101/* @Szewcson
|
||||||
esphome/components/gl_r01_i2c/* @pkejval
|
|
||||||
esphome/components/globals/* @esphome/core
|
esphome/components/globals/* @esphome/core
|
||||||
esphome/components/gp2y1010au0f/* @zry98
|
esphome/components/gp2y1010au0f/* @zry98
|
||||||
esphome/components/gp8403/* @jesserockz
|
esphome/components/gp8403/* @jesserockz
|
||||||
esphome/components/gpio/* @esphome/core
|
esphome/components/gpio/* @esphome/core
|
||||||
esphome/components/gpio/one_wire/* @ssieb
|
esphome/components/gpio/one_wire/* @ssieb
|
||||||
esphome/components/gps/* @coogle @ximex
|
esphome/components/gps/* @coogle
|
||||||
esphome/components/graph/* @synco
|
esphome/components/graph/* @synco
|
||||||
esphome/components/graphical_display_menu/* @MrMDavidson
|
esphome/components/graphical_display_menu/* @MrMDavidson
|
||||||
esphome/components/gree/* @orestismers
|
esphome/components/gree/* @orestismers
|
||||||
@ -194,7 +179,6 @@ esphome/components/haier/text_sensor/* @paveldn
|
|||||||
esphome/components/havells_solar/* @sourabhjaiswal
|
esphome/components/havells_solar/* @sourabhjaiswal
|
||||||
esphome/components/hbridge/fan/* @WeekendWarrior
|
esphome/components/hbridge/fan/* @WeekendWarrior
|
||||||
esphome/components/hbridge/light/* @DotNetDann
|
esphome/components/hbridge/light/* @DotNetDann
|
||||||
esphome/components/hbridge/switch/* @dwmw2
|
|
||||||
esphome/components/he60r/* @clydebarrow
|
esphome/components/he60r/* @clydebarrow
|
||||||
esphome/components/heatpumpir/* @rob-deutsch
|
esphome/components/heatpumpir/* @rob-deutsch
|
||||||
esphome/components/hitachi_ac424/* @sourabhjaiswal
|
esphome/components/hitachi_ac424/* @sourabhjaiswal
|
||||||
@ -241,29 +225,22 @@ esphome/components/kamstrup_kmp/* @cfeenstra1024
|
|||||||
esphome/components/key_collector/* @ssieb
|
esphome/components/key_collector/* @ssieb
|
||||||
esphome/components/key_provider/* @ssieb
|
esphome/components/key_provider/* @ssieb
|
||||||
esphome/components/kuntze/* @ssieb
|
esphome/components/kuntze/* @ssieb
|
||||||
esphome/components/lc709203f/* @ilikecake
|
|
||||||
esphome/components/lcd_menu/* @numo68
|
esphome/components/lcd_menu/* @numo68
|
||||||
esphome/components/ld2410/* @regevbr @sebcaps
|
esphome/components/ld2410/* @regevbr @sebcaps
|
||||||
esphome/components/ld2420/* @descipher
|
esphome/components/ld2420/* @descipher
|
||||||
esphome/components/ld2450/* @hareeshmu
|
|
||||||
esphome/components/ld24xx/* @kbx81
|
|
||||||
esphome/components/ledc/* @OttoWinter
|
esphome/components/ledc/* @OttoWinter
|
||||||
esphome/components/libretiny/* @kuba2k2
|
esphome/components/libretiny/* @kuba2k2
|
||||||
esphome/components/libretiny_pwm/* @kuba2k2
|
esphome/components/libretiny_pwm/* @kuba2k2
|
||||||
esphome/components/light/* @esphome/core
|
esphome/components/light/* @esphome/core
|
||||||
esphome/components/lightwaverf/* @max246
|
esphome/components/lightwaverf/* @max246
|
||||||
esphome/components/lilygo_t5_47/touchscreen/* @jesserockz
|
esphome/components/lilygo_t5_47/touchscreen/* @jesserockz
|
||||||
esphome/components/ln882x/* @lamauny
|
|
||||||
esphome/components/lock/* @esphome/core
|
esphome/components/lock/* @esphome/core
|
||||||
esphome/components/logger/* @esphome/core
|
esphome/components/logger/* @esphome/core
|
||||||
esphome/components/logger/select/* @clydebarrow
|
|
||||||
esphome/components/lps22/* @nagisa
|
|
||||||
esphome/components/ltr390/* @latonita @sjtrny
|
esphome/components/ltr390/* @latonita @sjtrny
|
||||||
esphome/components/ltr501/* @latonita
|
esphome/components/ltr501/* @latonita
|
||||||
esphome/components/ltr_als_ps/* @latonita
|
esphome/components/ltr_als_ps/* @latonita
|
||||||
esphome/components/lvgl/* @clydebarrow
|
esphome/components/lvgl/* @clydebarrow
|
||||||
esphome/components/m5stack_8angle/* @rnauber
|
esphome/components/m5stack_8angle/* @rnauber
|
||||||
esphome/components/mapping/* @clydebarrow
|
|
||||||
esphome/components/matrix_keypad/* @ssieb
|
esphome/components/matrix_keypad/* @ssieb
|
||||||
esphome/components/max17043/* @blacknell
|
esphome/components/max17043/* @blacknell
|
||||||
esphome/components/max31865/* @DAVe3283
|
esphome/components/max31865/* @DAVe3283
|
||||||
@ -280,7 +257,6 @@ esphome/components/mcp23x17_base/* @jesserockz
|
|||||||
esphome/components/mcp23xxx_base/* @jesserockz
|
esphome/components/mcp23xxx_base/* @jesserockz
|
||||||
esphome/components/mcp2515/* @danielschramm @mvturnho
|
esphome/components/mcp2515/* @danielschramm @mvturnho
|
||||||
esphome/components/mcp3204/* @rsumner
|
esphome/components/mcp3204/* @rsumner
|
||||||
esphome/components/mcp4461/* @p1ngb4ck
|
|
||||||
esphome/components/mcp4728/* @berfenger
|
esphome/components/mcp4728/* @berfenger
|
||||||
esphome/components/mcp47a1/* @jesserockz
|
esphome/components/mcp47a1/* @jesserockz
|
||||||
esphome/components/mcp9600/* @mreditor97
|
esphome/components/mcp9600/* @mreditor97
|
||||||
@ -290,13 +266,11 @@ esphome/components/mdns/* @esphome/core
|
|||||||
esphome/components/media_player/* @jesserockz
|
esphome/components/media_player/* @jesserockz
|
||||||
esphome/components/micro_wake_word/* @jesserockz @kahrendt
|
esphome/components/micro_wake_word/* @jesserockz @kahrendt
|
||||||
esphome/components/micronova/* @jorre05
|
esphome/components/micronova/* @jorre05
|
||||||
esphome/components/microphone/* @jesserockz @kahrendt
|
esphome/components/microphone/* @jesserockz
|
||||||
esphome/components/mics_4514/* @jesserockz
|
esphome/components/mics_4514/* @jesserockz
|
||||||
esphome/components/midea/* @dudanov
|
esphome/components/midea/* @dudanov
|
||||||
esphome/components/midea_ir/* @dudanov
|
esphome/components/midea_ir/* @dudanov
|
||||||
esphome/components/mipi_spi/* @clydebarrow
|
|
||||||
esphome/components/mitsubishi/* @RubyBailey
|
esphome/components/mitsubishi/* @RubyBailey
|
||||||
esphome/components/mixer/speaker/* @kahrendt
|
|
||||||
esphome/components/mlx90393/* @functionpointer
|
esphome/components/mlx90393/* @functionpointer
|
||||||
esphome/components/mlx90614/* @jesserockz
|
esphome/components/mlx90614/* @jesserockz
|
||||||
esphome/components/mmc5603/* @benhoff
|
esphome/components/mmc5603/* @benhoff
|
||||||
@ -315,7 +289,6 @@ esphome/components/mopeka_std_check/* @Fabian-Schmidt
|
|||||||
esphome/components/mpl3115a2/* @kbickar
|
esphome/components/mpl3115a2/* @kbickar
|
||||||
esphome/components/mpu6886/* @fabaff
|
esphome/components/mpu6886/* @fabaff
|
||||||
esphome/components/ms8607/* @e28eta
|
esphome/components/ms8607/* @e28eta
|
||||||
esphome/components/msa3xx/* @latonita
|
|
||||||
esphome/components/nau7802/* @cujomalainey
|
esphome/components/nau7802/* @cujomalainey
|
||||||
esphome/components/network/* @esphome/core
|
esphome/components/network/* @esphome/core
|
||||||
esphome/components/nextion/* @edwardtfn @senexcrenshaw
|
esphome/components/nextion/* @edwardtfn @senexcrenshaw
|
||||||
@ -326,27 +299,20 @@ esphome/components/nextion/text_sensor/* @senexcrenshaw
|
|||||||
esphome/components/nfc/* @jesserockz @kbx81
|
esphome/components/nfc/* @jesserockz @kbx81
|
||||||
esphome/components/noblex/* @AGalfra
|
esphome/components/noblex/* @AGalfra
|
||||||
esphome/components/npi19/* @bakerkj
|
esphome/components/npi19/* @bakerkj
|
||||||
esphome/components/nrf52/* @tomaszduda23
|
|
||||||
esphome/components/number/* @esphome/core
|
esphome/components/number/* @esphome/core
|
||||||
esphome/components/one_wire/* @ssieb
|
esphome/components/one_wire/* @ssieb
|
||||||
esphome/components/online_image/* @clydebarrow @guillempages
|
esphome/components/online_image/* @guillempages
|
||||||
esphome/components/opentherm/* @olegtarasov
|
esphome/components/opentherm/* @olegtarasov
|
||||||
esphome/components/openthread/* @mrene
|
|
||||||
esphome/components/opt3001/* @ccutrer
|
|
||||||
esphome/components/ota/* @esphome/core
|
esphome/components/ota/* @esphome/core
|
||||||
esphome/components/output/* @esphome/core
|
esphome/components/output/* @esphome/core
|
||||||
esphome/components/packet_transport/* @clydebarrow
|
|
||||||
esphome/components/pca6416a/* @Mat931
|
esphome/components/pca6416a/* @Mat931
|
||||||
esphome/components/pca9554/* @clydebarrow @hwstar
|
esphome/components/pca9554/* @clydebarrow @hwstar
|
||||||
esphome/components/pcf85063/* @brogon
|
esphome/components/pcf85063/* @brogon
|
||||||
esphome/components/pcf8563/* @KoenBreeman
|
esphome/components/pcf8563/* @KoenBreeman
|
||||||
esphome/components/pi4ioe5v6408/* @jesserockz
|
|
||||||
esphome/components/pid/* @OttoWinter
|
esphome/components/pid/* @OttoWinter
|
||||||
esphome/components/pipsolar/* @andreashergert1984
|
esphome/components/pipsolar/* @andreashergert1984
|
||||||
esphome/components/pm1006/* @habbie
|
esphome/components/pm1006/* @habbie
|
||||||
esphome/components/pm2005/* @andrewjswan
|
|
||||||
esphome/components/pmsa003i/* @sjtrny
|
esphome/components/pmsa003i/* @sjtrny
|
||||||
esphome/components/pmsx003/* @ximex
|
|
||||||
esphome/components/pmwcs3/* @SeByDocKy
|
esphome/components/pmwcs3/* @SeByDocKy
|
||||||
esphome/components/pn532/* @OttoWinter @jesserockz
|
esphome/components/pn532/* @OttoWinter @jesserockz
|
||||||
esphome/components/pn532_i2c/* @OttoWinter @jesserockz
|
esphome/components/pn532_i2c/* @OttoWinter @jesserockz
|
||||||
@ -371,7 +337,7 @@ esphome/components/radon_eye_rd200/* @jeffeb3
|
|||||||
esphome/components/rc522/* @glmnet
|
esphome/components/rc522/* @glmnet
|
||||||
esphome/components/rc522_i2c/* @glmnet
|
esphome/components/rc522_i2c/* @glmnet
|
||||||
esphome/components/rc522_spi/* @glmnet
|
esphome/components/rc522_spi/* @glmnet
|
||||||
esphome/components/resampler/speaker/* @kahrendt
|
esphome/components/resistance_sampler/* @jesserockz
|
||||||
esphome/components/restart/* @esphome/core
|
esphome/components/restart/* @esphome/core
|
||||||
esphome/components/rf_bridge/* @jesserockz
|
esphome/components/rf_bridge/* @jesserockz
|
||||||
esphome/components/rgbct/* @jesserockz
|
esphome/components/rgbct/* @jesserockz
|
||||||
@ -381,16 +347,13 @@ esphome/components/rp2040_pwm/* @jesserockz
|
|||||||
esphome/components/rpi_dpi_rgb/* @clydebarrow
|
esphome/components/rpi_dpi_rgb/* @clydebarrow
|
||||||
esphome/components/rtl87xx/* @kuba2k2
|
esphome/components/rtl87xx/* @kuba2k2
|
||||||
esphome/components/rtttl/* @glmnet
|
esphome/components/rtttl/* @glmnet
|
||||||
esphome/components/runtime_stats/* @bdraco
|
|
||||||
esphome/components/safe_mode/* @jsuanet @kbx81 @paulmonigatti
|
esphome/components/safe_mode/* @jsuanet @kbx81 @paulmonigatti
|
||||||
esphome/components/scd4x/* @martgras @sjtrny
|
esphome/components/scd4x/* @martgras @sjtrny
|
||||||
esphome/components/script/* @esphome/core
|
esphome/components/script/* @esphome/core
|
||||||
esphome/components/sdl/* @bdm310 @clydebarrow
|
esphome/components/sdl/* @clydebarrow
|
||||||
esphome/components/sdm_meter/* @jesserockz @polyfaces
|
esphome/components/sdm_meter/* @jesserockz @polyfaces
|
||||||
esphome/components/sdp3x/* @Azimath
|
esphome/components/sdp3x/* @Azimath
|
||||||
esphome/components/seeed_mr24hpc1/* @limengdu
|
esphome/components/seeed_mr24hpc1/* @limengdu
|
||||||
esphome/components/seeed_mr60bha2/* @limengdu
|
|
||||||
esphome/components/seeed_mr60fda2/* @limengdu
|
|
||||||
esphome/components/selec_meter/* @sourabhjaiswal
|
esphome/components/selec_meter/* @sourabhjaiswal
|
||||||
esphome/components/select/* @esphome/core
|
esphome/components/select/* @esphome/core
|
||||||
esphome/components/sen0321/* @notjj
|
esphome/components/sen0321/* @notjj
|
||||||
@ -416,9 +379,7 @@ esphome/components/smt100/* @piechade
|
|||||||
esphome/components/sn74hc165/* @jesserockz
|
esphome/components/sn74hc165/* @jesserockz
|
||||||
esphome/components/socket/* @esphome/core
|
esphome/components/socket/* @esphome/core
|
||||||
esphome/components/sonoff_d1/* @anatoly-savchenkov
|
esphome/components/sonoff_d1/* @anatoly-savchenkov
|
||||||
esphome/components/sound_level/* @kahrendt
|
|
||||||
esphome/components/speaker/* @jesserockz @kahrendt
|
esphome/components/speaker/* @jesserockz @kahrendt
|
||||||
esphome/components/speaker/media_player/* @kahrendt @synesthesiam
|
|
||||||
esphome/components/spi/* @clydebarrow @esphome/core
|
esphome/components/spi/* @clydebarrow @esphome/core
|
||||||
esphome/components/spi_device/* @clydebarrow
|
esphome/components/spi_device/* @clydebarrow
|
||||||
esphome/components/spi_led_strip/* @clydebarrow
|
esphome/components/spi_led_strip/* @clydebarrow
|
||||||
@ -447,10 +408,6 @@ esphome/components/substitutions/* @esphome/core
|
|||||||
esphome/components/sun/* @OttoWinter
|
esphome/components/sun/* @OttoWinter
|
||||||
esphome/components/sun_gtil2/* @Mat931
|
esphome/components/sun_gtil2/* @Mat931
|
||||||
esphome/components/switch/* @esphome/core
|
esphome/components/switch/* @esphome/core
|
||||||
esphome/components/switch/binary_sensor/* @ssieb
|
|
||||||
esphome/components/sx126x/* @swoboda1337
|
|
||||||
esphome/components/sx127x/* @swoboda1337
|
|
||||||
esphome/components/syslog/* @clydebarrow
|
|
||||||
esphome/components/t6615/* @tylermenezes
|
esphome/components/t6615/* @tylermenezes
|
||||||
esphome/components/tc74/* @sethgirvan
|
esphome/components/tc74/* @sethgirvan
|
||||||
esphome/components/tca9548a/* @andreashergert1984
|
esphome/components/tca9548a/* @andreashergert1984
|
||||||
@ -476,7 +433,6 @@ esphome/components/tmp102/* @timsavage
|
|||||||
esphome/components/tmp1075/* @sybrenstuvel
|
esphome/components/tmp1075/* @sybrenstuvel
|
||||||
esphome/components/tmp117/* @Azimath
|
esphome/components/tmp117/* @Azimath
|
||||||
esphome/components/tof10120/* @wstrzalka
|
esphome/components/tof10120/* @wstrzalka
|
||||||
esphome/components/tormatic/* @ti-mo
|
|
||||||
esphome/components/toshiba/* @kbx81
|
esphome/components/toshiba/* @kbx81
|
||||||
esphome/components/touchscreen/* @jesserockz @nielsnl68
|
esphome/components/touchscreen/* @jesserockz @nielsnl68
|
||||||
esphome/components/tsl2591/* @wjcarpenter
|
esphome/components/tsl2591/* @wjcarpenter
|
||||||
@ -490,25 +446,21 @@ esphome/components/tuya/switch/* @jesserockz
|
|||||||
esphome/components/tuya/text_sensor/* @dentra
|
esphome/components/tuya/text_sensor/* @dentra
|
||||||
esphome/components/uart/* @esphome/core
|
esphome/components/uart/* @esphome/core
|
||||||
esphome/components/uart/button/* @ssieb
|
esphome/components/uart/button/* @ssieb
|
||||||
esphome/components/uart/packet_transport/* @clydebarrow
|
|
||||||
esphome/components/udp/* @clydebarrow
|
esphome/components/udp/* @clydebarrow
|
||||||
esphome/components/ufire_ec/* @pvizeli
|
esphome/components/ufire_ec/* @pvizeli
|
||||||
esphome/components/ufire_ise/* @pvizeli
|
esphome/components/ufire_ise/* @pvizeli
|
||||||
esphome/components/ultrasonic/* @OttoWinter
|
esphome/components/ultrasonic/* @OttoWinter
|
||||||
esphome/components/update/* @jesserockz
|
esphome/components/update/* @jesserockz
|
||||||
esphome/components/uponor_smatrix/* @kroimon
|
esphome/components/uponor_smatrix/* @kroimon
|
||||||
esphome/components/usb_host/* @clydebarrow
|
|
||||||
esphome/components/usb_uart/* @clydebarrow
|
|
||||||
esphome/components/valve/* @esphome/core
|
esphome/components/valve/* @esphome/core
|
||||||
esphome/components/vbus/* @ssieb
|
esphome/components/vbus/* @ssieb
|
||||||
esphome/components/veml3235/* @kbx81
|
esphome/components/veml3235/* @kbx81
|
||||||
esphome/components/veml7700/* @latonita
|
esphome/components/veml7700/* @latonita
|
||||||
esphome/components/version/* @esphome/core
|
esphome/components/version/* @esphome/core
|
||||||
esphome/components/voice_assistant/* @jesserockz @kahrendt
|
esphome/components/voice_assistant/* @jesserockz
|
||||||
esphome/components/wake_on_lan/* @clydebarrow @willwill2will54
|
esphome/components/wake_on_lan/* @clydebarrow @willwill2will54
|
||||||
esphome/components/watchdog/* @oarcher
|
esphome/components/watchdog/* @oarcher
|
||||||
esphome/components/waveshare_epaper/* @clydebarrow
|
esphome/components/waveshare_epaper/* @clydebarrow
|
||||||
esphome/components/web_server/ota/* @esphome/core
|
|
||||||
esphome/components/web_server_base/* @OttoWinter
|
esphome/components/web_server_base/* @OttoWinter
|
||||||
esphome/components/web_server_idf/* @dentra
|
esphome/components/web_server_idf/* @dentra
|
||||||
esphome/components/weikai/* @DrCoolZic
|
esphome/components/weikai/* @DrCoolZic
|
||||||
@ -535,10 +487,7 @@ esphome/components/xiaomi_lywsd03mmc/* @ahpohl
|
|||||||
esphome/components/xiaomi_mhoc303/* @drug123
|
esphome/components/xiaomi_mhoc303/* @drug123
|
||||||
esphome/components/xiaomi_mhoc401/* @vevsvevs
|
esphome/components/xiaomi_mhoc401/* @vevsvevs
|
||||||
esphome/components/xiaomi_rtcgq02lm/* @jesserockz
|
esphome/components/xiaomi_rtcgq02lm/* @jesserockz
|
||||||
esphome/components/xiaomi_xmwsdj04mmc/* @medusalix
|
|
||||||
esphome/components/xl9535/* @mreditor97
|
esphome/components/xl9535/* @mreditor97
|
||||||
esphome/components/xpt2046/touchscreen/* @nielsnl68 @numo68
|
esphome/components/xpt2046/touchscreen/* @nielsnl68 @numo68
|
||||||
esphome/components/xxtea/* @clydebarrow
|
|
||||||
esphome/components/zephyr/* @tomaszduda23
|
|
||||||
esphome/components/zhlt01/* @cfeenstra1024
|
esphome/components/zhlt01/* @cfeenstra1024
|
||||||
esphome/components/zio_ultrasonic/* @kahrendt
|
esphome/components/zio_ultrasonic/* @kahrendt
|
||||||
|
@ -34,7 +34,7 @@ This Code of Conduct applies both within project spaces and in public spaces whe
|
|||||||
|
|
||||||
## Enforcement
|
## Enforcement
|
||||||
|
|
||||||
Instances of abusive, harassing, or otherwise unacceptable behavior may be reported by contacting the project team at esphome@openhomefoundation.org. The project team will review and investigate all complaints, and will respond in a way that it deems appropriate to the circumstances. The project team is obligated to maintain confidentiality with regard to the reporter of an incident. Further details of specific enforcement policies may be posted separately.
|
Instances of abusive, harassing, or otherwise unacceptable behavior may be reported by contacting the project team at esphome@nabucasa.com. The project team will review and investigate all complaints, and will respond in a way that it deems appropriate to the circumstances. The project team is obligated to maintain confidentiality with regard to the reporter of an incident. Further details of specific enforcement policies may be posted separately.
|
||||||
|
|
||||||
Project maintainers who do not follow or enforce the Code of Conduct in good faith may face temporary or permanent repercussions as determined by other members of the project's leadership.
|
Project maintainers who do not follow or enforce the Code of Conduct in good faith may face temporary or permanent repercussions as determined by other members of the project's leadership.
|
||||||
|
|
||||||
|
@ -1,14 +1,12 @@
|
|||||||
# Contributing to ESPHome [](https://discord.gg/KhAMKrd) [](https://GitHub.com/esphome/esphome/releases/)
|
# Contributing to ESPHome
|
||||||
|
|
||||||
We welcome contributions to the ESPHome suite of code and documentation!
|
For a detailed guide, please see https://esphome.io/guides/contributing.html#contributing-to-esphome
|
||||||
|
|
||||||
Please read our [contributing guide](https://esphome.io/guides/contributing.html) if you wish to contribute to the
|
Things to note when contributing:
|
||||||
project and be sure to join us on [Discord](https://discord.gg/KhAMKrd).
|
|
||||||
|
|
||||||
**See also:**
|
- Please test your changes :)
|
||||||
|
- If a new feature is added or an existing user-facing feature is changed, you should also
|
||||||
[Documentation](https://esphome.io) -- [Issues](https://github.com/esphome/esphome/issues) -- [Feature requests](https://github.com/orgs/esphome/discussions)
|
update the [docs](https://github.com/esphome/esphome-docs). See [contributing to esphome-docs](https://esphome.io/guides/contributing.html#contributing-to-esphomedocs)
|
||||||
|
for more information.
|
||||||
---
|
- Please also update the tests in the `tests/` folder. You can do so by just adding a line in one of the YAML files
|
||||||
|
which checks if your new feature compiles correctly.
|
||||||
[](https://www.openhomefoundation.org/)
|
|
||||||
|
13
README.md
13
README.md
@ -1,16 +1,11 @@
|
|||||||
# ESPHome [](https://discord.gg/KhAMKrd) [](https://GitHub.com/esphome/esphome/releases/)
|
# ESPHome [](https://discord.gg/KhAMKrd) [](https://GitHub.com/esphome/esphome/releases/)
|
||||||
|
|
||||||
<a href="https://esphome.io/">
|
[](https://esphome.io/)
|
||||||
<picture>
|
|
||||||
<source media="(prefers-color-scheme: dark)" srcset="https://esphome.io/_static/logo-text-on-dark.svg", alt="ESPHome Logo">
|
|
||||||
<img src="https://esphome.io/_static/logo-text-on-light.svg" alt="ESPHome Logo">
|
|
||||||
</picture>
|
|
||||||
</a>
|
|
||||||
|
|
||||||
---
|
**Documentation:** https://esphome.io/
|
||||||
|
|
||||||
[Documentation](https://esphome.io) -- [Issues](https://github.com/esphome/esphome/issues) -- [Feature requests](https://github.com/orgs/esphome/discussions)
|
For issues, please go to [the issue tracker](https://github.com/esphome/issues/issues).
|
||||||
|
|
||||||
---
|
For feature requests, please see [feature requests](https://github.com/esphome/feature-requests/issues).
|
||||||
|
|
||||||
[](https://www.openhomefoundation.org/)
|
[](https://www.openhomefoundation.org/)
|
||||||
|
@ -1,56 +1,153 @@
|
|||||||
ARG BUILD_VERSION=dev
|
# Build these with the build.py script
|
||||||
ARG BUILD_OS=alpine
|
# Example:
|
||||||
ARG BUILD_BASE_VERSION=2025.04.0
|
# python3 docker/build.py --tag dev --arch amd64 --build-type docker build
|
||||||
ARG BUILD_TYPE=docker
|
|
||||||
|
|
||||||
FROM ghcr.io/esphome/docker-base:${BUILD_OS}-${BUILD_BASE_VERSION} AS base-source-docker
|
# One of "docker", "hassio"
|
||||||
FROM ghcr.io/esphome/docker-base:${BUILD_OS}-ha-addon-${BUILD_BASE_VERSION} AS base-source-ha-addon
|
ARG BASEIMGTYPE=docker
|
||||||
|
|
||||||
ARG BUILD_TYPE
|
|
||||||
FROM base-source-${BUILD_TYPE} AS base
|
|
||||||
|
|
||||||
RUN git config --system --add safe.directory "*"
|
# https://github.com/hassio-addons/addon-debian-base/releases
|
||||||
|
FROM ghcr.io/hassio-addons/debian-base:7.2.0 AS base-hassio
|
||||||
|
# https://hub.docker.com/_/debian?tab=tags&page=1&name=bookworm
|
||||||
|
FROM debian:12.2-slim AS base-docker
|
||||||
|
|
||||||
ENV PIP_DISABLE_PIP_VERSION_CHECK=1
|
FROM base-${BASEIMGTYPE} AS base
|
||||||
|
|
||||||
RUN pip install --no-cache-dir -U pip uv==0.6.14
|
|
||||||
|
|
||||||
COPY requirements.txt /
|
ARG TARGETARCH
|
||||||
|
ARG TARGETVARIANT
|
||||||
|
|
||||||
|
|
||||||
|
# Note that --break-system-packages is used below because
|
||||||
|
# https://peps.python.org/pep-0668/ added a safety check that prevents
|
||||||
|
# installing packages with the same name as a system package. This is
|
||||||
|
# not a problem for us because we are not concerned about overwriting
|
||||||
|
# system packages because we are running in an isolated container.
|
||||||
|
|
||||||
RUN \
|
RUN \
|
||||||
uv pip install --no-cache-dir \
|
apt-get update \
|
||||||
-r /requirements.txt
|
# Use pinned versions so that we get updates with build caching
|
||||||
|
&& apt-get install -y --no-install-recommends \
|
||||||
|
python3-pip=23.0.1+dfsg-1 \
|
||||||
|
python3-setuptools=66.1.1-1 \
|
||||||
|
python3-venv=3.11.2-1+b1 \
|
||||||
|
python3-wheel=0.38.4-2 \
|
||||||
|
iputils-ping=3:20221126-1+deb12u1 \
|
||||||
|
git=1:2.39.5-0+deb12u1 \
|
||||||
|
curl=7.88.1-10+deb12u8 \
|
||||||
|
openssh-client=1:9.2p1-2+deb12u3 \
|
||||||
|
python3-cffi=1.15.1-5 \
|
||||||
|
libcairo2=1.16.0-7 \
|
||||||
|
libmagic1=1:5.44-3 \
|
||||||
|
patch=2.7.6-7 \
|
||||||
|
&& rm -rf \
|
||||||
|
/tmp/* \
|
||||||
|
/var/{cache,log}/* \
|
||||||
|
/var/lib/apt/lists/*
|
||||||
|
|
||||||
|
ENV \
|
||||||
|
# Fix click python3 lang warning https://click.palletsprojects.com/en/7.x/python3/
|
||||||
|
LANG=C.UTF-8 LC_ALL=C.UTF-8 \
|
||||||
|
# Store globally installed pio libs in /piolibs
|
||||||
|
PLATFORMIO_GLOBALLIB_DIR=/piolibs
|
||||||
|
|
||||||
|
# Support legacy binaries on Debian multiarch system. There is no "correct" way
|
||||||
|
# to do this, other than using properly built toolchains...
|
||||||
|
# See: https://unix.stackexchange.com/questions/553743/correct-way-to-add-lib-ld-linux-so-3-in-debian
|
||||||
|
RUN \
|
||||||
|
if [ "$TARGETARCH$TARGETVARIANT" = "armv7" ]; then \
|
||||||
|
ln -s /lib/arm-linux-gnueabihf/ld-linux-armhf.so.3 /lib/ld-linux.so.3; \
|
||||||
|
fi
|
||||||
|
|
||||||
RUN \
|
RUN \
|
||||||
platformio settings set enable_telemetry No \
|
# Ubuntu python3-pip is missing wheel
|
||||||
|
if [ "$TARGETARCH$TARGETVARIANT" = "armv7" ]; then \
|
||||||
|
export PIP_EXTRA_INDEX_URL="https://www.piwheels.org/simple"; \
|
||||||
|
fi; \
|
||||||
|
pip3 install \
|
||||||
|
--break-system-packages --no-cache-dir \
|
||||||
|
# Keep platformio version in sync with requirements.txt
|
||||||
|
platformio==6.1.16 \
|
||||||
|
# Change some platformio settings
|
||||||
|
&& platformio settings set enable_telemetry No \
|
||||||
&& platformio settings set check_platformio_interval 1000000 \
|
&& platformio settings set check_platformio_interval 1000000 \
|
||||||
&& mkdir -p /piolibs
|
&& mkdir -p /piolibs
|
||||||
|
|
||||||
|
|
||||||
|
# First install requirements to leverage caching when requirements don't change
|
||||||
|
# tmpfs is for https://github.com/rust-lang/cargo/issues/8719
|
||||||
|
|
||||||
|
COPY requirements.txt requirements_optional.txt /
|
||||||
|
RUN --mount=type=tmpfs,target=/root/.cargo <<END-OF-RUN
|
||||||
|
# Fail on any non-zero status
|
||||||
|
set -e
|
||||||
|
|
||||||
|
if [ "$TARGETARCH$TARGETVARIANT" = "armv7" ]
|
||||||
|
then
|
||||||
|
curl -L https://www.piwheels.org/cp311/cryptography-43.0.0-cp37-abi3-linux_armv7l.whl -o /tmp/cryptography-43.0.0-cp37-abi3-linux_armv7l.whl
|
||||||
|
pip3 install --break-system-packages --no-cache-dir /tmp/cryptography-43.0.0-cp37-abi3-linux_armv7l.whl
|
||||||
|
rm /tmp/cryptography-43.0.0-cp37-abi3-linux_armv7l.whl
|
||||||
|
export PIP_EXTRA_INDEX_URL="https://www.piwheels.org/simple";
|
||||||
|
fi
|
||||||
|
|
||||||
|
# install build tools in case wheels are not available
|
||||||
|
BUILD_DEPS="
|
||||||
|
build-essential=12.9
|
||||||
|
python3-dev=3.11.2-1+b1
|
||||||
|
zlib1g-dev=1:1.2.13.dfsg-1
|
||||||
|
libjpeg-dev=1:2.1.5-2
|
||||||
|
libfreetype-dev=2.12.1+dfsg-5+deb12u3
|
||||||
|
libssl-dev=3.0.15-1~deb12u1
|
||||||
|
libffi-dev=3.4.4-1
|
||||||
|
libopenjp2-7=2.5.0-2
|
||||||
|
libtiff6=4.5.0-6+deb12u1
|
||||||
|
cargo=0.66.0+ds1-1
|
||||||
|
pkg-config=1.8.1-1
|
||||||
|
"
|
||||||
|
if [ "$TARGETARCH$TARGETVARIANT" = "arm64" ] || [ "$TARGETARCH$TARGETVARIANT" = "armv7" ]
|
||||||
|
then
|
||||||
|
apt-get update
|
||||||
|
apt-get install -y --no-install-recommends $BUILD_DEPS
|
||||||
|
fi
|
||||||
|
|
||||||
|
CARGO_REGISTRIES_CRATES_IO_PROTOCOL=sparse CARGO_HOME=/root/.cargo
|
||||||
|
pip3 install --break-system-packages --no-cache-dir -r /requirements.txt -r /requirements_optional.txt
|
||||||
|
|
||||||
|
if [ "$TARGETARCH$TARGETVARIANT" = "arm64" ] || [ "$TARGETARCH$TARGETVARIANT" = "armv7" ]
|
||||||
|
then
|
||||||
|
apt-get remove -y --purge --auto-remove $BUILD_DEPS
|
||||||
|
rm -rf /tmp/* /var/{cache,log}/* /var/lib/apt/lists/*
|
||||||
|
fi
|
||||||
|
END-OF-RUN
|
||||||
|
|
||||||
|
|
||||||
COPY script/platformio_install_deps.py platformio.ini /
|
COPY script/platformio_install_deps.py platformio.ini /
|
||||||
RUN /platformio_install_deps.py /platformio.ini --libraries
|
RUN /platformio_install_deps.py /platformio.ini --libraries
|
||||||
|
|
||||||
ARG BUILD_VERSION
|
# Avoid unsafe git error when container user and file config volume permissions don't match
|
||||||
|
RUN git config --system --add safe.directory '*'
|
||||||
LABEL \
|
|
||||||
org.opencontainers.image.authors="The ESPHome Authors" \
|
|
||||||
org.opencontainers.image.title="ESPHome" \
|
|
||||||
org.opencontainers.image.description="ESPHome is a system to configure your microcontrollers by simple yet powerful configuration files and control them remotely through Home Automation systems" \
|
|
||||||
org.opencontainers.image.url="https://esphome.io/" \
|
|
||||||
org.opencontainers.image.documentation="https://esphome.io/" \
|
|
||||||
org.opencontainers.image.source="https://github.com/esphome/esphome" \
|
|
||||||
org.opencontainers.image.licenses="ESPHome" \
|
|
||||||
org.opencontainers.image.version=${BUILD_VERSION}
|
|
||||||
|
|
||||||
|
|
||||||
# ======================= docker-type image =======================
|
# ======================= docker-type image =======================
|
||||||
FROM base AS base-docker
|
FROM base AS docker
|
||||||
|
|
||||||
|
# Copy esphome and install
|
||||||
|
COPY . /esphome
|
||||||
|
RUN if [ "$TARGETARCH$TARGETVARIANT" = "armv7" ]; then \
|
||||||
|
export PIP_EXTRA_INDEX_URL="https://www.piwheels.org/simple"; \
|
||||||
|
fi; \
|
||||||
|
pip3 install \
|
||||||
|
--break-system-packages --no-cache-dir -e /esphome
|
||||||
|
|
||||||
|
# Settings for dashboard
|
||||||
|
ENV USERNAME="" PASSWORD=""
|
||||||
|
|
||||||
# Expose the dashboard to Docker
|
# Expose the dashboard to Docker
|
||||||
EXPOSE 6052
|
EXPOSE 6052
|
||||||
|
|
||||||
# Run healthcheck (heartbeat)
|
# Run healthcheck (heartbeat)
|
||||||
HEALTHCHECK --interval=30s --timeout=30s \
|
HEALTHCHECK --interval=30s --timeout=30s \
|
||||||
CMD curl --fail http://localhost:6052/version -A "HealthCheck" || exit 1
|
CMD curl --fail http://localhost:6052/version -A "HealthCheck" || exit 1
|
||||||
|
|
||||||
COPY docker/docker_entrypoint.sh /entrypoint.sh
|
COPY docker/docker_entrypoint.sh /entrypoint.sh
|
||||||
|
|
||||||
@ -64,23 +161,73 @@ ENTRYPOINT ["/entrypoint.sh"]
|
|||||||
CMD ["dashboard", "/config"]
|
CMD ["dashboard", "/config"]
|
||||||
|
|
||||||
|
|
||||||
# ======================= ha-addon-type image =======================
|
|
||||||
FROM base AS base-ha-addon
|
|
||||||
|
# ======================= hassio-type image =======================
|
||||||
|
FROM base AS hassio
|
||||||
|
|
||||||
|
RUN \
|
||||||
|
apt-get update \
|
||||||
|
# Use pinned versions so that we get updates with build caching
|
||||||
|
&& apt-get install -y --no-install-recommends \
|
||||||
|
nginx-light=1.22.1-9 \
|
||||||
|
&& rm -rf \
|
||||||
|
/tmp/* \
|
||||||
|
/var/{cache,log}/* \
|
||||||
|
/var/lib/apt/lists/*
|
||||||
|
|
||||||
|
ARG BUILD_VERSION=dev
|
||||||
|
|
||||||
# Copy root filesystem
|
# Copy root filesystem
|
||||||
COPY docker/ha-addon-rootfs/ /
|
COPY docker/ha-addon-rootfs/ /
|
||||||
|
|
||||||
ARG BUILD_VERSION
|
# Copy esphome and install
|
||||||
|
COPY . /esphome
|
||||||
|
RUN if [ "$TARGETARCH$TARGETVARIANT" = "armv7" ]; then \
|
||||||
|
export PIP_EXTRA_INDEX_URL="https://www.piwheels.org/simple"; \
|
||||||
|
fi; \
|
||||||
|
pip3 install \
|
||||||
|
--break-system-packages --no-cache-dir -e /esphome
|
||||||
|
|
||||||
|
# Labels
|
||||||
LABEL \
|
LABEL \
|
||||||
io.hass.name="ESPHome" \
|
io.hass.name="ESPHome" \
|
||||||
io.hass.description="ESPHome is a system to configure your microcontrollers by simple yet powerful configuration files and control them remotely through Home Automation systems" \
|
io.hass.description="Manage and program ESP8266/ESP32 microcontrollers through YAML configuration files" \
|
||||||
io.hass.type="addon" \
|
io.hass.type="addon" \
|
||||||
io.hass.version="${BUILD_VERSION}"
|
io.hass.version="${BUILD_VERSION}"
|
||||||
# io.hass.arch is inherited from addon-debian-base
|
# io.hass.arch is inherited from addon-debian-base
|
||||||
|
|
||||||
ARG BUILD_TYPE
|
|
||||||
FROM base-${BUILD_TYPE} AS final
|
|
||||||
|
|
||||||
# Copy esphome and install
|
|
||||||
COPY . /esphome
|
|
||||||
RUN uv pip install --no-cache-dir -e /esphome
|
# ======================= lint-type image =======================
|
||||||
|
FROM base AS lint
|
||||||
|
|
||||||
|
ENV \
|
||||||
|
PLATFORMIO_CORE_DIR=/esphome/.temp/platformio
|
||||||
|
|
||||||
|
RUN \
|
||||||
|
apt-get update \
|
||||||
|
# Use pinned versions so that we get updates with build caching
|
||||||
|
&& apt-get install -y --no-install-recommends \
|
||||||
|
clang-format-13=1:13.0.1-11+b2 \
|
||||||
|
clang-tidy-14=1:14.0.6-12 \
|
||||||
|
patch=2.7.6-7 \
|
||||||
|
software-properties-common=0.99.30-4.1~deb12u1 \
|
||||||
|
nano=7.2-1+deb12u1 \
|
||||||
|
build-essential=12.9 \
|
||||||
|
python3-dev=3.11.2-1+b1 \
|
||||||
|
&& rm -rf \
|
||||||
|
/tmp/* \
|
||||||
|
/var/{cache,log}/* \
|
||||||
|
/var/lib/apt/lists/*
|
||||||
|
|
||||||
|
COPY requirements_test.txt /
|
||||||
|
RUN if [ "$TARGETARCH$TARGETVARIANT" = "armv7" ]; then \
|
||||||
|
export PIP_EXTRA_INDEX_URL="https://www.piwheels.org/simple"; \
|
||||||
|
fi; \
|
||||||
|
pip3 install \
|
||||||
|
--break-system-packages --no-cache-dir -r /requirements_test.txt
|
||||||
|
|
||||||
|
VOLUME ["/esphome"]
|
||||||
|
WORKDIR /esphome
|
||||||
|
@ -1,19 +1,22 @@
|
|||||||
#!/usr/bin/env python3
|
#!/usr/bin/env python3
|
||||||
import argparse
|
|
||||||
from dataclasses import dataclass
|
from dataclasses import dataclass
|
||||||
import re
|
|
||||||
import shlex
|
|
||||||
import subprocess
|
import subprocess
|
||||||
|
import argparse
|
||||||
|
from platform import machine
|
||||||
|
import shlex
|
||||||
|
import re
|
||||||
import sys
|
import sys
|
||||||
|
|
||||||
|
|
||||||
CHANNEL_DEV = "dev"
|
CHANNEL_DEV = "dev"
|
||||||
CHANNEL_BETA = "beta"
|
CHANNEL_BETA = "beta"
|
||||||
CHANNEL_RELEASE = "release"
|
CHANNEL_RELEASE = "release"
|
||||||
CHANNELS = [CHANNEL_DEV, CHANNEL_BETA, CHANNEL_RELEASE]
|
CHANNELS = [CHANNEL_DEV, CHANNEL_BETA, CHANNEL_RELEASE]
|
||||||
|
|
||||||
ARCH_AMD64 = "amd64"
|
ARCH_AMD64 = "amd64"
|
||||||
|
ARCH_ARMV7 = "armv7"
|
||||||
ARCH_AARCH64 = "aarch64"
|
ARCH_AARCH64 = "aarch64"
|
||||||
ARCHS = [ARCH_AMD64, ARCH_AARCH64]
|
ARCHS = [ARCH_AMD64, ARCH_ARMV7, ARCH_AARCH64]
|
||||||
|
|
||||||
TYPE_DOCKER = "docker"
|
TYPE_DOCKER = "docker"
|
||||||
TYPE_HA_ADDON = "ha-addon"
|
TYPE_HA_ADDON = "ha-addon"
|
||||||
@ -54,7 +57,7 @@ manifest_parser = subparsers.add_parser(
|
|||||||
class DockerParams:
|
class DockerParams:
|
||||||
build_to: str
|
build_to: str
|
||||||
manifest_to: str
|
manifest_to: str
|
||||||
build_type: str
|
baseimgtype: str
|
||||||
platform: str
|
platform: str
|
||||||
target: str
|
target: str
|
||||||
|
|
||||||
@ -66,19 +69,25 @@ class DockerParams:
|
|||||||
TYPE_LINT: "esphome/esphome-lint",
|
TYPE_LINT: "esphome/esphome-lint",
|
||||||
}[build_type]
|
}[build_type]
|
||||||
build_to = f"{prefix}-{arch}"
|
build_to = f"{prefix}-{arch}"
|
||||||
|
baseimgtype = {
|
||||||
|
TYPE_DOCKER: "docker",
|
||||||
|
TYPE_HA_ADDON: "hassio",
|
||||||
|
TYPE_LINT: "docker",
|
||||||
|
}[build_type]
|
||||||
platform = {
|
platform = {
|
||||||
ARCH_AMD64: "linux/amd64",
|
ARCH_AMD64: "linux/amd64",
|
||||||
|
ARCH_ARMV7: "linux/arm/v7",
|
||||||
ARCH_AARCH64: "linux/arm64",
|
ARCH_AARCH64: "linux/arm64",
|
||||||
}[arch]
|
}[arch]
|
||||||
target = {
|
target = {
|
||||||
TYPE_DOCKER: "final",
|
TYPE_DOCKER: "docker",
|
||||||
TYPE_HA_ADDON: "final",
|
TYPE_HA_ADDON: "hassio",
|
||||||
TYPE_LINT: "lint",
|
TYPE_LINT: "lint",
|
||||||
}[build_type]
|
}[build_type]
|
||||||
return cls(
|
return cls(
|
||||||
build_to=build_to,
|
build_to=build_to,
|
||||||
manifest_to=prefix,
|
manifest_to=prefix,
|
||||||
build_type=build_type,
|
baseimgtype=baseimgtype,
|
||||||
platform=platform,
|
platform=platform,
|
||||||
target=target,
|
target=target,
|
||||||
)
|
)
|
||||||
@ -140,7 +149,7 @@ def main():
|
|||||||
"buildx",
|
"buildx",
|
||||||
"build",
|
"build",
|
||||||
"--build-arg",
|
"--build-arg",
|
||||||
f"BUILD_TYPE={params.build_type}",
|
f"BASEIMGTYPE={params.baseimgtype}",
|
||||||
"--build-arg",
|
"--build-arg",
|
||||||
f"BUILD_VERSION={args.tag}",
|
f"BUILD_VERSION={args.tag}",
|
||||||
"--cache-from",
|
"--cache-from",
|
||||||
|
@ -1,6 +1,6 @@
|
|||||||
#!/usr/bin/env python3
|
#!/usr/bin/env python3
|
||||||
import argparse
|
|
||||||
import re
|
import re
|
||||||
|
import argparse
|
||||||
|
|
||||||
CHANNEL_DEV = "dev"
|
CHANNEL_DEV = "dev"
|
||||||
CHANNEL_BETA = "beta"
|
CHANNEL_BETA = "beta"
|
||||||
|
@ -23,6 +23,10 @@ if bashio::config.true 'streamer_mode'; then
|
|||||||
export ESPHOME_STREAMER_MODE=true
|
export ESPHOME_STREAMER_MODE=true
|
||||||
fi
|
fi
|
||||||
|
|
||||||
|
if bashio::config.true 'status_use_ping'; then
|
||||||
|
export ESPHOME_DASHBOARD_USE_PING=true
|
||||||
|
fi
|
||||||
|
|
||||||
if bashio::config.has_value 'relative_url'; then
|
if bashio::config.has_value 'relative_url'; then
|
||||||
export ESPHOME_DASHBOARD_RELATIVE_URL=$(bashio::config 'relative_url')
|
export ESPHOME_DASHBOARD_RELATIVE_URL=$(bashio::config 'relative_url')
|
||||||
fi
|
fi
|
||||||
|
@ -2,7 +2,6 @@
|
|||||||
import argparse
|
import argparse
|
||||||
from datetime import datetime
|
from datetime import datetime
|
||||||
import functools
|
import functools
|
||||||
import importlib
|
|
||||||
import logging
|
import logging
|
||||||
import os
|
import os
|
||||||
import re
|
import re
|
||||||
@ -34,15 +33,16 @@ from esphome.const import (
|
|||||||
CONF_PORT,
|
CONF_PORT,
|
||||||
CONF_SUBSTITUTIONS,
|
CONF_SUBSTITUTIONS,
|
||||||
CONF_TOPIC,
|
CONF_TOPIC,
|
||||||
ENV_NOGITIGNORE,
|
PLATFORM_BK72XX,
|
||||||
PLATFORM_ESP32,
|
PLATFORM_ESP32,
|
||||||
PLATFORM_ESP8266,
|
PLATFORM_ESP8266,
|
||||||
PLATFORM_RP2040,
|
PLATFORM_RP2040,
|
||||||
|
PLATFORM_RTL87XX,
|
||||||
SECRETS_FILES,
|
SECRETS_FILES,
|
||||||
)
|
)
|
||||||
from esphome.core import CORE, EsphomeError, coroutine
|
from esphome.core import CORE, EsphomeError, coroutine
|
||||||
from esphome.helpers import get_bool_env, indent, is_ip_address
|
from esphome.helpers import get_bool_env, indent, is_ip_address
|
||||||
from esphome.log import AnsiFore, color, setup_log
|
from esphome.log import Fore, color, setup_log
|
||||||
from esphome.util import (
|
from esphome.util import (
|
||||||
get_serial_ports,
|
get_serial_ports,
|
||||||
list_yaml_files,
|
list_yaml_files,
|
||||||
@ -66,7 +66,7 @@ def choose_prompt(options, purpose: str = None):
|
|||||||
return options[0][1]
|
return options[0][1]
|
||||||
|
|
||||||
safe_print(
|
safe_print(
|
||||||
f"Found multiple options{f' for {purpose}' if purpose else ''}, please choose one:"
|
f'Found multiple options{f" for {purpose}" if purpose else ""}, please choose one:'
|
||||||
)
|
)
|
||||||
for i, (desc, _) in enumerate(options):
|
for i, (desc, _) in enumerate(options):
|
||||||
safe_print(f" [{i + 1}] {desc}")
|
safe_print(f" [{i + 1}] {desc}")
|
||||||
@ -82,7 +82,7 @@ def choose_prompt(options, purpose: str = None):
|
|||||||
raise ValueError
|
raise ValueError
|
||||||
break
|
break
|
||||||
except ValueError:
|
except ValueError:
|
||||||
safe_print(color(AnsiFore.RED, f"Invalid option: '{opt}'"))
|
safe_print(color(Fore.RED, f"Invalid option: '{opt}'"))
|
||||||
return options[opt - 1][1]
|
return options[opt - 1][1]
|
||||||
|
|
||||||
|
|
||||||
@ -132,8 +132,7 @@ def get_port_type(port):
|
|||||||
return "NETWORK"
|
return "NETWORK"
|
||||||
|
|
||||||
|
|
||||||
def run_miniterm(config, port, args):
|
def run_miniterm(config, port):
|
||||||
from aioesphomeapi import LogParser
|
|
||||||
import serial
|
import serial
|
||||||
|
|
||||||
from esphome import platformio_api
|
from esphome import platformio_api
|
||||||
@ -154,11 +153,10 @@ def run_miniterm(config, port, args):
|
|||||||
|
|
||||||
# We can't set to False by default since it leads to toggling and hence
|
# We can't set to False by default since it leads to toggling and hence
|
||||||
# ESP32 resets on some platforms.
|
# ESP32 resets on some platforms.
|
||||||
if config["logger"][CONF_DEASSERT_RTS_DTR] or args.reset:
|
if config["logger"][CONF_DEASSERT_RTS_DTR]:
|
||||||
ser.dtr = False
|
ser.dtr = False
|
||||||
ser.rts = False
|
ser.rts = False
|
||||||
|
|
||||||
parser = LogParser()
|
|
||||||
tries = 0
|
tries = 0
|
||||||
while tries < 5:
|
while tries < 5:
|
||||||
try:
|
try:
|
||||||
@ -175,7 +173,8 @@ def run_miniterm(config, port, args):
|
|||||||
.decode("utf8", "backslashreplace")
|
.decode("utf8", "backslashreplace")
|
||||||
)
|
)
|
||||||
time_str = datetime.now().time().strftime("[%H:%M:%S]")
|
time_str = datetime.now().time().strftime("[%H:%M:%S]")
|
||||||
safe_print(parser.parse_line(line, time_str))
|
message = time_str + line
|
||||||
|
safe_print(message)
|
||||||
|
|
||||||
backtrace_state = platformio_api.process_stacktrace(
|
backtrace_state = platformio_api.process_stacktrace(
|
||||||
config, line, backtrace_state=backtrace_state
|
config, line, backtrace_state=backtrace_state
|
||||||
@ -210,9 +209,6 @@ def wrap_to_code(name, comp):
|
|||||||
|
|
||||||
|
|
||||||
def write_cpp(config):
|
def write_cpp(config):
|
||||||
if not get_bool_env(ENV_NOGITIGNORE):
|
|
||||||
writer.write_gitignore()
|
|
||||||
|
|
||||||
generate_cpp_contents(config)
|
generate_cpp_contents(config)
|
||||||
return write_cpp_file()
|
return write_cpp_file()
|
||||||
|
|
||||||
@ -229,13 +225,10 @@ def generate_cpp_contents(config):
|
|||||||
|
|
||||||
|
|
||||||
def write_cpp_file():
|
def write_cpp_file():
|
||||||
|
writer.write_platformio_project()
|
||||||
|
|
||||||
code_s = indent(CORE.cpp_main_section)
|
code_s = indent(CORE.cpp_main_section)
|
||||||
writer.write_cpp(code_s)
|
writer.write_cpp(code_s)
|
||||||
|
|
||||||
from esphome.build_gen import platformio
|
|
||||||
|
|
||||||
platformio.write_project()
|
|
||||||
|
|
||||||
return 0
|
return 0
|
||||||
|
|
||||||
|
|
||||||
@ -250,11 +243,11 @@ def compile_program(args, config):
|
|||||||
return 0 if idedata is not None else 1
|
return 0 if idedata is not None else 1
|
||||||
|
|
||||||
|
|
||||||
def upload_using_esptool(config, port, file, speed):
|
def upload_using_esptool(config, port, file):
|
||||||
from esphome import platformio_api
|
from esphome import platformio_api
|
||||||
|
|
||||||
first_baudrate = speed or config[CONF_ESPHOME][CONF_PLATFORMIO_OPTIONS].get(
|
first_baudrate = config[CONF_ESPHOME][CONF_PLATFORMIO_OPTIONS].get(
|
||||||
"upload_speed", os.getenv("ESPHOME_UPLOAD_SPEED", "460800")
|
"upload_speed", 460800
|
||||||
)
|
)
|
||||||
|
|
||||||
if file is not None:
|
if file is not None:
|
||||||
@ -343,23 +336,16 @@ def check_permissions(port):
|
|||||||
|
|
||||||
|
|
||||||
def upload_program(config, args, host):
|
def upload_program(config, args, host):
|
||||||
try:
|
|
||||||
module = importlib.import_module("esphome.components." + CORE.target_platform)
|
|
||||||
if getattr(module, "upload_program")(config, args, host):
|
|
||||||
return 0
|
|
||||||
except AttributeError:
|
|
||||||
pass
|
|
||||||
|
|
||||||
if get_port_type(host) == "SERIAL":
|
if get_port_type(host) == "SERIAL":
|
||||||
check_permissions(host)
|
check_permissions(host)
|
||||||
if CORE.target_platform in (PLATFORM_ESP32, PLATFORM_ESP8266):
|
if CORE.target_platform in (PLATFORM_ESP32, PLATFORM_ESP8266):
|
||||||
file = getattr(args, "file", None)
|
file = getattr(args, "file", None)
|
||||||
return upload_using_esptool(config, host, file, args.upload_speed)
|
return upload_using_esptool(config, host, file)
|
||||||
|
|
||||||
if CORE.target_platform in (PLATFORM_RP2040):
|
if CORE.target_platform in (PLATFORM_RP2040):
|
||||||
return upload_using_platformio(config, args.device)
|
return upload_using_platformio(config, args.device)
|
||||||
|
|
||||||
if CORE.is_libretiny:
|
if CORE.target_platform in (PLATFORM_BK72XX, PLATFORM_RTL87XX):
|
||||||
return upload_using_platformio(config, host)
|
return upload_using_platformio(config, host)
|
||||||
|
|
||||||
return 1 # Unknown target platform
|
return 1 # Unknown target platform
|
||||||
@ -377,16 +363,14 @@ def upload_program(config, args, host):
|
|||||||
|
|
||||||
from esphome import espota2
|
from esphome import espota2
|
||||||
|
|
||||||
remote_port = int(ota_conf[CONF_PORT])
|
remote_port = ota_conf[CONF_PORT]
|
||||||
password = ota_conf.get(CONF_PASSWORD, "")
|
password = ota_conf.get(CONF_PASSWORD, "")
|
||||||
|
|
||||||
if (
|
if (
|
||||||
CONF_MQTT in config # pylint: disable=too-many-boolean-expressions
|
not is_ip_address(CORE.address) # pylint: disable=too-many-boolean-expressions
|
||||||
|
and (get_port_type(host) == "MQTT" or config[CONF_MDNS][CONF_DISABLED])
|
||||||
|
and CONF_MQTT in config
|
||||||
and (not args.device or args.device in ("MQTT", "OTA"))
|
and (not args.device or args.device in ("MQTT", "OTA"))
|
||||||
and (
|
|
||||||
((config[CONF_MDNS][CONF_DISABLED]) and not is_ip_address(CORE.address))
|
|
||||||
or get_port_type(host) == "MQTT"
|
|
||||||
)
|
|
||||||
):
|
):
|
||||||
from esphome import mqtt
|
from esphome import mqtt
|
||||||
|
|
||||||
@ -405,7 +389,7 @@ def show_logs(config, args, port):
|
|||||||
raise EsphomeError("Logger is not configured!")
|
raise EsphomeError("Logger is not configured!")
|
||||||
if get_port_type(port) == "SERIAL":
|
if get_port_type(port) == "SERIAL":
|
||||||
check_permissions(port)
|
check_permissions(port)
|
||||||
return run_miniterm(config, port, args)
|
return run_miniterm(config, port)
|
||||||
if get_port_type(port) == "NETWORK" and "api" in config:
|
if get_port_type(port) == "NETWORK" and "api" in config:
|
||||||
if config[CONF_MDNS][CONF_DISABLED] and CONF_MQTT in config:
|
if config[CONF_MDNS][CONF_DISABLED] and CONF_MQTT in config:
|
||||||
from esphome import mqtt
|
from esphome import mqtt
|
||||||
@ -599,38 +583,33 @@ def command_update_all(args):
|
|||||||
middle_text = f" {middle_text} "
|
middle_text = f" {middle_text} "
|
||||||
width = len(click.unstyle(middle_text))
|
width = len(click.unstyle(middle_text))
|
||||||
half_line = "=" * ((twidth - width) // 2)
|
half_line = "=" * ((twidth - width) // 2)
|
||||||
safe_print(f"{half_line}{middle_text}{half_line}")
|
click.echo(f"{half_line}{middle_text}{half_line}")
|
||||||
|
|
||||||
for f in files:
|
for f in files:
|
||||||
safe_print(f"Updating {color(AnsiFore.CYAN, f)}")
|
print(f"Updating {color(Fore.CYAN, f)}")
|
||||||
safe_print("-" * twidth)
|
print("-" * twidth)
|
||||||
safe_print()
|
print()
|
||||||
if CORE.dashboard:
|
rc = run_external_process(
|
||||||
rc = run_external_process(
|
"esphome", "--dashboard", "run", f, "--no-logs", "--device", "OTA"
|
||||||
"esphome", "--dashboard", "run", f, "--no-logs", "--device", "OTA"
|
)
|
||||||
)
|
|
||||||
else:
|
|
||||||
rc = run_external_process(
|
|
||||||
"esphome", "run", f, "--no-logs", "--device", "OTA"
|
|
||||||
)
|
|
||||||
if rc == 0:
|
if rc == 0:
|
||||||
print_bar(f"[{color(AnsiFore.BOLD_GREEN, 'SUCCESS')}] {f}")
|
print_bar(f"[{color(Fore.BOLD_GREEN, 'SUCCESS')}] {f}")
|
||||||
success[f] = True
|
success[f] = True
|
||||||
else:
|
else:
|
||||||
print_bar(f"[{color(AnsiFore.BOLD_RED, 'ERROR')}] {f}")
|
print_bar(f"[{color(Fore.BOLD_RED, 'ERROR')}] {f}")
|
||||||
success[f] = False
|
success[f] = False
|
||||||
|
|
||||||
safe_print()
|
print()
|
||||||
safe_print()
|
print()
|
||||||
safe_print()
|
print()
|
||||||
|
|
||||||
print_bar(f"[{color(AnsiFore.BOLD_WHITE, 'SUMMARY')}]")
|
print_bar(f"[{color(Fore.BOLD_WHITE, 'SUMMARY')}]")
|
||||||
failed = 0
|
failed = 0
|
||||||
for f in files:
|
for f in files:
|
||||||
if success[f]:
|
if success[f]:
|
||||||
safe_print(f" - {f}: {color(AnsiFore.GREEN, 'SUCCESS')}")
|
print(f" - {f}: {color(Fore.GREEN, 'SUCCESS')}")
|
||||||
else:
|
else:
|
||||||
safe_print(f" - {f}: {color(AnsiFore.BOLD_RED, 'FAILED')}")
|
print(f" - {f}: {color(Fore.BOLD_RED, 'FAILED')}")
|
||||||
failed += 1
|
failed += 1
|
||||||
return failed
|
return failed
|
||||||
|
|
||||||
@ -656,7 +635,7 @@ def command_rename(args, config):
|
|||||||
if c not in ALLOWED_NAME_CHARS:
|
if c not in ALLOWED_NAME_CHARS:
|
||||||
print(
|
print(
|
||||||
color(
|
color(
|
||||||
AnsiFore.BOLD_RED,
|
Fore.BOLD_RED,
|
||||||
f"'{c}' is an invalid character for names. Valid characters are: "
|
f"'{c}' is an invalid character for names. Valid characters are: "
|
||||||
f"{ALLOWED_NAME_CHARS} (lowercase, no spaces)",
|
f"{ALLOWED_NAME_CHARS} (lowercase, no spaces)",
|
||||||
)
|
)
|
||||||
@ -669,9 +648,7 @@ def command_rename(args, config):
|
|||||||
yaml = yaml_util.load_yaml(CORE.config_path)
|
yaml = yaml_util.load_yaml(CORE.config_path)
|
||||||
if CONF_ESPHOME not in yaml or CONF_NAME not in yaml[CONF_ESPHOME]:
|
if CONF_ESPHOME not in yaml or CONF_NAME not in yaml[CONF_ESPHOME]:
|
||||||
print(
|
print(
|
||||||
color(
|
color(Fore.BOLD_RED, "Complex YAML files cannot be automatically renamed.")
|
||||||
AnsiFore.BOLD_RED, "Complex YAML files cannot be automatically renamed."
|
|
||||||
)
|
|
||||||
)
|
)
|
||||||
return 1
|
return 1
|
||||||
old_name = yaml[CONF_ESPHOME][CONF_NAME]
|
old_name = yaml[CONF_ESPHOME][CONF_NAME]
|
||||||
@ -694,7 +671,7 @@ def command_rename(args, config):
|
|||||||
)
|
)
|
||||||
> 1
|
> 1
|
||||||
):
|
):
|
||||||
print(color(AnsiFore.BOLD_RED, "Too many matches in YAML to safely rename"))
|
print(color(Fore.BOLD_RED, "Too many matches in YAML to safely rename"))
|
||||||
return 1
|
return 1
|
||||||
|
|
||||||
new_raw = re.sub(
|
new_raw = re.sub(
|
||||||
@ -706,7 +683,7 @@ def command_rename(args, config):
|
|||||||
|
|
||||||
new_path = os.path.join(CORE.config_dir, args.name + ".yaml")
|
new_path = os.path.join(CORE.config_dir, args.name + ".yaml")
|
||||||
print(
|
print(
|
||||||
f"Updating {color(AnsiFore.CYAN, CORE.config_path)} to {color(AnsiFore.CYAN, new_path)}"
|
f"Updating {color(Fore.CYAN, CORE.config_path)} to {color(Fore.CYAN, new_path)}"
|
||||||
)
|
)
|
||||||
print()
|
print()
|
||||||
|
|
||||||
@ -715,7 +692,7 @@ def command_rename(args, config):
|
|||||||
|
|
||||||
rc = run_external_process("esphome", "config", new_path)
|
rc = run_external_process("esphome", "config", new_path)
|
||||||
if rc != 0:
|
if rc != 0:
|
||||||
print(color(AnsiFore.BOLD_RED, "Rename failed. Reverting changes."))
|
print(color(Fore.BOLD_RED, "Rename failed. Reverting changes."))
|
||||||
os.remove(new_path)
|
os.remove(new_path)
|
||||||
return 1
|
return 1
|
||||||
|
|
||||||
@ -741,7 +718,7 @@ def command_rename(args, config):
|
|||||||
if CORE.config_path != new_path:
|
if CORE.config_path != new_path:
|
||||||
os.remove(CORE.config_path)
|
os.remove(CORE.config_path)
|
||||||
|
|
||||||
print(color(AnsiFore.BOLD_GREEN, "SUCCESS"))
|
print(color(Fore.BOLD_GREEN, "SUCCESS"))
|
||||||
print()
|
print()
|
||||||
return 0
|
return 0
|
||||||
|
|
||||||
@ -781,14 +758,6 @@ def parse_args(argv):
|
|||||||
options_parser.add_argument(
|
options_parser.add_argument(
|
||||||
"-q", "--quiet", help="Disable all ESPHome logs.", action="store_true"
|
"-q", "--quiet", help="Disable all ESPHome logs.", action="store_true"
|
||||||
)
|
)
|
||||||
options_parser.add_argument(
|
|
||||||
"-l",
|
|
||||||
"--log-level",
|
|
||||||
help="Set the log level.",
|
|
||||||
default=os.getenv("ESPHOME_LOG_LEVEL", "INFO"),
|
|
||||||
action="store",
|
|
||||||
choices=["DEBUG", "INFO", "WARNING", "ERROR", "CRITICAL"],
|
|
||||||
)
|
|
||||||
options_parser.add_argument(
|
options_parser.add_argument(
|
||||||
"--dashboard", help=argparse.SUPPRESS, action="store_true"
|
"--dashboard", help=argparse.SUPPRESS, action="store_true"
|
||||||
)
|
)
|
||||||
@ -857,10 +826,6 @@ def parse_args(argv):
|
|||||||
"--device",
|
"--device",
|
||||||
help="Manually specify the serial port/address to use, for example /dev/ttyUSB0.",
|
help="Manually specify the serial port/address to use, for example /dev/ttyUSB0.",
|
||||||
)
|
)
|
||||||
parser_upload.add_argument(
|
|
||||||
"--upload_speed",
|
|
||||||
help="Override the default or configured upload speed.",
|
|
||||||
)
|
|
||||||
parser_upload.add_argument(
|
parser_upload.add_argument(
|
||||||
"--file",
|
"--file",
|
||||||
help="Manually specify the binary file to upload.",
|
help="Manually specify the binary file to upload.",
|
||||||
@ -879,13 +844,6 @@ def parse_args(argv):
|
|||||||
"--device",
|
"--device",
|
||||||
help="Manually specify the serial port/address to use, for example /dev/ttyUSB0.",
|
help="Manually specify the serial port/address to use, for example /dev/ttyUSB0.",
|
||||||
)
|
)
|
||||||
parser_logs.add_argument(
|
|
||||||
"--reset",
|
|
||||||
"-r",
|
|
||||||
action="store_true",
|
|
||||||
help="Reset the device before starting serial logs.",
|
|
||||||
default=os.getenv("ESPHOME_SERIAL_LOGGING_RESET"),
|
|
||||||
)
|
|
||||||
|
|
||||||
parser_discover = subparsers.add_parser(
|
parser_discover = subparsers.add_parser(
|
||||||
"discover",
|
"discover",
|
||||||
@ -908,20 +866,9 @@ def parse_args(argv):
|
|||||||
"--device",
|
"--device",
|
||||||
help="Manually specify the serial port/address to use, for example /dev/ttyUSB0.",
|
help="Manually specify the serial port/address to use, for example /dev/ttyUSB0.",
|
||||||
)
|
)
|
||||||
parser_run.add_argument(
|
|
||||||
"--upload_speed",
|
|
||||||
help="Override the default or configured upload speed.",
|
|
||||||
)
|
|
||||||
parser_run.add_argument(
|
parser_run.add_argument(
|
||||||
"--no-logs", help="Disable starting logs.", action="store_true"
|
"--no-logs", help="Disable starting logs.", action="store_true"
|
||||||
)
|
)
|
||||||
parser_run.add_argument(
|
|
||||||
"--reset",
|
|
||||||
"-r",
|
|
||||||
action="store_true",
|
|
||||||
help="Reset the device before starting serial logs.",
|
|
||||||
default=os.getenv("ESPHOME_SERIAL_LOGGING_RESET"),
|
|
||||||
)
|
|
||||||
|
|
||||||
parser_clean = subparsers.add_parser(
|
parser_clean = subparsers.add_parser(
|
||||||
"clean-mqtt",
|
"clean-mqtt",
|
||||||
@ -1040,16 +987,11 @@ def run_esphome(argv):
|
|||||||
args = parse_args(argv)
|
args = parse_args(argv)
|
||||||
CORE.dashboard = args.dashboard
|
CORE.dashboard = args.dashboard
|
||||||
|
|
||||||
# Override log level if verbose is set
|
|
||||||
if args.verbose:
|
|
||||||
args.log_level = "DEBUG"
|
|
||||||
elif args.quiet:
|
|
||||||
args.log_level = "CRITICAL"
|
|
||||||
|
|
||||||
setup_log(
|
setup_log(
|
||||||
log_level=args.log_level,
|
args.verbose,
|
||||||
|
args.quiet,
|
||||||
# Show timestamp for dashboard access logs
|
# Show timestamp for dashboard access logs
|
||||||
include_timestamp=args.command == "dashboard",
|
args.command == "dashboard",
|
||||||
)
|
)
|
||||||
|
|
||||||
if args.command in PRE_CONFIG_ACTIONS:
|
if args.command in PRE_CONFIG_ACTIONS:
|
||||||
|
@ -1,102 +0,0 @@
|
|||||||
import os
|
|
||||||
|
|
||||||
from esphome.const import __version__
|
|
||||||
from esphome.core import CORE
|
|
||||||
from esphome.helpers import mkdir_p, read_file, write_file_if_changed
|
|
||||||
from esphome.writer import find_begin_end, update_storage_json
|
|
||||||
|
|
||||||
INI_AUTO_GENERATE_BEGIN = "; ========== AUTO GENERATED CODE BEGIN ==========="
|
|
||||||
INI_AUTO_GENERATE_END = "; =========== AUTO GENERATED CODE END ============"
|
|
||||||
|
|
||||||
INI_BASE_FORMAT = (
|
|
||||||
"""; Auto generated code by esphome
|
|
||||||
|
|
||||||
[common]
|
|
||||||
lib_deps =
|
|
||||||
build_flags =
|
|
||||||
upload_flags =
|
|
||||||
|
|
||||||
""",
|
|
||||||
"""
|
|
||||||
|
|
||||||
""",
|
|
||||||
)
|
|
||||||
|
|
||||||
|
|
||||||
def format_ini(data: dict[str, str | list[str]]) -> str:
|
|
||||||
content = ""
|
|
||||||
for key, value in sorted(data.items()):
|
|
||||||
if isinstance(value, list):
|
|
||||||
content += f"{key} =\n"
|
|
||||||
for x in value:
|
|
||||||
content += f" {x}\n"
|
|
||||||
else:
|
|
||||||
content += f"{key} = {value}\n"
|
|
||||||
return content
|
|
||||||
|
|
||||||
|
|
||||||
def get_ini_content():
|
|
||||||
CORE.add_platformio_option(
|
|
||||||
"lib_deps",
|
|
||||||
[x.as_lib_dep for x in CORE.platformio_libraries.values()]
|
|
||||||
+ ["${common.lib_deps}"],
|
|
||||||
)
|
|
||||||
# Sort to avoid changing build flags order
|
|
||||||
CORE.add_platformio_option("build_flags", sorted(CORE.build_flags))
|
|
||||||
|
|
||||||
# Sort to avoid changing build unflags order
|
|
||||||
CORE.add_platformio_option("build_unflags", sorted(CORE.build_unflags))
|
|
||||||
|
|
||||||
# Add extra script for C++ flags
|
|
||||||
CORE.add_platformio_option("extra_scripts", [f"pre:{CXX_FLAGS_FILE_NAME}"])
|
|
||||||
|
|
||||||
content = "[platformio]\n"
|
|
||||||
content += f"description = ESPHome {__version__}\n"
|
|
||||||
|
|
||||||
content += f"[env:{CORE.name}]\n"
|
|
||||||
content += format_ini(CORE.platformio_options)
|
|
||||||
|
|
||||||
return content
|
|
||||||
|
|
||||||
|
|
||||||
def write_ini(content):
|
|
||||||
update_storage_json()
|
|
||||||
path = CORE.relative_build_path("platformio.ini")
|
|
||||||
|
|
||||||
if os.path.isfile(path):
|
|
||||||
text = read_file(path)
|
|
||||||
content_format = find_begin_end(
|
|
||||||
text, INI_AUTO_GENERATE_BEGIN, INI_AUTO_GENERATE_END
|
|
||||||
)
|
|
||||||
else:
|
|
||||||
content_format = INI_BASE_FORMAT
|
|
||||||
full_file = f"{content_format[0] + INI_AUTO_GENERATE_BEGIN}\n{content}"
|
|
||||||
full_file += INI_AUTO_GENERATE_END + content_format[1]
|
|
||||||
write_file_if_changed(path, full_file)
|
|
||||||
|
|
||||||
|
|
||||||
def write_project():
|
|
||||||
mkdir_p(CORE.build_path)
|
|
||||||
|
|
||||||
content = get_ini_content()
|
|
||||||
write_ini(content)
|
|
||||||
|
|
||||||
# Write extra script for C++ specific flags
|
|
||||||
write_cxx_flags_script()
|
|
||||||
|
|
||||||
|
|
||||||
CXX_FLAGS_FILE_NAME = "cxx_flags.py"
|
|
||||||
CXX_FLAGS_FILE_CONTENTS = """# Auto-generated ESPHome script for C++ specific compiler flags
|
|
||||||
Import("env")
|
|
||||||
|
|
||||||
# Add C++ specific flags
|
|
||||||
"""
|
|
||||||
|
|
||||||
|
|
||||||
def write_cxx_flags_script() -> None:
|
|
||||||
path = CORE.relative_build_path(CXX_FLAGS_FILE_NAME)
|
|
||||||
contents = CXX_FLAGS_FILE_CONTENTS
|
|
||||||
if not CORE.is_host:
|
|
||||||
contents += 'env.Append(CXXFLAGS=["-Wno-volatile"])'
|
|
||||||
contents += "\n"
|
|
||||||
write_file_if_changed(path, contents)
|
|
@ -22,7 +22,6 @@ from esphome.cpp_generator import ( # noqa: F401
|
|||||||
TemplateArguments,
|
TemplateArguments,
|
||||||
add,
|
add,
|
||||||
add_build_flag,
|
add_build_flag,
|
||||||
add_build_unflag,
|
|
||||||
add_define,
|
add_define,
|
||||||
add_global,
|
add_global,
|
||||||
add_library,
|
add_library,
|
||||||
@ -35,7 +34,6 @@ from esphome.cpp_generator import ( # noqa: F401
|
|||||||
process_lambda,
|
process_lambda,
|
||||||
progmem_array,
|
progmem_array,
|
||||||
safe_exp,
|
safe_exp,
|
||||||
set_cpp_standard,
|
|
||||||
statement,
|
statement,
|
||||||
static_const_array,
|
static_const_array,
|
||||||
templatable,
|
templatable,
|
||||||
|
@ -1,10 +1,10 @@
|
|||||||
import esphome.codegen as cg
|
import esphome.codegen as cg
|
||||||
from esphome.components import sensor, uart
|
from esphome.components import sensor, uart
|
||||||
from esphome.const import (
|
from esphome.const import (
|
||||||
DEVICE_CLASS_DISTANCE,
|
|
||||||
ICON_ARROW_EXPAND_VERTICAL,
|
|
||||||
STATE_CLASS_MEASUREMENT,
|
STATE_CLASS_MEASUREMENT,
|
||||||
UNIT_METER,
|
UNIT_METER,
|
||||||
|
ICON_ARROW_EXPAND_VERTICAL,
|
||||||
|
DEVICE_CLASS_DISTANCE,
|
||||||
)
|
)
|
||||||
|
|
||||||
CODEOWNERS = ["@MrSuicideParrot"]
|
CODEOWNERS = ["@MrSuicideParrot"]
|
||||||
|
@ -1,9 +1,9 @@
|
|||||||
import esphome.codegen as cg
|
import esphome.codegen as cg
|
||||||
from esphome.components import sensor, uart
|
from esphome.components import sensor, uart
|
||||||
from esphome.const import (
|
from esphome.const import (
|
||||||
DEVICE_CLASS_DISTANCE,
|
|
||||||
ICON_ARROW_EXPAND_VERTICAL,
|
|
||||||
STATE_CLASS_MEASUREMENT,
|
STATE_CLASS_MEASUREMENT,
|
||||||
|
ICON_ARROW_EXPAND_VERTICAL,
|
||||||
|
DEVICE_CLASS_DISTANCE,
|
||||||
UNIT_MILLIMETER,
|
UNIT_MILLIMETER,
|
||||||
)
|
)
|
||||||
|
|
||||||
|
@ -7,7 +7,7 @@ namespace a4988 {
|
|||||||
static const char *const TAG = "a4988.stepper";
|
static const char *const TAG = "a4988.stepper";
|
||||||
|
|
||||||
void A4988::setup() {
|
void A4988::setup() {
|
||||||
ESP_LOGCONFIG(TAG, "Running setup");
|
ESP_LOGCONFIG(TAG, "Setting up A4988...");
|
||||||
if (this->sleep_pin_ != nullptr) {
|
if (this->sleep_pin_ != nullptr) {
|
||||||
this->sleep_pin_->setup();
|
this->sleep_pin_->setup();
|
||||||
this->sleep_pin_->digital_write(false);
|
this->sleep_pin_->digital_write(false);
|
||||||
|
@ -1,9 +1,10 @@
|
|||||||
from esphome import pins
|
from esphome import pins
|
||||||
import esphome.codegen as cg
|
|
||||||
from esphome.components import stepper
|
from esphome.components import stepper
|
||||||
import esphome.config_validation as cv
|
import esphome.config_validation as cv
|
||||||
|
import esphome.codegen as cg
|
||||||
from esphome.const import CONF_DIR_PIN, CONF_ID, CONF_SLEEP_PIN, CONF_STEP_PIN
|
from esphome.const import CONF_DIR_PIN, CONF_ID, CONF_SLEEP_PIN, CONF_STEP_PIN
|
||||||
|
|
||||||
|
|
||||||
a4988_ns = cg.esphome_ns.namespace("a4988")
|
a4988_ns = cg.esphome_ns.namespace("a4988")
|
||||||
A4988 = a4988_ns.class_("A4988", stepper.Stepper, cg.Component)
|
A4988 = a4988_ns.class_("A4988", stepper.Stepper, cg.Component)
|
||||||
|
|
||||||
|
@ -7,7 +7,7 @@ namespace absolute_humidity {
|
|||||||
static const char *const TAG = "absolute_humidity.sensor";
|
static const char *const TAG = "absolute_humidity.sensor";
|
||||||
|
|
||||||
void AbsoluteHumidityComponent::setup() {
|
void AbsoluteHumidityComponent::setup() {
|
||||||
ESP_LOGCONFIG(TAG, "Running setup for '%s'", this->get_name().c_str());
|
ESP_LOGCONFIG(TAG, "Setting up absolute humidity '%s'...", this->get_name().c_str());
|
||||||
|
|
||||||
ESP_LOGD(TAG, " Added callback for temperature '%s'", this->temperature_sensor_->get_name().c_str());
|
ESP_LOGD(TAG, " Added callback for temperature '%s'", this->temperature_sensor_->get_name().c_str());
|
||||||
this->temperature_sensor_->add_on_state_callback([this](float state) { this->temperature_callback_(state); });
|
this->temperature_sensor_->add_on_state_callback([this](float state) { this->temperature_callback_(state); });
|
||||||
@ -40,11 +40,9 @@ void AbsoluteHumidityComponent::dump_config() {
|
|||||||
break;
|
break;
|
||||||
}
|
}
|
||||||
|
|
||||||
ESP_LOGCONFIG(TAG,
|
ESP_LOGCONFIG(TAG, "Sources");
|
||||||
"Sources\n"
|
ESP_LOGCONFIG(TAG, " Temperature: '%s'", this->temperature_sensor_->get_name().c_str());
|
||||||
" Temperature: '%s'\n"
|
ESP_LOGCONFIG(TAG, " Relative Humidity: '%s'", this->humidity_sensor_->get_name().c_str());
|
||||||
" Relative Humidity: '%s'",
|
|
||||||
this->temperature_sensor_->get_name().c_str(), this->humidity_sensor_->get_name().c_str());
|
|
||||||
}
|
}
|
||||||
|
|
||||||
float AbsoluteHumidityComponent::get_setup_priority() const { return setup_priority::DATA; }
|
float AbsoluteHumidityComponent::get_setup_priority() const { return setup_priority::DATA; }
|
||||||
|
@ -1,12 +1,12 @@
|
|||||||
import esphome.codegen as cg
|
import esphome.codegen as cg
|
||||||
from esphome.components import sensor
|
|
||||||
import esphome.config_validation as cv
|
import esphome.config_validation as cv
|
||||||
|
from esphome.components import sensor
|
||||||
from esphome.const import (
|
from esphome.const import (
|
||||||
CONF_EQUATION,
|
|
||||||
CONF_HUMIDITY,
|
CONF_HUMIDITY,
|
||||||
CONF_TEMPERATURE,
|
CONF_TEMPERATURE,
|
||||||
ICON_WATER,
|
|
||||||
STATE_CLASS_MEASUREMENT,
|
STATE_CLASS_MEASUREMENT,
|
||||||
|
CONF_EQUATION,
|
||||||
|
ICON_WATER,
|
||||||
UNIT_GRAMS_PER_CUBIC_METER,
|
UNIT_GRAMS_PER_CUBIC_METER,
|
||||||
)
|
)
|
||||||
|
|
||||||
|
@ -4,7 +4,6 @@
|
|||||||
#include "esphome/core/helpers.h"
|
#include "esphome/core/helpers.h"
|
||||||
#include "esphome/core/log.h"
|
#include "esphome/core/log.h"
|
||||||
#include <cmath>
|
#include <cmath>
|
||||||
#include <numbers>
|
|
||||||
|
|
||||||
#ifdef USE_ESP8266
|
#ifdef USE_ESP8266
|
||||||
#include <core_esp8266_waveform.h>
|
#include <core_esp8266_waveform.h>
|
||||||
@ -115,14 +114,13 @@ void IRAM_ATTR HOT AcDimmerDataStore::gpio_intr() {
|
|||||||
// fully off, disable output immediately
|
// fully off, disable output immediately
|
||||||
this->gate_pin.digital_write(false);
|
this->gate_pin.digital_write(false);
|
||||||
} else {
|
} else {
|
||||||
auto min_us = this->cycle_time_us * this->min_power / 1000;
|
|
||||||
if (this->method == DIM_METHOD_TRAILING) {
|
if (this->method == DIM_METHOD_TRAILING) {
|
||||||
this->enable_time_us = 1; // cannot be 0
|
this->enable_time_us = 1; // cannot be 0
|
||||||
// calculate time until disable in µs with integer arithmetic and take into account min_power
|
this->disable_time_us = std::max((uint32_t) 10, this->value * this->cycle_time_us / 65535);
|
||||||
this->disable_time_us = std::max((uint32_t) 10, this->value * (this->cycle_time_us - min_us) / 65535 + min_us);
|
|
||||||
} else {
|
} else {
|
||||||
// calculate time until enable in µs: (1.0-value)*cycle_time, but with integer arithmetic
|
// calculate time until enable in µs: (1.0-value)*cycle_time, but with integer arithmetic
|
||||||
// also take into account min_power
|
// also take into account min_power
|
||||||
|
auto min_us = this->cycle_time_us * this->min_power / 1000;
|
||||||
this->enable_time_us = std::max((uint32_t) 1, ((65535 - this->value) * (this->cycle_time_us - min_us)) / 65535);
|
this->enable_time_us = std::max((uint32_t) 1, ((65535 - this->value) * (this->cycle_time_us - min_us)) / 65535);
|
||||||
|
|
||||||
if (this->method == DIM_METHOD_LEADING_PULSE) {
|
if (this->method == DIM_METHOD_LEADING_PULSE) {
|
||||||
@ -194,17 +192,18 @@ void AcDimmer::setup() {
|
|||||||
setTimer1Callback(&timer_interrupt);
|
setTimer1Callback(&timer_interrupt);
|
||||||
#endif
|
#endif
|
||||||
#ifdef USE_ESP32
|
#ifdef USE_ESP32
|
||||||
// timer frequency of 1mhz
|
// 80 Divider -> 1 count=1µs
|
||||||
dimmer_timer = timerBegin(1000000);
|
dimmer_timer = timerBegin(0, 80, true);
|
||||||
timerAttachInterrupt(dimmer_timer, &AcDimmerDataStore::s_timer_intr);
|
timerAttachInterrupt(dimmer_timer, &AcDimmerDataStore::s_timer_intr, true);
|
||||||
// For ESP32, we can't use dynamic interval calculation because the timerX functions
|
// For ESP32, we can't use dynamic interval calculation because the timerX functions
|
||||||
// are not callable from ISR (placed in flash storage).
|
// are not callable from ISR (placed in flash storage).
|
||||||
// Here we just use an interrupt firing every 50 µs.
|
// Here we just use an interrupt firing every 50 µs.
|
||||||
timerAlarm(dimmer_timer, 50, true, 0);
|
timerAlarmWrite(dimmer_timer, 50, true);
|
||||||
|
timerAlarmEnable(dimmer_timer);
|
||||||
#endif
|
#endif
|
||||||
}
|
}
|
||||||
void AcDimmer::write_state(float state) {
|
void AcDimmer::write_state(float state) {
|
||||||
state = std::acos(1 - (2 * state)) / std::numbers::pi; // RMS power compensation
|
state = std::acos(1 - (2 * state)) / 3.14159; // RMS power compensation
|
||||||
auto new_value = static_cast<uint16_t>(roundf(state * 65535));
|
auto new_value = static_cast<uint16_t>(roundf(state * 65535));
|
||||||
if (new_value != 0 && this->store_.value == 0)
|
if (new_value != 0 && this->store_.value == 0)
|
||||||
this->store_.init_cycle = this->init_with_half_cycle_;
|
this->store_.init_cycle = this->init_with_half_cycle_;
|
||||||
@ -214,10 +213,8 @@ void AcDimmer::dump_config() {
|
|||||||
ESP_LOGCONFIG(TAG, "AcDimmer:");
|
ESP_LOGCONFIG(TAG, "AcDimmer:");
|
||||||
LOG_PIN(" Output Pin: ", this->gate_pin_);
|
LOG_PIN(" Output Pin: ", this->gate_pin_);
|
||||||
LOG_PIN(" Zero-Cross Pin: ", this->zero_cross_pin_);
|
LOG_PIN(" Zero-Cross Pin: ", this->zero_cross_pin_);
|
||||||
ESP_LOGCONFIG(TAG,
|
ESP_LOGCONFIG(TAG, " Min Power: %.1f%%", this->store_.min_power / 10.0f);
|
||||||
" Min Power: %.1f%%\n"
|
ESP_LOGCONFIG(TAG, " Init with half cycle: %s", YESNO(this->init_with_half_cycle_));
|
||||||
" Init with half cycle: %s",
|
|
||||||
this->store_.min_power / 10.0f, YESNO(this->init_with_half_cycle_));
|
|
||||||
if (method_ == DIM_METHOD_LEADING_PULSE) {
|
if (method_ == DIM_METHOD_LEADING_PULSE) {
|
||||||
ESP_LOGCONFIG(TAG, " Method: leading pulse");
|
ESP_LOGCONFIG(TAG, " Method: leading pulse");
|
||||||
} else if (method_ == DIM_METHOD_LEADING) {
|
} else if (method_ == DIM_METHOD_LEADING) {
|
||||||
|
@ -1,8 +1,8 @@
|
|||||||
from esphome import pins
|
|
||||||
import esphome.codegen as cg
|
import esphome.codegen as cg
|
||||||
from esphome.components import output
|
|
||||||
import esphome.config_validation as cv
|
import esphome.config_validation as cv
|
||||||
from esphome.const import CONF_ID, CONF_METHOD, CONF_MIN_POWER
|
from esphome import pins
|
||||||
|
from esphome.components import output
|
||||||
|
from esphome.const import CONF_ID, CONF_MIN_POWER, CONF_METHOD
|
||||||
|
|
||||||
CODEOWNERS = ["@glmnet"]
|
CODEOWNERS = ["@glmnet"]
|
||||||
|
|
||||||
|
@ -1,8 +1,8 @@
|
|||||||
import esphome.codegen as cg
|
import esphome.codegen as cg
|
||||||
from esphome.components import uart
|
|
||||||
from esphome.components.light.effects import register_addressable_effect
|
|
||||||
from esphome.components.light.types import AddressableLightEffect
|
|
||||||
import esphome.config_validation as cv
|
import esphome.config_validation as cv
|
||||||
|
from esphome.components import uart
|
||||||
|
from esphome.components.light.types import AddressableLightEffect
|
||||||
|
from esphome.components.light.effects import register_addressable_effect
|
||||||
from esphome.const import CONF_NAME, CONF_UART_ID
|
from esphome.const import CONF_NAME, CONF_UART_ID
|
||||||
|
|
||||||
DEPENDENCIES = ["uart"]
|
DEPENDENCIES = ["uart"]
|
||||||
|
@ -1,26 +1,20 @@
|
|||||||
from esphome import pins
|
|
||||||
import esphome.codegen as cg
|
import esphome.codegen as cg
|
||||||
|
import esphome.config_validation as cv
|
||||||
|
from esphome import pins
|
||||||
|
from esphome.const import CONF_ANALOG, CONF_INPUT, CONF_NUMBER
|
||||||
|
|
||||||
|
from esphome.core import CORE
|
||||||
from esphome.components.esp32 import get_esp32_variant
|
from esphome.components.esp32 import get_esp32_variant
|
||||||
|
from esphome.const import PLATFORM_ESP8266
|
||||||
from esphome.components.esp32.const import (
|
from esphome.components.esp32.const import (
|
||||||
VARIANT_ESP32,
|
VARIANT_ESP32,
|
||||||
VARIANT_ESP32C2,
|
VARIANT_ESP32C2,
|
||||||
VARIANT_ESP32C3,
|
VARIANT_ESP32C3,
|
||||||
VARIANT_ESP32C5,
|
|
||||||
VARIANT_ESP32C6,
|
VARIANT_ESP32C6,
|
||||||
VARIANT_ESP32H2,
|
VARIANT_ESP32H2,
|
||||||
VARIANT_ESP32S2,
|
VARIANT_ESP32S2,
|
||||||
VARIANT_ESP32S3,
|
VARIANT_ESP32S3,
|
||||||
)
|
)
|
||||||
from esphome.config_helpers import filter_source_files_from_platform
|
|
||||||
import esphome.config_validation as cv
|
|
||||||
from esphome.const import (
|
|
||||||
CONF_ANALOG,
|
|
||||||
CONF_INPUT,
|
|
||||||
CONF_NUMBER,
|
|
||||||
PLATFORM_ESP8266,
|
|
||||||
PlatformFramework,
|
|
||||||
)
|
|
||||||
from esphome.core import CORE
|
|
||||||
|
|
||||||
CODEOWNERS = ["@esphome/core"]
|
CODEOWNERS = ["@esphome/core"]
|
||||||
|
|
||||||
@ -44,160 +38,122 @@ ATTENUATION_MODES = {
|
|||||||
"auto": "auto",
|
"auto": "auto",
|
||||||
}
|
}
|
||||||
|
|
||||||
sampling_mode = adc_ns.enum("SamplingMode", is_class=True)
|
adc1_channel_t = cg.global_ns.enum("adc1_channel_t")
|
||||||
|
adc2_channel_t = cg.global_ns.enum("adc2_channel_t")
|
||||||
SAMPLING_MODES = {
|
|
||||||
"avg": sampling_mode.AVG,
|
|
||||||
"min": sampling_mode.MIN,
|
|
||||||
"max": sampling_mode.MAX,
|
|
||||||
}
|
|
||||||
|
|
||||||
adc_unit_t = cg.global_ns.enum("adc_unit_t", is_class=True)
|
|
||||||
|
|
||||||
adc_channel_t = cg.global_ns.enum("adc_channel_t", is_class=True)
|
|
||||||
|
|
||||||
|
# From https://github.com/espressif/esp-idf/blob/master/components/driver/include/driver/adc_common.h
|
||||||
# pin to adc1 channel mapping
|
# pin to adc1 channel mapping
|
||||||
# https://github.com/espressif/esp-idf/blob/v4.4.8/components/driver/include/driver/adc.h
|
|
||||||
ESP32_VARIANT_ADC1_PIN_TO_CHANNEL = {
|
ESP32_VARIANT_ADC1_PIN_TO_CHANNEL = {
|
||||||
# https://github.com/espressif/esp-idf/blob/master/components/soc/esp32/include/soc/adc_channel.h
|
|
||||||
VARIANT_ESP32: {
|
VARIANT_ESP32: {
|
||||||
36: adc_channel_t.ADC_CHANNEL_0,
|
36: adc1_channel_t.ADC1_CHANNEL_0,
|
||||||
37: adc_channel_t.ADC_CHANNEL_1,
|
37: adc1_channel_t.ADC1_CHANNEL_1,
|
||||||
38: adc_channel_t.ADC_CHANNEL_2,
|
38: adc1_channel_t.ADC1_CHANNEL_2,
|
||||||
39: adc_channel_t.ADC_CHANNEL_3,
|
39: adc1_channel_t.ADC1_CHANNEL_3,
|
||||||
32: adc_channel_t.ADC_CHANNEL_4,
|
32: adc1_channel_t.ADC1_CHANNEL_4,
|
||||||
33: adc_channel_t.ADC_CHANNEL_5,
|
33: adc1_channel_t.ADC1_CHANNEL_5,
|
||||||
34: adc_channel_t.ADC_CHANNEL_6,
|
34: adc1_channel_t.ADC1_CHANNEL_6,
|
||||||
35: adc_channel_t.ADC_CHANNEL_7,
|
35: adc1_channel_t.ADC1_CHANNEL_7,
|
||||||
},
|
},
|
||||||
# https://github.com/espressif/esp-idf/blob/master/components/soc/esp32c2/include/soc/adc_channel.h
|
|
||||||
VARIANT_ESP32C2: {
|
|
||||||
0: adc_channel_t.ADC_CHANNEL_0,
|
|
||||||
1: adc_channel_t.ADC_CHANNEL_1,
|
|
||||||
2: adc_channel_t.ADC_CHANNEL_2,
|
|
||||||
3: adc_channel_t.ADC_CHANNEL_3,
|
|
||||||
4: adc_channel_t.ADC_CHANNEL_4,
|
|
||||||
},
|
|
||||||
# https://github.com/espressif/esp-idf/blob/master/components/soc/esp32c3/include/soc/adc_channel.h
|
|
||||||
VARIANT_ESP32C3: {
|
|
||||||
0: adc_channel_t.ADC_CHANNEL_0,
|
|
||||||
1: adc_channel_t.ADC_CHANNEL_1,
|
|
||||||
2: adc_channel_t.ADC_CHANNEL_2,
|
|
||||||
3: adc_channel_t.ADC_CHANNEL_3,
|
|
||||||
4: adc_channel_t.ADC_CHANNEL_4,
|
|
||||||
},
|
|
||||||
# ESP32-C5 ADC1 pin mapping - based on official ESP-IDF documentation
|
|
||||||
# https://docs.espressif.com/projects/esp-idf/en/latest/esp32c5/api-reference/peripherals/gpio.html
|
|
||||||
VARIANT_ESP32C5: {
|
|
||||||
1: adc_channel_t.ADC_CHANNEL_0,
|
|
||||||
2: adc_channel_t.ADC_CHANNEL_1,
|
|
||||||
3: adc_channel_t.ADC_CHANNEL_2,
|
|
||||||
4: adc_channel_t.ADC_CHANNEL_3,
|
|
||||||
5: adc_channel_t.ADC_CHANNEL_4,
|
|
||||||
6: adc_channel_t.ADC_CHANNEL_5,
|
|
||||||
},
|
|
||||||
# https://github.com/espressif/esp-idf/blob/master/components/soc/esp32c6/include/soc/adc_channel.h
|
|
||||||
VARIANT_ESP32C6: {
|
|
||||||
0: adc_channel_t.ADC_CHANNEL_0,
|
|
||||||
1: adc_channel_t.ADC_CHANNEL_1,
|
|
||||||
2: adc_channel_t.ADC_CHANNEL_2,
|
|
||||||
3: adc_channel_t.ADC_CHANNEL_3,
|
|
||||||
4: adc_channel_t.ADC_CHANNEL_4,
|
|
||||||
5: adc_channel_t.ADC_CHANNEL_5,
|
|
||||||
6: adc_channel_t.ADC_CHANNEL_6,
|
|
||||||
},
|
|
||||||
# https://github.com/espressif/esp-idf/blob/master/components/soc/esp32h2/include/soc/adc_channel.h
|
|
||||||
VARIANT_ESP32H2: {
|
|
||||||
1: adc_channel_t.ADC_CHANNEL_0,
|
|
||||||
2: adc_channel_t.ADC_CHANNEL_1,
|
|
||||||
3: adc_channel_t.ADC_CHANNEL_2,
|
|
||||||
4: adc_channel_t.ADC_CHANNEL_3,
|
|
||||||
5: adc_channel_t.ADC_CHANNEL_4,
|
|
||||||
},
|
|
||||||
# https://github.com/espressif/esp-idf/blob/master/components/soc/esp32s2/include/soc/adc_channel.h
|
|
||||||
VARIANT_ESP32S2: {
|
VARIANT_ESP32S2: {
|
||||||
1: adc_channel_t.ADC_CHANNEL_0,
|
1: adc1_channel_t.ADC1_CHANNEL_0,
|
||||||
2: adc_channel_t.ADC_CHANNEL_1,
|
2: adc1_channel_t.ADC1_CHANNEL_1,
|
||||||
3: adc_channel_t.ADC_CHANNEL_2,
|
3: adc1_channel_t.ADC1_CHANNEL_2,
|
||||||
4: adc_channel_t.ADC_CHANNEL_3,
|
4: adc1_channel_t.ADC1_CHANNEL_3,
|
||||||
5: adc_channel_t.ADC_CHANNEL_4,
|
5: adc1_channel_t.ADC1_CHANNEL_4,
|
||||||
6: adc_channel_t.ADC_CHANNEL_5,
|
6: adc1_channel_t.ADC1_CHANNEL_5,
|
||||||
7: adc_channel_t.ADC_CHANNEL_6,
|
7: adc1_channel_t.ADC1_CHANNEL_6,
|
||||||
8: adc_channel_t.ADC_CHANNEL_7,
|
8: adc1_channel_t.ADC1_CHANNEL_7,
|
||||||
9: adc_channel_t.ADC_CHANNEL_8,
|
9: adc1_channel_t.ADC1_CHANNEL_8,
|
||||||
10: adc_channel_t.ADC_CHANNEL_9,
|
10: adc1_channel_t.ADC1_CHANNEL_9,
|
||||||
},
|
},
|
||||||
# https://github.com/espressif/esp-idf/blob/master/components/soc/esp32s3/include/soc/adc_channel.h
|
|
||||||
VARIANT_ESP32S3: {
|
VARIANT_ESP32S3: {
|
||||||
1: adc_channel_t.ADC_CHANNEL_0,
|
1: adc1_channel_t.ADC1_CHANNEL_0,
|
||||||
2: adc_channel_t.ADC_CHANNEL_1,
|
2: adc1_channel_t.ADC1_CHANNEL_1,
|
||||||
3: adc_channel_t.ADC_CHANNEL_2,
|
3: adc1_channel_t.ADC1_CHANNEL_2,
|
||||||
4: adc_channel_t.ADC_CHANNEL_3,
|
4: adc1_channel_t.ADC1_CHANNEL_3,
|
||||||
5: adc_channel_t.ADC_CHANNEL_4,
|
5: adc1_channel_t.ADC1_CHANNEL_4,
|
||||||
6: adc_channel_t.ADC_CHANNEL_5,
|
6: adc1_channel_t.ADC1_CHANNEL_5,
|
||||||
7: adc_channel_t.ADC_CHANNEL_6,
|
7: adc1_channel_t.ADC1_CHANNEL_6,
|
||||||
8: adc_channel_t.ADC_CHANNEL_7,
|
8: adc1_channel_t.ADC1_CHANNEL_7,
|
||||||
9: adc_channel_t.ADC_CHANNEL_8,
|
9: adc1_channel_t.ADC1_CHANNEL_8,
|
||||||
10: adc_channel_t.ADC_CHANNEL_9,
|
10: adc1_channel_t.ADC1_CHANNEL_9,
|
||||||
|
},
|
||||||
|
VARIANT_ESP32C3: {
|
||||||
|
0: adc1_channel_t.ADC1_CHANNEL_0,
|
||||||
|
1: adc1_channel_t.ADC1_CHANNEL_1,
|
||||||
|
2: adc1_channel_t.ADC1_CHANNEL_2,
|
||||||
|
3: adc1_channel_t.ADC1_CHANNEL_3,
|
||||||
|
4: adc1_channel_t.ADC1_CHANNEL_4,
|
||||||
|
},
|
||||||
|
VARIANT_ESP32C2: {
|
||||||
|
0: adc1_channel_t.ADC1_CHANNEL_0,
|
||||||
|
1: adc1_channel_t.ADC1_CHANNEL_1,
|
||||||
|
2: adc1_channel_t.ADC1_CHANNEL_2,
|
||||||
|
3: adc1_channel_t.ADC1_CHANNEL_3,
|
||||||
|
4: adc1_channel_t.ADC1_CHANNEL_4,
|
||||||
|
},
|
||||||
|
VARIANT_ESP32C6: {
|
||||||
|
0: adc1_channel_t.ADC1_CHANNEL_0,
|
||||||
|
1: adc1_channel_t.ADC1_CHANNEL_1,
|
||||||
|
2: adc1_channel_t.ADC1_CHANNEL_2,
|
||||||
|
3: adc1_channel_t.ADC1_CHANNEL_3,
|
||||||
|
4: adc1_channel_t.ADC1_CHANNEL_4,
|
||||||
|
5: adc1_channel_t.ADC1_CHANNEL_5,
|
||||||
|
6: adc1_channel_t.ADC1_CHANNEL_6,
|
||||||
|
},
|
||||||
|
VARIANT_ESP32H2: {
|
||||||
|
0: adc1_channel_t.ADC1_CHANNEL_0,
|
||||||
|
1: adc1_channel_t.ADC1_CHANNEL_1,
|
||||||
|
2: adc1_channel_t.ADC1_CHANNEL_2,
|
||||||
|
3: adc1_channel_t.ADC1_CHANNEL_3,
|
||||||
|
4: adc1_channel_t.ADC1_CHANNEL_4,
|
||||||
},
|
},
|
||||||
}
|
}
|
||||||
|
|
||||||
# pin to adc2 channel mapping
|
|
||||||
# https://github.com/espressif/esp-idf/blob/v4.4.8/components/driver/include/driver/adc.h
|
|
||||||
ESP32_VARIANT_ADC2_PIN_TO_CHANNEL = {
|
ESP32_VARIANT_ADC2_PIN_TO_CHANNEL = {
|
||||||
# https://github.com/espressif/esp-idf/blob/master/components/soc/esp32/include/soc/adc_channel.h
|
# TODO: add other variants
|
||||||
VARIANT_ESP32: {
|
VARIANT_ESP32: {
|
||||||
4: adc_channel_t.ADC_CHANNEL_0,
|
4: adc2_channel_t.ADC2_CHANNEL_0,
|
||||||
0: adc_channel_t.ADC_CHANNEL_1,
|
0: adc2_channel_t.ADC2_CHANNEL_1,
|
||||||
2: adc_channel_t.ADC_CHANNEL_2,
|
2: adc2_channel_t.ADC2_CHANNEL_2,
|
||||||
15: adc_channel_t.ADC_CHANNEL_3,
|
15: adc2_channel_t.ADC2_CHANNEL_3,
|
||||||
13: adc_channel_t.ADC_CHANNEL_4,
|
13: adc2_channel_t.ADC2_CHANNEL_4,
|
||||||
12: adc_channel_t.ADC_CHANNEL_5,
|
12: adc2_channel_t.ADC2_CHANNEL_5,
|
||||||
14: adc_channel_t.ADC_CHANNEL_6,
|
14: adc2_channel_t.ADC2_CHANNEL_6,
|
||||||
27: adc_channel_t.ADC_CHANNEL_7,
|
27: adc2_channel_t.ADC2_CHANNEL_7,
|
||||||
25: adc_channel_t.ADC_CHANNEL_8,
|
25: adc2_channel_t.ADC2_CHANNEL_8,
|
||||||
26: adc_channel_t.ADC_CHANNEL_9,
|
26: adc2_channel_t.ADC2_CHANNEL_9,
|
||||||
},
|
},
|
||||||
# https://github.com/espressif/esp-idf/blob/master/components/soc/esp32c2/include/soc/adc_channel.h
|
|
||||||
VARIANT_ESP32C2: {
|
|
||||||
5: adc_channel_t.ADC_CHANNEL_0,
|
|
||||||
},
|
|
||||||
# https://github.com/espressif/esp-idf/blob/master/components/soc/esp32c3/include/soc/adc_channel.h
|
|
||||||
VARIANT_ESP32C3: {
|
|
||||||
5: adc_channel_t.ADC_CHANNEL_0,
|
|
||||||
},
|
|
||||||
# ESP32-C5 has no ADC2 channels
|
|
||||||
VARIANT_ESP32C5: {}, # no ADC2
|
|
||||||
# https://github.com/espressif/esp-idf/blob/master/components/soc/esp32c6/include/soc/adc_channel.h
|
|
||||||
VARIANT_ESP32C6: {}, # no ADC2
|
|
||||||
# https://github.com/espressif/esp-idf/blob/master/components/soc/esp32h2/include/soc/adc_channel.h
|
|
||||||
VARIANT_ESP32H2: {}, # no ADC2
|
|
||||||
# https://github.com/espressif/esp-idf/blob/master/components/soc/esp32s2/include/soc/adc_channel.h
|
|
||||||
VARIANT_ESP32S2: {
|
VARIANT_ESP32S2: {
|
||||||
11: adc_channel_t.ADC_CHANNEL_0,
|
11: adc2_channel_t.ADC2_CHANNEL_0,
|
||||||
12: adc_channel_t.ADC_CHANNEL_1,
|
12: adc2_channel_t.ADC2_CHANNEL_1,
|
||||||
13: adc_channel_t.ADC_CHANNEL_2,
|
13: adc2_channel_t.ADC2_CHANNEL_2,
|
||||||
14: adc_channel_t.ADC_CHANNEL_3,
|
14: adc2_channel_t.ADC2_CHANNEL_3,
|
||||||
15: adc_channel_t.ADC_CHANNEL_4,
|
15: adc2_channel_t.ADC2_CHANNEL_4,
|
||||||
16: adc_channel_t.ADC_CHANNEL_5,
|
16: adc2_channel_t.ADC2_CHANNEL_5,
|
||||||
17: adc_channel_t.ADC_CHANNEL_6,
|
17: adc2_channel_t.ADC2_CHANNEL_6,
|
||||||
18: adc_channel_t.ADC_CHANNEL_7,
|
18: adc2_channel_t.ADC2_CHANNEL_7,
|
||||||
19: adc_channel_t.ADC_CHANNEL_8,
|
19: adc2_channel_t.ADC2_CHANNEL_8,
|
||||||
20: adc_channel_t.ADC_CHANNEL_9,
|
20: adc2_channel_t.ADC2_CHANNEL_9,
|
||||||
},
|
},
|
||||||
# https://github.com/espressif/esp-idf/blob/master/components/soc/esp32s3/include/soc/adc_channel.h
|
|
||||||
VARIANT_ESP32S3: {
|
VARIANT_ESP32S3: {
|
||||||
11: adc_channel_t.ADC_CHANNEL_0,
|
11: adc2_channel_t.ADC2_CHANNEL_0,
|
||||||
12: adc_channel_t.ADC_CHANNEL_1,
|
12: adc2_channel_t.ADC2_CHANNEL_1,
|
||||||
13: adc_channel_t.ADC_CHANNEL_2,
|
13: adc2_channel_t.ADC2_CHANNEL_2,
|
||||||
14: adc_channel_t.ADC_CHANNEL_3,
|
14: adc2_channel_t.ADC2_CHANNEL_3,
|
||||||
15: adc_channel_t.ADC_CHANNEL_4,
|
15: adc2_channel_t.ADC2_CHANNEL_4,
|
||||||
16: adc_channel_t.ADC_CHANNEL_5,
|
16: adc2_channel_t.ADC2_CHANNEL_5,
|
||||||
17: adc_channel_t.ADC_CHANNEL_6,
|
17: adc2_channel_t.ADC2_CHANNEL_6,
|
||||||
18: adc_channel_t.ADC_CHANNEL_7,
|
18: adc2_channel_t.ADC2_CHANNEL_7,
|
||||||
19: adc_channel_t.ADC_CHANNEL_8,
|
19: adc2_channel_t.ADC2_CHANNEL_8,
|
||||||
20: adc_channel_t.ADC_CHANNEL_9,
|
20: adc2_channel_t.ADC2_CHANNEL_9,
|
||||||
},
|
},
|
||||||
|
VARIANT_ESP32C3: {
|
||||||
|
5: adc2_channel_t.ADC2_CHANNEL_0,
|
||||||
|
},
|
||||||
|
VARIANT_ESP32C2: {},
|
||||||
|
VARIANT_ESP32C6: {},
|
||||||
|
VARIANT_ESP32H2: {},
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
||||||
@ -250,20 +206,3 @@ def validate_adc_pin(value):
|
|||||||
)(value)
|
)(value)
|
||||||
|
|
||||||
raise NotImplementedError
|
raise NotImplementedError
|
||||||
|
|
||||||
|
|
||||||
FILTER_SOURCE_FILES = filter_source_files_from_platform(
|
|
||||||
{
|
|
||||||
"adc_sensor_esp32.cpp": {
|
|
||||||
PlatformFramework.ESP32_ARDUINO,
|
|
||||||
PlatformFramework.ESP32_IDF,
|
|
||||||
},
|
|
||||||
"adc_sensor_esp8266.cpp": {PlatformFramework.ESP8266_ARDUINO},
|
|
||||||
"adc_sensor_rp2040.cpp": {PlatformFramework.RP2040_ARDUINO},
|
|
||||||
"adc_sensor_libretiny.cpp": {
|
|
||||||
PlatformFramework.BK72XX_ARDUINO,
|
|
||||||
PlatformFramework.RTL87XX_ARDUINO,
|
|
||||||
PlatformFramework.LN882X_ARDUINO,
|
|
||||||
},
|
|
||||||
}
|
|
||||||
)
|
|
||||||
|
332
esphome/components/adc/adc_sensor.cpp
Normal file
332
esphome/components/adc/adc_sensor.cpp
Normal file
@ -0,0 +1,332 @@
|
|||||||
|
#include "adc_sensor.h"
|
||||||
|
#include "esphome/core/helpers.h"
|
||||||
|
#include "esphome/core/log.h"
|
||||||
|
|
||||||
|
#ifdef USE_ESP8266
|
||||||
|
#ifdef USE_ADC_SENSOR_VCC
|
||||||
|
#include <Esp.h>
|
||||||
|
ADC_MODE(ADC_VCC)
|
||||||
|
#else
|
||||||
|
#include <Arduino.h>
|
||||||
|
#endif
|
||||||
|
#endif
|
||||||
|
|
||||||
|
#ifdef USE_RP2040
|
||||||
|
#ifdef CYW43_USES_VSYS_PIN
|
||||||
|
#include "pico/cyw43_arch.h"
|
||||||
|
#endif
|
||||||
|
#include <hardware/adc.h>
|
||||||
|
#endif
|
||||||
|
|
||||||
|
namespace esphome {
|
||||||
|
namespace adc {
|
||||||
|
|
||||||
|
static const char *const TAG = "adc";
|
||||||
|
|
||||||
|
// 13-bit for S2, 12-bit for all other ESP32 variants
|
||||||
|
#ifdef USE_ESP32
|
||||||
|
static const adc_bits_width_t ADC_WIDTH_MAX_SOC_BITS = static_cast<adc_bits_width_t>(ADC_WIDTH_MAX - 1);
|
||||||
|
|
||||||
|
#ifndef SOC_ADC_RTC_MAX_BITWIDTH
|
||||||
|
#if USE_ESP32_VARIANT_ESP32S2
|
||||||
|
static const int32_t SOC_ADC_RTC_MAX_BITWIDTH = 13;
|
||||||
|
#else
|
||||||
|
static const int32_t SOC_ADC_RTC_MAX_BITWIDTH = 12;
|
||||||
|
#endif
|
||||||
|
#endif
|
||||||
|
|
||||||
|
static const int ADC_MAX = (1 << SOC_ADC_RTC_MAX_BITWIDTH) - 1; // 4095 (12 bit) or 8191 (13 bit)
|
||||||
|
static const int ADC_HALF = (1 << SOC_ADC_RTC_MAX_BITWIDTH) >> 1; // 2048 (12 bit) or 4096 (13 bit)
|
||||||
|
#endif
|
||||||
|
|
||||||
|
#ifdef USE_RP2040
|
||||||
|
extern "C"
|
||||||
|
#endif
|
||||||
|
void
|
||||||
|
ADCSensor::setup() {
|
||||||
|
ESP_LOGCONFIG(TAG, "Setting up ADC '%s'...", this->get_name().c_str());
|
||||||
|
#if !defined(USE_ADC_SENSOR_VCC) && !defined(USE_RP2040)
|
||||||
|
this->pin_->setup();
|
||||||
|
#endif
|
||||||
|
|
||||||
|
#ifdef USE_ESP32
|
||||||
|
if (this->channel1_ != ADC1_CHANNEL_MAX) {
|
||||||
|
adc1_config_width(ADC_WIDTH_MAX_SOC_BITS);
|
||||||
|
if (!this->autorange_) {
|
||||||
|
adc1_config_channel_atten(this->channel1_, this->attenuation_);
|
||||||
|
}
|
||||||
|
} else if (this->channel2_ != ADC2_CHANNEL_MAX) {
|
||||||
|
if (!this->autorange_) {
|
||||||
|
adc2_config_channel_atten(this->channel2_, this->attenuation_);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// load characteristics for each attenuation
|
||||||
|
for (int32_t i = 0; i <= ADC_ATTEN_DB_12_COMPAT; i++) {
|
||||||
|
auto adc_unit = this->channel1_ != ADC1_CHANNEL_MAX ? ADC_UNIT_1 : ADC_UNIT_2;
|
||||||
|
auto cal_value = esp_adc_cal_characterize(adc_unit, (adc_atten_t) i, ADC_WIDTH_MAX_SOC_BITS,
|
||||||
|
1100, // default vref
|
||||||
|
&this->cal_characteristics_[i]);
|
||||||
|
switch (cal_value) {
|
||||||
|
case ESP_ADC_CAL_VAL_EFUSE_VREF:
|
||||||
|
ESP_LOGV(TAG, "Using eFuse Vref for calibration");
|
||||||
|
break;
|
||||||
|
case ESP_ADC_CAL_VAL_EFUSE_TP:
|
||||||
|
ESP_LOGV(TAG, "Using two-point eFuse Vref for calibration");
|
||||||
|
break;
|
||||||
|
case ESP_ADC_CAL_VAL_DEFAULT_VREF:
|
||||||
|
default:
|
||||||
|
break;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
#endif // USE_ESP32
|
||||||
|
|
||||||
|
#ifdef USE_RP2040
|
||||||
|
static bool initialized = false;
|
||||||
|
if (!initialized) {
|
||||||
|
adc_init();
|
||||||
|
initialized = true;
|
||||||
|
}
|
||||||
|
#endif
|
||||||
|
|
||||||
|
ESP_LOGCONFIG(TAG, "ADC '%s' setup finished!", this->get_name().c_str());
|
||||||
|
}
|
||||||
|
|
||||||
|
void ADCSensor::dump_config() {
|
||||||
|
LOG_SENSOR("", "ADC Sensor", this);
|
||||||
|
#if defined(USE_ESP8266) || defined(USE_LIBRETINY)
|
||||||
|
#ifdef USE_ADC_SENSOR_VCC
|
||||||
|
ESP_LOGCONFIG(TAG, " Pin: VCC");
|
||||||
|
#else
|
||||||
|
LOG_PIN(" Pin: ", this->pin_);
|
||||||
|
#endif
|
||||||
|
#endif // USE_ESP8266 || USE_LIBRETINY
|
||||||
|
|
||||||
|
#ifdef USE_ESP32
|
||||||
|
LOG_PIN(" Pin: ", this->pin_);
|
||||||
|
if (this->autorange_) {
|
||||||
|
ESP_LOGCONFIG(TAG, " Attenuation: auto");
|
||||||
|
} else {
|
||||||
|
switch (this->attenuation_) {
|
||||||
|
case ADC_ATTEN_DB_0:
|
||||||
|
ESP_LOGCONFIG(TAG, " Attenuation: 0db");
|
||||||
|
break;
|
||||||
|
case ADC_ATTEN_DB_2_5:
|
||||||
|
ESP_LOGCONFIG(TAG, " Attenuation: 2.5db");
|
||||||
|
break;
|
||||||
|
case ADC_ATTEN_DB_6:
|
||||||
|
ESP_LOGCONFIG(TAG, " Attenuation: 6db");
|
||||||
|
break;
|
||||||
|
case ADC_ATTEN_DB_12_COMPAT:
|
||||||
|
ESP_LOGCONFIG(TAG, " Attenuation: 12db");
|
||||||
|
break;
|
||||||
|
default: // This is to satisfy the unused ADC_ATTEN_MAX
|
||||||
|
break;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
#endif // USE_ESP32
|
||||||
|
|
||||||
|
#ifdef USE_RP2040
|
||||||
|
if (this->is_temperature_) {
|
||||||
|
ESP_LOGCONFIG(TAG, " Pin: Temperature");
|
||||||
|
} else {
|
||||||
|
#ifdef USE_ADC_SENSOR_VCC
|
||||||
|
ESP_LOGCONFIG(TAG, " Pin: VCC");
|
||||||
|
#else
|
||||||
|
LOG_PIN(" Pin: ", this->pin_);
|
||||||
|
#endif // USE_ADC_SENSOR_VCC
|
||||||
|
}
|
||||||
|
#endif // USE_RP2040
|
||||||
|
ESP_LOGCONFIG(TAG, " Samples: %i", this->sample_count_);
|
||||||
|
LOG_UPDATE_INTERVAL(this);
|
||||||
|
}
|
||||||
|
|
||||||
|
float ADCSensor::get_setup_priority() const { return setup_priority::DATA; }
|
||||||
|
void ADCSensor::update() {
|
||||||
|
float value_v = this->sample();
|
||||||
|
ESP_LOGV(TAG, "'%s': Got voltage=%.4fV", this->get_name().c_str(), value_v);
|
||||||
|
this->publish_state(value_v);
|
||||||
|
}
|
||||||
|
|
||||||
|
void ADCSensor::set_sample_count(uint8_t sample_count) {
|
||||||
|
if (sample_count != 0) {
|
||||||
|
this->sample_count_ = sample_count;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
#ifdef USE_ESP8266
|
||||||
|
float ADCSensor::sample() {
|
||||||
|
uint32_t raw = 0;
|
||||||
|
for (uint8_t sample = 0; sample < this->sample_count_; sample++) {
|
||||||
|
#ifdef USE_ADC_SENSOR_VCC
|
||||||
|
raw += ESP.getVcc(); // NOLINT(readability-static-accessed-through-instance)
|
||||||
|
#else
|
||||||
|
raw += analogRead(this->pin_->get_pin()); // NOLINT
|
||||||
|
#endif
|
||||||
|
}
|
||||||
|
raw = (raw + (this->sample_count_ >> 1)) / this->sample_count_; // NOLINT(clang-analyzer-core.DivideZero)
|
||||||
|
if (this->output_raw_) {
|
||||||
|
return raw;
|
||||||
|
}
|
||||||
|
return raw / 1024.0f;
|
||||||
|
}
|
||||||
|
#endif
|
||||||
|
|
||||||
|
#ifdef USE_ESP32
|
||||||
|
float ADCSensor::sample() {
|
||||||
|
if (!this->autorange_) {
|
||||||
|
uint32_t sum = 0;
|
||||||
|
for (uint8_t sample = 0; sample < this->sample_count_; sample++) {
|
||||||
|
int raw = -1;
|
||||||
|
if (this->channel1_ != ADC1_CHANNEL_MAX) {
|
||||||
|
raw = adc1_get_raw(this->channel1_);
|
||||||
|
} else if (this->channel2_ != ADC2_CHANNEL_MAX) {
|
||||||
|
adc2_get_raw(this->channel2_, ADC_WIDTH_MAX_SOC_BITS, &raw);
|
||||||
|
}
|
||||||
|
if (raw == -1) {
|
||||||
|
return NAN;
|
||||||
|
}
|
||||||
|
sum += raw;
|
||||||
|
}
|
||||||
|
sum = (sum + (this->sample_count_ >> 1)) / this->sample_count_; // NOLINT(clang-analyzer-core.DivideZero)
|
||||||
|
if (this->output_raw_) {
|
||||||
|
return sum;
|
||||||
|
}
|
||||||
|
uint32_t mv = esp_adc_cal_raw_to_voltage(sum, &this->cal_characteristics_[(int32_t) this->attenuation_]);
|
||||||
|
return mv / 1000.0f;
|
||||||
|
}
|
||||||
|
|
||||||
|
int raw12 = ADC_MAX, raw6 = ADC_MAX, raw2 = ADC_MAX, raw0 = ADC_MAX;
|
||||||
|
|
||||||
|
if (this->channel1_ != ADC1_CHANNEL_MAX) {
|
||||||
|
adc1_config_channel_atten(this->channel1_, ADC_ATTEN_DB_12_COMPAT);
|
||||||
|
raw12 = adc1_get_raw(this->channel1_);
|
||||||
|
if (raw12 < ADC_MAX) {
|
||||||
|
adc1_config_channel_atten(this->channel1_, ADC_ATTEN_DB_6);
|
||||||
|
raw6 = adc1_get_raw(this->channel1_);
|
||||||
|
if (raw6 < ADC_MAX) {
|
||||||
|
adc1_config_channel_atten(this->channel1_, ADC_ATTEN_DB_2_5);
|
||||||
|
raw2 = adc1_get_raw(this->channel1_);
|
||||||
|
if (raw2 < ADC_MAX) {
|
||||||
|
adc1_config_channel_atten(this->channel1_, ADC_ATTEN_DB_0);
|
||||||
|
raw0 = adc1_get_raw(this->channel1_);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
} else if (this->channel2_ != ADC2_CHANNEL_MAX) {
|
||||||
|
adc2_config_channel_atten(this->channel2_, ADC_ATTEN_DB_12_COMPAT);
|
||||||
|
adc2_get_raw(this->channel2_, ADC_WIDTH_MAX_SOC_BITS, &raw12);
|
||||||
|
if (raw12 < ADC_MAX) {
|
||||||
|
adc2_config_channel_atten(this->channel2_, ADC_ATTEN_DB_6);
|
||||||
|
adc2_get_raw(this->channel2_, ADC_WIDTH_MAX_SOC_BITS, &raw6);
|
||||||
|
if (raw6 < ADC_MAX) {
|
||||||
|
adc2_config_channel_atten(this->channel2_, ADC_ATTEN_DB_2_5);
|
||||||
|
adc2_get_raw(this->channel2_, ADC_WIDTH_MAX_SOC_BITS, &raw2);
|
||||||
|
if (raw2 < ADC_MAX) {
|
||||||
|
adc2_config_channel_atten(this->channel2_, ADC_ATTEN_DB_0);
|
||||||
|
adc2_get_raw(this->channel2_, ADC_WIDTH_MAX_SOC_BITS, &raw0);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
if (raw0 == -1 || raw2 == -1 || raw6 == -1 || raw12 == -1) {
|
||||||
|
return NAN;
|
||||||
|
}
|
||||||
|
|
||||||
|
uint32_t mv12 = esp_adc_cal_raw_to_voltage(raw12, &this->cal_characteristics_[(int32_t) ADC_ATTEN_DB_12_COMPAT]);
|
||||||
|
uint32_t mv6 = esp_adc_cal_raw_to_voltage(raw6, &this->cal_characteristics_[(int32_t) ADC_ATTEN_DB_6]);
|
||||||
|
uint32_t mv2 = esp_adc_cal_raw_to_voltage(raw2, &this->cal_characteristics_[(int32_t) ADC_ATTEN_DB_2_5]);
|
||||||
|
uint32_t mv0 = esp_adc_cal_raw_to_voltage(raw0, &this->cal_characteristics_[(int32_t) ADC_ATTEN_DB_0]);
|
||||||
|
|
||||||
|
// Contribution of each value, in range 0-2048 (12 bit ADC) or 0-4096 (13 bit ADC)
|
||||||
|
uint32_t c12 = std::min(raw12, ADC_HALF);
|
||||||
|
uint32_t c6 = ADC_HALF - std::abs(raw6 - ADC_HALF);
|
||||||
|
uint32_t c2 = ADC_HALF - std::abs(raw2 - ADC_HALF);
|
||||||
|
uint32_t c0 = std::min(ADC_MAX - raw0, ADC_HALF);
|
||||||
|
// max theoretical csum value is 4096*4 = 16384
|
||||||
|
uint32_t csum = c12 + c6 + c2 + c0;
|
||||||
|
|
||||||
|
// each mv is max 3900; so max value is 3900*4096*4, fits in unsigned32
|
||||||
|
uint32_t mv_scaled = (mv12 * c12) + (mv6 * c6) + (mv2 * c2) + (mv0 * c0);
|
||||||
|
return mv_scaled / (float) (csum * 1000U);
|
||||||
|
}
|
||||||
|
#endif // USE_ESP32
|
||||||
|
|
||||||
|
#ifdef USE_RP2040
|
||||||
|
float ADCSensor::sample() {
|
||||||
|
if (this->is_temperature_) {
|
||||||
|
adc_set_temp_sensor_enabled(true);
|
||||||
|
delay(1);
|
||||||
|
adc_select_input(4);
|
||||||
|
uint32_t raw = 0;
|
||||||
|
for (uint8_t sample = 0; sample < this->sample_count_; sample++) {
|
||||||
|
raw += adc_read();
|
||||||
|
}
|
||||||
|
raw = (raw + (this->sample_count_ >> 1)) / this->sample_count_; // NOLINT(clang-analyzer-core.DivideZero)
|
||||||
|
adc_set_temp_sensor_enabled(false);
|
||||||
|
if (this->output_raw_) {
|
||||||
|
return raw;
|
||||||
|
}
|
||||||
|
return raw * 3.3f / 4096.0f;
|
||||||
|
} else {
|
||||||
|
uint8_t pin = this->pin_->get_pin();
|
||||||
|
#ifdef CYW43_USES_VSYS_PIN
|
||||||
|
if (pin == PICO_VSYS_PIN) {
|
||||||
|
// Measuring VSYS on Raspberry Pico W needs to be wrapped with
|
||||||
|
// `cyw43_thread_enter()`/`cyw43_thread_exit()` as discussed in
|
||||||
|
// https://github.com/raspberrypi/pico-sdk/issues/1222, since Wifi chip and
|
||||||
|
// VSYS ADC both share GPIO29
|
||||||
|
cyw43_thread_enter();
|
||||||
|
}
|
||||||
|
#endif // CYW43_USES_VSYS_PIN
|
||||||
|
|
||||||
|
adc_gpio_init(pin);
|
||||||
|
adc_select_input(pin - 26);
|
||||||
|
|
||||||
|
uint32_t raw = 0;
|
||||||
|
for (uint8_t sample = 0; sample < this->sample_count_; sample++) {
|
||||||
|
raw += adc_read();
|
||||||
|
}
|
||||||
|
raw = (raw + (this->sample_count_ >> 1)) / this->sample_count_; // NOLINT(clang-analyzer-core.DivideZero)
|
||||||
|
|
||||||
|
#ifdef CYW43_USES_VSYS_PIN
|
||||||
|
if (pin == PICO_VSYS_PIN) {
|
||||||
|
cyw43_thread_exit();
|
||||||
|
}
|
||||||
|
#endif // CYW43_USES_VSYS_PIN
|
||||||
|
|
||||||
|
if (this->output_raw_) {
|
||||||
|
return raw;
|
||||||
|
}
|
||||||
|
float coeff = pin == PICO_VSYS_PIN ? 3.0 : 1.0;
|
||||||
|
return raw * 3.3f / 4096.0f * coeff;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
#endif
|
||||||
|
|
||||||
|
#ifdef USE_LIBRETINY
|
||||||
|
float ADCSensor::sample() {
|
||||||
|
uint32_t raw = 0;
|
||||||
|
if (this->output_raw_) {
|
||||||
|
for (uint8_t sample = 0; sample < this->sample_count_; sample++) {
|
||||||
|
raw += analogRead(this->pin_->get_pin()); // NOLINT
|
||||||
|
}
|
||||||
|
raw = (raw + (this->sample_count_ >> 1)) / this->sample_count_; // NOLINT(clang-analyzer-core.DivideZero)
|
||||||
|
return raw;
|
||||||
|
}
|
||||||
|
for (uint8_t sample = 0; sample < this->sample_count_; sample++) {
|
||||||
|
raw += analogReadVoltage(this->pin_->get_pin()); // NOLINT
|
||||||
|
}
|
||||||
|
raw = (raw + (this->sample_count_ >> 1)) / this->sample_count_; // NOLINT(clang-analyzer-core.DivideZero)
|
||||||
|
return raw / 1000.0f;
|
||||||
|
}
|
||||||
|
#endif // USE_LIBRETINY
|
||||||
|
|
||||||
|
#ifdef USE_ESP8266
|
||||||
|
std::string ADCSensor::unique_id() { return get_mac_address() + "-adc"; }
|
||||||
|
#endif
|
||||||
|
|
||||||
|
} // namespace adc
|
||||||
|
} // namespace esphome
|
@ -7,18 +7,17 @@
|
|||||||
#include "esphome/core/hal.h"
|
#include "esphome/core/hal.h"
|
||||||
|
|
||||||
#ifdef USE_ESP32
|
#ifdef USE_ESP32
|
||||||
#include "esp_adc/adc_cali.h"
|
#include <esp_adc_cal.h>
|
||||||
#include "esp_adc/adc_cali_scheme.h"
|
#include "driver/adc.h"
|
||||||
#include "esp_adc/adc_oneshot.h"
|
#endif
|
||||||
#include "hal/adc_types.h" // This defines ADC_CHANNEL_MAX
|
|
||||||
#endif // USE_ESP32
|
|
||||||
|
|
||||||
namespace esphome {
|
namespace esphome {
|
||||||
namespace adc {
|
namespace adc {
|
||||||
|
|
||||||
#ifdef USE_ESP32
|
#ifdef USE_ESP32
|
||||||
// clang-format off
|
// clang-format off
|
||||||
#if (ESP_IDF_VERSION_MAJOR == 5 && \
|
#if (ESP_IDF_VERSION_MAJOR == 4 && ESP_IDF_VERSION >= ESP_IDF_VERSION_VAL(4, 4, 7)) || \
|
||||||
|
(ESP_IDF_VERSION_MAJOR == 5 && \
|
||||||
((ESP_IDF_VERSION_MINOR == 0 && ESP_IDF_VERSION_PATCH >= 5) || \
|
((ESP_IDF_VERSION_MINOR == 0 && ESP_IDF_VERSION_PATCH >= 5) || \
|
||||||
(ESP_IDF_VERSION_MINOR == 1 && ESP_IDF_VERSION_PATCH >= 3) || \
|
(ESP_IDF_VERSION_MINOR == 1 && ESP_IDF_VERSION_PATCH >= 3) || \
|
||||||
(ESP_IDF_VERSION_MINOR >= 2)) \
|
(ESP_IDF_VERSION_MINOR >= 2)) \
|
||||||
@ -30,127 +29,62 @@ static const adc_atten_t ADC_ATTEN_DB_12_COMPAT = ADC_ATTEN_DB_11;
|
|||||||
#endif
|
#endif
|
||||||
#endif // USE_ESP32
|
#endif // USE_ESP32
|
||||||
|
|
||||||
enum class SamplingMode : uint8_t {
|
|
||||||
AVG = 0,
|
|
||||||
MIN = 1,
|
|
||||||
MAX = 2,
|
|
||||||
};
|
|
||||||
|
|
||||||
const LogString *sampling_mode_to_str(SamplingMode mode);
|
|
||||||
|
|
||||||
class Aggregator {
|
|
||||||
public:
|
|
||||||
Aggregator(SamplingMode mode);
|
|
||||||
void add_sample(uint32_t value);
|
|
||||||
uint32_t aggregate();
|
|
||||||
|
|
||||||
protected:
|
|
||||||
uint32_t aggr_{0};
|
|
||||||
uint32_t samples_{0};
|
|
||||||
SamplingMode mode_{SamplingMode::AVG};
|
|
||||||
};
|
|
||||||
|
|
||||||
class ADCSensor : public sensor::Sensor, public PollingComponent, public voltage_sampler::VoltageSampler {
|
class ADCSensor : public sensor::Sensor, public PollingComponent, public voltage_sampler::VoltageSampler {
|
||||||
public:
|
public:
|
||||||
/// Update the sensor's state by reading the current ADC value.
|
#ifdef USE_ESP32
|
||||||
/// This method is called periodically based on the update interval.
|
/// Set the attenuation for this pin. Only available on the ESP32.
|
||||||
|
void set_attenuation(adc_atten_t attenuation) { this->attenuation_ = attenuation; }
|
||||||
|
void set_channel1(adc1_channel_t channel) {
|
||||||
|
this->channel1_ = channel;
|
||||||
|
this->channel2_ = ADC2_CHANNEL_MAX;
|
||||||
|
}
|
||||||
|
void set_channel2(adc2_channel_t channel) {
|
||||||
|
this->channel2_ = channel;
|
||||||
|
this->channel1_ = ADC1_CHANNEL_MAX;
|
||||||
|
}
|
||||||
|
void set_autorange(bool autorange) { this->autorange_ = autorange; }
|
||||||
|
#endif
|
||||||
|
|
||||||
|
/// Update ADC values
|
||||||
void update() override;
|
void update() override;
|
||||||
|
/// Setup ADC
|
||||||
/// Set up the ADC sensor by initializing hardware and calibration parameters.
|
|
||||||
/// This method is called once during device initialization.
|
|
||||||
void setup() override;
|
void setup() override;
|
||||||
|
|
||||||
/// Output the configuration details of the ADC sensor for debugging purposes.
|
|
||||||
/// This method is called during the ESPHome setup process to log the configuration.
|
|
||||||
void dump_config() override;
|
void dump_config() override;
|
||||||
|
/// `HARDWARE_LATE` setup priority
|
||||||
/// Return the setup priority for this component.
|
|
||||||
/// Components with higher priority are initialized earlier during setup.
|
|
||||||
/// @return A float representing the setup priority.
|
|
||||||
float get_setup_priority() const override;
|
float get_setup_priority() const override;
|
||||||
|
|
||||||
/// Set the GPIO pin to be used by the ADC sensor.
|
|
||||||
/// @param pin Pointer to an InternalGPIOPin representing the ADC input pin.
|
|
||||||
void set_pin(InternalGPIOPin *pin) { this->pin_ = pin; }
|
void set_pin(InternalGPIOPin *pin) { this->pin_ = pin; }
|
||||||
|
|
||||||
/// Enable or disable the output of raw ADC values (unprocessed data).
|
|
||||||
/// @param output_raw Boolean indicating whether to output raw ADC values (true) or processed values (false).
|
|
||||||
void set_output_raw(bool output_raw) { this->output_raw_ = output_raw; }
|
void set_output_raw(bool output_raw) { this->output_raw_ = output_raw; }
|
||||||
|
|
||||||
/// Set the number of samples to be taken for ADC readings to improve accuracy.
|
|
||||||
/// A higher sample count reduces noise but increases the reading time.
|
|
||||||
/// @param sample_count The number of samples (e.g., 1, 4, 8).
|
|
||||||
void set_sample_count(uint8_t sample_count);
|
void set_sample_count(uint8_t sample_count);
|
||||||
|
|
||||||
/// Set the sampling mode for how multiple ADC samples are combined into a single measurement.
|
|
||||||
///
|
|
||||||
/// When multiple samples are taken (controlled by set_sample_count), they can be combined
|
|
||||||
/// in one of three ways:
|
|
||||||
/// - SamplingMode::AVG: Compute the average (default)
|
|
||||||
/// - SamplingMode::MIN: Use the lowest sample value
|
|
||||||
/// - SamplingMode::MAX: Use the highest sample value
|
|
||||||
/// @param sampling_mode The desired sampling mode to use for aggregating ADC samples.
|
|
||||||
void set_sampling_mode(SamplingMode sampling_mode);
|
|
||||||
|
|
||||||
/// Perform a single ADC sampling operation and return the measured value.
|
|
||||||
/// This function handles raw readings, calibration, and averaging as needed.
|
|
||||||
/// @return The sampled value as a float.
|
|
||||||
float sample() override;
|
float sample() override;
|
||||||
|
|
||||||
#ifdef USE_ESP32
|
#ifdef USE_ESP8266
|
||||||
/// Set the ADC attenuation level to adjust the input voltage range.
|
std::string unique_id() override;
|
||||||
/// This determines how the ADC interprets input voltages, allowing for greater precision
|
#endif
|
||||||
/// or the ability to measure higher voltages depending on the chosen attenuation level.
|
|
||||||
/// @param attenuation The desired ADC attenuation level (e.g., ADC_ATTEN_DB_0, ADC_ATTEN_DB_11).
|
|
||||||
void set_attenuation(adc_atten_t attenuation) { this->attenuation_ = attenuation; }
|
|
||||||
|
|
||||||
/// Configure the ADC to use a specific channel on a specific ADC unit.
|
|
||||||
/// This sets the channel for single-shot or continuous ADC measurements.
|
|
||||||
/// @param unit The ADC unit to use (ADC_UNIT_1 or ADC_UNIT_2).
|
|
||||||
/// @param channel The ADC channel to configure, such as ADC_CHANNEL_0, ADC_CHANNEL_3, etc.
|
|
||||||
void set_channel(adc_unit_t unit, adc_channel_t channel) {
|
|
||||||
this->adc_unit_ = unit;
|
|
||||||
this->channel_ = channel;
|
|
||||||
}
|
|
||||||
|
|
||||||
/// Set whether autoranging should be enabled for the ADC.
|
|
||||||
/// Autoranging automatically adjusts the attenuation level to handle a wide range of input voltages.
|
|
||||||
/// @param autorange Boolean indicating whether to enable autoranging.
|
|
||||||
void set_autorange(bool autorange) { this->autorange_ = autorange; }
|
|
||||||
#endif // USE_ESP32
|
|
||||||
|
|
||||||
#ifdef USE_RP2040
|
#ifdef USE_RP2040
|
||||||
void set_is_temperature() { this->is_temperature_ = true; }
|
void set_is_temperature() { this->is_temperature_ = true; }
|
||||||
#endif // USE_RP2040
|
#endif
|
||||||
|
|
||||||
protected:
|
protected:
|
||||||
uint8_t sample_count_{1};
|
|
||||||
bool output_raw_{false};
|
|
||||||
InternalGPIOPin *pin_;
|
InternalGPIOPin *pin_;
|
||||||
SamplingMode sampling_mode_{SamplingMode::AVG};
|
bool output_raw_{false};
|
||||||
|
uint8_t sample_count_{1};
|
||||||
#ifdef USE_ESP32
|
|
||||||
float sample_autorange_();
|
|
||||||
float sample_fixed_attenuation_();
|
|
||||||
bool autorange_{false};
|
|
||||||
adc_oneshot_unit_handle_t adc_handle_{nullptr};
|
|
||||||
adc_cali_handle_t calibration_handle_{nullptr};
|
|
||||||
adc_atten_t attenuation_{ADC_ATTEN_DB_0};
|
|
||||||
adc_channel_t channel_;
|
|
||||||
adc_unit_t adc_unit_;
|
|
||||||
struct SetupFlags {
|
|
||||||
uint8_t init_complete : 1;
|
|
||||||
uint8_t config_complete : 1;
|
|
||||||
uint8_t handle_init_complete : 1;
|
|
||||||
uint8_t calibration_complete : 1;
|
|
||||||
uint8_t reserved : 4;
|
|
||||||
} setup_flags_{};
|
|
||||||
static adc_oneshot_unit_handle_t shared_adc_handles[2];
|
|
||||||
#endif // USE_ESP32
|
|
||||||
|
|
||||||
#ifdef USE_RP2040
|
#ifdef USE_RP2040
|
||||||
bool is_temperature_{false};
|
bool is_temperature_{false};
|
||||||
#endif // USE_RP2040
|
#endif
|
||||||
|
|
||||||
|
#ifdef USE_ESP32
|
||||||
|
adc_atten_t attenuation_{ADC_ATTEN_DB_0};
|
||||||
|
adc1_channel_t channel1_{ADC1_CHANNEL_MAX};
|
||||||
|
adc2_channel_t channel2_{ADC2_CHANNEL_MAX};
|
||||||
|
bool autorange_{false};
|
||||||
|
#if ESP_IDF_VERSION_MAJOR >= 5
|
||||||
|
esp_adc_cal_characteristics_t cal_characteristics_[SOC_ADC_ATTEN_NUM] = {};
|
||||||
|
#else
|
||||||
|
esp_adc_cal_characteristics_t cal_characteristics_[ADC_ATTEN_MAX] = {};
|
||||||
|
#endif
|
||||||
|
#endif
|
||||||
};
|
};
|
||||||
|
|
||||||
} // namespace adc
|
} // namespace adc
|
||||||
|
@ -1,79 +0,0 @@
|
|||||||
#include "adc_sensor.h"
|
|
||||||
#include "esphome/core/log.h"
|
|
||||||
|
|
||||||
namespace esphome {
|
|
||||||
namespace adc {
|
|
||||||
|
|
||||||
static const char *const TAG = "adc.common";
|
|
||||||
|
|
||||||
const LogString *sampling_mode_to_str(SamplingMode mode) {
|
|
||||||
switch (mode) {
|
|
||||||
case SamplingMode::AVG:
|
|
||||||
return LOG_STR("average");
|
|
||||||
case SamplingMode::MIN:
|
|
||||||
return LOG_STR("minimum");
|
|
||||||
case SamplingMode::MAX:
|
|
||||||
return LOG_STR("maximum");
|
|
||||||
}
|
|
||||||
return LOG_STR("unknown");
|
|
||||||
}
|
|
||||||
|
|
||||||
Aggregator::Aggregator(SamplingMode mode) {
|
|
||||||
this->mode_ = mode;
|
|
||||||
// set to max uint if mode is "min"
|
|
||||||
if (mode == SamplingMode::MIN) {
|
|
||||||
this->aggr_ = UINT32_MAX;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
void Aggregator::add_sample(uint32_t value) {
|
|
||||||
this->samples_ += 1;
|
|
||||||
|
|
||||||
switch (this->mode_) {
|
|
||||||
case SamplingMode::AVG:
|
|
||||||
this->aggr_ += value;
|
|
||||||
break;
|
|
||||||
|
|
||||||
case SamplingMode::MIN:
|
|
||||||
if (value < this->aggr_) {
|
|
||||||
this->aggr_ = value;
|
|
||||||
}
|
|
||||||
break;
|
|
||||||
|
|
||||||
case SamplingMode::MAX:
|
|
||||||
if (value > this->aggr_) {
|
|
||||||
this->aggr_ = value;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
uint32_t Aggregator::aggregate() {
|
|
||||||
if (this->mode_ == SamplingMode::AVG) {
|
|
||||||
if (this->samples_ == 0) {
|
|
||||||
return this->aggr_;
|
|
||||||
}
|
|
||||||
|
|
||||||
return (this->aggr_ + (this->samples_ >> 1)) / this->samples_; // NOLINT(clang-analyzer-core.DivideZero)
|
|
||||||
}
|
|
||||||
|
|
||||||
return this->aggr_;
|
|
||||||
}
|
|
||||||
|
|
||||||
void ADCSensor::update() {
|
|
||||||
float value_v = this->sample();
|
|
||||||
ESP_LOGV(TAG, "'%s': Voltage=%.4fV", this->get_name().c_str(), value_v);
|
|
||||||
this->publish_state(value_v);
|
|
||||||
}
|
|
||||||
|
|
||||||
void ADCSensor::set_sample_count(uint8_t sample_count) {
|
|
||||||
if (sample_count != 0) {
|
|
||||||
this->sample_count_ = sample_count;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
void ADCSensor::set_sampling_mode(SamplingMode sampling_mode) { this->sampling_mode_ = sampling_mode; }
|
|
||||||
|
|
||||||
float ADCSensor::get_setup_priority() const { return setup_priority::DATA; }
|
|
||||||
|
|
||||||
} // namespace adc
|
|
||||||
} // namespace esphome
|
|
@ -1,346 +0,0 @@
|
|||||||
#ifdef USE_ESP32
|
|
||||||
|
|
||||||
#include "adc_sensor.h"
|
|
||||||
#include "esphome/core/log.h"
|
|
||||||
|
|
||||||
namespace esphome {
|
|
||||||
namespace adc {
|
|
||||||
|
|
||||||
static const char *const TAG = "adc.esp32";
|
|
||||||
|
|
||||||
adc_oneshot_unit_handle_t ADCSensor::shared_adc_handles[2] = {nullptr, nullptr};
|
|
||||||
|
|
||||||
const LogString *attenuation_to_str(adc_atten_t attenuation) {
|
|
||||||
switch (attenuation) {
|
|
||||||
case ADC_ATTEN_DB_0:
|
|
||||||
return LOG_STR("0 dB");
|
|
||||||
case ADC_ATTEN_DB_2_5:
|
|
||||||
return LOG_STR("2.5 dB");
|
|
||||||
case ADC_ATTEN_DB_6:
|
|
||||||
return LOG_STR("6 dB");
|
|
||||||
case ADC_ATTEN_DB_12_COMPAT:
|
|
||||||
return LOG_STR("12 dB");
|
|
||||||
default:
|
|
||||||
return LOG_STR("Unknown Attenuation");
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
const LogString *adc_unit_to_str(adc_unit_t unit) {
|
|
||||||
switch (unit) {
|
|
||||||
case ADC_UNIT_1:
|
|
||||||
return LOG_STR("ADC1");
|
|
||||||
case ADC_UNIT_2:
|
|
||||||
return LOG_STR("ADC2");
|
|
||||||
default:
|
|
||||||
return LOG_STR("Unknown ADC Unit");
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
void ADCSensor::setup() {
|
|
||||||
ESP_LOGCONFIG(TAG, "Running setup for '%s'", this->get_name().c_str());
|
|
||||||
// Check if another sensor already initialized this ADC unit
|
|
||||||
if (ADCSensor::shared_adc_handles[this->adc_unit_] == nullptr) {
|
|
||||||
adc_oneshot_unit_init_cfg_t init_config = {}; // Zero initialize
|
|
||||||
init_config.unit_id = this->adc_unit_;
|
|
||||||
init_config.ulp_mode = ADC_ULP_MODE_DISABLE;
|
|
||||||
#if USE_ESP32_VARIANT_ESP32C3 || USE_ESP32_VARIANT_ESP32C5 || USE_ESP32_VARIANT_ESP32C6 || USE_ESP32_VARIANT_ESP32H2
|
|
||||||
init_config.clk_src = ADC_DIGI_CLK_SRC_DEFAULT;
|
|
||||||
#endif // USE_ESP32_VARIANT_ESP32C3 || USE_ESP32_VARIANT_ESP32C5 || USE_ESP32_VARIANT_ESP32C6 ||
|
|
||||||
// USE_ESP32_VARIANT_ESP32H2
|
|
||||||
esp_err_t err = adc_oneshot_new_unit(&init_config, &ADCSensor::shared_adc_handles[this->adc_unit_]);
|
|
||||||
if (err != ESP_OK) {
|
|
||||||
ESP_LOGE(TAG, "Error initializing %s: %d", LOG_STR_ARG(adc_unit_to_str(this->adc_unit_)), err);
|
|
||||||
this->mark_failed();
|
|
||||||
return;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
this->adc_handle_ = ADCSensor::shared_adc_handles[this->adc_unit_];
|
|
||||||
|
|
||||||
this->setup_flags_.handle_init_complete = true;
|
|
||||||
|
|
||||||
adc_oneshot_chan_cfg_t config = {
|
|
||||||
.atten = this->attenuation_,
|
|
||||||
.bitwidth = ADC_BITWIDTH_DEFAULT,
|
|
||||||
};
|
|
||||||
esp_err_t err = adc_oneshot_config_channel(this->adc_handle_, this->channel_, &config);
|
|
||||||
if (err != ESP_OK) {
|
|
||||||
ESP_LOGE(TAG, "Error configuring channel: %d", err);
|
|
||||||
this->mark_failed();
|
|
||||||
return;
|
|
||||||
}
|
|
||||||
this->setup_flags_.config_complete = true;
|
|
||||||
|
|
||||||
// Initialize ADC calibration
|
|
||||||
if (this->calibration_handle_ == nullptr) {
|
|
||||||
adc_cali_handle_t handle = nullptr;
|
|
||||||
esp_err_t err;
|
|
||||||
|
|
||||||
#if USE_ESP32_VARIANT_ESP32C3 || USE_ESP32_VARIANT_ESP32C5 || USE_ESP32_VARIANT_ESP32C6 || \
|
|
||||||
USE_ESP32_VARIANT_ESP32S3 || USE_ESP32_VARIANT_ESP32H2
|
|
||||||
// RISC-V variants and S3 use curve fitting calibration
|
|
||||||
adc_cali_curve_fitting_config_t cali_config = {}; // Zero initialize first
|
|
||||||
#if ESP_IDF_VERSION >= ESP_IDF_VERSION_VAL(5, 3, 0)
|
|
||||||
cali_config.chan = this->channel_;
|
|
||||||
#endif // ESP_IDF_VERSION >= ESP_IDF_VERSION_VAL(5, 3, 0)
|
|
||||||
cali_config.unit_id = this->adc_unit_;
|
|
||||||
cali_config.atten = this->attenuation_;
|
|
||||||
cali_config.bitwidth = ADC_BITWIDTH_DEFAULT;
|
|
||||||
|
|
||||||
err = adc_cali_create_scheme_curve_fitting(&cali_config, &handle);
|
|
||||||
if (err == ESP_OK) {
|
|
||||||
this->calibration_handle_ = handle;
|
|
||||||
this->setup_flags_.calibration_complete = true;
|
|
||||||
ESP_LOGV(TAG, "Using curve fitting calibration");
|
|
||||||
} else {
|
|
||||||
ESP_LOGW(TAG, "Curve fitting calibration failed with error %d, will use uncalibrated readings", err);
|
|
||||||
this->setup_flags_.calibration_complete = false;
|
|
||||||
}
|
|
||||||
#else // Other ESP32 variants use line fitting calibration
|
|
||||||
adc_cali_line_fitting_config_t cali_config = {
|
|
||||||
.unit_id = this->adc_unit_,
|
|
||||||
.atten = this->attenuation_,
|
|
||||||
.bitwidth = ADC_BITWIDTH_DEFAULT,
|
|
||||||
#if !defined(USE_ESP32_VARIANT_ESP32S2)
|
|
||||||
.default_vref = 1100, // Default reference voltage in mV
|
|
||||||
#endif // !defined(USE_ESP32_VARIANT_ESP32S2)
|
|
||||||
};
|
|
||||||
err = adc_cali_create_scheme_line_fitting(&cali_config, &handle);
|
|
||||||
if (err == ESP_OK) {
|
|
||||||
this->calibration_handle_ = handle;
|
|
||||||
this->setup_flags_.calibration_complete = true;
|
|
||||||
ESP_LOGV(TAG, "Using line fitting calibration");
|
|
||||||
} else {
|
|
||||||
ESP_LOGW(TAG, "Line fitting calibration failed with error %d, will use uncalibrated readings", err);
|
|
||||||
this->setup_flags_.calibration_complete = false;
|
|
||||||
}
|
|
||||||
#endif // USE_ESP32_VARIANT_ESP32C3 || ESP32C5 || ESP32C6 || ESP32S3 || ESP32H2
|
|
||||||
}
|
|
||||||
|
|
||||||
this->setup_flags_.init_complete = true;
|
|
||||||
}
|
|
||||||
|
|
||||||
void ADCSensor::dump_config() {
|
|
||||||
LOG_SENSOR("", "ADC Sensor", this);
|
|
||||||
LOG_PIN(" Pin: ", this->pin_);
|
|
||||||
ESP_LOGCONFIG(TAG,
|
|
||||||
" Channel: %d\n"
|
|
||||||
" Unit: %s\n"
|
|
||||||
" Attenuation: %s\n"
|
|
||||||
" Samples: %i\n"
|
|
||||||
" Sampling mode: %s",
|
|
||||||
this->channel_, LOG_STR_ARG(adc_unit_to_str(this->adc_unit_)),
|
|
||||||
this->autorange_ ? "Auto" : LOG_STR_ARG(attenuation_to_str(this->attenuation_)), this->sample_count_,
|
|
||||||
LOG_STR_ARG(sampling_mode_to_str(this->sampling_mode_)));
|
|
||||||
|
|
||||||
ESP_LOGCONFIG(
|
|
||||||
TAG,
|
|
||||||
" Setup Status:\n"
|
|
||||||
" Handle Init: %s\n"
|
|
||||||
" Config: %s\n"
|
|
||||||
" Calibration: %s\n"
|
|
||||||
" Overall Init: %s",
|
|
||||||
this->setup_flags_.handle_init_complete ? "OK" : "FAILED", this->setup_flags_.config_complete ? "OK" : "FAILED",
|
|
||||||
this->setup_flags_.calibration_complete ? "OK" : "FAILED", this->setup_flags_.init_complete ? "OK" : "FAILED");
|
|
||||||
|
|
||||||
LOG_UPDATE_INTERVAL(this);
|
|
||||||
}
|
|
||||||
|
|
||||||
float ADCSensor::sample() {
|
|
||||||
if (this->autorange_) {
|
|
||||||
return this->sample_autorange_();
|
|
||||||
} else {
|
|
||||||
return this->sample_fixed_attenuation_();
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
float ADCSensor::sample_fixed_attenuation_() {
|
|
||||||
auto aggr = Aggregator(this->sampling_mode_);
|
|
||||||
|
|
||||||
for (uint8_t sample = 0; sample < this->sample_count_; sample++) {
|
|
||||||
int raw;
|
|
||||||
esp_err_t err = adc_oneshot_read(this->adc_handle_, this->channel_, &raw);
|
|
||||||
|
|
||||||
if (err != ESP_OK) {
|
|
||||||
ESP_LOGW(TAG, "ADC read failed with error %d", err);
|
|
||||||
continue;
|
|
||||||
}
|
|
||||||
|
|
||||||
if (raw == -1) {
|
|
||||||
ESP_LOGW(TAG, "Invalid ADC reading");
|
|
||||||
continue;
|
|
||||||
}
|
|
||||||
|
|
||||||
aggr.add_sample(raw);
|
|
||||||
}
|
|
||||||
|
|
||||||
uint32_t final_value = aggr.aggregate();
|
|
||||||
|
|
||||||
if (this->output_raw_) {
|
|
||||||
return final_value;
|
|
||||||
}
|
|
||||||
|
|
||||||
if (this->calibration_handle_ != nullptr) {
|
|
||||||
int voltage_mv;
|
|
||||||
esp_err_t err = adc_cali_raw_to_voltage(this->calibration_handle_, final_value, &voltage_mv);
|
|
||||||
if (err == ESP_OK) {
|
|
||||||
return voltage_mv / 1000.0f;
|
|
||||||
} else {
|
|
||||||
ESP_LOGW(TAG, "ADC calibration conversion failed with error %d, disabling calibration", err);
|
|
||||||
if (this->calibration_handle_ != nullptr) {
|
|
||||||
#if USE_ESP32_VARIANT_ESP32C3 || USE_ESP32_VARIANT_ESP32C5 || USE_ESP32_VARIANT_ESP32C6 || \
|
|
||||||
USE_ESP32_VARIANT_ESP32S3 || USE_ESP32_VARIANT_ESP32H2
|
|
||||||
adc_cali_delete_scheme_curve_fitting(this->calibration_handle_);
|
|
||||||
#else // Other ESP32 variants use line fitting calibration
|
|
||||||
adc_cali_delete_scheme_line_fitting(this->calibration_handle_);
|
|
||||||
#endif // USE_ESP32_VARIANT_ESP32C3 || ESP32C5 || ESP32C6 || ESP32S3 || ESP32H2
|
|
||||||
this->calibration_handle_ = nullptr;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
return final_value * 3.3f / 4095.0f;
|
|
||||||
}
|
|
||||||
|
|
||||||
float ADCSensor::sample_autorange_() {
|
|
||||||
// Auto-range mode
|
|
||||||
auto read_atten = [this](adc_atten_t atten) -> std::pair<int, float> {
|
|
||||||
// First reconfigure the attenuation for this reading
|
|
||||||
adc_oneshot_chan_cfg_t config = {
|
|
||||||
.atten = atten,
|
|
||||||
.bitwidth = ADC_BITWIDTH_DEFAULT,
|
|
||||||
};
|
|
||||||
|
|
||||||
esp_err_t err = adc_oneshot_config_channel(this->adc_handle_, this->channel_, &config);
|
|
||||||
|
|
||||||
if (err != ESP_OK) {
|
|
||||||
ESP_LOGW(TAG, "Error configuring ADC channel for autorange: %d", err);
|
|
||||||
return {-1, 0.0f};
|
|
||||||
}
|
|
||||||
|
|
||||||
// Need to recalibrate for the new attenuation
|
|
||||||
if (this->calibration_handle_ != nullptr) {
|
|
||||||
// Delete old calibration handle
|
|
||||||
#if USE_ESP32_VARIANT_ESP32C3 || USE_ESP32_VARIANT_ESP32C5 || USE_ESP32_VARIANT_ESP32C6 || \
|
|
||||||
USE_ESP32_VARIANT_ESP32S3 || USE_ESP32_VARIANT_ESP32H2
|
|
||||||
adc_cali_delete_scheme_curve_fitting(this->calibration_handle_);
|
|
||||||
#else
|
|
||||||
adc_cali_delete_scheme_line_fitting(this->calibration_handle_);
|
|
||||||
#endif
|
|
||||||
this->calibration_handle_ = nullptr;
|
|
||||||
}
|
|
||||||
|
|
||||||
// Create new calibration handle for this attenuation
|
|
||||||
adc_cali_handle_t handle = nullptr;
|
|
||||||
|
|
||||||
#if USE_ESP32_VARIANT_ESP32C3 || USE_ESP32_VARIANT_ESP32C5 || USE_ESP32_VARIANT_ESP32C6 || \
|
|
||||||
USE_ESP32_VARIANT_ESP32S3 || USE_ESP32_VARIANT_ESP32H2
|
|
||||||
adc_cali_curve_fitting_config_t cali_config = {};
|
|
||||||
#if ESP_IDF_VERSION >= ESP_IDF_VERSION_VAL(5, 3, 0)
|
|
||||||
cali_config.chan = this->channel_;
|
|
||||||
#endif
|
|
||||||
cali_config.unit_id = this->adc_unit_;
|
|
||||||
cali_config.atten = atten;
|
|
||||||
cali_config.bitwidth = ADC_BITWIDTH_DEFAULT;
|
|
||||||
|
|
||||||
err = adc_cali_create_scheme_curve_fitting(&cali_config, &handle);
|
|
||||||
#else
|
|
||||||
adc_cali_line_fitting_config_t cali_config = {
|
|
||||||
.unit_id = this->adc_unit_,
|
|
||||||
.atten = atten,
|
|
||||||
.bitwidth = ADC_BITWIDTH_DEFAULT,
|
|
||||||
#if !defined(USE_ESP32_VARIANT_ESP32S2)
|
|
||||||
.default_vref = 1100,
|
|
||||||
#endif
|
|
||||||
};
|
|
||||||
err = adc_cali_create_scheme_line_fitting(&cali_config, &handle);
|
|
||||||
#endif
|
|
||||||
|
|
||||||
int raw;
|
|
||||||
err = adc_oneshot_read(this->adc_handle_, this->channel_, &raw);
|
|
||||||
|
|
||||||
if (err != ESP_OK) {
|
|
||||||
ESP_LOGW(TAG, "ADC read failed in autorange with error %d", err);
|
|
||||||
if (handle != nullptr) {
|
|
||||||
#if USE_ESP32_VARIANT_ESP32C3 || USE_ESP32_VARIANT_ESP32C5 || USE_ESP32_VARIANT_ESP32C6 || \
|
|
||||||
USE_ESP32_VARIANT_ESP32S3 || USE_ESP32_VARIANT_ESP32H2
|
|
||||||
adc_cali_delete_scheme_curve_fitting(handle);
|
|
||||||
#else
|
|
||||||
adc_cali_delete_scheme_line_fitting(handle);
|
|
||||||
#endif
|
|
||||||
}
|
|
||||||
return {-1, 0.0f};
|
|
||||||
}
|
|
||||||
|
|
||||||
float voltage = 0.0f;
|
|
||||||
if (handle != nullptr) {
|
|
||||||
int voltage_mv;
|
|
||||||
err = adc_cali_raw_to_voltage(handle, raw, &voltage_mv);
|
|
||||||
if (err == ESP_OK) {
|
|
||||||
voltage = voltage_mv / 1000.0f;
|
|
||||||
} else {
|
|
||||||
voltage = raw * 3.3f / 4095.0f;
|
|
||||||
}
|
|
||||||
// Clean up calibration handle
|
|
||||||
#if USE_ESP32_VARIANT_ESP32C3 || USE_ESP32_VARIANT_ESP32C5 || USE_ESP32_VARIANT_ESP32C6 || \
|
|
||||||
USE_ESP32_VARIANT_ESP32S3 || USE_ESP32_VARIANT_ESP32H2
|
|
||||||
adc_cali_delete_scheme_curve_fitting(handle);
|
|
||||||
#else
|
|
||||||
adc_cali_delete_scheme_line_fitting(handle);
|
|
||||||
#endif
|
|
||||||
} else {
|
|
||||||
voltage = raw * 3.3f / 4095.0f;
|
|
||||||
}
|
|
||||||
|
|
||||||
return {raw, voltage};
|
|
||||||
};
|
|
||||||
|
|
||||||
auto [raw12, mv12] = read_atten(ADC_ATTEN_DB_12);
|
|
||||||
if (raw12 == -1) {
|
|
||||||
ESP_LOGE(TAG, "Failed to read ADC in autorange mode");
|
|
||||||
return NAN;
|
|
||||||
}
|
|
||||||
|
|
||||||
int raw6 = 4095, raw2 = 4095, raw0 = 4095;
|
|
||||||
float mv6 = 0, mv2 = 0, mv0 = 0;
|
|
||||||
|
|
||||||
if (raw12 < 4095) {
|
|
||||||
auto [raw6_val, mv6_val] = read_atten(ADC_ATTEN_DB_6);
|
|
||||||
raw6 = raw6_val;
|
|
||||||
mv6 = mv6_val;
|
|
||||||
|
|
||||||
if (raw6 < 4095 && raw6 != -1) {
|
|
||||||
auto [raw2_val, mv2_val] = read_atten(ADC_ATTEN_DB_2_5);
|
|
||||||
raw2 = raw2_val;
|
|
||||||
mv2 = mv2_val;
|
|
||||||
|
|
||||||
if (raw2 < 4095 && raw2 != -1) {
|
|
||||||
auto [raw0_val, mv0_val] = read_atten(ADC_ATTEN_DB_0);
|
|
||||||
raw0 = raw0_val;
|
|
||||||
mv0 = mv0_val;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
if (raw0 == -1 || raw2 == -1 || raw6 == -1 || raw12 == -1) {
|
|
||||||
return NAN;
|
|
||||||
}
|
|
||||||
|
|
||||||
const int adc_half = 2048;
|
|
||||||
uint32_t c12 = std::min(raw12, adc_half);
|
|
||||||
uint32_t c6 = adc_half - std::abs(raw6 - adc_half);
|
|
||||||
uint32_t c2 = adc_half - std::abs(raw2 - adc_half);
|
|
||||||
uint32_t c0 = std::min(4095 - raw0, adc_half);
|
|
||||||
uint32_t csum = c12 + c6 + c2 + c0;
|
|
||||||
|
|
||||||
if (csum == 0) {
|
|
||||||
ESP_LOGE(TAG, "Invalid weight sum in autorange calculation");
|
|
||||||
return NAN;
|
|
||||||
}
|
|
||||||
|
|
||||||
return (mv12 * c12 + mv6 * c6 + mv2 * c2 + mv0 * c0) / csum;
|
|
||||||
}
|
|
||||||
|
|
||||||
} // namespace adc
|
|
||||||
} // namespace esphome
|
|
||||||
|
|
||||||
#endif // USE_ESP32
|
|
@ -1,62 +0,0 @@
|
|||||||
#ifdef USE_ESP8266
|
|
||||||
|
|
||||||
#include "adc_sensor.h"
|
|
||||||
#include "esphome/core/helpers.h"
|
|
||||||
#include "esphome/core/log.h"
|
|
||||||
|
|
||||||
#ifdef USE_ADC_SENSOR_VCC
|
|
||||||
#include <Esp.h>
|
|
||||||
ADC_MODE(ADC_VCC)
|
|
||||||
#else
|
|
||||||
#include <Arduino.h>
|
|
||||||
#endif // USE_ADC_SENSOR_VCC
|
|
||||||
|
|
||||||
namespace esphome {
|
|
||||||
namespace adc {
|
|
||||||
|
|
||||||
static const char *const TAG = "adc.esp8266";
|
|
||||||
|
|
||||||
void ADCSensor::setup() {
|
|
||||||
ESP_LOGCONFIG(TAG, "Running setup for '%s'", this->get_name().c_str());
|
|
||||||
#ifndef USE_ADC_SENSOR_VCC
|
|
||||||
this->pin_->setup();
|
|
||||||
#endif
|
|
||||||
}
|
|
||||||
|
|
||||||
void ADCSensor::dump_config() {
|
|
||||||
LOG_SENSOR("", "ADC Sensor", this);
|
|
||||||
#ifdef USE_ADC_SENSOR_VCC
|
|
||||||
ESP_LOGCONFIG(TAG, " Pin: VCC");
|
|
||||||
#else
|
|
||||||
LOG_PIN(" Pin: ", this->pin_);
|
|
||||||
#endif // USE_ADC_SENSOR_VCC
|
|
||||||
ESP_LOGCONFIG(TAG,
|
|
||||||
" Samples: %i\n"
|
|
||||||
" Sampling mode: %s",
|
|
||||||
this->sample_count_, LOG_STR_ARG(sampling_mode_to_str(this->sampling_mode_)));
|
|
||||||
LOG_UPDATE_INTERVAL(this);
|
|
||||||
}
|
|
||||||
|
|
||||||
float ADCSensor::sample() {
|
|
||||||
auto aggr = Aggregator(this->sampling_mode_);
|
|
||||||
|
|
||||||
for (uint8_t sample = 0; sample < this->sample_count_; sample++) {
|
|
||||||
uint32_t raw = 0;
|
|
||||||
#ifdef USE_ADC_SENSOR_VCC
|
|
||||||
raw = ESP.getVcc(); // NOLINT(readability-static-accessed-through-instance)
|
|
||||||
#else
|
|
||||||
raw = analogRead(this->pin_->get_pin()); // NOLINT
|
|
||||||
#endif // USE_ADC_SENSOR_VCC
|
|
||||||
aggr.add_sample(raw);
|
|
||||||
}
|
|
||||||
|
|
||||||
if (this->output_raw_) {
|
|
||||||
return aggr.aggregate();
|
|
||||||
}
|
|
||||||
return aggr.aggregate() / 1024.0f;
|
|
||||||
}
|
|
||||||
|
|
||||||
} // namespace adc
|
|
||||||
} // namespace esphome
|
|
||||||
|
|
||||||
#endif // USE_ESP8266
|
|
@ -1,55 +0,0 @@
|
|||||||
#ifdef USE_LIBRETINY
|
|
||||||
|
|
||||||
#include "adc_sensor.h"
|
|
||||||
#include "esphome/core/log.h"
|
|
||||||
|
|
||||||
namespace esphome {
|
|
||||||
namespace adc {
|
|
||||||
|
|
||||||
static const char *const TAG = "adc.libretiny";
|
|
||||||
|
|
||||||
void ADCSensor::setup() {
|
|
||||||
ESP_LOGCONFIG(TAG, "Running setup for '%s'", this->get_name().c_str());
|
|
||||||
#ifndef USE_ADC_SENSOR_VCC
|
|
||||||
this->pin_->setup();
|
|
||||||
#endif // !USE_ADC_SENSOR_VCC
|
|
||||||
}
|
|
||||||
|
|
||||||
void ADCSensor::dump_config() {
|
|
||||||
LOG_SENSOR("", "ADC Sensor", this);
|
|
||||||
#ifdef USE_ADC_SENSOR_VCC
|
|
||||||
ESP_LOGCONFIG(TAG, " Pin: VCC");
|
|
||||||
#else // USE_ADC_SENSOR_VCC
|
|
||||||
LOG_PIN(" Pin: ", this->pin_);
|
|
||||||
#endif // USE_ADC_SENSOR_VCC
|
|
||||||
ESP_LOGCONFIG(TAG,
|
|
||||||
" Samples: %i\n"
|
|
||||||
" Sampling mode: %s",
|
|
||||||
this->sample_count_, LOG_STR_ARG(sampling_mode_to_str(this->sampling_mode_)));
|
|
||||||
LOG_UPDATE_INTERVAL(this);
|
|
||||||
}
|
|
||||||
|
|
||||||
float ADCSensor::sample() {
|
|
||||||
uint32_t raw = 0;
|
|
||||||
auto aggr = Aggregator(this->sampling_mode_);
|
|
||||||
|
|
||||||
if (this->output_raw_) {
|
|
||||||
for (uint8_t sample = 0; sample < this->sample_count_; sample++) {
|
|
||||||
raw = analogRead(this->pin_->get_pin()); // NOLINT
|
|
||||||
aggr.add_sample(raw);
|
|
||||||
}
|
|
||||||
return aggr.aggregate();
|
|
||||||
}
|
|
||||||
|
|
||||||
for (uint8_t sample = 0; sample < this->sample_count_; sample++) {
|
|
||||||
raw = analogReadVoltage(this->pin_->get_pin()); // NOLINT
|
|
||||||
aggr.add_sample(raw);
|
|
||||||
}
|
|
||||||
|
|
||||||
return aggr.aggregate() / 1000.0f;
|
|
||||||
}
|
|
||||||
|
|
||||||
} // namespace adc
|
|
||||||
} // namespace esphome
|
|
||||||
|
|
||||||
#endif // USE_LIBRETINY
|
|
@ -1,98 +0,0 @@
|
|||||||
#ifdef USE_RP2040
|
|
||||||
|
|
||||||
#include "adc_sensor.h"
|
|
||||||
#include "esphome/core/log.h"
|
|
||||||
|
|
||||||
#ifdef CYW43_USES_VSYS_PIN
|
|
||||||
#include "pico/cyw43_arch.h"
|
|
||||||
#endif // CYW43_USES_VSYS_PIN
|
|
||||||
#include <hardware/adc.h>
|
|
||||||
|
|
||||||
namespace esphome {
|
|
||||||
namespace adc {
|
|
||||||
|
|
||||||
static const char *const TAG = "adc.rp2040";
|
|
||||||
|
|
||||||
void ADCSensor::setup() {
|
|
||||||
ESP_LOGCONFIG(TAG, "Running setup for '%s'", this->get_name().c_str());
|
|
||||||
static bool initialized = false;
|
|
||||||
if (!initialized) {
|
|
||||||
adc_init();
|
|
||||||
initialized = true;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
void ADCSensor::dump_config() {
|
|
||||||
LOG_SENSOR("", "ADC Sensor", this);
|
|
||||||
if (this->is_temperature_) {
|
|
||||||
ESP_LOGCONFIG(TAG, " Pin: Temperature");
|
|
||||||
} else {
|
|
||||||
#ifdef USE_ADC_SENSOR_VCC
|
|
||||||
ESP_LOGCONFIG(TAG, " Pin: VCC");
|
|
||||||
#else
|
|
||||||
LOG_PIN(" Pin: ", this->pin_);
|
|
||||||
#endif // USE_ADC_SENSOR_VCC
|
|
||||||
}
|
|
||||||
ESP_LOGCONFIG(TAG,
|
|
||||||
" Samples: %i\n"
|
|
||||||
" Sampling mode: %s",
|
|
||||||
this->sample_count_, LOG_STR_ARG(sampling_mode_to_str(this->sampling_mode_)));
|
|
||||||
LOG_UPDATE_INTERVAL(this);
|
|
||||||
}
|
|
||||||
|
|
||||||
float ADCSensor::sample() {
|
|
||||||
uint32_t raw = 0;
|
|
||||||
auto aggr = Aggregator(this->sampling_mode_);
|
|
||||||
|
|
||||||
if (this->is_temperature_) {
|
|
||||||
adc_set_temp_sensor_enabled(true);
|
|
||||||
delay(1);
|
|
||||||
adc_select_input(4);
|
|
||||||
|
|
||||||
for (uint8_t sample = 0; sample < this->sample_count_; sample++) {
|
|
||||||
raw = adc_read();
|
|
||||||
aggr.add_sample(raw);
|
|
||||||
}
|
|
||||||
adc_set_temp_sensor_enabled(false);
|
|
||||||
if (this->output_raw_) {
|
|
||||||
return aggr.aggregate();
|
|
||||||
}
|
|
||||||
return aggr.aggregate() * 3.3f / 4096.0f;
|
|
||||||
}
|
|
||||||
|
|
||||||
uint8_t pin = this->pin_->get_pin();
|
|
||||||
#ifdef CYW43_USES_VSYS_PIN
|
|
||||||
if (pin == PICO_VSYS_PIN) {
|
|
||||||
// Measuring VSYS on Raspberry Pico W needs to be wrapped with
|
|
||||||
// `cyw43_thread_enter()`/`cyw43_thread_exit()` as discussed in
|
|
||||||
// https://github.com/raspberrypi/pico-sdk/issues/1222, since Wifi chip and
|
|
||||||
// VSYS ADC both share GPIO29
|
|
||||||
cyw43_thread_enter();
|
|
||||||
}
|
|
||||||
#endif // CYW43_USES_VSYS_PIN
|
|
||||||
|
|
||||||
adc_gpio_init(pin);
|
|
||||||
adc_select_input(pin - 26);
|
|
||||||
|
|
||||||
for (uint8_t sample = 0; sample < this->sample_count_; sample++) {
|
|
||||||
raw = adc_read();
|
|
||||||
aggr.add_sample(raw);
|
|
||||||
}
|
|
||||||
|
|
||||||
#ifdef CYW43_USES_VSYS_PIN
|
|
||||||
if (pin == PICO_VSYS_PIN) {
|
|
||||||
cyw43_thread_exit();
|
|
||||||
}
|
|
||||||
#endif // CYW43_USES_VSYS_PIN
|
|
||||||
|
|
||||||
if (this->output_raw_) {
|
|
||||||
return aggr.aggregate();
|
|
||||||
}
|
|
||||||
float coeff = pin == PICO_VSYS_PIN ? 3.0f : 1.0f;
|
|
||||||
return aggr.aggregate() * 3.3f / 4096.0f * coeff;
|
|
||||||
}
|
|
||||||
|
|
||||||
} // namespace adc
|
|
||||||
} // namespace esphome
|
|
||||||
|
|
||||||
#endif // USE_RP2040
|
|
@ -1,28 +1,27 @@
|
|||||||
import logging
|
import logging
|
||||||
|
|
||||||
import esphome.codegen as cg
|
import esphome.codegen as cg
|
||||||
|
import esphome.config_validation as cv
|
||||||
|
import esphome.final_validate as fv
|
||||||
|
from esphome.core import CORE
|
||||||
from esphome.components import sensor, voltage_sampler
|
from esphome.components import sensor, voltage_sampler
|
||||||
from esphome.components.esp32 import get_esp32_variant
|
from esphome.components.esp32 import get_esp32_variant
|
||||||
import esphome.config_validation as cv
|
|
||||||
from esphome.const import (
|
from esphome.const import (
|
||||||
CONF_ATTENUATION,
|
CONF_ATTENUATION,
|
||||||
CONF_ID,
|
CONF_ID,
|
||||||
CONF_NUMBER,
|
CONF_NUMBER,
|
||||||
CONF_PIN,
|
CONF_PIN,
|
||||||
CONF_RAW,
|
CONF_RAW,
|
||||||
|
CONF_WIFI,
|
||||||
DEVICE_CLASS_VOLTAGE,
|
DEVICE_CLASS_VOLTAGE,
|
||||||
STATE_CLASS_MEASUREMENT,
|
STATE_CLASS_MEASUREMENT,
|
||||||
UNIT_VOLT,
|
UNIT_VOLT,
|
||||||
)
|
)
|
||||||
from esphome.core import CORE
|
|
||||||
|
|
||||||
from . import (
|
from . import (
|
||||||
ATTENUATION_MODES,
|
ATTENUATION_MODES,
|
||||||
ESP32_VARIANT_ADC1_PIN_TO_CHANNEL,
|
ESP32_VARIANT_ADC1_PIN_TO_CHANNEL,
|
||||||
ESP32_VARIANT_ADC2_PIN_TO_CHANNEL,
|
ESP32_VARIANT_ADC2_PIN_TO_CHANNEL,
|
||||||
SAMPLING_MODES,
|
|
||||||
adc_ns,
|
adc_ns,
|
||||||
adc_unit_t,
|
|
||||||
validate_adc_pin,
|
validate_adc_pin,
|
||||||
)
|
)
|
||||||
|
|
||||||
@ -31,11 +30,9 @@ _LOGGER = logging.getLogger(__name__)
|
|||||||
AUTO_LOAD = ["voltage_sampler"]
|
AUTO_LOAD = ["voltage_sampler"]
|
||||||
|
|
||||||
CONF_SAMPLES = "samples"
|
CONF_SAMPLES = "samples"
|
||||||
CONF_SAMPLING_MODE = "sampling_mode"
|
|
||||||
|
|
||||||
|
|
||||||
_attenuation = cv.enum(ATTENUATION_MODES, lower=True)
|
_attenuation = cv.enum(ATTENUATION_MODES, lower=True)
|
||||||
_sampling_mode = cv.enum(SAMPLING_MODES, lower=True)
|
|
||||||
|
|
||||||
|
|
||||||
def validate_config(config):
|
def validate_config(config):
|
||||||
@ -56,6 +53,21 @@ def validate_config(config):
|
|||||||
return config
|
return config
|
||||||
|
|
||||||
|
|
||||||
|
def final_validate_config(config):
|
||||||
|
if CORE.is_esp32:
|
||||||
|
variant = get_esp32_variant()
|
||||||
|
if (
|
||||||
|
CONF_WIFI in fv.full_config.get()
|
||||||
|
and config[CONF_PIN][CONF_NUMBER]
|
||||||
|
in ESP32_VARIANT_ADC2_PIN_TO_CHANNEL[variant]
|
||||||
|
):
|
||||||
|
raise cv.Invalid(
|
||||||
|
f"{variant} doesn't support ADC on this pin when Wi-Fi is configured"
|
||||||
|
)
|
||||||
|
|
||||||
|
return config
|
||||||
|
|
||||||
|
|
||||||
ADCSensor = adc_ns.class_(
|
ADCSensor = adc_ns.class_(
|
||||||
"ADCSensor", sensor.Sensor, cg.PollingComponent, voltage_sampler.VoltageSampler
|
"ADCSensor", sensor.Sensor, cg.PollingComponent, voltage_sampler.VoltageSampler
|
||||||
)
|
)
|
||||||
@ -76,13 +88,14 @@ CONFIG_SCHEMA = cv.All(
|
|||||||
cv.only_on_esp32, _attenuation
|
cv.only_on_esp32, _attenuation
|
||||||
),
|
),
|
||||||
cv.Optional(CONF_SAMPLES, default=1): cv.int_range(min=1, max=255),
|
cv.Optional(CONF_SAMPLES, default=1): cv.int_range(min=1, max=255),
|
||||||
cv.Optional(CONF_SAMPLING_MODE, default="avg"): _sampling_mode,
|
|
||||||
}
|
}
|
||||||
)
|
)
|
||||||
.extend(cv.polling_component_schema("60s")),
|
.extend(cv.polling_component_schema("60s")),
|
||||||
validate_config,
|
validate_config,
|
||||||
)
|
)
|
||||||
|
|
||||||
|
FINAL_VALIDATE_SCHEMA = final_validate_config
|
||||||
|
|
||||||
|
|
||||||
async def to_code(config):
|
async def to_code(config):
|
||||||
var = cg.new_Pvariable(config[CONF_ID])
|
var = cg.new_Pvariable(config[CONF_ID])
|
||||||
@ -99,15 +112,14 @@ async def to_code(config):
|
|||||||
|
|
||||||
cg.add(var.set_output_raw(config[CONF_RAW]))
|
cg.add(var.set_output_raw(config[CONF_RAW]))
|
||||||
cg.add(var.set_sample_count(config[CONF_SAMPLES]))
|
cg.add(var.set_sample_count(config[CONF_SAMPLES]))
|
||||||
cg.add(var.set_sampling_mode(config[CONF_SAMPLING_MODE]))
|
|
||||||
|
if attenuation := config.get(CONF_ATTENUATION):
|
||||||
|
if attenuation == "auto":
|
||||||
|
cg.add(var.set_autorange(cg.global_ns.true))
|
||||||
|
else:
|
||||||
|
cg.add(var.set_attenuation(attenuation))
|
||||||
|
|
||||||
if CORE.is_esp32:
|
if CORE.is_esp32:
|
||||||
if attenuation := config.get(CONF_ATTENUATION):
|
|
||||||
if attenuation == "auto":
|
|
||||||
cg.add(var.set_autorange(cg.global_ns.true))
|
|
||||||
else:
|
|
||||||
cg.add(var.set_attenuation(attenuation))
|
|
||||||
|
|
||||||
variant = get_esp32_variant()
|
variant = get_esp32_variant()
|
||||||
pin_num = config[CONF_PIN][CONF_NUMBER]
|
pin_num = config[CONF_PIN][CONF_NUMBER]
|
||||||
if (
|
if (
|
||||||
@ -115,10 +127,10 @@ async def to_code(config):
|
|||||||
and pin_num in ESP32_VARIANT_ADC1_PIN_TO_CHANNEL[variant]
|
and pin_num in ESP32_VARIANT_ADC1_PIN_TO_CHANNEL[variant]
|
||||||
):
|
):
|
||||||
chan = ESP32_VARIANT_ADC1_PIN_TO_CHANNEL[variant][pin_num]
|
chan = ESP32_VARIANT_ADC1_PIN_TO_CHANNEL[variant][pin_num]
|
||||||
cg.add(var.set_channel(adc_unit_t.ADC_UNIT_1, chan))
|
cg.add(var.set_channel1(chan))
|
||||||
elif (
|
elif (
|
||||||
variant in ESP32_VARIANT_ADC2_PIN_TO_CHANNEL
|
variant in ESP32_VARIANT_ADC2_PIN_TO_CHANNEL
|
||||||
and pin_num in ESP32_VARIANT_ADC2_PIN_TO_CHANNEL[variant]
|
and pin_num in ESP32_VARIANT_ADC2_PIN_TO_CHANNEL[variant]
|
||||||
):
|
):
|
||||||
chan = ESP32_VARIANT_ADC2_PIN_TO_CHANNEL[variant][pin_num]
|
chan = ESP32_VARIANT_ADC2_PIN_TO_CHANNEL[variant][pin_num]
|
||||||
cg.add(var.set_channel(adc_unit_t.ADC_UNIT_2, chan))
|
cg.add(var.set_channel2(chan))
|
||||||
|
@ -1,6 +1,6 @@
|
|||||||
import esphome.codegen as cg
|
import esphome.codegen as cg
|
||||||
from esphome.components import spi
|
|
||||||
import esphome.config_validation as cv
|
import esphome.config_validation as cv
|
||||||
|
from esphome.components import spi
|
||||||
from esphome.const import CONF_ID
|
from esphome.const import CONF_ID
|
||||||
|
|
||||||
DEPENDENCIES = ["spi"]
|
DEPENDENCIES = ["spi"]
|
||||||
|
@ -9,7 +9,7 @@ static const char *const TAG = "adc128s102";
|
|||||||
float ADC128S102::get_setup_priority() const { return setup_priority::HARDWARE; }
|
float ADC128S102::get_setup_priority() const { return setup_priority::HARDWARE; }
|
||||||
|
|
||||||
void ADC128S102::setup() {
|
void ADC128S102::setup() {
|
||||||
ESP_LOGCONFIG(TAG, "Running setup");
|
ESP_LOGCONFIG(TAG, "Setting up adc128s102");
|
||||||
this->spi_setup();
|
this->spi_setup();
|
||||||
}
|
}
|
||||||
|
|
||||||
|
@ -1,9 +1,9 @@
|
|||||||
import esphome.codegen as cg
|
import esphome.codegen as cg
|
||||||
from esphome.components import sensor, voltage_sampler
|
|
||||||
import esphome.config_validation as cv
|
import esphome.config_validation as cv
|
||||||
from esphome.const import CONF_CHANNEL, CONF_ID
|
from esphome.components import sensor, voltage_sampler
|
||||||
|
from esphome.const import CONF_ID, CONF_CHANNEL
|
||||||
|
|
||||||
from .. import ADC128S102, adc128s102_ns
|
from .. import adc128s102_ns, ADC128S102
|
||||||
|
|
||||||
AUTO_LOAD = ["voltage_sampler"]
|
AUTO_LOAD = ["voltage_sampler"]
|
||||||
DEPENDENCIES = ["adc128s102"]
|
DEPENDENCIES = ["adc128s102"]
|
||||||
|
@ -1,15 +1,15 @@
|
|||||||
import esphome.codegen as cg
|
import esphome.codegen as cg
|
||||||
from esphome.components import display, light
|
|
||||||
import esphome.config_validation as cv
|
import esphome.config_validation as cv
|
||||||
|
from esphome.components import display, light
|
||||||
from esphome.const import (
|
from esphome.const import (
|
||||||
CONF_ADDRESSABLE_LIGHT_ID,
|
|
||||||
CONF_HEIGHT,
|
|
||||||
CONF_ID,
|
CONF_ID,
|
||||||
CONF_LAMBDA,
|
CONF_LAMBDA,
|
||||||
CONF_PAGES,
|
CONF_PAGES,
|
||||||
CONF_PIXEL_MAPPER,
|
CONF_ADDRESSABLE_LIGHT_ID,
|
||||||
CONF_UPDATE_INTERVAL,
|
CONF_HEIGHT,
|
||||||
CONF_WIDTH,
|
CONF_WIDTH,
|
||||||
|
CONF_UPDATE_INTERVAL,
|
||||||
|
CONF_PIXEL_MAPPER,
|
||||||
)
|
)
|
||||||
|
|
||||||
CODEOWNERS = ["@justfalter"]
|
CODEOWNERS = ["@justfalter"]
|
||||||
|
@ -177,14 +177,11 @@ void ADE7880::dump_config() {
|
|||||||
LOG_SENSOR(" ", "Power Factor", this->channel_a_->power_factor);
|
LOG_SENSOR(" ", "Power Factor", this->channel_a_->power_factor);
|
||||||
LOG_SENSOR(" ", "Forward Active Energy", this->channel_a_->forward_active_energy);
|
LOG_SENSOR(" ", "Forward Active Energy", this->channel_a_->forward_active_energy);
|
||||||
LOG_SENSOR(" ", "Reverse Active Energy", this->channel_a_->reverse_active_energy);
|
LOG_SENSOR(" ", "Reverse Active Energy", this->channel_a_->reverse_active_energy);
|
||||||
ESP_LOGCONFIG(TAG,
|
ESP_LOGCONFIG(TAG, " Calibration:");
|
||||||
" Calibration:\n"
|
ESP_LOGCONFIG(TAG, " Current: %" PRId32, this->channel_a_->current_gain_calibration);
|
||||||
" Current: %" PRId32 "\n"
|
ESP_LOGCONFIG(TAG, " Voltage: %" PRId32, this->channel_a_->voltage_gain_calibration);
|
||||||
" Voltage: %" PRId32 "\n"
|
ESP_LOGCONFIG(TAG, " Power: %" PRId32, this->channel_a_->power_gain_calibration);
|
||||||
" Power: %" PRId32 "\n"
|
ESP_LOGCONFIG(TAG, " Phase Angle: %u", this->channel_a_->phase_angle_calibration);
|
||||||
" Phase Angle: %u",
|
|
||||||
this->channel_a_->current_gain_calibration, this->channel_a_->voltage_gain_calibration,
|
|
||||||
this->channel_a_->power_gain_calibration, this->channel_a_->phase_angle_calibration);
|
|
||||||
}
|
}
|
||||||
|
|
||||||
if (this->channel_b_ != nullptr) {
|
if (this->channel_b_ != nullptr) {
|
||||||
@ -196,14 +193,11 @@ void ADE7880::dump_config() {
|
|||||||
LOG_SENSOR(" ", "Power Factor", this->channel_b_->power_factor);
|
LOG_SENSOR(" ", "Power Factor", this->channel_b_->power_factor);
|
||||||
LOG_SENSOR(" ", "Forward Active Energy", this->channel_b_->forward_active_energy);
|
LOG_SENSOR(" ", "Forward Active Energy", this->channel_b_->forward_active_energy);
|
||||||
LOG_SENSOR(" ", "Reverse Active Energy", this->channel_b_->reverse_active_energy);
|
LOG_SENSOR(" ", "Reverse Active Energy", this->channel_b_->reverse_active_energy);
|
||||||
ESP_LOGCONFIG(TAG,
|
ESP_LOGCONFIG(TAG, " Calibration:");
|
||||||
" Calibration:\n"
|
ESP_LOGCONFIG(TAG, " Current: %" PRId32, this->channel_b_->current_gain_calibration);
|
||||||
" Current: %" PRId32 "\n"
|
ESP_LOGCONFIG(TAG, " Voltage: %" PRId32, this->channel_b_->voltage_gain_calibration);
|
||||||
" Voltage: %" PRId32 "\n"
|
ESP_LOGCONFIG(TAG, " Power: %" PRId32, this->channel_b_->power_gain_calibration);
|
||||||
" Power: %" PRId32 "\n"
|
ESP_LOGCONFIG(TAG, " Phase Angle: %u", this->channel_b_->phase_angle_calibration);
|
||||||
" Phase Angle: %u",
|
|
||||||
this->channel_b_->current_gain_calibration, this->channel_b_->voltage_gain_calibration,
|
|
||||||
this->channel_b_->power_gain_calibration, this->channel_b_->phase_angle_calibration);
|
|
||||||
}
|
}
|
||||||
|
|
||||||
if (this->channel_c_ != nullptr) {
|
if (this->channel_c_ != nullptr) {
|
||||||
@ -215,23 +209,18 @@ void ADE7880::dump_config() {
|
|||||||
LOG_SENSOR(" ", "Power Factor", this->channel_c_->power_factor);
|
LOG_SENSOR(" ", "Power Factor", this->channel_c_->power_factor);
|
||||||
LOG_SENSOR(" ", "Forward Active Energy", this->channel_c_->forward_active_energy);
|
LOG_SENSOR(" ", "Forward Active Energy", this->channel_c_->forward_active_energy);
|
||||||
LOG_SENSOR(" ", "Reverse Active Energy", this->channel_c_->reverse_active_energy);
|
LOG_SENSOR(" ", "Reverse Active Energy", this->channel_c_->reverse_active_energy);
|
||||||
ESP_LOGCONFIG(TAG,
|
ESP_LOGCONFIG(TAG, " Calibration:");
|
||||||
" Calibration:\n"
|
ESP_LOGCONFIG(TAG, " Current: %" PRId32, this->channel_c_->current_gain_calibration);
|
||||||
" Current: %" PRId32 "\n"
|
ESP_LOGCONFIG(TAG, " Voltage: %" PRId32, this->channel_c_->voltage_gain_calibration);
|
||||||
" Voltage: %" PRId32 "\n"
|
ESP_LOGCONFIG(TAG, " Power: %" PRId32, this->channel_c_->power_gain_calibration);
|
||||||
" Power: %" PRId32 "\n"
|
ESP_LOGCONFIG(TAG, " Phase Angle: %u", this->channel_c_->phase_angle_calibration);
|
||||||
" Phase Angle: %u",
|
|
||||||
this->channel_c_->current_gain_calibration, this->channel_c_->voltage_gain_calibration,
|
|
||||||
this->channel_c_->power_gain_calibration, this->channel_c_->phase_angle_calibration);
|
|
||||||
}
|
}
|
||||||
|
|
||||||
if (this->channel_n_ != nullptr) {
|
if (this->channel_n_ != nullptr) {
|
||||||
ESP_LOGCONFIG(TAG, " Neutral:");
|
ESP_LOGCONFIG(TAG, " Neutral:");
|
||||||
LOG_SENSOR(" ", "Current", this->channel_n_->current);
|
LOG_SENSOR(" ", "Current", this->channel_n_->current);
|
||||||
ESP_LOGCONFIG(TAG,
|
ESP_LOGCONFIG(TAG, " Calibration:");
|
||||||
" Calibration:\n"
|
ESP_LOGCONFIG(TAG, " Current: %" PRId32, this->channel_n_->current_gain_calibration);
|
||||||
" Current: %" PRId32,
|
|
||||||
this->channel_n_->current_gain_calibration);
|
|
||||||
}
|
}
|
||||||
|
|
||||||
LOG_I2C_DEVICE(this);
|
LOG_I2C_DEVICE(this);
|
||||||
|
@ -85,6 +85,8 @@ class ADE7880 : public i2c::I2CDevice, public PollingComponent {
|
|||||||
|
|
||||||
void dump_config() override;
|
void dump_config() override;
|
||||||
|
|
||||||
|
float get_setup_priority() const override { return setup_priority::DATA; }
|
||||||
|
|
||||||
protected:
|
protected:
|
||||||
ADE7880Store store_{};
|
ADE7880Store store_{};
|
||||||
InternalGPIOPin *irq0_pin_{nullptr};
|
InternalGPIOPin *irq0_pin_{nullptr};
|
||||||
|
@ -1,7 +1,7 @@
|
|||||||
from esphome import pins
|
|
||||||
import esphome.codegen as cg
|
import esphome.codegen as cg
|
||||||
from esphome.components import i2c, sensor
|
|
||||||
import esphome.config_validation as cv
|
import esphome.config_validation as cv
|
||||||
|
from esphome.components import sensor, i2c
|
||||||
|
from esphome import pins
|
||||||
from esphome.const import (
|
from esphome.const import (
|
||||||
CONF_ACTIVE_POWER,
|
CONF_ACTIVE_POWER,
|
||||||
CONF_APPARENT_POWER,
|
CONF_APPARENT_POWER,
|
||||||
|
@ -1,27 +1,27 @@
|
|||||||
from esphome import pins
|
|
||||||
import esphome.codegen as cg
|
import esphome.codegen as cg
|
||||||
from esphome.components import sensor
|
|
||||||
import esphome.config_validation as cv
|
import esphome.config_validation as cv
|
||||||
|
from esphome.components import sensor
|
||||||
|
from esphome import pins
|
||||||
from esphome.const import (
|
from esphome.const import (
|
||||||
CONF_FREQUENCY,
|
|
||||||
CONF_IRQ_PIN,
|
CONF_IRQ_PIN,
|
||||||
CONF_VOLTAGE,
|
CONF_VOLTAGE,
|
||||||
|
CONF_FREQUENCY,
|
||||||
CONF_VOLTAGE_GAIN,
|
CONF_VOLTAGE_GAIN,
|
||||||
DEVICE_CLASS_APPARENT_POWER,
|
|
||||||
DEVICE_CLASS_CURRENT,
|
DEVICE_CLASS_CURRENT,
|
||||||
DEVICE_CLASS_FREQUENCY,
|
DEVICE_CLASS_APPARENT_POWER,
|
||||||
DEVICE_CLASS_POWER,
|
DEVICE_CLASS_POWER,
|
||||||
DEVICE_CLASS_POWER_FACTOR,
|
|
||||||
DEVICE_CLASS_REACTIVE_POWER,
|
DEVICE_CLASS_REACTIVE_POWER,
|
||||||
|
DEVICE_CLASS_POWER_FACTOR,
|
||||||
DEVICE_CLASS_VOLTAGE,
|
DEVICE_CLASS_VOLTAGE,
|
||||||
|
DEVICE_CLASS_FREQUENCY,
|
||||||
STATE_CLASS_MEASUREMENT,
|
STATE_CLASS_MEASUREMENT,
|
||||||
UNIT_AMPERE,
|
|
||||||
UNIT_HERTZ,
|
|
||||||
UNIT_PERCENT,
|
|
||||||
UNIT_VOLT,
|
UNIT_VOLT,
|
||||||
|
UNIT_HERTZ,
|
||||||
|
UNIT_AMPERE,
|
||||||
UNIT_VOLT_AMPS,
|
UNIT_VOLT_AMPS,
|
||||||
UNIT_VOLT_AMPS_REACTIVE,
|
|
||||||
UNIT_WATT,
|
UNIT_WATT,
|
||||||
|
UNIT_VOLT_AMPS_REACTIVE,
|
||||||
|
UNIT_PERCENT,
|
||||||
)
|
)
|
||||||
|
|
||||||
CONF_CURRENT_A = "current_a"
|
CONF_CURRENT_A = "current_a"
|
||||||
|
@ -58,18 +58,15 @@ void ADE7953::dump_config() {
|
|||||||
LOG_SENSOR(" ", "Active Power B Sensor", this->active_power_b_sensor_);
|
LOG_SENSOR(" ", "Active Power B Sensor", this->active_power_b_sensor_);
|
||||||
LOG_SENSOR(" ", "Rective Power A Sensor", this->reactive_power_a_sensor_);
|
LOG_SENSOR(" ", "Rective Power A Sensor", this->reactive_power_a_sensor_);
|
||||||
LOG_SENSOR(" ", "Reactive Power B Sensor", this->reactive_power_b_sensor_);
|
LOG_SENSOR(" ", "Reactive Power B Sensor", this->reactive_power_b_sensor_);
|
||||||
ESP_LOGCONFIG(TAG,
|
ESP_LOGCONFIG(TAG, " USE_ACC_ENERGY_REGS: %d", this->use_acc_energy_regs_);
|
||||||
" USE_ACC_ENERGY_REGS: %d\n"
|
ESP_LOGCONFIG(TAG, " PGA_V_8: 0x%X", pga_v_);
|
||||||
" PGA_V_8: 0x%X\n"
|
ESP_LOGCONFIG(TAG, " PGA_IA_8: 0x%X", pga_ia_);
|
||||||
" PGA_IA_8: 0x%X\n"
|
ESP_LOGCONFIG(TAG, " PGA_IB_8: 0x%X", pga_ib_);
|
||||||
" PGA_IB_8: 0x%X\n"
|
ESP_LOGCONFIG(TAG, " VGAIN_32: 0x%08jX", (uintmax_t) vgain_);
|
||||||
" VGAIN_32: 0x%08jX\n"
|
ESP_LOGCONFIG(TAG, " AIGAIN_32: 0x%08jX", (uintmax_t) aigain_);
|
||||||
" AIGAIN_32: 0x%08jX\n"
|
ESP_LOGCONFIG(TAG, " BIGAIN_32: 0x%08jX", (uintmax_t) bigain_);
|
||||||
" BIGAIN_32: 0x%08jX\n"
|
ESP_LOGCONFIG(TAG, " AWGAIN_32: 0x%08jX", (uintmax_t) awgain_);
|
||||||
" AWGAIN_32: 0x%08jX\n"
|
ESP_LOGCONFIG(TAG, " BWGAIN_32: 0x%08jX", (uintmax_t) bwgain_);
|
||||||
" BWGAIN_32: 0x%08jX",
|
|
||||||
this->use_acc_energy_regs_, pga_v_, pga_ia_, pga_ib_, (uintmax_t) vgain_, (uintmax_t) aigain_,
|
|
||||||
(uintmax_t) bigain_, (uintmax_t) awgain_, (uintmax_t) bwgain_);
|
|
||||||
}
|
}
|
||||||
|
|
||||||
#define ADE_PUBLISH_(name, val, factor) \
|
#define ADE_PUBLISH_(name, val, factor) \
|
||||||
|
@ -1,6 +1,6 @@
|
|||||||
#include "ade7953_i2c.h"
|
#include "ade7953_i2c.h"
|
||||||
#include "esphome/core/helpers.h"
|
|
||||||
#include "esphome/core/log.h"
|
#include "esphome/core/log.h"
|
||||||
|
#include "esphome/core/helpers.h"
|
||||||
|
|
||||||
namespace esphome {
|
namespace esphome {
|
||||||
namespace ade7953_i2c {
|
namespace ade7953_i2c {
|
||||||
|
@ -1,8 +1,9 @@
|
|||||||
import esphome.codegen as cg
|
import esphome.codegen as cg
|
||||||
from esphome.components import ade7953_base, i2c
|
|
||||||
import esphome.config_validation as cv
|
import esphome.config_validation as cv
|
||||||
|
from esphome.components import i2c, ade7953_base
|
||||||
from esphome.const import CONF_ID
|
from esphome.const import CONF_ID
|
||||||
|
|
||||||
|
|
||||||
DEPENDENCIES = ["i2c"]
|
DEPENDENCIES = ["i2c"]
|
||||||
AUTO_LOAD = ["ade7953_base"]
|
AUTO_LOAD = ["ade7953_base"]
|
||||||
|
|
||||||
|
@ -1,6 +1,6 @@
|
|||||||
#include "ade7953_spi.h"
|
#include "ade7953_spi.h"
|
||||||
#include "esphome/core/helpers.h"
|
|
||||||
#include "esphome/core/log.h"
|
#include "esphome/core/log.h"
|
||||||
|
#include "esphome/core/helpers.h"
|
||||||
|
|
||||||
namespace esphome {
|
namespace esphome {
|
||||||
namespace ade7953_spi {
|
namespace ade7953_spi {
|
||||||
|
@ -1,8 +1,9 @@
|
|||||||
import esphome.codegen as cg
|
import esphome.codegen as cg
|
||||||
from esphome.components import ade7953_base, spi
|
|
||||||
import esphome.config_validation as cv
|
import esphome.config_validation as cv
|
||||||
|
from esphome.components import spi, ade7953_base
|
||||||
from esphome.const import CONF_ID
|
from esphome.const import CONF_ID
|
||||||
|
|
||||||
|
|
||||||
DEPENDENCIES = ["spi"]
|
DEPENDENCIES = ["spi"]
|
||||||
AUTO_LOAD = ["ade7953_base"]
|
AUTO_LOAD = ["ade7953_base"]
|
||||||
|
|
||||||
|
@ -1,6 +1,6 @@
|
|||||||
import esphome.codegen as cg
|
import esphome.codegen as cg
|
||||||
from esphome.components import i2c
|
|
||||||
import esphome.config_validation as cv
|
import esphome.config_validation as cv
|
||||||
|
from esphome.components import i2c
|
||||||
from esphome.const import CONF_ID
|
from esphome.const import CONF_ID
|
||||||
|
|
||||||
DEPENDENCIES = ["i2c"]
|
DEPENDENCIES = ["i2c"]
|
||||||
|
@ -9,14 +9,18 @@ static const char *const TAG = "ads1115";
|
|||||||
static const uint8_t ADS1115_REGISTER_CONVERSION = 0x00;
|
static const uint8_t ADS1115_REGISTER_CONVERSION = 0x00;
|
||||||
static const uint8_t ADS1115_REGISTER_CONFIG = 0x01;
|
static const uint8_t ADS1115_REGISTER_CONFIG = 0x01;
|
||||||
|
|
||||||
|
static const uint8_t ADS1115_DATA_RATE_860_SPS = 0b111; // 3300_SPS for ADS1015
|
||||||
|
|
||||||
void ADS1115Component::setup() {
|
void ADS1115Component::setup() {
|
||||||
ESP_LOGCONFIG(TAG, "Running setup");
|
ESP_LOGCONFIG(TAG, "Setting up ADS1115...");
|
||||||
uint16_t value;
|
uint16_t value;
|
||||||
if (!this->read_byte_16(ADS1115_REGISTER_CONVERSION, &value)) {
|
if (!this->read_byte_16(ADS1115_REGISTER_CONVERSION, &value)) {
|
||||||
this->mark_failed();
|
this->mark_failed();
|
||||||
return;
|
return;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
ESP_LOGCONFIG(TAG, "Configuring ADS1115...");
|
||||||
|
|
||||||
uint16_t config = 0;
|
uint16_t config = 0;
|
||||||
// Clear single-shot bit
|
// Clear single-shot bit
|
||||||
// 0b0xxxxxxxxxxxxxxx
|
// 0b0xxxxxxxxxxxxxxx
|
||||||
@ -39,9 +43,9 @@ void ADS1115Component::setup() {
|
|||||||
config |= 0b0000000100000000;
|
config |= 0b0000000100000000;
|
||||||
}
|
}
|
||||||
|
|
||||||
// Set data rate - 860 samples per second
|
// Set data rate - 860 samples per second (we're in singleshot mode)
|
||||||
// 0bxxxxxxxx100xxxxx
|
// 0bxxxxxxxx100xxxxx
|
||||||
config |= ADS1115_860SPS << 5;
|
config |= ADS1115_DATA_RATE_860_SPS << 5;
|
||||||
|
|
||||||
// Set comparator mode - hysteresis
|
// Set comparator mode - hysteresis
|
||||||
// 0bxxxxxxxxxxx0xxxx
|
// 0bxxxxxxxxxxx0xxxx
|
||||||
@ -66,14 +70,14 @@ void ADS1115Component::setup() {
|
|||||||
this->prev_config_ = config;
|
this->prev_config_ = config;
|
||||||
}
|
}
|
||||||
void ADS1115Component::dump_config() {
|
void ADS1115Component::dump_config() {
|
||||||
ESP_LOGCONFIG(TAG, "ADS1115:");
|
ESP_LOGCONFIG(TAG, "Setting up ADS1115...");
|
||||||
LOG_I2C_DEVICE(this);
|
LOG_I2C_DEVICE(this);
|
||||||
if (this->is_failed()) {
|
if (this->is_failed()) {
|
||||||
ESP_LOGE(TAG, ESP_LOG_MSG_COMM_FAIL);
|
ESP_LOGE(TAG, "Communication with ADS1115 failed!");
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
float ADS1115Component::request_measurement(ADS1115Multiplexer multiplexer, ADS1115Gain gain,
|
float ADS1115Component::request_measurement(ADS1115Multiplexer multiplexer, ADS1115Gain gain,
|
||||||
ADS1115Resolution resolution, ADS1115Samplerate samplerate) {
|
ADS1115Resolution resolution) {
|
||||||
uint16_t config = this->prev_config_;
|
uint16_t config = this->prev_config_;
|
||||||
// Multiplexer
|
// Multiplexer
|
||||||
// 0bxBBBxxxxxxxxxxxx
|
// 0bxBBBxxxxxxxxxxxx
|
||||||
@ -85,11 +89,6 @@ float ADS1115Component::request_measurement(ADS1115Multiplexer multiplexer, ADS1
|
|||||||
config &= 0b1111000111111111;
|
config &= 0b1111000111111111;
|
||||||
config |= (gain & 0b111) << 9;
|
config |= (gain & 0b111) << 9;
|
||||||
|
|
||||||
// Sample rate
|
|
||||||
// 0bxxxxxxxxBBBxxxxx
|
|
||||||
config &= 0b1111111100011111;
|
|
||||||
config |= (samplerate & 0b111) << 5;
|
|
||||||
|
|
||||||
if (!this->continuous_mode_) {
|
if (!this->continuous_mode_) {
|
||||||
// Start conversion
|
// Start conversion
|
||||||
config |= 0b1000000000000000;
|
config |= 0b1000000000000000;
|
||||||
@ -102,54 +101,8 @@ float ADS1115Component::request_measurement(ADS1115Multiplexer multiplexer, ADS1
|
|||||||
}
|
}
|
||||||
this->prev_config_ = config;
|
this->prev_config_ = config;
|
||||||
|
|
||||||
// Delay calculated as: ceil((1000/SPS)+.5)
|
// about 1.2 ms with 860 samples per second
|
||||||
if (resolution == ADS1015_12_BITS) {
|
delay(2);
|
||||||
switch (samplerate) {
|
|
||||||
case ADS1115_8SPS:
|
|
||||||
delay(9);
|
|
||||||
break;
|
|
||||||
case ADS1115_16SPS:
|
|
||||||
delay(5);
|
|
||||||
break;
|
|
||||||
case ADS1115_32SPS:
|
|
||||||
delay(3);
|
|
||||||
break;
|
|
||||||
case ADS1115_64SPS:
|
|
||||||
case ADS1115_128SPS:
|
|
||||||
delay(2);
|
|
||||||
break;
|
|
||||||
default:
|
|
||||||
delay(1);
|
|
||||||
break;
|
|
||||||
}
|
|
||||||
} else {
|
|
||||||
switch (samplerate) {
|
|
||||||
case ADS1115_8SPS:
|
|
||||||
delay(126); // NOLINT
|
|
||||||
break;
|
|
||||||
case ADS1115_16SPS:
|
|
||||||
delay(63); // NOLINT
|
|
||||||
break;
|
|
||||||
case ADS1115_32SPS:
|
|
||||||
delay(32);
|
|
||||||
break;
|
|
||||||
case ADS1115_64SPS:
|
|
||||||
delay(17);
|
|
||||||
break;
|
|
||||||
case ADS1115_128SPS:
|
|
||||||
delay(9);
|
|
||||||
break;
|
|
||||||
case ADS1115_250SPS:
|
|
||||||
delay(5);
|
|
||||||
break;
|
|
||||||
case ADS1115_475SPS:
|
|
||||||
delay(3);
|
|
||||||
break;
|
|
||||||
case ADS1115_860SPS:
|
|
||||||
delay(2);
|
|
||||||
break;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
// in continuous mode, conversion will always be running, rely on the delay
|
// in continuous mode, conversion will always be running, rely on the delay
|
||||||
// to ensure conversion is taking place with the correct settings
|
// to ensure conversion is taking place with the correct settings
|
||||||
|
@ -33,27 +33,16 @@ enum ADS1115Resolution {
|
|||||||
ADS1015_12_BITS = 12,
|
ADS1015_12_BITS = 12,
|
||||||
};
|
};
|
||||||
|
|
||||||
enum ADS1115Samplerate {
|
|
||||||
ADS1115_8SPS = 0b000,
|
|
||||||
ADS1115_16SPS = 0b001,
|
|
||||||
ADS1115_32SPS = 0b010,
|
|
||||||
ADS1115_64SPS = 0b011,
|
|
||||||
ADS1115_128SPS = 0b100,
|
|
||||||
ADS1115_250SPS = 0b101,
|
|
||||||
ADS1115_475SPS = 0b110,
|
|
||||||
ADS1115_860SPS = 0b111
|
|
||||||
};
|
|
||||||
|
|
||||||
class ADS1115Component : public Component, public i2c::I2CDevice {
|
class ADS1115Component : public Component, public i2c::I2CDevice {
|
||||||
public:
|
public:
|
||||||
void setup() override;
|
void setup() override;
|
||||||
void dump_config() override;
|
void dump_config() override;
|
||||||
/// HARDWARE_LATE setup priority
|
/// HARDWARE_LATE setup priority
|
||||||
|
float get_setup_priority() const override { return setup_priority::DATA; }
|
||||||
void set_continuous_mode(bool continuous_mode) { continuous_mode_ = continuous_mode; }
|
void set_continuous_mode(bool continuous_mode) { continuous_mode_ = continuous_mode; }
|
||||||
|
|
||||||
/// Helper method to request a measurement from a sensor.
|
/// Helper method to request a measurement from a sensor.
|
||||||
float request_measurement(ADS1115Multiplexer multiplexer, ADS1115Gain gain, ADS1115Resolution resolution,
|
float request_measurement(ADS1115Multiplexer multiplexer, ADS1115Gain gain, ADS1115Resolution resolution);
|
||||||
ADS1115Samplerate samplerate);
|
|
||||||
|
|
||||||
protected:
|
protected:
|
||||||
uint16_t prev_config_{0};
|
uint16_t prev_config_{0};
|
||||||
|
@ -1,18 +1,16 @@
|
|||||||
import esphome.codegen as cg
|
import esphome.codegen as cg
|
||||||
from esphome.components import sensor, voltage_sampler
|
|
||||||
import esphome.config_validation as cv
|
import esphome.config_validation as cv
|
||||||
|
from esphome.components import sensor, voltage_sampler
|
||||||
from esphome.const import (
|
from esphome.const import (
|
||||||
CONF_GAIN,
|
CONF_GAIN,
|
||||||
CONF_ID,
|
|
||||||
CONF_MULTIPLEXER,
|
CONF_MULTIPLEXER,
|
||||||
CONF_RESOLUTION,
|
CONF_RESOLUTION,
|
||||||
CONF_SAMPLE_RATE,
|
|
||||||
DEVICE_CLASS_VOLTAGE,
|
DEVICE_CLASS_VOLTAGE,
|
||||||
STATE_CLASS_MEASUREMENT,
|
STATE_CLASS_MEASUREMENT,
|
||||||
UNIT_VOLT,
|
UNIT_VOLT,
|
||||||
|
CONF_ID,
|
||||||
)
|
)
|
||||||
|
from .. import ads1115_ns, ADS1115Component, CONF_ADS1115_ID
|
||||||
from .. import CONF_ADS1115_ID, ADS1115Component, ads1115_ns
|
|
||||||
|
|
||||||
AUTO_LOAD = ["voltage_sampler"]
|
AUTO_LOAD = ["voltage_sampler"]
|
||||||
DEPENDENCIES = ["ads1115"]
|
DEPENDENCIES = ["ads1115"]
|
||||||
@ -45,17 +43,6 @@ RESOLUTION = {
|
|||||||
"12_BITS": ADS1115Resolution.ADS1015_12_BITS,
|
"12_BITS": ADS1115Resolution.ADS1015_12_BITS,
|
||||||
}
|
}
|
||||||
|
|
||||||
ADS1115Samplerate = ads1115_ns.enum("ADS1115Samplerate")
|
|
||||||
SAMPLERATE = {
|
|
||||||
"8": ADS1115Samplerate.ADS1115_8SPS,
|
|
||||||
"16": ADS1115Samplerate.ADS1115_16SPS,
|
|
||||||
"32": ADS1115Samplerate.ADS1115_32SPS,
|
|
||||||
"64": ADS1115Samplerate.ADS1115_64SPS,
|
|
||||||
"128": ADS1115Samplerate.ADS1115_128SPS,
|
|
||||||
"250": ADS1115Samplerate.ADS1115_250SPS,
|
|
||||||
"475": ADS1115Samplerate.ADS1115_475SPS,
|
|
||||||
"860": ADS1115Samplerate.ADS1115_860SPS,
|
|
||||||
}
|
|
||||||
|
|
||||||
ADS1115Sensor = ads1115_ns.class_(
|
ADS1115Sensor = ads1115_ns.class_(
|
||||||
"ADS1115Sensor", sensor.Sensor, cg.PollingComponent, voltage_sampler.VoltageSampler
|
"ADS1115Sensor", sensor.Sensor, cg.PollingComponent, voltage_sampler.VoltageSampler
|
||||||
@ -77,9 +64,6 @@ CONFIG_SCHEMA = (
|
|||||||
cv.Optional(CONF_RESOLUTION, default="16_BITS"): cv.enum(
|
cv.Optional(CONF_RESOLUTION, default="16_BITS"): cv.enum(
|
||||||
RESOLUTION, upper=True, space="_"
|
RESOLUTION, upper=True, space="_"
|
||||||
),
|
),
|
||||||
cv.Optional(CONF_SAMPLE_RATE, default="860"): cv.enum(
|
|
||||||
SAMPLERATE, string=True
|
|
||||||
),
|
|
||||||
}
|
}
|
||||||
)
|
)
|
||||||
.extend(cv.polling_component_schema("60s"))
|
.extend(cv.polling_component_schema("60s"))
|
||||||
@ -95,4 +79,3 @@ async def to_code(config):
|
|||||||
cg.add(var.set_multiplexer(config[CONF_MULTIPLEXER]))
|
cg.add(var.set_multiplexer(config[CONF_MULTIPLEXER]))
|
||||||
cg.add(var.set_gain(config[CONF_GAIN]))
|
cg.add(var.set_gain(config[CONF_GAIN]))
|
||||||
cg.add(var.set_resolution(config[CONF_RESOLUTION]))
|
cg.add(var.set_resolution(config[CONF_RESOLUTION]))
|
||||||
cg.add(var.set_samplerate(config[CONF_SAMPLE_RATE]))
|
|
||||||
|
@ -8,7 +8,7 @@ namespace ads1115 {
|
|||||||
static const char *const TAG = "ads1115.sensor";
|
static const char *const TAG = "ads1115.sensor";
|
||||||
|
|
||||||
float ADS1115Sensor::sample() {
|
float ADS1115Sensor::sample() {
|
||||||
return this->parent_->request_measurement(this->multiplexer_, this->gain_, this->resolution_, this->samplerate_);
|
return this->parent_->request_measurement(this->multiplexer_, this->gain_, this->resolution_);
|
||||||
}
|
}
|
||||||
|
|
||||||
void ADS1115Sensor::update() {
|
void ADS1115Sensor::update() {
|
||||||
@ -24,7 +24,6 @@ void ADS1115Sensor::dump_config() {
|
|||||||
ESP_LOGCONFIG(TAG, " Multiplexer: %u", this->multiplexer_);
|
ESP_LOGCONFIG(TAG, " Multiplexer: %u", this->multiplexer_);
|
||||||
ESP_LOGCONFIG(TAG, " Gain: %u", this->gain_);
|
ESP_LOGCONFIG(TAG, " Gain: %u", this->gain_);
|
||||||
ESP_LOGCONFIG(TAG, " Resolution: %u", this->resolution_);
|
ESP_LOGCONFIG(TAG, " Resolution: %u", this->resolution_);
|
||||||
ESP_LOGCONFIG(TAG, " Sample rate: %u", this->samplerate_);
|
|
||||||
}
|
}
|
||||||
|
|
||||||
} // namespace ads1115
|
} // namespace ads1115
|
||||||
|
@ -21,7 +21,6 @@ class ADS1115Sensor : public sensor::Sensor,
|
|||||||
void set_multiplexer(ADS1115Multiplexer multiplexer) { this->multiplexer_ = multiplexer; }
|
void set_multiplexer(ADS1115Multiplexer multiplexer) { this->multiplexer_ = multiplexer; }
|
||||||
void set_gain(ADS1115Gain gain) { this->gain_ = gain; }
|
void set_gain(ADS1115Gain gain) { this->gain_ = gain; }
|
||||||
void set_resolution(ADS1115Resolution resolution) { this->resolution_ = resolution; }
|
void set_resolution(ADS1115Resolution resolution) { this->resolution_ = resolution; }
|
||||||
void set_samplerate(ADS1115Samplerate samplerate) { this->samplerate_ = samplerate; }
|
|
||||||
float sample() override;
|
float sample() override;
|
||||||
|
|
||||||
void dump_config() override;
|
void dump_config() override;
|
||||||
@ -30,7 +29,6 @@ class ADS1115Sensor : public sensor::Sensor,
|
|||||||
ADS1115Multiplexer multiplexer_;
|
ADS1115Multiplexer multiplexer_;
|
||||||
ADS1115Gain gain_;
|
ADS1115Gain gain_;
|
||||||
ADS1115Resolution resolution_;
|
ADS1115Resolution resolution_;
|
||||||
ADS1115Samplerate samplerate_;
|
|
||||||
};
|
};
|
||||||
|
|
||||||
} // namespace ads1115
|
} // namespace ads1115
|
||||||
|
@ -1,6 +1,6 @@
|
|||||||
import esphome.codegen as cg
|
import esphome.codegen as cg
|
||||||
from esphome.components import spi
|
|
||||||
import esphome.config_validation as cv
|
import esphome.config_validation as cv
|
||||||
|
from esphome.components import spi
|
||||||
from esphome.const import CONF_ID
|
from esphome.const import CONF_ID
|
||||||
|
|
||||||
CODEOWNERS = ["@solomondg1"]
|
CODEOWNERS = ["@solomondg1"]
|
||||||
|
@ -1,5 +1,4 @@
|
|||||||
#include "ads1118.h"
|
#include "ads1118.h"
|
||||||
#include "esphome/core/helpers.h"
|
|
||||||
#include "esphome/core/log.h"
|
#include "esphome/core/log.h"
|
||||||
|
|
||||||
namespace esphome {
|
namespace esphome {
|
||||||
@ -9,7 +8,7 @@ static const char *const TAG = "ads1118";
|
|||||||
static const uint8_t ADS1118_DATA_RATE_860_SPS = 0b111;
|
static const uint8_t ADS1118_DATA_RATE_860_SPS = 0b111;
|
||||||
|
|
||||||
void ADS1118::setup() {
|
void ADS1118::setup() {
|
||||||
ESP_LOGCONFIG(TAG, "Running setup");
|
ESP_LOGCONFIG(TAG, "Setting up ads1118");
|
||||||
this->spi_setup();
|
this->spi_setup();
|
||||||
|
|
||||||
this->config_ = 0;
|
this->config_ = 0;
|
||||||
|
@ -34,6 +34,7 @@ class ADS1118 : public Component,
|
|||||||
ADS1118() = default;
|
ADS1118() = default;
|
||||||
void setup() override;
|
void setup() override;
|
||||||
void dump_config() override;
|
void dump_config() override;
|
||||||
|
float get_setup_priority() const override { return setup_priority::DATA; }
|
||||||
/// Helper method to request a measurement from a sensor.
|
/// Helper method to request a measurement from a sensor.
|
||||||
float request_measurement(ADS1118Multiplexer multiplexer, ADS1118Gain gain, bool temperature_mode);
|
float request_measurement(ADS1118Multiplexer multiplexer, ADS1118Gain gain, bool temperature_mode);
|
||||||
|
|
||||||
|
@ -1,18 +1,17 @@
|
|||||||
import esphome.codegen as cg
|
import esphome.codegen as cg
|
||||||
from esphome.components import sensor, voltage_sampler
|
|
||||||
import esphome.config_validation as cv
|
import esphome.config_validation as cv
|
||||||
|
from esphome.components import sensor, voltage_sampler
|
||||||
from esphome.const import (
|
from esphome.const import (
|
||||||
CONF_GAIN,
|
CONF_GAIN,
|
||||||
CONF_MULTIPLEXER,
|
CONF_MULTIPLEXER,
|
||||||
CONF_TYPE,
|
|
||||||
DEVICE_CLASS_TEMPERATURE,
|
|
||||||
DEVICE_CLASS_VOLTAGE,
|
DEVICE_CLASS_VOLTAGE,
|
||||||
|
DEVICE_CLASS_TEMPERATURE,
|
||||||
STATE_CLASS_MEASUREMENT,
|
STATE_CLASS_MEASUREMENT,
|
||||||
UNIT_CELSIUS,
|
UNIT_CELSIUS,
|
||||||
UNIT_VOLT,
|
UNIT_VOLT,
|
||||||
|
CONF_TYPE,
|
||||||
)
|
)
|
||||||
|
from .. import ads1118_ns, ADS1118, CONF_ADS1118_ID
|
||||||
from .. import ADS1118, CONF_ADS1118_ID, ads1118_ns
|
|
||||||
|
|
||||||
AUTO_LOAD = ["voltage_sampler"]
|
AUTO_LOAD = ["voltage_sampler"]
|
||||||
DEPENDENCIES = ["ads1118"]
|
DEPENDENCIES = ["ads1118"]
|
||||||
|
@ -1,5 +1,4 @@
|
|||||||
#include "ags10.h"
|
#include "ags10.h"
|
||||||
#include "esphome/core/helpers.h"
|
|
||||||
|
|
||||||
#include <cinttypes>
|
#include <cinttypes>
|
||||||
|
|
||||||
@ -24,7 +23,7 @@ static const uint16_t ZP_CURRENT = 0x0000;
|
|||||||
static const uint16_t ZP_DEFAULT = 0xFFFF;
|
static const uint16_t ZP_DEFAULT = 0xFFFF;
|
||||||
|
|
||||||
void AGS10Component::setup() {
|
void AGS10Component::setup() {
|
||||||
ESP_LOGCONFIG(TAG, "Running setup");
|
ESP_LOGCONFIG(TAG, "Setting up ags10...");
|
||||||
|
|
||||||
auto version = this->read_version_();
|
auto version = this->read_version_();
|
||||||
if (version) {
|
if (version) {
|
||||||
@ -66,7 +65,7 @@ void AGS10Component::dump_config() {
|
|||||||
case NONE:
|
case NONE:
|
||||||
break;
|
break;
|
||||||
case COMMUNICATION_FAILED:
|
case COMMUNICATION_FAILED:
|
||||||
ESP_LOGE(TAG, ESP_LOG_MSG_COMM_FAIL);
|
ESP_LOGE(TAG, "Communication with AGS10 failed!");
|
||||||
break;
|
break;
|
||||||
case CRC_CHECK_FAILED:
|
case CRC_CHECK_FAILED:
|
||||||
ESP_LOGE(TAG, "The crc check failed");
|
ESP_LOGE(TAG, "The crc check failed");
|
||||||
|
@ -31,6 +31,8 @@ class AGS10Component : public PollingComponent, public i2c::I2CDevice {
|
|||||||
|
|
||||||
void dump_config() override;
|
void dump_config() override;
|
||||||
|
|
||||||
|
float get_setup_priority() const override { return setup_priority::DATA; }
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Modifies target address of AGS10.
|
* Modifies target address of AGS10.
|
||||||
*
|
*
|
||||||
|
@ -1,21 +1,21 @@
|
|||||||
from esphome import automation
|
|
||||||
import esphome.codegen as cg
|
import esphome.codegen as cg
|
||||||
from esphome.components import i2c, sensor
|
from esphome import automation
|
||||||
import esphome.config_validation as cv
|
import esphome.config_validation as cv
|
||||||
|
from esphome.components import i2c, sensor
|
||||||
from esphome.const import (
|
from esphome.const import (
|
||||||
CONF_ADDRESS,
|
|
||||||
CONF_ID,
|
CONF_ID,
|
||||||
CONF_MODE,
|
|
||||||
CONF_TVOC,
|
|
||||||
CONF_VALUE,
|
|
||||||
CONF_VERSION,
|
|
||||||
DEVICE_CLASS_VOLATILE_ORGANIC_COMPOUNDS_PARTS,
|
|
||||||
ENTITY_CATEGORY_DIAGNOSTIC,
|
|
||||||
ICON_RADIATOR,
|
ICON_RADIATOR,
|
||||||
ICON_RESTART,
|
ICON_RESTART,
|
||||||
|
DEVICE_CLASS_VOLATILE_ORGANIC_COMPOUNDS_PARTS,
|
||||||
|
ENTITY_CATEGORY_DIAGNOSTIC,
|
||||||
STATE_CLASS_MEASUREMENT,
|
STATE_CLASS_MEASUREMENT,
|
||||||
UNIT_OHM,
|
UNIT_OHM,
|
||||||
UNIT_PARTS_PER_BILLION,
|
UNIT_PARTS_PER_BILLION,
|
||||||
|
CONF_ADDRESS,
|
||||||
|
CONF_TVOC,
|
||||||
|
CONF_VERSION,
|
||||||
|
CONF_MODE,
|
||||||
|
CONF_VALUE,
|
||||||
)
|
)
|
||||||
|
|
||||||
CONF_RESISTANCE = "resistance"
|
CONF_RESISTANCE = "resistance"
|
||||||
|
@ -13,9 +13,8 @@
|
|||||||
// results making successive requests; the current implementation makes 3 attempts with a delay of 30ms each time.
|
// results making successive requests; the current implementation makes 3 attempts with a delay of 30ms each time.
|
||||||
|
|
||||||
#include "aht10.h"
|
#include "aht10.h"
|
||||||
#include "esphome/core/hal.h"
|
|
||||||
#include "esphome/core/helpers.h"
|
|
||||||
#include "esphome/core/log.h"
|
#include "esphome/core/log.h"
|
||||||
|
#include "esphome/core/hal.h"
|
||||||
|
|
||||||
namespace esphome {
|
namespace esphome {
|
||||||
namespace aht10 {
|
namespace aht10 {
|
||||||
@ -35,59 +34,57 @@ static const uint8_t AHT10_INIT_ATTEMPTS = 10;
|
|||||||
|
|
||||||
static const uint8_t AHT10_STATUS_BUSY = 0x80;
|
static const uint8_t AHT10_STATUS_BUSY = 0x80;
|
||||||
|
|
||||||
static const float AHT10_DIVISOR = 1048576.0f; // 2^20, used for temperature and humidity calculations
|
|
||||||
|
|
||||||
void AHT10Component::setup() {
|
void AHT10Component::setup() {
|
||||||
ESP_LOGCONFIG(TAG, "Running setup");
|
|
||||||
|
|
||||||
if (this->write(AHT10_SOFTRESET_CMD, sizeof(AHT10_SOFTRESET_CMD)) != i2c::ERROR_OK) {
|
if (this->write(AHT10_SOFTRESET_CMD, sizeof(AHT10_SOFTRESET_CMD)) != i2c::ERROR_OK) {
|
||||||
ESP_LOGE(TAG, "Reset failed");
|
ESP_LOGE(TAG, "Reset AHT10 failed!");
|
||||||
}
|
}
|
||||||
delay(AHT10_SOFTRESET_DELAY);
|
delay(AHT10_SOFTRESET_DELAY);
|
||||||
|
|
||||||
i2c::ErrorCode error_code = i2c::ERROR_INVALID_ARGUMENT;
|
i2c::ErrorCode error_code = i2c::ERROR_INVALID_ARGUMENT;
|
||||||
switch (this->variant_) {
|
switch (this->variant_) {
|
||||||
case AHT10Variant::AHT20:
|
case AHT10Variant::AHT20:
|
||||||
|
ESP_LOGCONFIG(TAG, "Setting up AHT20");
|
||||||
error_code = this->write(AHT20_INITIALIZE_CMD, sizeof(AHT20_INITIALIZE_CMD));
|
error_code = this->write(AHT20_INITIALIZE_CMD, sizeof(AHT20_INITIALIZE_CMD));
|
||||||
break;
|
break;
|
||||||
case AHT10Variant::AHT10:
|
case AHT10Variant::AHT10:
|
||||||
|
ESP_LOGCONFIG(TAG, "Setting up AHT10");
|
||||||
error_code = this->write(AHT10_INITIALIZE_CMD, sizeof(AHT10_INITIALIZE_CMD));
|
error_code = this->write(AHT10_INITIALIZE_CMD, sizeof(AHT10_INITIALIZE_CMD));
|
||||||
break;
|
break;
|
||||||
}
|
}
|
||||||
if (error_code != i2c::ERROR_OK) {
|
if (error_code != i2c::ERROR_OK) {
|
||||||
ESP_LOGE(TAG, ESP_LOG_MSG_COMM_FAIL);
|
ESP_LOGE(TAG, "Communication with AHT10 failed!");
|
||||||
this->mark_failed();
|
this->mark_failed();
|
||||||
return;
|
return;
|
||||||
}
|
}
|
||||||
uint8_t cal_attempts = 0;
|
|
||||||
uint8_t data = AHT10_STATUS_BUSY;
|
uint8_t data = AHT10_STATUS_BUSY;
|
||||||
|
int cal_attempts = 0;
|
||||||
while (data & AHT10_STATUS_BUSY) {
|
while (data & AHT10_STATUS_BUSY) {
|
||||||
delay(AHT10_DEFAULT_DELAY);
|
delay(AHT10_DEFAULT_DELAY);
|
||||||
if (this->read(&data, 1) != i2c::ERROR_OK) {
|
if (this->read(&data, 1) != i2c::ERROR_OK) {
|
||||||
ESP_LOGE(TAG, ESP_LOG_MSG_COMM_FAIL);
|
ESP_LOGE(TAG, "Communication with AHT10 failed!");
|
||||||
this->mark_failed();
|
this->mark_failed();
|
||||||
return;
|
return;
|
||||||
}
|
}
|
||||||
++cal_attempts;
|
++cal_attempts;
|
||||||
if (cal_attempts > AHT10_INIT_ATTEMPTS) {
|
if (cal_attempts > AHT10_INIT_ATTEMPTS) {
|
||||||
ESP_LOGE(TAG, "Initialization timed out");
|
ESP_LOGE(TAG, "AHT10 initialization timed out!");
|
||||||
this->mark_failed();
|
this->mark_failed();
|
||||||
return;
|
return;
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
if ((data & 0x68) != 0x08) { // Bit[6:5] = 0b00, NORMAL mode and Bit[3] = 0b1, CALIBRATED
|
if ((data & 0x68) != 0x08) { // Bit[6:5] = 0b00, NORMAL mode and Bit[3] = 0b1, CALIBRATED
|
||||||
ESP_LOGE(TAG, "Initialization failed");
|
ESP_LOGE(TAG, "AHT10 initialization failed!");
|
||||||
this->mark_failed();
|
this->mark_failed();
|
||||||
return;
|
return;
|
||||||
}
|
}
|
||||||
|
|
||||||
ESP_LOGV(TAG, "Initialization complete");
|
ESP_LOGV(TAG, "AHT10 initialization");
|
||||||
}
|
}
|
||||||
|
|
||||||
void AHT10Component::restart_read_() {
|
void AHT10Component::restart_read_() {
|
||||||
if (this->read_count_ == AHT10_ATTEMPTS) {
|
if (this->read_count_ == AHT10_ATTEMPTS) {
|
||||||
this->read_count_ = 0;
|
this->read_count_ = 0;
|
||||||
this->status_set_error("Reading timed out");
|
this->status_set_error("Measurements reading timed-out!");
|
||||||
return;
|
return;
|
||||||
}
|
}
|
||||||
this->read_count_++;
|
this->read_count_++;
|
||||||
@ -100,24 +97,24 @@ void AHT10Component::read_data_() {
|
|||||||
ESP_LOGD(TAG, "Read attempt %d at %ums", this->read_count_, (unsigned) (millis() - this->start_time_));
|
ESP_LOGD(TAG, "Read attempt %d at %ums", this->read_count_, (unsigned) (millis() - this->start_time_));
|
||||||
}
|
}
|
||||||
if (this->read(data, 6) != i2c::ERROR_OK) {
|
if (this->read(data, 6) != i2c::ERROR_OK) {
|
||||||
this->status_set_warning("Read failed, will retry");
|
this->status_set_warning("AHT10 read failed, retrying soon");
|
||||||
this->restart_read_();
|
this->restart_read_();
|
||||||
return;
|
return;
|
||||||
}
|
}
|
||||||
|
|
||||||
if ((data[0] & 0x80) == 0x80) { // Bit[7] = 0b1, device is busy
|
if ((data[0] & 0x80) == 0x80) { // Bit[7] = 0b1, device is busy
|
||||||
ESP_LOGD(TAG, "Device busy, will retry");
|
ESP_LOGD(TAG, "AHT10 is busy, waiting...");
|
||||||
this->restart_read_();
|
this->restart_read_();
|
||||||
return;
|
return;
|
||||||
}
|
}
|
||||||
if (data[1] == 0x0 && data[2] == 0x0 && (data[3] >> 4) == 0x0) {
|
if (data[1] == 0x0 && data[2] == 0x0 && (data[3] >> 4) == 0x0) {
|
||||||
// Invalid humidity (0x0)
|
// Unrealistic humidity (0x0)
|
||||||
if (this->humidity_sensor_ == nullptr) {
|
if (this->humidity_sensor_ == nullptr) {
|
||||||
ESP_LOGV(TAG, "Invalid humidity (reading not required)");
|
ESP_LOGV(TAG, "ATH10 Unrealistic humidity (0x0), but humidity is not required");
|
||||||
} else {
|
} else {
|
||||||
ESP_LOGD(TAG, "Invalid humidity, retrying");
|
ESP_LOGD(TAG, "ATH10 Unrealistic humidity (0x0), retrying...");
|
||||||
if (this->write(AHT10_MEASURE_CMD, sizeof(AHT10_MEASURE_CMD)) != i2c::ERROR_OK) {
|
if (this->write(AHT10_MEASURE_CMD, sizeof(AHT10_MEASURE_CMD)) != i2c::ERROR_OK) {
|
||||||
this->status_set_warning(ESP_LOG_MSG_COMM_FAIL);
|
this->status_set_warning("Communication with AHT10 failed!");
|
||||||
}
|
}
|
||||||
this->restart_read_();
|
this->restart_read_();
|
||||||
return;
|
return;
|
||||||
@ -126,17 +123,22 @@ void AHT10Component::read_data_() {
|
|||||||
if (this->read_count_ > 1) {
|
if (this->read_count_ > 1) {
|
||||||
ESP_LOGD(TAG, "Success at %ums", (unsigned) (millis() - this->start_time_));
|
ESP_LOGD(TAG, "Success at %ums", (unsigned) (millis() - this->start_time_));
|
||||||
}
|
}
|
||||||
uint32_t raw_temperature = encode_uint24(data[3] & 0xF, data[4], data[5]);
|
uint32_t raw_temperature = ((data[3] & 0x0F) << 16) | (data[4] << 8) | data[5];
|
||||||
uint32_t raw_humidity = encode_uint24(data[1], data[2], data[3]) >> 4;
|
uint32_t raw_humidity = ((data[1] << 16) | (data[2] << 8) | data[3]) >> 4;
|
||||||
|
|
||||||
if (this->temperature_sensor_ != nullptr) {
|
if (this->temperature_sensor_ != nullptr) {
|
||||||
float temperature = ((200.0f * static_cast<float>(raw_temperature)) / AHT10_DIVISOR) - 50.0f;
|
float temperature = ((200.0f * (float) raw_temperature) / 1048576.0f) - 50.0f;
|
||||||
this->temperature_sensor_->publish_state(temperature);
|
this->temperature_sensor_->publish_state(temperature);
|
||||||
}
|
}
|
||||||
if (this->humidity_sensor_ != nullptr) {
|
if (this->humidity_sensor_ != nullptr) {
|
||||||
float humidity = raw_humidity == 0 ? NAN : static_cast<float>(raw_humidity) * 100.0f / AHT10_DIVISOR;
|
float humidity;
|
||||||
|
if (raw_humidity == 0) { // unrealistic value
|
||||||
|
humidity = NAN;
|
||||||
|
} else {
|
||||||
|
humidity = (float) raw_humidity * 100.0f / 1048576.0f;
|
||||||
|
}
|
||||||
if (std::isnan(humidity)) {
|
if (std::isnan(humidity)) {
|
||||||
ESP_LOGW(TAG, "Invalid humidity reading (0%%), ");
|
ESP_LOGW(TAG, "Invalid humidity! Sensor reported 0%% Hum");
|
||||||
}
|
}
|
||||||
this->humidity_sensor_->publish_state(humidity);
|
this->humidity_sensor_->publish_state(humidity);
|
||||||
}
|
}
|
||||||
@ -148,7 +150,7 @@ void AHT10Component::update() {
|
|||||||
return;
|
return;
|
||||||
this->start_time_ = millis();
|
this->start_time_ = millis();
|
||||||
if (this->write(AHT10_MEASURE_CMD, sizeof(AHT10_MEASURE_CMD)) != i2c::ERROR_OK) {
|
if (this->write(AHT10_MEASURE_CMD, sizeof(AHT10_MEASURE_CMD)) != i2c::ERROR_OK) {
|
||||||
this->status_set_warning(ESP_LOG_MSG_COMM_FAIL);
|
this->status_set_warning("Communication with AHT10 failed!");
|
||||||
return;
|
return;
|
||||||
}
|
}
|
||||||
this->restart_read_();
|
this->restart_read_();
|
||||||
@ -160,7 +162,7 @@ void AHT10Component::dump_config() {
|
|||||||
ESP_LOGCONFIG(TAG, "AHT10:");
|
ESP_LOGCONFIG(TAG, "AHT10:");
|
||||||
LOG_I2C_DEVICE(this);
|
LOG_I2C_DEVICE(this);
|
||||||
if (this->is_failed()) {
|
if (this->is_failed()) {
|
||||||
ESP_LOGE(TAG, ESP_LOG_MSG_COMM_FAIL);
|
ESP_LOGE(TAG, "Communication with AHT10 failed!");
|
||||||
}
|
}
|
||||||
LOG_SENSOR(" ", "Temperature", this->temperature_sensor_);
|
LOG_SENSOR(" ", "Temperature", this->temperature_sensor_);
|
||||||
LOG_SENSOR(" ", "Humidity", this->humidity_sensor_);
|
LOG_SENSOR(" ", "Humidity", this->humidity_sensor_);
|
||||||
|
@ -1,16 +1,16 @@
|
|||||||
import esphome.codegen as cg
|
import esphome.codegen as cg
|
||||||
from esphome.components import i2c, sensor
|
|
||||||
import esphome.config_validation as cv
|
import esphome.config_validation as cv
|
||||||
|
from esphome.components import i2c, sensor
|
||||||
from esphome.const import (
|
from esphome.const import (
|
||||||
CONF_HUMIDITY,
|
CONF_HUMIDITY,
|
||||||
CONF_ID,
|
CONF_ID,
|
||||||
CONF_TEMPERATURE,
|
CONF_TEMPERATURE,
|
||||||
CONF_VARIANT,
|
|
||||||
DEVICE_CLASS_HUMIDITY,
|
DEVICE_CLASS_HUMIDITY,
|
||||||
DEVICE_CLASS_TEMPERATURE,
|
DEVICE_CLASS_TEMPERATURE,
|
||||||
STATE_CLASS_MEASUREMENT,
|
STATE_CLASS_MEASUREMENT,
|
||||||
UNIT_CELSIUS,
|
UNIT_CELSIUS,
|
||||||
UNIT_PERCENT,
|
UNIT_PERCENT,
|
||||||
|
CONF_VARIANT,
|
||||||
)
|
)
|
||||||
|
|
||||||
DEPENDENCIES = ["i2c"]
|
DEPENDENCIES = ["i2c"]
|
||||||
|
@ -17,7 +17,7 @@ static const char *const TAG = "aic3204";
|
|||||||
}
|
}
|
||||||
|
|
||||||
void AIC3204::setup() {
|
void AIC3204::setup() {
|
||||||
ESP_LOGCONFIG(TAG, "Running setup");
|
ESP_LOGCONFIG(TAG, "Setting up AIC3204...");
|
||||||
|
|
||||||
// Set register page to 0
|
// Set register page to 0
|
||||||
ERROR_CHECK(this->write_byte(AIC3204_PAGE_CTRL, 0x00), "Set page 0 failed");
|
ERROR_CHECK(this->write_byte(AIC3204_PAGE_CTRL, 0x00), "Set page 0 failed");
|
||||||
@ -113,7 +113,7 @@ void AIC3204::dump_config() {
|
|||||||
LOG_I2C_DEVICE(this);
|
LOG_I2C_DEVICE(this);
|
||||||
|
|
||||||
if (this->is_failed()) {
|
if (this->is_failed()) {
|
||||||
ESP_LOGE(TAG, ESP_LOG_MSG_COMM_FAIL);
|
ESP_LOGE(TAG, "Communication with AIC3204 failed");
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
@ -66,6 +66,7 @@ class AIC3204 : public audio_dac::AudioDac, public Component, public i2c::I2CDev
|
|||||||
public:
|
public:
|
||||||
void setup() override;
|
void setup() override;
|
||||||
void dump_config() override;
|
void dump_config() override;
|
||||||
|
float get_setup_priority() const override { return setup_priority::DATA; }
|
||||||
|
|
||||||
bool set_mute_off() override;
|
bool set_mute_off() override;
|
||||||
bool set_mute_on() override;
|
bool set_mute_on() override;
|
||||||
|
@ -1,6 +1,6 @@
|
|||||||
import esphome.codegen as cg
|
import esphome.codegen as cg
|
||||||
from esphome.components import esp32_ble_tracker
|
|
||||||
import esphome.config_validation as cv
|
import esphome.config_validation as cv
|
||||||
|
from esphome.components import esp32_ble_tracker
|
||||||
from esphome.const import CONF_ID
|
from esphome.const import CONF_ID
|
||||||
|
|
||||||
DEPENDENCIES = ["esp32_ble_tracker"]
|
DEPENDENCIES = ["esp32_ble_tracker"]
|
||||||
|
@ -1,17 +1,18 @@
|
|||||||
import esphome.codegen as cg
|
import esphome.codegen as cg
|
||||||
from esphome.components import ble_client, sensor
|
|
||||||
import esphome.config_validation as cv
|
import esphome.config_validation as cv
|
||||||
|
from esphome.components import sensor, ble_client
|
||||||
|
|
||||||
from esphome.const import (
|
from esphome.const import (
|
||||||
CONF_BATTERY_VOLTAGE,
|
CONF_BATTERY_VOLTAGE,
|
||||||
CONF_HUMIDITY,
|
CONF_HUMIDITY,
|
||||||
CONF_PRESSURE,
|
CONF_PRESSURE,
|
||||||
CONF_TEMPERATURE,
|
CONF_TEMPERATURE,
|
||||||
CONF_TVOC,
|
CONF_TVOC,
|
||||||
|
DEVICE_CLASS_VOLTAGE,
|
||||||
DEVICE_CLASS_HUMIDITY,
|
DEVICE_CLASS_HUMIDITY,
|
||||||
DEVICE_CLASS_PRESSURE,
|
DEVICE_CLASS_PRESSURE,
|
||||||
DEVICE_CLASS_TEMPERATURE,
|
DEVICE_CLASS_TEMPERATURE,
|
||||||
DEVICE_CLASS_VOLATILE_ORGANIC_COMPOUNDS_PARTS,
|
DEVICE_CLASS_VOLATILE_ORGANIC_COMPOUNDS_PARTS,
|
||||||
DEVICE_CLASS_VOLTAGE,
|
|
||||||
ENTITY_CATEGORY_DIAGNOSTIC,
|
ENTITY_CATEGORY_DIAGNOSTIC,
|
||||||
STATE_CLASS_MEASUREMENT,
|
STATE_CLASS_MEASUREMENT,
|
||||||
UNIT_CELSIUS,
|
UNIT_CELSIUS,
|
||||||
@ -34,7 +35,7 @@ AirthingsWaveBase = airthings_wave_base_ns.class_(
|
|||||||
|
|
||||||
|
|
||||||
BASE_SCHEMA = (
|
BASE_SCHEMA = (
|
||||||
cv.Schema(
|
sensor.SENSOR_SCHEMA.extend(
|
||||||
{
|
{
|
||||||
cv.Optional(CONF_HUMIDITY): sensor.sensor_schema(
|
cv.Optional(CONF_HUMIDITY): sensor.sensor_schema(
|
||||||
unit_of_measurement=UNIT_PERCENT,
|
unit_of_measurement=UNIT_PERCENT,
|
||||||
|
@ -1,7 +1,10 @@
|
|||||||
import esphome.codegen as cg
|
import esphome.codegen as cg
|
||||||
from esphome.components import airthings_wave_base
|
|
||||||
import esphome.config_validation as cv
|
import esphome.config_validation as cv
|
||||||
from esphome.const import CONF_ID
|
from esphome.components import airthings_wave_base
|
||||||
|
|
||||||
|
from esphome.const import (
|
||||||
|
CONF_ID,
|
||||||
|
)
|
||||||
|
|
||||||
DEPENDENCIES = airthings_wave_base.DEPENDENCIES
|
DEPENDENCIES = airthings_wave_base.DEPENDENCIES
|
||||||
|
|
||||||
|
@ -1 +1 @@
|
|||||||
CODEOWNERS = ["@jeromelaban", "@precurse"]
|
CODEOWNERS = ["@jeromelaban"]
|
||||||
|
@ -73,29 +73,11 @@ void AirthingsWavePlus::dump_config() {
|
|||||||
LOG_SENSOR(" ", "Illuminance", this->illuminance_sensor_);
|
LOG_SENSOR(" ", "Illuminance", this->illuminance_sensor_);
|
||||||
}
|
}
|
||||||
|
|
||||||
void AirthingsWavePlus::setup() {
|
AirthingsWavePlus::AirthingsWavePlus() {
|
||||||
const char *service_uuid;
|
this->service_uuid_ = espbt::ESPBTUUID::from_raw(SERVICE_UUID);
|
||||||
const char *characteristic_uuid;
|
this->sensors_data_characteristic_uuid_ = espbt::ESPBTUUID::from_raw(CHARACTERISTIC_UUID);
|
||||||
const char *access_control_point_characteristic_uuid;
|
|
||||||
|
|
||||||
// Change UUIDs for Wave Radon Gen2
|
|
||||||
switch (this->wave_device_type_) {
|
|
||||||
case WaveDeviceType::WAVE_GEN2:
|
|
||||||
service_uuid = SERVICE_UUID_WAVE_RADON_GEN2;
|
|
||||||
characteristic_uuid = CHARACTERISTIC_UUID_WAVE_RADON_GEN2;
|
|
||||||
access_control_point_characteristic_uuid = ACCESS_CONTROL_POINT_CHARACTERISTIC_UUID_WAVE_RADON_GEN2;
|
|
||||||
break;
|
|
||||||
default:
|
|
||||||
// Wave Plus
|
|
||||||
service_uuid = SERVICE_UUID;
|
|
||||||
characteristic_uuid = CHARACTERISTIC_UUID;
|
|
||||||
access_control_point_characteristic_uuid = ACCESS_CONTROL_POINT_CHARACTERISTIC_UUID;
|
|
||||||
}
|
|
||||||
|
|
||||||
this->service_uuid_ = espbt::ESPBTUUID::from_raw(service_uuid);
|
|
||||||
this->sensors_data_characteristic_uuid_ = espbt::ESPBTUUID::from_raw(characteristic_uuid);
|
|
||||||
this->access_control_point_characteristic_uuid_ =
|
this->access_control_point_characteristic_uuid_ =
|
||||||
espbt::ESPBTUUID::from_raw(access_control_point_characteristic_uuid);
|
espbt::ESPBTUUID::from_raw(ACCESS_CONTROL_POINT_CHARACTERISTIC_UUID);
|
||||||
}
|
}
|
||||||
|
|
||||||
} // namespace airthings_wave_plus
|
} // namespace airthings_wave_plus
|
||||||
|
@ -9,20 +9,13 @@ namespace airthings_wave_plus {
|
|||||||
|
|
||||||
namespace espbt = esphome::esp32_ble_tracker;
|
namespace espbt = esphome::esp32_ble_tracker;
|
||||||
|
|
||||||
enum WaveDeviceType : uint8_t { WAVE_PLUS = 0, WAVE_GEN2 = 1 };
|
|
||||||
|
|
||||||
static const char *const SERVICE_UUID = "b42e1c08-ade7-11e4-89d3-123b93f75cba";
|
static const char *const SERVICE_UUID = "b42e1c08-ade7-11e4-89d3-123b93f75cba";
|
||||||
static const char *const CHARACTERISTIC_UUID = "b42e2a68-ade7-11e4-89d3-123b93f75cba";
|
static const char *const CHARACTERISTIC_UUID = "b42e2a68-ade7-11e4-89d3-123b93f75cba";
|
||||||
static const char *const ACCESS_CONTROL_POINT_CHARACTERISTIC_UUID = "b42e2d06-ade7-11e4-89d3-123b93f75cba";
|
static const char *const ACCESS_CONTROL_POINT_CHARACTERISTIC_UUID = "b42e2d06-ade7-11e4-89d3-123b93f75cba";
|
||||||
|
|
||||||
static const char *const SERVICE_UUID_WAVE_RADON_GEN2 = "b42e4a8e-ade7-11e4-89d3-123b93f75cba";
|
|
||||||
static const char *const CHARACTERISTIC_UUID_WAVE_RADON_GEN2 = "b42e4dcc-ade7-11e4-89d3-123b93f75cba";
|
|
||||||
static const char *const ACCESS_CONTROL_POINT_CHARACTERISTIC_UUID_WAVE_RADON_GEN2 =
|
|
||||||
"b42e50d8-ade7-11e4-89d3-123b93f75cba";
|
|
||||||
|
|
||||||
class AirthingsWavePlus : public airthings_wave_base::AirthingsWaveBase {
|
class AirthingsWavePlus : public airthings_wave_base::AirthingsWaveBase {
|
||||||
public:
|
public:
|
||||||
void setup() override;
|
AirthingsWavePlus();
|
||||||
|
|
||||||
void dump_config() override;
|
void dump_config() override;
|
||||||
|
|
||||||
@ -30,14 +23,12 @@ class AirthingsWavePlus : public airthings_wave_base::AirthingsWaveBase {
|
|||||||
void set_radon_long_term(sensor::Sensor *radon_long_term) { radon_long_term_sensor_ = radon_long_term; }
|
void set_radon_long_term(sensor::Sensor *radon_long_term) { radon_long_term_sensor_ = radon_long_term; }
|
||||||
void set_co2(sensor::Sensor *co2) { co2_sensor_ = co2; }
|
void set_co2(sensor::Sensor *co2) { co2_sensor_ = co2; }
|
||||||
void set_illuminance(sensor::Sensor *illuminance) { illuminance_sensor_ = illuminance; }
|
void set_illuminance(sensor::Sensor *illuminance) { illuminance_sensor_ = illuminance; }
|
||||||
void set_device_type(WaveDeviceType wave_device_type) { wave_device_type_ = wave_device_type; }
|
|
||||||
|
|
||||||
protected:
|
protected:
|
||||||
bool is_valid_radon_value_(uint16_t radon);
|
bool is_valid_radon_value_(uint16_t radon);
|
||||||
bool is_valid_co2_value_(uint16_t co2);
|
bool is_valid_co2_value_(uint16_t co2);
|
||||||
|
|
||||||
void read_sensors(uint8_t *raw_value, uint16_t value_len) override;
|
void read_sensors(uint8_t *raw_value, uint16_t value_len) override;
|
||||||
WaveDeviceType wave_device_type_{WaveDeviceType::WAVE_PLUS};
|
|
||||||
|
|
||||||
sensor::Sensor *radon_sensor_{nullptr};
|
sensor::Sensor *radon_sensor_{nullptr};
|
||||||
sensor::Sensor *radon_long_term_sensor_{nullptr};
|
sensor::Sensor *radon_long_term_sensor_{nullptr};
|
||||||
|
Some files were not shown because too many files have changed in this diff Show More
Loading…
x
Reference in New Issue
Block a user