mirror of
https://github.com/esphome/esphome.git
synced 2025-10-20 00:58:42 +00:00
Compare commits
1 Commits
mqtt_reduc
...
improv_ser
Author | SHA1 | Date | |
---|---|---|---|
![]() |
2232038e8d |
@@ -1,370 +0,0 @@
|
||||
# ESPHome AI Collaboration Guide
|
||||
|
||||
This document provides essential context for AI models interacting with this project. Adhering to these guidelines will ensure consistency and maintain code quality.
|
||||
|
||||
## 1. Project Overview & Purpose
|
||||
|
||||
* **Primary Goal:** ESPHome is a system to configure microcontrollers (like ESP32, ESP8266, RP2040, and LibreTiny-based chips) using simple yet powerful YAML configuration files. It generates C++ firmware that can be compiled and flashed to these devices, allowing users to control them remotely through home automation systems.
|
||||
* **Business Domain:** Internet of Things (IoT), Home Automation.
|
||||
|
||||
## 2. Core Technologies & Stack
|
||||
|
||||
* **Languages:** Python (>=3.11), C++ (gnu++20)
|
||||
* **Frameworks & Runtimes:** PlatformIO, Arduino, ESP-IDF.
|
||||
* **Build Systems:** PlatformIO is the primary build system. CMake is used as an alternative.
|
||||
* **Configuration:** YAML.
|
||||
* **Key Libraries/Dependencies:**
|
||||
* **Python:** `voluptuous` (for configuration validation), `PyYAML` (for parsing configuration files), `paho-mqtt` (for MQTT communication), `tornado` (for the web server), `aioesphomeapi` (for the native API).
|
||||
* **C++:** `ArduinoJson` (for JSON serialization/deserialization), `AsyncMqttClient-esphome` (for MQTT), `ESPAsyncWebServer` (for the web server).
|
||||
* **Package Manager(s):** `pip` (for Python dependencies), `platformio` (for C++/PlatformIO dependencies).
|
||||
* **Communication Protocols:** Protobuf (for native API), MQTT, HTTP.
|
||||
|
||||
## 3. Architectural Patterns
|
||||
|
||||
* **Overall Architecture:** The project follows a code-generation architecture. The Python code parses user-defined YAML configuration files and generates C++ source code. This C++ code is then compiled and flashed to the target microcontroller using PlatformIO.
|
||||
|
||||
* **Directory Structure Philosophy:**
|
||||
* `/esphome`: Contains the core Python source code for the ESPHome application.
|
||||
* `/esphome/components`: Contains the individual components that can be used in ESPHome configurations. Each component is a self-contained unit with its own C++ and Python code.
|
||||
* `/tests`: Contains all unit and integration tests for the Python code.
|
||||
* `/docker`: Contains Docker-related files for building and running ESPHome in a container.
|
||||
* `/script`: Contains helper scripts for development and maintenance.
|
||||
|
||||
* **Core Architectural Components:**
|
||||
1. **Configuration System** (`esphome/config*.py`): Handles YAML parsing and validation using Voluptuous, schema definitions, and multi-platform configurations.
|
||||
2. **Code Generation** (`esphome/codegen.py`, `esphome/cpp_generator.py`): Manages Python to C++ code generation, template processing, and build flag management.
|
||||
3. **Component System** (`esphome/components/`): Contains modular hardware and software components with platform-specific implementations and dependency management.
|
||||
4. **Core Framework** (`esphome/core/`): Manages the application lifecycle, hardware abstraction, and component registration.
|
||||
5. **Dashboard** (`esphome/dashboard/`): A web-based interface for device configuration, management, and OTA updates.
|
||||
|
||||
* **Platform Support:**
|
||||
1. **ESP32** (`components/esp32/`): Espressif ESP32 family. Supports multiple variants (Original, C2, C3, C5, C6, H2, P4, S2, S3) with ESP-IDF framework. Arduino framework supports only a subset of the variants (Original, C3, S2, S3).
|
||||
2. **ESP8266** (`components/esp8266/`): Espressif ESP8266. Arduino framework only, with memory constraints.
|
||||
3. **RP2040** (`components/rp2040/`): Raspberry Pi Pico/RP2040. Arduino framework with PIO (Programmable I/O) support.
|
||||
4. **LibreTiny** (`components/libretiny/`): Realtek and Beken chips. Supports multiple chip families and auto-generated components.
|
||||
|
||||
## 4. Coding Conventions & Style Guide
|
||||
|
||||
* **Formatting:**
|
||||
* **Python:** Uses `ruff` and `flake8` for linting and formatting. Configuration is in `pyproject.toml`.
|
||||
* **C++:** Uses `clang-format` for formatting. Configuration is in `.clang-format`.
|
||||
|
||||
* **Naming Conventions:**
|
||||
* **Python:** Follows PEP 8. Use clear, descriptive names following snake_case.
|
||||
* **C++:** Follows the Google C++ Style Guide.
|
||||
|
||||
* **Component Structure:**
|
||||
* **Standard Files:**
|
||||
```
|
||||
components/[component_name]/
|
||||
├── __init__.py # Component configuration schema and code generation
|
||||
├── [component].h # C++ header file (if needed)
|
||||
├── [component].cpp # C++ implementation (if needed)
|
||||
└── [platform]/ # Platform-specific implementations
|
||||
├── __init__.py # Platform-specific configuration
|
||||
├── [platform].h # Platform C++ header
|
||||
└── [platform].cpp # Platform C++ implementation
|
||||
```
|
||||
|
||||
* **Component Metadata:**
|
||||
- `DEPENDENCIES`: List of required components
|
||||
- `AUTO_LOAD`: Components to automatically load
|
||||
- `CONFLICTS_WITH`: Incompatible components
|
||||
- `CODEOWNERS`: GitHub usernames responsible for maintenance
|
||||
- `MULTI_CONF`: Whether multiple instances are allowed
|
||||
|
||||
* **Code Generation & Common Patterns:**
|
||||
* **Configuration Schema Pattern:**
|
||||
```python
|
||||
import esphome.codegen as cg
|
||||
import esphome.config_validation as cv
|
||||
from esphome.const import CONF_KEY, CONF_ID
|
||||
|
||||
CONF_PARAM = "param" # A constant that does not yet exist in esphome/const.py
|
||||
|
||||
my_component_ns = cg.esphome_ns.namespace("my_component")
|
||||
MyComponent = my_component_ns.class_("MyComponent", cg.Component)
|
||||
|
||||
CONFIG_SCHEMA = cv.Schema({
|
||||
cv.GenerateID(): cv.declare_id(MyComponent),
|
||||
cv.Required(CONF_KEY): cv.string,
|
||||
cv.Optional(CONF_PARAM, default=42): cv.int_,
|
||||
}).extend(cv.COMPONENT_SCHEMA)
|
||||
|
||||
async def to_code(config):
|
||||
var = cg.new_Pvariable(config[CONF_ID])
|
||||
await cg.register_component(var, config)
|
||||
cg.add(var.set_key(config[CONF_KEY]))
|
||||
cg.add(var.set_param(config[CONF_PARAM]))
|
||||
```
|
||||
|
||||
* **C++ Class Pattern:**
|
||||
```cpp
|
||||
namespace esphome {
|
||||
namespace my_component {
|
||||
|
||||
class MyComponent : public Component {
|
||||
public:
|
||||
void setup() override;
|
||||
void loop() override;
|
||||
void dump_config() override;
|
||||
|
||||
void set_key(const std::string &key) { this->key_ = key; }
|
||||
void set_param(int param) { this->param_ = param; }
|
||||
|
||||
protected:
|
||||
std::string key_;
|
||||
int param_{0};
|
||||
};
|
||||
|
||||
} // namespace my_component
|
||||
} // namespace esphome
|
||||
```
|
||||
|
||||
* **Common Component Examples:**
|
||||
- **Sensor:**
|
||||
```python
|
||||
from esphome.components import sensor
|
||||
CONFIG_SCHEMA = sensor.sensor_schema(MySensor).extend(cv.polling_component_schema("60s"))
|
||||
async def to_code(config):
|
||||
var = await sensor.new_sensor(config)
|
||||
await cg.register_component(var, config)
|
||||
```
|
||||
|
||||
- **Binary Sensor:**
|
||||
```python
|
||||
from esphome.components import binary_sensor
|
||||
CONFIG_SCHEMA = binary_sensor.binary_sensor_schema().extend({ ... })
|
||||
async def to_code(config):
|
||||
var = await binary_sensor.new_binary_sensor(config)
|
||||
```
|
||||
|
||||
- **Switch:**
|
||||
```python
|
||||
from esphome.components import switch
|
||||
CONFIG_SCHEMA = switch.switch_schema().extend({ ... })
|
||||
async def to_code(config):
|
||||
var = await switch.new_switch(config)
|
||||
```
|
||||
|
||||
* **Configuration Validation:**
|
||||
* **Common Validators:** `cv.int_`, `cv.float_`, `cv.string`, `cv.boolean`, `cv.int_range(min=0, max=100)`, `cv.positive_int`, `cv.percentage`.
|
||||
* **Complex Validation:** `cv.All(cv.string, cv.Length(min=1, max=50))`, `cv.Any(cv.int_, cv.string)`.
|
||||
* **Platform-Specific:** `cv.only_on(["esp32", "esp8266"])`, `esp32.only_on_variant(...)`, `cv.only_on_esp32`, `cv.only_on_esp8266`, `cv.only_on_rp2040`.
|
||||
* **Framework-Specific:** `cv.only_with_framework(...)`, `cv.only_with_arduino`, `cv.only_with_esp_idf`.
|
||||
* **Schema Extensions:**
|
||||
```python
|
||||
CONFIG_SCHEMA = cv.Schema({ ... })
|
||||
.extend(cv.COMPONENT_SCHEMA)
|
||||
.extend(uart.UART_DEVICE_SCHEMA)
|
||||
.extend(i2c.i2c_device_schema(0x48))
|
||||
.extend(spi.spi_device_schema(cs_pin_required=True))
|
||||
```
|
||||
|
||||
## 5. Key Files & Entrypoints
|
||||
|
||||
* **Main Entrypoint(s):** `esphome/__main__.py` is the main entrypoint for the ESPHome command-line interface.
|
||||
* **Configuration:**
|
||||
* `pyproject.toml`: Defines the Python project metadata and dependencies.
|
||||
* `platformio.ini`: Configures the PlatformIO build environments for different microcontrollers.
|
||||
* `.pre-commit-config.yaml`: Configures the pre-commit hooks for linting and formatting.
|
||||
* **CI/CD Pipeline:** Defined in `.github/workflows`.
|
||||
* **Static Analysis & Development:**
|
||||
* `esphome/core/defines.h`: A comprehensive header file containing all `#define` directives that can be added by components using `cg.add_define()` in Python. This file is used exclusively for development, static analysis tools, and CI testing - it is not used during runtime compilation. When developing components that add new defines, they must be added to this file to ensure proper IDE support and static analysis coverage. The file includes feature flags, build configurations, and platform-specific defines that help static analyzers understand the complete codebase without needing to compile for specific platforms.
|
||||
|
||||
## 6. Development & Testing Workflow
|
||||
|
||||
* **Local Development Environment:** Use the provided Docker container or create a Python virtual environment and install dependencies from `requirements_dev.txt`.
|
||||
* **Running Commands:** Use the `script/run-in-env.py` script to execute commands within the project's virtual environment. For example, to run the linter: `python3 script/run-in-env.py pre-commit run`.
|
||||
* **Testing:**
|
||||
* **Python:** Run unit tests with `pytest`.
|
||||
* **C++:** Use `clang-tidy` for static analysis.
|
||||
* **Component Tests:** YAML-based compilation tests are located in `tests/`. The structure is as follows:
|
||||
```
|
||||
tests/
|
||||
├── test_build_components/ # Base test configurations
|
||||
└── components/[component]/ # Component-specific tests
|
||||
```
|
||||
Run them using `script/test_build_components`. Use `-c <component>` to test specific components and `-t <target>` for specific platforms.
|
||||
* **Testing All Components Together:** To verify that all components can be tested together without ID conflicts or configuration issues, use:
|
||||
```bash
|
||||
./script/test_component_grouping.py -e config --all
|
||||
```
|
||||
This tests all components in a single build to catch conflicts that might not appear when testing components individually. Use `-e config` for fast configuration validation, or `-e compile` for full compilation testing.
|
||||
* **Debugging and Troubleshooting:**
|
||||
* **Debug Tools:**
|
||||
- `esphome config <file>.yaml` to validate configuration.
|
||||
- `esphome compile <file>.yaml` to compile without uploading.
|
||||
- Check the Dashboard for real-time logs.
|
||||
- Use component-specific debug logging.
|
||||
* **Common Issues:**
|
||||
- **Import Errors**: Check component dependencies and `PYTHONPATH`.
|
||||
- **Validation Errors**: Review configuration schema definitions.
|
||||
- **Build Errors**: Check platform compatibility and library versions.
|
||||
- **Runtime Errors**: Review generated C++ code and component logic.
|
||||
|
||||
## 7. Specific Instructions for AI Collaboration
|
||||
|
||||
* **Contribution Workflow (Pull Request Process):**
|
||||
1. **Fork & Branch:** Create a new branch in your fork.
|
||||
2. **Make Changes:** Adhere to all coding conventions and patterns.
|
||||
3. **Test:** Create component tests for all supported platforms and run the full test suite locally.
|
||||
4. **Lint:** Run `pre-commit` to ensure code is compliant.
|
||||
5. **Commit:** Commit your changes. There is no strict format for commit messages.
|
||||
6. **Pull Request:** Submit a PR against the `dev` branch. The Pull Request title should have a prefix of the component being worked on (e.g., `[display] Fix bug`, `[abc123] Add new component`). Update documentation, examples, and add `CODEOWNERS` entries as needed. Pull requests should always be made with the PULL_REQUEST_TEMPLATE.md template filled out correctly.
|
||||
|
||||
* **Documentation Contributions:**
|
||||
* Documentation is hosted in the separate `esphome/esphome-docs` repository.
|
||||
* The contribution workflow is the same as for the codebase.
|
||||
|
||||
* **Best Practices:**
|
||||
* **Component Development:** Keep dependencies minimal, provide clear error messages, and write comprehensive docstrings and tests.
|
||||
* **Code Generation:** Generate minimal and efficient C++ code. Validate all user inputs thoroughly. Support multiple platform variations.
|
||||
* **Configuration Design:** Aim for simplicity with sensible defaults, while allowing for advanced customization.
|
||||
* **Embedded Systems Optimization:** ESPHome targets resource-constrained microcontrollers. Be mindful of flash size and RAM usage.
|
||||
|
||||
**STL Container Guidelines:**
|
||||
|
||||
ESPHome runs on embedded systems with limited resources. Choose containers carefully:
|
||||
|
||||
1. **Compile-time-known sizes:** Use `std::array` instead of `std::vector` when size is known at compile time.
|
||||
```cpp
|
||||
// Bad - generates STL realloc code
|
||||
std::vector<int> values;
|
||||
|
||||
// Good - no dynamic allocation
|
||||
std::array<int, MAX_VALUES> values;
|
||||
```
|
||||
Use `cg.add_define("MAX_VALUES", count)` to set the size from Python configuration.
|
||||
|
||||
**For byte buffers:** Avoid `std::vector<uint8_t>` unless the buffer needs to grow. Use `std::unique_ptr<uint8_t[]>` instead.
|
||||
|
||||
> **Note:** `std::unique_ptr<uint8_t[]>` does **not** provide bounds checking or iterator support like `std::vector<uint8_t>`. Use it only when you do not need these features and want minimal overhead.
|
||||
|
||||
```cpp
|
||||
// Bad - STL overhead for simple byte buffer
|
||||
std::vector<uint8_t> buffer;
|
||||
buffer.resize(256);
|
||||
|
||||
// Good - minimal overhead, single allocation
|
||||
std::unique_ptr<uint8_t[]> buffer = std::make_unique<uint8_t[]>(256);
|
||||
// Or if size is constant:
|
||||
std::array<uint8_t, 256> buffer;
|
||||
```
|
||||
|
||||
2. **Compile-time-known fixed sizes with vector-like API:** Use `StaticVector` from `esphome/core/helpers.h` for fixed-size stack allocation with `push_back()` interface.
|
||||
```cpp
|
||||
// Bad - generates STL realloc code (_M_realloc_insert)
|
||||
std::vector<ServiceRecord> services;
|
||||
services.reserve(5); // Still includes reallocation machinery
|
||||
|
||||
// Good - compile-time fixed size, stack allocated, no reallocation machinery
|
||||
StaticVector<ServiceRecord, MAX_SERVICES> services; // Allocates all MAX_SERVICES on stack
|
||||
services.push_back(record1); // Tracks count but all slots allocated
|
||||
```
|
||||
Use `cg.add_define("MAX_SERVICES", count)` to set the size from Python configuration.
|
||||
Like `std::array` but with vector-like API (`push_back()`, `size()`) and no STL reallocation code.
|
||||
|
||||
3. **Runtime-known sizes:** Use `FixedVector` from `esphome/core/helpers.h` when the size is only known at runtime initialization.
|
||||
```cpp
|
||||
// Bad - generates STL realloc code (_M_realloc_insert)
|
||||
std::vector<TxtRecord> txt_records;
|
||||
txt_records.reserve(5); // Still includes reallocation machinery
|
||||
|
||||
// Good - runtime size, single allocation, no reallocation machinery
|
||||
FixedVector<TxtRecord> txt_records;
|
||||
txt_records.init(record_count); // Initialize with exact size at runtime
|
||||
```
|
||||
**Benefits:**
|
||||
- Eliminates `_M_realloc_insert`, `_M_default_append` template instantiations (saves 200-500 bytes per instance)
|
||||
- Single allocation, no upper bound needed
|
||||
- No reallocation overhead
|
||||
- Compatible with protobuf code generation when using `[(fixed_vector) = true]` option
|
||||
|
||||
4. **Small datasets (1-16 elements):** Use `std::vector` or `std::array` with simple structs instead of `std::map`/`std::set`/`std::unordered_map`.
|
||||
```cpp
|
||||
// Bad - 2KB+ overhead for red-black tree/hash table
|
||||
std::map<std::string, int> small_lookup;
|
||||
std::unordered_map<int, std::string> tiny_map;
|
||||
|
||||
// Good - simple struct with linear search (std::vector is fine)
|
||||
struct LookupEntry {
|
||||
const char *key;
|
||||
int value;
|
||||
};
|
||||
std::vector<LookupEntry> small_lookup = {
|
||||
{"key1", 10},
|
||||
{"key2", 20},
|
||||
{"key3", 30},
|
||||
};
|
||||
// Or std::array if size is compile-time constant:
|
||||
// std::array<LookupEntry, 3> small_lookup = {{ ... }};
|
||||
```
|
||||
Linear search on small datasets (1-16 elements) is often faster than hashing/tree overhead, but this depends on lookup frequency and access patterns. For frequent lookups in hot code paths, the O(1) vs O(n) complexity difference may still matter even for small datasets. `std::vector` with simple structs is usually fine—it's the heavy containers (`map`, `set`, `unordered_map`) that should be avoided for small datasets unless profiling shows otherwise.
|
||||
|
||||
5. **Detection:** Look for these patterns in compiler output:
|
||||
- Large code sections with STL symbols (vector, map, set)
|
||||
- `alloc`, `realloc`, `dealloc` in symbol names
|
||||
- `_M_realloc_insert`, `_M_default_append` (vector reallocation)
|
||||
- Red-black tree code (`rb_tree`, `_Rb_tree`)
|
||||
- Hash table infrastructure (`unordered_map`, `hash`)
|
||||
|
||||
**When to optimize:**
|
||||
- Core components (API, network, logger)
|
||||
- Widely-used components (mdns, wifi, ble)
|
||||
- Components causing flash size complaints
|
||||
|
||||
**When not to optimize:**
|
||||
- Single-use niche components
|
||||
- Code where readability matters more than bytes
|
||||
- Already using appropriate containers
|
||||
|
||||
* **State Management:** Use `CORE.data` for component state that needs to persist during configuration generation. Avoid module-level mutable globals.
|
||||
|
||||
**Bad Pattern (Module-Level Globals):**
|
||||
```python
|
||||
# Don't do this - state persists between compilation runs
|
||||
_component_state = []
|
||||
_use_feature = None
|
||||
|
||||
def enable_feature():
|
||||
global _use_feature
|
||||
_use_feature = True
|
||||
```
|
||||
|
||||
**Good Pattern (CORE.data with Helpers):**
|
||||
```python
|
||||
from esphome.core import CORE
|
||||
|
||||
# Keys for CORE.data storage
|
||||
COMPONENT_STATE_KEY = "my_component_state"
|
||||
USE_FEATURE_KEY = "my_component_use_feature"
|
||||
|
||||
def _get_component_state() -> list:
|
||||
"""Get component state from CORE.data."""
|
||||
return CORE.data.setdefault(COMPONENT_STATE_KEY, [])
|
||||
|
||||
def _get_use_feature() -> bool | None:
|
||||
"""Get feature flag from CORE.data."""
|
||||
return CORE.data.get(USE_FEATURE_KEY)
|
||||
|
||||
def _set_use_feature(value: bool) -> None:
|
||||
"""Set feature flag in CORE.data."""
|
||||
CORE.data[USE_FEATURE_KEY] = value
|
||||
|
||||
def enable_feature():
|
||||
_set_use_feature(True)
|
||||
```
|
||||
|
||||
**Why this matters:**
|
||||
- Module-level globals persist between compilation runs if the dashboard doesn't fork/exec
|
||||
- `CORE.data` automatically clears between runs
|
||||
- Typed helper functions provide better IDE support and maintainability
|
||||
- Encapsulation makes state management explicit and testable
|
||||
|
||||
* **Security:** Be mindful of security when making changes to the API, web server, or any other network-related code. Do not hardcode secrets or keys.
|
||||
|
||||
* **Dependencies & Build System Integration:**
|
||||
* **Python:** When adding a new Python dependency, add it to the appropriate `requirements*.txt` file and `pyproject.toml`.
|
||||
* **C++ / PlatformIO:** When adding a new C++ dependency, add it to `platformio.ini` and use `cg.add_library`.
|
||||
* **Build Flags:** Use `cg.add_build_flag(...)` to add compiler flags.
|
70
.clang-tidy
70
.clang-tidy
@@ -5,42 +5,31 @@ Checks: >-
|
||||
-altera-*,
|
||||
-android-*,
|
||||
-boost-*,
|
||||
-bugprone-branch-clone,
|
||||
-bugprone-easily-swappable-parameters,
|
||||
-bugprone-implicit-widening-of-multiplication-result,
|
||||
-bugprone-multi-level-implicit-pointer-conversion,
|
||||
-bugprone-narrowing-conversions,
|
||||
-bugprone-signed-char-misuse,
|
||||
-bugprone-switch-missing-default-case,
|
||||
-bugprone-too-small-loop-variable,
|
||||
-cert-dcl50-cpp,
|
||||
-cert-err33-c,
|
||||
-cert-err58-cpp,
|
||||
-cert-oop57-cpp,
|
||||
-cert-str34-c,
|
||||
-clang-analyzer-optin.core.EnumCastOutOfRange,
|
||||
-clang-analyzer-optin.cplusplus.UninitializedObject,
|
||||
-clang-analyzer-osx.*,
|
||||
-clang-diagnostic-delete-abstract-non-virtual-dtor,
|
||||
-clang-diagnostic-delete-non-abstract-non-virtual-dtor,
|
||||
-clang-diagnostic-deprecated-declarations,
|
||||
-clang-diagnostic-ignored-optimization-argument,
|
||||
-clang-diagnostic-missing-field-initializers,
|
||||
-clang-diagnostic-shadow-field,
|
||||
-clang-diagnostic-sign-compare,
|
||||
-clang-diagnostic-unused-variable,
|
||||
-clang-diagnostic-unused-const-variable,
|
||||
-clang-diagnostic-unused-parameter,
|
||||
-clang-diagnostic-vla-cxx-extension,
|
||||
-concurrency-*,
|
||||
-cppcoreguidelines-avoid-c-arrays,
|
||||
-cppcoreguidelines-avoid-const-or-ref-data-members,
|
||||
-cppcoreguidelines-avoid-do-while,
|
||||
-cppcoreguidelines-avoid-goto,
|
||||
-cppcoreguidelines-avoid-magic-numbers,
|
||||
-cppcoreguidelines-init-variables,
|
||||
-cppcoreguidelines-macro-to-enum,
|
||||
-cppcoreguidelines-macro-usage,
|
||||
-cppcoreguidelines-missing-std-forward,
|
||||
-cppcoreguidelines-narrowing-conversions,
|
||||
-cppcoreguidelines-non-private-member-variables-in-classes,
|
||||
-cppcoreguidelines-owning-memory,
|
||||
-cppcoreguidelines-prefer-member-initializer,
|
||||
-cppcoreguidelines-pro-bounds-array-to-pointer-decay,
|
||||
-cppcoreguidelines-pro-bounds-constant-array-index,
|
||||
-cppcoreguidelines-pro-bounds-pointer-arithmetic,
|
||||
@@ -51,10 +40,8 @@ Checks: >-
|
||||
-cppcoreguidelines-pro-type-static-cast-downcast,
|
||||
-cppcoreguidelines-pro-type-union-access,
|
||||
-cppcoreguidelines-pro-type-vararg,
|
||||
-cppcoreguidelines-rvalue-reference-param-not-moved,
|
||||
-cppcoreguidelines-special-member-functions,
|
||||
-cppcoreguidelines-use-default-member-init,
|
||||
-cppcoreguidelines-virtual-class-destructor,
|
||||
-fuchsia-default-arguments,
|
||||
-fuchsia-multiple-inheritance,
|
||||
-fuchsia-overloaded-operator,
|
||||
-fuchsia-statically-constructed-objects,
|
||||
@@ -64,7 +51,6 @@ Checks: >-
|
||||
-google-explicit-constructor,
|
||||
-google-readability-braces-around-statements,
|
||||
-google-readability-casting,
|
||||
-google-readability-namespace-comments,
|
||||
-google-readability-todo,
|
||||
-google-runtime-references,
|
||||
-hicpp-*,
|
||||
@@ -73,33 +59,22 @@ Checks: >-
|
||||
-llvm-include-order,
|
||||
-llvm-qualified-auto,
|
||||
-llvmlibc-*,
|
||||
-misc-const-correctness,
|
||||
-misc-include-cleaner,
|
||||
-misc-no-recursion,
|
||||
-misc-non-private-member-variables-in-classes,
|
||||
-misc-no-recursion,
|
||||
-misc-unused-parameters,
|
||||
-misc-use-anonymous-namespace,
|
||||
-modernize-avoid-bind,
|
||||
-modernize-avoid-c-arrays,
|
||||
-modernize-avoid-bind,
|
||||
-modernize-concat-nested-namespaces,
|
||||
-modernize-macro-to-enum,
|
||||
-modernize-return-braced-init-list,
|
||||
-modernize-type-traits,
|
||||
-modernize-use-auto,
|
||||
-modernize-use-constraints,
|
||||
-modernize-use-default-member-init,
|
||||
-modernize-use-equals-default,
|
||||
-modernize-use-nodiscard,
|
||||
-modernize-use-nullptr,
|
||||
-modernize-use-nodiscard,
|
||||
-modernize-use-nullptr,
|
||||
-modernize-use-trailing-return-type,
|
||||
-modernize-use-nodiscard,
|
||||
-mpi-*,
|
||||
-objc-*,
|
||||
-performance-enum-size,
|
||||
-readability-avoid-nested-conditional-operator,
|
||||
-readability-container-contains,
|
||||
-readability-container-data-pointer,
|
||||
-readability-braces-around-statements,
|
||||
-readability-const-return-type,
|
||||
-readability-convert-member-functions-to-static,
|
||||
-readability-else-after-return,
|
||||
-readability-function-cognitive-complexity,
|
||||
@@ -108,22 +83,23 @@ Checks: >-
|
||||
-readability-magic-numbers,
|
||||
-readability-make-member-function-const,
|
||||
-readability-named-parameter,
|
||||
-readability-redundant-casting,
|
||||
-readability-redundant-inline-specifier,
|
||||
-readability-qualified-auto,
|
||||
-readability-redundant-access-specifiers,
|
||||
-readability-redundant-member-init,
|
||||
-readability-redundant-string-init,
|
||||
-readability-uppercase-literal-suffix,
|
||||
-readability-use-anyofallof,
|
||||
WarningsAsErrors: '*'
|
||||
AnalyzeTemporaryDtors: false
|
||||
FormatStyle: google
|
||||
CheckOptions:
|
||||
- key: google-readability-braces-around-statements.ShortStatementLines
|
||||
value: '1'
|
||||
- key: google-readability-function-size.StatementThreshold
|
||||
value: '800'
|
||||
- key: google-runtime-int.TypeSuffix
|
||||
value: '_t'
|
||||
- key: llvm-namespace-comment.ShortNamespaceLines
|
||||
- key: google-readability-namespace-comments.ShortNamespaceLines
|
||||
value: '10'
|
||||
- key: llvm-namespace-comment.SpacesBeforeComments
|
||||
- key: google-readability-namespace-comments.SpacesBeforeComments
|
||||
value: '2'
|
||||
- key: modernize-loop-convert.MaxCopySize
|
||||
value: '16'
|
||||
@@ -141,8 +117,6 @@ CheckOptions:
|
||||
value: 'make_unique'
|
||||
- key: modernize-make-unique.MakeSmartPtrFunctionHeader
|
||||
value: 'esphome/core/helpers.h'
|
||||
- key: readability-braces-around-statements.ShortStatementLines
|
||||
value: 2
|
||||
- key: readability-identifier-naming.LocalVariableCase
|
||||
value: 'lower_case'
|
||||
- key: readability-identifier-naming.ClassCase
|
||||
@@ -189,11 +163,3 @@ CheckOptions:
|
||||
value: 'lower_case'
|
||||
- key: readability-identifier-naming.VirtualMethodSuffix
|
||||
value: ''
|
||||
- key: readability-qualified-auto.AddConstToQualified
|
||||
value: 0
|
||||
- key: readability-identifier-length.MinimumVariableNameLength
|
||||
value: 0
|
||||
- key: readability-identifier-length.MinimumParameterNameLength
|
||||
value: 0
|
||||
- key: readability-identifier-length.MinimumLoopCounterNameLength
|
||||
value: 0
|
||||
|
@@ -1 +0,0 @@
|
||||
d7693a1e996cacd4a3d1c9a16336799c2a8cc3db02e4e74084151ce964581248
|
@@ -1,5 +1,2 @@
|
||||
[run]
|
||||
omit =
|
||||
esphome/components/*
|
||||
esphome/analyze_memory/*
|
||||
tests/integration/*
|
||||
omit = esphome/components/*
|
||||
|
@@ -1,37 +0,0 @@
|
||||
ARG BUILD_BASE_VERSION=2025.04.0
|
||||
|
||||
|
||||
FROM ghcr.io/esphome/docker-base:debian-${BUILD_BASE_VERSION} AS base
|
||||
|
||||
RUN git config --system --add safe.directory "*"
|
||||
|
||||
RUN apt update \
|
||||
&& apt install -y \
|
||||
protobuf-compiler
|
||||
|
||||
RUN pip install uv
|
||||
|
||||
RUN useradd esphome -m
|
||||
|
||||
USER esphome
|
||||
ENV VIRTUAL_ENV=/home/esphome/.local/esphome-venv
|
||||
RUN uv venv $VIRTUAL_ENV
|
||||
ENV PATH="$VIRTUAL_ENV/bin:$PATH"
|
||||
# Override this set to true in the docker-base image
|
||||
ENV UV_SYSTEM_PYTHON=false
|
||||
|
||||
WORKDIR /tmp
|
||||
|
||||
COPY requirements.txt ./
|
||||
RUN uv pip install -r requirements.txt
|
||||
COPY requirements_dev.txt requirements_test.txt ./
|
||||
RUN uv pip install -r requirements_dev.txt -r requirements_test.txt
|
||||
|
||||
RUN \
|
||||
platformio settings set enable_telemetry No \
|
||||
&& platformio settings set check_platformio_interval 1000000
|
||||
|
||||
COPY script/platformio_install_deps.py platformio.ini ./
|
||||
RUN ./platformio_install_deps.py platformio.ini --libraries --platforms --tools
|
||||
|
||||
WORKDIR /workspaces
|
@@ -1,87 +1,56 @@
|
||||
{
|
||||
"name": "ESPHome Dev",
|
||||
"context": "..",
|
||||
"dockerFile": "Dockerfile",
|
||||
"image": "esphome/esphome-lint:dev",
|
||||
"postCreateCommand": [
|
||||
"script/devcontainer-post-create"
|
||||
],
|
||||
"features": {
|
||||
"ghcr.io/devcontainers/features/github-cli:1": {}
|
||||
},
|
||||
"runArgs": [
|
||||
"--privileged",
|
||||
"-e",
|
||||
"GIT_EDITOR=code --wait"
|
||||
// uncomment and edit the path in order to pass though local USB serial to the conatiner
|
||||
// , "--device=/dev/ttyACM0"
|
||||
"ESPHOME_DASHBOARD_USE_PING=1"
|
||||
],
|
||||
"appPort": 6052,
|
||||
// if you are using avahi in the host device, uncomment these to allow the
|
||||
// devcontainer to find devices via mdns
|
||||
//"mounts": [
|
||||
// "type=bind,source=/dev/bus/usb,target=/dev/bus/usb",
|
||||
// "type=bind,source=/var/run/dbus,target=/var/run/dbus",
|
||||
// "type=bind,source=/var/run/avahi-daemon/socket,target=/var/run/avahi-daemon/socket"
|
||||
//],
|
||||
"customizations": {
|
||||
"vscode": {
|
||||
"extensions": [
|
||||
// python
|
||||
"ms-python.python",
|
||||
"ms-python.pylint",
|
||||
"ms-python.flake8",
|
||||
"charliermarsh.ruff",
|
||||
"visualstudioexptteam.vscodeintellicode",
|
||||
// yaml
|
||||
"redhat.vscode-yaml",
|
||||
// cpp
|
||||
"ms-vscode.cpptools",
|
||||
// editorconfig
|
||||
"editorconfig.editorconfig"
|
||||
],
|
||||
"settings": {
|
||||
"python.languageServer": "Pylance",
|
||||
"python.pythonPath": "/usr/bin/python3",
|
||||
"pylint.args": [
|
||||
"--rcfile=${workspaceFolder}/pyproject.toml"
|
||||
],
|
||||
"flake8.args": [
|
||||
"--config=${workspaceFolder}/.flake8"
|
||||
],
|
||||
"ruff.configuration": "${workspaceFolder}/pyproject.toml",
|
||||
"[python]": {
|
||||
// VS will say "Value is not accepted" before building the devcontainer, but the warning
|
||||
// should go away after build is completed.
|
||||
"editor.defaultFormatter": "charliermarsh.ruff"
|
||||
},
|
||||
"editor.formatOnPaste": false,
|
||||
"editor.formatOnSave": true,
|
||||
"editor.formatOnType": true,
|
||||
"files.trimTrailingWhitespace": true,
|
||||
"terminal.integrated.defaultProfile.linux": "bash",
|
||||
"yaml.customTags": [
|
||||
"!secret scalar",
|
||||
"!lambda scalar",
|
||||
"!extend scalar",
|
||||
"!remove scalar",
|
||||
"!include_dir_named scalar",
|
||||
"!include_dir_list scalar",
|
||||
"!include_dir_merge_list scalar",
|
||||
"!include_dir_merge_named scalar"
|
||||
],
|
||||
"files.exclude": {
|
||||
"**/.git": true,
|
||||
"**/.DS_Store": true,
|
||||
"**/*.pyc": {
|
||||
"when": "$(basename).py"
|
||||
},
|
||||
"**/__pycache__": true
|
||||
},
|
||||
"files.associations": {
|
||||
"**/.vscode/*.json": "jsonc"
|
||||
},
|
||||
"C_Cpp.clang_format_path": "/usr/bin/clang-format-13"
|
||||
}
|
||||
}
|
||||
"extensions": [
|
||||
// python
|
||||
"ms-python.python",
|
||||
"visualstudioexptteam.vscodeintellicode",
|
||||
// yaml
|
||||
"redhat.vscode-yaml",
|
||||
// cpp
|
||||
"ms-vscode.cpptools",
|
||||
// editorconfig
|
||||
"editorconfig.editorconfig",
|
||||
],
|
||||
"settings": {
|
||||
"python.languageServer": "Pylance",
|
||||
"python.pythonPath": "/usr/bin/python3",
|
||||
"python.linting.pylintEnabled": true,
|
||||
"python.linting.enabled": true,
|
||||
"python.formatting.provider": "black",
|
||||
"editor.formatOnPaste": false,
|
||||
"editor.formatOnSave": true,
|
||||
"editor.formatOnType": true,
|
||||
"files.trimTrailingWhitespace": true,
|
||||
"terminal.integrated.defaultProfile.linux": "bash",
|
||||
"yaml.customTags": [
|
||||
"!secret scalar",
|
||||
"!lambda scalar",
|
||||
"!include_dir_named scalar",
|
||||
"!include_dir_list scalar",
|
||||
"!include_dir_merge_list scalar",
|
||||
"!include_dir_merge_named scalar"
|
||||
],
|
||||
"files.exclude": {
|
||||
"**/.git": true,
|
||||
"**/.DS_Store": true,
|
||||
"**/*.pyc": {
|
||||
"when": "$(basename).py"
|
||||
},
|
||||
"**/__pycache__": true
|
||||
},
|
||||
"files.associations": {
|
||||
"**/.vscode/*.json": "jsonc"
|
||||
},
|
||||
"C_Cpp.clang_format_path": "/usr/bin/clang-format-11",
|
||||
}
|
||||
}
|
||||
|
@@ -75,9 +75,6 @@ target/
|
||||
# pyenv
|
||||
.python-version
|
||||
|
||||
# asdf
|
||||
.tool-versions
|
||||
|
||||
# celery beat schedule file
|
||||
celerybeat-schedule
|
||||
|
||||
@@ -114,5 +111,4 @@ config/
|
||||
examples/
|
||||
Dockerfile
|
||||
.git/
|
||||
tests/
|
||||
.*
|
||||
tests/build/
|
||||
|
@@ -25,9 +25,10 @@ indent_size = 2
|
||||
[*.{yaml,yml}]
|
||||
indent_style = space
|
||||
indent_size = 2
|
||||
quote_type = double
|
||||
quote_type = single
|
||||
|
||||
# JSON
|
||||
[*.json]
|
||||
indent_style = space
|
||||
indent_size = 2
|
||||
|
||||
|
1
.gitattributes
vendored
1
.gitattributes
vendored
@@ -1,3 +1,2 @@
|
||||
# Normalize line endings to LF in the repository
|
||||
* text eol=lf
|
||||
*.png binary
|
||||
|
3
.github/FUNDING.yml
vendored
Normal file
3
.github/FUNDING.yml
vendored
Normal file
@@ -0,0 +1,3 @@
|
||||
# These are supported funding model platforms
|
||||
|
||||
custom: https://www.nabucasa.com
|
92
.github/ISSUE_TEMPLATE/bug_report.yml
vendored
92
.github/ISSUE_TEMPLATE/bug_report.yml
vendored
@@ -1,92 +0,0 @@
|
||||
name: Report an issue with ESPHome
|
||||
description: Report an issue with ESPHome.
|
||||
body:
|
||||
- type: markdown
|
||||
attributes:
|
||||
value: |
|
||||
This issue form is for reporting bugs only!
|
||||
|
||||
If you have a feature request or enhancement, please [request them here instead][fr].
|
||||
|
||||
[fr]: https://github.com/orgs/esphome/discussions
|
||||
- type: textarea
|
||||
validations:
|
||||
required: true
|
||||
id: problem
|
||||
attributes:
|
||||
label: The problem
|
||||
description: >-
|
||||
Describe the issue you are experiencing here to communicate to the
|
||||
maintainers. Tell us what you were trying to do and what happened.
|
||||
|
||||
Provide a clear and concise description of what the problem is.
|
||||
|
||||
- type: markdown
|
||||
attributes:
|
||||
value: |
|
||||
## Environment
|
||||
- type: input
|
||||
id: version
|
||||
validations:
|
||||
required: true
|
||||
attributes:
|
||||
label: Which version of ESPHome has the issue?
|
||||
description: >
|
||||
ESPHome version like 1.19, 2025.6.0 or 2025.XX.X-dev.
|
||||
- type: dropdown
|
||||
validations:
|
||||
required: true
|
||||
id: installation
|
||||
attributes:
|
||||
label: What type of installation are you using?
|
||||
options:
|
||||
- Home Assistant Add-on
|
||||
- Docker
|
||||
- pip
|
||||
- type: dropdown
|
||||
validations:
|
||||
required: true
|
||||
id: platform
|
||||
attributes:
|
||||
label: What platform are you using?
|
||||
options:
|
||||
- ESP8266
|
||||
- ESP32
|
||||
- RP2040
|
||||
- BK72XX
|
||||
- RTL87XX
|
||||
- LN882X
|
||||
- Host
|
||||
- Other
|
||||
- type: input
|
||||
id: component_name
|
||||
attributes:
|
||||
label: Component causing the issue
|
||||
description: >
|
||||
The name of the component or platform. For example, api/i2c or ultrasonic.
|
||||
|
||||
- type: markdown
|
||||
attributes:
|
||||
value: |
|
||||
# Details
|
||||
- type: textarea
|
||||
id: config
|
||||
attributes:
|
||||
label: YAML Config
|
||||
description: |
|
||||
Include a complete YAML configuration file demonstrating the problem here. Preferably post the *entire* file - don't make assumptions about what is unimportant. However, if it's a large or complicated config then you will need to reduce it to the smallest possible file *that still demonstrates the problem*. If you don't provide enough information to *easily* reproduce the problem, it's unlikely your bug report will get any attention. Logs do not belong here, attach them below.
|
||||
render: yaml
|
||||
- type: textarea
|
||||
id: logs
|
||||
attributes:
|
||||
label: Anything in the logs that might be useful for us?
|
||||
description: For example, error message, or stack traces. Serial or USB logs are much more useful than WiFi logs.
|
||||
render: txt
|
||||
- type: textarea
|
||||
id: additional
|
||||
attributes:
|
||||
label: Additional information
|
||||
description: >
|
||||
If you have any additional information for us, use the field below.
|
||||
Please note, you can attach screenshots or screen recordings here, by
|
||||
dragging and dropping files in the field below.
|
21
.github/ISSUE_TEMPLATE/config.yml
vendored
21
.github/ISSUE_TEMPLATE/config.yml
vendored
@@ -1,21 +1,12 @@
|
||||
---
|
||||
blank_issues_enabled: false
|
||||
contact_links:
|
||||
- name: Report an issue with the ESPHome documentation
|
||||
url: https://github.com/esphome/esphome-docs/issues/new/choose
|
||||
about: Report an issue with the ESPHome documentation.
|
||||
- name: Report an issue with the ESPHome web server
|
||||
url: https://github.com/esphome/esphome-webserver/issues/new/choose
|
||||
about: Report an issue with the ESPHome web server.
|
||||
- name: Report an issue with the ESPHome Builder / Dashboard
|
||||
url: https://github.com/esphome/dashboard/issues/new/choose
|
||||
about: Report an issue with the ESPHome Builder / Dashboard.
|
||||
- name: Report an issue with the ESPHome API client
|
||||
url: https://github.com/esphome/aioesphomeapi/issues/new/choose
|
||||
about: Report an issue with the ESPHome API client.
|
||||
- name: Make a Feature Request
|
||||
url: https://github.com/orgs/esphome/discussions
|
||||
- name: Issue Tracker
|
||||
url: https://github.com/esphome/issues
|
||||
about: Please create bug reports in the dedicated issue tracker.
|
||||
- name: Feature Request Tracker
|
||||
url: https://github.com/esphome/feature-requests
|
||||
about: Please create feature requests in the dedicated feature request tracker.
|
||||
- name: Frequently Asked Question
|
||||
url: https://esphome.io/guides/faq.html
|
||||
about: Please view the FAQ for common questions and what to include in a bug report.
|
||||
|
||||
|
25
.github/PULL_REQUEST_TEMPLATE.md
vendored
25
.github/PULL_REQUEST_TEMPLATE.md
vendored
@@ -1,34 +1,31 @@
|
||||
# What does this implement/fix?
|
||||
# What does this implement/fix?
|
||||
|
||||
<!-- Quick description and explanation of changes -->
|
||||
Quick description and explanation of changes
|
||||
|
||||
## Types of changes
|
||||
|
||||
- [ ] Bugfix (non-breaking change which fixes an issue)
|
||||
- [ ] New feature (non-breaking change which adds functionality)
|
||||
- [ ] Breaking change (fix or feature that would cause existing functionality to not work as expected)
|
||||
- [ ] Code quality improvements to existing code or addition of tests
|
||||
- [ ] Other
|
||||
|
||||
**Related issue or feature (if applicable):**
|
||||
**Related issue or feature (if applicable):** fixes <link to issue>
|
||||
|
||||
- fixes <link to issue>
|
||||
|
||||
**Pull request in [esphome-docs](https://github.com/esphome/esphome-docs) with documentation (if applicable):**
|
||||
|
||||
- esphome/esphome-docs#<esphome-docs PR number goes here>
|
||||
**Pull request in [esphome-docs](https://github.com/esphome/esphome-docs) with documentation (if applicable):** esphome/esphome-docs#<esphome-docs PR number goes here>
|
||||
|
||||
## Test Environment
|
||||
|
||||
- [ ] ESP32
|
||||
- [ ] ESP32 IDF
|
||||
- [ ] ESP8266
|
||||
- [ ] RP2040
|
||||
- [ ] BK72xx
|
||||
- [ ] RTL87xx
|
||||
- [ ] nRF52840
|
||||
|
||||
## Example entry for `config.yaml`:
|
||||
<!--
|
||||
Supplying a configuration snippet, makes it easier for a maintainer to test
|
||||
your PR. Furthermore, for new integrations, it gives an impression of how
|
||||
the configuration would look like.
|
||||
Note: Remove this section if this PR does not have an example entry.
|
||||
-->
|
||||
|
||||
```yaml
|
||||
# Example config.yaml
|
||||
@@ -38,6 +35,6 @@
|
||||
## Checklist:
|
||||
- [ ] The code change is tested and works locally.
|
||||
- [ ] Tests have been added to verify that the new code works (under `tests/` folder).
|
||||
|
||||
|
||||
If user exposed functionality or configuration variables are added/changed:
|
||||
- [ ] Documentation added/updated in [esphome-docs](https://github.com/esphome/esphome-docs).
|
||||
|
98
.github/actions/build-image/action.yaml
vendored
98
.github/actions/build-image/action.yaml
vendored
@@ -1,98 +0,0 @@
|
||||
name: Build Image
|
||||
inputs:
|
||||
target:
|
||||
description: "Target to build"
|
||||
required: true
|
||||
example: "docker"
|
||||
build_type:
|
||||
description: "Build type"
|
||||
required: true
|
||||
example: "docker"
|
||||
suffix:
|
||||
description: "Suffix to add to tags"
|
||||
required: true
|
||||
version:
|
||||
description: "Version to build"
|
||||
required: true
|
||||
example: "2023.12.0"
|
||||
base_os:
|
||||
description: "Base OS to use"
|
||||
required: false
|
||||
default: "debian"
|
||||
example: "debian"
|
||||
runs:
|
||||
using: "composite"
|
||||
steps:
|
||||
- name: Generate short tags
|
||||
id: tags
|
||||
shell: bash
|
||||
run: |
|
||||
output=$(docker/generate_tags.py \
|
||||
--tag "${{ inputs.version }}" \
|
||||
--suffix "${{ inputs.suffix }}")
|
||||
echo $output
|
||||
for l in $output; do
|
||||
echo $l >> $GITHUB_OUTPUT
|
||||
done
|
||||
|
||||
# set cache-to only if dev branch
|
||||
- id: cache-to
|
||||
shell: bash
|
||||
run: |-
|
||||
if [[ "${{ github.ref }}" == "refs/heads/dev" ]]; then
|
||||
echo "value=type=gha,mode=max" >> $GITHUB_OUTPUT
|
||||
else
|
||||
echo "value=" >> $GITHUB_OUTPUT
|
||||
fi
|
||||
|
||||
- name: Build and push to ghcr by digest
|
||||
id: build-ghcr
|
||||
uses: docker/build-push-action@263435318d21b8e681c14492fe198d362a7d2c83 # v6.18.0
|
||||
env:
|
||||
DOCKER_BUILD_SUMMARY: false
|
||||
DOCKER_BUILD_RECORD_UPLOAD: false
|
||||
with:
|
||||
context: .
|
||||
file: ./docker/Dockerfile
|
||||
target: ${{ inputs.target }}
|
||||
cache-from: type=gha
|
||||
cache-to: ${{ steps.cache-to.outputs.value }}
|
||||
build-args: |
|
||||
BUILD_TYPE=${{ inputs.build_type }}
|
||||
BUILD_VERSION=${{ inputs.version }}
|
||||
BUILD_OS=${{ inputs.base_os }}
|
||||
outputs: |
|
||||
type=image,name=ghcr.io/${{ steps.tags.outputs.image_name }},push-by-digest=true,name-canonical=true,push=true
|
||||
|
||||
- name: Export ghcr digests
|
||||
shell: bash
|
||||
run: |
|
||||
mkdir -p /tmp/digests/${{ inputs.build_type }}/ghcr
|
||||
digest="${{ steps.build-ghcr.outputs.digest }}"
|
||||
touch "/tmp/digests/${{ inputs.build_type }}/ghcr/${digest#sha256:}"
|
||||
|
||||
- name: Build and push to dockerhub by digest
|
||||
id: build-dockerhub
|
||||
uses: docker/build-push-action@263435318d21b8e681c14492fe198d362a7d2c83 # v6.18.0
|
||||
env:
|
||||
DOCKER_BUILD_SUMMARY: false
|
||||
DOCKER_BUILD_RECORD_UPLOAD: false
|
||||
with:
|
||||
context: .
|
||||
file: ./docker/Dockerfile
|
||||
target: ${{ inputs.target }}
|
||||
cache-from: type=gha
|
||||
cache-to: ${{ steps.cache-to.outputs.value }}
|
||||
build-args: |
|
||||
BUILD_TYPE=${{ inputs.build_type }}
|
||||
BUILD_VERSION=${{ inputs.version }}
|
||||
BUILD_OS=${{ inputs.base_os }}
|
||||
outputs: |
|
||||
type=image,name=docker.io/${{ steps.tags.outputs.image_name }},push-by-digest=true,name-canonical=true,push=true
|
||||
|
||||
- name: Export dockerhub digests
|
||||
shell: bash
|
||||
run: |
|
||||
mkdir -p /tmp/digests/${{ inputs.build_type }}/dockerhub
|
||||
digest="${{ steps.build-dockerhub.outputs.digest }}"
|
||||
touch "/tmp/digests/${{ inputs.build_type }}/dockerhub/${digest#sha256:}"
|
47
.github/actions/restore-python/action.yml
vendored
47
.github/actions/restore-python/action.yml
vendored
@@ -1,47 +0,0 @@
|
||||
name: Restore Python
|
||||
inputs:
|
||||
python-version:
|
||||
description: Python version to restore
|
||||
required: true
|
||||
type: string
|
||||
cache-key:
|
||||
description: Cache key to use
|
||||
required: true
|
||||
type: string
|
||||
outputs:
|
||||
python-version:
|
||||
description: Python version restored
|
||||
value: ${{ steps.python.outputs.python-version }}
|
||||
runs:
|
||||
using: "composite"
|
||||
steps:
|
||||
- name: Set up Python ${{ inputs.python-version }}
|
||||
id: python
|
||||
uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
|
||||
with:
|
||||
python-version: ${{ inputs.python-version }}
|
||||
- name: Restore Python virtual environment
|
||||
id: cache-venv
|
||||
uses: actions/cache/restore@0057852bfaa89a56745cba8c7296529d2fc39830 # v4.3.0
|
||||
with:
|
||||
path: venv
|
||||
# yamllint disable-line rule:line-length
|
||||
key: ${{ runner.os }}-${{ steps.python.outputs.python-version }}-venv-${{ inputs.cache-key }}
|
||||
- name: Create Python virtual environment
|
||||
if: steps.cache-venv.outputs.cache-hit != 'true' && runner.os != 'Windows'
|
||||
shell: bash
|
||||
run: |
|
||||
python -m venv venv
|
||||
source venv/bin/activate
|
||||
python --version
|
||||
pip install -r requirements.txt -r requirements_test.txt
|
||||
pip install -e .
|
||||
- name: Create Python virtual environment
|
||||
if: steps.cache-venv.outputs.cache-hit != 'true' && runner.os == 'Windows'
|
||||
shell: bash
|
||||
run: |
|
||||
python -m venv venv
|
||||
source ./venv/Scripts/activate
|
||||
python --version
|
||||
pip install -r requirements.txt -r requirements_test.txt
|
||||
pip install -e .
|
1
.github/copilot-instructions.md
vendored
1
.github/copilot-instructions.md
vendored
@@ -1 +0,0 @@
|
||||
../.ai/instructions.md
|
35
.github/dependabot.yml
vendored
35
.github/dependabot.yml
vendored
@@ -1,40 +1,9 @@
|
||||
---
|
||||
version: 2
|
||||
updates:
|
||||
- package-ecosystem: pip
|
||||
- package-ecosystem: "pip"
|
||||
directory: "/"
|
||||
schedule:
|
||||
interval: daily
|
||||
interval: "daily"
|
||||
ignore:
|
||||
# Hypotehsis is only used for testing and is updated quite often
|
||||
- dependency-name: hypothesis
|
||||
- package-ecosystem: github-actions
|
||||
labels:
|
||||
- "dependencies"
|
||||
- "github-actions"
|
||||
directory: "/"
|
||||
schedule:
|
||||
interval: daily
|
||||
open-pull-requests-limit: 10
|
||||
groups:
|
||||
docker-actions:
|
||||
applies-to: version-updates
|
||||
patterns:
|
||||
- "docker/login-action"
|
||||
- "docker/setup-buildx-action"
|
||||
- package-ecosystem: github-actions
|
||||
labels:
|
||||
- "dependencies"
|
||||
- "github-actions"
|
||||
directory: "/.github/actions/build-image"
|
||||
schedule:
|
||||
interval: daily
|
||||
open-pull-requests-limit: 10
|
||||
- package-ecosystem: github-actions
|
||||
labels:
|
||||
- "dependencies"
|
||||
- "github-actions"
|
||||
directory: "/.github/actions/restore-python"
|
||||
schedule:
|
||||
interval: daily
|
||||
open-pull-requests-limit: 10
|
||||
|
662
.github/workflows/auto-label-pr.yml
vendored
662
.github/workflows/auto-label-pr.yml
vendored
@@ -1,662 +0,0 @@
|
||||
name: Auto Label PR
|
||||
|
||||
on:
|
||||
# Runs only on pull_request_target due to having access to a App token.
|
||||
# This means PRs from forks will not be able to alter this workflow to get the tokens
|
||||
pull_request_target:
|
||||
types: [labeled, opened, reopened, synchronize, edited]
|
||||
|
||||
permissions:
|
||||
pull-requests: write
|
||||
contents: read
|
||||
|
||||
env:
|
||||
SMALL_PR_THRESHOLD: 30
|
||||
MAX_LABELS: 15
|
||||
TOO_BIG_THRESHOLD: 1000
|
||||
COMPONENT_LABEL_THRESHOLD: 10
|
||||
|
||||
jobs:
|
||||
label:
|
||||
runs-on: ubuntu-latest
|
||||
if: github.event.action != 'labeled' || github.event.sender.type != 'Bot'
|
||||
steps:
|
||||
- name: Checkout
|
||||
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
|
||||
|
||||
- name: Generate a token
|
||||
id: generate-token
|
||||
uses: actions/create-github-app-token@67018539274d69449ef7c02e8e71183d1719ab42 # v2
|
||||
with:
|
||||
app-id: ${{ secrets.ESPHOME_GITHUB_APP_ID }}
|
||||
private-key: ${{ secrets.ESPHOME_GITHUB_APP_PRIVATE_KEY }}
|
||||
|
||||
- name: Auto Label PR
|
||||
uses: actions/github-script@ed597411d8f924073f98dfc5c65a23a2325f34cd # v8.0.0
|
||||
with:
|
||||
github-token: ${{ steps.generate-token.outputs.token }}
|
||||
script: |
|
||||
const fs = require('fs');
|
||||
|
||||
// Constants
|
||||
const SMALL_PR_THRESHOLD = parseInt('${{ env.SMALL_PR_THRESHOLD }}');
|
||||
const MAX_LABELS = parseInt('${{ env.MAX_LABELS }}');
|
||||
const TOO_BIG_THRESHOLD = parseInt('${{ env.TOO_BIG_THRESHOLD }}');
|
||||
const COMPONENT_LABEL_THRESHOLD = parseInt('${{ env.COMPONENT_LABEL_THRESHOLD }}');
|
||||
const BOT_COMMENT_MARKER = '<!-- auto-label-pr-bot -->';
|
||||
const CODEOWNERS_MARKER = '<!-- codeowners-request -->';
|
||||
const TOO_BIG_MARKER = '<!-- too-big-request -->';
|
||||
|
||||
const MANAGED_LABELS = [
|
||||
'new-component',
|
||||
'new-platform',
|
||||
'new-target-platform',
|
||||
'merging-to-release',
|
||||
'merging-to-beta',
|
||||
'core',
|
||||
'small-pr',
|
||||
'dashboard',
|
||||
'github-actions',
|
||||
'by-code-owner',
|
||||
'has-tests',
|
||||
'needs-tests',
|
||||
'needs-docs',
|
||||
'needs-codeowners',
|
||||
'too-big',
|
||||
'labeller-recheck',
|
||||
'bugfix',
|
||||
'new-feature',
|
||||
'breaking-change',
|
||||
'code-quality'
|
||||
];
|
||||
|
||||
const DOCS_PR_PATTERNS = [
|
||||
/https:\/\/github\.com\/esphome\/esphome-docs\/pull\/\d+/,
|
||||
/esphome\/esphome-docs#\d+/
|
||||
];
|
||||
|
||||
// Global state
|
||||
const { owner, repo } = context.repo;
|
||||
const pr_number = context.issue.number;
|
||||
|
||||
// Get current labels and PR data
|
||||
const { data: currentLabelsData } = await github.rest.issues.listLabelsOnIssue({
|
||||
owner,
|
||||
repo,
|
||||
issue_number: pr_number
|
||||
});
|
||||
const currentLabels = currentLabelsData.map(label => label.name);
|
||||
const managedLabels = currentLabels.filter(label =>
|
||||
label.startsWith('component: ') || MANAGED_LABELS.includes(label)
|
||||
);
|
||||
|
||||
// Check for mega-PR early - if present, skip most automatic labeling
|
||||
const isMegaPR = currentLabels.includes('mega-pr');
|
||||
|
||||
// Get all PR files with automatic pagination
|
||||
const prFiles = await github.paginate(
|
||||
github.rest.pulls.listFiles,
|
||||
{
|
||||
owner,
|
||||
repo,
|
||||
pull_number: pr_number
|
||||
}
|
||||
);
|
||||
|
||||
// Calculate data from PR files
|
||||
const changedFiles = prFiles.map(file => file.filename);
|
||||
const totalAdditions = prFiles.reduce((sum, file) => sum + (file.additions || 0), 0);
|
||||
const totalDeletions = prFiles.reduce((sum, file) => sum + (file.deletions || 0), 0);
|
||||
const totalChanges = totalAdditions + totalDeletions;
|
||||
|
||||
console.log('Current labels:', currentLabels.join(', '));
|
||||
console.log('Changed files:', changedFiles.length);
|
||||
console.log('Total changes:', totalChanges);
|
||||
if (isMegaPR) {
|
||||
console.log('Mega-PR detected - applying limited labeling logic');
|
||||
}
|
||||
|
||||
// Fetch API data
|
||||
async function fetchApiData() {
|
||||
try {
|
||||
const response = await fetch('https://data.esphome.io/components.json');
|
||||
const componentsData = await response.json();
|
||||
return {
|
||||
targetPlatforms: componentsData.target_platforms || [],
|
||||
platformComponents: componentsData.platform_components || []
|
||||
};
|
||||
} catch (error) {
|
||||
console.log('Failed to fetch components data from API:', error.message);
|
||||
return { targetPlatforms: [], platformComponents: [] };
|
||||
}
|
||||
}
|
||||
|
||||
// Strategy: Merge branch detection
|
||||
async function detectMergeBranch() {
|
||||
const labels = new Set();
|
||||
const baseRef = context.payload.pull_request.base.ref;
|
||||
|
||||
if (baseRef === 'release') {
|
||||
labels.add('merging-to-release');
|
||||
} else if (baseRef === 'beta') {
|
||||
labels.add('merging-to-beta');
|
||||
}
|
||||
|
||||
return labels;
|
||||
}
|
||||
|
||||
// Strategy: Component and platform labeling
|
||||
async function detectComponentPlatforms(apiData) {
|
||||
const labels = new Set();
|
||||
const componentRegex = /^esphome\/components\/([^\/]+)\//;
|
||||
const targetPlatformRegex = new RegExp(`^esphome\/components\/(${apiData.targetPlatforms.join('|')})/`);
|
||||
|
||||
for (const file of changedFiles) {
|
||||
const componentMatch = file.match(componentRegex);
|
||||
if (componentMatch) {
|
||||
labels.add(`component: ${componentMatch[1]}`);
|
||||
}
|
||||
|
||||
const platformMatch = file.match(targetPlatformRegex);
|
||||
if (platformMatch) {
|
||||
labels.add(`platform: ${platformMatch[1]}`);
|
||||
}
|
||||
}
|
||||
|
||||
return labels;
|
||||
}
|
||||
|
||||
// Strategy: New component detection
|
||||
async function detectNewComponents() {
|
||||
const labels = new Set();
|
||||
const addedFiles = prFiles.filter(file => file.status === 'added').map(file => file.filename);
|
||||
|
||||
for (const file of addedFiles) {
|
||||
const componentMatch = file.match(/^esphome\/components\/([^\/]+)\/__init__\.py$/);
|
||||
if (componentMatch) {
|
||||
try {
|
||||
const content = fs.readFileSync(file, 'utf8');
|
||||
if (content.includes('IS_TARGET_PLATFORM = True')) {
|
||||
labels.add('new-target-platform');
|
||||
}
|
||||
} catch (error) {
|
||||
console.log(`Failed to read content of ${file}:`, error.message);
|
||||
}
|
||||
labels.add('new-component');
|
||||
}
|
||||
}
|
||||
|
||||
return labels;
|
||||
}
|
||||
|
||||
// Strategy: New platform detection
|
||||
async function detectNewPlatforms(apiData) {
|
||||
const labels = new Set();
|
||||
const addedFiles = prFiles.filter(file => file.status === 'added').map(file => file.filename);
|
||||
|
||||
for (const file of addedFiles) {
|
||||
const platformFileMatch = file.match(/^esphome\/components\/([^\/]+)\/([^\/]+)\.py$/);
|
||||
if (platformFileMatch) {
|
||||
const [, component, platform] = platformFileMatch;
|
||||
if (apiData.platformComponents.includes(platform)) {
|
||||
labels.add('new-platform');
|
||||
}
|
||||
}
|
||||
|
||||
const platformDirMatch = file.match(/^esphome\/components\/([^\/]+)\/([^\/]+)\/__init__\.py$/);
|
||||
if (platformDirMatch) {
|
||||
const [, component, platform] = platformDirMatch;
|
||||
if (apiData.platformComponents.includes(platform)) {
|
||||
labels.add('new-platform');
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
return labels;
|
||||
}
|
||||
|
||||
// Strategy: Core files detection
|
||||
async function detectCoreChanges() {
|
||||
const labels = new Set();
|
||||
const coreFiles = changedFiles.filter(file =>
|
||||
file.startsWith('esphome/core/') ||
|
||||
(file.startsWith('esphome/') && file.split('/').length === 2)
|
||||
);
|
||||
|
||||
if (coreFiles.length > 0) {
|
||||
labels.add('core');
|
||||
}
|
||||
|
||||
return labels;
|
||||
}
|
||||
|
||||
// Strategy: PR size detection
|
||||
async function detectPRSize() {
|
||||
const labels = new Set();
|
||||
|
||||
if (totalChanges <= SMALL_PR_THRESHOLD) {
|
||||
labels.add('small-pr');
|
||||
return labels;
|
||||
}
|
||||
|
||||
const testAdditions = prFiles
|
||||
.filter(file => file.filename.startsWith('tests/'))
|
||||
.reduce((sum, file) => sum + (file.additions || 0), 0);
|
||||
const testDeletions = prFiles
|
||||
.filter(file => file.filename.startsWith('tests/'))
|
||||
.reduce((sum, file) => sum + (file.deletions || 0), 0);
|
||||
|
||||
const nonTestChanges = (totalAdditions - testAdditions) - (totalDeletions - testDeletions);
|
||||
|
||||
// Don't add too-big if mega-pr label is already present
|
||||
if (nonTestChanges > TOO_BIG_THRESHOLD && !isMegaPR) {
|
||||
labels.add('too-big');
|
||||
}
|
||||
|
||||
return labels;
|
||||
}
|
||||
|
||||
// Strategy: Dashboard changes
|
||||
async function detectDashboardChanges() {
|
||||
const labels = new Set();
|
||||
const dashboardFiles = changedFiles.filter(file =>
|
||||
file.startsWith('esphome/dashboard/') ||
|
||||
file.startsWith('esphome/components/dashboard_import/')
|
||||
);
|
||||
|
||||
if (dashboardFiles.length > 0) {
|
||||
labels.add('dashboard');
|
||||
}
|
||||
|
||||
return labels;
|
||||
}
|
||||
|
||||
// Strategy: GitHub Actions changes
|
||||
async function detectGitHubActionsChanges() {
|
||||
const labels = new Set();
|
||||
const githubActionsFiles = changedFiles.filter(file =>
|
||||
file.startsWith('.github/workflows/')
|
||||
);
|
||||
|
||||
if (githubActionsFiles.length > 0) {
|
||||
labels.add('github-actions');
|
||||
}
|
||||
|
||||
return labels;
|
||||
}
|
||||
|
||||
// Strategy: Code owner detection
|
||||
async function detectCodeOwner() {
|
||||
const labels = new Set();
|
||||
|
||||
try {
|
||||
const { data: codeownersFile } = await github.rest.repos.getContent({
|
||||
owner,
|
||||
repo,
|
||||
path: 'CODEOWNERS',
|
||||
});
|
||||
|
||||
const codeownersContent = Buffer.from(codeownersFile.content, 'base64').toString('utf8');
|
||||
const prAuthor = context.payload.pull_request.user.login;
|
||||
|
||||
const codeownersLines = codeownersContent.split('\n')
|
||||
.map(line => line.trim())
|
||||
.filter(line => line && !line.startsWith('#'));
|
||||
|
||||
const codeownersRegexes = codeownersLines.map(line => {
|
||||
const parts = line.split(/\s+/);
|
||||
const pattern = parts[0];
|
||||
const owners = parts.slice(1);
|
||||
|
||||
let regex;
|
||||
if (pattern.endsWith('*')) {
|
||||
const dir = pattern.slice(0, -1);
|
||||
regex = new RegExp(`^${dir.replace(/[.*+?^${}()|[\]\\]/g, '\\$&')}`);
|
||||
} else if (pattern.includes('*')) {
|
||||
// First escape all regex special chars except *, then replace * with .*
|
||||
const regexPattern = pattern
|
||||
.replace(/[.+?^${}()|[\]\\]/g, '\\$&')
|
||||
.replace(/\*/g, '.*');
|
||||
regex = new RegExp(`^${regexPattern}$`);
|
||||
} else {
|
||||
regex = new RegExp(`^${pattern.replace(/[.*+?^${}()|[\]\\]/g, '\\$&')}$`);
|
||||
}
|
||||
|
||||
return { regex, owners };
|
||||
});
|
||||
|
||||
for (const file of changedFiles) {
|
||||
for (const { regex, owners } of codeownersRegexes) {
|
||||
if (regex.test(file) && owners.some(owner => owner === `@${prAuthor}`)) {
|
||||
labels.add('by-code-owner');
|
||||
return labels;
|
||||
}
|
||||
}
|
||||
}
|
||||
} catch (error) {
|
||||
console.log('Failed to read or parse CODEOWNERS file:', error.message);
|
||||
}
|
||||
|
||||
return labels;
|
||||
}
|
||||
|
||||
// Strategy: Test detection
|
||||
async function detectTests() {
|
||||
const labels = new Set();
|
||||
const testFiles = changedFiles.filter(file => file.startsWith('tests/'));
|
||||
|
||||
if (testFiles.length > 0) {
|
||||
labels.add('has-tests');
|
||||
}
|
||||
|
||||
return labels;
|
||||
}
|
||||
|
||||
// Strategy: PR Template Checkbox detection
|
||||
async function detectPRTemplateCheckboxes() {
|
||||
const labels = new Set();
|
||||
const prBody = context.payload.pull_request.body || '';
|
||||
|
||||
console.log('Checking PR template checkboxes...');
|
||||
|
||||
// Check for checked checkboxes in the "Types of changes" section
|
||||
const checkboxPatterns = [
|
||||
{ pattern: /- \[x\] Bugfix \(non-breaking change which fixes an issue\)/i, label: 'bugfix' },
|
||||
{ pattern: /- \[x\] New feature \(non-breaking change which adds functionality\)/i, label: 'new-feature' },
|
||||
{ pattern: /- \[x\] Breaking change \(fix or feature that would cause existing functionality to not work as expected\)/i, label: 'breaking-change' },
|
||||
{ pattern: /- \[x\] Code quality improvements to existing code or addition of tests/i, label: 'code-quality' }
|
||||
];
|
||||
|
||||
for (const { pattern, label } of checkboxPatterns) {
|
||||
if (pattern.test(prBody)) {
|
||||
console.log(`Found checked checkbox for: ${label}`);
|
||||
labels.add(label);
|
||||
}
|
||||
}
|
||||
|
||||
return labels;
|
||||
}
|
||||
|
||||
// Strategy: Requirements detection
|
||||
async function detectRequirements(allLabels) {
|
||||
const labels = new Set();
|
||||
|
||||
// Check for missing tests
|
||||
if ((allLabels.has('new-component') || allLabels.has('new-platform') || allLabels.has('new-feature')) && !allLabels.has('has-tests')) {
|
||||
labels.add('needs-tests');
|
||||
}
|
||||
|
||||
// Check for missing docs
|
||||
if (allLabels.has('new-component') || allLabels.has('new-platform') || allLabels.has('new-feature')) {
|
||||
const prBody = context.payload.pull_request.body || '';
|
||||
const hasDocsLink = DOCS_PR_PATTERNS.some(pattern => pattern.test(prBody));
|
||||
|
||||
if (!hasDocsLink) {
|
||||
labels.add('needs-docs');
|
||||
}
|
||||
}
|
||||
|
||||
// Check for missing CODEOWNERS
|
||||
if (allLabels.has('new-component')) {
|
||||
const codeownersModified = prFiles.some(file =>
|
||||
file.filename === 'CODEOWNERS' &&
|
||||
(file.status === 'modified' || file.status === 'added') &&
|
||||
(file.additions || 0) > 0
|
||||
);
|
||||
|
||||
if (!codeownersModified) {
|
||||
labels.add('needs-codeowners');
|
||||
}
|
||||
}
|
||||
|
||||
return labels;
|
||||
}
|
||||
|
||||
// Generate review messages
|
||||
function generateReviewMessages(finalLabels) {
|
||||
const messages = [];
|
||||
const prAuthor = context.payload.pull_request.user.login;
|
||||
|
||||
// Too big message
|
||||
if (finalLabels.includes('too-big')) {
|
||||
const testAdditions = prFiles
|
||||
.filter(file => file.filename.startsWith('tests/'))
|
||||
.reduce((sum, file) => sum + (file.additions || 0), 0);
|
||||
const testDeletions = prFiles
|
||||
.filter(file => file.filename.startsWith('tests/'))
|
||||
.reduce((sum, file) => sum + (file.deletions || 0), 0);
|
||||
const nonTestChanges = (totalAdditions - testAdditions) - (totalDeletions - testDeletions);
|
||||
|
||||
const tooManyLabels = finalLabels.length > MAX_LABELS;
|
||||
const tooManyChanges = nonTestChanges > TOO_BIG_THRESHOLD;
|
||||
|
||||
let message = `${TOO_BIG_MARKER}\n### 📦 Pull Request Size\n\n`;
|
||||
|
||||
if (tooManyLabels && tooManyChanges) {
|
||||
message += `This PR is too large with ${nonTestChanges} line changes (excluding tests) and affects ${finalLabels.length} different components/areas.`;
|
||||
} else if (tooManyLabels) {
|
||||
message += `This PR affects ${finalLabels.length} different components/areas.`;
|
||||
} else {
|
||||
message += `This PR is too large with ${nonTestChanges} line changes (excluding tests).`;
|
||||
}
|
||||
|
||||
message += ` Please consider breaking it down into smaller, focused PRs to make review easier and reduce the risk of conflicts.\n\n`;
|
||||
message += `For guidance on breaking down large PRs, see: https://developers.esphome.io/contributing/submitting-your-work/#how-to-approach-large-submissions`;
|
||||
|
||||
messages.push(message);
|
||||
}
|
||||
|
||||
// CODEOWNERS message
|
||||
if (finalLabels.includes('needs-codeowners')) {
|
||||
const message = `${CODEOWNERS_MARKER}\n### 👥 Code Ownership\n\n` +
|
||||
`Hey there @${prAuthor},\n` +
|
||||
`Thanks for submitting this pull request! Can you add yourself as a codeowner for this integration? ` +
|
||||
`This way we can notify you if a bug report for this integration is reported.\n\n` +
|
||||
`In \`__init__.py\` of the integration, please add:\n\n` +
|
||||
`\`\`\`python\nCODEOWNERS = ["@${prAuthor}"]\n\`\`\`\n\n` +
|
||||
`And run \`script/build_codeowners.py\``;
|
||||
|
||||
messages.push(message);
|
||||
}
|
||||
|
||||
return messages;
|
||||
}
|
||||
|
||||
// Handle reviews
|
||||
async function handleReviews(finalLabels) {
|
||||
const reviewMessages = generateReviewMessages(finalLabels);
|
||||
const hasReviewableLabels = finalLabels.some(label =>
|
||||
['too-big', 'needs-codeowners'].includes(label)
|
||||
);
|
||||
|
||||
const { data: reviews } = await github.rest.pulls.listReviews({
|
||||
owner,
|
||||
repo,
|
||||
pull_number: pr_number
|
||||
});
|
||||
|
||||
const botReviews = reviews.filter(review =>
|
||||
review.user.type === 'Bot' &&
|
||||
review.state === 'CHANGES_REQUESTED' &&
|
||||
review.body && review.body.includes(BOT_COMMENT_MARKER)
|
||||
);
|
||||
|
||||
if (hasReviewableLabels) {
|
||||
const reviewBody = `${BOT_COMMENT_MARKER}\n\n${reviewMessages.join('\n\n---\n\n')}`;
|
||||
|
||||
if (botReviews.length > 0) {
|
||||
// Update existing review
|
||||
await github.rest.pulls.updateReview({
|
||||
owner,
|
||||
repo,
|
||||
pull_number: pr_number,
|
||||
review_id: botReviews[0].id,
|
||||
body: reviewBody
|
||||
});
|
||||
console.log('Updated existing bot review');
|
||||
} else {
|
||||
// Create new review
|
||||
await github.rest.pulls.createReview({
|
||||
owner,
|
||||
repo,
|
||||
pull_number: pr_number,
|
||||
body: reviewBody,
|
||||
event: 'REQUEST_CHANGES'
|
||||
});
|
||||
console.log('Created new bot review');
|
||||
}
|
||||
} else if (botReviews.length > 0) {
|
||||
// Dismiss existing reviews
|
||||
for (const review of botReviews) {
|
||||
try {
|
||||
await github.rest.pulls.dismissReview({
|
||||
owner,
|
||||
repo,
|
||||
pull_number: pr_number,
|
||||
review_id: review.id,
|
||||
message: 'Review dismissed: All requirements have been met'
|
||||
});
|
||||
console.log(`Dismissed bot review ${review.id}`);
|
||||
} catch (error) {
|
||||
console.log(`Failed to dismiss review ${review.id}:`, error.message);
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Main execution
|
||||
const apiData = await fetchApiData();
|
||||
const baseRef = context.payload.pull_request.base.ref;
|
||||
|
||||
// Early exit for non-dev branches
|
||||
if (baseRef !== 'dev') {
|
||||
const branchLabels = await detectMergeBranch();
|
||||
const finalLabels = Array.from(branchLabels);
|
||||
|
||||
console.log('Computed labels (merge branch only):', finalLabels.join(', '));
|
||||
|
||||
// Apply labels
|
||||
if (finalLabels.length > 0) {
|
||||
await github.rest.issues.addLabels({
|
||||
owner,
|
||||
repo,
|
||||
issue_number: pr_number,
|
||||
labels: finalLabels
|
||||
});
|
||||
}
|
||||
|
||||
// Remove old managed labels
|
||||
const labelsToRemove = managedLabels.filter(label => !finalLabels.includes(label));
|
||||
for (const label of labelsToRemove) {
|
||||
try {
|
||||
await github.rest.issues.removeLabel({
|
||||
owner,
|
||||
repo,
|
||||
issue_number: pr_number,
|
||||
name: label
|
||||
});
|
||||
} catch (error) {
|
||||
console.log(`Failed to remove label ${label}:`, error.message);
|
||||
}
|
||||
}
|
||||
|
||||
return;
|
||||
}
|
||||
|
||||
// Run all strategies
|
||||
const [
|
||||
branchLabels,
|
||||
componentLabels,
|
||||
newComponentLabels,
|
||||
newPlatformLabels,
|
||||
coreLabels,
|
||||
sizeLabels,
|
||||
dashboardLabels,
|
||||
actionsLabels,
|
||||
codeOwnerLabels,
|
||||
testLabels,
|
||||
checkboxLabels
|
||||
] = await Promise.all([
|
||||
detectMergeBranch(),
|
||||
detectComponentPlatforms(apiData),
|
||||
detectNewComponents(),
|
||||
detectNewPlatforms(apiData),
|
||||
detectCoreChanges(),
|
||||
detectPRSize(),
|
||||
detectDashboardChanges(),
|
||||
detectGitHubActionsChanges(),
|
||||
detectCodeOwner(),
|
||||
detectTests(),
|
||||
detectPRTemplateCheckboxes()
|
||||
]);
|
||||
|
||||
// Combine all labels
|
||||
const allLabels = new Set([
|
||||
...branchLabels,
|
||||
...componentLabels,
|
||||
...newComponentLabels,
|
||||
...newPlatformLabels,
|
||||
...coreLabels,
|
||||
...sizeLabels,
|
||||
...dashboardLabels,
|
||||
...actionsLabels,
|
||||
...codeOwnerLabels,
|
||||
...testLabels,
|
||||
...checkboxLabels
|
||||
]);
|
||||
|
||||
// Detect requirements based on all other labels
|
||||
const requirementLabels = await detectRequirements(allLabels);
|
||||
for (const label of requirementLabels) {
|
||||
allLabels.add(label);
|
||||
}
|
||||
|
||||
let finalLabels = Array.from(allLabels);
|
||||
|
||||
// For mega-PRs, exclude component labels if there are too many
|
||||
if (isMegaPR) {
|
||||
const componentLabels = finalLabels.filter(label => label.startsWith('component: '));
|
||||
if (componentLabels.length > COMPONENT_LABEL_THRESHOLD) {
|
||||
finalLabels = finalLabels.filter(label => !label.startsWith('component: '));
|
||||
console.log(`Mega-PR detected - excluding ${componentLabels.length} component labels (threshold: ${COMPONENT_LABEL_THRESHOLD})`);
|
||||
}
|
||||
}
|
||||
|
||||
// Handle too many labels (only for non-mega PRs)
|
||||
const tooManyLabels = finalLabels.length > MAX_LABELS;
|
||||
|
||||
if (tooManyLabels && !isMegaPR && !finalLabels.includes('too-big')) {
|
||||
finalLabels = ['too-big'];
|
||||
}
|
||||
|
||||
console.log('Computed labels:', finalLabels.join(', '));
|
||||
|
||||
// Handle reviews
|
||||
await handleReviews(finalLabels);
|
||||
|
||||
// Apply labels
|
||||
if (finalLabels.length > 0) {
|
||||
console.log(`Adding labels: ${finalLabels.join(', ')}`);
|
||||
await github.rest.issues.addLabels({
|
||||
owner,
|
||||
repo,
|
||||
issue_number: pr_number,
|
||||
labels: finalLabels
|
||||
});
|
||||
}
|
||||
|
||||
// Remove old managed labels
|
||||
const labelsToRemove = managedLabels.filter(label => !finalLabels.includes(label));
|
||||
for (const label of labelsToRemove) {
|
||||
console.log(`Removing label: ${label}`);
|
||||
try {
|
||||
await github.rest.issues.removeLabel({
|
||||
owner,
|
||||
repo,
|
||||
issue_number: pr_number,
|
||||
name: label
|
||||
});
|
||||
} catch (error) {
|
||||
console.log(`Failed to remove label ${label}:`, error.message);
|
||||
}
|
||||
}
|
91
.github/workflows/ci-api-proto.yml
vendored
91
.github/workflows/ci-api-proto.yml
vendored
@@ -1,91 +0,0 @@
|
||||
name: API Proto CI
|
||||
|
||||
on:
|
||||
pull_request:
|
||||
paths:
|
||||
- "esphome/components/api/api.proto"
|
||||
- "esphome/components/api/api_pb2.cpp"
|
||||
- "esphome/components/api/api_pb2.h"
|
||||
- "esphome/components/api/api_pb2_service.cpp"
|
||||
- "esphome/components/api/api_pb2_service.h"
|
||||
- "script/api_protobuf/api_protobuf.py"
|
||||
- ".github/workflows/ci-api-proto.yml"
|
||||
|
||||
permissions:
|
||||
contents: read
|
||||
pull-requests: write
|
||||
|
||||
jobs:
|
||||
check:
|
||||
name: Check generated files
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- name: Checkout
|
||||
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
|
||||
- name: Set up Python
|
||||
uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
|
||||
with:
|
||||
python-version: "3.11"
|
||||
|
||||
- name: Install apt dependencies
|
||||
run: |
|
||||
sudo apt update
|
||||
sudo apt-cache show protobuf-compiler
|
||||
sudo apt install -y protobuf-compiler
|
||||
protoc --version
|
||||
- name: Install python dependencies
|
||||
run: pip install aioesphomeapi -c requirements.txt -r requirements_dev.txt
|
||||
- name: Generate files
|
||||
run: script/api_protobuf/api_protobuf.py
|
||||
- name: Check for changes
|
||||
run: |
|
||||
if ! git diff --quiet; then
|
||||
echo "## Job Failed" | tee -a $GITHUB_STEP_SUMMARY
|
||||
echo "You have altered the generated proto files but they do not match what is expected." | tee -a $GITHUB_STEP_SUMMARY
|
||||
echo "Please run 'script/api_protobuf/api_protobuf.py' and commit the changes." | tee -a $GITHUB_STEP_SUMMARY
|
||||
exit 1
|
||||
fi
|
||||
- if: failure()
|
||||
name: Review PR
|
||||
uses: actions/github-script@ed597411d8f924073f98dfc5c65a23a2325f34cd # v8.0.0
|
||||
with:
|
||||
script: |
|
||||
await github.rest.pulls.createReview({
|
||||
pull_number: context.issue.number,
|
||||
owner: context.repo.owner,
|
||||
repo: context.repo.repo,
|
||||
event: 'REQUEST_CHANGES',
|
||||
body: 'You have altered the generated proto files but they do not match what is expected.\nPlease run "script/api_protobuf/api_protobuf.py" and commit the changes.'
|
||||
})
|
||||
- if: failure()
|
||||
name: Show changes
|
||||
run: git diff
|
||||
- if: failure()
|
||||
name: Archive artifacts
|
||||
uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4.6.2
|
||||
with:
|
||||
name: generated-proto-files
|
||||
path: |
|
||||
esphome/components/api/api_pb2.*
|
||||
esphome/components/api/api_pb2_service.*
|
||||
- if: success()
|
||||
name: Dismiss review
|
||||
uses: actions/github-script@ed597411d8f924073f98dfc5c65a23a2325f34cd # v8.0.0
|
||||
with:
|
||||
script: |
|
||||
let reviews = await github.rest.pulls.listReviews({
|
||||
pull_number: context.issue.number,
|
||||
owner: context.repo.owner,
|
||||
repo: context.repo.repo
|
||||
});
|
||||
for (let review of reviews.data) {
|
||||
if (review.user.login === 'github-actions[bot]' && review.state === 'CHANGES_REQUESTED') {
|
||||
await github.rest.pulls.dismissReview({
|
||||
pull_number: context.issue.number,
|
||||
owner: context.repo.owner,
|
||||
repo: context.repo.repo,
|
||||
review_id: review.id,
|
||||
message: 'Files now match the expected proto files.'
|
||||
});
|
||||
}
|
||||
}
|
76
.github/workflows/ci-clang-tidy-hash.yml
vendored
76
.github/workflows/ci-clang-tidy-hash.yml
vendored
@@ -1,76 +0,0 @@
|
||||
name: Clang-tidy Hash CI
|
||||
|
||||
on:
|
||||
pull_request:
|
||||
paths:
|
||||
- ".clang-tidy"
|
||||
- "platformio.ini"
|
||||
- "requirements_dev.txt"
|
||||
- "sdkconfig.defaults"
|
||||
- ".clang-tidy.hash"
|
||||
- "script/clang_tidy_hash.py"
|
||||
- ".github/workflows/ci-clang-tidy-hash.yml"
|
||||
|
||||
permissions:
|
||||
contents: read
|
||||
pull-requests: write
|
||||
|
||||
jobs:
|
||||
verify-hash:
|
||||
name: Verify clang-tidy hash
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- name: Checkout
|
||||
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
|
||||
|
||||
- name: Set up Python
|
||||
uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
|
||||
with:
|
||||
python-version: "3.11"
|
||||
|
||||
- name: Verify hash
|
||||
run: |
|
||||
python script/clang_tidy_hash.py --verify
|
||||
|
||||
- if: failure()
|
||||
name: Show hash details
|
||||
run: |
|
||||
python script/clang_tidy_hash.py
|
||||
echo "## Job Failed" | tee -a $GITHUB_STEP_SUMMARY
|
||||
echo "You have modified clang-tidy configuration but have not updated the hash." | tee -a $GITHUB_STEP_SUMMARY
|
||||
echo "Please run 'script/clang_tidy_hash.py --update' and commit the changes." | tee -a $GITHUB_STEP_SUMMARY
|
||||
|
||||
- if: failure()
|
||||
name: Request changes
|
||||
uses: actions/github-script@ed597411d8f924073f98dfc5c65a23a2325f34cd # v8.0.0
|
||||
with:
|
||||
script: |
|
||||
await github.rest.pulls.createReview({
|
||||
pull_number: context.issue.number,
|
||||
owner: context.repo.owner,
|
||||
repo: context.repo.repo,
|
||||
event: 'REQUEST_CHANGES',
|
||||
body: 'You have modified clang-tidy configuration but have not updated the hash.\nPlease run `script/clang_tidy_hash.py --update` and commit the changes.'
|
||||
})
|
||||
|
||||
- if: success()
|
||||
name: Dismiss review
|
||||
uses: actions/github-script@ed597411d8f924073f98dfc5c65a23a2325f34cd # v8.0.0
|
||||
with:
|
||||
script: |
|
||||
let reviews = await github.rest.pulls.listReviews({
|
||||
pull_number: context.issue.number,
|
||||
owner: context.repo.owner,
|
||||
repo: context.repo.repo
|
||||
});
|
||||
for (let review of reviews.data) {
|
||||
if (review.user.login === 'github-actions[bot]' && review.state === 'CHANGES_REQUESTED') {
|
||||
await github.rest.pulls.dismissReview({
|
||||
pull_number: context.issue.number,
|
||||
owner: context.repo.owner,
|
||||
repo: context.repo.repo,
|
||||
review_id: review.id,
|
||||
message: 'Clang-tidy hash now matches configuration.'
|
||||
});
|
||||
}
|
||||
}
|
71
.github/workflows/ci-docker.yml
vendored
71
.github/workflows/ci-docker.yml
vendored
@@ -1,64 +1,53 @@
|
||||
---
|
||||
name: CI for docker images
|
||||
|
||||
# Only run when docker paths change
|
||||
|
||||
on:
|
||||
push:
|
||||
branches: [dev, beta, release]
|
||||
paths:
|
||||
- "docker/**"
|
||||
- ".github/workflows/ci-docker.yml"
|
||||
- "requirements*.txt"
|
||||
- "platformio.ini"
|
||||
- "script/platformio_install_deps.py"
|
||||
- 'docker/**'
|
||||
- '.github/workflows/**'
|
||||
- 'requirements*.txt'
|
||||
- 'platformio.ini'
|
||||
|
||||
pull_request:
|
||||
paths:
|
||||
- "docker/**"
|
||||
- ".github/workflows/ci-docker.yml"
|
||||
- "requirements*.txt"
|
||||
- "platformio.ini"
|
||||
- "script/platformio_install_deps.py"
|
||||
- 'docker/**'
|
||||
- '.github/workflows/**'
|
||||
- 'requirements*.txt'
|
||||
- 'platformio.ini'
|
||||
|
||||
permissions:
|
||||
contents: read
|
||||
packages: read
|
||||
|
||||
concurrency:
|
||||
# yamllint disable-line rule:line-length
|
||||
group: ${{ github.workflow }}-${{ github.event.pull_request.number || github.ref }}
|
||||
cancel-in-progress: true
|
||||
|
||||
jobs:
|
||||
check-docker:
|
||||
name: Build docker containers
|
||||
runs-on: ${{ matrix.os }}
|
||||
runs-on: ubuntu-latest
|
||||
strategy:
|
||||
fail-fast: false
|
||||
matrix:
|
||||
os: ["ubuntu-24.04", "ubuntu-24.04-arm"]
|
||||
build_type:
|
||||
- "ha-addon"
|
||||
- "docker"
|
||||
# - "lint"
|
||||
arch: [amd64, armv7, aarch64]
|
||||
build_type: ["ha-addon", "docker", "lint"]
|
||||
steps:
|
||||
- uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
|
||||
- name: Set up Python
|
||||
uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
|
||||
with:
|
||||
python-version: "3.11"
|
||||
- name: Set up Docker Buildx
|
||||
uses: docker/setup-buildx-action@e468171a9de216ec08956ac3ada2f0791b6bd435 # v3.11.1
|
||||
- uses: actions/checkout@v2
|
||||
- name: Set up Python
|
||||
uses: actions/setup-python@v2
|
||||
with:
|
||||
python-version: '3.9'
|
||||
- name: Set up Docker Buildx
|
||||
uses: docker/setup-buildx-action@v1
|
||||
- name: Set up QEMU
|
||||
uses: docker/setup-qemu-action@v1
|
||||
|
||||
- name: Set TAG
|
||||
run: |
|
||||
echo "TAG=check" >> $GITHUB_ENV
|
||||
- name: Set TAG
|
||||
run: |
|
||||
echo "TAG=check" >> $GITHUB_ENV
|
||||
|
||||
- name: Run build
|
||||
run: |
|
||||
docker/build.py \
|
||||
--tag "${TAG}" \
|
||||
--arch "${{ matrix.os == 'ubuntu-24.04-arm' && 'aarch64' || 'amd64' }}" \
|
||||
--build-type "${{ matrix.build_type }}" \
|
||||
build
|
||||
- name: Run build
|
||||
run: |
|
||||
docker/build.py \
|
||||
--tag "${TAG}" \
|
||||
--arch "${{ matrix.arch }}" \
|
||||
--build-type "${{ matrix.build_type }}" \
|
||||
build
|
||||
|
881
.github/workflows/ci.yml
vendored
881
.github/workflows/ci.yml
vendored
@@ -1,4 +1,5 @@
|
||||
---
|
||||
# THESE JOBS ARE COPIED IN release.yml and release-dev.yml
|
||||
# PLEASE ALSO UPDATE THOSE FILES WHEN CHANGING LINES HERE
|
||||
name: CI
|
||||
|
||||
on:
|
||||
@@ -6,835 +7,153 @@ on:
|
||||
branches: [dev, beta, release]
|
||||
|
||||
pull_request:
|
||||
paths:
|
||||
- "**"
|
||||
- "!.github/workflows/*.yml"
|
||||
- "!.github/actions/build-image/*"
|
||||
- ".github/workflows/ci.yml"
|
||||
- "!.yamllint"
|
||||
- "!.github/dependabot.yml"
|
||||
- "!docker/**"
|
||||
merge_group:
|
||||
|
||||
permissions:
|
||||
contents: read
|
||||
|
||||
env:
|
||||
DEFAULT_PYTHON: "3.11"
|
||||
PYUPGRADE_TARGET: "--py311-plus"
|
||||
|
||||
concurrency:
|
||||
# yamllint disable-line rule:line-length
|
||||
group: ${{ github.workflow }}-${{ github.event.pull_request.number || github.ref }}
|
||||
cancel-in-progress: true
|
||||
|
||||
jobs:
|
||||
common:
|
||||
name: Create common environment
|
||||
runs-on: ubuntu-24.04
|
||||
outputs:
|
||||
cache-key: ${{ steps.cache-key.outputs.key }}
|
||||
steps:
|
||||
- name: Check out code from GitHub
|
||||
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
|
||||
- name: Generate cache-key
|
||||
id: cache-key
|
||||
run: echo key="${{ hashFiles('requirements.txt', 'requirements_test.txt', '.pre-commit-config.yaml') }}" >> $GITHUB_OUTPUT
|
||||
- name: Set up Python ${{ env.DEFAULT_PYTHON }}
|
||||
id: python
|
||||
uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
|
||||
with:
|
||||
python-version: ${{ env.DEFAULT_PYTHON }}
|
||||
- name: Restore Python virtual environment
|
||||
id: cache-venv
|
||||
uses: actions/cache@0057852bfaa89a56745cba8c7296529d2fc39830 # v4.3.0
|
||||
with:
|
||||
path: venv
|
||||
# yamllint disable-line rule:line-length
|
||||
key: ${{ runner.os }}-${{ steps.python.outputs.python-version }}-venv-${{ steps.cache-key.outputs.key }}
|
||||
- name: Create Python virtual environment
|
||||
if: steps.cache-venv.outputs.cache-hit != 'true'
|
||||
run: |
|
||||
python -m venv venv
|
||||
. venv/bin/activate
|
||||
python --version
|
||||
pip install -r requirements.txt -r requirements_test.txt pre-commit
|
||||
pip install -e .
|
||||
|
||||
pylint:
|
||||
name: Check pylint
|
||||
runs-on: ubuntu-24.04
|
||||
needs:
|
||||
- common
|
||||
- determine-jobs
|
||||
if: needs.determine-jobs.outputs.python-linters == 'true'
|
||||
steps:
|
||||
- name: Check out code from GitHub
|
||||
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
|
||||
- name: Restore Python
|
||||
uses: ./.github/actions/restore-python
|
||||
with:
|
||||
python-version: ${{ env.DEFAULT_PYTHON }}
|
||||
cache-key: ${{ needs.common.outputs.cache-key }}
|
||||
- name: Run pylint
|
||||
run: |
|
||||
. venv/bin/activate
|
||||
pylint -f parseable --persistent=n esphome
|
||||
- name: Suggested changes
|
||||
run: script/ci-suggest-changes
|
||||
if: always()
|
||||
|
||||
ci-custom:
|
||||
name: Run script/ci-custom
|
||||
runs-on: ubuntu-24.04
|
||||
needs:
|
||||
- common
|
||||
steps:
|
||||
- name: Check out code from GitHub
|
||||
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
|
||||
- name: Restore Python
|
||||
uses: ./.github/actions/restore-python
|
||||
with:
|
||||
python-version: ${{ env.DEFAULT_PYTHON }}
|
||||
cache-key: ${{ needs.common.outputs.cache-key }}
|
||||
- name: Register matcher
|
||||
run: echo "::add-matcher::.github/workflows/matchers/ci-custom.json"
|
||||
- name: Run script/ci-custom
|
||||
run: |
|
||||
. venv/bin/activate
|
||||
script/ci-custom.py
|
||||
script/build_codeowners.py --check
|
||||
script/build_language_schema.py --check
|
||||
script/generate-esp32-boards.py --check
|
||||
|
||||
pytest:
|
||||
name: Run pytest
|
||||
strategy:
|
||||
fail-fast: false
|
||||
matrix:
|
||||
python-version:
|
||||
- "3.11"
|
||||
- "3.14"
|
||||
os:
|
||||
- ubuntu-latest
|
||||
- macOS-latest
|
||||
- windows-latest
|
||||
exclude:
|
||||
# Minimize CI resource usage
|
||||
# by only running the Python version
|
||||
# version used for docker images on Windows and macOS
|
||||
- python-version: "3.14"
|
||||
os: windows-latest
|
||||
- python-version: "3.14"
|
||||
os: macOS-latest
|
||||
runs-on: ${{ matrix.os }}
|
||||
needs:
|
||||
- common
|
||||
steps:
|
||||
- name: Check out code from GitHub
|
||||
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
|
||||
- name: Restore Python
|
||||
id: restore-python
|
||||
uses: ./.github/actions/restore-python
|
||||
with:
|
||||
python-version: ${{ matrix.python-version }}
|
||||
cache-key: ${{ needs.common.outputs.cache-key }}
|
||||
- name: Register matcher
|
||||
run: echo "::add-matcher::.github/workflows/matchers/pytest.json"
|
||||
- name: Run pytest
|
||||
if: matrix.os == 'windows-latest'
|
||||
run: |
|
||||
. ./venv/Scripts/activate.ps1
|
||||
pytest -vv --cov-report=xml --tb=native -n auto tests --ignore=tests/integration/
|
||||
- name: Run pytest
|
||||
if: matrix.os == 'ubuntu-latest' || matrix.os == 'macOS-latest'
|
||||
run: |
|
||||
. venv/bin/activate
|
||||
pytest -vv --cov-report=xml --tb=native -n auto tests --ignore=tests/integration/
|
||||
- name: Upload coverage to Codecov
|
||||
uses: codecov/codecov-action@5a1091511ad55cbe89839c7260b706298ca349f7 # v5.5.1
|
||||
with:
|
||||
token: ${{ secrets.CODECOV_TOKEN }}
|
||||
- name: Save Python virtual environment cache
|
||||
if: github.ref == 'refs/heads/dev'
|
||||
uses: actions/cache/save@0057852bfaa89a56745cba8c7296529d2fc39830 # v4.3.0
|
||||
with:
|
||||
path: venv
|
||||
key: ${{ runner.os }}-${{ steps.restore-python.outputs.python-version }}-venv-${{ needs.common.outputs.cache-key }}
|
||||
|
||||
determine-jobs:
|
||||
name: Determine which jobs to run
|
||||
runs-on: ubuntu-24.04
|
||||
needs:
|
||||
- common
|
||||
outputs:
|
||||
integration-tests: ${{ steps.determine.outputs.integration-tests }}
|
||||
clang-tidy: ${{ steps.determine.outputs.clang-tidy }}
|
||||
python-linters: ${{ steps.determine.outputs.python-linters }}
|
||||
changed-components: ${{ steps.determine.outputs.changed-components }}
|
||||
changed-components-with-tests: ${{ steps.determine.outputs.changed-components-with-tests }}
|
||||
directly-changed-components-with-tests: ${{ steps.determine.outputs.directly-changed-components-with-tests }}
|
||||
component-test-count: ${{ steps.determine.outputs.component-test-count }}
|
||||
memory_impact: ${{ steps.determine.outputs.memory-impact }}
|
||||
steps:
|
||||
- name: Check out code from GitHub
|
||||
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
|
||||
with:
|
||||
# Fetch enough history to find the merge base
|
||||
fetch-depth: 2
|
||||
- name: Restore Python
|
||||
uses: ./.github/actions/restore-python
|
||||
with:
|
||||
python-version: ${{ env.DEFAULT_PYTHON }}
|
||||
cache-key: ${{ needs.common.outputs.cache-key }}
|
||||
- name: Determine which tests to run
|
||||
id: determine
|
||||
env:
|
||||
GH_TOKEN: ${{ github.token }}
|
||||
run: |
|
||||
. venv/bin/activate
|
||||
output=$(python script/determine-jobs.py)
|
||||
echo "Test determination output:"
|
||||
echo "$output" | jq
|
||||
|
||||
# Extract individual fields
|
||||
echo "integration-tests=$(echo "$output" | jq -r '.integration_tests')" >> $GITHUB_OUTPUT
|
||||
echo "clang-tidy=$(echo "$output" | jq -r '.clang_tidy')" >> $GITHUB_OUTPUT
|
||||
echo "python-linters=$(echo "$output" | jq -r '.python_linters')" >> $GITHUB_OUTPUT
|
||||
echo "changed-components=$(echo "$output" | jq -c '.changed_components')" >> $GITHUB_OUTPUT
|
||||
echo "changed-components-with-tests=$(echo "$output" | jq -c '.changed_components_with_tests')" >> $GITHUB_OUTPUT
|
||||
echo "directly-changed-components-with-tests=$(echo "$output" | jq -c '.directly_changed_components_with_tests')" >> $GITHUB_OUTPUT
|
||||
echo "component-test-count=$(echo "$output" | jq -r '.component_test_count')" >> $GITHUB_OUTPUT
|
||||
echo "memory-impact=$(echo "$output" | jq -c '.memory_impact')" >> $GITHUB_OUTPUT
|
||||
|
||||
integration-tests:
|
||||
name: Run integration tests
|
||||
runs-on: ubuntu-latest
|
||||
needs:
|
||||
- common
|
||||
- determine-jobs
|
||||
if: needs.determine-jobs.outputs.integration-tests == 'true'
|
||||
steps:
|
||||
- name: Check out code from GitHub
|
||||
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
|
||||
- name: Set up Python 3.13
|
||||
id: python
|
||||
uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
|
||||
with:
|
||||
python-version: "3.13"
|
||||
- name: Restore Python virtual environment
|
||||
id: cache-venv
|
||||
uses: actions/cache@0057852bfaa89a56745cba8c7296529d2fc39830 # v4.3.0
|
||||
with:
|
||||
path: venv
|
||||
key: ${{ runner.os }}-${{ steps.python.outputs.python-version }}-venv-${{ needs.common.outputs.cache-key }}
|
||||
- name: Create Python virtual environment
|
||||
if: steps.cache-venv.outputs.cache-hit != 'true'
|
||||
run: |
|
||||
python -m venv venv
|
||||
. venv/bin/activate
|
||||
python --version
|
||||
pip install -r requirements.txt -r requirements_test.txt
|
||||
pip install -e .
|
||||
- name: Register matcher
|
||||
run: echo "::add-matcher::.github/workflows/matchers/pytest.json"
|
||||
- name: Run integration tests
|
||||
run: |
|
||||
. venv/bin/activate
|
||||
pytest -vv --no-cov --tb=native -n auto tests/integration/
|
||||
|
||||
clang-tidy:
|
||||
ci:
|
||||
name: ${{ matrix.name }}
|
||||
runs-on: ubuntu-24.04
|
||||
needs:
|
||||
- common
|
||||
- determine-jobs
|
||||
if: needs.determine-jobs.outputs.clang-tidy == 'true'
|
||||
env:
|
||||
GH_TOKEN: ${{ github.token }}
|
||||
runs-on: ubuntu-latest
|
||||
strategy:
|
||||
fail-fast: false
|
||||
max-parallel: 2
|
||||
matrix:
|
||||
include:
|
||||
- id: ci-custom
|
||||
name: Run script/ci-custom
|
||||
- id: lint-python
|
||||
name: Run script/lint-python
|
||||
- id: test
|
||||
file: tests/test1.yaml
|
||||
name: Test tests/test1.yaml
|
||||
pio_cache_key: test1
|
||||
- id: test
|
||||
file: tests/test2.yaml
|
||||
name: Test tests/test2.yaml
|
||||
pio_cache_key: test2
|
||||
- id: test
|
||||
file: tests/test3.yaml
|
||||
name: Test tests/test3.yaml
|
||||
pio_cache_key: test1
|
||||
- id: test
|
||||
file: tests/test4.yaml
|
||||
name: Test tests/test4.yaml
|
||||
pio_cache_key: test4
|
||||
- id: test
|
||||
file: tests/test5.yaml
|
||||
name: Test tests/test5.yaml
|
||||
pio_cache_key: test5
|
||||
- id: pytest
|
||||
name: Run pytest
|
||||
- id: clang-format
|
||||
name: Run script/clang-format
|
||||
- id: clang-tidy
|
||||
name: Run script/clang-tidy for ESP8266
|
||||
options: --environment esp8266-arduino-tidy --grep USE_ESP8266
|
||||
options: --environment esp8266-tidy --grep USE_ESP8266
|
||||
pio_cache_key: tidyesp8266
|
||||
- id: clang-tidy
|
||||
name: Run script/clang-tidy for ESP32 Arduino 1/4
|
||||
options: --environment esp32-arduino-tidy --split-num 4 --split-at 1
|
||||
name: Run script/clang-tidy for ESP32 1/4
|
||||
options: --environment esp32-tidy --split-num 4 --split-at 1
|
||||
pio_cache_key: tidyesp32
|
||||
- id: clang-tidy
|
||||
name: Run script/clang-tidy for ESP32 Arduino 2/4
|
||||
options: --environment esp32-arduino-tidy --split-num 4 --split-at 2
|
||||
name: Run script/clang-tidy for ESP32 2/4
|
||||
options: --environment esp32-tidy --split-num 4 --split-at 2
|
||||
pio_cache_key: tidyesp32
|
||||
- id: clang-tidy
|
||||
name: Run script/clang-tidy for ESP32 Arduino 3/4
|
||||
options: --environment esp32-arduino-tidy --split-num 4 --split-at 3
|
||||
name: Run script/clang-tidy for ESP32 3/4
|
||||
options: --environment esp32-tidy --split-num 4 --split-at 3
|
||||
pio_cache_key: tidyesp32
|
||||
- id: clang-tidy
|
||||
name: Run script/clang-tidy for ESP32 Arduino 4/4
|
||||
options: --environment esp32-arduino-tidy --split-num 4 --split-at 4
|
||||
name: Run script/clang-tidy for ESP32 4/4
|
||||
options: --environment esp32-tidy --split-num 4 --split-at 4
|
||||
pio_cache_key: tidyesp32
|
||||
- id: clang-tidy
|
||||
name: Run script/clang-tidy for ESP32 IDF
|
||||
name: Run script/clang-tidy for ESP32 esp-idf
|
||||
options: --environment esp32-idf-tidy --grep USE_ESP_IDF
|
||||
pio_cache_key: tidyesp32-idf
|
||||
- id: clang-tidy
|
||||
name: Run script/clang-tidy for ZEPHYR
|
||||
options: --environment nrf52-tidy --grep USE_ZEPHYR --grep USE_NRF52
|
||||
pio_cache_key: tidy-zephyr
|
||||
ignore_errors: false
|
||||
|
||||
steps:
|
||||
- name: Check out code from GitHub
|
||||
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
|
||||
- uses: actions/checkout@v2
|
||||
- name: Set up Python
|
||||
uses: actions/setup-python@v2
|
||||
id: python
|
||||
with:
|
||||
# Need history for HEAD~1 to work for checking changed files
|
||||
fetch-depth: 2
|
||||
python-version: '3.7'
|
||||
|
||||
- name: Restore Python
|
||||
uses: ./.github/actions/restore-python
|
||||
- name: Cache pip modules
|
||||
uses: actions/cache@v2
|
||||
with:
|
||||
python-version: ${{ env.DEFAULT_PYTHON }}
|
||||
cache-key: ${{ needs.common.outputs.cache-key }}
|
||||
path: ~/.cache/pip
|
||||
key: pip-${{ steps.python.outputs.python-version }}-${{ hashFiles('requirements*.txt') }}
|
||||
restore-keys: |
|
||||
pip-${{ steps.python.outputs.python-version }}-
|
||||
|
||||
- name: Set up python environment
|
||||
run: |
|
||||
pip3 install -r requirements.txt -r requirements_optional.txt -r requirements_test.txt
|
||||
pip3 install -e .
|
||||
|
||||
# Use per check platformio cache because checks use different parts
|
||||
- name: Cache platformio
|
||||
if: github.ref == 'refs/heads/dev'
|
||||
uses: actions/cache@0057852bfaa89a56745cba8c7296529d2fc39830 # v4.3.0
|
||||
uses: actions/cache@v2
|
||||
with:
|
||||
path: ~/.platformio
|
||||
key: platformio-${{ matrix.pio_cache_key }}-${{ hashFiles('platformio.ini') }}
|
||||
if: matrix.id == 'test' || matrix.id == 'clang-tidy'
|
||||
|
||||
- name: Cache platformio
|
||||
if: github.ref != 'refs/heads/dev'
|
||||
uses: actions/cache/restore@0057852bfaa89a56745cba8c7296529d2fc39830 # v4.3.0
|
||||
with:
|
||||
path: ~/.platformio
|
||||
key: platformio-${{ matrix.pio_cache_key }}-${{ hashFiles('platformio.ini') }}
|
||||
- name: Install clang tools
|
||||
run: |
|
||||
sudo apt-get install \
|
||||
clang-format-11 \
|
||||
clang-tidy-11
|
||||
if: matrix.id == 'clang-tidy' || matrix.id == 'clang-format'
|
||||
|
||||
- name: Register problem matchers
|
||||
run: |
|
||||
echo "::add-matcher::.github/workflows/matchers/ci-custom.json"
|
||||
echo "::add-matcher::.github/workflows/matchers/lint-python.json"
|
||||
echo "::add-matcher::.github/workflows/matchers/python.json"
|
||||
echo "::add-matcher::.github/workflows/matchers/pytest.json"
|
||||
echo "::add-matcher::.github/workflows/matchers/gcc.json"
|
||||
echo "::add-matcher::.github/workflows/matchers/clang-tidy.json"
|
||||
|
||||
- name: Run 'pio run --list-targets -e esp32-idf-tidy'
|
||||
if: matrix.name == 'Run script/clang-tidy for ESP32 IDF'
|
||||
- name: Lint Custom
|
||||
run: |
|
||||
. venv/bin/activate
|
||||
mkdir -p .temp
|
||||
pio run --list-targets -e esp32-idf-tidy
|
||||
script/ci-custom.py
|
||||
script/build_codeowners.py --check
|
||||
if: matrix.id == 'ci-custom'
|
||||
|
||||
- name: Check if full clang-tidy scan needed
|
||||
id: check_full_scan
|
||||
- name: Lint Python
|
||||
run: script/lint-python
|
||||
if: matrix.id == 'lint-python'
|
||||
|
||||
- run: esphome compile ${{ matrix.file }}
|
||||
if: matrix.id == 'test'
|
||||
env:
|
||||
# Also cache libdeps, store them in a ~/.platformio subfolder
|
||||
PLATFORMIO_LIBDEPS_DIR: ~/.platformio/libdeps
|
||||
|
||||
- name: Run pytest
|
||||
run: |
|
||||
. venv/bin/activate
|
||||
if python script/clang_tidy_hash.py --check; then
|
||||
echo "full_scan=true" >> $GITHUB_OUTPUT
|
||||
echo "reason=hash_changed" >> $GITHUB_OUTPUT
|
||||
else
|
||||
echo "full_scan=false" >> $GITHUB_OUTPUT
|
||||
echo "reason=normal" >> $GITHUB_OUTPUT
|
||||
fi
|
||||
pytest -vv --tb=native tests
|
||||
if: matrix.id == 'pytest'
|
||||
|
||||
# Also run git-diff-index so that the step is marked as failed on formatting errors,
|
||||
# since clang-format doesn't do anything but change files if -i is passed.
|
||||
- name: Run clang-format
|
||||
run: |
|
||||
script/clang-format -i
|
||||
git diff-index --quiet HEAD --
|
||||
if: matrix.id == 'clang-format'
|
||||
|
||||
- name: Run clang-tidy
|
||||
run: |
|
||||
. venv/bin/activate
|
||||
if [ "${{ steps.check_full_scan.outputs.full_scan }}" = "true" ]; then
|
||||
echo "Running FULL clang-tidy scan (hash changed)"
|
||||
script/clang-tidy --all-headers --fix ${{ matrix.options }} ${{ matrix.ignore_errors && '|| true' || '' }}
|
||||
else
|
||||
echo "Running clang-tidy on changed files only"
|
||||
script/clang-tidy --all-headers --fix --changed ${{ matrix.options }} ${{ matrix.ignore_errors && '|| true' || '' }}
|
||||
fi
|
||||
script/clang-tidy --all-headers --fix ${{ matrix.options }}
|
||||
if: matrix.id == 'clang-tidy'
|
||||
env:
|
||||
# Also cache libdeps, store them in a ~/.platformio subfolder
|
||||
PLATFORMIO_LIBDEPS_DIR: ~/.platformio/libdeps
|
||||
|
||||
- name: Suggested changes
|
||||
run: script/ci-suggest-changes ${{ matrix.ignore_errors && '|| true' || '' }}
|
||||
# yamllint disable-line rule:line-length
|
||||
if: always()
|
||||
|
||||
test-build-components-splitter:
|
||||
name: Split components for intelligent grouping (40 weighted per batch)
|
||||
runs-on: ubuntu-24.04
|
||||
needs:
|
||||
- common
|
||||
- determine-jobs
|
||||
if: github.event_name == 'pull_request' && fromJSON(needs.determine-jobs.outputs.component-test-count) > 0
|
||||
outputs:
|
||||
matrix: ${{ steps.split.outputs.components }}
|
||||
steps:
|
||||
- name: Check out code from GitHub
|
||||
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
|
||||
- name: Restore Python
|
||||
uses: ./.github/actions/restore-python
|
||||
with:
|
||||
python-version: ${{ env.DEFAULT_PYTHON }}
|
||||
cache-key: ${{ needs.common.outputs.cache-key }}
|
||||
- name: Split components intelligently based on bus configurations
|
||||
id: split
|
||||
run: |
|
||||
. venv/bin/activate
|
||||
|
||||
# Use intelligent splitter that groups components with same bus configs
|
||||
components='${{ needs.determine-jobs.outputs.changed-components-with-tests }}'
|
||||
|
||||
# Only isolate directly changed components when targeting dev branch
|
||||
# For beta/release branches, group everything for faster CI
|
||||
if [[ "${{ github.base_ref }}" == beta* ]] || [[ "${{ github.base_ref }}" == release* ]]; then
|
||||
directly_changed='[]'
|
||||
echo "Target branch: ${{ github.base_ref }} - grouping all components"
|
||||
else
|
||||
directly_changed='${{ needs.determine-jobs.outputs.directly-changed-components-with-tests }}'
|
||||
echo "Target branch: ${{ github.base_ref }} - isolating directly changed components"
|
||||
fi
|
||||
|
||||
echo "Splitting components intelligently..."
|
||||
output=$(python3 script/split_components_for_ci.py --components "$components" --directly-changed "$directly_changed" --batch-size 40 --output github)
|
||||
|
||||
echo "$output" >> $GITHUB_OUTPUT
|
||||
|
||||
test-build-components-split:
|
||||
name: Test components batch (${{ matrix.components }})
|
||||
runs-on: ubuntu-24.04
|
||||
needs:
|
||||
- common
|
||||
- determine-jobs
|
||||
- test-build-components-splitter
|
||||
if: github.event_name == 'pull_request' && fromJSON(needs.determine-jobs.outputs.component-test-count) > 0
|
||||
strategy:
|
||||
fail-fast: false
|
||||
max-parallel: ${{ (startsWith(github.base_ref, 'beta') || startsWith(github.base_ref, 'release')) && 8 || 4 }}
|
||||
matrix:
|
||||
components: ${{ fromJson(needs.test-build-components-splitter.outputs.matrix) }}
|
||||
steps:
|
||||
- name: Show disk space
|
||||
run: |
|
||||
echo "Available disk space:"
|
||||
df -h
|
||||
|
||||
- name: List components
|
||||
run: echo ${{ matrix.components }}
|
||||
|
||||
- name: Cache apt packages
|
||||
uses: awalsh128/cache-apt-pkgs-action@acb598e5ddbc6f68a970c5da0688d2f3a9f04d05 # v1.5.3
|
||||
with:
|
||||
packages: libsdl2-dev
|
||||
version: 1.0
|
||||
|
||||
- name: Check out code from GitHub
|
||||
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
|
||||
- name: Restore Python
|
||||
uses: ./.github/actions/restore-python
|
||||
with:
|
||||
python-version: ${{ env.DEFAULT_PYTHON }}
|
||||
cache-key: ${{ needs.common.outputs.cache-key }}
|
||||
- name: Validate and compile components with intelligent grouping
|
||||
run: |
|
||||
. venv/bin/activate
|
||||
|
||||
# Check if /mnt has more free space than / before bind mounting
|
||||
# Extract available space in KB for comparison
|
||||
root_avail=$(df -k / | awk 'NR==2 {print $4}')
|
||||
mnt_avail=$(df -k /mnt 2>/dev/null | awk 'NR==2 {print $4}')
|
||||
|
||||
echo "Available space: / has ${root_avail}KB, /mnt has ${mnt_avail}KB"
|
||||
|
||||
# Only use /mnt if it has more space than /
|
||||
if [ -n "$mnt_avail" ] && [ "$mnt_avail" -gt "$root_avail" ]; then
|
||||
echo "Using /mnt for build files (more space available)"
|
||||
# Bind mount PlatformIO directory to /mnt (tools, packages, build cache all go there)
|
||||
sudo mkdir -p /mnt/platformio
|
||||
sudo chown $USER:$USER /mnt/platformio
|
||||
mkdir -p ~/.platformio
|
||||
sudo mount --bind /mnt/platformio ~/.platformio
|
||||
|
||||
# Bind mount test build directory to /mnt
|
||||
sudo mkdir -p /mnt/test_build_components_build
|
||||
sudo chown $USER:$USER /mnt/test_build_components_build
|
||||
mkdir -p tests/test_build_components/build
|
||||
sudo mount --bind /mnt/test_build_components_build tests/test_build_components/build
|
||||
else
|
||||
echo "Using / for build files (more space available than /mnt or /mnt unavailable)"
|
||||
fi
|
||||
|
||||
# Convert space-separated components to comma-separated for Python script
|
||||
components_csv=$(echo "${{ matrix.components }}" | tr ' ' ',')
|
||||
|
||||
# Only isolate directly changed components when targeting dev branch
|
||||
# For beta/release branches, group everything for faster CI
|
||||
#
|
||||
# WHY ISOLATE DIRECTLY CHANGED COMPONENTS?
|
||||
# - Isolated tests run WITHOUT --testing-mode, enabling full validation
|
||||
# - This catches pin conflicts and other issues in directly changed code
|
||||
# - Grouped tests use --testing-mode to allow config merging (disables some checks)
|
||||
# - Dependencies are safe to group since they weren't modified in this PR
|
||||
if [[ "${{ github.base_ref }}" == beta* ]] || [[ "${{ github.base_ref }}" == release* ]]; then
|
||||
directly_changed_csv=""
|
||||
echo "Testing components: $components_csv"
|
||||
echo "Target branch: ${{ github.base_ref }} - grouping all components"
|
||||
else
|
||||
directly_changed_csv=$(echo '${{ needs.determine-jobs.outputs.directly-changed-components-with-tests }}' | jq -r 'join(",")')
|
||||
echo "Testing components: $components_csv"
|
||||
echo "Target branch: ${{ github.base_ref }} - isolating directly changed components: $directly_changed_csv"
|
||||
fi
|
||||
echo ""
|
||||
|
||||
# Show disk space before validation (after bind mounts setup)
|
||||
echo "Disk space before config validation:"
|
||||
df -h
|
||||
echo ""
|
||||
|
||||
# Run config validation with grouping and isolation
|
||||
python3 script/test_build_components.py -e config -c "$components_csv" -f --isolate "$directly_changed_csv"
|
||||
|
||||
echo ""
|
||||
echo "Config validation passed! Starting compilation..."
|
||||
echo ""
|
||||
|
||||
# Show disk space before compilation
|
||||
echo "Disk space before compilation:"
|
||||
df -h
|
||||
echo ""
|
||||
|
||||
# Run compilation with grouping and isolation
|
||||
python3 script/test_build_components.py -e compile -c "$components_csv" -f --isolate "$directly_changed_csv"
|
||||
|
||||
pre-commit-ci-lite:
|
||||
name: pre-commit.ci lite
|
||||
runs-on: ubuntu-latest
|
||||
needs:
|
||||
- common
|
||||
if: github.event_name == 'pull_request' && !startsWith(github.base_ref, 'beta') && !startsWith(github.base_ref, 'release')
|
||||
steps:
|
||||
- name: Check out code from GitHub
|
||||
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
|
||||
- name: Restore Python
|
||||
uses: ./.github/actions/restore-python
|
||||
with:
|
||||
python-version: ${{ env.DEFAULT_PYTHON }}
|
||||
cache-key: ${{ needs.common.outputs.cache-key }}
|
||||
- uses: esphome/action@43cd1109c09c544d97196f7730ee5b2e0cc6d81e # v3.0.1 fork with pinned actions/cache
|
||||
env:
|
||||
SKIP: pylint,clang-tidy-hash
|
||||
- uses: pre-commit-ci/lite-action@5d6cc0eb514c891a40562a58a8e71576c5c7fb43 # v1.1.0
|
||||
if: always()
|
||||
|
||||
memory-impact-target-branch:
|
||||
name: Build target branch for memory impact
|
||||
runs-on: ubuntu-24.04
|
||||
needs:
|
||||
- common
|
||||
- determine-jobs
|
||||
if: github.event_name == 'pull_request' && fromJSON(needs.determine-jobs.outputs.memory_impact).should_run == 'true'
|
||||
outputs:
|
||||
ram_usage: ${{ steps.extract.outputs.ram_usage }}
|
||||
flash_usage: ${{ steps.extract.outputs.flash_usage }}
|
||||
cache_hit: ${{ steps.cache-memory-analysis.outputs.cache-hit }}
|
||||
skip: ${{ steps.check-script.outputs.skip }}
|
||||
steps:
|
||||
- name: Check out target branch
|
||||
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
|
||||
with:
|
||||
ref: ${{ github.base_ref }}
|
||||
|
||||
# Check if memory impact extraction script exists on target branch
|
||||
# If not, skip the analysis (this handles older branches that don't have the feature)
|
||||
- name: Check for memory impact script
|
||||
id: check-script
|
||||
run: |
|
||||
if [ -f "script/ci_memory_impact_extract.py" ]; then
|
||||
echo "skip=false" >> $GITHUB_OUTPUT
|
||||
else
|
||||
echo "skip=true" >> $GITHUB_OUTPUT
|
||||
echo "::warning::ci_memory_impact_extract.py not found on target branch, skipping memory impact analysis"
|
||||
fi
|
||||
|
||||
# All remaining steps only run if script exists
|
||||
- name: Generate cache key
|
||||
id: cache-key
|
||||
if: steps.check-script.outputs.skip != 'true'
|
||||
run: |
|
||||
# Get the commit SHA of the target branch
|
||||
target_sha=$(git rev-parse HEAD)
|
||||
|
||||
# Hash the build infrastructure files (all files that affect build/analysis)
|
||||
infra_hash=$(cat \
|
||||
script/test_build_components.py \
|
||||
script/ci_memory_impact_extract.py \
|
||||
script/analyze_component_buses.py \
|
||||
script/merge_component_configs.py \
|
||||
script/ci_helpers.py \
|
||||
.github/workflows/ci.yml \
|
||||
| sha256sum | cut -d' ' -f1)
|
||||
|
||||
# Get platform and components from job inputs
|
||||
platform="${{ fromJSON(needs.determine-jobs.outputs.memory_impact).platform }}"
|
||||
components='${{ toJSON(fromJSON(needs.determine-jobs.outputs.memory_impact).components) }}'
|
||||
components_hash=$(echo "$components" | sha256sum | cut -d' ' -f1)
|
||||
|
||||
# Combine into cache key
|
||||
cache_key="memory-analysis-target-${target_sha}-${infra_hash}-${platform}-${components_hash}"
|
||||
echo "cache-key=${cache_key}" >> $GITHUB_OUTPUT
|
||||
echo "Cache key: ${cache_key}"
|
||||
|
||||
- name: Restore cached memory analysis
|
||||
id: cache-memory-analysis
|
||||
if: steps.check-script.outputs.skip != 'true'
|
||||
uses: actions/cache/restore@0057852bfaa89a56745cba8c7296529d2fc39830 # v4.3.0
|
||||
with:
|
||||
path: memory-analysis-target.json
|
||||
key: ${{ steps.cache-key.outputs.cache-key }}
|
||||
|
||||
- name: Cache status
|
||||
if: steps.check-script.outputs.skip != 'true'
|
||||
run: |
|
||||
if [ "${{ steps.cache-memory-analysis.outputs.cache-hit }}" == "true" ]; then
|
||||
echo "✓ Cache hit! Using cached memory analysis results."
|
||||
echo " Skipping build step to save time."
|
||||
else
|
||||
echo "✗ Cache miss. Will build and analyze memory usage."
|
||||
fi
|
||||
|
||||
- name: Restore Python
|
||||
if: steps.check-script.outputs.skip != 'true' && steps.cache-memory-analysis.outputs.cache-hit != 'true'
|
||||
uses: ./.github/actions/restore-python
|
||||
with:
|
||||
python-version: ${{ env.DEFAULT_PYTHON }}
|
||||
cache-key: ${{ needs.common.outputs.cache-key }}
|
||||
|
||||
- name: Cache platformio
|
||||
if: steps.check-script.outputs.skip != 'true' && steps.cache-memory-analysis.outputs.cache-hit != 'true'
|
||||
uses: actions/cache/restore@0057852bfaa89a56745cba8c7296529d2fc39830 # v4.3.0
|
||||
with:
|
||||
path: ~/.platformio
|
||||
key: platformio-memory-${{ fromJSON(needs.determine-jobs.outputs.memory_impact).platform }}-${{ hashFiles('platformio.ini') }}
|
||||
|
||||
- name: Build, compile, and analyze memory
|
||||
if: steps.check-script.outputs.skip != 'true' && steps.cache-memory-analysis.outputs.cache-hit != 'true'
|
||||
id: build
|
||||
run: |
|
||||
. venv/bin/activate
|
||||
components='${{ toJSON(fromJSON(needs.determine-jobs.outputs.memory_impact).components) }}'
|
||||
platform="${{ fromJSON(needs.determine-jobs.outputs.memory_impact).platform }}"
|
||||
|
||||
echo "Building with test_build_components.py for $platform with components:"
|
||||
echo "$components" | jq -r '.[]' | sed 's/^/ - /'
|
||||
|
||||
# Use test_build_components.py which handles grouping automatically
|
||||
# Pass components as comma-separated list
|
||||
component_list=$(echo "$components" | jq -r 'join(",")')
|
||||
|
||||
echo "Compiling with test_build_components.py..."
|
||||
|
||||
# Run build and extract memory with auto-detection of build directory for detailed analysis
|
||||
# Use tee to show output in CI while also piping to extraction script
|
||||
python script/test_build_components.py \
|
||||
-e compile \
|
||||
-c "$component_list" \
|
||||
-t "$platform" 2>&1 | \
|
||||
tee /dev/stderr | \
|
||||
python script/ci_memory_impact_extract.py \
|
||||
--output-env \
|
||||
--output-json memory-analysis-target.json
|
||||
|
||||
- name: Save memory analysis to cache
|
||||
if: steps.check-script.outputs.skip != 'true' && steps.cache-memory-analysis.outputs.cache-hit != 'true' && steps.build.outcome == 'success'
|
||||
uses: actions/cache/save@0057852bfaa89a56745cba8c7296529d2fc39830 # v4.3.0
|
||||
with:
|
||||
path: memory-analysis-target.json
|
||||
key: ${{ steps.cache-key.outputs.cache-key }}
|
||||
|
||||
- name: Extract memory usage for outputs
|
||||
id: extract
|
||||
if: steps.check-script.outputs.skip != 'true'
|
||||
run: |
|
||||
if [ -f memory-analysis-target.json ]; then
|
||||
ram=$(jq -r '.ram_bytes' memory-analysis-target.json)
|
||||
flash=$(jq -r '.flash_bytes' memory-analysis-target.json)
|
||||
echo "ram_usage=${ram}" >> $GITHUB_OUTPUT
|
||||
echo "flash_usage=${flash}" >> $GITHUB_OUTPUT
|
||||
echo "RAM: ${ram} bytes, Flash: ${flash} bytes"
|
||||
else
|
||||
echo "Error: memory-analysis-target.json not found"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
- name: Upload memory analysis JSON
|
||||
uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4.6.2
|
||||
with:
|
||||
name: memory-analysis-target
|
||||
path: memory-analysis-target.json
|
||||
if-no-files-found: warn
|
||||
retention-days: 1
|
||||
|
||||
memory-impact-pr-branch:
|
||||
name: Build PR branch for memory impact
|
||||
runs-on: ubuntu-24.04
|
||||
needs:
|
||||
- common
|
||||
- determine-jobs
|
||||
if: github.event_name == 'pull_request' && fromJSON(needs.determine-jobs.outputs.memory_impact).should_run == 'true'
|
||||
outputs:
|
||||
ram_usage: ${{ steps.extract.outputs.ram_usage }}
|
||||
flash_usage: ${{ steps.extract.outputs.flash_usage }}
|
||||
steps:
|
||||
- name: Check out PR branch
|
||||
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
|
||||
- name: Restore Python
|
||||
uses: ./.github/actions/restore-python
|
||||
with:
|
||||
python-version: ${{ env.DEFAULT_PYTHON }}
|
||||
cache-key: ${{ needs.common.outputs.cache-key }}
|
||||
- name: Cache platformio
|
||||
uses: actions/cache/restore@0057852bfaa89a56745cba8c7296529d2fc39830 # v4.3.0
|
||||
with:
|
||||
path: ~/.platformio
|
||||
key: platformio-memory-${{ fromJSON(needs.determine-jobs.outputs.memory_impact).platform }}-${{ hashFiles('platformio.ini') }}
|
||||
- name: Build, compile, and analyze memory
|
||||
id: extract
|
||||
run: |
|
||||
. venv/bin/activate
|
||||
components='${{ toJSON(fromJSON(needs.determine-jobs.outputs.memory_impact).components) }}'
|
||||
platform="${{ fromJSON(needs.determine-jobs.outputs.memory_impact).platform }}"
|
||||
|
||||
echo "Building with test_build_components.py for $platform with components:"
|
||||
echo "$components" | jq -r '.[]' | sed 's/^/ - /'
|
||||
|
||||
# Use test_build_components.py which handles grouping automatically
|
||||
# Pass components as comma-separated list
|
||||
component_list=$(echo "$components" | jq -r 'join(",")')
|
||||
|
||||
echo "Compiling with test_build_components.py..."
|
||||
|
||||
# Run build and extract memory with auto-detection of build directory for detailed analysis
|
||||
# Use tee to show output in CI while also piping to extraction script
|
||||
python script/test_build_components.py \
|
||||
-e compile \
|
||||
-c "$component_list" \
|
||||
-t "$platform" 2>&1 | \
|
||||
tee /dev/stderr | \
|
||||
python script/ci_memory_impact_extract.py \
|
||||
--output-env \
|
||||
--output-json memory-analysis-pr.json
|
||||
- name: Upload memory analysis JSON
|
||||
uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4.6.2
|
||||
with:
|
||||
name: memory-analysis-pr
|
||||
path: memory-analysis-pr.json
|
||||
if-no-files-found: warn
|
||||
retention-days: 1
|
||||
|
||||
memory-impact-comment:
|
||||
name: Comment memory impact
|
||||
runs-on: ubuntu-24.04
|
||||
needs:
|
||||
- common
|
||||
- determine-jobs
|
||||
- memory-impact-target-branch
|
||||
- memory-impact-pr-branch
|
||||
if: github.event_name == 'pull_request' && fromJSON(needs.determine-jobs.outputs.memory_impact).should_run == 'true' && needs.memory-impact-target-branch.outputs.skip != 'true'
|
||||
permissions:
|
||||
contents: read
|
||||
pull-requests: write
|
||||
steps:
|
||||
- name: Check out code
|
||||
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
|
||||
- name: Restore Python
|
||||
uses: ./.github/actions/restore-python
|
||||
with:
|
||||
python-version: ${{ env.DEFAULT_PYTHON }}
|
||||
cache-key: ${{ needs.common.outputs.cache-key }}
|
||||
- name: Download target analysis JSON
|
||||
uses: actions/download-artifact@d3f86a106a0bac45b974a628896c90dbdf5c8093 # v4.3.0
|
||||
with:
|
||||
name: memory-analysis-target
|
||||
path: ./memory-analysis
|
||||
continue-on-error: true
|
||||
- name: Download PR analysis JSON
|
||||
uses: actions/download-artifact@d3f86a106a0bac45b974a628896c90dbdf5c8093 # v4.3.0
|
||||
with:
|
||||
name: memory-analysis-pr
|
||||
path: ./memory-analysis
|
||||
continue-on-error: true
|
||||
- name: Post or update PR comment
|
||||
env:
|
||||
GH_TOKEN: ${{ github.token }}
|
||||
COMPONENTS: ${{ toJSON(fromJSON(needs.determine-jobs.outputs.memory_impact).components) }}
|
||||
PLATFORM: ${{ fromJSON(needs.determine-jobs.outputs.memory_impact).platform }}
|
||||
TARGET_RAM: ${{ needs.memory-impact-target-branch.outputs.ram_usage }}
|
||||
TARGET_FLASH: ${{ needs.memory-impact-target-branch.outputs.flash_usage }}
|
||||
PR_RAM: ${{ needs.memory-impact-pr-branch.outputs.ram_usage }}
|
||||
PR_FLASH: ${{ needs.memory-impact-pr-branch.outputs.flash_usage }}
|
||||
TARGET_CACHE_HIT: ${{ needs.memory-impact-target-branch.outputs.cache_hit }}
|
||||
run: |
|
||||
. venv/bin/activate
|
||||
|
||||
# Check if analysis JSON files exist
|
||||
target_json_arg=""
|
||||
pr_json_arg=""
|
||||
|
||||
if [ -f ./memory-analysis/memory-analysis-target.json ]; then
|
||||
echo "Found target analysis JSON"
|
||||
target_json_arg="--target-json ./memory-analysis/memory-analysis-target.json"
|
||||
else
|
||||
echo "No target analysis JSON found"
|
||||
fi
|
||||
|
||||
if [ -f ./memory-analysis/memory-analysis-pr.json ]; then
|
||||
echo "Found PR analysis JSON"
|
||||
pr_json_arg="--pr-json ./memory-analysis/memory-analysis-pr.json"
|
||||
else
|
||||
echo "No PR analysis JSON found"
|
||||
fi
|
||||
|
||||
# Add cache flag if target was cached
|
||||
cache_flag=""
|
||||
if [ "$TARGET_CACHE_HIT" == "true" ]; then
|
||||
cache_flag="--target-cache-hit"
|
||||
fi
|
||||
|
||||
python script/ci_memory_impact_comment.py \
|
||||
--pr-number "${{ github.event.pull_request.number }}" \
|
||||
--components "$COMPONENTS" \
|
||||
--platform "$PLATFORM" \
|
||||
--target-ram "$TARGET_RAM" \
|
||||
--target-flash "$TARGET_FLASH" \
|
||||
--pr-ram "$PR_RAM" \
|
||||
--pr-flash "$PR_FLASH" \
|
||||
$target_json_arg \
|
||||
$pr_json_arg \
|
||||
$cache_flag
|
||||
|
||||
ci-status:
|
||||
name: CI Status
|
||||
runs-on: ubuntu-24.04
|
||||
needs:
|
||||
- common
|
||||
- ci-custom
|
||||
- pylint
|
||||
- pytest
|
||||
- integration-tests
|
||||
- clang-tidy
|
||||
- determine-jobs
|
||||
- test-build-components-splitter
|
||||
- test-build-components-split
|
||||
- pre-commit-ci-lite
|
||||
- memory-impact-target-branch
|
||||
- memory-impact-pr-branch
|
||||
- memory-impact-comment
|
||||
if: always()
|
||||
steps:
|
||||
- name: Success
|
||||
if: ${{ !(contains(needs.*.result, 'failure')) }}
|
||||
run: exit 0
|
||||
- name: Failure
|
||||
if: ${{ contains(needs.*.result, 'failure') }}
|
||||
env:
|
||||
JSON_DOC: ${{ toJSON(needs) }}
|
||||
run: |
|
||||
echo $JSON_DOC | jq
|
||||
exit 1
|
||||
run: script/ci-suggest-changes
|
||||
if: always() && (matrix.id == 'clang-tidy' || matrix.id == 'clang-format')
|
||||
|
324
.github/workflows/codeowner-review-request.yml
vendored
324
.github/workflows/codeowner-review-request.yml
vendored
@@ -1,324 +0,0 @@
|
||||
# This workflow automatically requests reviews from codeowners when:
|
||||
# 1. A PR is opened, reopened, or synchronized (updated)
|
||||
# 2. A PR is marked as ready for review
|
||||
#
|
||||
# It reads the CODEOWNERS file and matches all changed files in the PR against
|
||||
# the codeowner patterns, then requests reviews from the appropriate owners
|
||||
# while avoiding duplicate requests for users who have already been requested
|
||||
# or have already reviewed the PR.
|
||||
|
||||
name: Request Codeowner Reviews
|
||||
|
||||
on:
|
||||
# Needs to be pull_request_target to get write permissions
|
||||
pull_request_target:
|
||||
types: [opened, reopened, synchronize, ready_for_review]
|
||||
|
||||
permissions:
|
||||
pull-requests: write
|
||||
contents: read
|
||||
|
||||
jobs:
|
||||
request-codeowner-reviews:
|
||||
name: Run
|
||||
if: ${{ !github.event.pull_request.draft }}
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- name: Request reviews from component codeowners
|
||||
uses: actions/github-script@ed597411d8f924073f98dfc5c65a23a2325f34cd # v8.0.0
|
||||
with:
|
||||
script: |
|
||||
const owner = context.repo.owner;
|
||||
const repo = context.repo.repo;
|
||||
const pr_number = context.payload.pull_request.number;
|
||||
|
||||
console.log(`Processing PR #${pr_number} for codeowner review requests`);
|
||||
|
||||
// Hidden marker to identify bot comments from this workflow
|
||||
const BOT_COMMENT_MARKER = '<!-- codeowner-review-request-bot -->';
|
||||
|
||||
try {
|
||||
// Get the list of changed files in this PR
|
||||
const { data: files } = await github.rest.pulls.listFiles({
|
||||
owner,
|
||||
repo,
|
||||
pull_number: pr_number
|
||||
});
|
||||
|
||||
const changedFiles = files.map(file => file.filename);
|
||||
console.log(`Found ${changedFiles.length} changed files`);
|
||||
|
||||
if (changedFiles.length === 0) {
|
||||
console.log('No changed files found, skipping codeowner review requests');
|
||||
return;
|
||||
}
|
||||
|
||||
// Fetch CODEOWNERS file from root
|
||||
const { data: codeownersFile } = await github.rest.repos.getContent({
|
||||
owner,
|
||||
repo,
|
||||
path: 'CODEOWNERS',
|
||||
ref: context.payload.pull_request.base.sha
|
||||
});
|
||||
const codeownersContent = Buffer.from(codeownersFile.content, 'base64').toString('utf8');
|
||||
|
||||
// Parse CODEOWNERS file to extract all patterns and their owners
|
||||
const codeownersLines = codeownersContent.split('\n')
|
||||
.map(line => line.trim())
|
||||
.filter(line => line && !line.startsWith('#'));
|
||||
|
||||
const codeownersPatterns = [];
|
||||
|
||||
// Convert CODEOWNERS pattern to regex (robust glob handling)
|
||||
function globToRegex(pattern) {
|
||||
// Escape regex special characters except for glob wildcards
|
||||
let regexStr = pattern
|
||||
.replace(/([.+^=!:${}()|[\]\\])/g, '\\$1') // escape regex chars
|
||||
.replace(/\*\*/g, '.*') // globstar
|
||||
.replace(/\*/g, '[^/]*') // single star
|
||||
.replace(/\?/g, '.'); // question mark
|
||||
return new RegExp('^' + regexStr + '$');
|
||||
}
|
||||
|
||||
// Helper function to create comment body
|
||||
function createCommentBody(reviewersList, teamsList, matchedFileCount, isSuccessful = true) {
|
||||
const reviewerMentions = reviewersList.map(r => `@${r}`);
|
||||
const teamMentions = teamsList.map(t => `@${owner}/${t}`);
|
||||
const allMentions = [...reviewerMentions, ...teamMentions].join(', ');
|
||||
|
||||
if (isSuccessful) {
|
||||
return `${BOT_COMMENT_MARKER}\n👋 Hi there! I've automatically requested reviews from codeowners based on the files changed in this PR.\n\n${allMentions} - You've been requested to review this PR as codeowner(s) of ${matchedFileCount} file(s) that were modified. Thanks for your time! 🙏`;
|
||||
} else {
|
||||
return `${BOT_COMMENT_MARKER}\n👋 Hi there! This PR modifies ${matchedFileCount} file(s) with codeowners.\n\n${allMentions} - As codeowner(s) of the affected files, your review would be appreciated! 🙏\n\n_Note: Automatic review request may have failed, but you're still welcome to review._`;
|
||||
}
|
||||
}
|
||||
|
||||
for (const line of codeownersLines) {
|
||||
const parts = line.split(/\s+/);
|
||||
if (parts.length < 2) continue;
|
||||
|
||||
const pattern = parts[0];
|
||||
const owners = parts.slice(1);
|
||||
|
||||
// Use robust glob-to-regex conversion
|
||||
const regex = globToRegex(pattern);
|
||||
codeownersPatterns.push({ pattern, regex, owners });
|
||||
}
|
||||
|
||||
console.log(`Parsed ${codeownersPatterns.length} codeowner patterns`);
|
||||
|
||||
// Match changed files against CODEOWNERS patterns
|
||||
const matchedOwners = new Set();
|
||||
const matchedTeams = new Set();
|
||||
const fileMatches = new Map(); // Track which files matched which patterns
|
||||
|
||||
for (const file of changedFiles) {
|
||||
for (const { pattern, regex, owners } of codeownersPatterns) {
|
||||
if (regex.test(file)) {
|
||||
console.log(`File '${file}' matches pattern '${pattern}' with owners: ${owners.join(', ')}`);
|
||||
|
||||
if (!fileMatches.has(file)) {
|
||||
fileMatches.set(file, []);
|
||||
}
|
||||
fileMatches.get(file).push({ pattern, owners });
|
||||
|
||||
// Add owners to the appropriate set (remove @ prefix)
|
||||
for (const owner of owners) {
|
||||
const cleanOwner = owner.startsWith('@') ? owner.slice(1) : owner;
|
||||
if (cleanOwner.includes('/')) {
|
||||
// Team mention (org/team-name)
|
||||
const teamName = cleanOwner.split('/')[1];
|
||||
matchedTeams.add(teamName);
|
||||
} else {
|
||||
// Individual user
|
||||
matchedOwners.add(cleanOwner);
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
if (matchedOwners.size === 0 && matchedTeams.size === 0) {
|
||||
console.log('No codeowners found for any changed files');
|
||||
return;
|
||||
}
|
||||
|
||||
// Remove the PR author from reviewers
|
||||
const prAuthor = context.payload.pull_request.user.login;
|
||||
matchedOwners.delete(prAuthor);
|
||||
|
||||
// Get current reviewers to avoid duplicate requests (but still mention them)
|
||||
const { data: prData } = await github.rest.pulls.get({
|
||||
owner,
|
||||
repo,
|
||||
pull_number: pr_number
|
||||
});
|
||||
|
||||
const currentReviewers = new Set();
|
||||
const currentTeams = new Set();
|
||||
|
||||
if (prData.requested_reviewers) {
|
||||
prData.requested_reviewers.forEach(reviewer => {
|
||||
currentReviewers.add(reviewer.login);
|
||||
});
|
||||
}
|
||||
|
||||
if (prData.requested_teams) {
|
||||
prData.requested_teams.forEach(team => {
|
||||
currentTeams.add(team.slug);
|
||||
});
|
||||
}
|
||||
|
||||
// Check for completed reviews to avoid re-requesting users who have already reviewed
|
||||
const { data: reviews } = await github.rest.pulls.listReviews({
|
||||
owner,
|
||||
repo,
|
||||
pull_number: pr_number
|
||||
});
|
||||
|
||||
const reviewedUsers = new Set();
|
||||
reviews.forEach(review => {
|
||||
reviewedUsers.add(review.user.login);
|
||||
});
|
||||
|
||||
// Check for previous comments from this workflow to avoid duplicate pings
|
||||
const comments = await github.paginate(
|
||||
github.rest.issues.listComments,
|
||||
{
|
||||
owner,
|
||||
repo,
|
||||
issue_number: pr_number
|
||||
}
|
||||
);
|
||||
|
||||
const previouslyPingedUsers = new Set();
|
||||
const previouslyPingedTeams = new Set();
|
||||
|
||||
// Look for comments from github-actions bot that contain our bot marker
|
||||
const workflowComments = comments.filter(comment =>
|
||||
comment.user.type === 'Bot' &&
|
||||
comment.body.includes(BOT_COMMENT_MARKER)
|
||||
);
|
||||
|
||||
// Extract previously mentioned users and teams from workflow comments
|
||||
for (const comment of workflowComments) {
|
||||
// Match @username patterns (not team mentions)
|
||||
const userMentions = comment.body.match(/@([a-zA-Z0-9_.-]+)(?![/])/g) || [];
|
||||
userMentions.forEach(mention => {
|
||||
const username = mention.slice(1); // remove @
|
||||
previouslyPingedUsers.add(username);
|
||||
});
|
||||
|
||||
// Match @org/team patterns
|
||||
const teamMentions = comment.body.match(/@[a-zA-Z0-9_.-]+\/([a-zA-Z0-9_.-]+)/g) || [];
|
||||
teamMentions.forEach(mention => {
|
||||
const teamName = mention.split('/')[1];
|
||||
previouslyPingedTeams.add(teamName);
|
||||
});
|
||||
}
|
||||
|
||||
console.log(`Found ${previouslyPingedUsers.size} previously pinged users and ${previouslyPingedTeams.size} previously pinged teams`);
|
||||
|
||||
// Remove users who have already been pinged in previous workflow comments
|
||||
previouslyPingedUsers.forEach(user => {
|
||||
matchedOwners.delete(user);
|
||||
});
|
||||
|
||||
previouslyPingedTeams.forEach(team => {
|
||||
matchedTeams.delete(team);
|
||||
});
|
||||
|
||||
// Remove only users who have already submitted reviews (not just requested reviewers)
|
||||
reviewedUsers.forEach(reviewer => {
|
||||
matchedOwners.delete(reviewer);
|
||||
});
|
||||
|
||||
// For teams, we'll still remove already requested teams to avoid API errors
|
||||
currentTeams.forEach(team => {
|
||||
matchedTeams.delete(team);
|
||||
});
|
||||
|
||||
const reviewersList = Array.from(matchedOwners);
|
||||
const teamsList = Array.from(matchedTeams);
|
||||
|
||||
if (reviewersList.length === 0 && teamsList.length === 0) {
|
||||
console.log('No eligible reviewers found (all may already be requested, reviewed, or previously pinged)');
|
||||
return;
|
||||
}
|
||||
|
||||
const totalReviewers = reviewersList.length + teamsList.length;
|
||||
console.log(`Requesting reviews from ${reviewersList.length} users and ${teamsList.length} teams for ${fileMatches.size} matched files`);
|
||||
|
||||
// Request reviews
|
||||
try {
|
||||
const requestParams = {
|
||||
owner,
|
||||
repo,
|
||||
pull_number: pr_number
|
||||
};
|
||||
|
||||
// Filter out users who are already requested reviewers for the API call
|
||||
const newReviewers = reviewersList.filter(reviewer => !currentReviewers.has(reviewer));
|
||||
const newTeams = teamsList.filter(team => !currentTeams.has(team));
|
||||
|
||||
if (newReviewers.length > 0) {
|
||||
requestParams.reviewers = newReviewers;
|
||||
}
|
||||
|
||||
if (newTeams.length > 0) {
|
||||
requestParams.team_reviewers = newTeams;
|
||||
}
|
||||
|
||||
// Only make the API call if there are new reviewers to request
|
||||
if (newReviewers.length > 0 || newTeams.length > 0) {
|
||||
await github.rest.pulls.requestReviewers(requestParams);
|
||||
console.log(`Successfully requested reviews from ${newReviewers.length} new users and ${newTeams.length} new teams`);
|
||||
} else {
|
||||
console.log('All codeowners are already requested reviewers or have reviewed');
|
||||
}
|
||||
|
||||
// Only add a comment if there are new codeowners to mention (not previously pinged)
|
||||
if (reviewersList.length > 0 || teamsList.length > 0) {
|
||||
const commentBody = createCommentBody(reviewersList, teamsList, fileMatches.size, true);
|
||||
|
||||
await github.rest.issues.createComment({
|
||||
owner,
|
||||
repo,
|
||||
issue_number: pr_number,
|
||||
body: commentBody
|
||||
});
|
||||
console.log(`Added comment mentioning ${reviewersList.length} users and ${teamsList.length} teams`);
|
||||
} else {
|
||||
console.log('No new codeowners to mention in comment (all previously pinged)');
|
||||
}
|
||||
} catch (error) {
|
||||
if (error.status === 422) {
|
||||
console.log('Some reviewers may already be requested or unavailable:', error.message);
|
||||
|
||||
// Only try to add a comment if there are new codeowners to mention
|
||||
if (reviewersList.length > 0 || teamsList.length > 0) {
|
||||
const commentBody = createCommentBody(reviewersList, teamsList, fileMatches.size, false);
|
||||
|
||||
try {
|
||||
await github.rest.issues.createComment({
|
||||
owner,
|
||||
repo,
|
||||
issue_number: pr_number,
|
||||
body: commentBody
|
||||
});
|
||||
console.log(`Added fallback comment mentioning ${reviewersList.length} users and ${teamsList.length} teams`);
|
||||
} catch (commentError) {
|
||||
console.log('Failed to add comment:', commentError.message);
|
||||
}
|
||||
} else {
|
||||
console.log('No new codeowners to mention in fallback comment');
|
||||
}
|
||||
} else {
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
} catch (error) {
|
||||
console.log('Failed to process codeowner review requests:', error.message);
|
||||
console.error(error);
|
||||
}
|
91
.github/workflows/codeql.yml
vendored
91
.github/workflows/codeql.yml
vendored
@@ -1,91 +0,0 @@
|
||||
# For most projects, this workflow file will not need changing; you simply need
|
||||
# to commit it to your repository.
|
||||
#
|
||||
# You may wish to alter this file to override the set of languages analyzed,
|
||||
# or to provide custom queries or build logic.
|
||||
#
|
||||
# ******** NOTE ********
|
||||
# We have attempted to detect the languages in your repository. Please check
|
||||
# the `language` matrix defined below to confirm you have the correct set of
|
||||
# supported CodeQL languages.
|
||||
#
|
||||
name: "CodeQL Advanced"
|
||||
|
||||
on:
|
||||
workflow_dispatch:
|
||||
schedule:
|
||||
- cron: "30 18 * * 4"
|
||||
|
||||
jobs:
|
||||
analyze:
|
||||
name: Analyze (${{ matrix.language }})
|
||||
# Runner size impacts CodeQL analysis time. To learn more, please see:
|
||||
# - https://gh.io/recommended-hardware-resources-for-running-codeql
|
||||
# - https://gh.io/supported-runners-and-hardware-resources
|
||||
# - https://gh.io/using-larger-runners (GitHub.com only)
|
||||
# Consider using larger runners or machines with greater resources for possible analysis time improvements.
|
||||
runs-on: ${{ (matrix.language == 'swift' && 'macos-latest') || 'ubuntu-latest' }}
|
||||
permissions:
|
||||
# required for all workflows
|
||||
security-events: write
|
||||
|
||||
# required to fetch internal or private CodeQL packs
|
||||
packages: read
|
||||
|
||||
# only required for workflows in private repositories
|
||||
actions: read
|
||||
contents: read
|
||||
|
||||
strategy:
|
||||
fail-fast: false
|
||||
matrix:
|
||||
include:
|
||||
# - language: c-cpp
|
||||
# build-mode: autobuild
|
||||
- language: python
|
||||
build-mode: none
|
||||
# CodeQL supports the following values keywords for 'language': 'c-cpp', 'csharp', 'go', 'java-kotlin', 'javascript-typescript', 'python', 'ruby', 'swift'
|
||||
# Use `c-cpp` to analyze code written in C, C++ or both
|
||||
# Use 'java-kotlin' to analyze code written in Java, Kotlin or both
|
||||
# Use 'javascript-typescript' to analyze code written in JavaScript, TypeScript or both
|
||||
# To learn more about changing the languages that are analyzed or customizing the build mode for your analysis,
|
||||
# see https://docs.github.com/en/code-security/code-scanning/creating-an-advanced-setup-for-code-scanning/customizing-your-advanced-setup-for-code-scanning.
|
||||
# If you are analyzing a compiled language, you can modify the 'build-mode' for that language to customize how
|
||||
# your codebase is analyzed, see https://docs.github.com/en/code-security/code-scanning/creating-an-advanced-setup-for-code-scanning/codeql-code-scanning-for-compiled-languages
|
||||
steps:
|
||||
- name: Checkout repository
|
||||
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
|
||||
|
||||
# Initializes the CodeQL tools for scanning.
|
||||
- name: Initialize CodeQL
|
||||
uses: github/codeql-action/init@16140ae1a102900babc80a33c44059580f687047 # v4.30.9
|
||||
with:
|
||||
languages: ${{ matrix.language }}
|
||||
build-mode: ${{ matrix.build-mode }}
|
||||
# If you wish to specify custom queries, you can do so here or in a config file.
|
||||
# By default, queries listed here will override any specified in a config file.
|
||||
# Prefix the list here with "+" to use these queries and those in the config file.
|
||||
|
||||
# For more details on CodeQL's query packs, refer to: https://docs.github.com/en/code-security/code-scanning/automatically-scanning-your-code-for-vulnerabilities-and-errors/configuring-code-scanning#using-queries-in-ql-packs
|
||||
# queries: security-extended,security-and-quality
|
||||
|
||||
# If the analyze step fails for one of the languages you are analyzing with
|
||||
# "We were unable to automatically build your code", modify the matrix above
|
||||
# to set the build mode to "manual" for that language. Then modify this step
|
||||
# to build your code.
|
||||
# ℹ️ Command-line programs to run using the OS shell.
|
||||
# 📚 See https://docs.github.com/en/actions/using-workflows/workflow-syntax-for-github-actions#jobsjob_idstepsrun
|
||||
- if: matrix.build-mode == 'manual'
|
||||
shell: bash
|
||||
run: |
|
||||
echo 'If you are using a "manual" build mode for one or more of the' \
|
||||
'languages you are analyzing, replace this with the commands to build' \
|
||||
'your code, for example:'
|
||||
echo ' make bootstrap'
|
||||
echo ' make release'
|
||||
exit 1
|
||||
|
||||
- name: Perform CodeQL Analysis
|
||||
uses: github/codeql-action/analyze@16140ae1a102900babc80a33c44059580f687047 # v4.30.9
|
||||
with:
|
||||
category: "/language:${{matrix.language}}"
|
157
.github/workflows/external-component-bot.yml
vendored
157
.github/workflows/external-component-bot.yml
vendored
@@ -1,157 +0,0 @@
|
||||
name: Add External Component Comment
|
||||
|
||||
on:
|
||||
pull_request_target:
|
||||
types: [opened, synchronize]
|
||||
|
||||
permissions:
|
||||
contents: read # Needed to fetch PR details
|
||||
issues: write # Needed to create and update comments (PR comments are managed via the issues REST API)
|
||||
pull-requests: write # also needed?
|
||||
|
||||
jobs:
|
||||
external-comment:
|
||||
name: External component comment
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- name: Add external component comment
|
||||
uses: actions/github-script@ed597411d8f924073f98dfc5c65a23a2325f34cd # v8.0.0
|
||||
with:
|
||||
github-token: ${{ secrets.GITHUB_TOKEN }}
|
||||
script: |
|
||||
// Generate external component usage instructions
|
||||
function generateExternalComponentInstructions(prNumber, componentNames, owner, repo) {
|
||||
let source;
|
||||
if (owner === 'esphome' && repo === 'esphome')
|
||||
source = `github://pr#${prNumber}`;
|
||||
else
|
||||
source = `github://${owner}/${repo}@pull/${prNumber}/head`;
|
||||
return `To use the changes from this PR as an external component, add the following to your ESPHome configuration YAML file:
|
||||
|
||||
\`\`\`yaml
|
||||
external_components:
|
||||
- source: ${source}
|
||||
components: [${componentNames.join(', ')}]
|
||||
refresh: 1h
|
||||
\`\`\``;
|
||||
}
|
||||
|
||||
// Generate repo clone instructions
|
||||
function generateRepoInstructions(prNumber, owner, repo, branch) {
|
||||
return `To use the changes in this PR:
|
||||
|
||||
\`\`\`bash
|
||||
# Clone the repository:
|
||||
git clone https://github.com/${owner}/${repo}
|
||||
cd ${repo}
|
||||
|
||||
# Checkout the PR branch:
|
||||
git fetch origin pull/${prNumber}/head:${branch}
|
||||
git checkout ${branch}
|
||||
|
||||
# Install the development version:
|
||||
script/setup
|
||||
|
||||
# Activate the development version:
|
||||
source venv/bin/activate
|
||||
\`\`\`
|
||||
|
||||
Now you can run \`esphome\` as usual to test the changes in this PR.
|
||||
`;
|
||||
}
|
||||
|
||||
async function createComment(octokit, owner, repo, prNumber, esphomeChanges, componentChanges) {
|
||||
const commentMarker = "<!-- This comment was generated automatically by the external-component-bot workflow. -->";
|
||||
const legacyCommentMarker = "<!-- This comment was generated automatically by a GitHub workflow. -->";
|
||||
let commentBody;
|
||||
if (esphomeChanges.length === 1) {
|
||||
commentBody = generateExternalComponentInstructions(prNumber, componentChanges, owner, repo);
|
||||
} else {
|
||||
commentBody = generateRepoInstructions(prNumber, owner, repo, context.payload.pull_request.head.ref);
|
||||
}
|
||||
commentBody += `\n\n---\n(Added by the PR bot)\n\n${commentMarker}`;
|
||||
|
||||
// Check for existing bot comment
|
||||
const comments = await github.paginate(
|
||||
github.rest.issues.listComments,
|
||||
{
|
||||
owner: owner,
|
||||
repo: repo,
|
||||
issue_number: prNumber,
|
||||
per_page: 100,
|
||||
}
|
||||
);
|
||||
|
||||
const sorted = comments.sort((a, b) => new Date(b.updated_at) - new Date(a.updated_at));
|
||||
|
||||
const botComment = sorted.find(comment =>
|
||||
(
|
||||
comment.body.includes(commentMarker) ||
|
||||
comment.body.includes(legacyCommentMarker)
|
||||
) && comment.user.type === "Bot"
|
||||
);
|
||||
|
||||
if (botComment && botComment.body === commentBody) {
|
||||
// No changes in the comment, do nothing
|
||||
return;
|
||||
}
|
||||
|
||||
if (botComment) {
|
||||
// Update existing comment
|
||||
await github.rest.issues.updateComment({
|
||||
owner: owner,
|
||||
repo: repo,
|
||||
comment_id: botComment.id,
|
||||
body: commentBody,
|
||||
});
|
||||
} else {
|
||||
// Create new comment
|
||||
await github.rest.issues.createComment({
|
||||
owner: owner,
|
||||
repo: repo,
|
||||
issue_number: prNumber,
|
||||
body: commentBody,
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
async function getEsphomeAndComponentChanges(github, owner, repo, prNumber) {
|
||||
const changedFiles = await github.rest.pulls.listFiles({
|
||||
owner: owner,
|
||||
repo: repo,
|
||||
pull_number: prNumber,
|
||||
});
|
||||
|
||||
const esphomeChanges = changedFiles.data
|
||||
.filter(file => file.filename !== "esphome/core/defines.h" && file.filename.startsWith('esphome/'))
|
||||
.map(file => {
|
||||
const match = file.filename.match(/esphome\/([^/]+)/);
|
||||
return match ? match[1] : null;
|
||||
})
|
||||
.filter(it => it !== null);
|
||||
|
||||
if (esphomeChanges.length === 0) {
|
||||
return {esphomeChanges: [], componentChanges: []};
|
||||
}
|
||||
|
||||
const uniqueEsphomeChanges = [...new Set(esphomeChanges)];
|
||||
const componentChanges = changedFiles.data
|
||||
.filter(file => file.filename.startsWith('esphome/components/'))
|
||||
.map(file => {
|
||||
const match = file.filename.match(/esphome\/components\/([^/]+)\//);
|
||||
return match ? match[1] : null;
|
||||
})
|
||||
.filter(it => it !== null);
|
||||
|
||||
return {esphomeChanges: uniqueEsphomeChanges, componentChanges: [...new Set(componentChanges)]};
|
||||
}
|
||||
|
||||
// Start of main code.
|
||||
|
||||
const prNumber = context.payload.pull_request.number;
|
||||
const {owner, repo} = context.repo;
|
||||
|
||||
const {esphomeChanges, componentChanges} = await getEsphomeAndComponentChanges(github, owner, repo, prNumber);
|
||||
if (componentChanges.length !== 0) {
|
||||
await createComment(github, owner, repo, prNumber, esphomeChanges, componentChanges);
|
||||
}
|
163
.github/workflows/issue-codeowner-notify.yml
vendored
163
.github/workflows/issue-codeowner-notify.yml
vendored
@@ -1,163 +0,0 @@
|
||||
# This workflow automatically notifies codeowners when an issue is labeled with component labels.
|
||||
# It reads the CODEOWNERS file to find the maintainers for the labeled components
|
||||
# and posts a comment mentioning them to ensure they're aware of the issue.
|
||||
|
||||
name: Notify Issue Codeowners
|
||||
|
||||
on:
|
||||
issues:
|
||||
types: [labeled]
|
||||
|
||||
permissions:
|
||||
issues: write
|
||||
contents: read
|
||||
|
||||
jobs:
|
||||
notify-codeowners:
|
||||
name: Run
|
||||
if: ${{ startsWith(github.event.label.name, format('component{0} ', ':')) }}
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- name: Notify codeowners for component issues
|
||||
uses: actions/github-script@ed597411d8f924073f98dfc5c65a23a2325f34cd # v8.0.0
|
||||
with:
|
||||
script: |
|
||||
const owner = context.repo.owner;
|
||||
const repo = context.repo.repo;
|
||||
const issue_number = context.payload.issue.number;
|
||||
const labelName = context.payload.label.name;
|
||||
|
||||
console.log(`Processing issue #${issue_number} with label: ${labelName}`);
|
||||
|
||||
// Hidden marker to identify bot comments from this workflow
|
||||
const BOT_COMMENT_MARKER = '<!-- issue-codeowner-notify-bot -->';
|
||||
|
||||
// Extract component name from label
|
||||
const componentName = labelName.replace('component: ', '');
|
||||
console.log(`Component: ${componentName}`);
|
||||
|
||||
try {
|
||||
// Fetch CODEOWNERS file from root
|
||||
const { data: codeownersFile } = await github.rest.repos.getContent({
|
||||
owner,
|
||||
repo,
|
||||
path: 'CODEOWNERS'
|
||||
});
|
||||
const codeownersContent = Buffer.from(codeownersFile.content, 'base64').toString('utf8');
|
||||
|
||||
// Parse CODEOWNERS file to extract component mappings
|
||||
const codeownersLines = codeownersContent.split('\n')
|
||||
.map(line => line.trim())
|
||||
.filter(line => line && !line.startsWith('#'));
|
||||
|
||||
let componentOwners = null;
|
||||
|
||||
for (const line of codeownersLines) {
|
||||
const parts = line.split(/\s+/);
|
||||
if (parts.length < 2) continue;
|
||||
|
||||
const pattern = parts[0];
|
||||
const owners = parts.slice(1);
|
||||
|
||||
// Look for component patterns: esphome/components/{component}/*
|
||||
const componentMatch = pattern.match(/^esphome\/components\/([^\/]+)\/\*$/);
|
||||
if (componentMatch && componentMatch[1] === componentName) {
|
||||
componentOwners = owners;
|
||||
break;
|
||||
}
|
||||
}
|
||||
|
||||
if (!componentOwners) {
|
||||
console.log(`No codeowners found for component: ${componentName}`);
|
||||
return;
|
||||
}
|
||||
|
||||
console.log(`Found codeowners for '${componentName}': ${componentOwners.join(', ')}`);
|
||||
|
||||
// Separate users and teams
|
||||
const userOwners = [];
|
||||
const teamOwners = [];
|
||||
|
||||
for (const owner of componentOwners) {
|
||||
const cleanOwner = owner.startsWith('@') ? owner.slice(1) : owner;
|
||||
if (cleanOwner.includes('/')) {
|
||||
// Team mention (org/team-name)
|
||||
teamOwners.push(`@${cleanOwner}`);
|
||||
} else {
|
||||
// Individual user
|
||||
userOwners.push(`@${cleanOwner}`);
|
||||
}
|
||||
}
|
||||
|
||||
// Remove issue author from mentions to avoid self-notification
|
||||
const issueAuthor = context.payload.issue.user.login;
|
||||
const filteredUserOwners = userOwners.filter(mention =>
|
||||
mention !== `@${issueAuthor}`
|
||||
);
|
||||
|
||||
// Check for previous comments from this workflow to avoid duplicate pings
|
||||
const comments = await github.paginate(
|
||||
github.rest.issues.listComments,
|
||||
{
|
||||
owner,
|
||||
repo,
|
||||
issue_number: issue_number
|
||||
}
|
||||
);
|
||||
|
||||
const previouslyPingedUsers = new Set();
|
||||
const previouslyPingedTeams = new Set();
|
||||
|
||||
// Look for comments from github-actions bot that contain codeowner pings for this component
|
||||
const workflowComments = comments.filter(comment =>
|
||||
comment.user.type === 'Bot' &&
|
||||
comment.body.includes(BOT_COMMENT_MARKER) &&
|
||||
comment.body.includes(`component: ${componentName}`)
|
||||
);
|
||||
|
||||
// Extract previously mentioned users and teams from workflow comments
|
||||
for (const comment of workflowComments) {
|
||||
// Match @username patterns (not team mentions)
|
||||
const userMentions = comment.body.match(/@([a-zA-Z0-9_.-]+)(?![/])/g) || [];
|
||||
userMentions.forEach(mention => {
|
||||
previouslyPingedUsers.add(mention); // Keep @ prefix for easy comparison
|
||||
});
|
||||
|
||||
// Match @org/team patterns
|
||||
const teamMentions = comment.body.match(/@[a-zA-Z0-9_.-]+\/[a-zA-Z0-9_.-]+/g) || [];
|
||||
teamMentions.forEach(mention => {
|
||||
previouslyPingedTeams.add(mention);
|
||||
});
|
||||
}
|
||||
|
||||
console.log(`Found ${previouslyPingedUsers.size} previously pinged users and ${previouslyPingedTeams.size} previously pinged teams for component ${componentName}`);
|
||||
|
||||
// Remove previously pinged users and teams
|
||||
const newUserOwners = filteredUserOwners.filter(mention => !previouslyPingedUsers.has(mention));
|
||||
const newTeamOwners = teamOwners.filter(mention => !previouslyPingedTeams.has(mention));
|
||||
|
||||
const allMentions = [...newUserOwners, ...newTeamOwners];
|
||||
|
||||
if (allMentions.length === 0) {
|
||||
console.log('No new codeowners to notify (all previously pinged or issue author is the only codeowner)');
|
||||
return;
|
||||
}
|
||||
|
||||
// Create comment body
|
||||
const mentionString = allMentions.join(', ');
|
||||
const commentBody = `${BOT_COMMENT_MARKER}\n👋 Hey ${mentionString}!\n\nThis issue has been labeled with \`component: ${componentName}\` and you've been identified as a codeowner of this component. Please take a look when you have a chance!\n\nThanks for maintaining this component! 🙏`;
|
||||
|
||||
// Post comment
|
||||
await github.rest.issues.createComment({
|
||||
owner,
|
||||
repo,
|
||||
issue_number: issue_number,
|
||||
body: commentBody
|
||||
});
|
||||
|
||||
console.log(`Successfully notified new codeowners: ${mentionString}`);
|
||||
|
||||
} catch (error) {
|
||||
console.log('Failed to process codeowner notifications:', error.message);
|
||||
console.error(error);
|
||||
}
|
24
.github/workflows/lock.yml
vendored
24
.github/workflows/lock.yml
vendored
@@ -1,11 +1,27 @@
|
||||
---
|
||||
name: Lock closed issues and PRs
|
||||
name: Lock
|
||||
|
||||
on:
|
||||
schedule:
|
||||
- cron: "30 0 * * *" # Run daily at 00:30 UTC
|
||||
- cron: '30 0 * * *'
|
||||
workflow_dispatch:
|
||||
|
||||
permissions:
|
||||
issues: write
|
||||
pull-requests: write
|
||||
|
||||
concurrency:
|
||||
group: lock
|
||||
|
||||
jobs:
|
||||
lock:
|
||||
uses: esphome/workflows/.github/workflows/lock.yml@main
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- uses: dessant/lock-threads@v3
|
||||
with:
|
||||
pr-inactive-days: "1"
|
||||
pr-lock-reason: ""
|
||||
exclude-any-pr-labels: keep-open
|
||||
|
||||
issue-inactive-days: "7"
|
||||
issue-lock-reason: ""
|
||||
exclude-any-issue-labels: keep-open
|
||||
|
2
.github/workflows/matchers/ci-custom.json
vendored
2
.github/workflows/matchers/ci-custom.json
vendored
@@ -4,7 +4,7 @@
|
||||
"owner": "ci-custom",
|
||||
"pattern": [
|
||||
{
|
||||
"regexp": "^(.*):(\\d+):(\\d+):\\s+lint:\\s+(.*)$",
|
||||
"regexp": "^ERROR (.*):(\\d+):(\\d+) - (.*)$",
|
||||
"file": 1,
|
||||
"line": 2,
|
||||
"column": 3,
|
||||
|
2
.github/workflows/matchers/gcc.json
vendored
2
.github/workflows/matchers/gcc.json
vendored
@@ -5,7 +5,7 @@
|
||||
"severity": "error",
|
||||
"pattern": [
|
||||
{
|
||||
"regexp": "^src/(.*):(\\d+):(\\d+):\\s+(?:fatal\\s+)?(warning|error):\\s+(.*)$",
|
||||
"regexp": "^(.*):(\\d+):(\\d+):\\s+(?:fatal\\s+)?(warning|error):\\s+(.*)$",
|
||||
"file": 1,
|
||||
"line": 2,
|
||||
"column": 3,
|
||||
|
15
.github/workflows/matchers/lint-python.json
vendored
15
.github/workflows/matchers/lint-python.json
vendored
@@ -1,22 +1,11 @@
|
||||
{
|
||||
"problemMatcher": [
|
||||
{
|
||||
"owner": "ruff",
|
||||
"severity": "error",
|
||||
"pattern": [
|
||||
{
|
||||
"regexp": "^(.*): (Please format this file with the ruff formatter)",
|
||||
"file": 1,
|
||||
"message": 2
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"owner": "flake8",
|
||||
"severity": "error",
|
||||
"pattern": [
|
||||
{
|
||||
"regexp": "^(.*):(\\d+): ([EFCDNW]\\d{3}.*)$",
|
||||
"regexp": "^(.*):(\\d+) - ([EFCDNW]\\d{3}.*)$",
|
||||
"file": 1,
|
||||
"line": 2,
|
||||
"message": 3
|
||||
@@ -28,7 +17,7 @@
|
||||
"severity": "error",
|
||||
"pattern": [
|
||||
{
|
||||
"regexp": "^(.*):(\\d+): (\\[[EFCRW]\\d{4}\\(.*\\),.*\\].*)$",
|
||||
"regexp": "^(.*):(\\d+) - (\\[[EFCRW]\\d{4}\\(.*\\),.*\\].*)$",
|
||||
"file": 1,
|
||||
"line": 2,
|
||||
"message": 3
|
||||
|
279
.github/workflows/release.yml
vendored
279
.github/workflows/release.yml
vendored
@@ -1,4 +1,3 @@
|
||||
---
|
||||
name: Publish Release
|
||||
|
||||
on:
|
||||
@@ -17,245 +16,139 @@ jobs:
|
||||
runs-on: ubuntu-latest
|
||||
outputs:
|
||||
tag: ${{ steps.tag.outputs.tag }}
|
||||
branch_build: ${{ steps.tag.outputs.branch_build }}
|
||||
deploy_env: ${{ steps.tag.outputs.deploy_env }}
|
||||
steps:
|
||||
- uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
|
||||
- uses: actions/checkout@v2
|
||||
- name: Get tag
|
||||
id: tag
|
||||
# yamllint disable rule:line-length
|
||||
run: |
|
||||
if [[ "${{ github.event_name }}" = "release" ]]; then
|
||||
TAG="${{ github.event.release.tag_name}}"
|
||||
BRANCH_BUILD="false"
|
||||
if [[ "${{ github.event.release.prerelease }}" = "true" ]]; then
|
||||
ENVIRONMENT="beta"
|
||||
else
|
||||
ENVIRONMENT="production"
|
||||
fi
|
||||
if [[ "$GITHUB_EVENT_NAME" = "release" ]]; then
|
||||
TAG="${GITHUB_REF#refs/tags/}"
|
||||
else
|
||||
TAG=$(cat esphome/const.py | sed -n -E "s/^__version__\s+=\s+\"(.+)\"$/\1/p")
|
||||
today="$(date --utc '+%Y%m%d')"
|
||||
TAG="${TAG}${today}"
|
||||
BRANCH=${GITHUB_REF#refs/heads/}
|
||||
if [[ "$BRANCH" != "dev" ]]; then
|
||||
TAG="${TAG}-${BRANCH}"
|
||||
BRANCH_BUILD="true"
|
||||
ENVIRONMENT=""
|
||||
else
|
||||
BRANCH_BUILD="false"
|
||||
ENVIRONMENT="dev"
|
||||
fi
|
||||
fi
|
||||
echo "tag=${TAG}" >> $GITHUB_OUTPUT
|
||||
echo "branch_build=${BRANCH_BUILD}" >> $GITHUB_OUTPUT
|
||||
echo "deploy_env=${ENVIRONMENT}" >> $GITHUB_OUTPUT
|
||||
# yamllint enable rule:line-length
|
||||
echo "::set-output name=tag::${TAG}"
|
||||
|
||||
deploy-pypi:
|
||||
name: Build and publish to PyPi
|
||||
if: github.repository == 'esphome/esphome' && github.event_name == 'release'
|
||||
runs-on: ubuntu-latest
|
||||
permissions:
|
||||
contents: read
|
||||
id-token: write
|
||||
steps:
|
||||
- uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
|
||||
- uses: actions/checkout@v2
|
||||
- name: Set up Python
|
||||
uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
|
||||
uses: actions/setup-python@v1
|
||||
with:
|
||||
python-version: "3.x"
|
||||
python-version: '3.x'
|
||||
- name: Set up python environment
|
||||
run: |
|
||||
script/setup
|
||||
pip install setuptools wheel twine
|
||||
- name: Build
|
||||
run: |-
|
||||
pip3 install build
|
||||
python3 -m build
|
||||
- name: Publish
|
||||
uses: pypa/gh-action-pypi-publish@ed0c53931b1dc9bd32cbe73a98c7f6766f8a527e # v1.13.0
|
||||
with:
|
||||
skip-existing: true
|
||||
run: python setup.py sdist bdist_wheel
|
||||
- name: Upload
|
||||
env:
|
||||
TWINE_USERNAME: __token__
|
||||
TWINE_PASSWORD: ${{ secrets.PYPI_TOKEN }}
|
||||
run: twine upload dist/*
|
||||
|
||||
deploy-docker:
|
||||
name: Build ESPHome ${{ matrix.platform.arch }}
|
||||
name: Build and publish docker containers
|
||||
if: github.repository == 'esphome/esphome'
|
||||
permissions:
|
||||
contents: read
|
||||
packages: write
|
||||
runs-on: ${{ matrix.platform.os }}
|
||||
runs-on: ubuntu-latest
|
||||
needs: [init]
|
||||
strategy:
|
||||
fail-fast: false
|
||||
matrix:
|
||||
platform:
|
||||
- arch: amd64
|
||||
os: "ubuntu-24.04"
|
||||
- arch: arm64
|
||||
os: "ubuntu-24.04-arm"
|
||||
|
||||
arch: [amd64, armv7, aarch64]
|
||||
build_type: ["ha-addon", "docker", "lint"]
|
||||
steps:
|
||||
- uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
|
||||
- name: Set up Python
|
||||
uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
|
||||
with:
|
||||
python-version: "3.11"
|
||||
- uses: actions/checkout@v2
|
||||
- name: Set up Python
|
||||
uses: actions/setup-python@v2
|
||||
with:
|
||||
python-version: '3.9'
|
||||
|
||||
- name: Set up Docker Buildx
|
||||
uses: docker/setup-buildx-action@e468171a9de216ec08956ac3ada2f0791b6bd435 # v3.11.1
|
||||
- name: Set up Docker Buildx
|
||||
uses: docker/setup-buildx-action@v1
|
||||
- name: Set up QEMU
|
||||
uses: docker/setup-qemu-action@v1
|
||||
|
||||
- name: Log in to docker hub
|
||||
uses: docker/login-action@5e57cd118135c172c3672efd75eb46360885c0ef # v3.6.0
|
||||
with:
|
||||
username: ${{ secrets.DOCKER_USER }}
|
||||
password: ${{ secrets.DOCKER_PASSWORD }}
|
||||
- name: Log in to the GitHub container registry
|
||||
uses: docker/login-action@5e57cd118135c172c3672efd75eb46360885c0ef # v3.6.0
|
||||
with:
|
||||
- name: Log in to docker hub
|
||||
uses: docker/login-action@v1
|
||||
with:
|
||||
username: ${{ secrets.DOCKER_USER }}
|
||||
password: ${{ secrets.DOCKER_PASSWORD }}
|
||||
- name: Log in to the GitHub container registry
|
||||
uses: docker/login-action@v1
|
||||
with:
|
||||
registry: ghcr.io
|
||||
username: ${{ github.actor }}
|
||||
password: ${{ secrets.GITHUB_TOKEN }}
|
||||
|
||||
- name: Build docker
|
||||
uses: ./.github/actions/build-image
|
||||
with:
|
||||
target: final
|
||||
build_type: docker
|
||||
suffix: ""
|
||||
version: ${{ needs.init.outputs.tag }}
|
||||
- name: Build and push
|
||||
run: |
|
||||
docker/build.py \
|
||||
--tag "${{ needs.init.outputs.tag }}" \
|
||||
--arch "${{ matrix.arch }}" \
|
||||
--build-type "${{ matrix.build_type }}" \
|
||||
build \
|
||||
--push
|
||||
|
||||
- name: Build ha-addon
|
||||
uses: ./.github/actions/build-image
|
||||
with:
|
||||
target: final
|
||||
build_type: ha-addon
|
||||
suffix: "hassio"
|
||||
version: ${{ needs.init.outputs.tag }}
|
||||
|
||||
# - name: Build lint
|
||||
# uses: ./.github/actions/build-image
|
||||
# with:
|
||||
# target: lint
|
||||
# build_type: lint
|
||||
# suffix: lint
|
||||
# version: ${{ needs.init.outputs.tag }}
|
||||
|
||||
- name: Upload digests
|
||||
uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4.6.2
|
||||
with:
|
||||
name: digests-${{ matrix.platform.arch }}
|
||||
path: /tmp/digests
|
||||
retention-days: 1
|
||||
|
||||
deploy-manifest:
|
||||
name: Publish ESPHome ${{ matrix.image.build_type }} to ${{ matrix.registry }}
|
||||
runs-on: ubuntu-latest
|
||||
needs:
|
||||
- init
|
||||
- deploy-docker
|
||||
deploy-docker-manifest:
|
||||
if: github.repository == 'esphome/esphome'
|
||||
permissions:
|
||||
contents: read
|
||||
packages: write
|
||||
runs-on: ubuntu-latest
|
||||
needs: [init, deploy-docker]
|
||||
strategy:
|
||||
fail-fast: false
|
||||
matrix:
|
||||
image:
|
||||
- build_type: "docker"
|
||||
suffix: ""
|
||||
- build_type: "ha-addon"
|
||||
suffix: "hassio"
|
||||
# - build_type: "lint"
|
||||
# suffix: "lint"
|
||||
registry:
|
||||
- ghcr
|
||||
- dockerhub
|
||||
build_type: ["ha-addon", "docker", "lint"]
|
||||
steps:
|
||||
- uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
|
||||
- uses: actions/checkout@v2
|
||||
- name: Set up Python
|
||||
uses: actions/setup-python@v2
|
||||
with:
|
||||
python-version: '3.9'
|
||||
- name: Enable experimental manifest support
|
||||
run: |
|
||||
mkdir -p ~/.docker
|
||||
echo "{\"experimental\": \"enabled\"}" > ~/.docker/config.json
|
||||
|
||||
- name: Download digests
|
||||
uses: actions/download-artifact@634f93cb2916e3fdff6788551b99b062d0335ce0 # v5.0.0
|
||||
with:
|
||||
pattern: digests-*
|
||||
path: /tmp/digests
|
||||
merge-multiple: true
|
||||
|
||||
- name: Set up Docker Buildx
|
||||
uses: docker/setup-buildx-action@e468171a9de216ec08956ac3ada2f0791b6bd435 # v3.11.1
|
||||
|
||||
- name: Log in to docker hub
|
||||
if: matrix.registry == 'dockerhub'
|
||||
uses: docker/login-action@5e57cd118135c172c3672efd75eb46360885c0ef # v3.6.0
|
||||
with:
|
||||
username: ${{ secrets.DOCKER_USER }}
|
||||
password: ${{ secrets.DOCKER_PASSWORD }}
|
||||
- name: Log in to the GitHub container registry
|
||||
if: matrix.registry == 'ghcr'
|
||||
uses: docker/login-action@5e57cd118135c172c3672efd75eb46360885c0ef # v3.6.0
|
||||
with:
|
||||
- name: Log in to docker hub
|
||||
uses: docker/login-action@v1
|
||||
with:
|
||||
username: ${{ secrets.DOCKER_USER }}
|
||||
password: ${{ secrets.DOCKER_PASSWORD }}
|
||||
- name: Log in to the GitHub container registry
|
||||
uses: docker/login-action@v1
|
||||
with:
|
||||
registry: ghcr.io
|
||||
username: ${{ github.actor }}
|
||||
password: ${{ secrets.GITHUB_TOKEN }}
|
||||
|
||||
- name: Generate short tags
|
||||
id: tags
|
||||
run: |
|
||||
output=$(docker/generate_tags.py \
|
||||
--tag "${{ needs.init.outputs.tag }}" \
|
||||
--suffix "${{ matrix.image.suffix }}" \
|
||||
--registry "${{ matrix.registry }}")
|
||||
echo $output
|
||||
for l in $output; do
|
||||
echo $l >> $GITHUB_OUTPUT
|
||||
done
|
||||
- name: Run manifest
|
||||
run: |
|
||||
docker/build.py \
|
||||
--tag "${{ needs.init.outputs.tag }}" \
|
||||
--build-type "${{ matrix.build_type }}" \
|
||||
manifest
|
||||
|
||||
- name: Create manifest list and push
|
||||
working-directory: /tmp/digests/${{ matrix.image.build_type }}/${{ matrix.registry }}
|
||||
run: |
|
||||
docker buildx imagetools create $(jq -Rcnr 'inputs | . / "," | map("-t " + .) | join(" ")' <<< "${{ steps.tags.outputs.tags}}") \
|
||||
$(printf '${{ steps.tags.outputs.image }}@sha256:%s ' *)
|
||||
|
||||
deploy-ha-addon-repo:
|
||||
if: github.repository == 'esphome/esphome' && needs.init.outputs.branch_build == 'false'
|
||||
deploy-hassio-repo:
|
||||
if: github.repository == 'esphome/esphome' && github.event_name == 'release'
|
||||
runs-on: ubuntu-latest
|
||||
needs:
|
||||
- init
|
||||
- deploy-manifest
|
||||
needs: [deploy-docker]
|
||||
steps:
|
||||
- name: Trigger Workflow
|
||||
uses: actions/github-script@ed597411d8f924073f98dfc5c65a23a2325f34cd # v8.0.0
|
||||
with:
|
||||
github-token: ${{ secrets.DEPLOY_HA_ADDON_REPO_TOKEN }}
|
||||
script: |
|
||||
let description = "ESPHome";
|
||||
if (context.eventName == "release") {
|
||||
description = ${{ toJSON(github.event.release.body) }};
|
||||
}
|
||||
github.rest.actions.createWorkflowDispatch({
|
||||
owner: "esphome",
|
||||
repo: "home-assistant-addon",
|
||||
workflow_id: "bump-version.yml",
|
||||
ref: "main",
|
||||
inputs: {
|
||||
version: "${{ needs.init.outputs.tag }}",
|
||||
content: description
|
||||
}
|
||||
})
|
||||
|
||||
deploy-esphome-schema:
|
||||
if: github.repository == 'esphome/esphome' && needs.init.outputs.branch_build == 'false'
|
||||
runs-on: ubuntu-latest
|
||||
needs: [init]
|
||||
environment: ${{ needs.init.outputs.deploy_env }}
|
||||
steps:
|
||||
- name: Trigger Workflow
|
||||
uses: actions/github-script@ed597411d8f924073f98dfc5c65a23a2325f34cd # v8.0.0
|
||||
with:
|
||||
github-token: ${{ secrets.DEPLOY_ESPHOME_SCHEMA_REPO_TOKEN }}
|
||||
script: |
|
||||
github.rest.actions.createWorkflowDispatch({
|
||||
owner: "esphome",
|
||||
repo: "esphome-schema",
|
||||
workflow_id: "generate-schemas.yml",
|
||||
ref: "main",
|
||||
inputs: {
|
||||
version: "${{ needs.init.outputs.tag }}",
|
||||
}
|
||||
})
|
||||
- env:
|
||||
TOKEN: ${{ secrets.DEPLOY_HASSIO_TOKEN }}
|
||||
run: |
|
||||
TAG="${GITHUB_REF#refs/tags/}"
|
||||
curl \
|
||||
-u ":$TOKEN" \
|
||||
-X POST \
|
||||
-H "Accept: application/vnd.github.v3+json" \
|
||||
https://api.github.com/repos/esphome/hassio/actions/workflows/bump-version.yml/dispatches \
|
||||
-d "{\"ref\":\"main\",\"inputs\":{\"version\":\"$TAG\"}}"
|
||||
|
56
.github/workflows/stale.yml
vendored
56
.github/workflows/stale.yml
vendored
@@ -1,9 +1,8 @@
|
||||
---
|
||||
name: Stale
|
||||
|
||||
on:
|
||||
schedule:
|
||||
- cron: "30 0 * * *"
|
||||
- cron: '30 0 * * *'
|
||||
workflow_dispatch:
|
||||
|
||||
permissions:
|
||||
@@ -15,52 +14,35 @@ concurrency:
|
||||
|
||||
jobs:
|
||||
stale:
|
||||
if: github.repository_owner == 'esphome'
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- name: Stale
|
||||
uses: actions/stale@5f858e3efba33a5ca4407a664cc011ad407f2008 # v10.1.0
|
||||
- uses: actions/stale@v4
|
||||
with:
|
||||
debug-only: ${{ github.ref != 'refs/heads/dev' }} # Dry-run when not run on dev branch
|
||||
remove-stale-when-updated: true
|
||||
operations-per-run: 400
|
||||
|
||||
# The 90 day stale policy for PRs
|
||||
# - PRs
|
||||
# - No PRs marked as "not-stale"
|
||||
# - No Issues (see below)
|
||||
days-before-pr-stale: 90
|
||||
days-before-pr-close: 7
|
||||
days-before-issue-stale: -1
|
||||
days-before-issue-close: -1
|
||||
remove-stale-when-updated: true
|
||||
stale-pr-label: "stale"
|
||||
exempt-pr-labels: "not-stale"
|
||||
exempt-pr-labels: "no-stale"
|
||||
stale-pr-message: >
|
||||
There hasn't been any activity on this pull request recently. This
|
||||
pull request has been automatically marked as stale because of that
|
||||
and will be closed if no further activity occurs within 7 days.
|
||||
Thank you for your contributions.
|
||||
|
||||
If you are the author of this PR, please leave a comment if you want
|
||||
to keep it open. Also, please rebase your PR onto the latest dev
|
||||
branch to ensure that it's up to date with the latest changes.
|
||||
|
||||
Thank you for your contribution!
|
||||
|
||||
# The 90 day stale policy for Issues
|
||||
# - Issues
|
||||
# - No Issues marked as "not-stale"
|
||||
# - No PRs (see above)
|
||||
days-before-issue-stale: 90
|
||||
days-before-issue-close: 7
|
||||
# Use stale to automatically close issues with a reference to the issue tracker
|
||||
close-issues:
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- uses: actions/stale@v4
|
||||
with:
|
||||
days-before-pr-stale: -1
|
||||
days-before-pr-close: -1
|
||||
days-before-issue-stale: 1
|
||||
days-before-issue-close: 1
|
||||
remove-stale-when-updated: true
|
||||
stale-issue-label: "stale"
|
||||
exempt-issue-labels: "not-stale"
|
||||
stale-issue-message: >
|
||||
There hasn't been any activity on this issue recently. Due to the
|
||||
high number of incoming GitHub notifications, we have to clean some
|
||||
of the old issues, as many of them have already been resolved with
|
||||
the latest updates.
|
||||
|
||||
Please make sure to update to the latest ESPHome version and
|
||||
check if that solves the issue. Let us know if that works for you by
|
||||
adding a comment 👍
|
||||
|
||||
This issue has now been marked as stale and will be closed if no
|
||||
further activity occurs. Thank you for your contributions.
|
||||
https://github.com/esphome/esphome/issues/430
|
||||
|
30
.github/workflows/status-check-labels.yml
vendored
30
.github/workflows/status-check-labels.yml
vendored
@@ -1,30 +0,0 @@
|
||||
name: Status check labels
|
||||
|
||||
on:
|
||||
pull_request:
|
||||
types: [labeled, unlabeled]
|
||||
|
||||
jobs:
|
||||
check:
|
||||
name: Check ${{ matrix.label }}
|
||||
runs-on: ubuntu-latest
|
||||
strategy:
|
||||
fail-fast: false
|
||||
matrix:
|
||||
label:
|
||||
- needs-docs
|
||||
- merge-after-release
|
||||
steps:
|
||||
- name: Check for ${{ matrix.label }} label
|
||||
uses: actions/github-script@ed597411d8f924073f98dfc5c65a23a2325f34cd # v8.0.0
|
||||
with:
|
||||
script: |
|
||||
const { data: labels } = await github.rest.issues.listLabelsOnIssue({
|
||||
owner: context.repo.owner,
|
||||
repo: context.repo.repo,
|
||||
issue_number: context.issue.number
|
||||
});
|
||||
const hasLabel = labels.find(label => label.name === '${{ matrix.label }}');
|
||||
if (hasLabel) {
|
||||
core.setFailed('Pull request cannot be merged, it is labeled as ${{ matrix.label }}');
|
||||
}
|
53
.github/workflows/sync-device-classes.yml
vendored
53
.github/workflows/sync-device-classes.yml
vendored
@@ -1,53 +0,0 @@
|
||||
---
|
||||
name: Synchronise Device Classes from Home Assistant
|
||||
|
||||
on:
|
||||
workflow_dispatch:
|
||||
schedule:
|
||||
- cron: "45 6 * * *"
|
||||
|
||||
jobs:
|
||||
sync:
|
||||
name: Sync Device Classes
|
||||
runs-on: ubuntu-latest
|
||||
if: github.repository == 'esphome/esphome'
|
||||
steps:
|
||||
- name: Checkout
|
||||
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
|
||||
|
||||
- name: Checkout Home Assistant
|
||||
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
|
||||
with:
|
||||
repository: home-assistant/core
|
||||
path: lib/home-assistant
|
||||
|
||||
- name: Setup Python
|
||||
uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
|
||||
with:
|
||||
python-version: 3.13
|
||||
|
||||
- name: Install Home Assistant
|
||||
run: |
|
||||
python -m pip install --upgrade pip
|
||||
pip install -e lib/home-assistant
|
||||
pip install -r requirements_test.txt pre-commit
|
||||
|
||||
- name: Sync
|
||||
run: |
|
||||
python ./script/sync-device_class.py
|
||||
|
||||
- name: Run pre-commit hooks
|
||||
run: |
|
||||
python script/run-in-env.py pre-commit run --all-files
|
||||
|
||||
- name: Commit changes
|
||||
uses: peter-evans/create-pull-request@271a8d0340265f705b14b6d32b9829c1cb33d45e # v7.0.8
|
||||
with:
|
||||
commit-message: "Synchronise Device Classes from Home Assistant"
|
||||
committer: esphomebot <esphome@openhomefoundation.org>
|
||||
author: esphomebot <esphome@openhomefoundation.org>
|
||||
branch: sync/device-classes
|
||||
delete-branch: true
|
||||
title: "Synchronise Device Classes from Home Assistant"
|
||||
body-path: .github/PULL_REQUEST_TEMPLATE.md
|
||||
token: ${{ secrets.DEVICE_CLASS_SYNC_TOKEN }}
|
17
.gitignore
vendored
17
.gitignore
vendored
@@ -13,12 +13,6 @@ __pycache__/
|
||||
# Intellij Idea
|
||||
.idea
|
||||
|
||||
# Eclipse
|
||||
.project
|
||||
.cproject
|
||||
.pydevproject
|
||||
.settings/
|
||||
|
||||
# Vim
|
||||
*.swp
|
||||
|
||||
@@ -75,9 +69,6 @@ cov.xml
|
||||
# pyenv
|
||||
.python-version
|
||||
|
||||
# asdf
|
||||
.tool-versions
|
||||
|
||||
# Environments
|
||||
.env
|
||||
.venv
|
||||
@@ -86,7 +77,6 @@ venv/
|
||||
ENV/
|
||||
env.bak/
|
||||
venv.bak/
|
||||
venv-*/
|
||||
|
||||
# mypy
|
||||
.mypy_cache/
|
||||
@@ -137,10 +127,3 @@ tests/.esphome/
|
||||
|
||||
sdkconfig.*
|
||||
!sdkconfig.defaults
|
||||
|
||||
.tests/
|
||||
|
||||
/components
|
||||
/managed_components
|
||||
|
||||
api-docs/
|
||||
|
6
.gitpod.yml
Normal file
6
.gitpod.yml
Normal file
@@ -0,0 +1,6 @@
|
||||
ports:
|
||||
- port: 6052
|
||||
onOpen: open-preview
|
||||
tasks:
|
||||
- before: pyenv local $(pyenv version | grep '^3\.' | cut -d ' ' -f 1) && script/setup
|
||||
command: python -m esphome config dashboard
|
@@ -1,67 +1,27 @@
|
||||
---
|
||||
# See https://pre-commit.com for more information
|
||||
# See https://pre-commit.com/hooks.html for more hooks
|
||||
|
||||
ci:
|
||||
autoupdate_commit_msg: 'pre-commit: autoupdate'
|
||||
autoupdate_schedule: off # Disabled until ruff versions are synced between deps and pre-commit
|
||||
# Skip hooks that have issues in pre-commit CI environment
|
||||
skip: [pylint, clang-tidy-hash]
|
||||
|
||||
repos:
|
||||
- repo: https://github.com/astral-sh/ruff-pre-commit
|
||||
# Ruff version.
|
||||
rev: v0.14.1
|
||||
- repo: https://github.com/ambv/black
|
||||
rev: 20.8b1
|
||||
hooks:
|
||||
# Run the linter.
|
||||
- id: ruff
|
||||
args: [--fix]
|
||||
# Run the formatter.
|
||||
- id: ruff-format
|
||||
- repo: https://github.com/PyCQA/flake8
|
||||
rev: 7.3.0
|
||||
- id: black
|
||||
args:
|
||||
- --safe
|
||||
- --quiet
|
||||
files: ^((esphome|script|tests)/.+)?[^/]+\.py$
|
||||
- repo: https://gitlab.com/pycqa/flake8
|
||||
rev: 3.8.4
|
||||
hooks:
|
||||
- id: flake8
|
||||
additional_dependencies:
|
||||
- flake8-docstrings==1.7.0
|
||||
- flake8-docstrings==1.5.0
|
||||
- pydocstyle==5.1.1
|
||||
files: ^(esphome|tests)/.+\.py$
|
||||
- repo: https://github.com/pre-commit/pre-commit-hooks
|
||||
rev: v5.0.0
|
||||
rev: v3.4.0
|
||||
hooks:
|
||||
- id: no-commit-to-branch
|
||||
args:
|
||||
- --branch=dev
|
||||
- --branch=release
|
||||
- --branch=beta
|
||||
- id: end-of-file-fixer
|
||||
- id: trailing-whitespace
|
||||
- repo: https://github.com/asottile/pyupgrade
|
||||
rev: v3.20.0
|
||||
hooks:
|
||||
- id: pyupgrade
|
||||
args: [--py311-plus]
|
||||
- repo: https://github.com/adrienverge/yamllint.git
|
||||
rev: v1.37.1
|
||||
hooks:
|
||||
- id: yamllint
|
||||
exclude: ^(\.clang-format|\.clang-tidy)$
|
||||
- repo: https://github.com/pre-commit/mirrors-clang-format
|
||||
rev: v13.0.1
|
||||
hooks:
|
||||
- id: clang-format
|
||||
types_or: [c, c++]
|
||||
- repo: local
|
||||
hooks:
|
||||
- id: pylint
|
||||
name: pylint
|
||||
entry: python3 script/run-in-env.py pylint
|
||||
language: system
|
||||
types: [python]
|
||||
- id: clang-tidy-hash
|
||||
name: Update clang-tidy hash
|
||||
entry: python script/clang_tidy_hash.py --update-if-changed
|
||||
language: python
|
||||
files: ^(\.clang-tidy|platformio\.ini|requirements_dev\.txt)$
|
||||
pass_filenames: false
|
||||
additional_dependencies: []
|
||||
|
33
.vscode/tasks.json
vendored
33
.vscode/tasks.json
vendored
@@ -2,24 +2,15 @@
|
||||
"version": "2.0.0",
|
||||
"tasks": [
|
||||
{
|
||||
"label": "Run Dashboard",
|
||||
"label": "run",
|
||||
"type": "shell",
|
||||
"command": "${command:python.interpreterPath}",
|
||||
"args": [
|
||||
"-m",
|
||||
"esphome",
|
||||
"dashboard",
|
||||
"config/"
|
||||
],
|
||||
"command": "python3 -m esphome dashboard config/",
|
||||
"problemMatcher": []
|
||||
},
|
||||
{
|
||||
"label": "clang-tidy",
|
||||
"type": "shell",
|
||||
"command": "${command:python.interpreterPath}",
|
||||
"args": [
|
||||
"./script/clang-tidy"
|
||||
],
|
||||
"command": "./script/clang-tidy",
|
||||
"problemMatcher": [
|
||||
{
|
||||
"owner": "clang-tidy",
|
||||
@@ -36,24 +27,6 @@
|
||||
]
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"label": "Generate proto files",
|
||||
"type": "shell",
|
||||
"command": "${command:python.interpreterPath}",
|
||||
"args": [
|
||||
"./script/api_protobuf/api_protobuf.py"
|
||||
],
|
||||
"group": {
|
||||
"kind": "build",
|
||||
"isDefault": true
|
||||
},
|
||||
"presentation": {
|
||||
"reveal": "never",
|
||||
"close": true,
|
||||
"panel": "new"
|
||||
},
|
||||
"problemMatcher": []
|
||||
}
|
||||
]
|
||||
}
|
||||
|
19
.yamllint
19
.yamllint
@@ -1,19 +0,0 @@
|
||||
---
|
||||
extends: default
|
||||
|
||||
ignore-from-file: .gitignore
|
||||
|
||||
rules:
|
||||
document-start: disable
|
||||
empty-lines:
|
||||
level: error
|
||||
max: 1
|
||||
max-start: 0
|
||||
max-end: 1
|
||||
indentation:
|
||||
level: error
|
||||
spaces: 2
|
||||
indent-sequences: true
|
||||
check-multi-line-strings: false
|
||||
line-length: disable
|
||||
truthy: disable
|
430
CODEOWNERS
430
CODEOWNERS
@@ -6,434 +6,146 @@
|
||||
# the integration's code owner is automatically notified.
|
||||
|
||||
# Core Code
|
||||
pyproject.toml @esphome/core
|
||||
setup.py @esphome/core
|
||||
esphome/*.py @esphome/core
|
||||
esphome/core/* @esphome/core
|
||||
.github/** @esphome/core
|
||||
|
||||
# Integrations
|
||||
esphome/components/a01nyub/* @MrSuicideParrot
|
||||
esphome/components/a02yyuw/* @TH-Braemer
|
||||
esphome/components/absolute_humidity/* @DAVe3283
|
||||
esphome/components/ac_dimmer/* @glmnet
|
||||
esphome/components/adc/* @esphome/core
|
||||
esphome/components/adc128s102/* @DeerMaximum
|
||||
esphome/components/addressable_light/* @justfalter
|
||||
esphome/components/ade7880/* @kpfleming
|
||||
esphome/components/ade7953/* @angelnu
|
||||
esphome/components/ade7953_i2c/* @angelnu
|
||||
esphome/components/ade7953_spi/* @angelnu
|
||||
esphome/components/ads1118/* @solomondg1
|
||||
esphome/components/ags10/* @mak-42
|
||||
esphome/components/aic3204/* @kbx81
|
||||
esphome/components/airthings_ble/* @jeromelaban
|
||||
esphome/components/airthings_wave_base/* @jeromelaban @kpfleming @ncareau
|
||||
esphome/components/airthings_wave_mini/* @ncareau
|
||||
esphome/components/airthings_wave_plus/* @jeromelaban @precurse
|
||||
esphome/components/alarm_control_panel/* @grahambrown11 @hwstar
|
||||
esphome/components/alpha3/* @jan-hofmeier
|
||||
esphome/components/am2315c/* @swoboda1337
|
||||
esphome/components/airthings_wave_plus/* @jeromelaban
|
||||
esphome/components/am43/* @buxtronix
|
||||
esphome/components/am43/cover/* @buxtronix
|
||||
esphome/components/am43/sensor/* @buxtronix
|
||||
esphome/components/analog_threshold/* @ianchi
|
||||
esphome/components/animation/* @syndlex
|
||||
esphome/components/anova/* @buxtronix
|
||||
esphome/components/apds9306/* @aodrenah
|
||||
esphome/components/api/* @esphome/core
|
||||
esphome/components/as5600/* @ammmze
|
||||
esphome/components/as5600/sensor/* @ammmze
|
||||
esphome/components/as7341/* @mrgnr
|
||||
esphome/components/async_tcp/* @esphome/core
|
||||
esphome/components/at581x/* @X-Ryl669
|
||||
esphome/components/api/* @OttoWinter
|
||||
esphome/components/async_tcp/* @OttoWinter
|
||||
esphome/components/atc_mithermometer/* @ahpohl
|
||||
esphome/components/atm90e26/* @danieltwagner
|
||||
esphome/components/atm90e32/* @circuitsetup @descipher
|
||||
esphome/components/audio/* @kahrendt
|
||||
esphome/components/audio_adc/* @kbx81
|
||||
esphome/components/audio_dac/* @kbx81
|
||||
esphome/components/axs15231/* @clydebarrow
|
||||
esphome/components/b_parasite/* @rbaron
|
||||
esphome/components/ballu/* @bazuchan
|
||||
esphome/components/bang_bang/* @OttoWinter
|
||||
esphome/components/bedjet/* @jhansche
|
||||
esphome/components/bedjet/climate/* @jhansche
|
||||
esphome/components/bedjet/fan/* @jhansche
|
||||
esphome/components/bedjet/sensor/* @javawizard @jhansche
|
||||
esphome/components/beken_spi_led_strip/* @Mat931
|
||||
esphome/components/bh1750/* @OttoWinter
|
||||
esphome/components/bh1900nux/* @B48D81EFCC
|
||||
esphome/components/binary_sensor/* @esphome/core
|
||||
esphome/components/bk72xx/* @kuba2k2
|
||||
esphome/components/bl0906/* @athom-tech @jesserockz @tarontop
|
||||
esphome/components/bl0939/* @ziceva
|
||||
esphome/components/bl0940/* @dan-s-github @tobias-
|
||||
esphome/components/bl0942/* @dbuezas @dwmw2
|
||||
esphome/components/ble_client/* @buxtronix @clydebarrow
|
||||
esphome/components/bluetooth_proxy/* @bdraco @jesserockz
|
||||
esphome/components/bme280_base/* @esphome/core
|
||||
esphome/components/bme280_spi/* @apbodrov
|
||||
esphome/components/ble_client/* @buxtronix
|
||||
esphome/components/bme680_bsec/* @trvrnrth
|
||||
esphome/components/bme68x_bsec2/* @kbx81 @neffs
|
||||
esphome/components/bme68x_bsec2_i2c/* @kbx81 @neffs
|
||||
esphome/components/bmi160/* @flaviut
|
||||
esphome/components/bmp280_base/* @ademuri
|
||||
esphome/components/bmp280_i2c/* @ademuri
|
||||
esphome/components/bmp280_spi/* @ademuri
|
||||
esphome/components/bmp3xx/* @latonita
|
||||
esphome/components/bmp3xx_base/* @latonita @martgras
|
||||
esphome/components/bmp3xx_i2c/* @latonita
|
||||
esphome/components/bmp3xx_spi/* @latonita
|
||||
esphome/components/bmp581/* @kahrendt
|
||||
esphome/components/bp1658cj/* @Cossid
|
||||
esphome/components/bp5758d/* @Cossid
|
||||
esphome/components/button/* @esphome/core
|
||||
esphome/components/bytebuffer/* @clydebarrow
|
||||
esphome/components/camera/* @bdraco @DT-art1
|
||||
esphome/components/camera_encoder/* @DT-art1
|
||||
esphome/components/canbus/* @danielschramm @mvturnho
|
||||
esphome/components/cap1188/* @mreditor97
|
||||
esphome/components/captive_portal/* @esphome/core
|
||||
esphome/components/cap1188/* @MrEditor97
|
||||
esphome/components/captive_portal/* @OttoWinter
|
||||
esphome/components/ccs811/* @habbie
|
||||
esphome/components/cd74hc4067/* @asoehlke
|
||||
esphome/components/ch422g/* @clydebarrow @jesterret
|
||||
esphome/components/chsc6x/* @kkosik20
|
||||
esphome/components/climate/* @esphome/core
|
||||
esphome/components/climate_ir/* @glmnet
|
||||
esphome/components/cm1106/* @andrewjswan
|
||||
esphome/components/color_temperature/* @jesserockz
|
||||
esphome/components/combination/* @Cat-Ion @kahrendt
|
||||
esphome/components/const/* @esphome/core
|
||||
esphome/components/coolix/* @glmnet
|
||||
esphome/components/copy/* @OttoWinter
|
||||
esphome/components/cover/* @esphome/core
|
||||
esphome/components/cs5460a/* @balrog-kun
|
||||
esphome/components/cse7761/* @berfenger
|
||||
esphome/components/cst226/* @clydebarrow
|
||||
esphome/components/cst816/* @clydebarrow
|
||||
esphome/components/ct_clamp/* @jesserockz
|
||||
esphome/components/current_based/* @djwmarcx
|
||||
esphome/components/dac7678/* @NickB1
|
||||
esphome/components/daikin_arc/* @MagicBear
|
||||
esphome/components/daikin_brc/* @hagak
|
||||
esphome/components/dallas_temp/* @ssieb
|
||||
esphome/components/daly_bms/* @s1lvi0
|
||||
esphome/components/dashboard_import/* @esphome/core
|
||||
esphome/components/datetime/* @jesserockz @rfdarter
|
||||
esphome/components/debug/* @esphome/core
|
||||
esphome/components/delonghi/* @grob6000
|
||||
esphome/components/debug/* @OttoWinter
|
||||
esphome/components/dfplayer/* @glmnet
|
||||
esphome/components/dfrobot_sen0395/* @niklasweber
|
||||
esphome/components/dht/* @OttoWinter
|
||||
esphome/components/display_menu_base/* @numo68
|
||||
esphome/components/dps310/* @kbx81
|
||||
esphome/components/ds1307/* @badbadc0ffee
|
||||
esphome/components/ds2484/* @mrk-its
|
||||
esphome/components/dsmr/* @glmnet @zuidwijk
|
||||
esphome/components/duty_time/* @dudanov
|
||||
esphome/components/ee895/* @Stock-M
|
||||
esphome/components/ektf2232/touchscreen/* @jesserockz
|
||||
esphome/components/emc2101/* @ellull
|
||||
esphome/components/emmeti/* @E440QF
|
||||
esphome/components/ens160/* @latonita
|
||||
esphome/components/ens160_base/* @latonita @vincentscode
|
||||
esphome/components/ens160_i2c/* @latonita
|
||||
esphome/components/ens160_spi/* @latonita
|
||||
esphome/components/ens210/* @itn3rd77
|
||||
esphome/components/epaper_spi/* @esphome/core
|
||||
esphome/components/es7210/* @kahrendt
|
||||
esphome/components/es7243e/* @kbx81
|
||||
esphome/components/es8156/* @kbx81
|
||||
esphome/components/es8311/* @kahrendt @kroimon
|
||||
esphome/components/es8388/* @P4uLT
|
||||
esphome/components/esp32/* @esphome/core
|
||||
esphome/components/esp32_ble/* @bdraco @jesserockz @Rapsssito
|
||||
esphome/components/esp32_ble_client/* @bdraco @jesserockz
|
||||
esphome/components/esp32_ble_server/* @clydebarrow @jesserockz @Rapsssito
|
||||
esphome/components/esp32_ble_tracker/* @bdraco
|
||||
esphome/components/esp32_ble/* @jesserockz
|
||||
esphome/components/esp32_ble_server/* @jesserockz
|
||||
esphome/components/esp32_camera_web_server/* @ayufan
|
||||
esphome/components/esp32_can/* @Sympatron
|
||||
esphome/components/esp32_hosted/* @swoboda1337
|
||||
esphome/components/esp32_improv/* @jesserockz
|
||||
esphome/components/esp32_rmt/* @jesserockz
|
||||
esphome/components/esp32_rmt_led_strip/* @jesserockz
|
||||
esphome/components/esp8266/* @esphome/core
|
||||
esphome/components/esp_ldo/* @clydebarrow
|
||||
esphome/components/espnow/* @jesserockz
|
||||
esphome/components/ethernet_info/* @gtjadsonsantos
|
||||
esphome/components/event/* @nohat
|
||||
esphome/components/exposure_notifications/* @OttoWinter
|
||||
esphome/components/ezo/* @ssieb
|
||||
esphome/components/ezo_pmp/* @carlos-sarmiento
|
||||
esphome/components/factory_reset/* @anatoly-savchenkov
|
||||
esphome/components/fastled_base/* @OttoWinter
|
||||
esphome/components/feedback/* @ianchi
|
||||
esphome/components/fingerprint_grow/* @alexborro @loongyh @OnFreund
|
||||
esphome/components/font/* @clydebarrow @esphome/core
|
||||
esphome/components/fs3000/* @kahrendt
|
||||
esphome/components/ft5x06/* @clydebarrow
|
||||
esphome/components/ft63x6/* @gpambrozio
|
||||
esphome/components/gcja5/* @gcormier
|
||||
esphome/components/gdk101/* @Szewcson
|
||||
esphome/components/gl_r01_i2c/* @pkejval
|
||||
esphome/components/fingerprint_grow/* @OnFreund @loongyh
|
||||
esphome/components/globals/* @esphome/core
|
||||
esphome/components/gp2y1010au0f/* @zry98
|
||||
esphome/components/gp8403/* @jesserockz
|
||||
esphome/components/gpio/* @esphome/core
|
||||
esphome/components/gpio/one_wire/* @ssieb
|
||||
esphome/components/gps/* @coogle @ximex
|
||||
esphome/components/gps/* @coogle
|
||||
esphome/components/graph/* @synco
|
||||
esphome/components/graphical_display_menu/* @MrMDavidson
|
||||
esphome/components/gree/* @orestismers
|
||||
esphome/components/grove_gas_mc_v2/* @YorkshireIoT
|
||||
esphome/components/grove_tb6612fng/* @max246
|
||||
esphome/components/growatt_solar/* @leeuwte
|
||||
esphome/components/gt911/* @clydebarrow @jesserockz
|
||||
esphome/components/haier/* @paveldn
|
||||
esphome/components/haier/binary_sensor/* @paveldn
|
||||
esphome/components/haier/button/* @paveldn
|
||||
esphome/components/haier/sensor/* @paveldn
|
||||
esphome/components/haier/switch/* @paveldn
|
||||
esphome/components/haier/text_sensor/* @paveldn
|
||||
esphome/components/havells_solar/* @sourabhjaiswal
|
||||
esphome/components/hbridge/fan/* @WeekendWarrior
|
||||
esphome/components/hbridge/light/* @DotNetDann
|
||||
esphome/components/hbridge/switch/* @dwmw2
|
||||
esphome/components/he60r/* @clydebarrow
|
||||
esphome/components/heatpumpir/* @rob-deutsch
|
||||
esphome/components/hitachi_ac424/* @sourabhjaiswal
|
||||
esphome/components/hm3301/* @freekode
|
||||
esphome/components/hmac_md5/* @dwmw2
|
||||
esphome/components/homeassistant/* @esphome/core @OttoWinter
|
||||
esphome/components/homeassistant/number/* @landonr
|
||||
esphome/components/homeassistant/switch/* @Links2004
|
||||
esphome/components/honeywell_hih_i2c/* @Benichou34
|
||||
esphome/components/honeywellabp/* @RubyBailey
|
||||
esphome/components/honeywellabp2_i2c/* @jpfaff
|
||||
esphome/components/host/* @clydebarrow @esphome/core
|
||||
esphome/components/host/time/* @clydebarrow
|
||||
esphome/components/homeassistant/* @OttoWinter
|
||||
esphome/components/hrxl_maxsonar_wr/* @netmikey
|
||||
esphome/components/hte501/* @Stock-M
|
||||
esphome/components/http_request/ota/* @oarcher
|
||||
esphome/components/http_request/update/* @jesserockz
|
||||
esphome/components/htu31d/* @betterengineering
|
||||
esphome/components/hydreon_rgxx/* @functionpointer
|
||||
esphome/components/hyt271/* @Philippe12
|
||||
esphome/components/i2c/* @esphome/core
|
||||
esphome/components/i2c_device/* @gabest11
|
||||
esphome/components/i2s_audio/* @jesserockz
|
||||
esphome/components/i2s_audio/media_player/* @jesserockz
|
||||
esphome/components/i2s_audio/microphone/* @jesserockz
|
||||
esphome/components/i2s_audio/speaker/* @jesserockz @kahrendt
|
||||
esphome/components/iaqcore/* @yozik04
|
||||
esphome/components/ili9xxx/* @clydebarrow @nielsnl68
|
||||
esphome/components/improv_base/* @esphome/core
|
||||
esphome/components/improv_serial/* @esphome/core
|
||||
esphome/components/ina226/* @latonita @Sergio303
|
||||
esphome/components/ina260/* @mreditor97
|
||||
esphome/components/ina2xx_base/* @latonita
|
||||
esphome/components/ina2xx_i2c/* @latonita
|
||||
esphome/components/ina2xx_spi/* @latonita
|
||||
esphome/components/inkbird_ibsth1_mini/* @fkirill
|
||||
esphome/components/inkplate/* @jesserockz @JosipKuci
|
||||
esphome/components/inkplate6/* @jesserockz
|
||||
esphome/components/integration/* @OttoWinter
|
||||
esphome/components/internal_temperature/* @Mat931
|
||||
esphome/components/interval/* @esphome/core
|
||||
esphome/components/jsn_sr04t/* @Mafus1
|
||||
esphome/components/json/* @esphome/core
|
||||
esphome/components/kamstrup_kmp/* @cfeenstra1024
|
||||
esphome/components/key_collector/* @ssieb
|
||||
esphome/components/key_provider/* @ssieb
|
||||
esphome/components/kuntze/* @ssieb
|
||||
esphome/components/lc709203f/* @ilikecake
|
||||
esphome/components/lcd_menu/* @numo68
|
||||
esphome/components/ld2410/* @regevbr @sebcaps
|
||||
esphome/components/ld2412/* @Rihan9
|
||||
esphome/components/ld2420/* @descipher
|
||||
esphome/components/ld2450/* @hareeshmu
|
||||
esphome/components/ld24xx/* @kbx81
|
||||
esphome/components/json/* @OttoWinter
|
||||
esphome/components/ledc/* @OttoWinter
|
||||
esphome/components/libretiny/* @kuba2k2
|
||||
esphome/components/libretiny_pwm/* @kuba2k2
|
||||
esphome/components/light/* @esphome/core
|
||||
esphome/components/lightwaverf/* @max246
|
||||
esphome/components/lilygo_t5_47/touchscreen/* @jesserockz
|
||||
esphome/components/lm75b/* @beormund
|
||||
esphome/components/ln882x/* @lamauny
|
||||
esphome/components/lock/* @esphome/core
|
||||
esphome/components/logger/* @esphome/core
|
||||
esphome/components/logger/select/* @clydebarrow
|
||||
esphome/components/lps22/* @nagisa
|
||||
esphome/components/ltr390/* @latonita @sjtrny
|
||||
esphome/components/ltr501/* @latonita
|
||||
esphome/components/ltr_als_ps/* @latonita
|
||||
esphome/components/lvgl/* @clydebarrow
|
||||
esphome/components/m5stack_8angle/* @rnauber
|
||||
esphome/components/mapping/* @clydebarrow
|
||||
esphome/components/matrix_keypad/* @ssieb
|
||||
esphome/components/max17043/* @blacknell
|
||||
esphome/components/max31865/* @DAVe3283
|
||||
esphome/components/max44009/* @berfenger
|
||||
esphome/components/max6956/* @looping40
|
||||
esphome/components/ltr390/* @sjtrny
|
||||
esphome/components/max7219digit/* @rspaargaren
|
||||
esphome/components/max9611/* @mckaymatthew
|
||||
esphome/components/mcp23008/* @jesserockz
|
||||
esphome/components/mcp23017/* @jesserockz
|
||||
esphome/components/mcp23s08/* @jesserockz @SenexCrenshaw
|
||||
esphome/components/mcp23s17/* @jesserockz @SenexCrenshaw
|
||||
esphome/components/mcp23s08/* @SenexCrenshaw @jesserockz
|
||||
esphome/components/mcp23s17/* @SenexCrenshaw @jesserockz
|
||||
esphome/components/mcp23x08_base/* @jesserockz
|
||||
esphome/components/mcp23x17_base/* @jesserockz
|
||||
esphome/components/mcp23xxx_base/* @jesserockz
|
||||
esphome/components/mcp2515/* @danielschramm @mvturnho
|
||||
esphome/components/mcp3204/* @rsumner
|
||||
esphome/components/mcp4461/* @p1ngb4ck
|
||||
esphome/components/mcp4728/* @berfenger
|
||||
esphome/components/mcp47a1/* @jesserockz
|
||||
esphome/components/mcp9600/* @mreditor97
|
||||
esphome/components/mcp9808/* @k7hpn
|
||||
esphome/components/md5/* @esphome/core
|
||||
esphome/components/mdns/* @esphome/core
|
||||
esphome/components/media_player/* @jesserockz
|
||||
esphome/components/micro_wake_word/* @jesserockz @kahrendt
|
||||
esphome/components/micronova/* @jorre05
|
||||
esphome/components/microphone/* @jesserockz @kahrendt
|
||||
esphome/components/mics_4514/* @jesserockz
|
||||
esphome/components/midea/* @dudanov
|
||||
esphome/components/midea_ir/* @dudanov
|
||||
esphome/components/mipi_dsi/* @clydebarrow
|
||||
esphome/components/mipi_rgb/* @clydebarrow
|
||||
esphome/components/mipi_spi/* @clydebarrow
|
||||
esphome/components/mitsubishi/* @RubyBailey
|
||||
esphome/components/mixer/speaker/* @kahrendt
|
||||
esphome/components/mlx90393/* @functionpointer
|
||||
esphome/components/mlx90614/* @jesserockz
|
||||
esphome/components/mmc5603/* @benhoff
|
||||
esphome/components/mmc5983/* @agoode
|
||||
esphome/components/modbus_controller/* @martgras
|
||||
esphome/components/modbus_controller/binary_sensor/* @martgras
|
||||
esphome/components/modbus_controller/number/* @martgras
|
||||
esphome/components/modbus_controller/output/* @martgras
|
||||
esphome/components/modbus_controller/select/* @martgras @stegm
|
||||
esphome/components/modbus_controller/sensor/* @martgras
|
||||
esphome/components/modbus_controller/switch/* @martgras
|
||||
esphome/components/modbus_controller/text_sensor/* @martgras
|
||||
esphome/components/mopeka_ble/* @Fabian-Schmidt @spbrogan
|
||||
esphome/components/mopeka_pro_check/* @spbrogan
|
||||
esphome/components/mopeka_std_check/* @Fabian-Schmidt
|
||||
esphome/components/mpl3115a2/* @kbickar
|
||||
esphome/components/mpu6886/* @fabaff
|
||||
esphome/components/ms8607/* @e28eta
|
||||
esphome/components/msa3xx/* @latonita
|
||||
esphome/components/nau7802/* @cujomalainey
|
||||
esphome/components/network/* @esphome/core
|
||||
esphome/components/nextion/* @edwardtfn @senexcrenshaw
|
||||
esphome/components/nextion/* @senexcrenshaw
|
||||
esphome/components/nextion/binary_sensor/* @senexcrenshaw
|
||||
esphome/components/nextion/sensor/* @senexcrenshaw
|
||||
esphome/components/nextion/switch/* @senexcrenshaw
|
||||
esphome/components/nextion/text_sensor/* @senexcrenshaw
|
||||
esphome/components/nfc/* @jesserockz @kbx81
|
||||
esphome/components/noblex/* @AGalfra
|
||||
esphome/components/npi19/* @bakerkj
|
||||
esphome/components/nrf52/* @tomaszduda23
|
||||
esphome/components/nfc/* @jesserockz
|
||||
esphome/components/number/* @esphome/core
|
||||
esphome/components/one_wire/* @ssieb
|
||||
esphome/components/online_image/* @clydebarrow @guillempages
|
||||
esphome/components/opentherm/* @olegtarasov
|
||||
esphome/components/openthread/* @mrene
|
||||
esphome/components/opt3001/* @ccutrer
|
||||
esphome/components/ota/* @esphome/core
|
||||
esphome/components/output/* @esphome/core
|
||||
esphome/components/packet_transport/* @clydebarrow
|
||||
esphome/components/pca6416a/* @Mat931
|
||||
esphome/components/pca9554/* @bdraco @clydebarrow @hwstar
|
||||
esphome/components/pcf85063/* @brogon
|
||||
esphome/components/pcf8563/* @KoenBreeman
|
||||
esphome/components/pi4ioe5v6408/* @jesserockz
|
||||
esphome/components/pid/* @OttoWinter
|
||||
esphome/components/pipsolar/* @andreashergert1984
|
||||
esphome/components/pm1006/* @habbie
|
||||
esphome/components/pm2005/* @andrewjswan
|
||||
esphome/components/pmsa003i/* @sjtrny
|
||||
esphome/components/pmsx003/* @ximex
|
||||
esphome/components/pmwcs3/* @SeByDocKy
|
||||
esphome/components/pn532/* @jesserockz @OttoWinter
|
||||
esphome/components/pn532_i2c/* @jesserockz @OttoWinter
|
||||
esphome/components/pn532_spi/* @jesserockz @OttoWinter
|
||||
esphome/components/pn7150/* @jesserockz @kbx81
|
||||
esphome/components/pn7150_i2c/* @jesserockz @kbx81
|
||||
esphome/components/pn7160/* @jesserockz @kbx81
|
||||
esphome/components/pn7160_i2c/* @jesserockz @kbx81
|
||||
esphome/components/pn7160_spi/* @jesserockz @kbx81
|
||||
esphome/components/pn532/* @OttoWinter @jesserockz
|
||||
esphome/components/pn532_i2c/* @OttoWinter @jesserockz
|
||||
esphome/components/pn532_spi/* @OttoWinter @jesserockz
|
||||
esphome/components/power_supply/* @esphome/core
|
||||
esphome/components/preferences/* @esphome/core
|
||||
esphome/components/psram/* @esphome/core
|
||||
esphome/components/pulse_meter/* @cstaahl @stevebaxter @TrentHouliston
|
||||
esphome/components/pulse_meter/* @stevebaxter
|
||||
esphome/components/pvvx_mithermometer/* @pasiz
|
||||
esphome/components/pylontech/* @functionpointer
|
||||
esphome/components/qmp6988/* @andrewpc
|
||||
esphome/components/qr_code/* @wjtje
|
||||
esphome/components/qspi_dbi/* @clydebarrow
|
||||
esphome/components/qwiic_pir/* @kahrendt
|
||||
esphome/components/radon_eye_ble/* @jeffeb3
|
||||
esphome/components/radon_eye_rd200/* @jeffeb3
|
||||
esphome/components/rc522/* @glmnet
|
||||
esphome/components/rc522_i2c/* @glmnet
|
||||
esphome/components/rc522_spi/* @glmnet
|
||||
esphome/components/resampler/speaker/* @kahrendt
|
||||
esphome/components/restart/* @esphome/core
|
||||
esphome/components/rf_bridge/* @jesserockz
|
||||
esphome/components/rgbct/* @jesserockz
|
||||
esphome/components/rp2040/* @jesserockz
|
||||
esphome/components/rp2040_pio_led_strip/* @Papa-DMan
|
||||
esphome/components/rp2040_pwm/* @jesserockz
|
||||
esphome/components/rpi_dpi_rgb/* @clydebarrow
|
||||
esphome/components/rtl87xx/* @kuba2k2
|
||||
esphome/components/rtttl/* @glmnet
|
||||
esphome/components/runtime_stats/* @bdraco
|
||||
esphome/components/safe_mode/* @jsuanet @kbx81 @paulmonigatti
|
||||
esphome/components/scd4x/* @martgras @sjtrny
|
||||
esphome/components/safe_mode/* @paulmonigatti
|
||||
esphome/components/scd4x/* @sjtrny
|
||||
esphome/components/script/* @esphome/core
|
||||
esphome/components/sdl/* @bdm310 @clydebarrow
|
||||
esphome/components/sdm_meter/* @jesserockz @polyfaces
|
||||
esphome/components/sdp3x/* @Azimath
|
||||
esphome/components/seeed_mr24hpc1/* @limengdu
|
||||
esphome/components/seeed_mr60bha2/* @limengdu
|
||||
esphome/components/seeed_mr60fda2/* @limengdu
|
||||
esphome/components/selec_meter/* @sourabhjaiswal
|
||||
esphome/components/select/* @esphome/core
|
||||
esphome/components/sen0321/* @notjj
|
||||
esphome/components/sen21231/* @shreyaskarnik
|
||||
esphome/components/sen5x/* @martgras
|
||||
esphome/components/sensirion_common/* @martgras
|
||||
esphome/components/sensor/* @esphome/core
|
||||
esphome/components/sfa30/* @ghsensdev
|
||||
esphome/components/sgp40/* @SenexCrenshaw
|
||||
esphome/components/sgp4x/* @martgras @SenexCrenshaw
|
||||
esphome/components/sha256/* @esphome/core
|
||||
esphome/components/shelly_dimmer/* @edge90 @rnauber
|
||||
esphome/components/sht3xd/* @mrtoy-me
|
||||
esphome/components/sht4x/* @sjtrny
|
||||
esphome/components/shutdown/* @esphome/core @jsuanet
|
||||
esphome/components/sigma_delta_output/* @Cat-Ion
|
||||
esphome/components/shutdown/* @esphome/core
|
||||
esphome/components/sim800l/* @glmnet
|
||||
esphome/components/sm10bit_base/* @Cossid
|
||||
esphome/components/sm2135/* @BoukeHaarsma23 @dd32 @matika77
|
||||
esphome/components/sm2235/* @Cossid
|
||||
esphome/components/sm2335/* @Cossid
|
||||
esphome/components/sml/* @alengwenus
|
||||
esphome/components/smt100/* @piechade
|
||||
esphome/components/sn74hc165/* @jesserockz
|
||||
esphome/components/sm2135/* @BoukeHaarsma23
|
||||
esphome/components/socket/* @esphome/core
|
||||
esphome/components/sonoff_d1/* @anatoly-savchenkov
|
||||
esphome/components/sound_level/* @kahrendt
|
||||
esphome/components/speaker/* @jesserockz @kahrendt
|
||||
esphome/components/speaker/media_player/* @kahrendt @synesthesiam
|
||||
esphome/components/spi/* @clydebarrow @esphome/core
|
||||
esphome/components/spi_device/* @clydebarrow
|
||||
esphome/components/spi_led_strip/* @clydebarrow
|
||||
esphome/components/split_buffer/* @jesserockz
|
||||
esphome/components/sprinkler/* @kbx81
|
||||
esphome/components/sps30/* @martgras
|
||||
esphome/components/spi/* @esphome/core
|
||||
esphome/components/ssd1322_base/* @kbx81
|
||||
esphome/components/ssd1322_spi/* @kbx81
|
||||
esphome/components/ssd1325_base/* @kbx81
|
||||
@@ -445,112 +157,34 @@ esphome/components/ssd1331_base/* @kbx81
|
||||
esphome/components/ssd1331_spi/* @kbx81
|
||||
esphome/components/ssd1351_base/* @kbx81
|
||||
esphome/components/ssd1351_spi/* @kbx81
|
||||
esphome/components/st7567_base/* @latonita
|
||||
esphome/components/st7567_i2c/* @latonita
|
||||
esphome/components/st7567_spi/* @latonita
|
||||
esphome/components/st7701s/* @clydebarrow
|
||||
esphome/components/st7735/* @SenexCrenshaw
|
||||
esphome/components/st7789v/* @kbx81
|
||||
esphome/components/st7920/* @marsjan155
|
||||
esphome/components/statsd/* @Links2004
|
||||
esphome/components/substitutions/* @esphome/core
|
||||
esphome/components/sun/* @OttoWinter
|
||||
esphome/components/sun_gtil2/* @Mat931
|
||||
esphome/components/switch/* @esphome/core
|
||||
esphome/components/switch/binary_sensor/* @ssieb
|
||||
esphome/components/sx126x/* @swoboda1337
|
||||
esphome/components/sx127x/* @swoboda1337
|
||||
esphome/components/syslog/* @clydebarrow
|
||||
esphome/components/t6615/* @tylermenezes
|
||||
esphome/components/tc74/* @sethgirvan
|
||||
esphome/components/tca9548a/* @andreashergert1984
|
||||
esphome/components/tca9555/* @mobrembski
|
||||
esphome/components/tcl112/* @glmnet
|
||||
esphome/components/tee501/* @Stock-M
|
||||
esphome/components/teleinfo/* @0hax
|
||||
esphome/components/tem3200/* @bakerkj
|
||||
esphome/components/template/alarm_control_panel/* @grahambrown11 @hwstar
|
||||
esphome/components/template/datetime/* @rfdarter
|
||||
esphome/components/template/event/* @nohat
|
||||
esphome/components/template/fan/* @ssieb
|
||||
esphome/components/text/* @mauritskorse
|
||||
esphome/components/thermostat/* @kbx81
|
||||
esphome/components/time/* @esphome/core
|
||||
esphome/components/time/* @OttoWinter
|
||||
esphome/components/tlc5947/* @rnauber
|
||||
esphome/components/tlc5971/* @IJIJI
|
||||
esphome/components/tm1621/* @Philippe12
|
||||
esphome/components/tm1637/* @glmnet
|
||||
esphome/components/tm1638/* @skykingjwc
|
||||
esphome/components/tm1651/* @mrtoy-me
|
||||
esphome/components/tmp102/* @timsavage
|
||||
esphome/components/tmp1075/* @sybrenstuvel
|
||||
esphome/components/tmp117/* @Azimath
|
||||
esphome/components/tof10120/* @wstrzalka
|
||||
esphome/components/tormatic/* @ti-mo
|
||||
esphome/components/toshiba/* @kbx81
|
||||
esphome/components/touchscreen/* @jesserockz @nielsnl68
|
||||
esphome/components/tsl2591/* @wjcarpenter
|
||||
esphome/components/tt21100/* @kroimon
|
||||
esphome/components/tuya/binary_sensor/* @jesserockz
|
||||
esphome/components/tuya/climate/* @jesserockz
|
||||
esphome/components/tuya/number/* @frankiboy1
|
||||
esphome/components/tuya/select/* @bearpawmaxim
|
||||
esphome/components/tuya/sensor/* @jesserockz
|
||||
esphome/components/tuya/switch/* @jesserockz
|
||||
esphome/components/tuya/text_sensor/* @dentra
|
||||
esphome/components/uart/* @esphome/core
|
||||
esphome/components/uart/button/* @ssieb
|
||||
esphome/components/uart/packet_transport/* @clydebarrow
|
||||
esphome/components/udp/* @clydebarrow
|
||||
esphome/components/ufire_ec/* @pvizeli
|
||||
esphome/components/ufire_ise/* @pvizeli
|
||||
esphome/components/ultrasonic/* @OttoWinter
|
||||
esphome/components/update/* @jesserockz
|
||||
esphome/components/uponor_smatrix/* @kroimon
|
||||
esphome/components/usb_host/* @clydebarrow
|
||||
esphome/components/usb_uart/* @clydebarrow
|
||||
esphome/components/valve/* @esphome/core
|
||||
esphome/components/vbus/* @ssieb
|
||||
esphome/components/veml3235/* @kbx81
|
||||
esphome/components/veml7700/* @latonita
|
||||
esphome/components/version/* @esphome/core
|
||||
esphome/components/voice_assistant/* @jesserockz @kahrendt
|
||||
esphome/components/wake_on_lan/* @clydebarrow @willwill2will54
|
||||
esphome/components/watchdog/* @oarcher
|
||||
esphome/components/waveshare_epaper/* @clydebarrow
|
||||
esphome/components/web_server/ota/* @esphome/core
|
||||
esphome/components/web_server_base/* @esphome/core
|
||||
esphome/components/web_server_idf/* @dentra
|
||||
esphome/components/weikai/* @DrCoolZic
|
||||
esphome/components/weikai_i2c/* @DrCoolZic
|
||||
esphome/components/weikai_spi/* @DrCoolZic
|
||||
esphome/components/web_server_base/* @OttoWinter
|
||||
esphome/components/whirlpool/* @glmnet
|
||||
esphome/components/whynter/* @aeonsablaze
|
||||
esphome/components/wiegand/* @ssieb
|
||||
esphome/components/wireguard/* @droscy @lhoracek @thomas0bernard
|
||||
esphome/components/wk2132_i2c/* @DrCoolZic
|
||||
esphome/components/wk2132_spi/* @DrCoolZic
|
||||
esphome/components/wk2168_i2c/* @DrCoolZic
|
||||
esphome/components/wk2168_spi/* @DrCoolZic
|
||||
esphome/components/wk2204_i2c/* @DrCoolZic
|
||||
esphome/components/wk2204_spi/* @DrCoolZic
|
||||
esphome/components/wk2212_i2c/* @DrCoolZic
|
||||
esphome/components/wk2212_spi/* @DrCoolZic
|
||||
esphome/components/wl_134/* @hobbypunk90
|
||||
esphome/components/wts01/* @alepee
|
||||
esphome/components/x9c/* @EtienneMD
|
||||
esphome/components/xgzp68xx/* @gcormier
|
||||
esphome/components/xiaomi_hhccjcy10/* @fariouche
|
||||
esphome/components/xiaomi_lywsd02mmc/* @juanluss31
|
||||
esphome/components/xiaomi_lywsd03mmc/* @ahpohl
|
||||
esphome/components/xiaomi_mhoc303/* @drug123
|
||||
esphome/components/xiaomi_mhoc401/* @vevsvevs
|
||||
esphome/components/xiaomi_rtcgq02lm/* @jesserockz
|
||||
esphome/components/xiaomi_xmwsdj04mmc/* @medusalix
|
||||
esphome/components/xl9535/* @mreditor97
|
||||
esphome/components/xpt2046/touchscreen/* @nielsnl68 @numo68
|
||||
esphome/components/xxtea/* @clydebarrow
|
||||
esphome/components/zephyr/* @tomaszduda23
|
||||
esphome/components/zhlt01/* @cfeenstra1024
|
||||
esphome/components/zio_ultrasonic/* @kahrendt
|
||||
esphome/components/zwave_proxy/* @kbx81
|
||||
esphome/components/xpt2046/* @numo68
|
||||
|
@@ -34,7 +34,7 @@ This Code of Conduct applies both within project spaces and in public spaces whe
|
||||
|
||||
## Enforcement
|
||||
|
||||
Instances of abusive, harassing, or otherwise unacceptable behavior may be reported by contacting the project team at esphome@openhomefoundation.org. The project team will review and investigate all complaints, and will respond in a way that it deems appropriate to the circumstances. The project team is obligated to maintain confidentiality with regard to the reporter of an incident. Further details of specific enforcement policies may be posted separately.
|
||||
Instances of abusive, harassing, or otherwise unacceptable behavior may be reported by contacting the project team at esphome@nabucasa.com. The project team will review and investigate all complaints, and will respond in a way that it deems appropriate to the circumstances. The project team is obligated to maintain confidentiality with regard to the reporter of an incident. Further details of specific enforcement policies may be posted separately.
|
||||
|
||||
Project maintainers who do not follow or enforce the Code of Conduct in good faith may face temporary or permanent repercussions as determined by other members of the project's leadership.
|
||||
|
||||
|
@@ -1,14 +1,14 @@
|
||||
# Contributing to ESPHome [](https://discord.gg/KhAMKrd) [](https://GitHub.com/esphome/esphome/releases/)
|
||||
# Contributing to ESPHome
|
||||
|
||||
We welcome contributions to the ESPHome suite of code and documentation!
|
||||
For a detailed guide, please see https://esphome.io/guides/contributing.html#contributing-to-esphome
|
||||
|
||||
Please read our [contributing guide](https://esphome.io/guides/contributing.html) if you wish to contribute to the
|
||||
project and be sure to join us on [Discord](https://discord.gg/KhAMKrd).
|
||||
Things to note when contributing:
|
||||
|
||||
**See also:**
|
||||
|
||||
[Documentation](https://esphome.io) -- [Issues](https://github.com/esphome/esphome/issues) -- [Feature requests](https://github.com/orgs/esphome/discussions)
|
||||
|
||||
---
|
||||
|
||||
[](https://www.openhomefoundation.org/)
|
||||
- Please test your changes :)
|
||||
- If a new feature is added or an existing user-facing feature is changed, you should also
|
||||
update the [docs](https://github.com/esphome/esphome-docs). See [contributing to esphome-docs](https://esphome.io/guides/contributing.html#contributing-to-esphomedocs)
|
||||
for more information.
|
||||
- Please also update the tests in the `tests/` folder. You can do so by just adding a line in one of the YAML files
|
||||
which checks if your new feature compiles correctly.
|
||||
- Sometimes I will let pull requests linger because I'm not 100% sure about them. Please feel free to ping
|
||||
me after some time.
|
||||
|
@@ -1,6 +1,7 @@
|
||||
include LICENSE
|
||||
include README.md
|
||||
include requirements.txt
|
||||
recursive-include esphome *.cpp *.h *.tcc *.c
|
||||
recursive-include esphome *.py.script
|
||||
include esphome/dashboard/templates/*.html
|
||||
recursive-include esphome/dashboard/static *.ico *.js *.css *.woff* LICENSE
|
||||
recursive-include esphome *.cpp *.h *.tcc
|
||||
recursive-include esphome LICENSE.txt
|
||||
|
15
README.md
15
README.md
@@ -1,16 +1,9 @@
|
||||
# ESPHome [](https://discord.gg/KhAMKrd) [](https://GitHub.com/esphome/esphome/releases/)
|
||||
|
||||
<a href="https://esphome.io/">
|
||||
<picture>
|
||||
<source media="(prefers-color-scheme: dark)" srcset="https://esphome.io/_static/logo-text-on-dark.svg", alt="ESPHome Logo">
|
||||
<img src="https://esphome.io/_static/logo-text-on-light.svg" alt="ESPHome Logo">
|
||||
</picture>
|
||||
</a>
|
||||
[](https://esphome.io/)
|
||||
|
||||
---
|
||||
**Documentation:** https://esphome.io/
|
||||
|
||||
[Documentation](https://esphome.io) -- [Issues](https://github.com/esphome/esphome/issues) -- [Feature requests](https://github.com/orgs/esphome/discussions)
|
||||
For issues, please go to [the issue tracker](https://github.com/esphome/issues/issues).
|
||||
|
||||
---
|
||||
|
||||
[](https://www.openhomefoundation.org/)
|
||||
For feature requests, please see [feature requests](https://github.com/esphome/feature-requests/issues).
|
||||
|
@@ -1,57 +1,77 @@
|
||||
ARG BUILD_VERSION=dev
|
||||
ARG BUILD_OS=alpine
|
||||
ARG BUILD_BASE_VERSION=2025.04.0
|
||||
ARG BUILD_TYPE=docker
|
||||
# Build these with the build.py script
|
||||
# Example:
|
||||
# python3 docker/build.py --tag dev --arch amd64 --build-type docker build
|
||||
|
||||
FROM ghcr.io/esphome/docker-base:${BUILD_OS}-${BUILD_BASE_VERSION} AS base-source-docker
|
||||
FROM ghcr.io/esphome/docker-base:${BUILD_OS}-ha-addon-${BUILD_BASE_VERSION} AS base-source-ha-addon
|
||||
# One of "docker", "hassio"
|
||||
ARG BASEIMGTYPE=docker
|
||||
|
||||
ARG BUILD_TYPE
|
||||
FROM base-source-${BUILD_TYPE} AS base
|
||||
FROM ghcr.io/hassio-addons/debian-base/amd64:5.1.1 AS base-hassio-amd64
|
||||
FROM ghcr.io/hassio-addons/debian-base/aarch64:5.1.1 AS base-hassio-arm64
|
||||
FROM ghcr.io/hassio-addons/debian-base/armv7:5.1.1 AS base-hassio-armv7
|
||||
FROM debian:bullseye-20211011-slim AS base-docker-amd64
|
||||
FROM debian:bullseye-20211011-slim AS base-docker-arm64
|
||||
FROM debian:bullseye-20211011-slim AS base-docker-armv7
|
||||
|
||||
RUN git config --system --add safe.directory "*"
|
||||
|
||||
ENV PIP_DISABLE_PIP_VERSION_CHECK=1
|
||||
|
||||
RUN pip install --no-cache-dir -U pip uv==0.6.14
|
||||
|
||||
COPY requirements.txt /
|
||||
# Use TARGETARCH/TARGETVARIANT defined by docker
|
||||
# https://docs.docker.com/engine/reference/builder/#automatic-platform-args-in-the-global-scope
|
||||
FROM base-${BASEIMGTYPE}-${TARGETARCH}${TARGETVARIANT} AS base
|
||||
|
||||
RUN \
|
||||
uv pip install --no-cache-dir \
|
||||
-r /requirements.txt
|
||||
apt-get update \
|
||||
# Use pinned versions so that we get updates with build caching
|
||||
&& apt-get install -y --no-install-recommends \
|
||||
python3=3.9.2-3 \
|
||||
python3-pip=20.3.4-4 \
|
||||
python3-setuptools=52.0.0-4 \
|
||||
python3-pil=8.1.2+dfsg-0.3 \
|
||||
python3-cryptography=3.3.2-1 \
|
||||
iputils-ping=3:20210202-1 \
|
||||
git=1:2.30.2-1 \
|
||||
curl=7.74.0-1.3+b1 \
|
||||
&& rm -rf \
|
||||
/tmp/* \
|
||||
/var/{cache,log}/* \
|
||||
/var/lib/apt/lists/*
|
||||
|
||||
ENV \
|
||||
# Fix click python3 lang warning https://click.palletsprojects.com/en/7.x/python3/
|
||||
LANG=C.UTF-8 LC_ALL=C.UTF-8 \
|
||||
# Store globally installed pio libs in /piolibs
|
||||
PLATFORMIO_GLOBALLIB_DIR=/piolibs
|
||||
|
||||
RUN \
|
||||
platformio settings set enable_telemetry No \
|
||||
# Ubuntu python3-pip is missing wheel
|
||||
pip3 install --no-cache-dir \
|
||||
wheel==0.36.2 \
|
||||
platformio==5.2.2 \
|
||||
# Change some platformio settings
|
||||
&& platformio settings set enable_telemetry No \
|
||||
&& platformio settings set check_libraries_interval 1000000 \
|
||||
&& platformio settings set check_platformio_interval 1000000 \
|
||||
&& platformio settings set check_platforms_interval 1000000 \
|
||||
&& mkdir -p /piolibs
|
||||
|
||||
COPY script/platformio_install_deps.py platformio.ini /
|
||||
RUN /platformio_install_deps.py /platformio.ini --libraries
|
||||
|
||||
ARG BUILD_VERSION
|
||||
|
||||
LABEL \
|
||||
org.opencontainers.image.authors="The ESPHome Authors" \
|
||||
org.opencontainers.image.title="ESPHome" \
|
||||
org.opencontainers.image.description="ESPHome is a system to configure your microcontrollers by simple yet powerful configuration files and control them remotely through Home Automation systems" \
|
||||
org.opencontainers.image.url="https://esphome.io/" \
|
||||
org.opencontainers.image.documentation="https://esphome.io/" \
|
||||
org.opencontainers.image.source="https://github.com/esphome/esphome" \
|
||||
org.opencontainers.image.licenses="ESPHome" \
|
||||
org.opencontainers.image.version=${BUILD_VERSION}
|
||||
|
||||
|
||||
# ======================= docker-type image =======================
|
||||
FROM base AS base-docker
|
||||
FROM base AS docker
|
||||
|
||||
# First install requirements to leverage caching when requirements don't change
|
||||
COPY requirements.txt requirements_optional.txt docker/platformio_install_deps.py platformio.ini /
|
||||
RUN \
|
||||
pip3 install --no-cache-dir -r /requirements.txt -r /requirements_optional.txt \
|
||||
&& /platformio_install_deps.py /platformio.ini
|
||||
|
||||
# Copy esphome and install
|
||||
COPY . /esphome
|
||||
RUN pip3 install --no-cache-dir -e /esphome
|
||||
|
||||
# Settings for dashboard
|
||||
ENV USERNAME="" PASSWORD=""
|
||||
|
||||
# Expose the dashboard to Docker
|
||||
EXPOSE 6052
|
||||
|
||||
# Run healthcheck (heartbeat)
|
||||
HEALTHCHECK --interval=30s --timeout=30s \
|
||||
CMD curl --fail http://localhost:6052/version -A "HealthCheck" || exit 1
|
||||
|
||||
COPY docker/docker_entrypoint.sh /entrypoint.sh
|
||||
|
||||
# The directory the user should mount their configuration files to
|
||||
@@ -64,23 +84,73 @@ ENTRYPOINT ["/entrypoint.sh"]
|
||||
CMD ["dashboard", "/config"]
|
||||
|
||||
|
||||
# ======================= ha-addon-type image =======================
|
||||
FROM base AS base-ha-addon
|
||||
|
||||
|
||||
# ======================= hassio-type image =======================
|
||||
FROM base AS hassio
|
||||
|
||||
RUN \
|
||||
apt-get update \
|
||||
# Use pinned versions so that we get updates with build caching
|
||||
&& apt-get install -y --no-install-recommends \
|
||||
nginx=1.18.0-6.1 \
|
||||
&& rm -rf \
|
||||
/tmp/* \
|
||||
/var/{cache,log}/* \
|
||||
/var/lib/apt/lists/*
|
||||
|
||||
ARG BUILD_VERSION=dev
|
||||
|
||||
# Copy root filesystem
|
||||
COPY docker/ha-addon-rootfs/ /
|
||||
COPY docker/hassio-rootfs/ /
|
||||
|
||||
ARG BUILD_VERSION
|
||||
# First install requirements to leverage caching when requirements don't change
|
||||
COPY requirements.txt requirements_optional.txt docker/platformio_install_deps.py platformio.ini /
|
||||
RUN \
|
||||
pip3 install --no-cache-dir -r /requirements.txt -r /requirements_optional.txt \
|
||||
&& /platformio_install_deps.py /platformio.ini
|
||||
|
||||
# Copy esphome and install
|
||||
COPY . /esphome
|
||||
RUN pip3 install --no-cache-dir -e /esphome
|
||||
|
||||
# Labels
|
||||
LABEL \
|
||||
io.hass.name="ESPHome" \
|
||||
io.hass.description="ESPHome is a system to configure your microcontrollers by simple yet powerful configuration files and control them remotely through Home Automation systems" \
|
||||
io.hass.description="Manage and program ESP8266/ESP32 microcontrollers through YAML configuration files" \
|
||||
io.hass.type="addon" \
|
||||
io.hass.version="${BUILD_VERSION}"
|
||||
# io.hass.arch is inherited from addon-debian-base
|
||||
|
||||
ARG BUILD_TYPE
|
||||
FROM base-${BUILD_TYPE} AS final
|
||||
|
||||
# Copy esphome and install
|
||||
COPY . /esphome
|
||||
RUN uv pip install --no-cache-dir -e /esphome
|
||||
|
||||
|
||||
# ======================= lint-type image =======================
|
||||
FROM base AS lint
|
||||
|
||||
ENV \
|
||||
PLATFORMIO_CORE_DIR=/esphome/.temp/platformio
|
||||
|
||||
RUN \
|
||||
apt-get update \
|
||||
# Use pinned versions so that we get updates with build caching
|
||||
&& apt-get install -y --no-install-recommends \
|
||||
clang-format-11=1:11.0.1-2 \
|
||||
clang-tidy-11=1:11.0.1-2 \
|
||||
patch=2.7.6-7 \
|
||||
software-properties-common=0.96.20.2-2.1 \
|
||||
nano=5.4-2 \
|
||||
build-essential=12.9 \
|
||||
python3-dev=3.9.2-3 \
|
||||
&& rm -rf \
|
||||
/tmp/* \
|
||||
/var/{cache,log}/* \
|
||||
/var/lib/apt/lists/*
|
||||
|
||||
COPY requirements.txt requirements_optional.txt requirements_test.txt docker/platformio_install_deps.py platformio.ini /
|
||||
RUN \
|
||||
pip3 install --no-cache-dir -r /requirements.txt -r /requirements_optional.txt -r /requirements_test.txt \
|
||||
&& /platformio_install_deps.py /platformio.ini
|
||||
|
||||
VOLUME ["/esphome"]
|
||||
WORKDIR /esphome
|
||||
|
114
docker/build.py
114
docker/build.py
@@ -1,60 +1,45 @@
|
||||
#!/usr/bin/env python3
|
||||
import argparse
|
||||
from dataclasses import dataclass
|
||||
import re
|
||||
import shlex
|
||||
import subprocess
|
||||
import argparse
|
||||
from platform import machine
|
||||
import shlex
|
||||
import re
|
||||
import sys
|
||||
|
||||
CHANNEL_DEV = "dev"
|
||||
CHANNEL_BETA = "beta"
|
||||
CHANNEL_RELEASE = "release"
|
||||
|
||||
CHANNEL_DEV = 'dev'
|
||||
CHANNEL_BETA = 'beta'
|
||||
CHANNEL_RELEASE = 'release'
|
||||
CHANNELS = [CHANNEL_DEV, CHANNEL_BETA, CHANNEL_RELEASE]
|
||||
|
||||
ARCH_AMD64 = "amd64"
|
||||
ARCH_AARCH64 = "aarch64"
|
||||
ARCHS = [ARCH_AMD64, ARCH_AARCH64]
|
||||
ARCH_AMD64 = 'amd64'
|
||||
ARCH_ARMV7 = 'armv7'
|
||||
ARCH_AARCH64 = 'aarch64'
|
||||
ARCHS = [ARCH_AMD64, ARCH_ARMV7, ARCH_AARCH64]
|
||||
|
||||
TYPE_DOCKER = "docker"
|
||||
TYPE_HA_ADDON = "ha-addon"
|
||||
TYPE_LINT = "lint"
|
||||
TYPE_DOCKER = 'docker'
|
||||
TYPE_HA_ADDON = 'ha-addon'
|
||||
TYPE_LINT = 'lint'
|
||||
TYPES = [TYPE_DOCKER, TYPE_HA_ADDON, TYPE_LINT]
|
||||
|
||||
|
||||
parser = argparse.ArgumentParser()
|
||||
parser.add_argument(
|
||||
"--tag",
|
||||
type=str,
|
||||
required=True,
|
||||
help="The main docker tag to push to. If a version number also adds latest and/or beta tag",
|
||||
)
|
||||
parser.add_argument(
|
||||
"--arch", choices=ARCHS, required=False, help="The architecture to build for"
|
||||
)
|
||||
parser.add_argument(
|
||||
"--build-type", choices=TYPES, required=True, help="The type of build to run"
|
||||
)
|
||||
parser.add_argument(
|
||||
"--dry-run", action="store_true", help="Don't run any commands, just print them"
|
||||
)
|
||||
subparsers = parser.add_subparsers(
|
||||
help="Action to perform", dest="command", required=True
|
||||
)
|
||||
parser.add_argument("--tag", type=str, required=True, help="The main docker tag to push to. If a version number also adds latest and/or beta tag")
|
||||
parser.add_argument("--arch", choices=ARCHS, required=False, help="The architecture to build for")
|
||||
parser.add_argument("--build-type", choices=TYPES, required=True, help="The type of build to run")
|
||||
parser.add_argument("--dry-run", action="store_true", help="Don't run any commands, just print them")
|
||||
subparsers = parser.add_subparsers(help="Action to perform", dest="command", required=True)
|
||||
build_parser = subparsers.add_parser("build", help="Build the image")
|
||||
build_parser.add_argument("--push", help="Also push the images", action="store_true")
|
||||
build_parser.add_argument(
|
||||
"--load", help="Load the docker image locally", action="store_true"
|
||||
)
|
||||
manifest_parser = subparsers.add_parser(
|
||||
"manifest", help="Create a manifest from already pushed images"
|
||||
)
|
||||
manifest_parser = subparsers.add_parser("manifest", help="Create a manifest from already pushed images")
|
||||
|
||||
|
||||
@dataclass(frozen=True)
|
||||
class DockerParams:
|
||||
build_to: str
|
||||
manifest_to: str
|
||||
build_type: str
|
||||
baseimgtype: str
|
||||
platform: str
|
||||
target: str
|
||||
|
||||
@@ -63,22 +48,28 @@ class DockerParams:
|
||||
prefix = {
|
||||
TYPE_DOCKER: "esphome/esphome",
|
||||
TYPE_HA_ADDON: "esphome/esphome-hassio",
|
||||
TYPE_LINT: "esphome/esphome-lint",
|
||||
TYPE_LINT: "esphome/esphome-lint"
|
||||
}[build_type]
|
||||
build_to = f"{prefix}-{arch}"
|
||||
baseimgtype = {
|
||||
TYPE_DOCKER: "docker",
|
||||
TYPE_HA_ADDON: "hassio",
|
||||
TYPE_LINT: "docker",
|
||||
}[build_type]
|
||||
platform = {
|
||||
ARCH_AMD64: "linux/amd64",
|
||||
ARCH_ARMV7: "linux/arm/v7",
|
||||
ARCH_AARCH64: "linux/arm64",
|
||||
}[arch]
|
||||
target = {
|
||||
TYPE_DOCKER: "final",
|
||||
TYPE_HA_ADDON: "final",
|
||||
TYPE_DOCKER: "docker",
|
||||
TYPE_HA_ADDON: "hassio",
|
||||
TYPE_LINT: "lint",
|
||||
}[build_type]
|
||||
return cls(
|
||||
build_to=build_to,
|
||||
manifest_to=prefix,
|
||||
build_type=build_type,
|
||||
baseimgtype=baseimgtype,
|
||||
platform=platform,
|
||||
target=target,
|
||||
)
|
||||
@@ -90,18 +81,16 @@ def main():
|
||||
def run_command(*cmd, ignore_error: bool = False):
|
||||
print(f"$ {shlex.join(list(cmd))}")
|
||||
if not args.dry_run:
|
||||
rc = subprocess.call(list(cmd), close_fds=False)
|
||||
rc = subprocess.call(list(cmd))
|
||||
if rc != 0 and not ignore_error:
|
||||
print("Command failed")
|
||||
sys.exit(1)
|
||||
|
||||
# detect channel from tag
|
||||
match = re.match(r"^(\d+\.\d+)(?:\.\d+)?(b\d+)?$", args.tag)
|
||||
major_minor_version = None
|
||||
match = re.match(r'^\d+\.\d+(?:\.\d+)?(b\d+)?$', args.tag)
|
||||
if match is None:
|
||||
channel = CHANNEL_DEV
|
||||
elif match.group(2) is None:
|
||||
major_minor_version = match.group(1)
|
||||
elif match.group(1) is None:
|
||||
channel = CHANNEL_RELEASE
|
||||
else:
|
||||
channel = CHANNEL_BETA
|
||||
@@ -116,11 +105,6 @@ def main():
|
||||
tags_to_push.append("beta")
|
||||
tags_to_push.append("latest")
|
||||
|
||||
# Compatibility with HA tags
|
||||
if major_minor_version:
|
||||
tags_to_push.append("stable")
|
||||
tags_to_push.append(major_minor_version)
|
||||
|
||||
if args.command == "build":
|
||||
# 1. pull cache image
|
||||
params = DockerParams.for_type_arch(args.build_type, args.arch)
|
||||
@@ -136,28 +120,18 @@ def main():
|
||||
|
||||
# 3. build
|
||||
cmd = [
|
||||
"docker",
|
||||
"buildx",
|
||||
"build",
|
||||
"--build-arg",
|
||||
f"BUILD_TYPE={params.build_type}",
|
||||
"--build-arg",
|
||||
f"BUILD_VERSION={args.tag}",
|
||||
"--cache-from",
|
||||
f"type=registry,ref={cache_img}",
|
||||
"--file",
|
||||
"docker/Dockerfile",
|
||||
"--platform",
|
||||
params.platform,
|
||||
"--target",
|
||||
params.target,
|
||||
"docker", "buildx", "build",
|
||||
"--build-arg", f"BASEIMGTYPE={params.baseimgtype}",
|
||||
"--build-arg", f"BUILD_VERSION={args.tag}",
|
||||
"--cache-from", f"type=registry,ref={cache_img}",
|
||||
"--file", "docker/Dockerfile",
|
||||
"--platform", params.platform,
|
||||
"--target", params.target,
|
||||
]
|
||||
for img in imgs:
|
||||
cmd += ["--tag", img]
|
||||
if args.push:
|
||||
cmd += ["--push", "--cache-to", f"type=registry,ref={cache_img},mode=max"]
|
||||
if args.load:
|
||||
cmd += ["--load"]
|
||||
|
||||
run_command(*cmd, ".")
|
||||
elif args.command == "manifest":
|
||||
@@ -176,7 +150,9 @@ def main():
|
||||
run_command(*cmd)
|
||||
# 2. Push manifests
|
||||
for target in targets:
|
||||
run_command("docker", "manifest", "push", target)
|
||||
run_command(
|
||||
"docker", "manifest", "push", target
|
||||
)
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
|
@@ -1,4 +1,4 @@
|
||||
#!/usr/bin/env bash
|
||||
#!/bin/bash
|
||||
|
||||
# If /cache is mounted, use that as PIO's coredir
|
||||
# otherwise use path in /config (so that PIO packages aren't downloaded on each compile)
|
||||
@@ -21,10 +21,4 @@ export PLATFORMIO_PLATFORMS_DIR="${pio_cache_base}/platforms"
|
||||
export PLATFORMIO_PACKAGES_DIR="${pio_cache_base}/packages"
|
||||
export PLATFORMIO_CACHE_DIR="${pio_cache_base}/cache"
|
||||
|
||||
# If /build is mounted, use that as the build path
|
||||
# otherwise use path in /config (so that builds aren't lost on container restart)
|
||||
if [[ -d /build ]]; then
|
||||
export ESPHOME_BUILD_PATH=/build
|
||||
fi
|
||||
|
||||
exec esphome "$@"
|
||||
|
@@ -1,92 +0,0 @@
|
||||
#!/usr/bin/env python3
|
||||
import argparse
|
||||
import re
|
||||
|
||||
CHANNEL_DEV = "dev"
|
||||
CHANNEL_BETA = "beta"
|
||||
CHANNEL_RELEASE = "release"
|
||||
|
||||
GHCR = "ghcr"
|
||||
DOCKERHUB = "dockerhub"
|
||||
|
||||
parser = argparse.ArgumentParser()
|
||||
parser.add_argument(
|
||||
"--tag",
|
||||
type=str,
|
||||
required=True,
|
||||
help="The main docker tag to push to. If a version number also adds latest and/or beta tag",
|
||||
)
|
||||
parser.add_argument(
|
||||
"--suffix",
|
||||
type=str,
|
||||
required=True,
|
||||
help="The suffix of the tag.",
|
||||
)
|
||||
parser.add_argument(
|
||||
"--registry",
|
||||
type=str,
|
||||
choices=[GHCR, DOCKERHUB],
|
||||
required=False,
|
||||
action="append",
|
||||
help="The registry to build tags for.",
|
||||
)
|
||||
|
||||
|
||||
def main():
|
||||
args = parser.parse_args()
|
||||
|
||||
# detect channel from tag
|
||||
match = re.match(r"^(\d+\.\d+)(?:\.\d+)(?:(b\d+)|(-dev\d+))?$", args.tag)
|
||||
major_minor_version = None
|
||||
if match is None: # eg 2023.12.0-dev20231109-testbranch
|
||||
channel = None # Ran with custom tag for a branch etc
|
||||
elif match.group(3) is not None: # eg 2023.12.0-dev20231109
|
||||
channel = CHANNEL_DEV
|
||||
elif match.group(2) is not None: # eg 2023.12.0b1
|
||||
channel = CHANNEL_BETA
|
||||
else: # eg 2023.12.0
|
||||
major_minor_version = match.group(1)
|
||||
channel = CHANNEL_RELEASE
|
||||
|
||||
tags_to_push = [args.tag]
|
||||
if channel == CHANNEL_DEV:
|
||||
tags_to_push.append("dev")
|
||||
elif channel == CHANNEL_BETA:
|
||||
tags_to_push.append("beta")
|
||||
elif channel == CHANNEL_RELEASE:
|
||||
# Additionally push to beta
|
||||
tags_to_push.append("beta")
|
||||
tags_to_push.append("latest")
|
||||
|
||||
if major_minor_version:
|
||||
tags_to_push.append("stable")
|
||||
tags_to_push.append(major_minor_version)
|
||||
|
||||
suffix = f"-{args.suffix}" if args.suffix else ""
|
||||
|
||||
image_name = f"esphome/esphome{suffix}"
|
||||
|
||||
print(f"channel={channel}")
|
||||
|
||||
if args.registry is None:
|
||||
args.registry = [GHCR, DOCKERHUB]
|
||||
elif len(args.registry) == 1:
|
||||
if GHCR in args.registry:
|
||||
print(f"image=ghcr.io/{image_name}")
|
||||
if DOCKERHUB in args.registry:
|
||||
print(f"image=docker.io/{image_name}")
|
||||
|
||||
print(f"image_name={image_name}")
|
||||
|
||||
full_tags = []
|
||||
|
||||
for tag in tags_to_push:
|
||||
if GHCR in args.registry:
|
||||
full_tags += [f"ghcr.io/{image_name}:{tag}"]
|
||||
if DOCKERHUB in args.registry:
|
||||
full_tags += [f"docker.io/{image_name}:{tag}"]
|
||||
print(f"tags={','.join(full_tags)}")
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
main()
|
@@ -1,47 +0,0 @@
|
||||
#!/usr/bin/with-contenv bashio
|
||||
# ==============================================================================
|
||||
# This file installs the user ESPHome fork if specified.
|
||||
# The fork must be up to date with the latest ESPHome dev branch
|
||||
# and have no conflicts.
|
||||
# This config option only exists in the ESPHome Dev add-on.
|
||||
# ==============================================================================
|
||||
|
||||
declare esphome_fork
|
||||
|
||||
if bashio::config.has_value 'esphome_fork'; then
|
||||
esphome_fork=$(bashio::config 'esphome_fork')
|
||||
# format: [username][/repository]:ref
|
||||
if [[ "$esphome_fork" =~ ^(([^/]+)(/([^:]+))?:)?([^:/]+)$ ]]; then
|
||||
username="${BASH_REMATCH[2]:-esphome}"
|
||||
repository="${BASH_REMATCH[4]:-esphome}"
|
||||
ref="${BASH_REMATCH[5]}"
|
||||
else
|
||||
bashio::exit.nok "Invalid esphome_fork format: $esphome_fork"
|
||||
fi
|
||||
full_url="https://github.com/${username}/${repository}/archive/${ref}.tar.gz"
|
||||
bashio::log.info "Checking forked ESPHome"
|
||||
dev_version=$(python3 -c "from esphome.const import __version__; print(__version__)")
|
||||
bashio::log.info "Downloading ESPHome from fork '${esphome_fork}' (${full_url})..."
|
||||
curl -L -o /tmp/esphome.tar.gz "${full_url}" -qq ||
|
||||
bashio::exit.nok "Failed downloading ESPHome fork."
|
||||
bashio::log.info "Installing ESPHome from fork '${esphome_fork}' (${full_url})..."
|
||||
rm -rf /esphome || bashio::exit.nok "Failed to remove ESPHome."
|
||||
mkdir /esphome
|
||||
tar -zxf /tmp/esphome.tar.gz -C /esphome --strip-components=1 ||
|
||||
bashio::exit.nok "Failed installing ESPHome from fork."
|
||||
pip install -U -e /esphome || bashio::exit.nok "Failed installing ESPHome from fork."
|
||||
rm -f /tmp/esphome.tar.gz
|
||||
fork_version=$(python3 -c "from esphome.const import __version__; print(__version__)")
|
||||
|
||||
if [[ "$fork_version" != "$dev_version" ]]; then
|
||||
bashio::log.error "############################"
|
||||
bashio::log.error "Uninstalled fork as version does not match"
|
||||
bashio::log.error "Update (or ask the author to update) the branch"
|
||||
bashio::log.error "This is important as the dev addon and the dev ESPHome"
|
||||
bashio::log.error "branch can have changes that are not compatible with old forks"
|
||||
bashio::log.error "and get reported as bugs which we cannot solve easily."
|
||||
bashio::log.error "############################"
|
||||
bashio::exit.nok
|
||||
fi
|
||||
bashio::log.info "Installed ESPHome from fork '${esphome_fork}' (${full_url})..."
|
||||
fi
|
@@ -1,8 +0,0 @@
|
||||
ssl_protocols TLSv1.2 TLSv1.3;
|
||||
ssl_prefer_server_ciphers off;
|
||||
ssl_ciphers ECDHE-ECDSA-AES128-GCM-SHA256:ECDHE-RSA-AES128-GCM-SHA256:ECDHE-ECDSA-AES256-GCM-SHA384:ECDHE-RSA-AES256-GCM-SHA384:ECDHE-ECDSA-CHACHA20-POLY1305:ECDHE-RSA-CHACHA20-POLY1305:DHE-RSA-AES128-GCM-SHA256:DHE-RSA-AES256-GCM-SHA384;
|
||||
ssl_session_timeout 10m;
|
||||
ssl_session_cache shared:SSL:10m;
|
||||
ssl_session_tickets off;
|
||||
ssl_stapling on;
|
||||
ssl_stapling_verify on;
|
@@ -1,3 +0,0 @@
|
||||
upstream esphome {
|
||||
server unix:/var/run/esphome.sock;
|
||||
}
|
@@ -1 +0,0 @@
|
||||
Without requirements or design, programming is the art of adding bugs to an empty text file. (Louis Srygley)
|
@@ -1,32 +0,0 @@
|
||||
#!/command/with-contenv bashio
|
||||
# shellcheck shell=bash
|
||||
# ==============================================================================
|
||||
# Home Assistant Add-on: ESPHome
|
||||
# Sends discovery information to Home Assistant.
|
||||
# ==============================================================================
|
||||
declare config
|
||||
declare port
|
||||
|
||||
# We only disable it when disabled explicitly
|
||||
if bashio::config.false 'home_assistant_dashboard_integration';
|
||||
then
|
||||
bashio::log.info "Home Assistant discovery is disabled for this add-on."
|
||||
bashio::exit.ok
|
||||
fi
|
||||
|
||||
port=$(bashio::addon.ingress_port)
|
||||
|
||||
# Wait for NGINX to become available
|
||||
bashio::net.wait_for "${port}" "127.0.0.1" 300
|
||||
|
||||
config=$(\
|
||||
bashio::var.json \
|
||||
host "127.0.0.1" \
|
||||
port "^${port}" \
|
||||
)
|
||||
|
||||
if bashio::discovery "esphome" "${config}" > /dev/null; then
|
||||
bashio::log.info "Successfully send discovery information to Home Assistant."
|
||||
else
|
||||
bashio::log.error "Discovery message to Home Assistant failed!"
|
||||
fi
|
@@ -1 +0,0 @@
|
||||
oneshot
|
@@ -1 +0,0 @@
|
||||
/etc/s6-overlay/s6-rc.d/discovery/run
|
@@ -1,26 +0,0 @@
|
||||
#!/command/with-contenv bashio
|
||||
# shellcheck shell=bash
|
||||
# ==============================================================================
|
||||
# Home Assistant Community Add-on: ESPHome
|
||||
# Take down the S6 supervision tree when ESPHome dashboard fails
|
||||
# ==============================================================================
|
||||
declare exit_code
|
||||
readonly exit_code_container=$(</run/s6-linux-init-container-results/exitcode)
|
||||
readonly exit_code_service="${1}"
|
||||
readonly exit_code_signal="${2}"
|
||||
|
||||
bashio::log.info \
|
||||
"Service ESPHome dashboard exited with code ${exit_code_service}" \
|
||||
"(by signal ${exit_code_signal})"
|
||||
|
||||
if [[ "${exit_code_service}" -eq 256 ]]; then
|
||||
if [[ "${exit_code_container}" -eq 0 ]]; then
|
||||
echo $((128 + $exit_code_signal)) > /run/s6-linux-init-container-results/exitcode
|
||||
fi
|
||||
[[ "${exit_code_signal}" -eq 15 ]] && exec /run/s6/basedir/bin/halt
|
||||
elif [[ "${exit_code_service}" -ne 0 ]]; then
|
||||
if [[ "${exit_code_container}" -eq 0 ]]; then
|
||||
echo "${exit_code_service}" > /run/s6-linux-init-container-results/exitcode
|
||||
fi
|
||||
exec /run/s6/basedir/bin/halt
|
||||
fi
|
@@ -1 +0,0 @@
|
||||
longrun
|
@@ -1,27 +0,0 @@
|
||||
#!/command/with-contenv bashio
|
||||
# shellcheck shell=bash
|
||||
# ==============================================================================
|
||||
# Community Hass.io Add-ons: ESPHome
|
||||
# Configures NGINX for use with ESPHome
|
||||
# ==============================================================================
|
||||
mkdir -p /var/log/nginx
|
||||
|
||||
# Generate Ingress configuration
|
||||
bashio::var.json \
|
||||
interface "$(bashio::addon.ip_address)" \
|
||||
port "^$(bashio::addon.ingress_port)" \
|
||||
| tempio \
|
||||
-template /etc/nginx/templates/ingress.gtpl \
|
||||
-out /etc/nginx/servers/ingress.conf
|
||||
|
||||
# Generate direct access configuration, if enabled.
|
||||
if bashio::var.has_value "$(bashio::addon.port 6052)"; then
|
||||
bashio::config.require.ssl
|
||||
bashio::var.json \
|
||||
certfile "$(bashio::config 'certfile')" \
|
||||
keyfile "$(bashio::config 'keyfile')" \
|
||||
ssl "^$(bashio::config 'ssl')" \
|
||||
| tempio \
|
||||
-template /etc/nginx/templates/direct.gtpl \
|
||||
-out /etc/nginx/servers/direct.conf
|
||||
fi
|
@@ -1 +0,0 @@
|
||||
oneshot
|
@@ -1 +0,0 @@
|
||||
/etc/s6-overlay/s6-rc.d/init-nginx/run
|
@@ -1,25 +0,0 @@
|
||||
#!/command/with-contenv bashio
|
||||
# ==============================================================================
|
||||
# Community Hass.io Add-ons: ESPHome
|
||||
# Take down the S6 supervision tree when NGINX fails
|
||||
# ==============================================================================
|
||||
declare exit_code
|
||||
readonly exit_code_container=$(</run/s6-linux-init-container-results/exitcode)
|
||||
readonly exit_code_service="${1}"
|
||||
readonly exit_code_signal="${2}"
|
||||
|
||||
bashio::log.info \
|
||||
"Service NGINX exited with code ${exit_code_service}" \
|
||||
"(by signal ${exit_code_signal})"
|
||||
|
||||
if [[ "${exit_code_service}" -eq 256 ]]; then
|
||||
if [[ "${exit_code_container}" -eq 0 ]]; then
|
||||
echo $((128 + $exit_code_signal)) > /run/s6-linux-init-container-results/exitcode
|
||||
fi
|
||||
[[ "${exit_code_signal}" -eq 15 ]] && exec /run/s6/basedir/bin/halt
|
||||
elif [[ "${exit_code_service}" -ne 0 ]]; then
|
||||
if [[ "${exit_code_container}" -eq 0 ]]; then
|
||||
echo "${exit_code_service}" > /run/s6-linux-init-container-results/exitcode
|
||||
fi
|
||||
exec /run/s6/basedir/bin/halt
|
||||
fi
|
@@ -1 +0,0 @@
|
||||
longrun
|
41
docker/hassio-rootfs/etc/cont-init.d/10-requirements.sh
Executable file
41
docker/hassio-rootfs/etc/cont-init.d/10-requirements.sh
Executable file
@@ -0,0 +1,41 @@
|
||||
#!/usr/bin/with-contenv bashio
|
||||
# ==============================================================================
|
||||
# Community Hass.io Add-ons: ESPHome
|
||||
# This files check if all user configuration requirements are met
|
||||
# ==============================================================================
|
||||
|
||||
# Check SSL requirements, if enabled
|
||||
if bashio::config.true 'ssl'; then
|
||||
if ! bashio::config.has_value 'certfile'; then
|
||||
bashio::fatal 'SSL is enabled, but no certfile was specified.'
|
||||
bashio::exit.nok
|
||||
fi
|
||||
|
||||
if ! bashio::config.has_value 'keyfile'; then
|
||||
bashio::fatal 'SSL is enabled, but no keyfile was specified'
|
||||
bashio::exit.nok
|
||||
fi
|
||||
|
||||
|
||||
certfile="/ssl/$(bashio::config 'certfile')"
|
||||
keyfile="/ssl/$(bashio::config 'keyfile')"
|
||||
|
||||
if ! bashio::fs.file_exists "${certfile}"; then
|
||||
if ! bashio::fs.file_exists "${keyfile}"; then
|
||||
# Both files are missing, let's print a friendlier error message
|
||||
bashio::log.fatal 'You enabled encrypted connections using the "ssl": true option.'
|
||||
bashio::log.fatal "However, the SSL files '${certfile}' and '${keyfile}'"
|
||||
bashio::log.fatal "were not found. If you're using Hass.io on your local network and don't want"
|
||||
bashio::log.fatal 'to encrypt connections to the ESPHome dashboard, you can manually disable'
|
||||
bashio::log.fatal 'SSL by setting "ssl" to false."'
|
||||
bashio::exit.nok
|
||||
fi
|
||||
bashio::log.fatal "The configured certfile '${certfile}' was not found."
|
||||
bashio::exit.nok
|
||||
fi
|
||||
|
||||
if ! bashio::fs.file_exists "/ssl/$(bashio::config 'keyfile')"; then
|
||||
bashio::log.fatal "The configured keyfile '${keyfile}' was not found."
|
||||
bashio::exit.nok
|
||||
fi
|
||||
fi
|
34
docker/hassio-rootfs/etc/cont-init.d/20-nginx.sh
Executable file
34
docker/hassio-rootfs/etc/cont-init.d/20-nginx.sh
Executable file
@@ -0,0 +1,34 @@
|
||||
#!/usr/bin/with-contenv bashio
|
||||
# ==============================================================================
|
||||
# Community Hass.io Add-ons: ESPHome
|
||||
# Configures NGINX for use with ESPHome
|
||||
# ==============================================================================
|
||||
|
||||
declare certfile
|
||||
declare keyfile
|
||||
declare direct_port
|
||||
declare ingress_interface
|
||||
declare ingress_port
|
||||
|
||||
mkdir -p /var/log/nginx
|
||||
|
||||
direct_port=$(bashio::addon.port 6052)
|
||||
if bashio::var.has_value "${direct_port}"; then
|
||||
if bashio::config.true 'ssl'; then
|
||||
certfile=$(bashio::config 'certfile')
|
||||
keyfile=$(bashio::config 'keyfile')
|
||||
|
||||
mv /etc/nginx/servers/direct-ssl.disabled /etc/nginx/servers/direct.conf
|
||||
sed -i "s/%%certfile%%/${certfile}/g" /etc/nginx/servers/direct.conf
|
||||
sed -i "s/%%keyfile%%/${keyfile}/g" /etc/nginx/servers/direct.conf
|
||||
else
|
||||
mv /etc/nginx/servers/direct.disabled /etc/nginx/servers/direct.conf
|
||||
fi
|
||||
|
||||
sed -i "s/%%port%%/${direct_port}/g" /etc/nginx/servers/direct.conf
|
||||
fi
|
||||
|
||||
ingress_port=$(bashio::addon.ingress_port)
|
||||
ingress_interface=$(bashio::addon.ip_address)
|
||||
sed -i "s/%%port%%/${ingress_port}/g" /etc/nginx/servers/ingress.conf
|
||||
sed -i "s/%%interface%%/${ingress_interface}/g" /etc/nginx/servers/ingress.conf
|
9
docker/hassio-rootfs/etc/cont-init.d/30-dirs.sh
Normal file
9
docker/hassio-rootfs/etc/cont-init.d/30-dirs.sh
Normal file
@@ -0,0 +1,9 @@
|
||||
#!/usr/bin/with-contenv bashio
|
||||
# ==============================================================================
|
||||
# Community Hass.io Add-ons: ESPHome
|
||||
# This files creates all directories used by esphome
|
||||
# ==============================================================================
|
||||
|
||||
pio_cache_base=/data/cache/platformio
|
||||
|
||||
mkdir -p "${pio_cache_base}"
|
@@ -1,9 +1,9 @@
|
||||
proxy_http_version 1.1;
|
||||
proxy_ignore_client_abort off;
|
||||
proxy_read_timeout 86400s;
|
||||
proxy_redirect off;
|
||||
proxy_send_timeout 86400s;
|
||||
proxy_max_temp_file_size 0;
|
||||
proxy_http_version 1.1;
|
||||
proxy_ignore_client_abort off;
|
||||
proxy_read_timeout 86400s;
|
||||
proxy_redirect off;
|
||||
proxy_send_timeout 86400s;
|
||||
proxy_max_temp_file_size 0;
|
||||
|
||||
proxy_set_header Accept-Encoding "";
|
||||
proxy_set_header Connection $connection_upgrade;
|
@@ -1,7 +1,5 @@
|
||||
root /dev/null;
|
||||
server_name $hostname;
|
||||
|
||||
client_max_body_size 512m;
|
||||
root /dev/null;
|
||||
server_name $hostname;
|
||||
|
||||
add_header X-Content-Type-Options nosniff;
|
||||
add_header X-XSS-Protection "1; mode=block";
|
9
docker/hassio-rootfs/etc/nginx/includes/ssl_params.conf
Normal file
9
docker/hassio-rootfs/etc/nginx/includes/ssl_params.conf
Normal file
@@ -0,0 +1,9 @@
|
||||
ssl_protocols TLSv1.2;
|
||||
ssl_prefer_server_ciphers on;
|
||||
ssl_ciphers ECDHE-RSA-AES256-GCM-SHA512:DHE-RSA-AES256-GCM-SHA512:ECDHE-RSA-AES256-GCM-SHA384:DHE-RSA-AES256-GCM-SHA384:ECDHE-RSA-AES256-SHA384:ECDHE-RSA-AES256-SHA:DHE-RSA-AES256-SHA;
|
||||
ssl_ecdh_curve secp384r1;
|
||||
ssl_session_timeout 10m;
|
||||
ssl_session_cache shared:SSL:10m;
|
||||
ssl_session_tickets off;
|
||||
ssl_stapling on;
|
||||
ssl_stapling_verify on;
|
@@ -2,6 +2,7 @@ daemon off;
|
||||
user root;
|
||||
pid /var/run/nginx.pid;
|
||||
worker_processes 1;
|
||||
# Hass.io addon log
|
||||
error_log /proc/1/fd/1 error;
|
||||
events {
|
||||
worker_connections 1024;
|
||||
@@ -9,22 +10,24 @@ events {
|
||||
|
||||
http {
|
||||
include /etc/nginx/includes/mime.types;
|
||||
|
||||
access_log off;
|
||||
default_type application/octet-stream;
|
||||
gzip on;
|
||||
keepalive_timeout 65;
|
||||
sendfile on;
|
||||
server_tokens off;
|
||||
|
||||
tcp_nodelay on;
|
||||
tcp_nopush on;
|
||||
access_log stdout;
|
||||
default_type application/octet-stream;
|
||||
gzip on;
|
||||
keepalive_timeout 65;
|
||||
sendfile on;
|
||||
server_tokens off;
|
||||
|
||||
map $http_upgrade $connection_upgrade {
|
||||
default upgrade;
|
||||
'' close;
|
||||
}
|
||||
|
||||
include /etc/nginx/includes/upstream.conf;
|
||||
# Use Hass.io supervisor as resolver
|
||||
resolver 172.30.32.2;
|
||||
|
||||
upstream esphome {
|
||||
server unix:/var/run/esphome.sock;
|
||||
}
|
||||
|
||||
include /etc/nginx/servers/*.conf;
|
||||
}
|
@@ -1,26 +1,20 @@
|
||||
server {
|
||||
{{ if not .ssl }}
|
||||
listen 6052 default_server;
|
||||
{{ else }}
|
||||
listen 6052 default_server ssl http2;
|
||||
{{ end }}
|
||||
listen %%port%% default_server ssl http2;
|
||||
|
||||
include /etc/nginx/includes/server_params.conf;
|
||||
include /etc/nginx/includes/proxy_params.conf;
|
||||
|
||||
{{ if .ssl }}
|
||||
include /etc/nginx/includes/ssl_params.conf;
|
||||
|
||||
ssl_certificate /ssl/{{ .certfile }};
|
||||
ssl_certificate_key /ssl/{{ .keyfile }};
|
||||
ssl on;
|
||||
ssl_certificate /ssl/%%certfile%%;
|
||||
ssl_certificate_key /ssl/%%keyfile%%;
|
||||
|
||||
# Clear Hass.io Ingress header
|
||||
proxy_set_header X-Hassio-Ingress "";
|
||||
|
||||
# Redirect http requests to https on the same port.
|
||||
# https://rageagainstshell.com/2016/11/redirect-http-to-https-on-the-same-port-in-nginx/
|
||||
error_page 497 https://$http_host$request_uri;
|
||||
{{ end }}
|
||||
|
||||
# Clear Home Assistant Ingress header
|
||||
proxy_set_header X-HA-Ingress "";
|
||||
|
||||
location / {
|
||||
proxy_pass http://esphome;
|
12
docker/hassio-rootfs/etc/nginx/servers/direct.disabled
Normal file
12
docker/hassio-rootfs/etc/nginx/servers/direct.disabled
Normal file
@@ -0,0 +1,12 @@
|
||||
server {
|
||||
listen %%port%% default_server;
|
||||
|
||||
include /etc/nginx/includes/server_params.conf;
|
||||
include /etc/nginx/includes/proxy_params.conf;
|
||||
# Clear Hass.io Ingress header
|
||||
proxy_set_header X-Hassio-Ingress "";
|
||||
|
||||
location / {
|
||||
proxy_pass http://esphome;
|
||||
}
|
||||
}
|
@@ -1,16 +1,14 @@
|
||||
server {
|
||||
listen 127.0.0.1:{{ .port }} default_server;
|
||||
listen {{ .interface }}:{{ .port }} default_server;
|
||||
listen %%interface%%:%%port%% default_server;
|
||||
|
||||
include /etc/nginx/includes/server_params.conf;
|
||||
include /etc/nginx/includes/proxy_params.conf;
|
||||
|
||||
# Set Home Assistant Ingress header
|
||||
proxy_set_header X-HA-Ingress "YES";
|
||||
# Set Hass.io Ingress header
|
||||
proxy_set_header X-Hassio-Ingress "YES";
|
||||
|
||||
location / {
|
||||
# Only allow from Hass.io supervisor
|
||||
allow 172.30.32.2;
|
||||
allow 127.0.0.1;
|
||||
deny all;
|
||||
|
||||
proxy_pass http://esphome;
|
9
docker/hassio-rootfs/etc/services.d/esphome/finish
Executable file
9
docker/hassio-rootfs/etc/services.d/esphome/finish
Executable file
@@ -0,0 +1,9 @@
|
||||
#!/usr/bin/execlineb -S0
|
||||
# ==============================================================================
|
||||
# Community Hass.io Add-ons: ESPHome
|
||||
# Take down the S6 supervision tree when ESPHome fails
|
||||
# ==============================================================================
|
||||
if -n { s6-test $# -ne 0 }
|
||||
if -n { s6-test ${1} -eq 256 }
|
||||
|
||||
s6-svscanctl -t /var/run/s6/services
|
@@ -1,19 +1,10 @@
|
||||
#!/command/with-contenv bashio
|
||||
# shellcheck shell=bash
|
||||
#!/usr/bin/with-contenv bashio
|
||||
# ==============================================================================
|
||||
# Community Hass.io Add-ons: ESPHome
|
||||
# Runs the ESPHome dashboard
|
||||
# ==============================================================================
|
||||
readonly pio_cache_base=/data/cache/platformio
|
||||
|
||||
export ESPHOME_IS_HA_ADDON=true
|
||||
export PLATFORMIO_GLOBALLIB_DIR=/piolibs
|
||||
|
||||
# we can't set core_dir, because the settings file is stored in `core_dir/appstate.json`
|
||||
# setting `core_dir` would therefore prevent pio from accessing
|
||||
export PLATFORMIO_PLATFORMS_DIR="${pio_cache_base}/platforms"
|
||||
export PLATFORMIO_PACKAGES_DIR="${pio_cache_base}/packages"
|
||||
export PLATFORMIO_CACHE_DIR="${pio_cache_base}/cache"
|
||||
export ESPHOME_IS_HASSIO=true
|
||||
|
||||
if bashio::config.true 'leave_front_door_open'; then
|
||||
export DISABLE_HA_AUTHENTICATION=true
|
||||
@@ -23,31 +14,22 @@ if bashio::config.true 'streamer_mode'; then
|
||||
export ESPHOME_STREAMER_MODE=true
|
||||
fi
|
||||
|
||||
if bashio::config.true 'status_use_ping'; then
|
||||
export ESPHOME_DASHBOARD_USE_PING=true
|
||||
fi
|
||||
|
||||
if bashio::config.has_value 'relative_url'; then
|
||||
export ESPHOME_DASHBOARD_RELATIVE_URL=$(bashio::config 'relative_url')
|
||||
fi
|
||||
|
||||
if bashio::config.has_value 'default_compile_process_limit'; then
|
||||
export ESPHOME_DEFAULT_COMPILE_PROCESS_LIMIT=$(bashio::config 'default_compile_process_limit')
|
||||
else
|
||||
if grep -q 'Raspberry Pi 3' /proc/cpuinfo; then
|
||||
export ESPHOME_DEFAULT_COMPILE_PROCESS_LIMIT=1
|
||||
fi
|
||||
fi
|
||||
pio_cache_base=/data/cache/platformio
|
||||
# we can't set core_dir, because the settings file is stored in `core_dir/appstate.json`
|
||||
# setting `core_dir` would therefore prevent pio from accessing
|
||||
export PLATFORMIO_PLATFORMS_DIR="${pio_cache_base}/platforms"
|
||||
export PLATFORMIO_PACKAGES_DIR="${pio_cache_base}/packages"
|
||||
export PLATFORMIO_CACHE_DIR="${pio_cache_base}/cache"
|
||||
|
||||
mkdir -p "${pio_cache_base}"
|
||||
|
||||
mkdir -p /config/esphome
|
||||
|
||||
if bashio::fs.directory_exists '/config/esphome/.esphome'; then
|
||||
bashio::log.info "Migrating old .esphome directory..."
|
||||
if bashio::fs.file_exists '/config/esphome/.esphome/esphome.json'; then
|
||||
mv /config/esphome/.esphome/esphome.json /data/esphome.json
|
||||
fi
|
||||
mkdir -p "/data/storage"
|
||||
mv /config/esphome/.esphome/*.json /data/storage/ || true
|
||||
rm -rf /config/esphome/.esphome
|
||||
fi
|
||||
export PLATFORMIO_GLOBALLIB_DIR=/piolibs
|
||||
|
||||
bashio::log.info "Starting ESPHome dashboard..."
|
||||
exec esphome dashboard /config/esphome --socket /var/run/esphome.sock --ha-addon
|
||||
exec esphome dashboard /config/esphome --socket /var/run/esphome.sock --hassio
|
9
docker/hassio-rootfs/etc/services.d/nginx/finish
Executable file
9
docker/hassio-rootfs/etc/services.d/nginx/finish
Executable file
@@ -0,0 +1,9 @@
|
||||
#!/usr/bin/execlineb -S0
|
||||
# ==============================================================================
|
||||
# Community Hass.io Add-ons: ESPHome
|
||||
# Take down the S6 supervision tree when NGINX fails
|
||||
# ==============================================================================
|
||||
if -n { s6-test $# -ne 0 }
|
||||
if -n { s6-test ${1} -eq 256 }
|
||||
|
||||
s6-svscanctl -t /var/run/s6/services
|
@@ -1,11 +1,10 @@
|
||||
#!/command/with-contenv bashio
|
||||
# shellcheck shell=bash
|
||||
#!/usr/bin/with-contenv bashio
|
||||
# ==============================================================================
|
||||
# Community Hass.io Add-ons: ESPHome
|
||||
# Runs the NGINX proxy
|
||||
# ==============================================================================
|
||||
|
||||
bashio::log.info "Waiting for ESPHome dashboard to come up..."
|
||||
bashio::log.info "Waiting for dashboard to come up..."
|
||||
|
||||
while [[ ! -S /var/run/esphome.sock ]]; do
|
||||
sleep 0.5
|
30
docker/platformio_install_deps.py
Executable file
30
docker/platformio_install_deps.py
Executable file
@@ -0,0 +1,30 @@
|
||||
#!/usr/bin/env python3
|
||||
# This script is used in the docker containers to preinstall
|
||||
# all platformio libraries in the global storage
|
||||
|
||||
import configparser
|
||||
import subprocess
|
||||
import sys
|
||||
|
||||
config = configparser.ConfigParser(inline_comment_prefixes=(';', ))
|
||||
config.read(sys.argv[1])
|
||||
|
||||
libs = []
|
||||
# Extract from every lib_deps key in all sections
|
||||
for section in config.sections():
|
||||
conf = config[section]
|
||||
if "lib_deps" not in conf:
|
||||
continue
|
||||
for lib_dep in conf["lib_deps"].splitlines():
|
||||
if not lib_dep:
|
||||
# Empty line or comment
|
||||
continue
|
||||
if lib_dep.startswith("${"):
|
||||
# Extending from another section
|
||||
continue
|
||||
if "@" not in lib_dep:
|
||||
# No version pinned, this is an internal lib
|
||||
continue
|
||||
libs.append(lib_dep)
|
||||
|
||||
subprocess.check_call(['platformio', 'lib', '-g', 'install', *libs])
|
1060
esphome/__main__.py
1060
esphome/__main__.py
File diff suppressed because it is too large
Load Diff
@@ -1,142 +0,0 @@
|
||||
"""Address cache for DNS and mDNS lookups."""
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
import logging
|
||||
from typing import TYPE_CHECKING
|
||||
|
||||
if TYPE_CHECKING:
|
||||
from collections.abc import Iterable
|
||||
|
||||
_LOGGER = logging.getLogger(__name__)
|
||||
|
||||
|
||||
def normalize_hostname(hostname: str) -> str:
|
||||
"""Normalize hostname for cache lookups.
|
||||
|
||||
Removes trailing dots and converts to lowercase.
|
||||
"""
|
||||
return hostname.rstrip(".").lower()
|
||||
|
||||
|
||||
class AddressCache:
|
||||
"""Cache for DNS and mDNS address lookups.
|
||||
|
||||
This cache stores pre-resolved addresses from command-line arguments
|
||||
to avoid slow DNS/mDNS lookups during builds.
|
||||
"""
|
||||
|
||||
def __init__(
|
||||
self,
|
||||
mdns_cache: dict[str, list[str]] | None = None,
|
||||
dns_cache: dict[str, list[str]] | None = None,
|
||||
) -> None:
|
||||
"""Initialize the address cache.
|
||||
|
||||
Args:
|
||||
mdns_cache: Pre-populated mDNS addresses (hostname -> IPs)
|
||||
dns_cache: Pre-populated DNS addresses (hostname -> IPs)
|
||||
"""
|
||||
self.mdns_cache = mdns_cache or {}
|
||||
self.dns_cache = dns_cache or {}
|
||||
|
||||
def _get_cached_addresses(
|
||||
self, hostname: str, cache: dict[str, list[str]], cache_type: str
|
||||
) -> list[str] | None:
|
||||
"""Get cached addresses from a specific cache.
|
||||
|
||||
Args:
|
||||
hostname: The hostname to look up
|
||||
cache: The cache dictionary to check
|
||||
cache_type: Type of cache for logging ("mDNS" or "DNS")
|
||||
|
||||
Returns:
|
||||
List of IP addresses if found in cache, None otherwise
|
||||
"""
|
||||
normalized = normalize_hostname(hostname)
|
||||
if addresses := cache.get(normalized):
|
||||
_LOGGER.debug("Using %s cache for %s: %s", cache_type, hostname, addresses)
|
||||
return addresses
|
||||
return None
|
||||
|
||||
def get_mdns_addresses(self, hostname: str) -> list[str] | None:
|
||||
"""Get cached mDNS addresses for a hostname.
|
||||
|
||||
Args:
|
||||
hostname: The hostname to look up (should end with .local)
|
||||
|
||||
Returns:
|
||||
List of IP addresses if found in cache, None otherwise
|
||||
"""
|
||||
return self._get_cached_addresses(hostname, self.mdns_cache, "mDNS")
|
||||
|
||||
def get_dns_addresses(self, hostname: str) -> list[str] | None:
|
||||
"""Get cached DNS addresses for a hostname.
|
||||
|
||||
Args:
|
||||
hostname: The hostname to look up
|
||||
|
||||
Returns:
|
||||
List of IP addresses if found in cache, None otherwise
|
||||
"""
|
||||
return self._get_cached_addresses(hostname, self.dns_cache, "DNS")
|
||||
|
||||
def get_addresses(self, hostname: str) -> list[str] | None:
|
||||
"""Get cached addresses for a hostname.
|
||||
|
||||
Checks mDNS cache for .local domains, DNS cache otherwise.
|
||||
|
||||
Args:
|
||||
hostname: The hostname to look up
|
||||
|
||||
Returns:
|
||||
List of IP addresses if found in cache, None otherwise
|
||||
"""
|
||||
normalized = normalize_hostname(hostname)
|
||||
if normalized.endswith(".local"):
|
||||
return self.get_mdns_addresses(hostname)
|
||||
return self.get_dns_addresses(hostname)
|
||||
|
||||
def has_cache(self) -> bool:
|
||||
"""Check if any cache entries exist."""
|
||||
return bool(self.mdns_cache or self.dns_cache)
|
||||
|
||||
@classmethod
|
||||
def from_cli_args(
|
||||
cls, mdns_args: Iterable[str], dns_args: Iterable[str]
|
||||
) -> AddressCache:
|
||||
"""Create cache from command-line arguments.
|
||||
|
||||
Args:
|
||||
mdns_args: List of mDNS cache entries like ['host=ip1,ip2']
|
||||
dns_args: List of DNS cache entries like ['host=ip1,ip2']
|
||||
|
||||
Returns:
|
||||
Configured AddressCache instance
|
||||
"""
|
||||
mdns_cache = cls._parse_cache_args(mdns_args)
|
||||
dns_cache = cls._parse_cache_args(dns_args)
|
||||
return cls(mdns_cache=mdns_cache, dns_cache=dns_cache)
|
||||
|
||||
@staticmethod
|
||||
def _parse_cache_args(cache_args: Iterable[str]) -> dict[str, list[str]]:
|
||||
"""Parse cache arguments into a dictionary.
|
||||
|
||||
Args:
|
||||
cache_args: List of cache mappings like ['host1=ip1,ip2', 'host2=ip3']
|
||||
|
||||
Returns:
|
||||
Dictionary mapping normalized hostnames to list of IP addresses
|
||||
"""
|
||||
cache: dict[str, list[str]] = {}
|
||||
for arg in cache_args:
|
||||
if "=" not in arg:
|
||||
_LOGGER.warning(
|
||||
"Invalid cache format: %s (expected 'hostname=ip1,ip2')", arg
|
||||
)
|
||||
continue
|
||||
hostname, ips = arg.split("=", 1)
|
||||
# Normalize hostname for consistent lookups
|
||||
normalized = normalize_hostname(hostname)
|
||||
cache[normalized] = [ip.strip() for ip in ips.split(",")]
|
||||
return cache
|
@@ -1,502 +0,0 @@
|
||||
"""Memory usage analyzer for ESPHome compiled binaries."""
|
||||
|
||||
from collections import defaultdict
|
||||
from dataclasses import dataclass, field
|
||||
import logging
|
||||
from pathlib import Path
|
||||
import re
|
||||
import subprocess
|
||||
from typing import TYPE_CHECKING
|
||||
|
||||
from .const import (
|
||||
CORE_SUBCATEGORY_PATTERNS,
|
||||
DEMANGLED_PATTERNS,
|
||||
ESPHOME_COMPONENT_PATTERN,
|
||||
SECTION_TO_ATTR,
|
||||
SYMBOL_PATTERNS,
|
||||
)
|
||||
from .helpers import (
|
||||
get_component_class_patterns,
|
||||
get_esphome_components,
|
||||
map_section_name,
|
||||
parse_symbol_line,
|
||||
)
|
||||
|
||||
if TYPE_CHECKING:
|
||||
from esphome.platformio_api import IDEData
|
||||
|
||||
_LOGGER = logging.getLogger(__name__)
|
||||
|
||||
# GCC global constructor/destructor prefix annotations
|
||||
_GCC_PREFIX_ANNOTATIONS = {
|
||||
"_GLOBAL__sub_I_": "global constructor for",
|
||||
"_GLOBAL__sub_D_": "global destructor for",
|
||||
}
|
||||
|
||||
# GCC optimization suffix pattern (e.g., $isra$0, $part$1, $constprop$2)
|
||||
_GCC_OPTIMIZATION_SUFFIX_PATTERN = re.compile(r"(\$(?:isra|part|constprop)\$\d+)")
|
||||
|
||||
# C++ runtime patterns for categorization
|
||||
_CPP_RUNTIME_PATTERNS = frozenset(["vtable", "typeinfo", "thunk"])
|
||||
|
||||
# libc printf/scanf family base names (used to detect variants like _printf_r, vfprintf, etc.)
|
||||
_LIBC_PRINTF_SCANF_FAMILY = frozenset(["printf", "fprintf", "sprintf", "scanf"])
|
||||
|
||||
# Regex pattern for parsing readelf section headers
|
||||
# Format: [ #] name type addr off size
|
||||
_READELF_SECTION_PATTERN = re.compile(
|
||||
r"\s*\[\s*\d+\]\s+([\.\w]+)\s+\w+\s+[\da-fA-F]+\s+[\da-fA-F]+\s+([\da-fA-F]+)"
|
||||
)
|
||||
|
||||
# Component category prefixes
|
||||
_COMPONENT_PREFIX_ESPHOME = "[esphome]"
|
||||
_COMPONENT_PREFIX_EXTERNAL = "[external]"
|
||||
_COMPONENT_CORE = f"{_COMPONENT_PREFIX_ESPHOME}core"
|
||||
_COMPONENT_API = f"{_COMPONENT_PREFIX_ESPHOME}api"
|
||||
|
||||
# C++ namespace prefixes
|
||||
_NAMESPACE_ESPHOME = "esphome::"
|
||||
_NAMESPACE_STD = "std::"
|
||||
|
||||
# Type alias for symbol information: (symbol_name, size, component)
|
||||
SymbolInfoType = tuple[str, int, str]
|
||||
|
||||
|
||||
@dataclass
|
||||
class MemorySection:
|
||||
"""Represents a memory section with its symbols."""
|
||||
|
||||
name: str
|
||||
symbols: list[SymbolInfoType] = field(default_factory=list)
|
||||
total_size: int = 0
|
||||
|
||||
|
||||
@dataclass
|
||||
class ComponentMemory:
|
||||
"""Tracks memory usage for a component."""
|
||||
|
||||
name: str
|
||||
text_size: int = 0 # Code in flash
|
||||
rodata_size: int = 0 # Read-only data in flash
|
||||
data_size: int = 0 # Initialized data (flash + ram)
|
||||
bss_size: int = 0 # Uninitialized data (ram only)
|
||||
symbol_count: int = 0
|
||||
|
||||
@property
|
||||
def flash_total(self) -> int:
|
||||
"""Total flash usage (text + rodata + data)."""
|
||||
return self.text_size + self.rodata_size + self.data_size
|
||||
|
||||
@property
|
||||
def ram_total(self) -> int:
|
||||
"""Total RAM usage (data + bss)."""
|
||||
return self.data_size + self.bss_size
|
||||
|
||||
|
||||
class MemoryAnalyzer:
|
||||
"""Analyzes memory usage from ELF files."""
|
||||
|
||||
def __init__(
|
||||
self,
|
||||
elf_path: str,
|
||||
objdump_path: str | None = None,
|
||||
readelf_path: str | None = None,
|
||||
external_components: set[str] | None = None,
|
||||
idedata: "IDEData | None" = None,
|
||||
) -> None:
|
||||
"""Initialize memory analyzer.
|
||||
|
||||
Args:
|
||||
elf_path: Path to ELF file to analyze
|
||||
objdump_path: Path to objdump binary (auto-detected from idedata if not provided)
|
||||
readelf_path: Path to readelf binary (auto-detected from idedata if not provided)
|
||||
external_components: Set of external component names
|
||||
idedata: Optional PlatformIO IDEData object to auto-detect toolchain paths
|
||||
"""
|
||||
self.elf_path = Path(elf_path)
|
||||
if not self.elf_path.exists():
|
||||
raise FileNotFoundError(f"ELF file not found: {elf_path}")
|
||||
|
||||
# Auto-detect toolchain paths from idedata if not provided
|
||||
if idedata is not None and (objdump_path is None or readelf_path is None):
|
||||
objdump_path = objdump_path or idedata.objdump_path
|
||||
readelf_path = readelf_path or idedata.readelf_path
|
||||
_LOGGER.debug("Using toolchain paths from PlatformIO idedata")
|
||||
|
||||
self.objdump_path = objdump_path or "objdump"
|
||||
self.readelf_path = readelf_path or "readelf"
|
||||
self.external_components = external_components or set()
|
||||
|
||||
self.sections: dict[str, MemorySection] = {}
|
||||
self.components: dict[str, ComponentMemory] = defaultdict(
|
||||
lambda: ComponentMemory("")
|
||||
)
|
||||
self._demangle_cache: dict[str, str] = {}
|
||||
self._uncategorized_symbols: list[tuple[str, str, int]] = []
|
||||
self._esphome_core_symbols: list[
|
||||
tuple[str, str, int]
|
||||
] = [] # Track core symbols
|
||||
self._component_symbols: dict[str, list[tuple[str, str, int]]] = defaultdict(
|
||||
list
|
||||
) # Track symbols for all components
|
||||
|
||||
def analyze(self) -> dict[str, ComponentMemory]:
|
||||
"""Analyze the ELF file and return component memory usage."""
|
||||
self._parse_sections()
|
||||
self._parse_symbols()
|
||||
self._categorize_symbols()
|
||||
return dict(self.components)
|
||||
|
||||
def _parse_sections(self) -> None:
|
||||
"""Parse section headers from ELF file."""
|
||||
result = subprocess.run(
|
||||
[self.readelf_path, "-S", str(self.elf_path)],
|
||||
capture_output=True,
|
||||
text=True,
|
||||
check=True,
|
||||
)
|
||||
|
||||
# Parse section headers
|
||||
for line in result.stdout.splitlines():
|
||||
# Look for section entries
|
||||
if not (match := _READELF_SECTION_PATTERN.match(line)):
|
||||
continue
|
||||
|
||||
section_name = match.group(1)
|
||||
size_hex = match.group(2)
|
||||
size = int(size_hex, 16)
|
||||
|
||||
# Map to standard section name
|
||||
mapped_section = map_section_name(section_name)
|
||||
if not mapped_section:
|
||||
continue
|
||||
|
||||
if mapped_section not in self.sections:
|
||||
self.sections[mapped_section] = MemorySection(mapped_section)
|
||||
self.sections[mapped_section].total_size += size
|
||||
|
||||
def _parse_symbols(self) -> None:
|
||||
"""Parse symbols from ELF file."""
|
||||
result = subprocess.run(
|
||||
[self.objdump_path, "-t", str(self.elf_path)],
|
||||
capture_output=True,
|
||||
text=True,
|
||||
check=True,
|
||||
)
|
||||
|
||||
# Track seen addresses to avoid duplicates
|
||||
seen_addresses: set[str] = set()
|
||||
|
||||
for line in result.stdout.splitlines():
|
||||
if not (symbol_info := parse_symbol_line(line)):
|
||||
continue
|
||||
|
||||
section, name, size, address = symbol_info
|
||||
|
||||
# Skip duplicate symbols at the same address (e.g., C1/C2 constructors)
|
||||
if address in seen_addresses or section not in self.sections:
|
||||
continue
|
||||
|
||||
self.sections[section].symbols.append((name, size, ""))
|
||||
seen_addresses.add(address)
|
||||
|
||||
def _categorize_symbols(self) -> None:
|
||||
"""Categorize symbols by component."""
|
||||
# First, collect all unique symbol names for batch demangling
|
||||
all_symbols = {
|
||||
symbol_name
|
||||
for section in self.sections.values()
|
||||
for symbol_name, _, _ in section.symbols
|
||||
}
|
||||
|
||||
# Batch demangle all symbols at once
|
||||
self._batch_demangle_symbols(list(all_symbols))
|
||||
|
||||
# Now categorize with cached demangled names
|
||||
for section_name, section in self.sections.items():
|
||||
for symbol_name, size, _ in section.symbols:
|
||||
component = self._identify_component(symbol_name)
|
||||
|
||||
if component not in self.components:
|
||||
self.components[component] = ComponentMemory(component)
|
||||
|
||||
comp_mem = self.components[component]
|
||||
comp_mem.symbol_count += 1
|
||||
|
||||
# Update the appropriate size attribute based on section
|
||||
if attr_name := SECTION_TO_ATTR.get(section_name):
|
||||
setattr(comp_mem, attr_name, getattr(comp_mem, attr_name) + size)
|
||||
|
||||
# Track uncategorized symbols
|
||||
if component == "other" and size > 0:
|
||||
demangled = self._demangle_symbol(symbol_name)
|
||||
self._uncategorized_symbols.append((symbol_name, demangled, size))
|
||||
|
||||
# Track ESPHome core symbols for detailed analysis
|
||||
if component == _COMPONENT_CORE and size > 0:
|
||||
demangled = self._demangle_symbol(symbol_name)
|
||||
self._esphome_core_symbols.append((symbol_name, demangled, size))
|
||||
|
||||
# Track all component symbols for detailed analysis
|
||||
if size > 0:
|
||||
demangled = self._demangle_symbol(symbol_name)
|
||||
self._component_symbols[component].append(
|
||||
(symbol_name, demangled, size)
|
||||
)
|
||||
|
||||
def _identify_component(self, symbol_name: str) -> str:
|
||||
"""Identify which component a symbol belongs to."""
|
||||
# Demangle C++ names if needed
|
||||
demangled = self._demangle_symbol(symbol_name)
|
||||
|
||||
# Check for special component classes first (before namespace pattern)
|
||||
# This handles cases like esphome::ESPHomeOTAComponent which should map to ota
|
||||
if _NAMESPACE_ESPHOME in demangled:
|
||||
# Check for special component classes that include component name in the class
|
||||
# For example: esphome::ESPHomeOTAComponent -> ota component
|
||||
for component_name in get_esphome_components():
|
||||
patterns = get_component_class_patterns(component_name)
|
||||
if any(pattern in demangled for pattern in patterns):
|
||||
return f"{_COMPONENT_PREFIX_ESPHOME}{component_name}"
|
||||
|
||||
# Check for ESPHome component namespaces
|
||||
match = ESPHOME_COMPONENT_PATTERN.search(demangled)
|
||||
if match:
|
||||
component_name = match.group(1)
|
||||
# Strip trailing underscore if present (e.g., switch_ -> switch)
|
||||
component_name = component_name.rstrip("_")
|
||||
|
||||
# Check if this is an actual component in the components directory
|
||||
if component_name in get_esphome_components():
|
||||
return f"{_COMPONENT_PREFIX_ESPHOME}{component_name}"
|
||||
# Check if this is a known external component from the config
|
||||
if component_name in self.external_components:
|
||||
return f"{_COMPONENT_PREFIX_EXTERNAL}{component_name}"
|
||||
# Everything else in esphome:: namespace is core
|
||||
return _COMPONENT_CORE
|
||||
|
||||
# Check for esphome core namespace (no component namespace)
|
||||
if _NAMESPACE_ESPHOME in demangled:
|
||||
# If no component match found, it's core
|
||||
return _COMPONENT_CORE
|
||||
|
||||
# Check against symbol patterns
|
||||
for component, patterns in SYMBOL_PATTERNS.items():
|
||||
if any(pattern in symbol_name for pattern in patterns):
|
||||
return component
|
||||
|
||||
# Check against demangled patterns
|
||||
for component, patterns in DEMANGLED_PATTERNS.items():
|
||||
if any(pattern in demangled for pattern in patterns):
|
||||
return component
|
||||
|
||||
# Special cases that need more complex logic
|
||||
|
||||
# Check if spi_flash vs spi_driver
|
||||
if "spi_" in symbol_name or "SPI" in symbol_name:
|
||||
return "spi_flash" if "spi_flash" in symbol_name else "spi_driver"
|
||||
|
||||
# libc special printf variants
|
||||
if (
|
||||
symbol_name.startswith("_")
|
||||
and symbol_name[1:].replace("_r", "").replace("v", "").replace("s", "")
|
||||
in _LIBC_PRINTF_SCANF_FAMILY
|
||||
):
|
||||
return "libc"
|
||||
|
||||
# Track uncategorized symbols for analysis
|
||||
return "other"
|
||||
|
||||
def _batch_demangle_symbols(self, symbols: list[str]) -> None:
|
||||
"""Batch demangle C++ symbol names for efficiency."""
|
||||
if not symbols:
|
||||
return
|
||||
|
||||
# Try to find the appropriate c++filt for the platform
|
||||
cppfilt_cmd = "c++filt"
|
||||
|
||||
_LOGGER.info("Demangling %d symbols", len(symbols))
|
||||
_LOGGER.debug("objdump_path = %s", self.objdump_path)
|
||||
|
||||
# Check if we have a toolchain-specific c++filt
|
||||
if self.objdump_path and self.objdump_path != "objdump":
|
||||
# Replace objdump with c++filt in the path
|
||||
potential_cppfilt = self.objdump_path.replace("objdump", "c++filt")
|
||||
_LOGGER.info("Checking for toolchain c++filt at: %s", potential_cppfilt)
|
||||
if Path(potential_cppfilt).exists():
|
||||
cppfilt_cmd = potential_cppfilt
|
||||
_LOGGER.info("✓ Using toolchain c++filt: %s", cppfilt_cmd)
|
||||
else:
|
||||
_LOGGER.info(
|
||||
"✗ Toolchain c++filt not found at %s, using system c++filt",
|
||||
potential_cppfilt,
|
||||
)
|
||||
else:
|
||||
_LOGGER.info("✗ Using system c++filt (objdump_path=%s)", self.objdump_path)
|
||||
|
||||
# Strip GCC optimization suffixes and prefixes before demangling
|
||||
# Suffixes like $isra$0, $part$0, $constprop$0 confuse c++filt
|
||||
# Prefixes like _GLOBAL__sub_I_ need to be removed and tracked
|
||||
symbols_stripped: list[str] = []
|
||||
symbols_prefixes: list[str] = [] # Track removed prefixes
|
||||
for symbol in symbols:
|
||||
# Remove GCC optimization markers
|
||||
stripped = _GCC_OPTIMIZATION_SUFFIX_PATTERN.sub("", symbol)
|
||||
|
||||
# Handle GCC global constructor/initializer prefixes
|
||||
# _GLOBAL__sub_I_<mangled> -> extract <mangled> for demangling
|
||||
prefix = ""
|
||||
for gcc_prefix in _GCC_PREFIX_ANNOTATIONS:
|
||||
if stripped.startswith(gcc_prefix):
|
||||
prefix = gcc_prefix
|
||||
stripped = stripped[len(prefix) :]
|
||||
break
|
||||
|
||||
symbols_stripped.append(stripped)
|
||||
symbols_prefixes.append(prefix)
|
||||
|
||||
try:
|
||||
# Send all symbols to c++filt at once
|
||||
result = subprocess.run(
|
||||
[cppfilt_cmd],
|
||||
input="\n".join(symbols_stripped),
|
||||
capture_output=True,
|
||||
text=True,
|
||||
check=False,
|
||||
)
|
||||
except (subprocess.SubprocessError, OSError, UnicodeDecodeError) as e:
|
||||
# On error, cache originals
|
||||
_LOGGER.warning("Failed to batch demangle symbols: %s", e)
|
||||
for symbol in symbols:
|
||||
self._demangle_cache[symbol] = symbol
|
||||
return
|
||||
|
||||
if result.returncode != 0:
|
||||
_LOGGER.warning(
|
||||
"c++filt exited with code %d: %s",
|
||||
result.returncode,
|
||||
result.stderr[:200] if result.stderr else "(no error output)",
|
||||
)
|
||||
# Cache originals on failure
|
||||
for symbol in symbols:
|
||||
self._demangle_cache[symbol] = symbol
|
||||
return
|
||||
|
||||
# Process demangled output
|
||||
self._process_demangled_output(
|
||||
symbols, symbols_stripped, symbols_prefixes, result.stdout, cppfilt_cmd
|
||||
)
|
||||
|
||||
def _process_demangled_output(
|
||||
self,
|
||||
symbols: list[str],
|
||||
symbols_stripped: list[str],
|
||||
symbols_prefixes: list[str],
|
||||
demangled_output: str,
|
||||
cppfilt_cmd: str,
|
||||
) -> None:
|
||||
"""Process demangled symbol output and populate cache.
|
||||
|
||||
Args:
|
||||
symbols: Original symbol names
|
||||
symbols_stripped: Stripped symbol names sent to c++filt
|
||||
symbols_prefixes: Removed prefixes to restore
|
||||
demangled_output: Output from c++filt
|
||||
cppfilt_cmd: Path to c++filt command (for logging)
|
||||
"""
|
||||
demangled_lines = demangled_output.strip().split("\n")
|
||||
failed_count = 0
|
||||
|
||||
for original, stripped, prefix, demangled in zip(
|
||||
symbols, symbols_stripped, symbols_prefixes, demangled_lines
|
||||
):
|
||||
# Add back any prefix that was removed
|
||||
demangled = self._restore_symbol_prefix(prefix, stripped, demangled)
|
||||
|
||||
# If we stripped a suffix, add it back to the demangled name for clarity
|
||||
if original != stripped and not prefix:
|
||||
demangled = self._restore_symbol_suffix(original, demangled)
|
||||
|
||||
self._demangle_cache[original] = demangled
|
||||
|
||||
# Log symbols that failed to demangle (stayed the same as stripped version)
|
||||
if stripped == demangled and stripped.startswith("_Z"):
|
||||
failed_count += 1
|
||||
if failed_count <= 5: # Only log first 5 failures
|
||||
_LOGGER.warning("Failed to demangle: %s", original)
|
||||
|
||||
if failed_count == 0:
|
||||
_LOGGER.info("Successfully demangled all %d symbols", len(symbols))
|
||||
return
|
||||
|
||||
_LOGGER.warning(
|
||||
"Failed to demangle %d/%d symbols using %s",
|
||||
failed_count,
|
||||
len(symbols),
|
||||
cppfilt_cmd,
|
||||
)
|
||||
|
||||
@staticmethod
|
||||
def _restore_symbol_prefix(prefix: str, stripped: str, demangled: str) -> str:
|
||||
"""Restore prefix that was removed before demangling.
|
||||
|
||||
Args:
|
||||
prefix: Prefix that was removed (e.g., "_GLOBAL__sub_I_")
|
||||
stripped: Stripped symbol name
|
||||
demangled: Demangled symbol name
|
||||
|
||||
Returns:
|
||||
Demangled name with prefix restored/annotated
|
||||
"""
|
||||
if not prefix:
|
||||
return demangled
|
||||
|
||||
# Successfully demangled - add descriptive prefix
|
||||
if demangled != stripped and (
|
||||
annotation := _GCC_PREFIX_ANNOTATIONS.get(prefix)
|
||||
):
|
||||
return f"[{annotation}: {demangled}]"
|
||||
|
||||
# Failed to demangle - restore original prefix
|
||||
return prefix + demangled
|
||||
|
||||
@staticmethod
|
||||
def _restore_symbol_suffix(original: str, demangled: str) -> str:
|
||||
"""Restore GCC optimization suffix that was removed before demangling.
|
||||
|
||||
Args:
|
||||
original: Original symbol name with suffix
|
||||
demangled: Demangled symbol name without suffix
|
||||
|
||||
Returns:
|
||||
Demangled name with suffix annotation
|
||||
"""
|
||||
if suffix_match := _GCC_OPTIMIZATION_SUFFIX_PATTERN.search(original):
|
||||
return f"{demangled} [{suffix_match.group(1)}]"
|
||||
return demangled
|
||||
|
||||
def _demangle_symbol(self, symbol: str) -> str:
|
||||
"""Get demangled C++ symbol name from cache."""
|
||||
return self._demangle_cache.get(symbol, symbol)
|
||||
|
||||
def _categorize_esphome_core_symbol(self, demangled: str) -> str:
|
||||
"""Categorize ESPHome core symbols into subcategories."""
|
||||
# Special patterns that need to be checked separately
|
||||
if any(pattern in demangled for pattern in _CPP_RUNTIME_PATTERNS):
|
||||
return "C++ Runtime (vtables/RTTI)"
|
||||
|
||||
if demangled.startswith(_NAMESPACE_STD):
|
||||
return "C++ STL"
|
||||
|
||||
# Check against patterns from const.py
|
||||
for category, patterns in CORE_SUBCATEGORY_PATTERNS.items():
|
||||
if any(pattern in demangled for pattern in patterns):
|
||||
return category
|
||||
|
||||
return "Other Core"
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
from .cli import main
|
||||
|
||||
main()
|
@@ -1,6 +0,0 @@
|
||||
"""Main entry point for running the memory analyzer as a module."""
|
||||
|
||||
from .cli import main
|
||||
|
||||
if __name__ == "__main__":
|
||||
main()
|
@@ -1,408 +0,0 @@
|
||||
"""CLI interface for memory analysis with report generation."""
|
||||
|
||||
from collections import defaultdict
|
||||
import sys
|
||||
|
||||
from . import (
|
||||
_COMPONENT_API,
|
||||
_COMPONENT_CORE,
|
||||
_COMPONENT_PREFIX_ESPHOME,
|
||||
_COMPONENT_PREFIX_EXTERNAL,
|
||||
MemoryAnalyzer,
|
||||
)
|
||||
|
||||
|
||||
class MemoryAnalyzerCLI(MemoryAnalyzer):
|
||||
"""Memory analyzer with CLI-specific report generation."""
|
||||
|
||||
# Column width constants
|
||||
COL_COMPONENT: int = 29
|
||||
COL_FLASH_TEXT: int = 14
|
||||
COL_FLASH_DATA: int = 14
|
||||
COL_RAM_DATA: int = 12
|
||||
COL_RAM_BSS: int = 12
|
||||
COL_TOTAL_FLASH: int = 15
|
||||
COL_TOTAL_RAM: int = 12
|
||||
COL_SEPARATOR: int = 3 # " | "
|
||||
|
||||
# Core analysis column widths
|
||||
COL_CORE_SUBCATEGORY: int = 30
|
||||
COL_CORE_SIZE: int = 12
|
||||
COL_CORE_COUNT: int = 6
|
||||
COL_CORE_PERCENT: int = 10
|
||||
|
||||
# Calculate table width once at class level
|
||||
TABLE_WIDTH: int = (
|
||||
COL_COMPONENT
|
||||
+ COL_SEPARATOR
|
||||
+ COL_FLASH_TEXT
|
||||
+ COL_SEPARATOR
|
||||
+ COL_FLASH_DATA
|
||||
+ COL_SEPARATOR
|
||||
+ COL_RAM_DATA
|
||||
+ COL_SEPARATOR
|
||||
+ COL_RAM_BSS
|
||||
+ COL_SEPARATOR
|
||||
+ COL_TOTAL_FLASH
|
||||
+ COL_SEPARATOR
|
||||
+ COL_TOTAL_RAM
|
||||
)
|
||||
|
||||
@staticmethod
|
||||
def _make_separator_line(*widths: int) -> str:
|
||||
"""Create a separator line with given column widths.
|
||||
|
||||
Args:
|
||||
widths: Column widths to create separators for
|
||||
|
||||
Returns:
|
||||
Separator line like "----+---------+-----"
|
||||
"""
|
||||
return "-+-".join("-" * width for width in widths)
|
||||
|
||||
# Pre-computed separator lines
|
||||
MAIN_TABLE_SEPARATOR: str = _make_separator_line(
|
||||
COL_COMPONENT,
|
||||
COL_FLASH_TEXT,
|
||||
COL_FLASH_DATA,
|
||||
COL_RAM_DATA,
|
||||
COL_RAM_BSS,
|
||||
COL_TOTAL_FLASH,
|
||||
COL_TOTAL_RAM,
|
||||
)
|
||||
|
||||
CORE_TABLE_SEPARATOR: str = _make_separator_line(
|
||||
COL_CORE_SUBCATEGORY,
|
||||
COL_CORE_SIZE,
|
||||
COL_CORE_COUNT,
|
||||
COL_CORE_PERCENT,
|
||||
)
|
||||
|
||||
def generate_report(self, detailed: bool = False) -> str:
|
||||
"""Generate a formatted memory report."""
|
||||
components = sorted(
|
||||
self.components.items(), key=lambda x: x[1].flash_total, reverse=True
|
||||
)
|
||||
|
||||
# Calculate totals
|
||||
total_flash = sum(c.flash_total for _, c in components)
|
||||
total_ram = sum(c.ram_total for _, c in components)
|
||||
|
||||
# Build report
|
||||
lines: list[str] = []
|
||||
|
||||
lines.append("=" * self.TABLE_WIDTH)
|
||||
lines.append("Component Memory Analysis".center(self.TABLE_WIDTH))
|
||||
lines.append("=" * self.TABLE_WIDTH)
|
||||
lines.append("")
|
||||
|
||||
# Main table - fixed column widths
|
||||
lines.append(
|
||||
f"{'Component':<{self.COL_COMPONENT}} | {'Flash (text)':>{self.COL_FLASH_TEXT}} | {'Flash (data)':>{self.COL_FLASH_DATA}} | {'RAM (data)':>{self.COL_RAM_DATA}} | {'RAM (bss)':>{self.COL_RAM_BSS}} | {'Total Flash':>{self.COL_TOTAL_FLASH}} | {'Total RAM':>{self.COL_TOTAL_RAM}}"
|
||||
)
|
||||
lines.append(self.MAIN_TABLE_SEPARATOR)
|
||||
|
||||
for name, mem in components:
|
||||
if mem.flash_total > 0 or mem.ram_total > 0:
|
||||
flash_rodata = mem.rodata_size + mem.data_size
|
||||
lines.append(
|
||||
f"{name:<{self.COL_COMPONENT}} | {mem.text_size:>{self.COL_FLASH_TEXT - 2},} B | {flash_rodata:>{self.COL_FLASH_DATA - 2},} B | "
|
||||
f"{mem.data_size:>{self.COL_RAM_DATA - 2},} B | {mem.bss_size:>{self.COL_RAM_BSS - 2},} B | "
|
||||
f"{mem.flash_total:>{self.COL_TOTAL_FLASH - 2},} B | {mem.ram_total:>{self.COL_TOTAL_RAM - 2},} B"
|
||||
)
|
||||
|
||||
lines.append(self.MAIN_TABLE_SEPARATOR)
|
||||
lines.append(
|
||||
f"{'TOTAL':<{self.COL_COMPONENT}} | {' ':>{self.COL_FLASH_TEXT}} | {' ':>{self.COL_FLASH_DATA}} | "
|
||||
f"{' ':>{self.COL_RAM_DATA}} | {' ':>{self.COL_RAM_BSS}} | "
|
||||
f"{total_flash:>{self.COL_TOTAL_FLASH - 2},} B | {total_ram:>{self.COL_TOTAL_RAM - 2},} B"
|
||||
)
|
||||
|
||||
# Top consumers
|
||||
lines.append("")
|
||||
lines.append("Top Flash Consumers:")
|
||||
for i, (name, mem) in enumerate(components[:25]):
|
||||
if mem.flash_total > 0:
|
||||
percentage = (
|
||||
(mem.flash_total / total_flash * 100) if total_flash > 0 else 0
|
||||
)
|
||||
lines.append(
|
||||
f"{i + 1}. {name} ({mem.flash_total:,} B) - {percentage:.1f}% of analyzed flash"
|
||||
)
|
||||
|
||||
lines.append("")
|
||||
lines.append("Top RAM Consumers:")
|
||||
ram_components = sorted(components, key=lambda x: x[1].ram_total, reverse=True)
|
||||
for i, (name, mem) in enumerate(ram_components[:25]):
|
||||
if mem.ram_total > 0:
|
||||
percentage = (mem.ram_total / total_ram * 100) if total_ram > 0 else 0
|
||||
lines.append(
|
||||
f"{i + 1}. {name} ({mem.ram_total:,} B) - {percentage:.1f}% of analyzed RAM"
|
||||
)
|
||||
|
||||
lines.append("")
|
||||
lines.append(
|
||||
"Note: This analysis covers symbols in the ELF file. Some runtime allocations may not be included."
|
||||
)
|
||||
lines.append("=" * self.TABLE_WIDTH)
|
||||
|
||||
# Add ESPHome core detailed analysis if there are core symbols
|
||||
if self._esphome_core_symbols:
|
||||
lines.append("")
|
||||
lines.append("=" * self.TABLE_WIDTH)
|
||||
lines.append(
|
||||
f"{_COMPONENT_CORE} Detailed Analysis".center(self.TABLE_WIDTH)
|
||||
)
|
||||
lines.append("=" * self.TABLE_WIDTH)
|
||||
lines.append("")
|
||||
|
||||
# Group core symbols by subcategory
|
||||
core_subcategories: dict[str, list[tuple[str, str, int]]] = defaultdict(
|
||||
list
|
||||
)
|
||||
|
||||
for symbol, demangled, size in self._esphome_core_symbols:
|
||||
# Categorize based on demangled name patterns
|
||||
subcategory = self._categorize_esphome_core_symbol(demangled)
|
||||
core_subcategories[subcategory].append((symbol, demangled, size))
|
||||
|
||||
# Sort subcategories by total size
|
||||
sorted_subcategories = sorted(
|
||||
[
|
||||
(name, symbols, sum(s[2] for s in symbols))
|
||||
for name, symbols in core_subcategories.items()
|
||||
],
|
||||
key=lambda x: x[2],
|
||||
reverse=True,
|
||||
)
|
||||
|
||||
lines.append(
|
||||
f"{'Subcategory':<{self.COL_CORE_SUBCATEGORY}} | {'Size':>{self.COL_CORE_SIZE}} | "
|
||||
f"{'Count':>{self.COL_CORE_COUNT}} | {'% of Core':>{self.COL_CORE_PERCENT}}"
|
||||
)
|
||||
lines.append(self.CORE_TABLE_SEPARATOR)
|
||||
|
||||
core_total = sum(size for _, _, size in self._esphome_core_symbols)
|
||||
|
||||
for subcategory, symbols, total_size in sorted_subcategories:
|
||||
percentage = (total_size / core_total * 100) if core_total > 0 else 0
|
||||
lines.append(
|
||||
f"{subcategory:<{self.COL_CORE_SUBCATEGORY}} | {total_size:>{self.COL_CORE_SIZE - 2},} B | "
|
||||
f"{len(symbols):>{self.COL_CORE_COUNT}} | {percentage:>{self.COL_CORE_PERCENT - 1}.1f}%"
|
||||
)
|
||||
|
||||
# Top 15 largest core symbols
|
||||
lines.append("")
|
||||
lines.append(f"Top 15 Largest {_COMPONENT_CORE} Symbols:")
|
||||
sorted_core_symbols = sorted(
|
||||
self._esphome_core_symbols, key=lambda x: x[2], reverse=True
|
||||
)
|
||||
|
||||
for i, (symbol, demangled, size) in enumerate(sorted_core_symbols[:15]):
|
||||
lines.append(f"{i + 1}. {demangled} ({size:,} B)")
|
||||
|
||||
lines.append("=" * self.TABLE_WIDTH)
|
||||
|
||||
# Add detailed analysis for top ESPHome and external components
|
||||
esphome_components = [
|
||||
(name, mem)
|
||||
for name, mem in components
|
||||
if name.startswith(_COMPONENT_PREFIX_ESPHOME) and name != _COMPONENT_CORE
|
||||
]
|
||||
external_components = [
|
||||
(name, mem)
|
||||
for name, mem in components
|
||||
if name.startswith(_COMPONENT_PREFIX_EXTERNAL)
|
||||
]
|
||||
|
||||
top_esphome_components = sorted(
|
||||
esphome_components, key=lambda x: x[1].flash_total, reverse=True
|
||||
)[:30]
|
||||
|
||||
# Include all external components (they're usually important)
|
||||
top_external_components = sorted(
|
||||
external_components, key=lambda x: x[1].flash_total, reverse=True
|
||||
)
|
||||
|
||||
# Check if API component exists and ensure it's included
|
||||
api_component = None
|
||||
for name, mem in components:
|
||||
if name == _COMPONENT_API:
|
||||
api_component = (name, mem)
|
||||
break
|
||||
|
||||
# Combine all components to analyze: top ESPHome + all external + API if not already included
|
||||
components_to_analyze = list(top_esphome_components) + list(
|
||||
top_external_components
|
||||
)
|
||||
if api_component and api_component not in components_to_analyze:
|
||||
components_to_analyze.append(api_component)
|
||||
|
||||
if components_to_analyze:
|
||||
for comp_name, comp_mem in components_to_analyze:
|
||||
if not (comp_symbols := self._component_symbols.get(comp_name, [])):
|
||||
continue
|
||||
lines.append("")
|
||||
lines.append("=" * self.TABLE_WIDTH)
|
||||
lines.append(f"{comp_name} Detailed Analysis".center(self.TABLE_WIDTH))
|
||||
lines.append("=" * self.TABLE_WIDTH)
|
||||
lines.append("")
|
||||
|
||||
# Sort symbols by size
|
||||
sorted_symbols = sorted(comp_symbols, key=lambda x: x[2], reverse=True)
|
||||
|
||||
lines.append(f"Total symbols: {len(sorted_symbols)}")
|
||||
lines.append(f"Total size: {comp_mem.flash_total:,} B")
|
||||
lines.append("")
|
||||
|
||||
# Show all symbols > 100 bytes for better visibility
|
||||
large_symbols = [
|
||||
(sym, dem, size) for sym, dem, size in sorted_symbols if size > 100
|
||||
]
|
||||
|
||||
lines.append(
|
||||
f"{comp_name} Symbols > 100 B ({len(large_symbols)} symbols):"
|
||||
)
|
||||
for i, (symbol, demangled, size) in enumerate(large_symbols):
|
||||
lines.append(f"{i + 1}. {demangled} ({size:,} B)")
|
||||
|
||||
lines.append("=" * self.TABLE_WIDTH)
|
||||
|
||||
return "\n".join(lines)
|
||||
|
||||
def dump_uncategorized_symbols(self, output_file: str | None = None) -> None:
|
||||
"""Dump uncategorized symbols for analysis."""
|
||||
# Sort by size descending
|
||||
sorted_symbols = sorted(
|
||||
self._uncategorized_symbols, key=lambda x: x[2], reverse=True
|
||||
)
|
||||
|
||||
lines = ["Uncategorized Symbols Analysis", "=" * 80]
|
||||
lines.append(f"Total uncategorized symbols: {len(sorted_symbols)}")
|
||||
lines.append(
|
||||
f"Total uncategorized size: {sum(s[2] for s in sorted_symbols):,} bytes"
|
||||
)
|
||||
lines.append("")
|
||||
lines.append(f"{'Size':>10} | {'Symbol':<60} | Demangled")
|
||||
lines.append("-" * 10 + "-+-" + "-" * 60 + "-+-" + "-" * 40)
|
||||
|
||||
for symbol, demangled, size in sorted_symbols[:100]: # Top 100
|
||||
demangled_display = (
|
||||
demangled[:100] if symbol != demangled else "[not demangled]"
|
||||
)
|
||||
lines.append(f"{size:>10,} | {symbol[:60]:<60} | {demangled_display}")
|
||||
|
||||
if len(sorted_symbols) > 100:
|
||||
lines.append(f"\n... and {len(sorted_symbols) - 100} more symbols")
|
||||
|
||||
content = "\n".join(lines)
|
||||
|
||||
if output_file:
|
||||
with open(output_file, "w", encoding="utf-8") as f:
|
||||
f.write(content)
|
||||
else:
|
||||
print(content)
|
||||
|
||||
|
||||
def analyze_elf(
|
||||
elf_path: str,
|
||||
objdump_path: str | None = None,
|
||||
readelf_path: str | None = None,
|
||||
detailed: bool = False,
|
||||
external_components: set[str] | None = None,
|
||||
) -> str:
|
||||
"""Analyze an ELF file and return a memory report."""
|
||||
analyzer = MemoryAnalyzerCLI(
|
||||
elf_path, objdump_path, readelf_path, external_components
|
||||
)
|
||||
analyzer.analyze()
|
||||
return analyzer.generate_report(detailed)
|
||||
|
||||
|
||||
def main():
|
||||
"""CLI entrypoint for memory analysis."""
|
||||
if len(sys.argv) < 2:
|
||||
print("Usage: python -m esphome.analyze_memory <build_directory>")
|
||||
print("\nAnalyze memory usage from an ESPHome build directory.")
|
||||
print("The build directory should contain firmware.elf and idedata will be")
|
||||
print("loaded from ~/.esphome/.internal/idedata/<device>.json")
|
||||
print("\nExamples:")
|
||||
print(" python -m esphome.analyze_memory ~/.esphome/build/my-device")
|
||||
print(" python -m esphome.analyze_memory .esphome/build/my-device")
|
||||
print(" python -m esphome.analyze_memory my-device # Short form")
|
||||
sys.exit(1)
|
||||
|
||||
build_dir = sys.argv[1]
|
||||
|
||||
# Load build directory
|
||||
import json
|
||||
from pathlib import Path
|
||||
|
||||
from esphome.platformio_api import IDEData
|
||||
|
||||
build_path = Path(build_dir)
|
||||
|
||||
# If no path separator in name, assume it's a device name
|
||||
if "/" not in build_dir and not build_path.is_dir():
|
||||
# Try current directory first
|
||||
cwd_path = Path.cwd() / ".esphome" / "build" / build_dir
|
||||
if cwd_path.is_dir():
|
||||
build_path = cwd_path
|
||||
print(f"Using build directory: {build_path}", file=sys.stderr)
|
||||
else:
|
||||
# Fall back to home directory
|
||||
build_path = Path.home() / ".esphome" / "build" / build_dir
|
||||
print(f"Using build directory: {build_path}", file=sys.stderr)
|
||||
|
||||
if not build_path.is_dir():
|
||||
print(f"Error: {build_path} is not a directory", file=sys.stderr)
|
||||
sys.exit(1)
|
||||
|
||||
# Find firmware.elf
|
||||
elf_file = None
|
||||
for elf_candidate in [
|
||||
build_path / "firmware.elf",
|
||||
build_path / ".pioenvs" / build_path.name / "firmware.elf",
|
||||
]:
|
||||
if elf_candidate.exists():
|
||||
elf_file = str(elf_candidate)
|
||||
break
|
||||
|
||||
if not elf_file:
|
||||
print(f"Error: firmware.elf not found in {build_dir}", file=sys.stderr)
|
||||
sys.exit(1)
|
||||
|
||||
# Find idedata.json - check current directory first, then home
|
||||
device_name = build_path.name
|
||||
idedata_candidates = [
|
||||
Path.cwd() / ".esphome" / "idedata" / f"{device_name}.json",
|
||||
Path.home() / ".esphome" / "idedata" / f"{device_name}.json",
|
||||
]
|
||||
|
||||
idedata = None
|
||||
for idedata_path in idedata_candidates:
|
||||
if not idedata_path.exists():
|
||||
continue
|
||||
try:
|
||||
with open(idedata_path, encoding="utf-8") as f:
|
||||
raw_data = json.load(f)
|
||||
idedata = IDEData(raw_data)
|
||||
print(f"Loaded idedata from: {idedata_path}", file=sys.stderr)
|
||||
break
|
||||
except (json.JSONDecodeError, OSError) as e:
|
||||
print(f"Warning: Failed to load idedata: {e}", file=sys.stderr)
|
||||
|
||||
if not idedata:
|
||||
print(
|
||||
f"Warning: idedata not found (searched {idedata_candidates[0]} and {idedata_candidates[1]})",
|
||||
file=sys.stderr,
|
||||
)
|
||||
|
||||
analyzer = MemoryAnalyzerCLI(elf_file, idedata=idedata)
|
||||
analyzer.analyze()
|
||||
report = analyzer.generate_report()
|
||||
print(report)
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
main()
|
@@ -1,903 +0,0 @@
|
||||
"""Constants for memory analysis symbol pattern matching."""
|
||||
|
||||
import re
|
||||
|
||||
# Pattern to extract ESPHome component namespaces dynamically
|
||||
ESPHOME_COMPONENT_PATTERN = re.compile(r"esphome::([a-zA-Z0-9_]+)::")
|
||||
|
||||
# Section mapping for ELF file sections
|
||||
# Maps standard section names to their various platform-specific variants
|
||||
SECTION_MAPPING = {
|
||||
".text": frozenset([".text", ".iram"]),
|
||||
".rodata": frozenset([".rodata"]),
|
||||
".data": frozenset([".data", ".dram"]),
|
||||
".bss": frozenset([".bss"]),
|
||||
}
|
||||
|
||||
# Section to ComponentMemory attribute mapping
|
||||
# Maps section names to the attribute name in ComponentMemory dataclass
|
||||
SECTION_TO_ATTR = {
|
||||
".text": "text_size",
|
||||
".rodata": "rodata_size",
|
||||
".data": "data_size",
|
||||
".bss": "bss_size",
|
||||
}
|
||||
|
||||
# Component identification rules
|
||||
# Symbol patterns: patterns found in raw symbol names
|
||||
SYMBOL_PATTERNS = {
|
||||
"freertos": [
|
||||
"vTask",
|
||||
"xTask",
|
||||
"xQueue",
|
||||
"pvPort",
|
||||
"vPort",
|
||||
"uxTask",
|
||||
"pcTask",
|
||||
"prvTimerTask",
|
||||
"prvAddNewTaskToReadyList",
|
||||
"pxReadyTasksLists",
|
||||
"prvAddCurrentTaskToDelayedList",
|
||||
"xEventGroupWaitBits",
|
||||
"xRingbufferSendFromISR",
|
||||
"prvSendItemDoneNoSplit",
|
||||
"prvReceiveGeneric",
|
||||
"prvSendAcquireGeneric",
|
||||
"prvCopyItemAllowSplit",
|
||||
"xEventGroup",
|
||||
"xRingbuffer",
|
||||
"prvSend",
|
||||
"prvReceive",
|
||||
"prvCopy",
|
||||
"xPort",
|
||||
"ulTaskGenericNotifyTake",
|
||||
"prvIdleTask",
|
||||
"prvInitialiseNewTask",
|
||||
"prvIsYieldRequiredSMP",
|
||||
"prvGetItemByteBuf",
|
||||
"prvInitializeNewRingbuffer",
|
||||
"prvAcquireItemNoSplit",
|
||||
"prvNotifyQueueSetContainer",
|
||||
"ucStaticTimerQueueStorage",
|
||||
"eTaskGetState",
|
||||
"main_task",
|
||||
"do_system_init_fn",
|
||||
"xSemaphoreCreateGenericWithCaps",
|
||||
"vListInsert",
|
||||
"uxListRemove",
|
||||
"vRingbufferReturnItem",
|
||||
"vRingbufferReturnItemFromISR",
|
||||
"prvCheckItemFitsByteBuffer",
|
||||
"prvGetCurMaxSizeAllowSplit",
|
||||
"tick_hook",
|
||||
"sys_sem_new",
|
||||
"sys_arch_mbox_fetch",
|
||||
"sys_arch_sem_wait",
|
||||
"prvDeleteTCB",
|
||||
"vQueueDeleteWithCaps",
|
||||
"vRingbufferDeleteWithCaps",
|
||||
"vSemaphoreDeleteWithCaps",
|
||||
"prvCheckItemAvail",
|
||||
"prvCheckTaskCanBeScheduledSMP",
|
||||
"prvGetCurMaxSizeNoSplit",
|
||||
"prvResetNextTaskUnblockTime",
|
||||
"prvReturnItemByteBuf",
|
||||
"vApplicationStackOverflowHook",
|
||||
"vApplicationGetIdleTaskMemory",
|
||||
"sys_init",
|
||||
"sys_mbox_new",
|
||||
"sys_arch_mbox_tryfetch",
|
||||
],
|
||||
"xtensa": ["xt_", "_xt_", "xPortEnterCriticalTimeout"],
|
||||
"heap": ["heap_", "multi_heap"],
|
||||
"spi_flash": ["spi_flash"],
|
||||
"rtc": ["rtc_", "rtcio_ll_"],
|
||||
"gpio_driver": ["gpio_", "pins"],
|
||||
"uart_driver": ["uart", "_uart", "UART"],
|
||||
"timer": ["timer_", "esp_timer"],
|
||||
"peripherals": ["periph_", "periman"],
|
||||
"network_stack": [
|
||||
"vj_compress",
|
||||
"raw_sendto",
|
||||
"raw_input",
|
||||
"etharp_",
|
||||
"icmp_input",
|
||||
"socket_ipv6",
|
||||
"ip_napt",
|
||||
"socket_ipv4_multicast",
|
||||
"socket_ipv6_multicast",
|
||||
"netconn_",
|
||||
"recv_raw",
|
||||
"accept_function",
|
||||
"netconn_recv_data",
|
||||
"netconn_accept",
|
||||
"netconn_write_vectors_partly",
|
||||
"netconn_drain",
|
||||
"raw_connect",
|
||||
"raw_bind",
|
||||
"icmp_send_response",
|
||||
"sockets",
|
||||
"icmp_dest_unreach",
|
||||
"inet_chksum_pseudo",
|
||||
"alloc_socket",
|
||||
"done_socket",
|
||||
"set_global_fd_sets",
|
||||
"inet_chksum_pbuf",
|
||||
"tryget_socket_unconn_locked",
|
||||
"tryget_socket_unconn",
|
||||
"cs_create_ctrl_sock",
|
||||
"netbuf_alloc",
|
||||
],
|
||||
"ipv6_stack": ["nd6_", "ip6_", "mld6_", "icmp6_", "icmp6_input"],
|
||||
"wifi_stack": [
|
||||
"ieee80211",
|
||||
"hostap",
|
||||
"sta_",
|
||||
"ap_",
|
||||
"scan_",
|
||||
"wifi_",
|
||||
"wpa_",
|
||||
"wps_",
|
||||
"esp_wifi",
|
||||
"cnx_",
|
||||
"wpa3_",
|
||||
"sae_",
|
||||
"wDev_",
|
||||
"ic_",
|
||||
"mac_",
|
||||
"esf_buf",
|
||||
"gWpaSm",
|
||||
"sm_WPA",
|
||||
"eapol_",
|
||||
"owe_",
|
||||
"wifiLowLevelInit",
|
||||
"s_do_mapping",
|
||||
"gScanStruct",
|
||||
"ppSearchTxframe",
|
||||
"ppMapWaitTxq",
|
||||
"ppFillAMPDUBar",
|
||||
"ppCheckTxConnTrafficIdle",
|
||||
"ppCalTkipMic",
|
||||
],
|
||||
"bluetooth": ["bt_", "ble_", "l2c_", "gatt_", "gap_", "hci_", "BT_init"],
|
||||
"wifi_bt_coex": ["coex"],
|
||||
"bluetooth_rom": ["r_ble", "r_lld", "r_llc", "r_llm"],
|
||||
"bluedroid_bt": [
|
||||
"bluedroid",
|
||||
"btc_",
|
||||
"bta_",
|
||||
"btm_",
|
||||
"btu_",
|
||||
"BTM_",
|
||||
"GATT",
|
||||
"L2CA_",
|
||||
"smp_",
|
||||
"gatts_",
|
||||
"attp_",
|
||||
"l2cu_",
|
||||
"l2cb",
|
||||
"smp_cb",
|
||||
"BTA_GATTC_",
|
||||
"SMP_",
|
||||
"BTU_",
|
||||
"BTA_Dm",
|
||||
"GAP_Ble",
|
||||
"BT_tx_if",
|
||||
"host_recv_pkt_cb",
|
||||
"saved_local_oob_data",
|
||||
"string_to_bdaddr",
|
||||
"string_is_bdaddr",
|
||||
"CalConnectParamTimeout",
|
||||
"transmit_fragment",
|
||||
"transmit_data",
|
||||
"event_command_ready",
|
||||
"read_command_complete_header",
|
||||
"parse_read_local_extended_features_response",
|
||||
"parse_read_local_version_info_response",
|
||||
"should_request_high",
|
||||
"btdm_wakeup_request",
|
||||
"BTA_SetAttributeValue",
|
||||
"BTA_EnableBluetooth",
|
||||
"transmit_command_futured",
|
||||
"transmit_command",
|
||||
"get_waiting_command",
|
||||
"make_command",
|
||||
"transmit_downward",
|
||||
"host_recv_adv_packet",
|
||||
"copy_extra_byte_in_db",
|
||||
"parse_read_local_supported_commands_response",
|
||||
],
|
||||
"crypto_math": [
|
||||
"ecp_",
|
||||
"bignum_",
|
||||
"mpi_",
|
||||
"sswu",
|
||||
"modp",
|
||||
"dragonfly_",
|
||||
"gcm_mult",
|
||||
"__multiply",
|
||||
"quorem",
|
||||
"__mdiff",
|
||||
"__lshift",
|
||||
"__mprec_tens",
|
||||
"ECC_",
|
||||
"multiprecision_",
|
||||
"mix_sub_columns",
|
||||
"sbox",
|
||||
"gfm2_sbox",
|
||||
"gfm3_sbox",
|
||||
"curve_p256",
|
||||
"curve",
|
||||
"p_256_init_curve",
|
||||
"shift_sub_rows",
|
||||
"rshift",
|
||||
],
|
||||
"hw_crypto": ["esp_aes", "esp_sha", "esp_rsa", "esp_bignum", "esp_mpi"],
|
||||
"libc": [
|
||||
"printf",
|
||||
"scanf",
|
||||
"malloc",
|
||||
"free",
|
||||
"memcpy",
|
||||
"memset",
|
||||
"strcpy",
|
||||
"strlen",
|
||||
"_dtoa",
|
||||
"_fopen",
|
||||
"__sfvwrite_r",
|
||||
"qsort",
|
||||
"__sf",
|
||||
"__sflush_r",
|
||||
"__srefill_r",
|
||||
"_impure_data",
|
||||
"_reclaim_reent",
|
||||
"_open_r",
|
||||
"strncpy",
|
||||
"_strtod_l",
|
||||
"__gethex",
|
||||
"__hexnan",
|
||||
"_setenv_r",
|
||||
"_tzset_unlocked_r",
|
||||
"__tzcalc_limits",
|
||||
"select",
|
||||
"scalbnf",
|
||||
"strtof",
|
||||
"strtof_l",
|
||||
"__d2b",
|
||||
"__b2d",
|
||||
"__s2b",
|
||||
"_Balloc",
|
||||
"__multadd",
|
||||
"__lo0bits",
|
||||
"__atexit0",
|
||||
"__smakebuf_r",
|
||||
"__swhatbuf_r",
|
||||
"_sungetc_r",
|
||||
"_close_r",
|
||||
"_link_r",
|
||||
"_unsetenv_r",
|
||||
"_rename_r",
|
||||
"__month_lengths",
|
||||
"tzinfo",
|
||||
"__ratio",
|
||||
"__hi0bits",
|
||||
"__ulp",
|
||||
"__any_on",
|
||||
"__copybits",
|
||||
"L_shift",
|
||||
"_fcntl_r",
|
||||
"_lseek_r",
|
||||
"_read_r",
|
||||
"_write_r",
|
||||
"_unlink_r",
|
||||
"_fstat_r",
|
||||
"access",
|
||||
"fsync",
|
||||
"tcsetattr",
|
||||
"tcgetattr",
|
||||
"tcflush",
|
||||
"tcdrain",
|
||||
"__ssrefill_r",
|
||||
"_stat_r",
|
||||
"__hexdig_fun",
|
||||
"__mcmp",
|
||||
"_fwalk_sglue",
|
||||
"__fpclassifyf",
|
||||
"_setlocale_r",
|
||||
"_mbrtowc_r",
|
||||
"fcntl",
|
||||
"__match",
|
||||
"_lock_close",
|
||||
"__c$",
|
||||
"__func__$",
|
||||
"__FUNCTION__$",
|
||||
"DAYS_IN_MONTH",
|
||||
"_DAYS_BEFORE_MONTH",
|
||||
"CSWTCH$",
|
||||
"dst$",
|
||||
"sulp",
|
||||
],
|
||||
"string_ops": ["strcmp", "strncmp", "strchr", "strstr", "strtok", "strdup"],
|
||||
"memory_alloc": ["malloc", "calloc", "realloc", "free", "_sbrk"],
|
||||
"file_io": [
|
||||
"fread",
|
||||
"fwrite",
|
||||
"fopen",
|
||||
"fclose",
|
||||
"fseek",
|
||||
"ftell",
|
||||
"fflush",
|
||||
"s_fd_table",
|
||||
],
|
||||
"string_formatting": [
|
||||
"snprintf",
|
||||
"vsnprintf",
|
||||
"sprintf",
|
||||
"vsprintf",
|
||||
"sscanf",
|
||||
"vsscanf",
|
||||
],
|
||||
"cpp_anonymous": ["_GLOBAL__N_", "n$"],
|
||||
"cpp_runtime": ["__cxx", "_ZN", "_ZL", "_ZSt", "__gxx_personality", "_Z16"],
|
||||
"exception_handling": ["__cxa_", "_Unwind_", "__gcc_personality", "uw_frame_state"],
|
||||
"static_init": ["_GLOBAL__sub_I_"],
|
||||
"mdns_lib": ["mdns"],
|
||||
"phy_radio": [
|
||||
"phy_",
|
||||
"rf_",
|
||||
"chip_",
|
||||
"register_chipv7",
|
||||
"pbus_",
|
||||
"bb_",
|
||||
"fe_",
|
||||
"rfcal_",
|
||||
"ram_rfcal",
|
||||
"tx_pwctrl",
|
||||
"rx_chan",
|
||||
"set_rx_gain",
|
||||
"set_chan",
|
||||
"agc_reg",
|
||||
"ram_txiq",
|
||||
"ram_txdc",
|
||||
"ram_gen_rx_gain",
|
||||
"rx_11b_opt",
|
||||
"set_rx_sense",
|
||||
"set_rx_gain_cal",
|
||||
"set_chan_dig_gain",
|
||||
"tx_pwctrl_init_cal",
|
||||
"rfcal_txiq",
|
||||
"set_tx_gain_table",
|
||||
"correct_rfpll_offset",
|
||||
"pll_correct_dcap",
|
||||
"txiq_cal_init",
|
||||
"pwdet_sar",
|
||||
"pwdet_sar2_init",
|
||||
"ram_iq_est_enable",
|
||||
"ram_rfpll_set_freq",
|
||||
"ant_wifirx_cfg",
|
||||
"ant_btrx_cfg",
|
||||
"force_txrxoff",
|
||||
"force_txrx_off",
|
||||
"tx_paon_set",
|
||||
"opt_11b_resart",
|
||||
"rfpll_1p2_opt",
|
||||
"ram_dc_iq_est",
|
||||
"ram_start_tx_tone",
|
||||
"ram_en_pwdet",
|
||||
"ram_cbw2040_cfg",
|
||||
"rxdc_est_min",
|
||||
"i2cmst_reg_init",
|
||||
"temprature_sens_read",
|
||||
"ram_restart_cal",
|
||||
"ram_write_gain_mem",
|
||||
"ram_wait_rfpll_cal_end",
|
||||
"txcal_debuge_mode",
|
||||
"ant_wifitx_cfg",
|
||||
"reg_init_begin",
|
||||
],
|
||||
"wifi_phy_pp": ["pp_", "ppT", "ppR", "ppP", "ppInstall", "ppCalTxAMPDULength"],
|
||||
"wifi_lmac": ["lmac"],
|
||||
"wifi_device": ["wdev", "wDev_"],
|
||||
"power_mgmt": [
|
||||
"pm_",
|
||||
"sleep",
|
||||
"rtc_sleep",
|
||||
"light_sleep",
|
||||
"deep_sleep",
|
||||
"power_down",
|
||||
"g_pm",
|
||||
],
|
||||
"memory_mgmt": [
|
||||
"mem_",
|
||||
"memory_",
|
||||
"tlsf_",
|
||||
"memp_",
|
||||
"pbuf_",
|
||||
"pbuf_alloc",
|
||||
"pbuf_copy_partial_pbuf",
|
||||
],
|
||||
"hal_layer": ["hal_"],
|
||||
"clock_mgmt": [
|
||||
"clk_",
|
||||
"clock_",
|
||||
"rtc_clk",
|
||||
"apb_",
|
||||
"cpu_freq",
|
||||
"setCpuFrequencyMhz",
|
||||
],
|
||||
"cache_mgmt": ["cache"],
|
||||
"flash_ops": ["flash", "image_load"],
|
||||
"interrupt_handlers": [
|
||||
"isr",
|
||||
"interrupt",
|
||||
"intr_",
|
||||
"exc_",
|
||||
"exception",
|
||||
"port_IntStack",
|
||||
],
|
||||
"wrapper_functions": ["_wrapper"],
|
||||
"error_handling": ["panic", "abort", "assert", "error_", "fault"],
|
||||
"authentication": ["auth"],
|
||||
"ppp_protocol": ["ppp", "ipcp_", "lcp_", "chap_", "LcpEchoCheck"],
|
||||
"dhcp": ["dhcp", "handle_dhcp"],
|
||||
"ethernet_phy": [
|
||||
"emac_",
|
||||
"eth_phy_",
|
||||
"phy_tlk110",
|
||||
"phy_lan87",
|
||||
"phy_ip101",
|
||||
"phy_rtl",
|
||||
"phy_dp83",
|
||||
"phy_ksz",
|
||||
"lan87xx_",
|
||||
"rtl8201_",
|
||||
"ip101_",
|
||||
"ksz80xx_",
|
||||
"jl1101_",
|
||||
"dp83848_",
|
||||
"eth_on_state_changed",
|
||||
],
|
||||
"threading": ["pthread_", "thread_", "_task_"],
|
||||
"pthread": ["pthread"],
|
||||
"synchronization": ["mutex", "semaphore", "spinlock", "portMUX"],
|
||||
"math_lib": [
|
||||
"sin",
|
||||
"cos",
|
||||
"tan",
|
||||
"sqrt",
|
||||
"pow",
|
||||
"exp",
|
||||
"log",
|
||||
"atan",
|
||||
"asin",
|
||||
"acos",
|
||||
"floor",
|
||||
"ceil",
|
||||
"fabs",
|
||||
"round",
|
||||
],
|
||||
"random": ["rand", "random", "rng_", "prng"],
|
||||
"time_lib": [
|
||||
"time",
|
||||
"clock",
|
||||
"gettimeofday",
|
||||
"settimeofday",
|
||||
"localtime",
|
||||
"gmtime",
|
||||
"mktime",
|
||||
"strftime",
|
||||
],
|
||||
"console_io": ["console_", "uart_tx", "uart_rx", "puts", "putchar", "getchar"],
|
||||
"rom_functions": ["r_", "rom_"],
|
||||
"compiler_runtime": [
|
||||
"__divdi3",
|
||||
"__udivdi3",
|
||||
"__moddi3",
|
||||
"__muldi3",
|
||||
"__ashldi3",
|
||||
"__ashrdi3",
|
||||
"__lshrdi3",
|
||||
"__cmpdi2",
|
||||
"__fixdfdi",
|
||||
"__floatdidf",
|
||||
],
|
||||
"libgcc": ["libgcc", "_divdi3", "_udivdi3"],
|
||||
"boot_startup": ["boot", "start_cpu", "call_start", "startup", "bootloader"],
|
||||
"bootloader": ["bootloader_", "esp_bootloader"],
|
||||
"app_framework": ["app_", "initArduino", "setup", "loop", "Update"],
|
||||
"weak_symbols": ["__weak_"],
|
||||
"compiler_builtins": ["__builtin_"],
|
||||
"vfs": ["vfs_", "VFS"],
|
||||
"esp32_sdk": ["esp32_", "esp32c", "esp32s"],
|
||||
"usb": ["usb_", "USB", "cdc_", "CDC"],
|
||||
"i2c_driver": ["i2c_", "I2C"],
|
||||
"i2s_driver": ["i2s_", "I2S"],
|
||||
"spi_driver": ["spi_", "SPI"],
|
||||
"adc_driver": ["adc_", "ADC"],
|
||||
"dac_driver": ["dac_", "DAC"],
|
||||
"touch_driver": ["touch_", "TOUCH"],
|
||||
"pwm_driver": ["pwm_", "PWM", "ledc_", "LEDC"],
|
||||
"rmt_driver": ["rmt_", "RMT"],
|
||||
"pcnt_driver": ["pcnt_", "PCNT"],
|
||||
"can_driver": ["can_", "CAN", "twai_", "TWAI"],
|
||||
"sdmmc_driver": ["sdmmc_", "SDMMC", "sdcard", "sd_card"],
|
||||
"temp_sensor": ["temp_sensor", "tsens_"],
|
||||
"watchdog": ["wdt_", "WDT", "watchdog"],
|
||||
"brownout": ["brownout", "bod_"],
|
||||
"ulp": ["ulp_", "ULP"],
|
||||
"psram": ["psram", "PSRAM", "spiram", "SPIRAM"],
|
||||
"efuse": ["efuse", "EFUSE"],
|
||||
"partition": ["partition", "esp_partition"],
|
||||
"esp_event": ["esp_event", "event_loop", "event_callback"],
|
||||
"esp_console": ["esp_console", "console_"],
|
||||
"chip_specific": ["chip_", "esp_chip"],
|
||||
"esp_system_utils": ["esp_system", "esp_hw", "esp_clk", "esp_sleep"],
|
||||
"ipc": ["esp_ipc", "ipc_"],
|
||||
"wifi_config": [
|
||||
"g_cnxMgr",
|
||||
"gChmCxt",
|
||||
"g_ic",
|
||||
"TxRxCxt",
|
||||
"s_dp",
|
||||
"s_ni",
|
||||
"s_reg_dump",
|
||||
"packet$",
|
||||
"d_mult_table",
|
||||
"K",
|
||||
"fcstab",
|
||||
],
|
||||
"smartconfig": ["sc_ack_send"],
|
||||
"rc_calibration": ["rc_cal", "rcUpdate"],
|
||||
"noise_floor": ["noise_check"],
|
||||
"rf_calibration": [
|
||||
"set_rx_sense",
|
||||
"set_rx_gain_cal",
|
||||
"set_chan_dig_gain",
|
||||
"tx_pwctrl_init_cal",
|
||||
"rfcal_txiq",
|
||||
"set_tx_gain_table",
|
||||
"correct_rfpll_offset",
|
||||
"pll_correct_dcap",
|
||||
"txiq_cal_init",
|
||||
"pwdet_sar",
|
||||
"rx_11b_opt",
|
||||
],
|
||||
"wifi_crypto": [
|
||||
"pk_use_ecparams",
|
||||
"process_segments",
|
||||
"ccmp_",
|
||||
"rc4_",
|
||||
"aria_",
|
||||
"mgf_mask",
|
||||
"dh_group",
|
||||
"ccmp_aad_nonce",
|
||||
"ccmp_encrypt",
|
||||
"rc4_skip",
|
||||
"aria_sb1",
|
||||
"aria_sb2",
|
||||
"aria_is1",
|
||||
"aria_is2",
|
||||
"aria_sl",
|
||||
"aria_a",
|
||||
],
|
||||
"radio_control": ["fsm_input", "fsm_sconfreq"],
|
||||
"pbuf": [
|
||||
"pbuf_",
|
||||
],
|
||||
"event_group": ["xEventGroup"],
|
||||
"ringbuffer": ["xRingbuffer", "prvSend", "prvReceive", "prvCopy"],
|
||||
"provisioning": ["prov_", "prov_stop_and_notify"],
|
||||
"scan": ["gScanStruct"],
|
||||
"port": ["xPort"],
|
||||
"elf_loader": [
|
||||
"elf_add",
|
||||
"elf_add_note",
|
||||
"elf_add_segment",
|
||||
"process_image",
|
||||
"read_encoded",
|
||||
"read_encoded_value",
|
||||
"read_encoded_value_with_base",
|
||||
"process_image_header",
|
||||
],
|
||||
"socket_api": [
|
||||
"sockets",
|
||||
"netconn_",
|
||||
"accept_function",
|
||||
"recv_raw",
|
||||
"socket_ipv4_multicast",
|
||||
"socket_ipv6_multicast",
|
||||
],
|
||||
"igmp": ["igmp_", "igmp_send", "igmp_input"],
|
||||
"icmp6": ["icmp6_"],
|
||||
"arp": ["arp_table"],
|
||||
"ampdu": [
|
||||
"ampdu_",
|
||||
"rcAmpdu",
|
||||
"trc_onAmpduOp",
|
||||
"rcAmpduLowerRate",
|
||||
"ampdu_dispatch_upto",
|
||||
],
|
||||
"ieee802_11": ["ieee802_11_", "ieee802_11_parse_elems"],
|
||||
"rate_control": ["rssi_margin", "rcGetSched", "get_rate_fcc_index"],
|
||||
"nan": ["nan_dp_", "nan_dp_post_tx", "nan_dp_delete_peer"],
|
||||
"channel_mgmt": ["chm_init", "chm_set_current_channel"],
|
||||
"trace": ["trc_init", "trc_onAmpduOp"],
|
||||
"country_code": ["country_info", "country_info_24ghz"],
|
||||
"multicore": ["do_multicore_settings"],
|
||||
"Update_lib": ["Update"],
|
||||
"stdio": [
|
||||
"__sf",
|
||||
"__sflush_r",
|
||||
"__srefill_r",
|
||||
"_impure_data",
|
||||
"_reclaim_reent",
|
||||
"_open_r",
|
||||
],
|
||||
"strncpy_ops": ["strncpy"],
|
||||
"math_internal": ["__mdiff", "__lshift", "__mprec_tens", "quorem"],
|
||||
"character_class": ["__chclass"],
|
||||
"camellia": ["camellia_", "camellia_feistel"],
|
||||
"crypto_tables": ["FSb", "FSb2", "FSb3", "FSb4"],
|
||||
"event_buffer": ["g_eb_list_desc", "eb_space"],
|
||||
"base_node": ["base_node_", "base_node_add_handler"],
|
||||
"file_descriptor": ["s_fd_table"],
|
||||
"tx_delay": ["tx_delay_cfg"],
|
||||
"deinit": ["deinit_functions"],
|
||||
"lcp_echo": ["LcpEchoCheck"],
|
||||
"raw_api": ["raw_bind", "raw_connect"],
|
||||
"checksum": ["process_checksum"],
|
||||
"entry_management": ["add_entry"],
|
||||
"esp_ota": ["esp_ota", "ota_", "read_otadata"],
|
||||
"http_server": [
|
||||
"httpd_",
|
||||
"parse_url_char",
|
||||
"cb_headers_complete",
|
||||
"delete_entry",
|
||||
"validate_structure",
|
||||
"config_save",
|
||||
"config_new",
|
||||
"verify_url",
|
||||
"cb_url",
|
||||
],
|
||||
"misc_system": [
|
||||
"alarm_cbs",
|
||||
"start_up",
|
||||
"tokens",
|
||||
"unhex",
|
||||
"osi_funcs_ro",
|
||||
"enum_function",
|
||||
"fragment_and_dispatch",
|
||||
"alarm_set",
|
||||
"osi_alarm_new",
|
||||
"config_set_string",
|
||||
"config_update_newest_section",
|
||||
"config_remove_key",
|
||||
"method_strings",
|
||||
"interop_match",
|
||||
"interop_database",
|
||||
"__state_table",
|
||||
"__action_table",
|
||||
"s_stub_table",
|
||||
"s_context",
|
||||
"s_mmu_ctx",
|
||||
"s_get_bus_mask",
|
||||
"hli_queue_put",
|
||||
"list_remove",
|
||||
"list_delete",
|
||||
"lock_acquire_generic",
|
||||
"is_vect_desc_usable",
|
||||
"io_mode_str",
|
||||
"__c$20233",
|
||||
"interface",
|
||||
"read_id_core",
|
||||
"subscribe_idle",
|
||||
"unsubscribe_idle",
|
||||
"s_clkout_handle",
|
||||
"lock_release_generic",
|
||||
"config_set_int",
|
||||
"config_get_int",
|
||||
"config_get_string",
|
||||
"config_has_key",
|
||||
"config_remove_section",
|
||||
"osi_alarm_init",
|
||||
"osi_alarm_deinit",
|
||||
"fixed_queue_enqueue",
|
||||
"fixed_queue_dequeue",
|
||||
"fixed_queue_new",
|
||||
"fixed_pkt_queue_enqueue",
|
||||
"fixed_pkt_queue_new",
|
||||
"list_append",
|
||||
"list_prepend",
|
||||
"list_insert_after",
|
||||
"list_contains",
|
||||
"list_get_node",
|
||||
"hash_function_blob",
|
||||
"cb_no_body",
|
||||
"cb_on_body",
|
||||
"profile_tab",
|
||||
"get_arg",
|
||||
"trim",
|
||||
"buf$",
|
||||
"process_appended_hash_and_sig$constprop$0",
|
||||
"uuidType",
|
||||
"allocate_svc_db_buf",
|
||||
"_hostname_is_ours",
|
||||
"s_hli_handlers",
|
||||
"tick_cb",
|
||||
"idle_cb",
|
||||
"input",
|
||||
"entry_find",
|
||||
"section_find",
|
||||
"find_bucket_entry_",
|
||||
"config_has_section",
|
||||
"hli_queue_create",
|
||||
"hli_queue_get",
|
||||
"hli_c_handler",
|
||||
"future_ready",
|
||||
"future_await",
|
||||
"future_new",
|
||||
"pkt_queue_enqueue",
|
||||
"pkt_queue_dequeue",
|
||||
"pkt_queue_cleanup",
|
||||
"pkt_queue_create",
|
||||
"pkt_queue_destroy",
|
||||
"fixed_pkt_queue_dequeue",
|
||||
"osi_alarm_cancel",
|
||||
"osi_alarm_is_active",
|
||||
"osi_sem_take",
|
||||
"osi_event_create",
|
||||
"osi_event_bind",
|
||||
"alarm_cb_handler",
|
||||
"list_foreach",
|
||||
"list_back",
|
||||
"list_front",
|
||||
"list_clear",
|
||||
"fixed_queue_try_peek_first",
|
||||
"translate_path",
|
||||
"get_idx",
|
||||
"find_key",
|
||||
"init",
|
||||
"end",
|
||||
"start",
|
||||
"set_read_value",
|
||||
"copy_address_list",
|
||||
"copy_and_key",
|
||||
"sdk_cfg_opts",
|
||||
"leftshift_onebit",
|
||||
"config_section_end",
|
||||
"config_section_begin",
|
||||
"find_entry_and_check_all_reset",
|
||||
"image_validate",
|
||||
"xPendingReadyList",
|
||||
"vListInitialise",
|
||||
"lock_init_generic",
|
||||
"ant_bttx_cfg",
|
||||
"ant_dft_cfg",
|
||||
"cs_send_to_ctrl_sock",
|
||||
"config_llc_util_funcs_reset",
|
||||
"make_set_adv_report_flow_control",
|
||||
"make_set_event_mask",
|
||||
"raw_new",
|
||||
"raw_remove",
|
||||
"BTE_InitStack",
|
||||
"parse_read_local_supported_features_response",
|
||||
"__math_invalidf",
|
||||
"tinytens",
|
||||
"__mprec_tinytens",
|
||||
"__mprec_bigtens",
|
||||
"vRingbufferDelete",
|
||||
"vRingbufferDeleteWithCaps",
|
||||
"vRingbufferReturnItem",
|
||||
"vRingbufferReturnItemFromISR",
|
||||
"get_acl_data_size_ble",
|
||||
"get_features_ble",
|
||||
"get_features_classic",
|
||||
"get_acl_packet_size_ble",
|
||||
"get_acl_packet_size_classic",
|
||||
"supports_extended_inquiry_response",
|
||||
"supports_rssi_with_inquiry_results",
|
||||
"supports_interlaced_inquiry_scan",
|
||||
"supports_reading_remote_extended_features",
|
||||
],
|
||||
"bluetooth_ll": [
|
||||
"lld_pdu_",
|
||||
"ld_acl_",
|
||||
"lld_stop_ind_handler",
|
||||
"lld_evt_winsize_change",
|
||||
"config_lld_evt_funcs_reset",
|
||||
"config_lld_funcs_reset",
|
||||
"config_llm_funcs_reset",
|
||||
"llm_set_long_adv_data",
|
||||
"lld_retry_tx_prog",
|
||||
"llc_link_sup_to_ind_handler",
|
||||
"config_llc_funcs_reset",
|
||||
"lld_evt_rxwin_compute",
|
||||
"config_btdm_funcs_reset",
|
||||
"config_ea_funcs_reset",
|
||||
"llc_defalut_state_tab_reset",
|
||||
"config_rwip_funcs_reset",
|
||||
"ke_lmp_rx_flooding_detect",
|
||||
],
|
||||
}
|
||||
|
||||
# Demangled patterns: patterns found in demangled C++ names
|
||||
DEMANGLED_PATTERNS = {
|
||||
"gpio_driver": ["GPIO"],
|
||||
"uart_driver": ["UART"],
|
||||
"network_stack": [
|
||||
"lwip",
|
||||
"tcp",
|
||||
"udp",
|
||||
"ip4",
|
||||
"ip6",
|
||||
"dhcp",
|
||||
"dns",
|
||||
"netif",
|
||||
"ethernet",
|
||||
"ppp",
|
||||
"slip",
|
||||
],
|
||||
"wifi_stack": ["NetworkInterface"],
|
||||
"nimble_bt": [
|
||||
"nimble",
|
||||
"NimBLE",
|
||||
"ble_hs",
|
||||
"ble_gap",
|
||||
"ble_gatt",
|
||||
"ble_att",
|
||||
"ble_l2cap",
|
||||
"ble_sm",
|
||||
],
|
||||
"crypto": ["mbedtls", "crypto", "sha", "aes", "rsa", "ecc", "tls", "ssl"],
|
||||
"cpp_stdlib": ["std::", "__gnu_cxx::", "__cxxabiv"],
|
||||
"static_init": ["__static_initialization"],
|
||||
"rtti": ["__type_info", "__class_type_info"],
|
||||
"web_server_lib": ["AsyncWebServer", "AsyncWebHandler", "WebServer"],
|
||||
"async_tcp": ["AsyncClient", "AsyncServer"],
|
||||
"mdns_lib": ["mdns"],
|
||||
"json_lib": [
|
||||
"ArduinoJson",
|
||||
"JsonDocument",
|
||||
"JsonArray",
|
||||
"JsonObject",
|
||||
"deserialize",
|
||||
"serialize",
|
||||
],
|
||||
"http_lib": ["HTTP", "http_", "Request", "Response", "Uri", "WebSocket"],
|
||||
"logging": ["log", "Log", "print", "Print", "diag_"],
|
||||
"authentication": ["checkDigestAuthentication"],
|
||||
"libgcc": ["libgcc"],
|
||||
"esp_system": ["esp_", "ESP"],
|
||||
"arduino": ["arduino"],
|
||||
"nvs": ["nvs_", "_ZTVN3nvs", "nvs::"],
|
||||
"filesystem": ["spiffs", "vfs"],
|
||||
"libc": ["newlib"],
|
||||
}
|
||||
|
||||
# Patterns for categorizing ESPHome core symbols into subcategories
|
||||
CORE_SUBCATEGORY_PATTERNS = {
|
||||
"Component Framework": ["Component"],
|
||||
"Application Core": ["Application"],
|
||||
"Scheduler": ["Scheduler"],
|
||||
"Component Iterator": ["ComponentIterator"],
|
||||
"Helper Functions": ["Helpers", "helpers"],
|
||||
"Preferences/Storage": ["Preferences", "ESPPreferences"],
|
||||
"I/O Utilities": ["HighFrequencyLoopRequester"],
|
||||
"String Utilities": ["str_"],
|
||||
"Bit Utilities": ["reverse_bits"],
|
||||
"Data Conversion": ["convert_"],
|
||||
"Network Utilities": ["network", "IPAddress"],
|
||||
"API Protocol": ["api::"],
|
||||
"WiFi Manager": ["wifi::"],
|
||||
"MQTT Client": ["mqtt::"],
|
||||
"Logger": ["logger::"],
|
||||
"OTA Updates": ["ota::"],
|
||||
"Web Server": ["web_server::"],
|
||||
"Time Management": ["time::"],
|
||||
"Sensor Framework": ["sensor::"],
|
||||
"Binary Sensor": ["binary_sensor::"],
|
||||
"Switch Framework": ["switch_::"],
|
||||
"Light Framework": ["light::"],
|
||||
"Climate Framework": ["climate::"],
|
||||
"Cover Framework": ["cover::"],
|
||||
}
|
@@ -1,121 +0,0 @@
|
||||
"""Helper functions for memory analysis."""
|
||||
|
||||
from functools import cache
|
||||
from pathlib import Path
|
||||
|
||||
from .const import SECTION_MAPPING
|
||||
|
||||
# Import namespace constant from parent module
|
||||
# Note: This would create a circular import if done at module level,
|
||||
# so we'll define it locally here as well
|
||||
_NAMESPACE_ESPHOME = "esphome::"
|
||||
|
||||
|
||||
# Get the list of actual ESPHome components by scanning the components directory
|
||||
@cache
|
||||
def get_esphome_components():
|
||||
"""Get set of actual ESPHome components from the components directory."""
|
||||
# Find the components directory relative to this file
|
||||
# Go up two levels from analyze_memory/helpers.py to esphome/
|
||||
current_dir = Path(__file__).parent.parent
|
||||
components_dir = current_dir / "components"
|
||||
|
||||
if not components_dir.exists() or not components_dir.is_dir():
|
||||
return frozenset()
|
||||
|
||||
return frozenset(
|
||||
item.name
|
||||
for item in components_dir.iterdir()
|
||||
if item.is_dir()
|
||||
and not item.name.startswith(".")
|
||||
and not item.name.startswith("__")
|
||||
)
|
||||
|
||||
|
||||
@cache
|
||||
def get_component_class_patterns(component_name: str) -> list[str]:
|
||||
"""Generate component class name patterns for symbol matching.
|
||||
|
||||
Args:
|
||||
component_name: The component name (e.g., "ota", "wifi", "api")
|
||||
|
||||
Returns:
|
||||
List of pattern strings to match against demangled symbols
|
||||
"""
|
||||
component_upper = component_name.upper()
|
||||
component_camel = component_name.replace("_", "").title()
|
||||
return [
|
||||
f"{_NAMESPACE_ESPHOME}{component_upper}Component", # e.g., esphome::OTAComponent
|
||||
f"{_NAMESPACE_ESPHOME}ESPHome{component_upper}Component", # e.g., esphome::ESPHomeOTAComponent
|
||||
f"{_NAMESPACE_ESPHOME}{component_camel}Component", # e.g., esphome::OtaComponent
|
||||
f"{_NAMESPACE_ESPHOME}ESPHome{component_camel}Component", # e.g., esphome::ESPHomeOtaComponent
|
||||
]
|
||||
|
||||
|
||||
def map_section_name(raw_section: str) -> str | None:
|
||||
"""Map raw section name to standard section.
|
||||
|
||||
Args:
|
||||
raw_section: Raw section name from ELF file (e.g., ".iram0.text", ".rodata.str1.1")
|
||||
|
||||
Returns:
|
||||
Standard section name (".text", ".rodata", ".data", ".bss") or None
|
||||
"""
|
||||
for standard_section, patterns in SECTION_MAPPING.items():
|
||||
if any(pattern in raw_section for pattern in patterns):
|
||||
return standard_section
|
||||
return None
|
||||
|
||||
|
||||
def parse_symbol_line(line: str) -> tuple[str, str, int, str] | None:
|
||||
"""Parse a single symbol line from objdump output.
|
||||
|
||||
Args:
|
||||
line: Line from objdump -t output
|
||||
|
||||
Returns:
|
||||
Tuple of (section, name, size, address) or None if not a valid symbol.
|
||||
Format: address l/g w/d F/O section size name
|
||||
Example: 40084870 l F .iram0.text 00000000 _xt_user_exc
|
||||
"""
|
||||
parts = line.split()
|
||||
if len(parts) < 5:
|
||||
return None
|
||||
|
||||
try:
|
||||
# Validate and extract address
|
||||
address = parts[0]
|
||||
int(address, 16)
|
||||
except ValueError:
|
||||
return None
|
||||
|
||||
# Look for F (function) or O (object) flag
|
||||
if "F" not in parts and "O" not in parts:
|
||||
return None
|
||||
|
||||
# Find section, size, and name
|
||||
for i, part in enumerate(parts):
|
||||
if not part.startswith("."):
|
||||
continue
|
||||
|
||||
section = map_section_name(part)
|
||||
if not section:
|
||||
break
|
||||
|
||||
# Need at least size field after section
|
||||
if i + 1 >= len(parts):
|
||||
break
|
||||
|
||||
try:
|
||||
size = int(parts[i + 1], 16)
|
||||
except ValueError:
|
||||
break
|
||||
|
||||
# Need symbol name and non-zero size
|
||||
if i + 2 >= len(parts) or size == 0:
|
||||
break
|
||||
|
||||
name = " ".join(parts[i + 2 :])
|
||||
return (section, name, size, address)
|
||||
|
||||
return None
|
@@ -1,48 +1,33 @@
|
||||
import esphome.codegen as cg
|
||||
import esphome.config_validation as cv
|
||||
from esphome.const import (
|
||||
CONF_ALL,
|
||||
CONF_ANY,
|
||||
CONF_AUTOMATION_ID,
|
||||
CONF_CONDITION,
|
||||
CONF_COUNT,
|
||||
CONF_ELSE,
|
||||
CONF_ID,
|
||||
CONF_THEN,
|
||||
CONF_TIME,
|
||||
CONF_TIMEOUT,
|
||||
CONF_TRIGGER_ID,
|
||||
CONF_TYPE_ID,
|
||||
CONF_UPDATE_INTERVAL,
|
||||
CONF_TIME,
|
||||
)
|
||||
from esphome.core import ID
|
||||
from esphome.cpp_generator import MockObj, MockObjClass, TemplateArgsType
|
||||
from esphome.schema_extractors import SCHEMA_EXTRACT, schema_extractor
|
||||
from esphome.types import ConfigType
|
||||
from esphome.jsonschema import jschema_extractor
|
||||
from esphome.util import Registry
|
||||
|
||||
|
||||
def maybe_simple_id(*validators):
|
||||
"""Allow a raw ID to be specified in place of a config block.
|
||||
If the value that's being validated is a dictionary, it's passed as-is to the specified validators. Otherwise, it's
|
||||
wrapped in a dict that looks like ``{"id": <value>}``, and that dict is then handed off to the specified validators.
|
||||
"""
|
||||
return maybe_conf(CONF_ID, *validators)
|
||||
|
||||
|
||||
def maybe_conf(conf, *validators):
|
||||
"""Allow a raw value to be specified in place of a config block.
|
||||
If the value that's being validated is a dictionary, it's passed as-is to the specified validators. Otherwise, it's
|
||||
wrapped in a dict that looks like ``{<conf>: <value>}``, and that dict is then handed off to the specified
|
||||
validators.
|
||||
(This is a general case of ``maybe_simple_id`` that allows the wrapping key to be something other than ``id``.)
|
||||
"""
|
||||
validator = cv.All(*validators)
|
||||
|
||||
@schema_extractor("maybe")
|
||||
@jschema_extractor("maybe")
|
||||
def validate(value):
|
||||
if value == SCHEMA_EXTRACT:
|
||||
return (validator, conf)
|
||||
# pylint: disable=comparison-with-callable
|
||||
if value == jschema_extractor:
|
||||
return validator
|
||||
|
||||
if isinstance(value, dict):
|
||||
return validator(value)
|
||||
@@ -52,11 +37,11 @@ def maybe_conf(conf, *validators):
|
||||
return validate
|
||||
|
||||
|
||||
def register_action(name: str, action_type: MockObjClass, schema: cv.Schema):
|
||||
def register_action(name, action_type, schema):
|
||||
return ACTION_REGISTRY.register(name, action_type, schema)
|
||||
|
||||
|
||||
def register_condition(name: str, condition_type: MockObjClass, schema: cv.Schema):
|
||||
def register_condition(name, condition_type, schema):
|
||||
return CONDITION_REGISTRY.register(name, condition_type, schema)
|
||||
|
||||
|
||||
@@ -78,13 +63,6 @@ def validate_potentially_and_condition(value):
|
||||
return validate_condition(value)
|
||||
|
||||
|
||||
def validate_potentially_or_condition(value):
|
||||
if isinstance(value, list):
|
||||
with cv.remove_prepend_path(["or"]):
|
||||
return validate_condition({"or": value})
|
||||
return validate_condition(value)
|
||||
|
||||
|
||||
DelayAction = cg.esphome_ns.class_("DelayAction", Action, cg.Component)
|
||||
LambdaAction = cg.esphome_ns.class_("LambdaAction", Action)
|
||||
IfAction = cg.esphome_ns.class_("IfAction", Action)
|
||||
@@ -92,8 +70,6 @@ WhileAction = cg.esphome_ns.class_("WhileAction", Action)
|
||||
RepeatAction = cg.esphome_ns.class_("RepeatAction", Action)
|
||||
WaitUntilAction = cg.esphome_ns.class_("WaitUntilAction", Action, cg.Component)
|
||||
UpdateComponentAction = cg.esphome_ns.class_("UpdateComponentAction", Action)
|
||||
SuspendComponentAction = cg.esphome_ns.class_("SuspendComponentAction", Action)
|
||||
ResumeComponentAction = cg.esphome_ns.class_("ResumeComponentAction", Action)
|
||||
Automation = cg.esphome_ns.class_("Automation")
|
||||
|
||||
LambdaCondition = cg.esphome_ns.class_("LambdaCondition", Condition)
|
||||
@@ -135,9 +111,11 @@ def validate_automation(extra_schema=None, extra_validators=None, single=False):
|
||||
# This should only happen with invalid configs, but let's have a nice error message.
|
||||
return [schema(value)]
|
||||
|
||||
@schema_extractor("automation")
|
||||
@jschema_extractor("automation")
|
||||
def validator(value):
|
||||
if value == SCHEMA_EXTRACT:
|
||||
# hack to get the schema
|
||||
# pylint: disable=comparison-with-callable
|
||||
if value == jschema_extractor:
|
||||
return schema
|
||||
|
||||
value = validator_(value)
|
||||
@@ -163,82 +141,28 @@ AUTOMATION_SCHEMA = cv.Schema(
|
||||
AndCondition = cg.esphome_ns.class_("AndCondition", Condition)
|
||||
OrCondition = cg.esphome_ns.class_("OrCondition", Condition)
|
||||
NotCondition = cg.esphome_ns.class_("NotCondition", Condition)
|
||||
XorCondition = cg.esphome_ns.class_("XorCondition", Condition)
|
||||
|
||||
|
||||
@register_condition("and", AndCondition, validate_condition_list)
|
||||
async def and_condition_to_code(
|
||||
config: ConfigType,
|
||||
condition_id: ID,
|
||||
template_arg: cg.TemplateArguments,
|
||||
args: TemplateArgsType,
|
||||
) -> MockObj:
|
||||
async def and_condition_to_code(config, condition_id, template_arg, args):
|
||||
conditions = await build_condition_list(config, template_arg, args)
|
||||
return cg.new_Pvariable(condition_id, template_arg, conditions)
|
||||
|
||||
|
||||
@register_condition("or", OrCondition, validate_condition_list)
|
||||
async def or_condition_to_code(
|
||||
config: ConfigType,
|
||||
condition_id: ID,
|
||||
template_arg: cg.TemplateArguments,
|
||||
args: TemplateArgsType,
|
||||
) -> MockObj:
|
||||
conditions = await build_condition_list(config, template_arg, args)
|
||||
return cg.new_Pvariable(condition_id, template_arg, conditions)
|
||||
|
||||
|
||||
@register_condition("all", AndCondition, validate_condition_list)
|
||||
async def all_condition_to_code(
|
||||
config: ConfigType,
|
||||
condition_id: ID,
|
||||
template_arg: cg.TemplateArguments,
|
||||
args: TemplateArgsType,
|
||||
) -> MockObj:
|
||||
conditions = await build_condition_list(config, template_arg, args)
|
||||
return cg.new_Pvariable(condition_id, template_arg, conditions)
|
||||
|
||||
|
||||
@register_condition("any", OrCondition, validate_condition_list)
|
||||
async def any_condition_to_code(
|
||||
config: ConfigType,
|
||||
condition_id: ID,
|
||||
template_arg: cg.TemplateArguments,
|
||||
args: TemplateArgsType,
|
||||
) -> MockObj:
|
||||
async def or_condition_to_code(config, condition_id, template_arg, args):
|
||||
conditions = await build_condition_list(config, template_arg, args)
|
||||
return cg.new_Pvariable(condition_id, template_arg, conditions)
|
||||
|
||||
|
||||
@register_condition("not", NotCondition, validate_potentially_and_condition)
|
||||
async def not_condition_to_code(
|
||||
config: ConfigType,
|
||||
condition_id: ID,
|
||||
template_arg: cg.TemplateArguments,
|
||||
args: TemplateArgsType,
|
||||
) -> MockObj:
|
||||
async def not_condition_to_code(config, condition_id, template_arg, args):
|
||||
condition = await build_condition(config, template_arg, args)
|
||||
return cg.new_Pvariable(condition_id, template_arg, condition)
|
||||
|
||||
|
||||
@register_condition("xor", XorCondition, validate_condition_list)
|
||||
async def xor_condition_to_code(
|
||||
config: ConfigType,
|
||||
condition_id: ID,
|
||||
template_arg: cg.TemplateArguments,
|
||||
args: TemplateArgsType,
|
||||
) -> MockObj:
|
||||
conditions = await build_condition_list(config, template_arg, args)
|
||||
return cg.new_Pvariable(condition_id, template_arg, conditions)
|
||||
|
||||
|
||||
@register_condition("lambda", LambdaCondition, cv.returning_lambda)
|
||||
async def lambda_condition_to_code(
|
||||
config: ConfigType,
|
||||
condition_id: ID,
|
||||
template_arg: cg.TemplateArguments,
|
||||
args: TemplateArgsType,
|
||||
) -> MockObj:
|
||||
async def lambda_condition_to_code(config, condition_id, template_arg, args):
|
||||
lambda_ = await cg.process_lambda(config, args, return_type=bool)
|
||||
return cg.new_Pvariable(condition_id, template_arg, lambda_)
|
||||
|
||||
@@ -255,12 +179,7 @@ async def lambda_condition_to_code(
|
||||
}
|
||||
).extend(cv.COMPONENT_SCHEMA),
|
||||
)
|
||||
async def for_condition_to_code(
|
||||
config: ConfigType,
|
||||
condition_id: ID,
|
||||
template_arg: cg.TemplateArguments,
|
||||
args: TemplateArgsType,
|
||||
) -> MockObj:
|
||||
async def for_condition_to_code(config, condition_id, template_arg, args):
|
||||
condition = await build_condition(
|
||||
config[CONF_CONDITION], cg.TemplateArguments(), []
|
||||
)
|
||||
@@ -274,12 +193,7 @@ async def for_condition_to_code(
|
||||
@register_action(
|
||||
"delay", DelayAction, cv.templatable(cv.positive_time_period_milliseconds)
|
||||
)
|
||||
async def delay_action_to_code(
|
||||
config: ConfigType,
|
||||
action_id: ID,
|
||||
template_arg: cg.TemplateArguments,
|
||||
args: TemplateArgsType,
|
||||
) -> MockObj:
|
||||
async def delay_action_to_code(config, action_id, template_arg, args):
|
||||
var = cg.new_Pvariable(action_id, template_arg)
|
||||
await cg.register_component(var, {})
|
||||
template_ = await cg.templatable(config, args, cg.uint32)
|
||||
@@ -292,27 +206,16 @@ async def delay_action_to_code(
|
||||
IfAction,
|
||||
cv.All(
|
||||
{
|
||||
cv.Exclusive(
|
||||
CONF_CONDITION, CONF_CONDITION
|
||||
): validate_potentially_and_condition,
|
||||
cv.Exclusive(CONF_ANY, CONF_CONDITION): validate_potentially_or_condition,
|
||||
cv.Exclusive(CONF_ALL, CONF_CONDITION): validate_potentially_and_condition,
|
||||
cv.Required(CONF_CONDITION): validate_potentially_and_condition,
|
||||
cv.Optional(CONF_THEN): validate_action_list,
|
||||
cv.Optional(CONF_ELSE): validate_action_list,
|
||||
},
|
||||
cv.has_at_least_one_key(CONF_THEN, CONF_ELSE),
|
||||
cv.has_at_least_one_key(CONF_CONDITION, CONF_ANY, CONF_ALL),
|
||||
),
|
||||
)
|
||||
async def if_action_to_code(
|
||||
config: ConfigType,
|
||||
action_id: ID,
|
||||
template_arg: cg.TemplateArguments,
|
||||
args: TemplateArgsType,
|
||||
) -> MockObj:
|
||||
cond_conf = next(el for el in config if el in (CONF_ANY, CONF_ALL, CONF_CONDITION))
|
||||
condition = await build_condition(config[cond_conf], template_arg, args)
|
||||
var = cg.new_Pvariable(action_id, template_arg, condition)
|
||||
async def if_action_to_code(config, action_id, template_arg, args):
|
||||
conditions = await build_condition(config[CONF_CONDITION], template_arg, args)
|
||||
var = cg.new_Pvariable(action_id, template_arg, conditions)
|
||||
if CONF_THEN in config:
|
||||
actions = await build_action_list(config[CONF_THEN], template_arg, args)
|
||||
cg.add(var.add_then(actions))
|
||||
@@ -332,14 +235,9 @@ async def if_action_to_code(
|
||||
}
|
||||
),
|
||||
)
|
||||
async def while_action_to_code(
|
||||
config: ConfigType,
|
||||
action_id: ID,
|
||||
template_arg: cg.TemplateArguments,
|
||||
args: TemplateArgsType,
|
||||
) -> MockObj:
|
||||
condition = await build_condition(config[CONF_CONDITION], template_arg, args)
|
||||
var = cg.new_Pvariable(action_id, template_arg, condition)
|
||||
async def while_action_to_code(config, action_id, template_arg, args):
|
||||
conditions = await build_condition(config[CONF_CONDITION], template_arg, args)
|
||||
var = cg.new_Pvariable(action_id, template_arg, conditions)
|
||||
actions = await build_action_list(config[CONF_THEN], template_arg, args)
|
||||
cg.add(var.add_then(actions))
|
||||
return var
|
||||
@@ -355,42 +253,33 @@ async def while_action_to_code(
|
||||
}
|
||||
),
|
||||
)
|
||||
async def repeat_action_to_code(
|
||||
config: ConfigType,
|
||||
action_id: ID,
|
||||
template_arg: cg.TemplateArguments,
|
||||
args: TemplateArgsType,
|
||||
) -> MockObj:
|
||||
async def repeat_action_to_code(config, action_id, template_arg, args):
|
||||
var = cg.new_Pvariable(action_id, template_arg)
|
||||
count_template = await cg.templatable(config[CONF_COUNT], args, cg.uint32)
|
||||
cg.add(var.set_count(count_template))
|
||||
actions = await build_action_list(
|
||||
config[CONF_THEN],
|
||||
cg.TemplateArguments(cg.uint32, *template_arg.args),
|
||||
[(cg.uint32, "iteration"), *args],
|
||||
)
|
||||
actions = await build_action_list(config[CONF_THEN], template_arg, args)
|
||||
cg.add(var.add_then(actions))
|
||||
return var
|
||||
|
||||
|
||||
_validate_wait_until = cv.maybe_simple_value(
|
||||
{
|
||||
cv.Required(CONF_CONDITION): validate_potentially_and_condition,
|
||||
cv.Optional(CONF_TIMEOUT): cv.templatable(cv.positive_time_period_milliseconds),
|
||||
},
|
||||
key=CONF_CONDITION,
|
||||
)
|
||||
def validate_wait_until(value):
|
||||
schema = cv.Schema(
|
||||
{
|
||||
cv.Required(CONF_CONDITION): validate_potentially_and_condition,
|
||||
cv.Optional(CONF_TIMEOUT): cv.templatable(
|
||||
cv.positive_time_period_milliseconds
|
||||
),
|
||||
}
|
||||
)
|
||||
if isinstance(value, dict) and CONF_CONDITION in value:
|
||||
return schema(value)
|
||||
return validate_wait_until({CONF_CONDITION: value})
|
||||
|
||||
|
||||
@register_action("wait_until", WaitUntilAction, _validate_wait_until)
|
||||
async def wait_until_action_to_code(
|
||||
config: ConfigType,
|
||||
action_id: ID,
|
||||
template_arg: cg.TemplateArguments,
|
||||
args: TemplateArgsType,
|
||||
) -> MockObj:
|
||||
condition = await build_condition(config[CONF_CONDITION], template_arg, args)
|
||||
var = cg.new_Pvariable(action_id, template_arg, condition)
|
||||
@register_action("wait_until", WaitUntilAction, validate_wait_until)
|
||||
async def wait_until_action_to_code(config, action_id, template_arg, args):
|
||||
conditions = await build_condition(config[CONF_CONDITION], template_arg, args)
|
||||
var = cg.new_Pvariable(action_id, template_arg, conditions)
|
||||
if CONF_TIMEOUT in config:
|
||||
template_ = await cg.templatable(config[CONF_TIMEOUT], args, cg.uint32)
|
||||
cg.add(var.set_timeout_value(template_))
|
||||
@@ -399,12 +288,7 @@ async def wait_until_action_to_code(
|
||||
|
||||
|
||||
@register_action("lambda", LambdaAction, cv.lambda_)
|
||||
async def lambda_action_to_code(
|
||||
config: ConfigType,
|
||||
action_id: ID,
|
||||
template_arg: cg.TemplateArguments,
|
||||
args: TemplateArgsType,
|
||||
) -> MockObj:
|
||||
async def lambda_action_to_code(config, action_id, template_arg, args):
|
||||
lambda_ = await cg.process_lambda(config, args, return_type=cg.void)
|
||||
return cg.new_Pvariable(action_id, template_arg, lambda_)
|
||||
|
||||
@@ -418,106 +302,48 @@ async def lambda_action_to_code(
|
||||
}
|
||||
),
|
||||
)
|
||||
async def component_update_action_to_code(
|
||||
config: ConfigType,
|
||||
action_id: ID,
|
||||
template_arg: cg.TemplateArguments,
|
||||
args: TemplateArgsType,
|
||||
) -> MockObj:
|
||||
async def component_update_action_to_code(config, action_id, template_arg, args):
|
||||
comp = await cg.get_variable(config[CONF_ID])
|
||||
return cg.new_Pvariable(action_id, template_arg, comp)
|
||||
|
||||
|
||||
@register_action(
|
||||
"component.suspend",
|
||||
SuspendComponentAction,
|
||||
maybe_simple_id(
|
||||
{
|
||||
cv.Required(CONF_ID): cv.use_id(cg.PollingComponent),
|
||||
}
|
||||
),
|
||||
)
|
||||
async def component_suspend_action_to_code(
|
||||
config: ConfigType,
|
||||
action_id: ID,
|
||||
template_arg: cg.TemplateArguments,
|
||||
args: TemplateArgsType,
|
||||
) -> MockObj:
|
||||
comp = await cg.get_variable(config[CONF_ID])
|
||||
return cg.new_Pvariable(action_id, template_arg, comp)
|
||||
|
||||
|
||||
@register_action(
|
||||
"component.resume",
|
||||
ResumeComponentAction,
|
||||
maybe_simple_id(
|
||||
{
|
||||
cv.Required(CONF_ID): cv.use_id(cg.PollingComponent),
|
||||
cv.Optional(CONF_UPDATE_INTERVAL): cv.templatable(
|
||||
cv.positive_time_period_milliseconds
|
||||
),
|
||||
}
|
||||
),
|
||||
)
|
||||
async def component_resume_action_to_code(
|
||||
config: ConfigType,
|
||||
action_id: ID,
|
||||
template_arg: cg.TemplateArguments,
|
||||
args: TemplateArgsType,
|
||||
) -> MockObj:
|
||||
comp = await cg.get_variable(config[CONF_ID])
|
||||
var = cg.new_Pvariable(action_id, template_arg, comp)
|
||||
if CONF_UPDATE_INTERVAL in config:
|
||||
template_ = await cg.templatable(config[CONF_UPDATE_INTERVAL], args, int)
|
||||
cg.add(var.set_update_interval(template_))
|
||||
return var
|
||||
|
||||
|
||||
async def build_action(
|
||||
full_config: ConfigType, template_arg: cg.TemplateArguments, args: TemplateArgsType
|
||||
) -> MockObj:
|
||||
async def build_action(full_config, template_arg, args):
|
||||
registry_entry, config = cg.extract_registry_entry_config(
|
||||
ACTION_REGISTRY, full_config
|
||||
)
|
||||
action_id = full_config[CONF_TYPE_ID]
|
||||
builder = registry_entry.coroutine_fun
|
||||
return await builder(config, action_id, template_arg, args)
|
||||
ret = await builder(config, action_id, template_arg, args)
|
||||
return ret
|
||||
|
||||
|
||||
async def build_action_list(
|
||||
config: list[ConfigType], templ: cg.TemplateArguments, arg_type: TemplateArgsType
|
||||
) -> list[MockObj]:
|
||||
actions: list[MockObj] = []
|
||||
async def build_action_list(config, templ, arg_type):
|
||||
actions = []
|
||||
for conf in config:
|
||||
action = await build_action(conf, templ, arg_type)
|
||||
actions.append(action)
|
||||
return actions
|
||||
|
||||
|
||||
async def build_condition(
|
||||
full_config: ConfigType, template_arg: cg.TemplateArguments, args: TemplateArgsType
|
||||
) -> MockObj:
|
||||
async def build_condition(full_config, template_arg, args):
|
||||
registry_entry, config = cg.extract_registry_entry_config(
|
||||
CONDITION_REGISTRY, full_config
|
||||
)
|
||||
action_id = full_config[CONF_TYPE_ID]
|
||||
builder = registry_entry.coroutine_fun
|
||||
return await builder(config, action_id, template_arg, args)
|
||||
ret = await builder(config, action_id, template_arg, args)
|
||||
return ret
|
||||
|
||||
|
||||
async def build_condition_list(
|
||||
config: ConfigType, templ: cg.TemplateArguments, args: TemplateArgsType
|
||||
) -> list[MockObj]:
|
||||
conditions: list[MockObj] = []
|
||||
async def build_condition_list(config, templ, args):
|
||||
conditions = []
|
||||
for conf in config:
|
||||
condition = await build_condition(conf, templ, args)
|
||||
conditions.append(condition)
|
||||
return conditions
|
||||
|
||||
|
||||
async def build_automation(
|
||||
trigger: MockObj, args: TemplateArgsType, config: ConfigType
|
||||
) -> MockObj:
|
||||
async def build_automation(trigger, args, config):
|
||||
arg_types = [arg[0] for arg in args]
|
||||
templ = cg.TemplateArguments(*arg_types)
|
||||
obj = cg.new_Pvariable(config[CONF_AUTOMATION_ID], templ, trigger)
|
||||
|
@@ -1,100 +0,0 @@
|
||||
from esphome.const import __version__
|
||||
from esphome.core import CORE
|
||||
from esphome.helpers import mkdir_p, read_file, write_file_if_changed
|
||||
from esphome.writer import find_begin_end, update_storage_json
|
||||
|
||||
INI_AUTO_GENERATE_BEGIN = "; ========== AUTO GENERATED CODE BEGIN ==========="
|
||||
INI_AUTO_GENERATE_END = "; =========== AUTO GENERATED CODE END ============"
|
||||
|
||||
INI_BASE_FORMAT = (
|
||||
"""; Auto generated code by esphome
|
||||
|
||||
[common]
|
||||
lib_deps =
|
||||
build_flags =
|
||||
upload_flags =
|
||||
|
||||
""",
|
||||
"""
|
||||
|
||||
""",
|
||||
)
|
||||
|
||||
|
||||
def format_ini(data: dict[str, str | list[str]]) -> str:
|
||||
content = ""
|
||||
for key, value in sorted(data.items()):
|
||||
if isinstance(value, list):
|
||||
content += f"{key} =\n"
|
||||
for x in value:
|
||||
content += f" {x}\n"
|
||||
else:
|
||||
content += f"{key} = {value}\n"
|
||||
return content
|
||||
|
||||
|
||||
def get_ini_content():
|
||||
CORE.add_platformio_option(
|
||||
"lib_deps",
|
||||
[x.as_lib_dep for x in CORE.platformio_libraries.values()]
|
||||
+ ["${common.lib_deps}"],
|
||||
)
|
||||
# Sort to avoid changing build flags order
|
||||
CORE.add_platformio_option("build_flags", sorted(CORE.build_flags))
|
||||
|
||||
# Sort to avoid changing build unflags order
|
||||
CORE.add_platformio_option("build_unflags", sorted(CORE.build_unflags))
|
||||
|
||||
# Add extra script for C++ flags
|
||||
CORE.add_platformio_option("extra_scripts", [f"pre:{CXX_FLAGS_FILE_NAME}"])
|
||||
|
||||
content = "[platformio]\n"
|
||||
content += f"description = ESPHome {__version__}\n"
|
||||
|
||||
content += f"[env:{CORE.name}]\n"
|
||||
content += format_ini(CORE.platformio_options)
|
||||
|
||||
return content
|
||||
|
||||
|
||||
def write_ini(content):
|
||||
update_storage_json()
|
||||
path = CORE.relative_build_path("platformio.ini")
|
||||
|
||||
if path.is_file():
|
||||
text = read_file(path)
|
||||
content_format = find_begin_end(
|
||||
text, INI_AUTO_GENERATE_BEGIN, INI_AUTO_GENERATE_END
|
||||
)
|
||||
else:
|
||||
content_format = INI_BASE_FORMAT
|
||||
full_file = f"{content_format[0] + INI_AUTO_GENERATE_BEGIN}\n{content}"
|
||||
full_file += INI_AUTO_GENERATE_END + content_format[1]
|
||||
write_file_if_changed(path, full_file)
|
||||
|
||||
|
||||
def write_project():
|
||||
mkdir_p(CORE.build_path)
|
||||
|
||||
content = get_ini_content()
|
||||
write_ini(content)
|
||||
|
||||
# Write extra script for C++ specific flags
|
||||
write_cxx_flags_script()
|
||||
|
||||
|
||||
CXX_FLAGS_FILE_NAME = "cxx_flags.py"
|
||||
CXX_FLAGS_FILE_CONTENTS = """# Auto-generated ESPHome script for C++ specific compiler flags
|
||||
Import("env")
|
||||
|
||||
# Add C++ specific flags
|
||||
"""
|
||||
|
||||
|
||||
def write_cxx_flags_script() -> None:
|
||||
path = CORE.relative_build_path(CXX_FLAGS_FILE_NAME)
|
||||
contents = CXX_FLAGS_FILE_CONTENTS
|
||||
if not CORE.is_host:
|
||||
contents += 'env.Append(CXXFLAGS=["-Wno-volatile"])'
|
||||
contents += "\n"
|
||||
write_file_if_changed(path, contents)
|
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user