mirror of
https://github.com/esphome/esphome.git
synced 2025-08-02 00:17:48 +00:00
Merge remote-tracking branch 'upstream/dev' into integration
This commit is contained in:
commit
f9357f5e6e
222
.ai/instructions.md
Normal file
222
.ai/instructions.md
Normal file
@ -0,0 +1,222 @@
|
|||||||
|
# ESPHome AI Collaboration Guide
|
||||||
|
|
||||||
|
This document provides essential context for AI models interacting with this project. Adhering to these guidelines will ensure consistency and maintain code quality.
|
||||||
|
|
||||||
|
## 1. Project Overview & Purpose
|
||||||
|
|
||||||
|
* **Primary Goal:** ESPHome is a system to configure microcontrollers (like ESP32, ESP8266, RP2040, and LibreTiny-based chips) using simple yet powerful YAML configuration files. It generates C++ firmware that can be compiled and flashed to these devices, allowing users to control them remotely through home automation systems.
|
||||||
|
* **Business Domain:** Internet of Things (IoT), Home Automation.
|
||||||
|
|
||||||
|
## 2. Core Technologies & Stack
|
||||||
|
|
||||||
|
* **Languages:** Python (>=3.10), C++ (gnu++20)
|
||||||
|
* **Frameworks & Runtimes:** PlatformIO, Arduino, ESP-IDF.
|
||||||
|
* **Build Systems:** PlatformIO is the primary build system. CMake is used as an alternative.
|
||||||
|
* **Configuration:** YAML.
|
||||||
|
* **Key Libraries/Dependencies:**
|
||||||
|
* **Python:** `voluptuous` (for configuration validation), `PyYAML` (for parsing configuration files), `paho-mqtt` (for MQTT communication), `tornado` (for the web server), `aioesphomeapi` (for the native API).
|
||||||
|
* **C++:** `ArduinoJson` (for JSON serialization/deserialization), `AsyncMqttClient-esphome` (for MQTT), `ESPAsyncWebServer` (for the web server).
|
||||||
|
* **Package Manager(s):** `pip` (for Python dependencies), `platformio` (for C++/PlatformIO dependencies).
|
||||||
|
* **Communication Protocols:** Protobuf (for native API), MQTT, HTTP.
|
||||||
|
|
||||||
|
## 3. Architectural Patterns
|
||||||
|
|
||||||
|
* **Overall Architecture:** The project follows a code-generation architecture. The Python code parses user-defined YAML configuration files and generates C++ source code. This C++ code is then compiled and flashed to the target microcontroller using PlatformIO.
|
||||||
|
|
||||||
|
* **Directory Structure Philosophy:**
|
||||||
|
* `/esphome`: Contains the core Python source code for the ESPHome application.
|
||||||
|
* `/esphome/components`: Contains the individual components that can be used in ESPHome configurations. Each component is a self-contained unit with its own C++ and Python code.
|
||||||
|
* `/tests`: Contains all unit and integration tests for the Python code.
|
||||||
|
* `/docker`: Contains Docker-related files for building and running ESPHome in a container.
|
||||||
|
* `/script`: Contains helper scripts for development and maintenance.
|
||||||
|
|
||||||
|
* **Core Architectural Components:**
|
||||||
|
1. **Configuration System** (`esphome/config*.py`): Handles YAML parsing and validation using Voluptuous, schema definitions, and multi-platform configurations.
|
||||||
|
2. **Code Generation** (`esphome/codegen.py`, `esphome/cpp_generator.py`): Manages Python to C++ code generation, template processing, and build flag management.
|
||||||
|
3. **Component System** (`esphome/components/`): Contains modular hardware and software components with platform-specific implementations and dependency management.
|
||||||
|
4. **Core Framework** (`esphome/core/`): Manages the application lifecycle, hardware abstraction, and component registration.
|
||||||
|
5. **Dashboard** (`esphome/dashboard/`): A web-based interface for device configuration, management, and OTA updates.
|
||||||
|
|
||||||
|
* **Platform Support:**
|
||||||
|
1. **ESP32** (`components/esp32/`): Espressif ESP32 family. Supports multiple variants (S2, S3, C3, etc.) and both IDF and Arduino frameworks.
|
||||||
|
2. **ESP8266** (`components/esp8266/`): Espressif ESP8266. Arduino framework only, with memory constraints.
|
||||||
|
3. **RP2040** (`components/rp2040/`): Raspberry Pi Pico/RP2040. Arduino framework with PIO (Programmable I/O) support.
|
||||||
|
4. **LibreTiny** (`components/libretiny/`): Realtek and Beken chips. Supports multiple chip families and auto-generated components.
|
||||||
|
|
||||||
|
## 4. Coding Conventions & Style Guide
|
||||||
|
|
||||||
|
* **Formatting:**
|
||||||
|
* **Python:** Uses `ruff` and `flake8` for linting and formatting. Configuration is in `pyproject.toml`.
|
||||||
|
* **C++:** Uses `clang-format` for formatting. Configuration is in `.clang-format`.
|
||||||
|
|
||||||
|
* **Naming Conventions:**
|
||||||
|
* **Python:** Follows PEP 8. Use clear, descriptive names following snake_case.
|
||||||
|
* **C++:** Follows the Google C++ Style Guide.
|
||||||
|
|
||||||
|
* **Component Structure:**
|
||||||
|
* **Standard Files:**
|
||||||
|
```
|
||||||
|
components/[component_name]/
|
||||||
|
├── __init__.py # Component configuration schema and code generation
|
||||||
|
├── [component].h # C++ header file (if needed)
|
||||||
|
├── [component].cpp # C++ implementation (if needed)
|
||||||
|
└── [platform]/ # Platform-specific implementations
|
||||||
|
├── __init__.py # Platform-specific configuration
|
||||||
|
├── [platform].h # Platform C++ header
|
||||||
|
└── [platform].cpp # Platform C++ implementation
|
||||||
|
```
|
||||||
|
|
||||||
|
* **Component Metadata:**
|
||||||
|
- `DEPENDENCIES`: List of required components
|
||||||
|
- `AUTO_LOAD`: Components to automatically load
|
||||||
|
- `CONFLICTS_WITH`: Incompatible components
|
||||||
|
- `CODEOWNERS`: GitHub usernames responsible for maintenance
|
||||||
|
- `MULTI_CONF`: Whether multiple instances are allowed
|
||||||
|
|
||||||
|
* **Code Generation & Common Patterns:**
|
||||||
|
* **Configuration Schema Pattern:**
|
||||||
|
```python
|
||||||
|
import esphome.codegen as cg
|
||||||
|
import esphome.config_validation as cv
|
||||||
|
from esphome.const import CONF_KEY, CONF_ID
|
||||||
|
|
||||||
|
CONF_PARAM = "param" # A constant that does not yet exist in esphome/const.py
|
||||||
|
|
||||||
|
my_component_ns = cg.esphome_ns.namespace("my_component")
|
||||||
|
MyComponent = my_component_ns.class_("MyComponent", cg.Component)
|
||||||
|
|
||||||
|
CONFIG_SCHEMA = cv.Schema({
|
||||||
|
cv.GenerateID(): cv.declare_id(MyComponent),
|
||||||
|
cv.Required(CONF_KEY): cv.string,
|
||||||
|
cv.Optional(CONF_PARAM, default=42): cv.int_,
|
||||||
|
}).extend(cv.COMPONENT_SCHEMA)
|
||||||
|
|
||||||
|
async def to_code(config):
|
||||||
|
var = cg.new_Pvariable(config[CONF_ID])
|
||||||
|
await cg.register_component(var, config)
|
||||||
|
cg.add(var.set_key(config[CONF_KEY]))
|
||||||
|
cg.add(var.set_param(config[CONF_PARAM]))
|
||||||
|
```
|
||||||
|
|
||||||
|
* **C++ Class Pattern:**
|
||||||
|
```cpp
|
||||||
|
namespace esphome {
|
||||||
|
namespace my_component {
|
||||||
|
|
||||||
|
class MyComponent : public Component {
|
||||||
|
public:
|
||||||
|
void setup() override;
|
||||||
|
void loop() override;
|
||||||
|
void dump_config() override;
|
||||||
|
|
||||||
|
void set_key(const std::string &key) { this->key_ = key; }
|
||||||
|
void set_param(int param) { this->param_ = param; }
|
||||||
|
|
||||||
|
protected:
|
||||||
|
std::string key_;
|
||||||
|
int param_{0};
|
||||||
|
};
|
||||||
|
|
||||||
|
} // namespace my_component
|
||||||
|
} // namespace esphome
|
||||||
|
```
|
||||||
|
|
||||||
|
* **Common Component Examples:**
|
||||||
|
- **Sensor:**
|
||||||
|
```python
|
||||||
|
from esphome.components import sensor
|
||||||
|
CONFIG_SCHEMA = sensor.sensor_schema(MySensor).extend(cv.polling_component_schema("60s"))
|
||||||
|
async def to_code(config):
|
||||||
|
var = await sensor.new_sensor(config)
|
||||||
|
await cg.register_component(var, config)
|
||||||
|
```
|
||||||
|
|
||||||
|
- **Binary Sensor:**
|
||||||
|
```python
|
||||||
|
from esphome.components import binary_sensor
|
||||||
|
CONFIG_SCHEMA = binary_sensor.binary_sensor_schema().extend({ ... })
|
||||||
|
async def to_code(config):
|
||||||
|
var = await binary_sensor.new_binary_sensor(config)
|
||||||
|
```
|
||||||
|
|
||||||
|
- **Switch:**
|
||||||
|
```python
|
||||||
|
from esphome.components import switch
|
||||||
|
CONFIG_SCHEMA = switch.switch_schema().extend({ ... })
|
||||||
|
async def to_code(config):
|
||||||
|
var = await switch.new_switch(config)
|
||||||
|
```
|
||||||
|
|
||||||
|
* **Configuration Validation:**
|
||||||
|
* **Common Validators:** `cv.int_`, `cv.float_`, `cv.string`, `cv.boolean`, `cv.int_range(min=0, max=100)`, `cv.positive_int`, `cv.percentage`.
|
||||||
|
* **Complex Validation:** `cv.All(cv.string, cv.Length(min=1, max=50))`, `cv.Any(cv.int_, cv.string)`.
|
||||||
|
* **Platform-Specific:** `cv.only_on(["esp32", "esp8266"])`, `cv.only_with_arduino`.
|
||||||
|
* **Schema Extensions:**
|
||||||
|
```python
|
||||||
|
CONFIG_SCHEMA = cv.Schema({ ... })
|
||||||
|
.extend(cv.COMPONENT_SCHEMA)
|
||||||
|
.extend(uart.UART_DEVICE_SCHEMA)
|
||||||
|
.extend(i2c.i2c_device_schema(0x48))
|
||||||
|
.extend(spi.spi_device_schema(cs_pin_required=True))
|
||||||
|
```
|
||||||
|
|
||||||
|
## 5. Key Files & Entrypoints
|
||||||
|
|
||||||
|
* **Main Entrypoint(s):** `esphome/__main__.py` is the main entrypoint for the ESPHome command-line interface.
|
||||||
|
* **Configuration:**
|
||||||
|
* `pyproject.toml`: Defines the Python project metadata and dependencies.
|
||||||
|
* `platformio.ini`: Configures the PlatformIO build environments for different microcontrollers.
|
||||||
|
* `.pre-commit-config.yaml`: Configures the pre-commit hooks for linting and formatting.
|
||||||
|
* **CI/CD Pipeline:** Defined in `.github/workflows`.
|
||||||
|
|
||||||
|
## 6. Development & Testing Workflow
|
||||||
|
|
||||||
|
* **Local Development Environment:** Use the provided Docker container or create a Python virtual environment and install dependencies from `requirements_dev.txt`.
|
||||||
|
* **Running Commands:** Use the `script/run-in-env.py` script to execute commands within the project's virtual environment. For example, to run the linter: `python3 script/run-in-env.py pre-commit run`.
|
||||||
|
* **Testing:**
|
||||||
|
* **Python:** Run unit tests with `pytest`.
|
||||||
|
* **C++:** Use `clang-tidy` for static analysis.
|
||||||
|
* **Component Tests:** YAML-based compilation tests are located in `tests/`. The structure is as follows:
|
||||||
|
```
|
||||||
|
tests/
|
||||||
|
├── test_build_components/ # Base test configurations
|
||||||
|
└── components/[component]/ # Component-specific tests
|
||||||
|
```
|
||||||
|
Run them using `script/test_build_components`. Use `-c <component>` to test specific components and `-t <target>` for specific platforms.
|
||||||
|
* **Debugging and Troubleshooting:**
|
||||||
|
* **Debug Tools:**
|
||||||
|
- `esphome config <file>.yaml` to validate configuration.
|
||||||
|
- `esphome compile <file>.yaml` to compile without uploading.
|
||||||
|
- Check the Dashboard for real-time logs.
|
||||||
|
- Use component-specific debug logging.
|
||||||
|
* **Common Issues:**
|
||||||
|
- **Import Errors**: Check component dependencies and `PYTHONPATH`.
|
||||||
|
- **Validation Errors**: Review configuration schema definitions.
|
||||||
|
- **Build Errors**: Check platform compatibility and library versions.
|
||||||
|
- **Runtime Errors**: Review generated C++ code and component logic.
|
||||||
|
|
||||||
|
## 7. Specific Instructions for AI Collaboration
|
||||||
|
|
||||||
|
* **Contribution Workflow (Pull Request Process):**
|
||||||
|
1. **Fork & Branch:** Create a new branch in your fork.
|
||||||
|
2. **Make Changes:** Adhere to all coding conventions and patterns.
|
||||||
|
3. **Test:** Create component tests for all supported platforms and run the full test suite locally.
|
||||||
|
4. **Lint:** Run `pre-commit` to ensure code is compliant.
|
||||||
|
5. **Commit:** Commit your changes. There is no strict format for commit messages.
|
||||||
|
6. **Pull Request:** Submit a PR against the `dev` branch. The Pull Request title should have a prefix of the component being worked on (e.g., `[display] Fix bug`, `[abc123] Add new component`). Update documentation, examples, and add `CODEOWNERS` entries as needed. Pull requests should always be made with the PULL_REQUEST_TEMPLATE.md template filled out correctly.
|
||||||
|
|
||||||
|
* **Documentation Contributions:**
|
||||||
|
* Documentation is hosted in the separate `esphome/esphome-docs` repository.
|
||||||
|
* The contribution workflow is the same as for the codebase.
|
||||||
|
|
||||||
|
* **Best Practices:**
|
||||||
|
* **Component Development:** Keep dependencies minimal, provide clear error messages, and write comprehensive docstrings and tests.
|
||||||
|
* **Code Generation:** Generate minimal and efficient C++ code. Validate all user inputs thoroughly. Support multiple platform variations.
|
||||||
|
* **Configuration Design:** Aim for simplicity with sensible defaults, while allowing for advanced customization.
|
||||||
|
|
||||||
|
* **Security:** Be mindful of security when making changes to the API, web server, or any other network-related code. Do not hardcode secrets or keys.
|
||||||
|
|
||||||
|
* **Dependencies & Build System Integration:**
|
||||||
|
* **Python:** When adding a new Python dependency, add it to the appropriate `requirements*.txt` file and `pyproject.toml`.
|
||||||
|
* **C++ / PlatformIO:** When adding a new C++ dependency, add it to `platformio.ini` and use `cg.add_library`.
|
||||||
|
* **Build Flags:** Use `cg.add_build_flag(...)` to add compiler flags.
|
1
.github/copilot-instructions.md
vendored
Symbolic link
1
.github/copilot-instructions.md
vendored
Symbolic link
@ -0,0 +1 @@
|
|||||||
|
../.ai/instructions.md
|
9
.github/dependabot.yml
vendored
9
.github/dependabot.yml
vendored
@ -9,6 +9,9 @@ updates:
|
|||||||
# Hypotehsis is only used for testing and is updated quite often
|
# Hypotehsis is only used for testing and is updated quite often
|
||||||
- dependency-name: hypothesis
|
- dependency-name: hypothesis
|
||||||
- package-ecosystem: github-actions
|
- package-ecosystem: github-actions
|
||||||
|
labels:
|
||||||
|
- "dependencies"
|
||||||
|
- "github-actions"
|
||||||
directory: "/"
|
directory: "/"
|
||||||
schedule:
|
schedule:
|
||||||
interval: daily
|
interval: daily
|
||||||
@ -20,11 +23,17 @@ updates:
|
|||||||
- "docker/login-action"
|
- "docker/login-action"
|
||||||
- "docker/setup-buildx-action"
|
- "docker/setup-buildx-action"
|
||||||
- package-ecosystem: github-actions
|
- package-ecosystem: github-actions
|
||||||
|
labels:
|
||||||
|
- "dependencies"
|
||||||
|
- "github-actions"
|
||||||
directory: "/.github/actions/build-image"
|
directory: "/.github/actions/build-image"
|
||||||
schedule:
|
schedule:
|
||||||
interval: daily
|
interval: daily
|
||||||
open-pull-requests-limit: 10
|
open-pull-requests-limit: 10
|
||||||
- package-ecosystem: github-actions
|
- package-ecosystem: github-actions
|
||||||
|
labels:
|
||||||
|
- "dependencies"
|
||||||
|
- "github-actions"
|
||||||
directory: "/.github/actions/restore-python"
|
directory: "/.github/actions/restore-python"
|
||||||
schedule:
|
schedule:
|
||||||
interval: daily
|
interval: daily
|
||||||
|
450
.github/workflows/auto-label-pr.yml
vendored
Normal file
450
.github/workflows/auto-label-pr.yml
vendored
Normal file
@ -0,0 +1,450 @@
|
|||||||
|
name: Auto Label PR
|
||||||
|
|
||||||
|
on:
|
||||||
|
# Runs only on pull_request_target due to having access to a App token.
|
||||||
|
# This means PRs from forks will not be able to alter this workflow to get the tokens
|
||||||
|
pull_request_target:
|
||||||
|
types: [labeled, opened, reopened, synchronize, edited]
|
||||||
|
|
||||||
|
permissions:
|
||||||
|
pull-requests: write
|
||||||
|
contents: read
|
||||||
|
|
||||||
|
env:
|
||||||
|
TARGET_PLATFORMS: |
|
||||||
|
esp32
|
||||||
|
esp8266
|
||||||
|
rp2040
|
||||||
|
libretiny
|
||||||
|
bk72xx
|
||||||
|
rtl87xx
|
||||||
|
ln882x
|
||||||
|
nrf52
|
||||||
|
host
|
||||||
|
PLATFORM_COMPONENTS: |
|
||||||
|
alarm_control_panel
|
||||||
|
audio_adc
|
||||||
|
audio_dac
|
||||||
|
binary_sensor
|
||||||
|
button
|
||||||
|
canbus
|
||||||
|
climate
|
||||||
|
cover
|
||||||
|
datetime
|
||||||
|
display
|
||||||
|
event
|
||||||
|
fan
|
||||||
|
light
|
||||||
|
lock
|
||||||
|
media_player
|
||||||
|
microphone
|
||||||
|
number
|
||||||
|
one_wire
|
||||||
|
ota
|
||||||
|
output
|
||||||
|
packet_transport
|
||||||
|
select
|
||||||
|
sensor
|
||||||
|
speaker
|
||||||
|
stepper
|
||||||
|
switch
|
||||||
|
text
|
||||||
|
text_sensor
|
||||||
|
time
|
||||||
|
touchscreen
|
||||||
|
update
|
||||||
|
valve
|
||||||
|
SMALL_PR_THRESHOLD: 30
|
||||||
|
MAX_LABELS: 15
|
||||||
|
|
||||||
|
jobs:
|
||||||
|
label:
|
||||||
|
runs-on: ubuntu-latest
|
||||||
|
if: github.event.action != 'labeled' || github.event.sender.type != 'Bot'
|
||||||
|
steps:
|
||||||
|
- name: Checkout
|
||||||
|
uses: actions/checkout@v4.2.2
|
||||||
|
|
||||||
|
- name: Get changes
|
||||||
|
id: changes
|
||||||
|
env:
|
||||||
|
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
|
||||||
|
run: |
|
||||||
|
# Get PR number
|
||||||
|
pr_number="${{ github.event.pull_request.number }}"
|
||||||
|
|
||||||
|
# Get list of changed files using gh CLI
|
||||||
|
files=$(gh pr diff $pr_number --name-only)
|
||||||
|
echo "files<<EOF" >> $GITHUB_OUTPUT
|
||||||
|
echo "$files" >> $GITHUB_OUTPUT
|
||||||
|
echo "EOF" >> $GITHUB_OUTPUT
|
||||||
|
|
||||||
|
# Get file stats (additions + deletions) using gh CLI
|
||||||
|
stats=$(gh pr view $pr_number --json files --jq '.files | map(.additions + .deletions) | add')
|
||||||
|
echo "total_changes=${stats:-0}" >> $GITHUB_OUTPUT
|
||||||
|
|
||||||
|
- name: Generate a token
|
||||||
|
id: generate-token
|
||||||
|
uses: actions/create-github-app-token@v2
|
||||||
|
with:
|
||||||
|
app-id: ${{ secrets.ESPHOME_GITHUB_APP_ID }}
|
||||||
|
private-key: ${{ secrets.ESPHOME_GITHUB_APP_PRIVATE_KEY }}
|
||||||
|
|
||||||
|
- name: Auto Label PR
|
||||||
|
uses: actions/github-script@v7.0.1
|
||||||
|
with:
|
||||||
|
github-token: ${{ steps.generate-token.outputs.token }}
|
||||||
|
script: |
|
||||||
|
const fs = require('fs');
|
||||||
|
|
||||||
|
const { owner, repo } = context.repo;
|
||||||
|
const pr_number = context.issue.number;
|
||||||
|
|
||||||
|
// Get current labels
|
||||||
|
const { data: currentLabelsData } = await github.rest.issues.listLabelsOnIssue({
|
||||||
|
owner,
|
||||||
|
repo,
|
||||||
|
issue_number: pr_number
|
||||||
|
});
|
||||||
|
const currentLabels = currentLabelsData.map(label => label.name);
|
||||||
|
|
||||||
|
// Define managed labels that this workflow controls
|
||||||
|
const managedLabels = currentLabels.filter(label =>
|
||||||
|
label.startsWith('component: ') ||
|
||||||
|
[
|
||||||
|
'new-component',
|
||||||
|
'new-platform',
|
||||||
|
'new-target-platform',
|
||||||
|
'merging-to-release',
|
||||||
|
'merging-to-beta',
|
||||||
|
'core',
|
||||||
|
'small-pr',
|
||||||
|
'dashboard',
|
||||||
|
'github-actions',
|
||||||
|
'by-code-owner',
|
||||||
|
'has-tests',
|
||||||
|
'needs-tests',
|
||||||
|
'needs-docs',
|
||||||
|
'too-big',
|
||||||
|
'labeller-recheck'
|
||||||
|
].includes(label)
|
||||||
|
);
|
||||||
|
|
||||||
|
console.log('Current labels:', currentLabels.join(', '));
|
||||||
|
console.log('Managed labels:', managedLabels.join(', '));
|
||||||
|
|
||||||
|
// Get changed files
|
||||||
|
const changedFiles = `${{ steps.changes.outputs.files }}`.split('\n').filter(f => f.length > 0);
|
||||||
|
const totalChanges = parseInt('${{ steps.changes.outputs.total_changes }}') || 0;
|
||||||
|
|
||||||
|
console.log('Changed files:', changedFiles.length);
|
||||||
|
console.log('Total changes:', totalChanges);
|
||||||
|
|
||||||
|
const labels = new Set();
|
||||||
|
|
||||||
|
// Get environment variables
|
||||||
|
const targetPlatforms = `${{ env.TARGET_PLATFORMS }}`.split('\n').filter(p => p.trim().length > 0).map(p => p.trim());
|
||||||
|
const platformComponents = `${{ env.PLATFORM_COMPONENTS }}`.split('\n').filter(p => p.trim().length > 0).map(p => p.trim());
|
||||||
|
const smallPrThreshold = parseInt('${{ env.SMALL_PR_THRESHOLD }}');
|
||||||
|
const maxLabels = parseInt('${{ env.MAX_LABELS }}');
|
||||||
|
|
||||||
|
// Strategy: Merge to release or beta branch
|
||||||
|
const baseRef = context.payload.pull_request.base.ref;
|
||||||
|
if (baseRef !== 'dev') {
|
||||||
|
if (baseRef === 'release') {
|
||||||
|
labels.add('merging-to-release');
|
||||||
|
} else if (baseRef === 'beta') {
|
||||||
|
labels.add('merging-to-beta');
|
||||||
|
}
|
||||||
|
|
||||||
|
// When targeting non-dev branches, only use merge warning labels
|
||||||
|
const finalLabels = Array.from(labels);
|
||||||
|
console.log('Computed labels (merge branch only):', finalLabels.join(', '));
|
||||||
|
|
||||||
|
// Add new labels
|
||||||
|
if (finalLabels.length > 0) {
|
||||||
|
console.log(`Adding labels: ${finalLabels.join(', ')}`);
|
||||||
|
await github.rest.issues.addLabels({
|
||||||
|
owner,
|
||||||
|
repo,
|
||||||
|
issue_number: pr_number,
|
||||||
|
labels: finalLabels
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
// Remove old managed labels that are no longer needed
|
||||||
|
const labelsToRemove = managedLabels.filter(label =>
|
||||||
|
!finalLabels.includes(label)
|
||||||
|
);
|
||||||
|
|
||||||
|
for (const label of labelsToRemove) {
|
||||||
|
console.log(`Removing label: ${label}`);
|
||||||
|
try {
|
||||||
|
await github.rest.issues.removeLabel({
|
||||||
|
owner,
|
||||||
|
repo,
|
||||||
|
issue_number: pr_number,
|
||||||
|
name: label
|
||||||
|
});
|
||||||
|
} catch (error) {
|
||||||
|
console.log(`Failed to remove label ${label}:`, error.message);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return; // Exit early, don't process other strategies
|
||||||
|
}
|
||||||
|
|
||||||
|
// Strategy: Component and Platform labeling
|
||||||
|
const componentRegex = /^esphome\/components\/([^\/]+)\//;
|
||||||
|
const targetPlatformRegex = new RegExp(`^esphome\/components\/(${targetPlatforms.join('|')})/`);
|
||||||
|
|
||||||
|
for (const file of changedFiles) {
|
||||||
|
// Check for component changes
|
||||||
|
const componentMatch = file.match(componentRegex);
|
||||||
|
if (componentMatch) {
|
||||||
|
const component = componentMatch[1];
|
||||||
|
labels.add(`component: ${component}`);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Check for target platform changes
|
||||||
|
const platformMatch = file.match(targetPlatformRegex);
|
||||||
|
if (platformMatch) {
|
||||||
|
const targetPlatform = platformMatch[1];
|
||||||
|
labels.add(`platform: ${targetPlatform}`);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Get PR files for new component/platform detection
|
||||||
|
const { data: prFiles } = await github.rest.pulls.listFiles({
|
||||||
|
owner,
|
||||||
|
repo,
|
||||||
|
pull_number: pr_number
|
||||||
|
});
|
||||||
|
|
||||||
|
const addedFiles = prFiles.filter(file => file.status === 'added').map(file => file.filename);
|
||||||
|
|
||||||
|
// Strategy: New Component detection
|
||||||
|
for (const file of addedFiles) {
|
||||||
|
// Check for new component files: esphome/components/{component}/__init__.py
|
||||||
|
const componentMatch = file.match(/^esphome\/components\/([^\/]+)\/__init__\.py$/);
|
||||||
|
if (componentMatch) {
|
||||||
|
try {
|
||||||
|
// Read the content directly from the filesystem since we have it checked out
|
||||||
|
const content = fs.readFileSync(file, 'utf8');
|
||||||
|
|
||||||
|
// Strategy: New Target Platform detection
|
||||||
|
if (content.includes('IS_TARGET_PLATFORM = True')) {
|
||||||
|
labels.add('new-target-platform');
|
||||||
|
}
|
||||||
|
labels.add('new-component');
|
||||||
|
} catch (error) {
|
||||||
|
console.log(`Failed to read content of ${file}:`, error.message);
|
||||||
|
// Fallback: assume it's a new component if we can't read the content
|
||||||
|
labels.add('new-component');
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Strategy: New Platform detection
|
||||||
|
for (const file of addedFiles) {
|
||||||
|
// Check for new platform files: esphome/components/{component}/{platform}.py
|
||||||
|
const platformFileMatch = file.match(/^esphome\/components\/([^\/]+)\/([^\/]+)\.py$/);
|
||||||
|
if (platformFileMatch) {
|
||||||
|
const [, component, platform] = platformFileMatch;
|
||||||
|
if (platformComponents.includes(platform)) {
|
||||||
|
labels.add('new-platform');
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Check for new platform files: esphome/components/{component}/{platform}/__init__.py
|
||||||
|
const platformDirMatch = file.match(/^esphome\/components\/([^\/]+)\/([^\/]+)\/__init__\.py$/);
|
||||||
|
if (platformDirMatch) {
|
||||||
|
const [, component, platform] = platformDirMatch;
|
||||||
|
if (platformComponents.includes(platform)) {
|
||||||
|
labels.add('new-platform');
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
const coreFiles = changedFiles.filter(file =>
|
||||||
|
file.startsWith('esphome/core/') ||
|
||||||
|
(file.startsWith('esphome/') && file.split('/').length === 2)
|
||||||
|
);
|
||||||
|
|
||||||
|
if (coreFiles.length > 0) {
|
||||||
|
labels.add('core');
|
||||||
|
}
|
||||||
|
|
||||||
|
// Strategy: Small PR detection
|
||||||
|
if (totalChanges <= smallPrThreshold) {
|
||||||
|
labels.add('small-pr');
|
||||||
|
}
|
||||||
|
|
||||||
|
// Strategy: Dashboard changes
|
||||||
|
const dashboardFiles = changedFiles.filter(file =>
|
||||||
|
file.startsWith('esphome/dashboard/') ||
|
||||||
|
file.startsWith('esphome/components/dashboard_import/')
|
||||||
|
);
|
||||||
|
|
||||||
|
if (dashboardFiles.length > 0) {
|
||||||
|
labels.add('dashboard');
|
||||||
|
}
|
||||||
|
|
||||||
|
// Strategy: GitHub Actions changes
|
||||||
|
const githubActionsFiles = changedFiles.filter(file =>
|
||||||
|
file.startsWith('.github/workflows/')
|
||||||
|
);
|
||||||
|
|
||||||
|
if (githubActionsFiles.length > 0) {
|
||||||
|
labels.add('github-actions');
|
||||||
|
}
|
||||||
|
|
||||||
|
// Strategy: Code Owner detection
|
||||||
|
try {
|
||||||
|
// Fetch CODEOWNERS file from the repository (in case it was changed in this PR)
|
||||||
|
const { data: codeownersFile } = await github.rest.repos.getContent({
|
||||||
|
owner,
|
||||||
|
repo,
|
||||||
|
path: '.github/CODEOWNERS',
|
||||||
|
ref: context.payload.pull_request.head.sha
|
||||||
|
});
|
||||||
|
|
||||||
|
const codeownersContent = Buffer.from(codeownersFile.content, 'base64').toString('utf8');
|
||||||
|
const prAuthor = context.payload.pull_request.user.login;
|
||||||
|
|
||||||
|
// Parse CODEOWNERS file
|
||||||
|
const codeownersLines = codeownersContent.split('\n')
|
||||||
|
.map(line => line.trim())
|
||||||
|
.filter(line => line && !line.startsWith('#'));
|
||||||
|
|
||||||
|
let isCodeOwner = false;
|
||||||
|
|
||||||
|
// Precompile CODEOWNERS patterns into regex objects
|
||||||
|
const codeownersRegexes = codeownersLines.map(line => {
|
||||||
|
const parts = line.split(/\s+/);
|
||||||
|
const pattern = parts[0];
|
||||||
|
const owners = parts.slice(1);
|
||||||
|
|
||||||
|
let regex;
|
||||||
|
if (pattern.endsWith('*')) {
|
||||||
|
// Directory pattern like "esphome/components/api/*"
|
||||||
|
const dir = pattern.slice(0, -1);
|
||||||
|
regex = new RegExp(`^${dir.replace(/[.*+?^${}()|[\]\\]/g, '\\$&')}`);
|
||||||
|
} else if (pattern.includes('*')) {
|
||||||
|
// Glob pattern
|
||||||
|
const regexPattern = pattern
|
||||||
|
.replace(/[.*+?^${}()|[\]\\]/g, '\\$&')
|
||||||
|
.replace(/\\*/g, '.*');
|
||||||
|
regex = new RegExp(`^${regexPattern}$`);
|
||||||
|
} else {
|
||||||
|
// Exact match
|
||||||
|
regex = new RegExp(`^${pattern.replace(/[.*+?^${}()|[\]\\]/g, '\\$&')}$`);
|
||||||
|
}
|
||||||
|
|
||||||
|
return { regex, owners };
|
||||||
|
});
|
||||||
|
|
||||||
|
for (const file of changedFiles) {
|
||||||
|
for (const { regex, owners } of codeownersRegexes) {
|
||||||
|
if (regex.test(file)) {
|
||||||
|
// Check if PR author is in the owners list
|
||||||
|
if (owners.some(owner => owner === `@${prAuthor}`)) {
|
||||||
|
isCodeOwner = true;
|
||||||
|
break;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
if (isCodeOwner) break;
|
||||||
|
}
|
||||||
|
|
||||||
|
if (isCodeOwner) {
|
||||||
|
labels.add('by-code-owner');
|
||||||
|
}
|
||||||
|
} catch (error) {
|
||||||
|
console.log('Failed to read or parse CODEOWNERS file:', error.message);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Strategy: Test detection
|
||||||
|
const testFiles = changedFiles.filter(file =>
|
||||||
|
file.startsWith('tests/')
|
||||||
|
);
|
||||||
|
|
||||||
|
if (testFiles.length > 0) {
|
||||||
|
labels.add('has-tests');
|
||||||
|
} else {
|
||||||
|
// Only check for needs-tests if this is a new component or new platform
|
||||||
|
if (labels.has('new-component') || labels.has('new-platform')) {
|
||||||
|
labels.add('needs-tests');
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Strategy: Documentation check for new components/platforms
|
||||||
|
if (labels.has('new-component') || labels.has('new-platform')) {
|
||||||
|
const prBody = context.payload.pull_request.body || '';
|
||||||
|
|
||||||
|
// Look for documentation PR links
|
||||||
|
// Patterns to match:
|
||||||
|
// - https://github.com/esphome/esphome-docs/pull/1234
|
||||||
|
// - esphome/esphome-docs#1234
|
||||||
|
const docsPrPatterns = [
|
||||||
|
/https:\/\/github\.com\/esphome\/esphome-docs\/pull\/\d+/,
|
||||||
|
/esphome\/esphome-docs#\d+/
|
||||||
|
];
|
||||||
|
|
||||||
|
const hasDocsLink = docsPrPatterns.some(pattern => pattern.test(prBody));
|
||||||
|
|
||||||
|
if (!hasDocsLink) {
|
||||||
|
labels.add('needs-docs');
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Convert Set to Array
|
||||||
|
let finalLabels = Array.from(labels);
|
||||||
|
|
||||||
|
console.log('Computed labels:', finalLabels.join(', '));
|
||||||
|
|
||||||
|
// Don't set more than max labels
|
||||||
|
if (finalLabels.length > maxLabels) {
|
||||||
|
const originalLength = finalLabels.length;
|
||||||
|
console.log(`Not setting ${originalLength} labels because out of range`);
|
||||||
|
finalLabels = ['too-big'];
|
||||||
|
|
||||||
|
// Request changes on the PR
|
||||||
|
await github.rest.pulls.createReview({
|
||||||
|
owner,
|
||||||
|
repo,
|
||||||
|
pull_number: pr_number,
|
||||||
|
body: `This PR is too large and affects ${originalLength} different components/areas. Please consider breaking it down into smaller, focused PRs to make review easier and reduce the risk of conflicts.`,
|
||||||
|
event: 'REQUEST_CHANGES'
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
// Add new labels
|
||||||
|
if (finalLabels.length > 0) {
|
||||||
|
console.log(`Adding labels: ${finalLabels.join(', ')}`);
|
||||||
|
await github.rest.issues.addLabels({
|
||||||
|
owner,
|
||||||
|
repo,
|
||||||
|
issue_number: pr_number,
|
||||||
|
labels: finalLabels
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
// Remove old managed labels that are no longer needed
|
||||||
|
const labelsToRemove = managedLabels.filter(label =>
|
||||||
|
!finalLabels.includes(label)
|
||||||
|
);
|
||||||
|
|
||||||
|
for (const label of labelsToRemove) {
|
||||||
|
console.log(`Removing label: ${label}`);
|
||||||
|
try {
|
||||||
|
await github.rest.issues.removeLabel({
|
||||||
|
owner,
|
||||||
|
repo,
|
||||||
|
issue_number: pr_number,
|
||||||
|
name: label
|
||||||
|
});
|
||||||
|
} catch (error) {
|
||||||
|
console.log(`Failed to remove label ${label}:`, error.message);
|
||||||
|
}
|
||||||
|
}
|
147
.github/workflows/external-component-bot.yml
vendored
Normal file
147
.github/workflows/external-component-bot.yml
vendored
Normal file
@ -0,0 +1,147 @@
|
|||||||
|
name: Add External Component Comment
|
||||||
|
|
||||||
|
on:
|
||||||
|
pull_request_target:
|
||||||
|
types: [opened, synchronize]
|
||||||
|
|
||||||
|
permissions:
|
||||||
|
contents: read # Needed to fetch PR details
|
||||||
|
issues: write # Needed to create and update comments (PR comments are managed via the issues REST API)
|
||||||
|
pull-requests: write # also needed?
|
||||||
|
|
||||||
|
jobs:
|
||||||
|
external-comment:
|
||||||
|
name: External component comment
|
||||||
|
runs-on: ubuntu-latest
|
||||||
|
steps:
|
||||||
|
- name: Add external component comment
|
||||||
|
uses: actions/github-script@v7.0.1
|
||||||
|
with:
|
||||||
|
github-token: ${{ secrets.GITHUB_TOKEN }}
|
||||||
|
script: |
|
||||||
|
// Generate external component usage instructions
|
||||||
|
function generateExternalComponentInstructions(prNumber, componentNames, owner, repo) {
|
||||||
|
let source;
|
||||||
|
if (owner === 'esphome' && repo === 'esphome')
|
||||||
|
source = `github://pr#${prNumber}`;
|
||||||
|
else
|
||||||
|
source = `github://${owner}/${repo}@pull/${prNumber}/head`;
|
||||||
|
return `To use the changes from this PR as an external component, add the following to your ESPHome configuration YAML file:
|
||||||
|
|
||||||
|
\`\`\`yaml
|
||||||
|
external_components:
|
||||||
|
- source: ${source}
|
||||||
|
components: [${componentNames.join(', ')}]
|
||||||
|
refresh: 1h
|
||||||
|
\`\`\``;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Generate repo clone instructions
|
||||||
|
function generateRepoInstructions(prNumber, owner, repo, branch) {
|
||||||
|
return `To use the changes in this PR:
|
||||||
|
|
||||||
|
\`\`\`bash
|
||||||
|
# Clone the repository:
|
||||||
|
git clone https://github.com/${owner}/${repo}
|
||||||
|
cd ${repo}
|
||||||
|
|
||||||
|
# Checkout the PR branch:
|
||||||
|
git fetch origin pull/${prNumber}/head:${branch}
|
||||||
|
git checkout ${branch}
|
||||||
|
|
||||||
|
# Install the development version:
|
||||||
|
script/setup
|
||||||
|
|
||||||
|
# Activate the development version:
|
||||||
|
source venv/bin/activate
|
||||||
|
\`\`\`
|
||||||
|
|
||||||
|
Now you can run \`esphome\` as usual to test the changes in this PR.
|
||||||
|
`;
|
||||||
|
}
|
||||||
|
|
||||||
|
async function createComment(octokit, owner, repo, prNumber, esphomeChanges, componentChanges) {
|
||||||
|
const commentMarker = "<!-- This comment was generated automatically by a GitHub workflow. -->";
|
||||||
|
let commentBody;
|
||||||
|
if (esphomeChanges.length === 1) {
|
||||||
|
commentBody = generateExternalComponentInstructions(prNumber, componentChanges, owner, repo);
|
||||||
|
} else {
|
||||||
|
commentBody = generateRepoInstructions(prNumber, owner, repo, context.payload.pull_request.head.ref);
|
||||||
|
}
|
||||||
|
commentBody += `\n\n---\n(Added by the PR bot)\n\n${commentMarker}`;
|
||||||
|
|
||||||
|
// Check for existing bot comment
|
||||||
|
const comments = await github.rest.issues.listComments({
|
||||||
|
owner: owner,
|
||||||
|
repo: repo,
|
||||||
|
issue_number: prNumber,
|
||||||
|
});
|
||||||
|
|
||||||
|
const botComment = comments.data.find(comment =>
|
||||||
|
comment.body.includes(commentMarker)
|
||||||
|
);
|
||||||
|
|
||||||
|
if (botComment && botComment.body === commentBody) {
|
||||||
|
// No changes in the comment, do nothing
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
if (botComment) {
|
||||||
|
// Update existing comment
|
||||||
|
await github.rest.issues.updateComment({
|
||||||
|
owner: owner,
|
||||||
|
repo: repo,
|
||||||
|
comment_id: botComment.id,
|
||||||
|
body: commentBody,
|
||||||
|
});
|
||||||
|
} else {
|
||||||
|
// Create new comment
|
||||||
|
await github.rest.issues.createComment({
|
||||||
|
owner: owner,
|
||||||
|
repo: repo,
|
||||||
|
issue_number: prNumber,
|
||||||
|
body: commentBody,
|
||||||
|
});
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
async function getEsphomeAndComponentChanges(github, owner, repo, prNumber) {
|
||||||
|
const changedFiles = await github.rest.pulls.listFiles({
|
||||||
|
owner: owner,
|
||||||
|
repo: repo,
|
||||||
|
pull_number: prNumber,
|
||||||
|
});
|
||||||
|
|
||||||
|
const esphomeChanges = changedFiles.data
|
||||||
|
.filter(file => file.filename !== "esphome/core/defines.h" && file.filename.startsWith('esphome/'))
|
||||||
|
.map(file => {
|
||||||
|
const match = file.filename.match(/esphome\/([^/]+)/);
|
||||||
|
return match ? match[1] : null;
|
||||||
|
})
|
||||||
|
.filter(it => it !== null);
|
||||||
|
|
||||||
|
if (esphomeChanges.length === 0) {
|
||||||
|
return {esphomeChanges: [], componentChanges: []};
|
||||||
|
}
|
||||||
|
|
||||||
|
const uniqueEsphomeChanges = [...new Set(esphomeChanges)];
|
||||||
|
const componentChanges = changedFiles.data
|
||||||
|
.filter(file => file.filename.startsWith('esphome/components/'))
|
||||||
|
.map(file => {
|
||||||
|
const match = file.filename.match(/esphome\/components\/([^/]+)\//);
|
||||||
|
return match ? match[1] : null;
|
||||||
|
})
|
||||||
|
.filter(it => it !== null);
|
||||||
|
|
||||||
|
return {esphomeChanges: uniqueEsphomeChanges, componentChanges: [...new Set(componentChanges)]};
|
||||||
|
}
|
||||||
|
|
||||||
|
// Start of main code.
|
||||||
|
|
||||||
|
const prNumber = context.payload.pull_request.number;
|
||||||
|
const {owner, repo} = context.repo;
|
||||||
|
|
||||||
|
const {esphomeChanges, componentChanges} = await getEsphomeAndComponentChanges(github, owner, repo, prNumber);
|
||||||
|
if (componentChanges.length !== 0) {
|
||||||
|
await createComment(github, owner, repo, prNumber, esphomeChanges, componentChanges);
|
||||||
|
}
|
@ -11,7 +11,7 @@ ci:
|
|||||||
repos:
|
repos:
|
||||||
- repo: https://github.com/astral-sh/ruff-pre-commit
|
- repo: https://github.com/astral-sh/ruff-pre-commit
|
||||||
# Ruff version.
|
# Ruff version.
|
||||||
rev: v0.12.3
|
rev: v0.12.4
|
||||||
hooks:
|
hooks:
|
||||||
# Run the linter.
|
# Run the linter.
|
||||||
- id: ruff
|
- id: ruff
|
||||||
|
@ -7,7 +7,7 @@ project and be sure to join us on [Discord](https://discord.gg/KhAMKrd).
|
|||||||
|
|
||||||
**See also:**
|
**See also:**
|
||||||
|
|
||||||
[Documentation](https://esphome.io) -- [Issues](https://github.com/esphome/issues/issues) -- [Feature requests](https://github.com/esphome/feature-requests/issues)
|
[Documentation](https://esphome.io) -- [Issues](https://github.com/esphome/esphome/issues) -- [Feature requests](https://github.com/orgs/esphome/discussions)
|
||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
|
@ -9,7 +9,7 @@
|
|||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
[Documentation](https://esphome.io) -- [Issues](https://github.com/esphome/issues/issues) -- [Feature requests](https://github.com/esphome/feature-requests/issues)
|
[Documentation](https://esphome.io) -- [Issues](https://github.com/esphome/esphome/issues) -- [Feature requests](https://github.com/orgs/esphome/discussions)
|
||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
|
@ -1,4 +1,5 @@
|
|||||||
#include "esphome/core/helpers.h"
|
#include "esphome/core/helpers.h"
|
||||||
|
#include "esphome/core/defines.h"
|
||||||
|
|
||||||
#ifdef USE_ESP32
|
#ifdef USE_ESP32
|
||||||
|
|
||||||
@ -36,7 +37,15 @@ IRAM_ATTR InterruptLock::~InterruptLock() { portENABLE_INTERRUPTS(); }
|
|||||||
|
|
||||||
LwIPLock::LwIPLock() {
|
LwIPLock::LwIPLock() {
|
||||||
#ifdef CONFIG_LWIP_TCPIP_CORE_LOCKING
|
#ifdef CONFIG_LWIP_TCPIP_CORE_LOCKING
|
||||||
// Only lock if we're not already in the TCPIP thread
|
// When CONFIG_LWIP_TCPIP_CORE_LOCKING is enabled, lwIP uses a global mutex to protect
|
||||||
|
// its internal state. Any thread can take this lock to safely access lwIP APIs.
|
||||||
|
//
|
||||||
|
// sys_thread_tcpip(LWIP_CORE_LOCK_QUERY_HOLDER) returns true if the current thread
|
||||||
|
// already holds the lwIP core lock. This prevents recursive locking attempts and
|
||||||
|
// allows nested LwIPLock instances to work correctly.
|
||||||
|
//
|
||||||
|
// If we don't already hold the lock, acquire it. This will block until the lock
|
||||||
|
// is available if another thread currently holds it.
|
||||||
if (!sys_thread_tcpip(LWIP_CORE_LOCK_QUERY_HOLDER)) {
|
if (!sys_thread_tcpip(LWIP_CORE_LOCK_QUERY_HOLDER)) {
|
||||||
LOCK_TCPIP_CORE();
|
LOCK_TCPIP_CORE();
|
||||||
}
|
}
|
||||||
@ -45,7 +54,16 @@ LwIPLock::LwIPLock() {
|
|||||||
|
|
||||||
LwIPLock::~LwIPLock() {
|
LwIPLock::~LwIPLock() {
|
||||||
#ifdef CONFIG_LWIP_TCPIP_CORE_LOCKING
|
#ifdef CONFIG_LWIP_TCPIP_CORE_LOCKING
|
||||||
// Only unlock if we hold the lock
|
// Only release the lwIP core lock if this thread currently holds it.
|
||||||
|
//
|
||||||
|
// sys_thread_tcpip(LWIP_CORE_LOCK_QUERY_HOLDER) queries lwIP's internal lock
|
||||||
|
// ownership tracking. It returns true only if the current thread is registered
|
||||||
|
// as the lock holder.
|
||||||
|
//
|
||||||
|
// This check is essential because:
|
||||||
|
// 1. We may not have acquired the lock in the constructor (if we already held it)
|
||||||
|
// 2. The lock might have been released by other means between constructor and destructor
|
||||||
|
// 3. Calling UNLOCK_TCPIP_CORE() without holding the lock causes undefined behavior
|
||||||
if (sys_thread_tcpip(LWIP_CORE_LOCK_QUERY_HOLDER)) {
|
if (sys_thread_tcpip(LWIP_CORE_LOCK_QUERY_HOLDER)) {
|
||||||
UNLOCK_TCPIP_CORE();
|
UNLOCK_TCPIP_CORE();
|
||||||
}
|
}
|
||||||
|
@ -22,6 +22,10 @@ void Mutex::unlock() {}
|
|||||||
IRAM_ATTR InterruptLock::InterruptLock() { state_ = xt_rsil(15); }
|
IRAM_ATTR InterruptLock::InterruptLock() { state_ = xt_rsil(15); }
|
||||||
IRAM_ATTR InterruptLock::~InterruptLock() { xt_wsr_ps(state_); }
|
IRAM_ATTR InterruptLock::~InterruptLock() { xt_wsr_ps(state_); }
|
||||||
|
|
||||||
|
// ESP8266 doesn't support lwIP core locking, so this is a no-op
|
||||||
|
LwIPLock::LwIPLock() {}
|
||||||
|
LwIPLock::~LwIPLock() {}
|
||||||
|
|
||||||
void get_mac_address_raw(uint8_t *mac) { // NOLINT(readability-non-const-parameter)
|
void get_mac_address_raw(uint8_t *mac) { // NOLINT(readability-non-const-parameter)
|
||||||
wifi_get_macaddr(STATION_IF, mac);
|
wifi_get_macaddr(STATION_IF, mac);
|
||||||
}
|
}
|
||||||
|
@ -26,6 +26,10 @@ void Mutex::unlock() { xSemaphoreGive(this->handle_); }
|
|||||||
IRAM_ATTR InterruptLock::InterruptLock() { portDISABLE_INTERRUPTS(); }
|
IRAM_ATTR InterruptLock::InterruptLock() { portDISABLE_INTERRUPTS(); }
|
||||||
IRAM_ATTR InterruptLock::~InterruptLock() { portENABLE_INTERRUPTS(); }
|
IRAM_ATTR InterruptLock::~InterruptLock() { portENABLE_INTERRUPTS(); }
|
||||||
|
|
||||||
|
// LibreTiny doesn't support lwIP core locking, so this is a no-op
|
||||||
|
LwIPLock::LwIPLock() {}
|
||||||
|
LwIPLock::~LwIPLock() {}
|
||||||
|
|
||||||
void get_mac_address_raw(uint8_t *mac) { // NOLINT(readability-non-const-parameter)
|
void get_mac_address_raw(uint8_t *mac) { // NOLINT(readability-non-const-parameter)
|
||||||
WiFi.macAddress(mac);
|
WiFi.macAddress(mac);
|
||||||
}
|
}
|
||||||
|
@ -204,7 +204,7 @@ def add_pio_file(component: str, key: str, data: str):
|
|||||||
cv.validate_id_name(key)
|
cv.validate_id_name(key)
|
||||||
except cv.Invalid as e:
|
except cv.Invalid as e:
|
||||||
raise EsphomeError(
|
raise EsphomeError(
|
||||||
f"[{component}] Invalid PIO key: {key}. Allowed characters: [{ascii_letters}{digits}_]\nPlease report an issue https://github.com/esphome/issues"
|
f"[{component}] Invalid PIO key: {key}. Allowed characters: [{ascii_letters}{digits}_]\nPlease report an issue https://github.com/esphome/esphome/issues"
|
||||||
) from e
|
) from e
|
||||||
CORE.data[KEY_RP2040][KEY_PIO_FILES][key] = data
|
CORE.data[KEY_RP2040][KEY_PIO_FILES][key] = data
|
||||||
|
|
||||||
|
@ -44,6 +44,10 @@ void Mutex::unlock() {}
|
|||||||
IRAM_ATTR InterruptLock::InterruptLock() { state_ = save_and_disable_interrupts(); }
|
IRAM_ATTR InterruptLock::InterruptLock() { state_ = save_and_disable_interrupts(); }
|
||||||
IRAM_ATTR InterruptLock::~InterruptLock() { restore_interrupts(state_); }
|
IRAM_ATTR InterruptLock::~InterruptLock() { restore_interrupts(state_); }
|
||||||
|
|
||||||
|
// RP2040 doesn't support lwIP core locking, so this is a no-op
|
||||||
|
LwIPLock::LwIPLock() {}
|
||||||
|
LwIPLock::~LwIPLock() {}
|
||||||
|
|
||||||
void get_mac_address_raw(uint8_t *mac) { // NOLINT(readability-non-const-parameter)
|
void get_mac_address_raw(uint8_t *mac) { // NOLINT(readability-non-const-parameter)
|
||||||
#ifdef USE_WIFI
|
#ifdef USE_WIFI
|
||||||
WiFi.macAddress(mac);
|
WiFi.macAddress(mac);
|
||||||
|
@ -317,3 +317,15 @@ async def to_code(config):
|
|||||||
if (sorting_group_config := config.get(CONF_SORTING_GROUPS)) is not None:
|
if (sorting_group_config := config.get(CONF_SORTING_GROUPS)) is not None:
|
||||||
cg.add_define("USE_WEBSERVER_SORTING")
|
cg.add_define("USE_WEBSERVER_SORTING")
|
||||||
add_sorting_groups(var, sorting_group_config)
|
add_sorting_groups(var, sorting_group_config)
|
||||||
|
|
||||||
|
|
||||||
|
def FILTER_SOURCE_FILES() -> list[str]:
|
||||||
|
"""Filter out web_server_v1.cpp when version is not 1."""
|
||||||
|
files_to_filter: list[str] = []
|
||||||
|
|
||||||
|
# web_server_v1.cpp is only needed when version is 1
|
||||||
|
config = CORE.config.get("web_server", {})
|
||||||
|
if config.get(CONF_VERSION, 2) != 1:
|
||||||
|
files_to_filter.append("web_server_v1.cpp")
|
||||||
|
|
||||||
|
return files_to_filter
|
||||||
|
@ -1620,7 +1620,9 @@ void WebServer::handle_event_request(AsyncWebServerRequest *request, const UrlMa
|
|||||||
request->send(404);
|
request->send(404);
|
||||||
}
|
}
|
||||||
|
|
||||||
static std::string get_event_type(event::Event *event) { return event->last_event_type ? *event->last_event_type : ""; }
|
static std::string get_event_type(event::Event *event) {
|
||||||
|
return (event && event->last_event_type) ? *event->last_event_type : "";
|
||||||
|
}
|
||||||
|
|
||||||
std::string WebServer::event_state_json_generator(WebServer *web_server, void *source) {
|
std::string WebServer::event_state_json_generator(WebServer *web_server, void *source) {
|
||||||
auto *event = static_cast<event::Event *>(source);
|
auto *event = static_cast<event::Event *>(source);
|
||||||
|
@ -54,6 +54,10 @@ void Mutex::unlock() { k_mutex_unlock(static_cast<k_mutex *>(this->handle_)); }
|
|||||||
IRAM_ATTR InterruptLock::InterruptLock() { state_ = irq_lock(); }
|
IRAM_ATTR InterruptLock::InterruptLock() { state_ = irq_lock(); }
|
||||||
IRAM_ATTR InterruptLock::~InterruptLock() { irq_unlock(state_); }
|
IRAM_ATTR InterruptLock::~InterruptLock() { irq_unlock(state_); }
|
||||||
|
|
||||||
|
// Zephyr doesn't support lwIP core locking, so this is a no-op
|
||||||
|
LwIPLock::LwIPLock() {}
|
||||||
|
LwIPLock::~LwIPLock() {}
|
||||||
|
|
||||||
uint32_t random_uint32() { return rand(); } // NOLINT(cert-msc30-c, cert-msc50-cpp)
|
uint32_t random_uint32() { return rand(); } // NOLINT(cert-msc30-c, cert-msc50-cpp)
|
||||||
bool random_bytes(uint8_t *data, size_t len) {
|
bool random_bytes(uint8_t *data, size_t len) {
|
||||||
sys_rand_get(data, len);
|
sys_rand_get(data, len);
|
||||||
|
@ -695,6 +695,10 @@ class LwIPLock {
|
|||||||
public:
|
public:
|
||||||
LwIPLock();
|
LwIPLock();
|
||||||
~LwIPLock();
|
~LwIPLock();
|
||||||
|
|
||||||
|
// Delete copy constructor and copy assignment operator to prevent accidental copying
|
||||||
|
LwIPLock(const LwIPLock &) = delete;
|
||||||
|
LwIPLock &operator=(const LwIPLock &) = delete;
|
||||||
};
|
};
|
||||||
|
|
||||||
/** Helper class to request `loop()` to be called as fast as possible.
|
/** Helper class to request `loop()` to be called as fast as possible.
|
||||||
|
@ -27,8 +27,8 @@ dynamic = ["dependencies", "optional-dependencies", "version"]
|
|||||||
[project.urls]
|
[project.urls]
|
||||||
"Documentation" = "https://esphome.io"
|
"Documentation" = "https://esphome.io"
|
||||||
"Source Code" = "https://github.com/esphome/esphome"
|
"Source Code" = "https://github.com/esphome/esphome"
|
||||||
"Bug Tracker" = "https://github.com/esphome/issues/issues"
|
"Bug Tracker" = "https://github.com/esphome/esphome/issues"
|
||||||
"Feature Request Tracker" = "https://github.com/esphome/feature-requests/issues"
|
"Feature Request Tracker" = "https://github.com/orgs/esphome/discussions"
|
||||||
"Discord" = "https://discord.gg/KhAMKrd"
|
"Discord" = "https://discord.gg/KhAMKrd"
|
||||||
"Forum" = "https://community.home-assistant.io/c/esphome"
|
"Forum" = "https://community.home-assistant.io/c/esphome"
|
||||||
"Twitter" = "https://twitter.com/esphome_"
|
"Twitter" = "https://twitter.com/esphome_"
|
||||||
|
@ -12,7 +12,7 @@ platformio==6.1.18 # When updating platformio, also update /docker/Dockerfile
|
|||||||
esptool==4.9.0
|
esptool==4.9.0
|
||||||
click==8.1.7
|
click==8.1.7
|
||||||
esphome-dashboard==20250514.0
|
esphome-dashboard==20250514.0
|
||||||
aioesphomeapi==36.0.0
|
aioesphomeapi==36.0.1
|
||||||
zeroconf==0.147.0
|
zeroconf==0.147.0
|
||||||
puremagic==1.30
|
puremagic==1.30
|
||||||
ruamel.yaml==0.18.14 # dashboard_import
|
ruamel.yaml==0.18.14 # dashboard_import
|
||||||
|
@ -1,6 +1,6 @@
|
|||||||
pylint==3.3.7
|
pylint==3.3.7
|
||||||
flake8==7.3.0 # also change in .pre-commit-config.yaml when updating
|
flake8==7.3.0 # also change in .pre-commit-config.yaml when updating
|
||||||
ruff==0.12.3 # also change in .pre-commit-config.yaml when updating
|
ruff==0.12.4 # also change in .pre-commit-config.yaml when updating
|
||||||
pyupgrade==3.20.0 # also change in .pre-commit-config.yaml when updating
|
pyupgrade==3.20.0 # also change in .pre-commit-config.yaml when updating
|
||||||
pre-commit
|
pre-commit
|
||||||
|
|
||||||
@ -8,7 +8,7 @@ pre-commit
|
|||||||
pytest==8.4.1
|
pytest==8.4.1
|
||||||
pytest-cov==6.2.1
|
pytest-cov==6.2.1
|
||||||
pytest-mock==3.14.1
|
pytest-mock==3.14.1
|
||||||
pytest-asyncio==1.0.0
|
pytest-asyncio==1.1.0
|
||||||
pytest-xdist==3.7.0
|
pytest-xdist==3.8.0
|
||||||
asyncmock==0.4.2
|
asyncmock==0.4.2
|
||||||
hypothesis==6.92.1
|
hypothesis==6.92.1
|
||||||
|
@ -241,6 +241,9 @@ def lint_ext_check(fname):
|
|||||||
"docker/ha-addon-rootfs/**",
|
"docker/ha-addon-rootfs/**",
|
||||||
"docker/*.py",
|
"docker/*.py",
|
||||||
"script/*",
|
"script/*",
|
||||||
|
"CLAUDE.md",
|
||||||
|
"GEMINI.md",
|
||||||
|
".github/copilot-instructions.md",
|
||||||
]
|
]
|
||||||
)
|
)
|
||||||
def lint_executable_bit(fname):
|
def lint_executable_bit(fname):
|
||||||
|
Loading…
x
Reference in New Issue
Block a user