1.4 KiB
title | description | logo | ha_category | ha_release | |
---|---|---|---|---|---|
Apache Kafka | Send data and events to Apache Kafka. | apache_kafka.png |
|
0.97 |
The apache_kafka
integration sends all state changes to a
Apache Kafka topic.
Apache Kafka is a real-time data pipeline that can read and write streams of data. It stores its data safely in a distributed, replicated, fault-tolerant cluster.
To use the apache_kafka
integration in your installation, add the following to your
configuration.yaml
file:
apache_kafka:
host: localhost
port: 9092
topic: home_assistant_1
{% configuration %} host: description: The IP address or hostname of an Apache Kafka cluster. required: true type: string port: description: The port to use. required: true type: integer topic: description: The Kafka topic to send data to. required: true type: string filter: description: Filters for entities to be included/excluded. required: false type: map keys: include_domains: description: Domains to be included. required: false type: list include_entities: description: Entities to be included. required: false type: list exclude_domains: description: Domains to be excluded. required: false type: list exclude_entities: description: Entities to be excluded. required: false type: list {% endconfiguration %}