Home Assistant + Confluent Cloud (Kafka)
Ever wanted to hook-up Home Assistant to Confluent Cloud, perhaps so you can publish your home weather station data to the world?
Its surprisingly easy to do:
- Setup Home Assistant, weather stations, sensors, etc
- Setup Confluent Cloud, make a cluster, service account, topic, etc…
- Make an API key for kafka (from clusters page - docs). Download the credentials file this step generates or keep and/or keep a note of the API key and secret as they won’t be displayed again
- In Home Assistant, make your life easy and install the VS Code extension, now you can very easily edit files on the Home Assistant server
- The Apache Kafka Integration lets us connect to Confluent Cloud as well as any other Kafka. Its already installed in HAOS so just needs to be configured. This is done by editing
configuration.yaml - Open the studio code server in-browser, edit the
configuration.yamlfile (/config/configuration.yamlon Home Assistant Operating System - HAOS)
Here’s a template:
apache_kafka:
ip_address: XXXX.confluent.cloud
port: 9092
topic: sometopic
username: API_KEY
password: API_SECRET
security_protocol: SASL_SSL
filter:
include_entity_globs:
- sensor.*
ip_addresscontains the hostname of your cloud kafka broker- See the plugin documentation for how filtering works. I’m just dumping sensor info for now
- All data will be sent to a single topic
My completed example to just send the weather station data:
apache_kafka:
ip_address: pkc-xxxxx.ap-southeast-2.aws.confluent.cloud
port: 9092
topic: weather-2060
username: SWBXXXXXXXXXXZOY
password: cfltrExxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxHUw
security_protocol: SASL_SSL
filter:
include_entity_globs:
- sensor.ha_weather_*
Save the file, reboot home assistant, and if your lucky, sensor readings should start arriving in Confluent Cloud in a couple of minutes:

Job done! Now I can transform my data with Flink, and share it with the world for a bit of citizen science.