Sending Home Assistant logs to Elasticsearch (ELK)
Do you want a better overview of you Home Assiatant logs? Do you want the home assistant logs to be searchable over time? Then you can export then to Elasticsearch and use Kibana for query the log data.
-
First you need to have an installed ELK stack (Elasticsearch, Logstash and Kibana). I went for an installation using docker with help of docker-elk github project.
-
Second you have to install Home Assiatant Logspout add-on.
-
Configure the logspout add-on
routes: - multiline+logstash+tcp://mykibana.example.com:50000 env: - name: SYSLOG_HOSTNAME value: homeassistant - name: INACTIVITY_TIMEOUT value: 1m - name: MULTILINE_PATTERN value: >- (\d\d(\d\d)?[-/]\d\d[-/]\d\d[T ]\d\d:\d\d:\d\d)|(^s6-rc:)|(^\[\d\d:\d\d:\d\d\])|(\d\d:\d\d:\d\d\ -)|(^[TDIWEF]:) - name: MULTILINE_MATCH value: first - name: INCLUDE_CONTAINERS value: homeassistant - name: LOGSTASH_FIELDS value: source=my-home-assistant
Note that default all docker container logs in HAOS is sent to logstash for insertion into elasticsearch.
-
Configure logstash to parse the log data that comes in. My logstash.conf file looks like this:
input { beats { port => 5044 } tcp { port => 50000 codec => json } udp { port => 5000 codec => json } } ## Add your filters / logstash plugins configuration here filter { if ([source] == "my-home-assistant") { if ([docker][name] == "/homeassistant") { grok { patterns_dir => ["/usr/share/logstash/pipeline/patterns"] match => { "message" => "%{LOGLEVEL:log_level}%{SPACE}\(%{GREEDYDATA:log_thread}\)%{SPACE}\[%{LOGGER_NAME:log_name}\]%{SPACE}%{GREEDYDATA:log_message}" } } mutate { gsub => [ "log_message", "\x1B\[([0-9]{1,2}(;[0-9]{1,2})?)?[m|M|K]", "" ] } if [log_message] =~ /\n/ { mutate { copy => { "log_message" => "log_trace" } } mutate { gsub => [ "log_message", "(?m)^([^\n]*)$.*", "\1" ] } } } else { drop { } } } } output { elasticsearch { hosts => "elasticsearch:9200" user => "logstash_internal" password => "${LOGSTASH_INTERNAL_PASSWORD}" } }
and my custom patterns file looks like this:
LOGGER_NAME [a-zA-Z0-9._-]+ UNICODE_START [\\u]
All incomming data tagged with my-home-assistant will be processed by the filter. It will also drop all data that comes from other than the HAOS home assistant docker container.
If you want to properly process data from other docker containers in the HAOS install, you will have to write more grok patterns.