Ab Februar 2025 Pflicht (EU AI Act): Anbieter und Betreiber von KI-Systemen müssen KI-Kompetenz nachweisen.
Alle Informationen 
Elk on Docker Compose Headerbild
Search

Elk on Docker (-Compose)

Lesezeit
3 ​​min

Notice:
This post is older than 5 years – the content might be outdated.

The ELK/Elastic stack is a common open source solution for collecting and analyzing log data from distributed systems. This article will show you how to run an ELK on Docker using Docker Compose. This will enable you to run ELK distributed on your docker infrastructure or test it on your local system.

To start just copy the source code below and create a file called „docker-compose.yml“ and then another one called „elasticsearch.yml“ in the same directory. These two files and a working Docker installation is all you need.

Starting up Docker Compose

The Docker compose file uses official images from the docker hub. Thus there is no need build your own images to get started. There are two different ways to launch your ELK stack from this example:

  1. As single instances: Get a shell in the directory where you’ve put the two files and type:  docker-compose up. This command will start one container for Elasticsearch, one for Kibana, one for Logstash. The Logstash container will listen to port 8080, the Kibana container to port 80.
  2. With an Elasticsearch cluster of x data nodes:  docker-compose scale Elasticsearch=x Kibana=1 Logstash=1. This will start an Elasticsearch cluster with x nodes, one Logstash and one Kibana instance. Logstash will again listen to port 8080, Kibana to port 80.

Sending & accessing data

As Logstash is listening to port 8080 feeding data into the system works via a simple curl:  curl -XPOST http://localhost:8080/ -d '{"a": "b", "c": "d"}'. Kibana is configured to listen to port 80. As the used Kibana image is not configured yet you will be prompted to setup an index pattern. Logstash will write its data to a daily index with the naming scheme ‚elk-data-*‘.

As Elasticsearch persists its data on local volumes the data in your cluster will be saved until the volume containers are removed.

Source Codes

Example usage

As mentioned in the introduction this is an easy solution to run ELK locally on your system. Therefore it’s easy to feed local files into „your“ ELK.

Feeding example data on your dockerized ELK
Feeding example data to your dockerized ELK

Feeding local syslogs

Another use case might be to use Logstash as a local syslog server. Change line 24 in your  Docker-compose file to this line:

This will cause Logstash to listen to incoming syslog messages at port 8514. Next change your syslog config and add something like this:

After you’ve reloaded your Logstash container and your syslog you’ll get an index called „syslog-*“

Syslog message at Kibana
Syslog message on Kibana

Get in touch

For all your Big Data needs visit our website, drop us an Email at info@inovex.de or call +49 721 619 021-0.

We’re hiring!

Looking for a change? We’re hiring Big Data Systems Engineers who have mastered the ELK stack. Apply now!

Hat dir der Beitrag gefallen?

Deine E-Mail-Adresse wird nicht veröffentlicht. Erforderliche Felder sind mit * markiert