Installing the ELK Stack on Docker Container

Installing the ELK Stack on Docker Container
Written by Abhishek JalanAugust 12, 2021
7 min read
DevOps
12 VIEWS 5 LIKES 0 DISLIKES SHARE
5 LIKES 0 DISLIKES 12 VIEWS SHARE
Abhishek Jalan

DevSecOps Engineer

Installing the ElasticSearch Logstash & Kibana (ELK) Stack on Docker with Centos Base system

In this blog, I am using a Dockerized ELK Stack that results in: three Docker containers running in parallel, for Elasticsearch, Logstash, and Kibana, port forwarding set up, and a data volume for persisting Elasticsearch data.

The ELK Stack (Elasticsearch, Logstash, and Kibana) can be installed on a variety of different operating systems and in various different setups. While the most common installation setup is Linux and other Unix-based systems, a less-discussed scenario is using Docker.

To get the default distributions of Elasticsearch and Kibana up and running in Docker, you can use Docker Compose.

Create a docker-compose.yml file for the single node Elastic with Logstash & Kibana. The following example brings up three containers so you can see how things work.

version: '3.2'

services:
elasticsearch:
build:
context: elasticsearch/
args:
ELK_VERSION: $ELK_VERSION
volumes:
- type: bind
source: ./elasticsearch/config/elasticsearch.yml
target: /usr/share/elasticsearch/config/elasticsearch.yml
read_only: true
- type: volume
source: elasticsearch
target: /usr/share/elasticsearch/data
ports:
- "9200:9200"
- "9300:9300"
environment:
ES_JAVA_OPTS: "-Xmx256m -Xms256m"
ELASTIC_PASSWORD: changeme
# Use single node discovery in order to disable production mode and avoid bootstrap checks.
# see: https://www.elastic.co/guide/en/elasticsearch/reference/current/bootstrap-checks.html
discovery.type: single-node
networks:
- elk

logstash:
build:
context: logstash/
args:
ELK_VERSION: $ELK_VERSION
volumes:
- type: bind
source: ./logstash/config/logstash.yml
target: /usr/share/logstash/config/logstash.yml
read_only: true
- type: bind
source: ./logstash/pipeline
target: /usr/share/logstash/pipeline
read_only: true
ports:
- "5044:5044"
- "5000:5000/tcp"
- "5000:5000/udp"
- "9600:9600"
environment:
LS_JAVA_OPTS: "-Xmx256m -Xms256m"
networks:
- elk
depends_on:
- elasticsearch

kibana:
build:
context: kibana/
args:
ELK_VERSION: $ELK_VERSION
volumes:
- type: bind
source: ./kibana/config/kibana.yml
target: /usr/share/kibana/config/kibana.yml
read_only: true
ports:
- "5601:5601"
networks:
- elk
depends_on:
- elasticsearch

networks:
elk:
driver: bridge

volumes:
elasticsearch:

Make sure Docker Engine is allotted at least 6-8 GiB of memory. In Docker Desktop.

Run docker-compose to bring up the three docker container Elasticsearch, Logstasg and Kibana:
docker-compose up
Submit a _cat/nodes request to see that the nodes are up and running:
curl -X GET "localhost:9200/_cat/nodes?v&pretty"
Open Kibana to load sample data and interact with the cluster:
http://localhost:5601


When you’re done experimenting, you can tear down the containers and volumes by running

docker-compose down -v

ELK
Elasticsearch
Kibana
ELK on Docker
ELK single node
12 VIEWS 5 LIKES 0 DISLIKES SHARE
5 LIKES 0 DISLIKES 12 VIEWS SHARE
Was this blog helpful?
You must be Logged in to comment
Code Block

1 Comments

Prasanta Das

Nice Blog

Abhishek Jalan
DevSecOps Engineer
+45 more
19 Blog Posts
5 Discussion Threads
Trending Categories
15
Software39
DevOps45
Frontend Development24
Backend Development18
Server Administration17
Linux Administration24
Data Center24
Sentry24
Terraform21
Ansible29
Docker28
Penetration Testing14
Kubernetes16
NGINX19
JenkinsX17
Techiio

Techiio is on the journey to build an ocean of technical knowledge, scouring the emerging stars in process and proffering them to the corporate world.

Follow us on:

facebooklinkdeintwitter

Subscribe to get latest updates

You can unsubscribe anytime from getting updates from us
Copyright techiio.com @2020 Kolkata, India
made with by Abhishek & Priyanka Jalan
Copyright techiio.com @2020
made with by Abhishek & Priyanka Jalan