Prerequisites

  • Memory – 4 GB and above
  • Docker Engine – version 18.06.0 or newer
  • Docker Compose – version 1.26.0 or newer

Install the required packages below:

## On Debian/Ubuntu
sudo apt update && sudo apt upgrade
sudo apt install curl vim git

Step 1 – Install Docker and Docker Compose

Install Docker CE 

sudo apt update

Install packages to allow apt to use a repository over HTTPS:

sudo apt -y install lsb-release gnupg apt-transport-https ca-certificates curl software-properties-common

Add Docker’s official GPG key:

curl -fsSL https://download.docker.com/linux/ubuntu/gpg | sudo gpg --dearmor -o /etc/apt/trusted.gpg.d/docker.gpg

Add stable repository:

sudo add-apt-repository "deb [arch=$(dpkg --print-architecture)] https://download.docker.com/linux/ubuntu $(lsb_release -cs) stable"

Install docker ce:

sudo apt update
sudo apt install docker-ce docker-ce-cli containerd.io docker-compose-plugin

add docker to non root user:

sudo usermod -aG docker $USER
newgrp docker

Install Docker-compose

curl -s https://api.github.com/repos/docker/compose/releases/latest | grep browser_download_url  | grep docker-compose-linux-x86_64 | cut -d '"' -f 4 | wget -qi -

Make the binary file executable.

chmod +x docker-compose-linux-x86_64

Move the file to your PATH.

sudo mv docker-compose-linux-x86_64 /usr/local/bin/docker-compose

Step 2 – Download and Config Elastic with docker-compose.

We will begin by cloning the file from Github as below

git clone https://github.com/deviantony/docker-elk.git
cd docker-elk
  • Configure Elasticsearch

configure elasticsearch on elasticsearch/config/elasticsearch.yml

elasticsearch:
  environment:
    cluster.name: ms-elasticsearch
    xpack.license.self_generated.type: basic
  • Configure Kibana

configuration file is stored in the kibana/config/kibana.yml

kibana:
  environment: kibana.dev.ridhohafidz.com
  • Configure .env for credential

open configuration on env 

nano .env

change the configuration

ELASTIC_VERSION=<VERSION>

## Passwords for stack users
#

# User 'elastic' (built-in) ##user default bawaan
#
# Superuser role, full access to cluster management and data indices.
# https://www.elastic.co/guide/en/elasticsearch/reference/current/built-in-users.html
ELASTIC_PASSWORD='StrongPassw0rd1'

# User 'logstash_internal' (custom)  ##user default bawaan
#
# The user Logstash uses to connect and send data to Elasticsearch.
# https://www.elastic.co/guide/en/logstash/current/ls-security.html
LOGSTASH_INTERNAL_PASSWORD='StrongPassw0rd1'

# User 'kibana_system' (built-in)  ##user default bawaan
#
# The user Kibana uses to connect and communicate with Elasticsearch.
# https://www.elastic.co/guide/en/elasticsearch/reference/current/built-in-users.html
KIBANA_SYSTEM_PASSWORD='  '##Add your own password

Step 4 – Bringing up the Elastic stack

bring up with docker-compose

docker-compose up -d

Sample output :

[+] Building 6.4s (12/17)                                                                                                                   
 => [docker-elk_setup internal] load build definition from Dockerfile                                                                  0.3s
 => => transferring dockerfile: 389B                                                                                                   0.0s
 => [docker-elk_setup internal] load .dockerignore                                                                                     0.5s
 => => transferring context: 250B                                                                                                      0.0s
 => [docker-elk_logstash internal] load build definition from Dockerfile                                                               0.6s
 => => transferring dockerfile: 312B                                                                                                   0.0s
 => [docker-elk_elasticsearch internal] load build definition from Dockerfile                                                          0.6s
 => => transferring dockerfile: 324B                                                                                                   0.0s
 => [docker-elk_logstash internal] load .dockerignore                                                                                  0.7s
 => => transferring context: 188B                                                 
........

check container are running :

$ docker ps
CONTAINER ID   IMAGE                      COMMAND                  CREATED          STATUS         PORTS                                                                                                                                                                        NAMES
096ddc76c6b9   docker-elk_logstash        "/usr/local/bin/dock…"   9 seconds ago    Up 5 seconds   0.0.0.0:5000->5000/tcp, :::5000->5000/tcp, 0.0.0.0:5044->5044/tcp, :::5044->5044/tcp, 0.0.0.0:9600->9600/tcp, 0.0.0.0:5000->5000/udp, :::9600->9600/tcp, :::5000->5000/udp   docker-elk-logstash-1
ec3aab33a213   docker-elk_kibana          "/bin/tini -- /usr/l…"   9 seconds ago    Up 5 seconds   0.0.0.0:5601->5601/tcp, :::5601->5601/tcp                                                                                                                                    docker-elk-kibana-1
b365f809d9f8   docker-elk_setup           "/entrypoint.sh"         10 seconds ago   Up 7 seconds   9200/tcp, 9300/tcp                                                                                                                                                           docker-elk-setup-1
45f6ba48a89f   docker-elk_elasticsearch   "/bin/tini -- /usr/l…"   10 seconds ag

Step 5 – Testing and Accessing dashboard

access the Kibana dashboard with the URL : 

in my case i access it on http://10.10.10.178:5601

Login using the credentials set for the Elasticsearch user:

Username: elastic
Password:         ##password that you make on .env 

By admin

Leave a Reply

Your email address will not be published. Required fields are marked *