Categories
Uncategorized

Elastic Stack (Elasticsearch, Logstash, Kibana) within Docker Containers

https://github.com/stevencoutts/CICD-Example

 

To set up the Elastic Stack (Elasticsearch, Logstash, Kibana) within Docker containers with an Nginx log shipper, follow these steps:

Step 1: Install Docker and Docker Compose

Before you get started, ensure that Docker and Docker Compose are installed on your machine.

- **Install Docker**:
    - For Ubuntu/Debian:
      sudo apt-get update
      sudo apt-get install docker.io
    - For CentOS/RHEL:
      sudo yum install -y docker

- **Install Docker Compose**:
  Follow the instructions on https://docs.docker.com/compose/install/ to get Docker Compose installed.

Step 2: Create a `docker-compose.yml` file

Create a new directory for your project and navigate into it. Then, create a `docker-compose.yml` file to define the services.

version: '3'

services:
  elasticsearch:
    image: docker.elastic.co/elasticsearch/elasticsearch:7.10.2
    container_name: elasticsearch
    environment:
      - discovery.type=single-node
      - bootstrap.memory_lock=true
      - "ES_JAVA_OPTS=-Xms512m -Xmx512m"
    ulimits:
      memlock:
        soft: -1
        hard: -1
    volumes:
      - esdata:/usr/share/elasticsearch/data
    ports:
      - 9200:9200

  logstash:
    image: docker.elastic.co/logstash/logstash:7.10.2
    container_name: logstash
    volumes:
      - ./logstash/config/logstash.conf:/usr/share/logstash/pipeline/logstash.conf
    ports:
      - 5044:5044
    environment:
      LS_JAVA_OPTS: "-Xmx256m -Xms256m"

  kibana:
    image: docker.elastic.co/kibana/kibana:7.10.2
    container_name: kibana
    ports:
      - 5601:5601

volumes:
  esdata:

Step 3: Configure Logstash to Ship Nginx Logs

Create a directory for the logstash configuration and place your `logstash.conf` file inside.

mkdir -p logstash/config
touch logstash/config/logstash.conf

Add the following configuration to logstash.conf. This will ship logs from an Nginx container:

input {
  tcp {
    port => 5044
    codec => json_lines
  }
}

output {
  elasticsearch {
    hosts => ["http://elasticsearch:9200"]
    index => "nginx-%{+YYYY.MM.dd}"
  }
}

Step 4: Create an Nginx Dockerfile

Create a Dockerfile for your Nginx server that will ship its logs to Logstash. This example assumes you have an existing Nginx configuration file and access log in place.

FROM nginx:alpine

COPY ./nginx.conf /etc/nginx/

# Configure Fluentd (or similar) to send the nginx access logs to Logstash
RUN apk add --no-cache curl && \
    mkdir -p /var/log/containers

CMD ["sh", "-c", "nginx -g 'daemon off;' & tail -f /dev/null"]

Step 5: Update `docker-compose.yml` with Nginx Service

Add the following to your `docker-compose.yml` file:

services:
  # ... existing services ...
  nginx:
    build:
      context: .
      dockerfile: Dockerfile
    ports:
      - "80:80"
    volumes:
      - ./nginx-access.log:/var/log/nginx/access.log

Step 6: Start the Services

Run `docker-compose up` to start all services.

docker-compose up -d

Step 7: Access Kibana and Verify Logs

– Open your web browser and go to http://localhost:5601.
– Set up an index pattern in Kibana using the `nginx-*` wildcard.
– Verify that logs from Nginx are being shipped to Elasticsearch via Logstash.

Step 8: Monitor Your Stack

Once everything is set up, you can monitor your services and troubleshoot as needed. Use Docker commands like docker-compose ps, docker-compose logs <service_name> for status and debugging.
That’s it! You now have the Elastic Stack running in Docker containers with Nginx log shipping enabled via Logstash.

Leave a Reply

Your email address will not be published. Required fields are marked *