Table of Contents

About

LogStash is:

Pipeline

A Logstash pipeline is composed of the following elements;

  • input (produce the data)
  • filter (optional, process the data)
  • output (write the data)

For instance:

  • it can read a log file
  • parse it into a JSON format via a grok expression (filter)
  • send them to sink (databases, index engines and so forth).

Plugin

The plugins are hosted on rubygems:

The official plugins are visible on Github: logstash-plugins github

Logstash is written with Java and Ruby is supported thanks to JRuby. Plugin may be written in Ruby or Java.

Getting Started

Docker

Doc

  • Download
docker pull docker.elastic.co/logstash/logstash:7.5.1
  • Run
docker run ^
   --rm ^
   -it ^
   -v ^
   %CD%:/usr/share/logstash/pipeline/ ^
   docker.elastic.co/logstash/logstash:7.5.1 ^
   bash

where:

  • the %CD% is the current directory and will be mounted into /usr/share/logstash/pipeline/ which is the location of the pipeline configuration file logstash.conf. If there is no configuration, the (BeatInput) is used.

the log4 configuration file is located at /usr/share/logstash/config/log4j2.properties

First pipeline

cd bin/logstash-7.5.1
# in docker
# cd /usr/share/logstash/bin/logstash
logstash -e 'input { stdin { } } output { stdout {} }'

where:

Typing hello word at the console (stdin) will produce the below message at the console (stdout)

{
       "message" => "hello world",
    "@timestamp" => 2020-01-13T14:02:43.376Z,
      "@version" => "1",
          "host" => "32621775747d"
}

Logstash adds timestamp and IP address information to the message.

Documentation / Reference