(EFK) Elasticsearch, Fluentd, Kibana Setup – [Step By Step Guide]


The EFK (Elasticsearch, Fluentd and Kibana) stack is an open source alternative to paid log management, log search and log visualization services like Splunk, SumoLogic and Graylog (Graylog is open source but enterprise support is paid). These services are used to search large amounts of log data for better insights, tracking, visualisation and analytical purposes. The EFK stack i.e. Elasticsearch, Fluentd and Kibana are separate open source projects that together make an amazing open source centralized log management stack that is not only free to use and easy to setup/install but also scalable and can handle really large amounts of log data in realtime. This article documents how to setup Elasticsearch, Fluentd and Kibana and putting it all together to get the best out of your boring log data. So Without further ado, Let’s jump right into the setup/installation process.

Let us first create a folder that we’ll put our EFK stack into. I chose to use work in /Users/amyth/installs/efk. You can choose any location that you would like to work on, Really.

1. Installing & Running Elasticsearch

1.1 Java Installation

Let us get started by installing Java as it is one of the core dependencies of elastichsearch.

Once the installation is finished, Confirm it by checking the java version using the following command.

and you should see something like the following.

1.2 Installing Elasticsearch

Next, download elasticsearch (v2.1.0)  and uncompress the downloaded package.

Now let’s run the an elasticsearch instance by cd’ing into the elasticsearch folder and running the elasticsearch script in the bin folder. To run elasticsearch as a daemon use the -d argument while calling the script.

or to run it as a daemon

After running Elasticsearch, confirm you have a running instance by navigating to and you should see something like the following:

2. Installing & Running Kibana

Now let us install and configure Kibana. First download kibana from this download page. Once downloaded move the download file to our efk install location and uncompress the downloaded file.

Next, lets run kibana using the following command

Now in your web browser navigate to and you should see the kibana dashboard. Something like the following image.

EFK, Elasticsearch, Fluentd, Kibana


Now before we create indices, let’s get the third and final pillar to our stack up and running.

3. Installing & Running Fluentd

For installation of Fluentd, it provides a bash script that automates the installation process. These Bash scripts are available for:

  • ubuntu: Trusty, Precise and Lucid
  • debian: Jessie, Wheezy and Squeeze.

Simply get and run these scripts using one of the following commands below (based on your operating system)

Once Installed, Let’s start the td-agent.

To make sure you have td-agent running, try the status command

4. Put together EFK, Elasticsearch, Fluentd and Kibana stack.

4.1 Get Required Fluentd Plugins

Now let us put all of it together to make it work. First we need a few fluentd plugins installed. Let’s install them by using the following commands.

4.2 Send Syslog to Elasticsearch via Fluentd

Next, we want to send some log data through fluentd to elasticsearch. In this case we’ll configure fluentd to forward the syslog data to ES. In order to do so, opent file /etc/td-agent/td-agent.conf and replace the existing configuration with the configuration below.

Now let’s launch fluentd using the following commands.

We would also require to tell syslog/rsyslog to stream the log data to fluentd. So let’s open the syslog configuration file.

and add the following line to it. This tells syslog to forward the log data to host which is our local host on port 5140. As fluentd listens to port 5140 by default.

Now to reload the configuration so that it include our recent changes, Let’s restart the syslog/rsyslog service.

Now let’s create an elasticsearch index named kibana where dynamic mapping is enabled.

Now go to your kibana dashboard by navigating in your web browser to ‘′ and choose the settings tab and enter kibana* in the “index name or pattern” field. Then uncheck “Index contains time-based events” and click on the create button.




Now go to the Discover link and you should see and search the logs from your syslog.