Logstash / Elasticsearch / Kibana for Windows Event Logs

http://www.ragingcomputer.com/2014/02/logstash-elasticsearch-kibana-for-windows-event-logs

Part 1 of 4 – Part 2Part 3Part 4

Have you heard of Logstash / ElasticSearch / Kibana? I don’t wanna oversell it, but it’s AMAZING!

I’ll start with a screenshot. You know you want this. I have to blur a few things to keep some 53cr375 about my environment.
kibana-windows

This is my configuration for collecting Windows event logs. I’m still working out the differences between the Windows XP, Server 2008R2, and Windows 7 computers I’m collecting logs from, but this has already proven very useful.

If you don’t know about it yet, you should really go watch this webinar.http://www.elasticsearch.org/webinars/introduction-to-logstash/ I’ll wait.

Before I start into a dump of my install notes, I’ll link a few resources.

The getting started guide is a great place to start
http://logstash.net/docs/1.3.3/tutorials/getting-started-centralized

I based my setup very heavily upon this guy’s setup. Part 1, Part 2, Part 3

Logstash / Elasticsearch are moving so quickly this is likely obsolete before I’m done writing it. Go to the sources for more current info.
http://www.elasticsearch.org/overview/
http://www.elasticsearch.org/overview/logstash/
http://www.elasticsearch.org/overview/elasticsearch/
http://www.elasticsearch.org/overview/kibana/

$10 for the PDF version. Buy it, print it. 3 hole punch it and stuff it in a binder. Write on it. Throw it in frustration. Learn to love it.
http://www.logstashbook.com/

I ran into a couple issues. Turns out I wasn’t alone. Google showed me the way.
https://groups.google.com/forum/m/#!topic/logstash-users/XXZLmB2TeNo

INSTALL LINUX

Install Debian. I like using the netinstall. it’s small and you always get up-to-date packages
http://www.debian.org/distrib/netinst

Install, you get to software selection, I only selected “SSH server” and “Standard system utilities”

When prompted during the install I made myself a “raging” standard user account

1
2
3
4
5
6
7
8
9
raging@logcatcher:~$
uname -a; lsb_release -a
Linux logcatcher 3.2.0-4-amd64 #1 SMP Debian 3.2.51-1 x86_64 GNU/Linux
No LSB modules are available.
Distributor ID: Debian
Description:    Debian GNU/Linux 7.3 (wheezy)
Release:        7.3
Codename:       wheezy

INSTALL SUDO

I don’t trust myself to run as root. I like using sudo instead of running as root. I change to the root user, install sudo, add my raging account to the sudo group, and list the current IP address. If this was more than just a temporary setup, I’d set a static IP or DHCP reservation.

1
2
3
4
su root
apt-get install sudo
usermod -a -G sudo raging
ip address show

And this is about as far as I bother on the local console. From here on I’ll SSH in from a more comfortable computer.

INSTALL JAVA

You’ll have to get the correct download link for java. I guarantee you the authparam for that link has long since expired. More info: https://wiki.debian.org/JavaPackage

1
sudo vi /etc/apt/sources.list

You’ll need to add the line

1
1
2
3
4
5
sudo apt-get install java-package
make-jpkg jre-7u51-linux-x64.tar.gz
sudo dpkg -i oracle-j2re1.7_1.7.0+update51_amd64.deb
sudo update-alternatives --auto java

INSTALL ELASTICSEARCH

Download and install elasticsearch. There’s probably a newer version available.

Make the elasticsearch data directory and set permissions

1
2
3
4
5
6
7
sudo mkdir /data
sudo mkdir /data/logs
sudo mkdir /data/data
sudo chown -R elasticsearch:elasticsearch /data/logs
sudo chown -R elasticsearch:elasticsearch /data/data
sudo chmod -R ug+rw /data/logs
sudo chmod -R ug+rw /data/data

Configure elasticsearch

1
sudo vi /etc/elasticsearch/elasticsearch.yml

Uncomment / Update the values to something that makes sense for your enviroment

1
2
3
4
Change the value of cluster.name to "logcatcher"
Change the value of node.name to something memorable like "logstorePrime"
Change the value of path.data to your data directory - mine was /data/data
Change the value of path.logs to your logs directory - mine was /data/logs

I’ll just link to the source for this part
www.elasticsearch.org/tutorials/too-many-open-files/

1
2
3
4
5
6
7
8
9
sudo vi /etc/security/limits.conf
Add:
elasticsearch soft nofile 32000
elasticsearch hard nofile 32000
sudo vi /etc/pam.d/su
Uncomment pam_limits.so
sudo service elasticsearch restart

If you end up with garbage data in elasticsearch, this is how to clear all indexes

1

http://www.elasticsearch.org/guide/en/elasticsearch/reference/current/indices-delete-index.html

INSTALL REDIS

Shamelessly stolen from http://redis.io/topics/quickstart

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
tar xzf redis-2.8.4.tar.gz
cd redis-2.8.4
make
sudo cp src/redis-server /usr/local/bin/
sudo cp src/redis-cli /usr/local/bin/
sudo mkdir /etc/redis
sudo mkdir /var/redis
sudo cp utils/redis_init_script /etc/init.d/redis_6379
sudo cp redis.conf /etc/redis/6379.conf
sudo mkdir /var/redis/6379
sudo update-rc.d redis_6379 defaults
sudo vi /etc/redis/6379.conf
    Set daemonize to yes
sudo service redis_6379 start

INSTALL LOGSTASH

Download logstash and copy to /opt/logstash/
There’s likely a newer version available

1
2
3
4
5
sudo mkdir /opt/logstash
sudo cp logstash-1.3.3-flatjar.jar /opt/logstash/logstash.jar
sudo mkdir /etc/logstash

If you’re only running one computer for logstash / elasticsearch, you might only need one logstash instance and no redis. I’m planning to scale this to 2 computers to have failover.