Logstash/Kibana for rsyslog anyone?
Been looking at Logstash and Kibana for collating and presenting rsyslog data. They run on top of Elasticsearch which uses Lucene as it's backend.
http://lucene.apache.org/
http://www.elasticsearch.org/overview/
http://www.elasticsearch.org/overview/logstash/
http://www.elasticsearch.org/overview/kibana/
Now whilst I can get lucene3 and elasticsearch RPM's through the Katello repos, I can only find a JAR files for Logstash which I think includes Elasticsearch and Lucene.
It kind of falls into that category of "too cool for school" tool in my mind at the moment, but it's showing a remarkable capability already. Anybody know if there are modular RPMs for Logstash & Kibana anywhere?
Cheers
D
Responses
I was messing around with this configuration for JBoss central logging on a cluster. I have it working, but I haven't had much use for it.
I'll try to give a short summary:
1.) mkdir /opt/logs && cd /opt/logs
2.) Download latest logstash.jar
3.) Download and install elasticsearch into /opt/logs/elasticsearch
4.) Download and install Kibana into /opt/logs/Kibana-x.x.x
Elasticsearch
I had to change "network.bind_host" to my external IP in /opt/logs/elasticsearch/config/elasticsearch.yml
To start it:
cd /opt/logs/elasticsearch/bin && ./elasticsearch
Logstash
I'm going to give you a jboss config, there are tutorials out there for syslog configurations. You need to create your own config file, and call it whatever you want. This is the contents of /opt/logs/jboss.conf:
input {
file {
path => '/opt/jboss-eap-6.1/domain/servers/robtest1/log/server.log'
format => 'json_event'
type => 'log4j'
tags => 'robtest1'
}
file {
path => '/opt/jboss-eap-6.1/domain/servers/robtest2/log/server.log'
format => 'json_event'
type => 'log4j'
tags => 'robtest2'
}
}
filter {
multiline {
type => "log4j"
pattern => "^\\s"
what => "previous"
}
mutate {
add_field => [ "log4j_ip", "%{@source_host}" ]
}
mutate {
gsub => ["log4j_ip", ":.*$", ""]
}
}
output {
elasticsearch_http {
host => 'X.X.X.X'
port => 9200
type => 'log4j'
flush_size => 10
}
stdout { }
}
X.X.X.X is my external IP in this case.
I have a bash script set to start logstash:
#!/bin/bash
java -jar logstash-1.1.12-flatjar.jar agent -f jboss.conf -- web --backend elasticsearch:///?local
Kibana
cd /opt/logs/Kibana-X.X.X/bin && ruby Kibana.rb
That will start the Kibana web interface. Browse to X.X.X.X:5601 and see if content is showing up.
That is a small primer on what I had to do to get it working.
Just noticed I was using an old version of Kibana, the newer installation instructions are here:
http://www.elasticsearch.org/overview/kibana/installation/
Much easier this way, just had to copy the content to my document root and hit the browser.
Robert,
Thanks a lot for sharing.
I have given this a try with the latest version of logstash (logstash-1.3.3-flatjar.jar) and have made some changes to jboss.conf to get rid of some "deprecation" warnings:
jboss.conf
input {
file {
path => '<my-path>/jboss-eap-6.2/standalone/log/server.log'
codec => json {
charset => "UTF-8"
}
type => 'log4j'
tags => 'A-Server'
}
}
filter {
multiline {
pattern => "^\\s"
what => "previous"
}
mutate {
add_field => [ "log4j_ip", "%{@source_host}" ]
}
mutate {
gsub => ["log4j_ip", ":.*$", ""]
}
}
output {
elasticsearch_http {
host => '<my-ip>'
port => 9200
flush_size => 10
}
stdout { }
}
Also the command-line option --backend elasticsearch:///?local seems to have gone so I have changed to script to
#!/bin/bash
java -jar logstash-1.3.3-flatjar.jar agent -f jboss.conf -- web
Since this topic is about experiences I'd like to know if anyone has any tips regarding scaling of elastic search or limitations.
Like things one should be a aware when dumping data into a index. Sharding, parent-child relations ships between documents.
Welcome! Check out the Getting Started with Red Hat page for quick tours and guides for common tasks.
