Logstash + Elasticsearch + Kibana
Centralized Log server
(as Splunk replacement)

Marko Ojleski
DevOps Engineer
$plunk
Business as usual, untill…
#Outage @03:00AM
Check logs….?!?
10 network devices
40 servers
100 logs
Massive RAGE
tail
cat
grep
sed
awk
sort
uniq
and looots of |
tail -10000 access_log | awk '{print $1}' | sort | uniq -c | sort -n
it’s just too much
1. collect data
2. parse/filter
3. send data

Logstash

written in JRuby
Author: Jordan Sissel
input

parse/filter

output
1. collect data

30+ inputs
1. collect data
file

syslog

tcp

udp

zmq

redis

log4j
Logstash input
Log shippers

Logstash
Beaver (Python)
Lumberjack (Go)
Woodchuck (Ruby)
Nxlog (C)
Sample conf

input {
tcp {
type => “server1"
host => "192.168.1.1"
port => "5555"
}
2. parse/filter

40+ filters
2. parse/filter
grok

csv

grep

geoip

json
mutate

Logstash
filters

xml
key/value
Grok filter

REGEX pattern collection
Grok filter
Grok filter

(?<![0-9])(?:(?:25[0-5]|2[0-4][0-9]|[0-1]?[0-9]{1,2})[.](?:25[0-5]|2[0-4][0-9]|[0-1]?[09]{1,2})[.](?:25[0-5]|...
Grok filter

(?<![0-9])(?:(?:25[0-5]|2[0-4][0-9]|[0-1]?[0-9]{1,2})[.](?:25[0-5]|2[0-4][0-9]|[0-1]?[09]{1,2})[.](?:25[0-5]|...
`$=`;$_=%!;($_)=/(.)/;$==++$|;($.,$/,$,,$,$",$;,$^,$#,$~,$*,$:,@%)=(
$!=~/(.)(.).(.)(.)(.)(.)..(.)(.)(.)..(.)......(.)/,$"...
`$=`;$_=%!;($_)=/(.)/;$==++$|;($.,$/,$,,$,$",$;,$^,$#,$~,$*,$:,@%)=(
$!=~/(.)(.).(.)(.)(.)(.)..(.)(.)(.)..(.)......(.)/,$"...
Grok filter

120+ regex patterns
USERNAME
IP
HOSTNAME
SYSLOGTIMESTAMP
LOGLEVEL
etc…
Grok filter

2.10.146.54 - 2013-12-01T13:37:57Z - some really boring message
Grok filter

2.10.146.54 - 2013-12-01T13:37:57Z - some really boring message
%{IP:client} - %{TIMESTAMP_ISO8601:time} - %{...
Grok filter

client => 2.10.146.54
time => 2013-12-01T13:37:57Z
message = > some really boring message
Grok filter
input {
tcp {
type => “server1"
host => "192.168.1.1"
port => "5555"
}

filter {
if [type] == “server1" {
grok...
3. send data

50+ outputs
3. send data
Logstash
output
statsd

stdout
tcp

elastic

redis

mongo

zmq
1. RESTful api
2. JSON-oriented
3. Horizontal scale
4. HA
5. Full Text search
6. Based on Lucene

Elasticsearch
Distribute...
Logstash => elasticsearch
input {
tcp {
type => “server1"
host => "192.168.1.1"
port => "5555"
}

filter {
if [type] == “s...
1. Clean and simple UI
2. Fully customizable
3. Bootstrap based
4. Old version running on Ruby
5. Milestone 3 fully rewrit...
Real Life Scenarios
Scenario 1
L2 switch

Cisco ASA

L3 switch

UDP

UDP

Elasticsearch

Syslog broker

(lightweight shipper)

UDP

Logstash

...
Scenario 2
Apache

(lightweight shipper)

IIS

TCP

TCP

(lightweight shipper)

Jboss

(lightweight shipper)

Elasticsearc...
Logstash + Elasticsearch + Kibana Presentation on Startit Tech Meetup
Upcoming SlideShare
Loading in …5
×

Logstash + Elasticsearch + Kibana Presentation on Startit Tech Meetup

6,251 views

Published on

Published in: Technology
0 Comments
10 Likes
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total views
6,251
On SlideShare
0
From Embeds
0
Number of Embeds
107
Actions
Shares
0
Downloads
158
Comments
0
Likes
10
Embeds 0
No embeds

No notes for slide

Logstash + Elasticsearch + Kibana Presentation on Startit Tech Meetup

  1. 1. Logstash + Elasticsearch + Kibana Centralized Log server (as Splunk replacement) Marko Ojleski DevOps Engineer
  2. 2. $plunk
  3. 3. Business as usual, untill…
  4. 4. #Outage @03:00AM
  5. 5. Check logs….?!? 10 network devices 40 servers 100 logs
  6. 6. Massive RAGE
  7. 7. tail cat grep sed awk sort uniq
  8. 8. and looots of |
  9. 9. tail -10000 access_log | awk '{print $1}' | sort | uniq -c | sort -n
  10. 10. it’s just too much
  11. 11. 1. collect data 2. parse/filter 3. send data Logstash written in JRuby Author: Jordan Sissel
  12. 12. input parse/filter output
  13. 13. 1. collect data 30+ inputs
  14. 14. 1. collect data file syslog tcp udp zmq redis log4j Logstash input
  15. 15. Log shippers Logstash Beaver (Python) Lumberjack (Go) Woodchuck (Ruby) Nxlog (C)
  16. 16. Sample conf input { tcp { type => “server1" host => "192.168.1.1" port => "5555" }
  17. 17. 2. parse/filter 40+ filters
  18. 18. 2. parse/filter grok csv grep geoip json mutate Logstash filters xml key/value
  19. 19. Grok filter REGEX pattern collection
  20. 20. Grok filter
  21. 21. Grok filter (?<![0-9])(?:(?:25[0-5]|2[0-4][0-9]|[0-1]?[0-9]{1,2})[.](?:25[0-5]|2[0-4][0-9]|[0-1]?[09]{1,2})[.](?:25[0-5]|2[0-4][0-9]|[0-1]?[0-9]{1,2})[.](?:25[0-5]|2[0-4][0-9]|[0-1]?[09]{1,2}))(?![0-9])
  22. 22. Grok filter (?<![0-9])(?:(?:25[0-5]|2[0-4][0-9]|[0-1]?[0-9]{1,2})[.](?:25[0-5]|2[0-4][0-9]|[0-1]?[09]{1,2})[.](?:25[0-5]|2[0-4][0-9]|[0-1]?[0-9]{1,2})[.](?:25[0-5]|2[0-4][0-9]|[0-1]?[09]{1,2}))(?![0-9]) IP
  23. 23. `$=`;$_=%!;($_)=/(.)/;$==++$|;($.,$/,$,,$,$",$;,$^,$#,$~,$*,$:,@%)=( $!=~/(.)(.).(.)(.)(.)(.)..(.)(.)(.)..(.)......(.)/,$"),$=++;$.++;$.++; $_++;$_++;($_,$,$,)=($~.$"."$;$/$%[$?]$_$$,$:$%[$?]",$"&$~,$#,);$,++ ;$,++;$^|=$";`$_$$,$/$:$;$~$*$%[$?]$.$~$*${#}$%[$?]$;$$"$^$~$*.>&$=`
  24. 24. `$=`;$_=%!;($_)=/(.)/;$==++$|;($.,$/,$,,$,$",$;,$^,$#,$~,$*,$:,@%)=( $!=~/(.)(.).(.)(.)(.)(.)..(.)(.)(.)..(.)......(.)/,$"),$=++;$.++;$.++; $_++;$_++;($_,$,$,)=($~.$"."$;$/$%[$?]$_$$,$:$%[$?]",$"&$~,$#,);$,++ ;$,++;$^|=$";`$_$$,$/$:$;$~$*$%[$?]$.$~$*${#}$%[$?]$;$$"$^$~$*.>&$=` Just another Perl hacker.
  25. 25. Grok filter 120+ regex patterns USERNAME IP HOSTNAME SYSLOGTIMESTAMP LOGLEVEL etc…
  26. 26. Grok filter 2.10.146.54 - 2013-12-01T13:37:57Z - some really boring message
  27. 27. Grok filter 2.10.146.54 - 2013-12-01T13:37:57Z - some really boring message %{IP:client} - %{TIMESTAMP_ISO8601:time} - %{GREEDYDATA:message}
  28. 28. Grok filter client => 2.10.146.54 time => 2013-12-01T13:37:57Z message = > some really boring message
  29. 29. Grok filter input { tcp { type => “server1" host => "192.168.1.1" port => "5555" } filter { if [type] == “server1" { grok { match => { "message" => "%{IP:client} - %{TIMESTAMP_ISO8601:time} - %{GREEDYDATA:message} "} } }
  30. 30. 3. send data 50+ outputs
  31. 31. 3. send data Logstash output statsd stdout tcp elastic redis mongo zmq
  32. 32. 1. RESTful api 2. JSON-oriented 3. Horizontal scale 4. HA 5. Full Text search 6. Based on Lucene Elasticsearch Distributed RESTful search server
  33. 33. Logstash => elasticsearch input { tcp { type => “server1" host => "192.168.1.1" port => "5555" } filter { if [type] == “server1" { grok { match => { "message" => "%{IP:client} - %{TIMESTAMP_ISO8601:time} - %{GREEDYDATA:message} "} } } output { elasticsearch {} }
  34. 34. 1. Clean and simple UI 2. Fully customizable 3. Bootstrap based 4. Old version running on Ruby 5. Milestone 3 fully rewritten in HTML/Angular.js Kibana Awesome Elasticsearch Web Frontend to search/graph
  35. 35. Real Life Scenarios
  36. 36. Scenario 1 L2 switch Cisco ASA L3 switch UDP UDP Elasticsearch Syslog broker (lightweight shipper) UDP Logstash (main log server) Kibana
  37. 37. Scenario 2 Apache (lightweight shipper) IIS TCP TCP (lightweight shipper) Jboss (lightweight shipper) Elasticsearch Logstash (main log server) TCP Kibana

×