Analysing your logs with ELK
stack & Docker
Intro
2
Do it yourself
Dockerhub elk
Docker hub
https://github.com/deviantony/docker-elk
dockercompose
Docker ELK repo
Importing data is as simple as
Getting started
$ nc localhost 5000 < /path/to/logfile.log
Wrong date
However ..
filter {
date {
match => [ "timestamp" , "dd/MMM/yyyy:HH:mm:ss Z" ]
}
}
Filters
Rerun
Enter grok
grok {
match => { "message" => "%{COMBINEDAPACHELOG}" }
}
Grok patterns
https://github.com/elastic/logstash/blob/v1.4.2/patterns/grok-patterns
Own Grok patterns
Directory: Patterns
filter {
grok {
patterns_dir => ["./patterns"]
match => { "message" => "%{SYSLOGBASE} %{POSTFIX_QUEUEID:queue_id}: %{GREEDYDATA:syslog_message}"
}
}
}
contents of ./patterns/postfix:
POSTFIX_QUEUEID [0-9A-F]{10,11}
Duplicates
fingerprint {
source => ["message"]
concatenate_sources => true
method => "SHA1"
target => "fingerprint"
key => "17272737"
}
output {
elasticsearch {
hosts => "elasticsearch:9200"
document_id => "%{fingerprint}"
}
}
Agents
if [agent] != "-" and [agent] != "" {
useragent {
add_tag => [ "UA" ]
source => "agent"
}
}
if "UA" in [tags] {
if [device] == "Other" { mutate { remove_field => "device" } }
if [name] == "Other" { mutate { remove_field => "name" } }
if [os] == "Other" { mutate { remove_field => "os" } }
}
Geo ip
geoip {
source => "clientip"
target => "geoip"
database => "/etc/logstash/GeoLiteCity.mmdb"
add_field => [ "[geoip][coordinates]", "%{[geoip][longitude]}" ]
add_field => [ "[geoip][coordinates]", "%{[geoip][latitude]}" ]
}
mutate {
convert => [ "[geoip][coordinates]", "float"]
}
Graphs
Questions?

Anaysing your logs with docker and elk