-
Notifications
You must be signed in to change notification settings - Fork 10
Parsing logs locally
Kirill Makhonin edited this page Nov 1, 2018
·
2 revisions
For analyzing logs locally in a complex cases you may use ELK stack (Elasticsearch + Logstash + Kibana).
In order to do it you have to:
- Download all logs from S3. You may use AWS CLI
aws s3 cp --recursive "s3://<bucket>/<filter>" .
- Concat all logs to single file
find ../raw -type f -exec cat {} >> merged.log \;
- Clone
deviantony/docker-elk
repository
https://github.com/deviantony/docker-elk.git
- Modify logstash configuration (
<docker-elk repo>/logstash/pipeline/logstash.conf
)
input {
tcp {
port => 5000
}
}
filter {
grok {
match => {
"message" => "%{DATA:timestamp} (?<source>[^\t]+) (?<payload>[^\t]+)"
}
remove_field => ["host", "message"]
}
date {
match => [ "timestamp", "ISO8601" ]
remove_field => ["timestamp"]
}
json {
source => "payload"
remove_field => ["payload"]
}
}
output {
elasticsearch {
hosts => "elasticsearch:9200"
}
}
- Start ELK stack using compose
docker-compose up
- Await finished of loading
- Create index
curl -XPOST -D- 'http://localhost:5601/api/saved_objects/index-pattern' \
-H 'Content-Type: application/json' \
-H 'kbn-version: 6.4.2' \
-d '{"attributes":{"title":"logstash-*","timeFieldName":"@timestamp"}}'
- Configure logstash
curl -XPUT -H "Content-Type: application/json" http://localhost:9200/_all/_settings -d '{"index.blocks.read_only_allow_delete": null}'
- Send data to logstash
nc localhost 5000 < merged.log
And then you can view logs on Kibana: http://localhost:5601/