You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
# Docker can send logs in gelf formatinput {gelf { }}# Filter to support structured logging filter {kv { }}# Send all the logs to our elasticsearchoutput {elasticsearch {hosts => ["elasticsearch"]}}
2) Start elasticsearch, logstash and kibana while linking them together:
# Start elasticsearch v2
docker run -d --name elasticsearch elasticsearch:2
# Start latest logstash and use `./logstash.conf` inside the container
docker run -d --name logstash --link elasticsearch:elasticsearch \
-v $PWD/logstash.conf:/etc/logstash/logstash.conf \
logstash -f /etc/logstash/logstash.conf
# Start kibana v4 on port 80
docker run -d --name kibana --link elasticsearch:elasticsearch -p 80:5601 kibana:4
3) Now you can start any docker container and have it's logs go into kibana:
# Save the logstash internal IP address to a temporary environment variable
LOGSTASH_HOST=$(docker inspect --format '{{ .NetworkSettings.IPAddress }}' logstash)# Run nginx using the gelf driver and logstash IP as the log ingestor
docker run --log-driver=gelf --log-opt gelf-address=udp://$LOGSTASH_HOST:12201 nginx