Skip to content

Instantly share code, notes, and snippets.

@eunomie
Created April 27, 2017 09:44
Show Gist options
  • Save eunomie/e7a183602b8734c47058d277700fdc2d to your computer and use it in GitHub Desktop.
Save eunomie/e7a183602b8734c47058d277700fdc2d to your computer and use it in GitHub Desktop.
How to send containers log to ELK using gelf log driver

Send docker logs to ELK through gelf log driver

There's so many way to send logs to an elk... logspout, filebeat, journalbeat, etc.

But docker has a gelf log driver and logstash a gelf input. So here we are.

Here is a docker-compose to test a full elk with a container sending logs via gelf.

Try

Requirements

  • docker
  • docker-compose
  • 4 Go of ram (elastic is set to take 2 Go, and you will run logstash + kibana)

Versions

The docker-compose.yml file run, from docker.elastic.co registry:

  • elasticsearch 5.3.0
  • logstash 5.3.0
  • kibana 5.3.0

The docker-compose.old.yml run, from docker hub:

  • elasticsearch 2.4
  • logstash 2
  • kibana 4

Run

  1. Run the stack

    docker-compose -f docker-compose.yml up
    
  2. See in the logstash logs the incoming logs from plop container

  3. Wait for kibana to be up (and wait again if not...)

  4. Add logstash-* as index with @timestamp as Time-field name

  5. Go to Discover and see your logs!

    Even if your output is not well formated, you will see logs with metadata like command, image id, container id, timestamp, container name, etc.

    {
      "_index": "logstash-2017.04.27",
      "_type": "docker",
      "_id": "AVuucsRtmwIbXgw8iFVb",
      "_score": null,
      "_source": {
        "source_host": "172.18.0.1",
        "level": 6,
        "created": "2017-04-26T14:15:52.292252751Z",
        "message": "My Message Thu Apr 27 08:06:48 UTC 2017",
        "type": "docker",
        "version": "1.1",
        "command": "/bin/sh -c while true; do echo My Message `date`; sleep 1; done;",
        "image_name": "alpine",
        "@timestamp": "2017-04-27T08:06:48.676Z",
        "container_name": "squarescaleweb_plop_1",
        "host": "plop-xps",
        "@version": "1",
        "tag": "",
        "image_id": "sha256:4a415e3663882fbc554ee830889c68a33b3585503892cc718a4698e91ef2a526",
        "container_id": "abbdfad9b05a45037327da369f0ae22c3f5477760f0e0d6bb00f796627b32706"
      },
      "fields": {
        "created": [
          1493216152292
        ],
        "@timestamp": [
          1493280408676
        ]
      },
      "sort": [
        1493280408676
      ]
    }

Under the hood

Logstash defines an input of type gelf with port 12201.

It's udp port, so don't forget to correctly open it using 12201:12201/udp in docker settings.

And logstash send logs to stdout for debug and elasticsearch.

The containers we want to see logs should define the logging configuration. In a docker-compose file in version 2:

logging:
  driver: gelf
  options:
    gelf-address: udp://localhost:12201

Careful, the address to send the log is relative to the docker host, not the container!

If you run docker instead of docker-compose:

docker run --log-driver gelf --log-opt gelf-address=udp://localhost:12201

To organize your logs, you can add tag (only one):

logging:
  options:
    tag: staging

or

docker run --log-opt tag="staging"

And result is like:

{
  "_index": "logstash-2017.04.27",
  "_type": "docker",
  "_id": "AVuuiZbeYg9q2vv-JShe",
  "_score": null,
  "_source": {
    "source_host": "172.18.0.1",
    "level": 6,
    "created": "2017-04-27T08:24:45.69023959Z",
    "message": "My Message Thu Apr 27 08:31:44 UTC 2017",
    "type": "docker",
    "version": "1.1",
    "command": "/bin/sh -c while true; do echo My Message `date`; sleep 1; done;",
    "image_name": "alpine",
    "@timestamp": "2017-04-27T08:31:44.338Z",
    "container_name": "squarescaleweb_plop_1",
    "host": "plop-xps",
    "@version": "1",
    "tag": "staging",
    "image_id": "sha256:4a415e3663882fbc554ee830889c68a33b3585503892cc718a4698e91ef2a526",
    "container_id": "12b7bcd3f2f54e017680090d01330f542e629a4528f558323e33f7894ec6be53"
  },
  "fields": {
    "created": [
      1493281485690
    ],
    "@timestamp": [
      1493281904338
    ]
  },
  "sort": [
    1493281904338
  ]
}

You can now filter your logs using the tag.

Resources

ADVENTURES IN GELF and Orchestration workshop from Jérôme Petazzoni.

For the fun, just read the headers of the first link:

  • GELF
  • Using a logging driver
  • I would tell you an UDP joke, but
  • DNS to the rescue
  • Hmmm … TCP to the rescue, then?
  • (╯°□°)╯︵ ┻━┻
  • Hot potato
  • Haters gonna hate
  • Workarounds
  • Future directions

Future directions is where it's written:

If you need better GELF support, I have good news: you can help! I’m not going to tell you “just send us a pull request, ha ha ha!” because I know that only a very small number of people have both the time and expertise to do that — but if you are one of them, then by all means, do it!

version: '2'
services:
logstash:
image: logstash:2
ports:
- "12201:12201/udp"
- "5044:5044"
environment:
- "xpack.monitoring.elasticsearch.url=http://elasticsearch:9200"
volumes:
- .:/usr/share/logstash/pipeline
links:
- elasticsearch:elasticsearch
depends_on:
- elasticsearch
command: logstash -f /usr/share/logstash/pipeline/
kibana:
image: kibana:4
ports:
- "5601:5601"
environment:
ELASTICSEARCH_URL: http://elasticsearch:9200
XPACK_SECURITY_ENABLED: "false"
links:
- elasticsearch:elasticsearch
depends_on:
- elasticsearch
elasticsearch:
image: elasticsearch:2.4
ports:
- "9200:9200"
environment:
- "http.host=0.0.0.0"
- "transport.host=127.0.0.1"
- "ES_JAVA_OPTS=-Xmx2g -Xms2g"
- "xpack.security.enabled=false"
plop:
image: alpine
logging:
driver: gelf
options:
gelf-address: udp://localhost:12201
tag: "staging"
links:
- logstash:logstash
depends_on:
- logstash
command: /bin/sh -c "while true; do echo My Message `date`; sleep 1; done;"
version: '2'
services:
logstash:
image: docker.elastic.co/logstash/logstash:5.3.0
ports:
- "12201:12201/udp"
- "5044:5044"
environment:
- "xpack.monitoring.elasticsearch.url=http://elasticsearch:9200"
volumes:
- .:/usr/share/logstash/pipeline
links:
- elasticsearch:elasticsearch
depends_on:
- elasticsearch
command: logstash -f /usr/share/logstash/pipeline/ --config.reload.automatic
kibana:
image: docker.elastic.co/kibana/kibana:5.3.0
ports:
- "5601:5601"
environment:
ELASTICSEARCH_URL: http://elasticsearch:9200
XPACK_SECURITY_ENABLED: "false"
links:
- elasticsearch:elasticsearch
depends_on:
- elasticsearch
elasticsearch:
image: docker.elastic.co/elasticsearch/elasticsearch:5.3.0
ports:
- "9200:9200"
environment:
- "http.host=0.0.0.0"
- "transport.host=127.0.0.1"
- "ES_JAVA_OPTS=-Xmx2g -Xms2g"
- "xpack.security.enabled=false"
plop:
image: alpine
logging:
driver: gelf
options:
gelf-address: udp://localhost:12201
tag: "staging"
links:
- logstash:logstash
depends_on:
- logstash
command: /bin/sh -c "while true; do echo My Message `date`; sleep 1; done;"
input {
gelf {
type => docker
port => 12201
}
}
output {
stdout {}
elasticsearch {
hosts => ["http://elasticsearch:9200"]
}
}
@ananthulasrikar
Copy link

Great, thanks for help.

@curry684
Copy link

curry684 commented Sep 3, 2018

    links:
      - logstash:logstash
    depends_on:
      - logstash

This is actually obsolete as links implicitly creates a depends_on relationship and ensures startup/shutdown is done in the correct order.

Nice writeup apart from that bit of nitpicking, thanks!

@muthukumarramu
Copy link

I am unable to create index patern.. it's shows below.

image

@barddes
Copy link

barddes commented May 27, 2021

Seems you don't have an index that matches this pattern "logstash-*", maybe it's created with another name or it doesn't exist at all

@muthukumarramu
Copy link

Can you give me step by step for create index

@reyostallenberg
Copy link

I am unable to create index patern.. it's shows below.

image

You should add logstash.conf in the same dir as docker-compose.yml and set https://gist.github.com/eunomie/e7a183602b8734c47058d277700fdc2d#file-docker-compose-yml-L16 to:

command: logstash -f /usr/share/logstash/pipeline/logstash.conf --config.reload.automatic

@lewislbr
Copy link

lewislbr commented Aug 5, 2022

gelf-address should be gelf-address: udp://127.0.0.1:12201 docker/for-win#12465

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment