Skip to content

davidedantonio/pino-fluentd

Repository files navigation

pino-fluentd

Build Status js-standard-style Greenkeeper badge npm-version

Send pino logs to Fluentd. This plugin is fully inspired to pino-elasticsearch

Under the hood the official fluent-logger-node module is used.

What is Fluentd?

Fluentd is an open source data collector for unified logging layer. Fluentd allows you to unify data collection and consumption for a better use and understanding of data.

Install

If you want to use pino-fluentd you must first install globally on your machine

npm install pino-fluentd -g

Usage

  pino-fluentd

  To send pino logs to fluentd:

  cat log | pino-fluentd --tag debug --trace-level info

  Flags
  -h   | --help                  Display Help
  -v   | --version               Display Version
  -H   | --host                  the IP address of fluentd; default: 127.0.0.1
  -p   | --port                  the port of fluentd; default: 24224
  -t   | --tag                   the name of the tag to use; default: pino
  -k   | --key                   the name of the type to use; default: log
  -T   | --timeout               set the socket to timetout after timeout milliseconds of inactivity
  -ri  | --reconnect-interval    The reconnect interval in milliseconds
  -fi  | --flush-interval        The flush interval in milliseconds
  -l   | --trace-level           trace level for the fluentd client, default 'error' (trace, debug, info, warn, error, fatal)

You can then use Elasticsearch and Kibana to browse and visualize your logs, or whatever you want. A full list of Data Output is here.

Setup and testing

Setting up pino-fluentd is easy and you can use the bundled docker-compose-*.yml file to bring up a Fluentd with or without Elasictsearch and Kibana containers.

You will need docker and docker-compose, then in this project folder, launch docker-compose -f docker-compose-*.yml up.

You can test it by launching node example | pino-fluentd, in this project folder. You will need to have pino-fluentd installed globally.

Forward to Elasticsearch

Fluentd can be configured to forward your logs to ElasticSearch (and search them with Kibana maybe?). In order to use fluentd with ElasticSearch you need to install the fluent-plugin-elasticsearch plugin on your fluentd instance:

gem install fluent-plugin-elasticsearch

In your Fluentd configuration file, use @type elasticsearch. Additional configuration is optional, default values would look like this:

# File fluent.conf
<source>
  @type forward
  port 24224
  bind 0.0.0.0
</source>
<match pino.*>
  @type copy
  <store>
    @type elasticsearch
    host elasticsearch
    port 9200
    flush_interval 10s
  </store>
  <store>
    @type stdout
  </store>
</match>

When Fluentd receives some logs from pino-fluentd and has flushed them to Elasticsearch after a certain interval (in this example 10 seconds), you can view, search and visualize, the log data using Kibana.

License

Licensed under MIT