Skip to content

Commit

Permalink
docs: adding tornado components logs split
Browse files Browse the repository at this point in the history
  • Loading branch information
Bertrand Rigaud committed Oct 9, 2023
1 parent 240924a commit 76f52b1
Showing 1 changed file with 15 additions and 15 deletions.
Original file line number Diff line number Diff line change
Expand Up @@ -4,18 +4,18 @@
Split Tornado logs by component
===============================

Dirac offers the ability to write logs for each component. One can find logs in <DIRAC FOLDER>/startup/<DIRAC COMPONENT>/log/current
Dirac offers the ability to write logs for each component. One can find logs in <DIRAC FOLDER>/startup/<DIRAC COMPONENT>/log/current

In case of Tornado, logs come from many components, and can be hard to sort.
In case of Tornado, logs come from many components, and can be hard to sort.

Using Fluent-bit will allow to collect logs from files, rearrange content, then send them elsewhere like an ELK instance or simply other files.
Thus, in case of ELK, it's now possible to monitor and display informations through Kibana and Grafana tools, using filters to sort logs, or simply read other splitted log files, one by component.
Using Fluent-bit will allow to collect logs from files, rearrange content, then send them elsewhere like an ELK instance or simply other files.
Thus, in case of ELK, it's now possible to monitor and display informations through Kibana and Grafana tools, using filters to sort logs, or simply read other splitted log files, one by component.

The idea behind that is to deal with logs independantly from Dirac. It is also possible to grab servers metrics such as cpu, memory and disk usage, giving the opportunity to make correlations between logs and server usage.

DIRAC Configuration
-------------------

First of all, you should configure a JSON Log Backend in your ``Resources`` and ``Operations`` like::

Resources
Expand Down Expand Up @@ -43,7 +43,7 @@ First of all, you should configure a JSON Log Backend in your ``Resources`` and

Fluent-bit Installation
-----------------------

On each Dirac server, install Fluent-bit (https://docs.fluentbit.io)::

curl https://raw.githubusercontent.com/fluent/fluent-bit/master/install.sh | sh
Expand All @@ -55,10 +55,10 @@ Edit and add in /etc/fluent-bit/fluent-bit.conf::

@INCLUDE dirac-json.conf

Create following files in /etc/fluent-bit
Create following files in /etc/fluent-bit

dirac-json.conf (Add all needed components and choose the output you want)::

[SERVICE]
flush 1
log_level info
Expand Down Expand Up @@ -155,21 +155,21 @@ dirac-json.conf (Add all needed components and choose the output you want)::
match metric

``dirac-json.conf`` is the main file, it defines different steps such as::
[SERVICE] where we describe our json parser (from dirac Json log backend)
[INPUT] where we describe dirac components log file and the way it will be parsed (json)
[FILTER] where we apply modifications to parsed data, for example adding a levelname "DEV" whenever logs are not well formatted, typically "print" in code, or adding fields like hostname to know from which host logs are coming, but also more complex treatments like in dirac.lua script (described later)
[OUTPUT] where we describe formatted logs destination, here, we have stdout, files on disks and elasticsearch.
[SERVICE] where we describe our json parser (from dirac Json log backend)
[INPUT] where we describe dirac components log file and the way it will be parsed (json)
[FILTER] where we apply modifications to parsed data, for example adding a levelname "DEV" whenever logs are not well formatted, typically "print" in code, or adding fields like hostname to know from which host logs are coming, but also more complex treatments like in dirac.lua script (described later)
[OUTPUT] where we describe formatted logs destination, here, we have stdout, files on disks and elasticsearch.

dirac-parsers.conf::

[PARSER]
Name dirac_parser_json
Format json
Time_Key asctime
Time_Format %Y-%m-%d %H:%M:%S,%L
Time_Keep On

``dirac-parsers.conf`` describes the source format that will be parsed, and the time that will be used (here asctime field) as reference
``dirac-parsers.conf`` describes the source format that will be parsed, and the time that will be used (here asctime field) as reference

dirac.lua::

Expand Down Expand Up @@ -249,6 +249,6 @@ Having a week log retention, Logrotate config file should look like
endscript
}

along with crontab line like
along with crontab line like

``0 0 * * * logrotate /etc/logrotate.d/diraclogs``

0 comments on commit 76f52b1

Please sign in to comment.