Fluent Bit Mis Interpreting an Incoming JSON event as string. #5886
Unanswered
srinathjamboju92
asked this question in
Q&A
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
I have the fluent bit running on my AWS EKS Cluster and the EKS worker nodes uses dockerd to output the logs inside /var/log/containers/*.log.
I'm using fluent bit 1.8.14 as DaemonSet and trying to send the logs to Elastic 8.1.0-10. Fluent bit gets the incoming event as JSON object itself, but it mess up the log format and converts the whole log into string and Elastic rejects it. I do not understand why Fluent Bit is parsing a JSON into string at the first place. Below is the configuration of Fluent Bit
[SERVICE]
Daemon Off
Flush 1
Log_Level debug
Parsers_File parsers.conf
Parsers_File custom_parsers.conf
HTTP_Server On
HTTP_Listen 0.0.0.0
HTTP_Port 2020
Health_Check On
[INPUT]
Name tail
Path /var/log/containers/.log
Tag kube.
Parser docker_no_time
Skip_Long_Lines Off
[FILTER]
Name kubernetes
Match kube.*
Merge_Log On
Labels Off
Annotations Off
K8S-Logging.Parser Off
K8S-Logging.Exclude Off
[OUTPUT]
Name es
Match kube.*
Host ${FLUENT_ELASTICSEARCH_HOST}
Port ${FLUENT_ELASTICSEARCH_PORT}
HTTP_User ${FLUENT_ELASTICSEARCH_USERNAME}
HTTP_Passwd ${FLUENT_ELASTICSEARCH_PASSWORD}
Logstash_Format On
Trace_Error On
net.keepalive on
net.keepalive_idle_timeout 60
Retry_Limit False
Replace_Dots Off
tls On
Generate_ID On
Suppress_Type_Name On
[OUTPUT]
Name stdout
Match kube.*
custom_parsers.conf: |
[PARSER]
Name docker_no_time
Format json
Time_Key time
Time_Format %Y-%m-%dT%H:%M:%S.%L
Sample JSON Incoming Event:
{ "log": "[2022/08/12 01:26:45] [ warn] [engine] failed to flush chunk '1-1659980212.689915400.flb', retry in 755 seconds: task_id=1470, input=tail.0 \u003e output=es.0 (out_id=0) smarsh\n", "stream": "stderr", "time": "2022-08-12T01:26:45.691360715Z" }
And the Error received when trying to flush the log to Elastic
{"took":1,"errors":true,"items":[{"create":{"_index":"logstash-2022.08.15","_id":"feb19628-ba0e-8fab-63c6-734e8a82de79","status":400,"error":{"type":"mapper_parsing_exception","reason":"object mapping for [log] tried to parse field [log] as object, but found a concrete value"}}}]}
When Enabled the Trace_Error and debug log level, I see
`{"create":{"_index":"logstash-2022.08.15","_id":"06f274e6-1758-21f7-d14e-74389ce44f6d"}}
{"@timestamp":"2022-08-15T07:40:35.826Z","log":"log:[2022/08/12 01:26:45] [ warn] [engine] failed to flush chunk '1-1659980212.689915400.flb', retry in 755 seconds: task_id=1470, input=tail.0 \u003e output=es.0 (out_id=0)\n stream:stderr time:2022-08-12T01:26:45.691360715Z","kubernetes":{"pod_name":"fluent-bit-latest-1.8.14-debug","namespace_name":"fluent-bit","pod_id":"49332c95-e767-4639-9fed-7d80fbeb2a23","host":"ip-10-31-34-206.us-west-2.compute.internal","container_name":"fluent-bit","docker_id":"a9a6712a87393f30e0b8f597745ce604cfe1a4e8a9cb87236643e823c19e46dc","container_hash":"fluentbit-remote.artifacts.internalrepo.com/fluent/fluent-bit@sha256:29576c59fbb3d767d52665cbcca7f9220c6615fae7139a7a3faa07d7f18dde61","container_image":"fluentbit-remote.artifacts.internalrepo.com/fluent/fluent-bit:1.8.14-debug"}}
[2022/08/15 07:41:16] [error] [output:es:es.0] error: Output
{"took":1,"errors":true,"items":[{"create":{"_index":"logstash-2022.08.15","_id":"06f274e6-1758-21f7-d14e-74389ce44f6d","status":400,"error":{"type":"mapper_parsing_exception","reason":"object mapping for [log] tried to parse field [log] as object, but found a concrete value"}}}]}`
When the Incoming event itself is a JSON Object, why would Fluent Bit want to convert the whole Object into string??
I tried to JUST send the logs to Elastic WITHOUT any Parser, but it still process the whole log as STRING and NOT as a JSON Object.
I do not know how to fix this and the documentation is not clear at all. No response received for many posts on slack. Hence reaching out for help here.
Beta Was this translation helpful? Give feedback.
All reactions