Long log lines read by fluentd are split into several documents sent to Elasticsearch
Issue
- How to make docker's max log line (of 16Kb) configurable?
- Why do long lines in my container logs get split into multiple lines?
- The max size of the message seems to be 16KB therefore for a message of 85KB the result is that 6 messages were created in different chunks;
- Fluentd is configured with the default configuration (docker json-file log driver).
Environment
- Red Hat OpenShift Container Platform (OCP) 3.x
Subscriber exclusive content
A Red Hat subscription provides unlimited access to our knowledgebase, tools, and much more.