Long log lines read by fluentd are split into several documents sent to Elasticsearch

Solution Unverified - Updated -

Issue

  • How do I make dockers max log line (of 16Kb) configurable?
  • Why do long lines in my container logs get split into multiple lines?
  • The max size of the message seems to be 16KB therefore for a message of 85KB the result is that 6 messages were created in different chunks.
  • Fluentd is configured with the default configuration (docker json-file log driver).

Environment

  • Red Hat OpenShift Container Platform
    • 3.X
  • json-file logging driver
  • Fluentd, Elasticsearch stack

Subscriber exclusive content

A Red Hat subscription provides unlimited access to our knowledgebase of over 48,000 articles and solutions.

Current Customers and Partners

Log in for full access

Log In
Close

Welcome! Check out the Getting Started with Red Hat page for quick tours and guides for common tasks.