Long log lines read by fluentd are split into several documents sent to Elasticsearch

Solution Verified - Updated -

Issue

  • How to make docker's max log line (of 16Kb) configurable?
  • Why do long lines in my container logs get split into multiple lines?
  • The max size of the message seems to be 16KB therefore for a message of 85KB the result is that 6 messages were created in different chunks;
  • Fluentd is configured with the default configuration (docker json-file log driver).

Environment

  • Red Hat OpenShift Container Platform (OCP) 3.x

Subscriber exclusive content

A Red Hat subscription provides unlimited access to our knowledgebase, tools, and much more.

Current Customers and Partners

Log in for full access

Log In

New to Red Hat?

Learn more about Red Hat subscriptions

Using a Red Hat product through a public cloud?

How to access this content