A-MQ raises "java.lang.OutOfMemoryError: Direct buffer memory" on broker startup

Solution Unverified - Updated -

Issue

We configure the broker to use the mKahaDB persistence store but during broker startup it raises this error to the Karaf shell.
No further errors are written to the container log file.
We use a larger number of kahaDB instance within our mKahaDB configuration (i.e. >50).

Exception in thread "ActiveMQ Data File Writer" Exception in thread "ActiveMQ Data File Writer" java.lang.OutOfMemoryError: Direct buffer memory
    at java.nio.Bits.reserveMemory(Bits.java:658)
    at java.nio.DirectByteBuffer.<init>(DirectByteBuffer.java:123)
    at java.nio.ByteBuffer.allocateDirect(ByteBuffer.java:306)
    at sun.nio.ch.Util.getTemporaryDirectBuffer(Util.java:174)
    at sun.nio.ch.IOUtil.write(IOUtil.java:58)
    at sun.nio.ch.FileChannelImpl.write(FileChannelImpl.java:205)
    at org.apache.activemq.store.kahadb.disk.journal.Journal.doPreallocationZeros(Journal.java:270)
    at org.apache.activemq.store.kahadb.disk.journal.Journal.preallocateEntireJournalDataFile(Journal.java:242)
    at org.apache.activemq.store.kahadb.disk.journal.DataFileAppender.processQueue(DataFileAppender.java:320)
    at org.apache.activemq.store.kahadb.disk.journal.DataFileAppender$1.run(DataFileAppender.java:193)

Subsequent broker restarts don't show this error but corruptions in various kahadb instances is reported.

Environment

  • JBoss A-MQ
    • 6.x

Subscriber exclusive content

A Red Hat subscription provides unlimited access to our knowledgebase, tools, and much more.

Current Customers and Partners

Log in for full access

Log In

New to Red Hat?

Learn more about Red Hat subscriptions

Using a Red Hat product through a public cloud?

How to access this content