OutOfDirectMemoryError raised from Netty

Solution In Progress - Updated -

Issue

We were getting an OutOfDirectMemoryError raised from Netty. After setting batch-delay to 0 in http-connector as shown in https://access.redhat.com/solutions/3204251

We use the same memory settings as EAP 6 and expect EAP 7 to work with the same settings.

The latest log shows the same issue is there but now coming from large messages

2019-11-12T14:23:41.077+00:00@vio-5653-jms-0@JMS@WARN  [org.apache.activemq.artemis.core.remoting.impl.netty.NettyConnection] (Thread-3301 (ActiveMQ-server-org.apache.activemq.artemis.core.server.impl.ActiveMQServerImpl$6@5a221463)) Trying to allocate 102426 bytes, System is throwing OutOfMemoryError on NettyConnection org.apache.activemq.artemis.core.remoting.impl.netty.NettyServerConnection@3f502f5a[ID=142b15a4, local= /x.y.z.50:5445, remote=/x.y.z.7:41430], there are currently pendingWrites: [NETTY] -> 0 causes: failed to allocate 16777216 byte(s) of direct memory (used: 3221225472, max: 3221225472): io.netty.util.internal.OutOfDirectMemoryError: failed to allocate 16777216 byte(s) of direct memory (used: 3221225472, max: 3221225472)#012#011at io.netty.util.internal.PlatformDependent.incrementMemoryCounter(PlatformDependent.java:656)#012#011at io.netty.util.internal.PlatformDependent.allocateDirectNoCleaner(PlatformDependent.java:611)#012#011at io.netty.buffer.PoolArena$DirectArena.allocateDirect(PoolArena.java:768)#012#011at io.netty.buffer.PoolArena$DirectArena.newChunk(PoolArena.java:744)#012#011at io.netty.buffer.PoolArena.allocateNormal(PoolArena.java:245)#012#011at io.netty.buffer.PoolArena.allocate(PoolArena.java:227)#012#011at io.netty.buffer.PoolArena.allocate(PoolArena.java:147)#012#011at io.netty.buffer.PooledByteBufAllocator.newDirectBuffer(PooledByteBufAllocator.java:327)#012#011at io.netty.buffer.AbstractByteBufAllocator.directBuffer(AbstractByteBufAllocator.java:187)#012#011at io.netty.buffer.AbstractByteBufAllocator.directBuffer(AbstractByteBufAllocator.java:178)#012#011at org.apache.activemq.artemis.core.remoting.impl.netty.NettyConnection.createTransportBuffer(NettyConnection.java:253)#012#011at org.apache.activemq.artemis.spi.core.protocol.AbstractRemotingConnection.createTransportBuffer(AbstractRemotingConnection.java:188)#012#011at org.apache.activemq.artemis.core.protocol.core.impl.PacketImpl.createPacket(PacketImpl.java:354)#012#011at org.apache.activemq.artemis.core.protocol.core.impl.PacketImpl.encode(PacketImpl.java:320)#012#011at org.apache.activemq.artemis.core.protocol.core.impl.ChannelImpl.send(ChannelImpl.java:294)#012#011at org.apache.activemq.artemis.core.protocol.core.impl.ChannelImpl.send(ChannelImpl.java:238)#012#011at org.apache.activemq.artemis.core.protocol.core.impl.CoreSessionCallback.sendLargeMessageContinuation(CoreSessionCallback.java:111)#012#011at org.apache.activemq.artemis.core.server.impl.ServerConsumerImpl$LargeMessageDeliverer.deliver(ServerConsumerImpl.java:1336)#012#011at org.apache.activemq.artemis.core.server.impl.ServerConsumerImpl$2.run(ServerConsumerImpl.java:1189)#012#011at org.apache.activemq.artemis.utils.actors.OrderedExecutor.doTask(OrderedExecutor.java:42)#012#011at org.apache.activemq.artemis.utils.actors.OrderedExecutor.doTask(OrderedExecutor.java:31)#012#011at org.apache.activemq.artemis.utils.actors.ProcessorBase.executePendingTasks(ProcessorBase.java:66)#012#011at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) [rt.jar:1.8.0_221]#012#011at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) [rt.jar:1.8.0_221]#012#011at org.apache.activemq.artemis.utils.ActiveMQThreadFactory$1.run(ActiveMQThreadFactory.java:118)#012

but now the issue is occurs when large messages are being used- "org.apache.activemq.artemis.core.protocol.core.impl.CoreSessionCallback.sendLargeMessageContinuation(CoreSessionCallback.java:111"

Environment

Red Hat JBoss Enterprise Application Platform 7.2.4

Subscriber exclusive content

A Red Hat subscription provides unlimited access to our knowledgebase, tools, and much more.

Current Customers and Partners

Log in for full access

Log In