netty: netty uses unsafe.Allocatememory() cause native memory leak

Hey,

In the production environment, we found that the memory occupied by the program has been increasing. Using the top command, we found that the value of RES has been increasing, even exceeding Xmx. We use jconsole.exe to observe that the number of threads and classes is stable, and the size of heap memory and meta space are within the normal range. Therefore, we begin to suspect that the program is constantly occupying native memory and is not released.

When we use gperftools to analyze the memory allocation occupied by the program, we will find that the method unsafe.Allocatememory0 occupies a very large proportion of memory and increases over time.

141609235-184610ca-3bb6-4b0c-9840-3ba77ee76662 141609283-0cafa3b4-b4c5-4f94-957b-b36660990898

This is the original PDF file generated using gperftools analyzer memory redisson-demo_1318.pdf redisson-demo_1450.pdf

After debugging with debug, it is found that the thread redisson-netty calls unsafe.Allocatememory().

image

stack info

allocateMemory:614, Unsafe (jdk.internal.misc)
<init>:122, DirectByteBuffer (java.nio)
allocateDirect:317, ByteBuffer (java.nio)
allocateDirect:645, PoolArena$DirectArena (io.netty.buffer)
newChunk:621, PoolArena$DirectArena (io.netty.buffer)
allocateNormal:204, PoolArena (io.netty.buffer)
tcacheAllocateSmall:174, PoolArena (io.netty.buffer)
allocate:136, PoolArena (io.netty.buffer)
allocate:128, PoolArena (io.netty.buffer)
newDirectBuffer:378, PooledByteBufAllocator (io.netty.buffer)
directBuffer:187, AbstractByteBufAllocator (io.netty.buffer)
directBuffer:178, AbstractByteBufAllocator (io.netty.buffer)
ioBuffer:131, AbstractByteBufAllocator (io.netty.buffer)
allocateBuffer:140, MessageToByteEncoder (io.netty.handler.codec)
write:105, MessageToByteEncoder (io.netty.handler.codec)
write:76, CommandEncoder (org.redisson.client.handler)
invokeWrite0:717, AbstractChannelHandlerContext (io.netty.channel)
invokeWrite:709, AbstractChannelHandlerContext (io.netty.channel)
write:792, AbstractChannelHandlerContext (io.netty.channel)
write:702, AbstractChannelHandlerContext (io.netty.channel)
write:120, MessageToByteEncoder (io.netty.handler.codec)
write:45, CommandBatchEncoder (org.redisson.client.handler)
invokeWrite0:717, AbstractChannelHandlerContext (io.netty.channel)
invokeWrite:709, AbstractChannelHandlerContext (io.netty.channel)
write:792, AbstractChannelHandlerContext (io.netty.channel)
write:702, AbstractChannelHandlerContext (io.netty.channel)
write:115, ChannelDuplexHandler (io.netty.channel)
write:97, CommandsQueue (org.redisson.client.handler)
invokeWrite0:717, AbstractChannelHandlerContext (io.netty.channel)
invokeWriteAndFlush:764, AbstractChannelHandlerContext (io.netty.channel)
write:790, AbstractChannelHandlerContext (io.netty.channel)
writeAndFlush:758, AbstractChannelHandlerContext (io.netty.channel)
writeAndFlush:1020, DefaultChannelPipeline (io.netty.channel)
writeAndFlush:299, AbstractChannel (io.netty.channel)
sendData:123, CommandsQueue (org.redisson.client.handler)
write:100, CommandsQueue (org.redisson.client.handler)
invokeWrite0:717, AbstractChannelHandlerContext (io.netty.channel)
invokeWriteAndFlush:764, AbstractChannelHandlerContext (io.netty.channel)
write:790, AbstractChannelHandlerContext (io.netty.channel)
writeAndFlush:758, AbstractChannelHandlerContext (io.netty.channel)
writeAndFlush:808, AbstractChannelHandlerContext (io.netty.channel)
writeAndFlush:1025, DefaultChannelPipeline (io.netty.channel)
writeAndFlush:294, AbstractChannel (io.netty.channel)
send:169, RedisConnection (org.redisson.client)
async:215, RedisConnection (org.redisson.client)
async:187, RedisConnection (org.redisson.client)
async:183, RedisConnection (org.redisson.client)
channelActive:89, BaseConnectionHandler (org.redisson.client.handler)
invokeChannelActive:230, AbstractChannelHandlerContext (io.netty.channel)
invokeChannelActive:216, AbstractChannelHandlerContext (io.netty.channel)
fireChannelActive:209, AbstractChannelHandlerContext (io.netty.channel)
channelActive:1398, DefaultChannelPipeline$HeadContext (io.netty.channel)
invokeChannelActive:230, AbstractChannelHandlerContext (io.netty.channel)
invokeChannelActive:216, AbstractChannelHandlerContext (io.netty.channel)
fireChannelActive:895, DefaultChannelPipeline (io.netty.channel)
fulfillConnectPromise:305, AbstractNioChannel$AbstractNioUnsafe (io.netty.channel.nio)
finishConnect:335, AbstractNioChannel$AbstractNioUnsafe (io.netty.channel.nio)
processSelectedKey:707, NioEventLoop (io.netty.channel.nio)
processSelectedKeysOptimized:655, NioEventLoop (io.netty.channel.nio)
processSelectedKeys:581, NioEventLoop (io.netty.channel.nio)
run:493, NioEventLoop (io.netty.channel.nio)
run:989, SingleThreadEventExecutor$4 (io.netty.util.concurrent)
run:74, ThreadExecutorMap$2 (io.netty.util.internal)
run:30, FastThreadLocalRunnable (io.netty.util.concurrent)
run:834, Thread (java.lang)

After modifying the redisson configuration so that redisson is no longer used to connect to redis, we use gperftools to observe that there is no Unsafe_AllocateMemory0, and the memory occupied by the program remains stable and will not increase,so we now suspect that netty caused the program’s native memory leak

Expected behavior

We hope to get your help so that we can use redisson without increasing the memory all the time

Actual behavior

Steps to reproduce

Minimal yet complete reproducer code (or URL to code)

https://github.com/redisson/redisson/files/7531565/redisson_demo.zip

  1. unzip redisson_ demo.zip
  2. modify start.sh to set your redis address
  3. execute start.sh to start the jar

Netty version

4.1.48.Final

JVM version (e.g. java -version)

redis 5.0.12 redisson 3.12.4 springboot 2.4.2 openjdk version "11" 2018-09-25

OS version (e.g. uname -a)

Linux

About this issue

  • Original URL
  • State: closed
  • Created 3 years ago
  • Comments: 21 (9 by maintainers)

Most upvoted comments

@caizy-2020 I think that the leak analysis is pointing to something that’s not a leak at all: Netty use native (and heap) memory pooling and that looks like a “leak” indeed, because memory will be released only on application stop. This issue can be closed I think.