Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Memory Leak Detected in Netty 4.1.100.Final #3274

Closed
Ruchika2203 opened this issue Jun 3, 2024 · 13 comments
Closed

Memory Leak Detected in Netty 4.1.100.Final #3274

Ruchika2203 opened this issue Jun 3, 2024 · 13 comments
Assignees
Labels
status/invalid We don't feel this issue is valid

Comments

@Ruchika2203
Copy link

Ruchika2203 commented Jun 3, 2024

We are experiencing an out-of-memory (OOM) issue in our application, which appears to be caused by a memory leak in Netty. After enabling io.netty.leakDetection.level=paranoid, we observed logs indicating that ByteBuf.release() was not called before the buffer was garbage-collected.

Application Configuration:

Java Version: 17
Netty Version: 4.1.100.Final
Spring Boot Version: 5.3.3
Build Tool: Maven
Application Type: Java / Maven / Spring Boot

Log Details :

2024-05-22 20:00:45:017 [reactor-http-epoll-2] [ERROR] [SourceId=localhost.localdomain|session] [io.netty.util.ResourceLeakDetector:337] : LEAK: ByteBuf.release() was not called before it's garbage-collected. See https://netty.io/wiki/reference-counted-objects.html for more information.
Recent access records:
netty/netty#1:
Hint: [613b1879-146, L:/192.168.185.101:41270 - Buffered ByteBufHolder in the inbound buffer queue
io.netty.handler.codec.http.DefaultHttpContent.touch(DefaultHttpContent.java:86)
io.netty.handler.codec.http.DefaultLastHttpContent.touch(DefaultLastHttpContent.java:96)
io.netty.handler.codec.http.DefaultLastHttpContent.touch(DefaultLastHttpContent.java:30)

netty/netty#2:
Hint: 'reactor.right.reactiveBridge' will handle the message from this point.
io.netty.handler.codec.http.DefaultHttpContent.touch(DefaultHttpContent.java:86)
io.netty.handler.codec.http.DefaultLastHttpContent.touch(DefaultLastHttpContent.java:96)
io.netty.handler.codec.http.DefaultLastHttpContent.touch(DefaultLastHttpContent.java:30)

netty/netty#3:
io.netty.handler.codec.http.DefaultHttpContent.release(DefaultHttpContent.java:92)
io.netty.util.ReferenceCountUtil.release(ReferenceCountUtil.java:90)
io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:90)

netty/netty#4:
io.netty.handler.codec.http.DefaultHttpContent.retain(DefaultHttpContent.java:68)
io.netty.handler.codec.http.DefaultLastHttpContent.retain(DefaultLastHttpContent.java:84)
io.netty.handler.codec.http.DefaultLastHttpContent.retain(DefaultLastHttpContent.java:30)
io.netty.handler.codec.http.HttpContentDecoder.decode(HttpContentDecoder.java:158)

netty/netty#5:
Hint: 'reactor.left.httpDecompressor' will handle the message from this point.
io.netty.handler.codec.http.DefaultHttpContent.touch(DefaultHttpContent.java:86)
io.netty.handler.codec.http.DefaultLastHttpContent.touch(DefaultLastHttpContent.java:96)
io.netty.handler.codec.http.DefaultLastHttpContent.touch(DefaultLastHttpContent.java:30)
io.netty.channel.DefaultChannelPipeline.touch(DefaultChannelPipeline.java:116)

Additional Information:
We are not modifying any of the Netty handlers in our application. The memory leak appears to be related to ByteBuf objects not being properly released. The issue persists despite our best efforts to ensure proper handling of Netty resources.

Please let us know if there is any additional information required or any potential fixes we can apply to mitigate this issue.

Link to Reference Documentation:
https://netty.io/wiki/reference-counted-objects.html

Thank you for your assistance.

@Ruchika2203 Ruchika2203 added status/need-triage A new issue that still need to be evaluated as a whole type/bug A general bug labels Jun 3, 2024
@violetagg
Copy link
Member

Spring Boot Version: 5.3.3

Please clarify whether this is Spring Framework or Spring Boot version

Please also provide Reactor Netty version

@violetagg violetagg added for/user-attention This issue needs user attention (feedback, rework, etc...) and removed status/need-triage A new issue that still need to be evaluated as a whole labels Jun 3, 2024
@violetagg violetagg self-assigned this Jun 3, 2024
@Ruchika2203
Copy link
Author

@violetagg
spring boot 2.7.17
reactor netty 1.0.38

@violetagg
Copy link
Member

violetagg commented Jun 3, 2024

@Ruchika2203 Please provide code samples and/or reproducible example.

@Ruchika2203
Copy link
Author

This happens on "our" code base and we do not have a reproducible example for this .
This was found with paranoid settings and is there an alternate way to get any more details.
Both applications are spring boot and right side application connects to database.

@violetagg
Copy link
Member

violetagg commented Jun 5, 2024

@Ruchika2203 Check this FAQ, may be it will help. https://projectreactor.io/docs/netty/release/reference/index.html#faq.memory-leaks

The stack above shows that the incoming data is kept in Reactor Netty, ready to be consumed by the application, I need to see your code in order to understand why your application does not consume the incoming data.

@Ruchika2203
Copy link
Author

Hi @violetagg ,

Here is one of the code snippet which is calling the right side app which is actually connecting to the database.
Also we see the memory leak in that datastore app only.

code.docx

@violetagg
Copy link
Member

@Ruchika2203 I checked the code but it does not seem to have issues (although there are structures that I don't know). Try to enable Reactor Netty/Spring logs and to trace the request/response what happens. You can correlate with the connection id (https://projectreactor.io/docs/netty/release/reference/index.html#faq.logging-prefix)
In your case the connection id is 613b1879-146 from the log Hint: [613b1879-146, L:/192.168.185.101:41270 - Buffered ByteBufHolder in the inbound buffer queue

@Ruchika2203
Copy link
Author

@violetagg Can we schedule a quick zoom connect to discuss on this issue.

@violetagg
Copy link
Member

@Ruchika2203 It is our policy to collaborate asynchronously through GitHub issues when there is a problem that needs addressed. While we try to help anyone who raises an issue, it is often difficult to properly troubleshoot without an example. For this reason, our New Issue template asks for code to reproduce the issue and our contribution guidelines for Reactor do the same.
In order to progress on the issue we ask that a minimal example is provided and ideally checked into a new GitHub repository. If that is not possible then we will need to close the issue.

@Ruchika2203
Copy link
Author

hi @violetagg
One more info
we are using exchange() in our code and saw an article on stack overflow where its ,mentioned that the exchange() has been deprecated due to potential memory leak.

Stacktrace link: https://stackoverflow.com/questions/58410352/spring-boot-webclients-retrieve-vs-exchange

Can it be the probable cause for netty OOM.

Code snippet :

Mono<ResponseEntity> responseEntity = webClient.patch()
.uri(dataStoreUrlWithTTLMutate, version, sessionBucketName, sessionId, timeToLive).bodyValue(data)
.exchange().flatMap(response -> response.toEntity(JsonNode.class));
PhoenixCustomPerfLogger.logExecutionTime(stopWatch, StringUtils.join(dataStoreUrlWithTTLMutate, "/", version,
"/", sessionBucketName, "/", sessionId, "/", timeToLive), HttpMethod.PATCH.name());

@violetagg
Copy link
Member

Copy link

If you would like us to look at this issue, please provide the requested information. If the information is not provided within the next 7 days this issue will be closed.

Copy link

Closing due to lack of requested feedback. If you would like us to look at this issue, please provide the requested information and we will re-open.

@github-actions github-actions bot closed this as not planned Won't fix, can't repro, duplicate, stale Jun 30, 2024
@violetagg violetagg added status/invalid We don't feel this issue is valid and removed type/bug A general bug for/user-attention This issue needs user attention (feedback, rework, etc...) status/need-feedback labels Jun 30, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
status/invalid We don't feel this issue is valid
Projects
None yet
Development

No branches or pull requests

2 participants