-
-
Notifications
You must be signed in to change notification settings - Fork 7.7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
(Microservices) gRPC transport potential memory leak #14296
Comments
Could you try downgrading to v10.4.5 ( |
@kamilmysliwiec that is most probably not the case, as when the leak occurred the application had the following dependencies "dependencies": {
"@grpc/grpc-js": "^1.10.3",
"@grpc/proto-loader": "^0.7.10",
"@isaacs/ttlcache": "^1.4.1",
"@nestjs/axios": "^3.0.2",
"@nestjs/common": "^10.3.4",
"@nestjs/config": "^3.2.0",
"@nestjs/core": "^10.3.4",
"@nestjs/microservices": "^10.3.4",
"@nestjs/schedule": "^4.0.1",
"@nestjs/terminus": "^10.2.3",
"@songkeys/nestjs-redis": "10.0.0",
"class-transformer": "^0.5.1",
"class-validator": "^0.14.1",
"ioredis": "^5.3.2",
"launchdarkly-node-server-sdk": "^7.0.4",
"lodash": "^4.17.21",
"nestjs-pino": "^4.0.0",
"reflect-metadata": "^0.2.1",
"rxjs": "^7.8.1"
}, And my first immediate reaction was to update the dependencies as they were pretty outdated. If you wish, I can still try downgrading to the |
Because of the ^ in front of the package version, we can’t be certain about the exact version that was installed without checking the lockfile.
If you can do that, it would be greatly appreciated! |
The leak is present on "dependencies": {
"@grpc/grpc-js": "^1.12.4",
"@grpc/proto-loader": "^0.7.13",
"@isaacs/ttlcache": "^1.4.1",
"@nestjs/axios": "^3.1.3",
"@nestjs/common": "10.4.5",
"@nestjs/config": "^3.3.0",
"@nestjs/core": "10.4.5",
"@nestjs/microservices": "10.4.5",
"@nestjs/schedule": "^4.1.1",
"@nestjs/terminus": "^10.2.3",
"@songkeys/nestjs-redis": "10.0.0",
"class-transformer": "^0.5.1",
"class-validator": "^0.14.1",
"ioredis": "^5.4.1",
"launchdarkly-node-server-sdk": "^7.0.4",
"lodash": "^4.17.21",
"nestjs-pino": "^4.1.0",
"reflect-metadata": "^0.2.2",
"rxjs": "^7.8.1"
}, |
Thanks @malsatin. That confirms that this issue is very likely unrelated to NestJS itself. Would you like to create an issue in the grpc repository? |
I would like to see some community confirmation of the issue as I'm not entirely sure, that the problem is not in our setup. |
Have you tried asking on our Discord channel? (here + send a new post in the # 🐈 nestjs-help forum. Make sure to include a link to this issue, so you don't need to write it all again) |
Is there an existing issue for this?
Current behavior
We are using Amazon EKS to run our services. In October we have updated its version from 1.30 to 1.31 and then noticed that our NestJS applications with gRPC healthchecks started leaking memory and being killed by the OOM.
The application code wasn't changed and no rebuilds were made, so it is clearly some change in k8s gRPC probes, which is not supported by nestjs / grpc-node.
Minimum reproduction code
https://stackblitz.com/edit/nestjs-typescript-starter-fuk8m3pw?file=src%2Fmain.ts
Steps to reproduce
Expected behavior
Memory consumption remains flat, as it was before.
Package
@nestjs/common
@nestjs/core
@nestjs/microservices
@nestjs/platform-express
@nestjs/platform-fastify
@nestjs/platform-socket.io
@nestjs/platform-ws
@nestjs/testing
@nestjs/websockets
Other package
@grpc/grpc-js
NestJS version
10.4.13
Packages versions
Node.js version
22.12.0-alpine3.20
In which operating systems have you tested?
Other
The only related issue I managed to find: grpc/grpc-node#2629.
We use
terminus/redis
probes inside the Healthcheck endpoint, I tried to make the endpoint always returnServingStatus.SERVING
- the leak was still present, so the issue is not in theterminus
module.I have tried to change runtime from NodeJS to Bun (
oven/bun:1.1.38-alpine
) - the leak was gone without any changes in application code.I have tried to use the latest NodeJS version (
node:23.3.0-alpine3.20
) - the leak was still present.Then I tried to add the following
channelOptions
in the microserviceoptions
parameter:And the leak was gone!
After some playtesting with the config I have discovered that
1.
by itself gives leak reduction, but does not eliminate it.
2.
by itself also gives leak reduction, but does not eliminate it.
3.
both of them together almost stops the leak.
4.
by itself almost stops the leak.
5.
both parameters does not affect the leak (it is still present).
I understand, that most probably the bug is not in the NestJS/Microservices or gRPC-Node, but I think this caveat is at least worth mentioning in the NestJS/Microservices documentation.
The text was updated successfully, but these errors were encountered: