Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Memory issue and "flushSync took too long" error #2106

Open
tmart-ops opened this issue Dec 12, 2024 · 3 comments
Open

Memory issue and "flushSync took too long" error #2106

tmart-ops opened this issue Dec 12, 2024 · 3 comments

Comments

@tmart-ops
Copy link

Environment: Express app using pino 9.5.0, pino-http 10.3.0 running in a container in Kubernetes

Issue: After a certain amount of hours running, the app which logs about 20k logs per hour, stops outputting logs. For the next few hours the memory usage of the pod will climb and level off (~1.3GB) before outputting a ton of errors until the pod is manually restarted.

The issue crops up sometimes multiple times in a day and sometimes after a span of days. We run multiple replicas of the app and only one replica will have the issue at a time.

I haven't found much information on the flushSync or memory related errors that are similar to this.
In the meantime we'll test out if the issue happens with synchronous logging.
Any help or suggestions would be appreciated. Thanks.

Transport definition

const transports = [
    {
        target: "pino/file",
        sync: false,
        options: { destination: 1, sync: false }
    }
];

Initial error after memory usage reaches the point where it levels off

Error in pino transport Error: _flushSync took too long (10s)
at flushSync (/usr/src/app/node_modules/thread-stream/index.js:531:13)
at writeSync (/usr/src/app/node_modules/thread-stream/index.js:468:7)
at ThreadStream.write (/usr/src/app/node_modules/thread-stream/index.js:249:9)
at Pino.write (/usr/src/app/node_modules/pino/lib/proto.js:217:10)
at Pino.LOG [as info] (/usr/src/app/node_modules/pino/lib/tools.js:62:21)
at onResFinished (/usr/src/app/node_modules/pino-http/logger.js:129:15)
at ServerResponse.onResponseComplete (/usr/src/app/node_modules/pino-http/logger.js:178:14)
at ServerResponse.emit (node:events:526:35)
at onFinish (node:_http_outgoing:1005:10)
at callback (node:internal/streams/writable:608:21)

Subsequent errors - these spam the logs (about 2k every 5 minutes)

Error in pino transport Error: the worker has exited
at ThreadStream.write (/usr/src/app/node_modules/thread-stream/index.js:238:19)	
at Pino.write (/usr/src/app/node_modules/pino/lib/proto.js:217:10)	
at Pino.LOG [as info] (/usr/src/app/node_modules/pino/lib/tools.js:71:21)	
[The rest of the stack trace varies depending on which method called the logger]
@mcollina
Copy link
Member

@tmart-ops a few notes:

  1. you don't need to use a transport if you are just writing to stdout, just use pino.destination().
  2. it seems you are creating the transport with sync: true, can you paste your full initialization code?

@tmart-ops
Copy link
Author

@mcollina
I may have edited the code right after you saw it but we have tried both async and sync settings, the issue occurs with both. I'll include the additional init code. The only notable modifications we have are setting hostname to undefined (to prevent logging hostname) and a custom logLevel.

import pino from "pino";

const transports = [
    {
        target: "pino/file",
        sync: false,
        options: { destination: 1, sync: false }
    }
];

const customLevels = {  critical: 60 };
const base = { hostname: undefined };

const transportsInstance = pino.transport({ targets: transports });

// Log Pino errors
transportsInstance.on("error", (err: unknown) => {
    console.error("Error in pino transport", err);
});

export const PinoLogger = pino({ customLevels, base }, transportsInstance);

@mcollina
Copy link
Member

mcollina commented Dec 22, 2024

Can you include a full reproduction? Something to replicate the problem.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants