-
Notifications
You must be signed in to change notification settings - Fork 34
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Large request payload crashing API #736
Comments
Hi @seeARMS, are you still running into this issue? If the GRPC options didn't work for you, could you also share the parameters you set? |
Where do I set those gRPC options? I mentioned in the OP that I'm using the logging bunyan Express middleware and the gRPC options don't get passed down |
I'm also facing this issue with our services, where we see outages on our production servers due to accidental large log entries (which is very annoying :( ). @seeARMS, did you find a way to pass those options on to the Express middleware? |
Also @cindy-peng - would you like a repro of the issue in a repository somewhere for better inspection? |
Hi @pedroyan, are you also using the Express middleware? If so, do you mind also sharing your latest repro? Thank you! |
Hi Everyone, I encountered the same issue where large log entries cause the server to crash due to exceeding the request payload size limit. Specifically, I received the following error:
I managed to resolve it by following these steps:
import { Writable } from 'stream';
export default class SizeLimitedStream extends Writable {
private buffer: string[] = [];
private sizeLimit = 9000000; // 9MB
private bunyanStream: NodeJS.WritableStream;
constructor(bunyanStream: NodeJS.WritableStream) {
super({ objectMode: true });
this.bunyanStream = bunyanStream;
}
_write(chunk: never, encoding: string, callback: Function) {
// TODO: check size limit before pushing to buffer
this.buffer.push(chunk);
if (this.buffer.join('').length >= this.sizeLimit) {
this.flushBuffer();
}
callback();
}
flushBuffer() {
this.bunyanStream.write(this.buffer.join(''));
this.buffer = [];
}
}
import { LoggingBunyan } from '@google-cloud/logging-bunyan';
import SizeLimitedStream from './size-limited-stream';
import Bunyan = require('bunyan');
# .... some code
const loggingBunyan = new LoggingBunyan();
const { stream, level, type } = loggingBunyan.stream('info');
const sizeLimitedStream = new SizeLimitedStream(stream);
const logger = Bunyan.createLogger({
name: 'some-log',
streams: [
{ stream: sizeLimitedStream, level, type },
],
});
logger.info({},'something')
# .... some code Of course, this can and should be improved 😄, but it worked for me, and I hope it works for you too! |
Hey @cindy-peng - apologies for the delay here. Things have been hectic on my side, but I will whip up a repro today for you . @hizaguirre-sp Nice solution!! How can I inject this limited stream into the GCP logging middleware? import { Express } from 'express'
import * as gcpLogging from '@google-cloud/logging-bunyan'
const useGcpLogger = async (app: Express) => {
const { mw, logger } = await gcpLogging.express.middleware({
logName: 'logName',
level: 'debug',
})
app.use(mw)
// How can I attach the stream to this logger? The middleware is the code that creates it
// logger.??
}
|
Ok, so I've hacked together this repro code using a similar setup as the one my team uses in production, but to my surprise, the repro code works... ... while the same "explosion" logic on our production API takes down the server. My first hypothesis was that one of the dependencies of However, even after updating all direct and indirect dependencies of Here is a link to my repro repository in case someone is also facing this problem and want to compare their solution against a working (hacked-together) setup |
I am having the exact same issue described here: #14
Throughout the codebase, occasionally (and accidentally) I am logging very large entries, and encountering an exception which crashes the server:
I am using the logging-bunyan Express middleware as follows:
It doesn't look like I'm able to pass in the arguments to this middleware as per the solution described here: googleapis/gax-nodejs#157 (comment)
maxEntrySize also doesn't seem to properly truncate.
Ideally, I don't care as much about truncating; I just want to properly handle the exception and not crash the server whenever I inadvertently attempt to log a massive entry.
Environment details
@google-cloud/logging-bunyan
version: 4.2.2The text was updated successfully, but these errors were encountered: