Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Large request payload crashing API #736

Open
seeARMS opened this issue Jan 24, 2024 · 8 comments
Open

Large request payload crashing API #736

seeARMS opened this issue Jan 24, 2024 · 8 comments
Assignees
Labels
api: logging Issues related to the googleapis/nodejs-logging-bunyan API. priority: p3 Desirable enhancement or fix. May not be included in next release. type: bug Error or flaw in code with unintended results or allowing sub-optimal usage patterns.

Comments

@seeARMS
Copy link

seeARMS commented Jan 24, 2024

I am having the exact same issue described here: #14

Throughout the codebase, occasionally (and accidentally) I am logging very large entries, and encountering an exception which crashes the server:

Error: 3 INVALID_ARGUMENT: Request payload size exceeds the limit: 10485760 bytes.
  File "/workspace/node_modules/@grpc/grpc-js/build/src/call.js", line 31, col 19, in callErrorFromStatus
    const error = new Error(message);
  File "/workspace/node_modules/@grpc/grpc-js/build/src/client.js", line 192, col 76, in Object.onReceiveStatus
    callProperties.callback((0, call_1.callErrorFromStatus)(status, /*callerStack*/ ''));
  File "/workspace/node_modules/@grpc/grpc-js/build/src/client-interceptors.js", line 360, col 141, in Object.onReceiveStatus
    '{snip} istener === null || listener === void 0 ? void 0 : listener.onReceiveStatus) === null || _b === void 0 ? void 0 : _b.call(listener, status);
  File "/workspace/node_modules/@grpc/grpc-js/build/src/client-interceptors.js", line 323, col 181, in Object.onReceiveStatus
    '{snip} ener === void 0 ? void 0 : interceptingListener.onReceiveStatus) === null || _b === void 0 ? void 0 : _b.call(interceptingListener, status);
  File "/workspace/node_modules/@grpc/grpc-js/build/src/resolving-call.js", line 94, col 78, in <anonymous>
    (_a = this.listener) === null || _a === void 0 ? void 0 : _a.onReceiveStatus(filteredStatus);
  File "node:internal/process/task_queues", line 77, col 11, in process.processTicksAndRejections

I am using the logging-bunyan Express middleware as follows:

  const { logger: gcpLogger, mw } = await lb.express.middleware({
    logName: "api-" + process.env.FLY_APP_NAME,
    level: "debug",
    defaultCallback: (err: any) => {
      try {
        if (err) {
          console.error("Error occured with logger.", err)
          Sentry.captureMessage(err, { level: "error" })
        }
      } catch (_err) {
        console.error("Error occured with logger in try/catch blog.", _err)
        Sentry.captureException(_err)
      }
      return true
    },

    // GCP should truncate the message down to 200kb.
    // See https://github.com/googleapis/nodejs-logging/pull/607.
    maxEntrySize: 200000,
  })

It doesn't look like I'm able to pass in the arguments to this middleware as per the solution described here: googleapis/gax-nodejs#157 (comment)

maxEntrySize also doesn't seem to properly truncate.

Ideally, I don't care as much about truncating; I just want to properly handle the exception and not crash the server whenever I inadvertently attempt to log a massive entry.

Environment details

  • OS: Linux
  • Node.js version: 18.18.2
  • npm version: 10.3.0
  • @google-cloud/logging-bunyan version: 4.2.2
@seeARMS seeARMS added priority: p2 Moderately-important priority. Fix may not be included in next release. type: bug Error or flaw in code with unintended results or allowing sub-optimal usage patterns. labels Jan 24, 2024
@product-auto-label product-auto-label bot added the api: logging Issues related to the googleapis/nodejs-logging-bunyan API. label Jan 24, 2024
@cindy-peng
Copy link
Contributor

Hi @seeARMS, are you still running into this issue? If the GRPC options didn't work for you, could you also share the parameters you set?

@seeARMS
Copy link
Author

seeARMS commented Apr 15, 2024

Hi @seeARMS, are you still running into this issue? If the GRPC options didn't work for you, could you also share the parameters you set?

Where do I set those gRPC options? I mentioned in the OP that I'm using the logging bunyan Express middleware and the gRPC options don't get passed down

@pedroyan
Copy link

pedroyan commented Jun 5, 2024

I'm also facing this issue with our services, where we see outages on our production servers due to accidental large log entries (which is very annoying :( ). @seeARMS, did you find a way to pass those options on to the Express middleware?

@pedroyan
Copy link

pedroyan commented Jun 5, 2024

Also @cindy-peng - would you like a repro of the issue in a repository somewhere for better inspection?

@cindy-peng cindy-peng added priority: p3 Desirable enhancement or fix. May not be included in next release. and removed priority: p2 Moderately-important priority. Fix may not be included in next release. labels Jul 31, 2024
@cindy-peng
Copy link
Contributor

Hi @pedroyan, are you also using the Express middleware? If so, do you mind also sharing your latest repro? Thank you!

@hizaguirre-sp
Copy link

Hi Everyone,

I encountered the same issue where large log entries cause the server to crash due to exceeding the request payload size limit. Specifically, I received the following error:

Error: 3 INVALID_ARGUMENT: Request payload size exceeds the limit: 10485760 bytes.

I managed to resolve it by following these steps:

  1. Create a custom class that inherits from Writable: This class enforces a stream size limit, in this case, 9MB.
import { Writable } from 'stream';

export default class SizeLimitedStream extends Writable {
  private buffer: string[] = [];
  private sizeLimit = 9000000; // 9MB
  private bunyanStream: NodeJS.WritableStream;

  constructor(bunyanStream: NodeJS.WritableStream) {
    super({ objectMode: true });
    this.bunyanStream = bunyanStream;
  }

  _write(chunk: never, encoding: string, callback: Function) {
   // TODO: check size limit before pushing to buffer
    this.buffer.push(chunk);
    if (this.buffer.join('').length >= this.sizeLimit) {
      this.flushBuffer();
    }
    callback();
  }

  flushBuffer() {
    this.bunyanStream.write(this.buffer.join(''));
    this.buffer = [];
  }
}
  1. Use the class to customize and limit the LoggingBunyan stream:
import { LoggingBunyan } from '@google-cloud/logging-bunyan';
import SizeLimitedStream from './size-limited-stream';
import Bunyan = require('bunyan');

# .... some code

const loggingBunyan = new LoggingBunyan();

const { stream, level, type } = loggingBunyan.stream('info');

const sizeLimitedStream = new SizeLimitedStream(stream);

const logger = Bunyan.createLogger({
  name: 'some-log',
  streams: [
    { stream: sizeLimitedStream, level, type },
  ],
});

logger.info({},'something')

# .... some code

Of course, this can and should be improved 😄, but it worked for me, and I hope it works for you too!

@pedroyan
Copy link

Hey @cindy-peng - apologies for the delay here. Things have been hectic on my side, but I will whip up a repro today for you .

@hizaguirre-sp Nice solution!! How can I inject this limited stream into the GCP logging middleware?

import { Express } from 'express'
import * as gcpLogging from '@google-cloud/logging-bunyan'

const useGcpLogger = async (app: Express) => {
  const { mw, logger } = await gcpLogging.express.middleware({
    logName: 'logName',
    level: 'debug',
  })
  app.use(mw)

  // How can I attach the stream to this logger? The middleware is the code that creates it
  // logger.??
}

@pedroyan
Copy link

pedroyan commented Sep 13, 2024

Ok, so I've hacked together this repro code using a similar setup as the one my team uses in production, but to my surprise, the repro code works...

image

... while the same "explosion" logic on our production API takes down the server.

image

My first hypothesis was that one of the dependencies of @google-cloud/logging-bunyan was not working as expected when trying to limit the request size.

However, even after updating all direct and indirect dependencies of @google-cloud/logging-bunyan by deleting the yarn.lock file and installing it from scratch on our solution, the problem still persists. I've also checked the version of @google-cloud/logging-bunyan on my solution matches the one on the repro project

Here is a link to my repro repository in case someone is also facing this problem and want to compare their solution against a working (hacked-together) setup

https://github.com/pedroyan/log-explosion-repro

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
api: logging Issues related to the googleapis/nodejs-logging-bunyan API. priority: p3 Desirable enhancement or fix. May not be included in next release. type: bug Error or flaw in code with unintended results or allowing sub-optimal usage patterns.
Projects
None yet
Development

No branches or pull requests

4 participants