Skip to content

Improve encoding/decoding performance #728

Open
@Domiii

Description

@Domiii

Currently, Dbux encoding/decoding is its biggest performance bottleneck.

  • Change to custom encoding/decoding → use type information to replace current array-of-dictionary approach
    • → Should give many X of speed-up.
    • NOTE currently, all data is encoded as dictionaries, of which we can end up having millions and even more.
    • Currently: an array-of-object is actually stored as an array-of-dictionary, such as [{ veryLongProp1: 1, anotherPropHere: 2 }, { veryLongProp1: 3, anotherPropHere: 4 }, ...]
      • It should become something like { props: [ 'veryLongProp, anotherPropHere' ], data: [1, 2, 3, 4, ...] }
  • Profile the entire data transmission stack: which part takes how long?
  • Don't waitForAck on client

Some More Notes

  • We are using msgpack (code here).
  • On Node, one could use v8.serialize.
    • Some performance comparisons can be found here
    • It's max size is generally 4GB on 64 bit systems.

Metadata

Metadata

Assignees

No one assigned

    Labels

    enhancementNew feature or requestperformancePerformance of runtime analysis tools can be quite a nuisance.

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions