Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Unexpected result metrics for simple benchmarks #186

Open
pmdartus opened this issue Jul 29, 2019 · 2 comments
Open

Unexpected result metrics for simple benchmarks #186

pmdartus opened this issue Jul 29, 2019 · 2 comments

Comments

@pmdartus
Copy link
Member

Observations

When running the fibonacci example on the getting started page locally the CLI outputs the following table:

┌─────────────────┬─────────────┬─────┬───────────────┬───────────────┐
│ Benchmark name  │ Metric (ms) │ N   │ Mean ± StdDev │ Median ± MAD  │
├─────────────────┼─────────────┼─────┼───────────────┼───────────────┤
│ js-execution    │ -           │ -   │ -             │ -             │
├─────────────────┼─────────────┼─────┼───────────────┼───────────────┤
│ └─ fibonacci 15 │ script      │ 237 │ 0.097 ± 12.6% │ 0.095 ± 10.5% │
├─────────────────┼─────────────┼─────┼───────────────┼───────────────┤
│ └─ fibonacci 15 │ aggregate   │ 236 │ 0.872 ± 16.2% │ 0.903 ± 5.5%  │
├─────────────────┼─────────────┼─────┼───────────────┼───────────────┤
│ └─ fibonacci 15 │ paint       │ 230 │ 0.032 ± 10.4% │ 0.031 ± 3.2%  │
├─────────────────┼─────────────┼─────┼───────────────┼───────────────┤
│ └─ fibonacci 38 │ script      │ 248 │ 0.077 ± 10.9% │ 0.080 ± 6.3%  │
├─────────────────┼─────────────┼─────┼───────────────┼───────────────┤
│ └─ fibonacci 38 │ aggregate   │ 238 │ 0.306 ± 7.0%  │ 0.305 ± 4.9%  │
└─────────────────┴─────────────┴─────┴───────────────┴───────────────┘

In the case of fibonacci example, the paint and aggregate metrics doesn't make much sense.

It would be great to pass to the benchmark block or the describe block the set of metrics the benchmark if interested by instead of gathering all the metrics by default.

Versions

  • node: 10.16.0
  • best: 4.0.0-alpha4
@pmdartus pmdartus changed the title Unexpected result metrics Unexpected result metrics for simple benchmarks Jul 29, 2019
@diervo
Copy link
Contributor

diervo commented Jul 29, 2019

@pmdartus I would argue to be a global best configuration option, 1) because I think its easier to reason, 2) because it will be hard to change the way the test runs at "runtime" (or we will have to parse the js at build time and search for those options).

@pmdartus
Copy link
Member Author

You are right, I think it will be really elegant to do that at the project level.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

3 participants