-
Notifications
You must be signed in to change notification settings - Fork 49
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
migrate to class #252
base: master
Are you sure you want to change the base?
migrate to class #252
Conversation
filenames should be lowercased and kebabcased. please provide a benchmark, so that we can compare potential performance gains or losses. We use afaik light-my-request for fastify.inject. Some people use inject in their productive solutions. is there a reason, we can not use prototypical notation? |
Okey I will change filename, but now you also have |
❯ npm run benchmark
> [email protected] benchmark
> node test/benchmark.js
Request x 269,265 ops/sec ±3.62% (78 runs sampled)
Custom Request:
Request With Cookies x 205,368 ops/sec ±2.22% (84 runs sampled)
Request With Cookies n payload x 169,358 ops/sec ±4.85% (77 runs sampled)
ParseUrl x 702,748 ops/sec ±9.31% (60 runs sampled)
ParseUrl and query x 99,315 ops/sec ±4.00% (75 runs sampled)
Fastest is: ParseUrl
❯ npm run benchmark
> [email protected] benchmark
> node test/benchmark.js
Request x 238,129 ops/sec ±5.85% (71 runs sampled)
Custom Request:
Request With Cookies x 169,174 ops/sec ±4.09% (78 runs sampled)
Request With Cookies n payload x 189,661 ops/sec ±2.57% (82 runs sampled)
ParseUrl x 1,263,631 ops/sec ±1.82% (89 runs sampled)
ParseUrl and query x 99,157 ops/sec ±3.26% (81 runs sampled)
Fastest is: ParseUrl |
Which of these benchmarks is default branch and which one is this PR? |
Bun.js implemented ServerResponse through a class that does not allow to call |
Good argument. Rename the files and make them lower and kebab, please... ;) |
Both are PR Below is the default branch ❯ npm run benchmark
> [email protected] benchmark
> node test/benchmark.js
Request x 286,519 ops/sec ±2.13% (88 runs sampled)
Custom Request x 19,788 ops/sec ±2.32% (84 runs sampled)
Request With Cookies x 204,017 ops/sec ±2.81% (79 runs sampled)
Request With Cookies n payload x 200,765 ops/sec ±2.78% (79 runs sampled)
ParseUrl x 1,001,307 ops/sec ±4.34% (78 runs sampled)
ParseUrl and query x 115,094 ops/sec ±3.48% (83 runs sampled)
Fastest is: ParseUrl |
It seems that customrequest benchmark is broken. Can you have a look at it? |
test/index.test.js
Outdated
date: res.headers.date, | ||
connection: 'keep-alive', |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
why are date and connection gone?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Previously, you read these options from _header
which is not documented in the ServerResponse
(also Bun not export this attribute)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Now we read header from .getHeader(name)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is a breaking change. Probably would result in delayed approval of this PR. Can you implement it not backwards breaking?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think not because node did it in a hidden way and bun doesn't add these headers at all
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I can wait, no problem)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It could be also result in a rejection of your PR, as bun is incompatible with nodejs despite @Jarred-Sumner 's claim to be a drop in replacement.
Please reconsider.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
So I think I can fix these headers in Bun, but you are also using undocumented options, don't you think that is not quite the right way
I think users shouldnt read and interact with them at all
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We are devs, we should always agree on a reasonable solution.
But for a informed decision, it is not enough to write "undocumented options". Can you please elaborate what options we are using, which are undocumented etc.?
But on the other hand, we have basically an api contract to fulfill. If you remove headers, we will break the expectations.
Thats why I wrote, that if you could implement it in a non-breaking way. Because we will have soon fastify v5, where we can introduce breaking changes. So we are in the perfect time window to discuss if those headers should be added or not. And if not, we can remove them in few months, when we release fastify v5.
But you should not be sad, if removing the additional headers is not accepted.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I totally agree with you
when I said "undocumented parameters" I meant "ServerResponse._header".
https://github.com/fastify/light-my-request/blob/v5.10.0/lib/response.js#L177C28-L177C35
https://nodejs.org/api/http.html#class-httpoutgoingmessage (no word about _header
)
PR ❯ npm run benchmark
> [email protected] benchmark
> node test/benchmark.js
Request x 237,109 ops/sec ±4.16% (75 runs sampled)
Custom Request x 14,722 ops/sec ±6.88% (76 runs sampled)
Request With Cookies x 169,585 ops/sec ±4.24% (79 runs sampled)
Request With Cookies n payload x 159,198 ops/sec ±4.37% (79 runs sampled)
ParseUrl x 1,109,862 ops/sec ±4.22% (80 runs sampled)
ParseUrl and query x 113,048 ops/sec ±3.30% (84 runs sampled)
Fastest is: ParseUrl |
Node: 16
master:
Node: 18
master:
Node: 20
master:
|
I really appreciate the effort to make more libraries work in Bun but IMO, we should strongly default to fix things in Bun instead of libraries whenever possible. Its our fault when it doesn't work, not the library's fault. We can make some tweaks to continue to use classes in Bun internally without breaking libraries. I also think it's important for changes like this to have no negative performance impact and it's hard to tell from the benchmarks posted |
I'm somewhat skeptical of this change. Why does Bun need classes? All of Node.js is implemented using prototypical inheritance anyway. |
|
Why are you skeptical? Do you have any reason to use a prototype instead of extending a class? it's better for reading, and looks healthier |
@Jarred-Sumner I with we could do that for Node.js without breaking a good chunk of npm. My 2 cents is that this is something you want to fix on the Bun side because this won't definitely be the only module with this problem. |
Oh, of course |
Prototypical inheritance has not been deprecated anywhere in the Node.js codebase. |
Nor has it been "deprecated" from the language. It is the language. The |
I would argue that classes are a syntax sugar for a reason, though, they are easier to read and modify than prototype-inheritance based code. If there is no perf degradation or a negligible one, I would be +1 on this change, as maintainability is important. Since light-my-request is not even a runtime dependency, nanooptimizations don't matter as much. |
main
pr
|
Some asynchronous tests have become faster |
@sirenkovladd |
|
@sirenkovladd Can you regenerate the benchmark after the fix? |
This benchmark already after commit, I just didn't push before text |
But if CustomRequest is important, I can go back to the old code or leave only one call constructor |
If it was a prod runtime code, I would say that 2-3% perf difference is significant. For a test code, admittedly, less so. I'm neutral on this PR; I would say that class syntax is generally easier to maintain, but there is slight perf decrease. |
Yeah the 3 % perf decrease is normal for node when transforming from prototype to class notation. That is also why transpiling classes in ts to ES5 makes them faster on node :D. For me it is a hard -1 |
I'm sure it's just an inaccuracy of the benchmark I rerun the benchmark with no changes
|
It is the PR |
make changes const suite = new Bench({
time: 10000,
}) master
pr
|
Good news like for me |
And which node version did you use? Why do you increased the max old cache? |
node v20.6.0 |
So I can say that everything will be faster, except
where function getCustomRequestLight (CustomRequest) {
class _CustomLMRRequest {
constructor (...opt) {
Object.assign(this, new Request(...opt))
}
}
Object.setPrototypeOf(_CustomLMRRequest.prototype, CustomRequest.prototype)
Object.getOwnPropertyNames(Request.prototype)
.filter(prop => prop !== 'constructor')
.forEach(prop => { _CustomLMRRequest.prototype[prop] = Request.prototype[prop] })
return _CustomLMRRequest
} difference between |
I think it's a bug that |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
fastify unit tests break with this PR
I feel that the risk of this PR are really not worth the reward. I don't think we should migrate to class. |
I think it has value, but it should be done with care. |
I can create a new PR with smaller changes if it helps to merge faster |
I thinks so, but maybe wait till we merged the fastify integration testing. Then we see if it breaks in fastify |
I think my PR actually broke 1 test in fastify (I tested it) https://github.com/fastify/fastify/blob/main/test/childLoggerFactory.test.js#L61C1-L91 This test expects an error when calling And in this PR I changed when you provide a callback, all errors will be injected there With Promise, everything works fine |
Make sub PR (same change, but slightly edited function |
migrate to class syntax (Chain, request, response)
bun.js support
update usage of deprecated attribute
Checklist
npm run test
andnpm run benchmark
and the Code of conduct