-
Notifications
You must be signed in to change notification settings - Fork 130
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Batching with array #108
base: new-index
Are you sure you want to change the base?
Batching with array #108
Conversation
src/electrum/server.rs
Outdated
&mut self, | ||
cmd: Value, | ||
empty_params: &Value, | ||
start_time: Instant, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Shouldn't the start_time start ticking separately for each handle_value
call?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
yes, it makes sense. Done in b8f4524
Should we limit the number of requests per batch? |
@@ -137,3 +140,44 @@ fn test_electrum() -> Result<()> { | |||
|
|||
Ok(()) | |||
} | |||
|
|||
/// Test the Electrum RPC server using an headless Electrum wallet |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The comment here should be updated, "using a raw TCP socket" or something similar
Do we have limits for "new-line separated" batch requests? Is it different here? How much do you propose as the limit here? |
We don't, but it is different here. New-lines separated requests are streamed, and individual responses are sent as they become available. We only hold up to 10 pending requests in the queue, then block and stop reading from the TCP socket until a pending request gets processed to free room (a bounded With array batching all the requests and all the response has to be buffered in-memory at once, which could use significant resources and be a DoS vector if there are no limits in place. |
Do you think we can afford 20 as the limit for batch requests? I think it's common number because is used as gap limit... |
Currently electrs support batching by separating requests via new lines, however, other impls support it also via json array as explained in the cited issue. This add support for batching requests via json array
close Blockstream/esplora#516