Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: ratelimit prediction #2188

Open
wants to merge 65 commits into
base: master
Choose a base branch
from
Open

feat: ratelimit prediction #2188

wants to merge 65 commits into from

Conversation

VincentRPS
Copy link
Member

Summary

This replaces the previous system of rate limit, which purely only handled rate limits, with
a new rate limit system which not only eases and handles rate limits but also tries to predict rate limits before they can happen and affect the bot. This system is made to be scalable and replaceable with something like Redis storage.

Information

  • This PR fixes an issue.
  • This PR adds something new (e.g. new method or parameters).
  • This PR is a breaking change (e.g. methods or parameters removed/renamed).
  • This PR is not a code change (e.g. documentation, README, typehinting,
    examples, ...).

Checklist

  • I have searched the open pull requests for duplicates.
  • If code changes were made then they have been tested.
    • I have updated the documentation to reflect the changes.
  • If type: ignore comments were used, a comment is also left explaining why.
  • I have updated the changelog to include these changes.

@VincentRPS VincentRPS self-assigned this Jul 31, 2023
@VincentRPS VincentRPS requested a review from a team as a code owner July 31, 2023 02:49
@pullapprove4
Copy link

pullapprove4 bot commented Jul 31, 2023

Please add a changelog entry

@pullapprove4
Copy link

pullapprove4 bot commented Jul 31, 2023

Changelog requirements met

CHANGELOG.md Outdated Show resolved Hide resolved
discord/client.py Outdated Show resolved Hide resolved
discord/http.py Show resolved Hide resolved
discord/rate_limiting.py Outdated Show resolved Hide resolved
discord/rate_limiting.py Show resolved Hide resolved
discord/rate_limiting.py Outdated Show resolved Hide resolved
discord/http.py Outdated Show resolved Hide resolved
VincentRPS and others added 3 commits July 31, 2023 15:51
pain. it was all pain.
Forgot to add bucket storage to slots.

Signed-off-by: VincentRPS <[email protected]>
@pullapprove4
Copy link

pullapprove4 bot commented Aug 2, 2023

Please add a changelog entry

@pullapprove4
Copy link

pullapprove4 bot commented Aug 2, 2023

Changelog requirements met

Summary: if two or more requests get rate limited at the same time with the same bucket, they will override each other.

To fix this without using any locks, we just set the temp bucket before any rate limit can happen, and set it to rate limited once there is one, therefore the only thing being overridden in the `rate_limited` variable to True.
@VincentRPS
Copy link
Member Author

PR is mostly finalized by this point. The final goal now is to try and test this system on larger bots to identify any possible faults before it is merged into master.

This PR will not be refactoring webhooks, at least for now. That will be the goal of a second PR in 2.6 or 2.7, as the work for that would be a bit more complicated, especially with interaction followups. There is a chance we could make interactions use the HTTPClient instead of Webhooks though which would be much easier to handle since it would just integrate with the current rate limit prediction system.

discord/http.py Show resolved Hide resolved
VincentRPS and others added 3 commits January 25, 2024 07:10
rate_limited is now set to False by default.
There is a lock on .use to prevent multiple requests from all releasing requests at the same time.

Signed-off-by: VincentRPS <[email protected]>
@switchupcb
Copy link

switchupcb commented Jul 9, 2024

This may be a helpful guide for this PR, if you haven't read it already.
@emretech

PR is mostly finalized by this point.
@VincentRPS

You will need to redo this PR if it is based on bluenix comment, which describes an incorrect implementation of rate limiting.

You can't predict rate limits based on when your bot sends requests, because your bot isn't the single source of truth of the rate limit.

The discord server determines which bucket (window) the request applies to and thus how the rate limit works. Your bot can send 50 requests per second at a technical level and still be rate limited depending on the timing of the rate limit bucket window and when the server receives the request.

You can read switchupcb/disgo#14 and switchupcb/disgo#22 for more information that validates these claims.

But I wouldn't read all that.


Bluenix admitted in the same thread his solution didn't work.

discord/discord-api-docs#5144 (comment)

Did you test this at 50 requests per second?

I do hit the global ratelimit, yes. This could have to do with how requests take different amount of time and appears in later windows (which you brought up about my implementation for route-specific ratelimits), but I am not sure this explains everything, I suppose you can consider this issue reproduced by me.

@EmmmaTech
Copy link
Contributor

Yeah, no thanks. Where were you when other Discord.py forks updated their ratelimit handlers?

@switchupcb
Copy link

Where were you when other Discord.py forks updated their ratelimit handlers?
@emretech

The answer to that question doesn't change the truth or the tests used to assert the truth.

@EmmmaTech
Copy link
Contributor

Where were you when other Discord.py forks updated their ratelimit handlers?
@emretech

The answer to that question doesn't change the truth or the tests used to assert the truth.

I want to know more about how "reliable" your method is. How many tests have you conducted? Have you tested your impelementation alongside Bluenix's proposal? I'm really just curious at this point, because you've seemed to dominate that original Discord API Docs issue I linked to with a bunch of your research (which was a weird thing to do btw, an issue in an issue tracker is not the right place to do that).

@switchupcb
Copy link

switchupcb commented Jul 9, 2024

I want to know more about how "reliable" your method is. How many tests have you conducted?
@emretech

https://github.com/switchupcb/disgo can handle over 50 RPS, but doesn't due to a correct rate limit implementation.

The rate limit test is located here: https://github.com/switchupcb/disgo/blob/v10/wrapper/tests/integration/ratelimit_test.go

You can also view the actions run on every commit and also read the information I linked, which ran a test indicating that Discord CANT be consumed using a "rolling" rate limit implementation.

I'm really just curious at this point, because you've seemed to dominate that original Discord API Docs issue I linked to with a bunch of your research (which was a weird thing to do btw, an issue in an issue tracker is not the right place to do that).

The entire point of that thread was to determine the actual rate limit implementation, so it could be included in the documentation.

Have you tested your impelementation alongside Bluenix's proposal?

Bluenix admitted in the same thread his solution didn't work.

discord/discord-api-docs#5144 (comment)

Did you test this at 50 requests per second?

I do hit the global ratelimit, yes. This could have to do with how requests take different amount of time and appears in later windows (which you brought up about my implementation for route-specific ratelimits), but I am not sure this explains everything, I suppose you can consider this issue reproduced by me.

@EmmmaTech
Copy link
Contributor

Bluenix admitted in the same thread his solution didn't work.

Oh my god, I guess I missed that part. I suppose you're right then, sorry. I'm reading your documentation from switchupcb/disgo#14 now.

@Lulalaby Lulalaby modified the milestones: v2.6, v2.7 Jul 9, 2024
@Lulalaby
Copy link
Member

needs a redo

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
feature Implements a feature PA: All Contributors pending priority: low Low Priority python Pull requests that update Python code status: awaiting review Awaiting review from a maintainer
Projects
Status: In Progress
Development

Successfully merging this pull request may close these issues.

6 participants