Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Too much memory use & possible memory leak #380

Open
purerosefallen opened this issue Jan 29, 2023 · 4 comments
Open

Too much memory use & possible memory leak #380

purerosefallen opened this issue Jan 29, 2023 · 4 comments
Labels
bug Something isn't working help wanted Extra attention is needed

Comments

@purerosefallen
Copy link
Contributor

This is Nanahira, the maintainer of MyCard, also the instance owner of https://hibi.moecube.com. Recently we saw the instance ate up 4 GB of our server memory, blew up our monitor system's alarm.
Is this software consuming that amount of memory, or perhaps my own misconfiguration? Looking for help, thanks.

@mnixry mnixry added the question Further information is requested label Jan 29, 2023
@mnixry
Copy link
Member

mnixry commented Jan 29, 2023

Hello Nanahira, thank you for maintaining the instance and for being the owner of hibi.moecube.com.

Regarding the issue of the instance consuming 4 GB of server memory and triggering the monitoring system's alarm, it may be caused by caching enabled by default. It is normal for this to occur when the request volume is high. You can refer to #114 for more information. I recommend upgrading to the latest version of HibiAPI, including all dependencies, as this may help with the memory usage issue.

However, if memory still grows with low request volume, there may be a potential memory leak in the program that requires further analysis.

@purerosefallen
Copy link
Contributor Author

With a careful check of the logs of both our gateways and CDN, we found there were 1000 requests per day, which isn't a high number. Is that amount of memory and request rate looking fine on other instances, especially the official one?
In addition, we would try to migrate to Redis caching, trying to solve this problem. Thanks for your reply anyway.

@mnixry
Copy link
Member

mnixry commented Jan 29, 2023

For your reference, this is the memory graph of the public demo api.obfs.dev, it uses disk cache as cache backend and holds average 120–160 requests per minute.

It does appear that even with the use of diskcache, the memory usage is still slowly increasing. Given the reported request volume, it is likely that there is a memory leak present in the program.
Debugging this issue may not be straightforward, and the rate at which the memory usage is increasing is relatively slow. I will try to add an APM to monitor the detailed memory profile of the program, and then work to fix the issue (I doubt that the issue may be caused by an upstream library rather than the code in the repository, which could make it more difficult to fix). So leave this issue open.

As a workaround, you can set up a scheduled restart to reduce the impact of the memory leak.

@mnixry mnixry reopened this Jan 29, 2023
@purerosefallen purerosefallen changed the title Too much memory use Too much memory use & possible memory leak Jan 29, 2023
@mnixry mnixry added bug Something isn't working help wanted Extra attention is needed and removed question Further information is requested labels Jan 30, 2023
@Ruriko
Copy link

Ruriko commented Oct 29, 2023

If you're using the default config then you have to set how much in-memory it can use since on default the size is not defined so it ends up using all your memory. For example I use mem://?size=256 which works fine for low traffic

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working help wanted Extra attention is needed
Development

No branches or pull requests

3 participants