You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hello, @Ewen2015! I'm here to assist you with any bugs, questions, or contributions you may have. Let's work together to tackle the issue you've encountered.
The issue you're experiencing with the model status remaining green even after the model is removed from the server might be due to a caching mechanism in place. The LBModelManager class uses Redis to manage load balancing configurations and their cooldown states, which could cause the model status to remain green if the cache is not properly invalidated when the model is removed [1].
Additionally, the ModelProviderService class handles model management, including enabling and disabling models. When a model is removed, the remove_model_credentials method is called, which should ideally invalidate the cache. However, if the cache is not properly invalidated, the model status might remain green [2].
To address this issue, you may need to ensure that the cache is properly invalidated when a model is removed. This can be done by using the delete method in the ProviderCredentialsCache class to delete cached model provider credentials [3].
Self Checks
Dify version
0.6.11
Cloud or Self Hosted
Self Hosted (Docker)
Steps to reproduce
ollama rm llama
Model status is always green even if the model is removed from the server.
✔️ Expected Behavior
Model status turns to red when a model is removed.
❌ Actual Behavior
Model status is always green even if the model is removed from the server.
The text was updated successfully, but these errors were encountered: