You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Please allow setting the baseurl for the gpt analysis function with an optional parameter.
Optionally the model as well, but even just the baseurl would be helpful 😄
Detailed design
I'm using legitify in an environment with limited internet access.
Because of that I can't access the default endpoint of chatgpt.
I can however use ollama with local models and have it pretend to be gpt3.
It would be awesome if you could add an optional parameter that allows people to set a custom baseurl.
As long as it's an openai compatible api this should work well.
While my personal usecase might be a bit special, this would also allow more privacy minded persons to use the feature without exposing any data to an external party.
And if we take this one step further and allow the user to select a custom model here:
https://github.com/Legit-Labs/legitify/blob/main/internal/gpt/analyzer.go#L112
This would allow using pretty much any model, as long as it is provided on an openai compatible api.
Additional information
I did some research and it should be possible to add it to the gogpt client like this:
TL;DR
Please allow setting the baseurl for the gpt analysis function with an optional parameter.
Optionally the model as well, but even just the baseurl would be helpful 😄
Detailed design
I'm using legitify in an environment with limited internet access. Because of that I can't access the default endpoint of chatgpt. I can however use ollama with local models and have it pretend to be gpt3. It would be awesome if you could add an optional parameter that allows people to set a custom baseurl. As long as it's an openai compatible api this should work well. While my personal usecase might be a bit special, this would also allow more privacy minded persons to use the feature without exposing any data to an external party. And if we take this one step further and allow the user to select a custom model here: https://github.com/Legit-Labs/legitify/blob/main/internal/gpt/analyzer.go#L112 This would allow using pretty much any model, as long as it is provided on an openai compatible api.
Additional information
I did some research and it should be possible to add it to the gogpt client like this:
https://github.com/Legit-Labs/legitify/blob/main/internal/gpt/analyzer.go#L34
The text was updated successfully, but these errors were encountered: