-
Notifications
You must be signed in to change notification settings - Fork 37
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add context for LLM #180
Comments
Please elaborate a little further with a use case.
For getting responses from LLM, part of the conversation is sent to the API. The number of earlier conversation segments sent to the API are governed by constant MAX_TRANSCRIPTION_PHRASES_FOR_LLM (default 12) defined here |
i believe by context he means providing the model your background. For example using this on a zoom interview being able to feed it your resume, or the job description. |
I put the llm directions in the you: first statement . and the resume context in the other file. However when i downlaod the new pull request or whatever it results in bugs. there must be a way to control better the thing that is sent to the llm to respond. transcription has been good for me on the free version, and the cost to run the rest is very low cost. |
@vivekuppal I am unsure if this will work. I notice the parameters.yaml says the
|
|
Hello friend, can you share the code |
Hi Everyone, @tan-banana, @btsogt21, @chenosyong, @raulvasquez, I am eager to help move things forward. |
看起来我们正在尝试解决转录中某些特定用例的一些问题。 我很乐意帮助您满足您的需求。 最好的协作方式是共享源代码。 随着 Transcribe 代码库的不断发展,您可能很难使您的个人代码库副本与新更改保持同步。 我渴望帮助推动事情向前发展。 |
I would be happy to see this use case included: User Story: Personalized Responses Based on User Background |
How can we go about setting up 'context' for the LLM to respond based on? For example, I would like to feed it information about myself and get responses tailored to said information.
Along a similar line, how could we go about implementing conversation 'memory' for the LLM when requesting responses?
The text was updated successfully, but these errors were encountered: