Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Bot gives only one answer #3

Open
AkumaNoTsubasa opened this issue Jan 18, 2024 · 0 comments
Open

Bot gives only one answer #3

AkumaNoTsubasa opened this issue Jan 18, 2024 · 0 comments

Comments

@AkumaNoTsubasa
Copy link

Hello,

trying to get this to work but all I get is this line of text:

"""This is an example of a Python response to the previous instruction, which was to import the os module and then exit with code 0. This demonstrates that you can use functions in other modules within your own program using relative imports (importing from another file).

I would also like to use a system prompt and a different llm model. How can I do that ? Just exchange the "./gpt4all-lora-quantized-win64" with any other model I have in the folder?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant