Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Optimizing Dynamic Prompts #9

Open
Anchaliya75 opened this issue Dec 19, 2024 · 1 comment
Open

Optimizing Dynamic Prompts #9

Anchaliya75 opened this issue Dec 19, 2024 · 1 comment

Comments

@Anchaliya75
Copy link

Description

Can we optimize prompts which are Dynamic i.e Prompts contains variables
An example Prompt can be

"""
You are an AI chatbot you must never answer/ respond to off-topic categories given here
{{off-topics-list}}
Description of each off-topic is also there in the list
{{off-topic-description}}.
Generate response based on given
{{query}}
"""

Can we optimize these kind of prompts. Are there any checks which will ensure that llm doesn't change any of these variables?

@deathsaber
Copy link

Hi. I had this same use case and did it by replacing the variables with temporary place-holders and, in the fine-tuned prompt, put back the variables where they were appropriate.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants