Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Question about the input size #6

Open
614TChen opened this issue May 16, 2023 · 1 comment
Open

Question about the input size #6

614TChen opened this issue May 16, 2023 · 1 comment

Comments

@614TChen
Copy link

Hi, interesting paper and great work! I try to run the code but i encounter the input length problem. When I run the schema linking prompt to get the schema linkings, I find out the length of the entire prompt is too long to feed to the model. Just wondering if I run the code wrongly. Btw, i get this issue on both turbo 3.5 and vicuna-13b

@MohammadrezaPourreza
Copy link
Owner

Hi, thank you so much for your comment on our paper. This due to the small context window of the models you have chosen. The prompts we are using have around 6000 tokens so if you want to keep the prompts as is, you should use models with larger context window like CodeX and GPT4. Other solution is to reduce the size of the prompts.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants