You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository has been archived by the owner on Apr 3, 2024. It is now read-only.
Can anyone help me identify why my ConversationalRetrievalQAChain is replying with "I don't know, or information is not provided" I can see it pulling the relevant docs in my terminal but it doesn't seem to use the information. Any help would be greatly appreciated!
import { PineconeClient } from "@pinecone-database/pinecone";
import { LangChainStream, StreamingTextResponse } from "ai";
import { ConversationalRetrievalQAChain } from "langchain/chains";
import {PromptTemplate,} from "langchain/prompts";
import { OpenAI } from "langchain/llms/openai";
import { OpenAIEmbeddings } from "langchain/embeddings/openai";
import { AIChatMessage, HumanChatMessage, SystemChatMessage } from "langchain/schema";
import { PineconeStore } from "langchain/vectorstores/pinecone";
import z from "zod";
export async function POST(req: Request) {
console.log("POST request received");
const QA_PROMPT = PromptTemplate.fromTemplate(
`Ignore all previous instructions. I want you to act as a document that I am having a conversation with. Only provide responses based on the given information. Never breach character.
Question: {question}
=========
{context}
=========
Answer in Markdown:`
);
const body = await req.json();
console.log("Request body:", body);
Can anyone help me identify why my ConversationalRetrievalQAChain is replying with "I don't know, or information is not provided" I can see it pulling the relevant docs in my terminal but it doesn't seem to use the information. Any help would be greatly appreciated!
import { PineconeClient } from "@pinecone-database/pinecone";
import { LangChainStream, StreamingTextResponse } from "ai";
import { ConversationalRetrievalQAChain } from "langchain/chains";
import {PromptTemplate,} from "langchain/prompts";
import { OpenAI } from "langchain/llms/openai";
import { OpenAIEmbeddings } from "langchain/embeddings/openai";
import { AIChatMessage, HumanChatMessage, SystemChatMessage } from "langchain/schema";
import { PineconeStore } from "langchain/vectorstores/pinecone";
import z from "zod";
const ChatSchema = z.object({
messages: z.array(
z.object({
role: z.enum(["system", "user", "assistant"]),
content: z.string(),
id: z.string().optional(),
createdAt: z.date().optional(),
})
),
});
export const runtime = "edge";
let pinecone: PineconeClient | null = null;
const initPineconeClient = async () => {
pinecone = new PineconeClient();
await pinecone.init({
apiKey: process.env.PINECONE_API_KEY!,
environment: process.env.PINECONE_ENVIRONMENT!,
});
};
export async function POST(req: Request) {
console.log("POST request received");
const QA_PROMPT = PromptTemplate.fromTemplate(
`Ignore all previous instructions. I want you to act as a document that I am having a conversation with. Only provide responses based on the given information. Never breach character.
const body = await req.json();
console.log("Request body:", body);
try {
const { messages } = ChatSchema.parse(body);
console.log("Parsed messages:", messages);
} catch (error) {
console.log("Error in POST handler:", error);
}
}
The text was updated successfully, but these errors were encountered: