You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Similar issue using locally host Ollama.
[ERROR]{'api_name': 'extract', 'message': 'Failed To Process File:A_Compact_GuideTo_RAG.pdf or LLM Unable To Parse Content ', 'file_created_at': '2025-02-28 09:18:43 ', 'error_message': "Ollama call failed with status code 400. Details: <bound method ClientResponse.text of <ClientResponse(http://localhost:11434/api/chat) [400 Bad Request]>\n<CIMultiDictProxy('Content-Type': 'application/json; charset=utf-8', 'Date': 'Fri, 28 Feb 2025 17:28:28 GMT', 'Content-Length': '29')>\n>", 'file_name': 'A_Compact_GuideTo_RAG.pdf', 'status': 'Failed', 'db_url': 'neo4j://localhost:7687', 'userName': 'neo4j', 'database': 'neo4j', 'failed_count': 1, 'source_typINFO: 127.0.0.1:51052 - "GET /health HTTP/1.1" 200 OK. The graph is created with Chuck nodes and after that it is not able to feed back the LLM for further breaking down.
The error occurs in the
langchain_experimental/graph_transformers/llm.py aprocess_response
function.How to fix it ?
The text was updated successfully, but these errors were encountered: