Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Use ollama and deepseek-r1:1.5b can't generate graph #1123

Open
qpb8023 opened this issue Feb 24, 2025 · 1 comment
Open

Use ollama and deepseek-r1:1.5b can't generate graph #1123

qpb8023 opened this issue Feb 24, 2025 · 1 comment
Labels
bug Something isn't working

Comments

@qpb8023
Copy link

qpb8023 commented Feb 24, 2025

The error occurs in the langchain_experimental/graph_transformers/llm.py aprocess_response function.
How to fix it ?

-----
{'head': '作者', 'head_type': '相关人物', 'relation': {'涉及人物': {'描述': '与相关的人物角色有关'}}, 'tail': '研究成果', 'tail_type': '成果'}
-----
2025-02-24 17:26:41,801 - Deleted File Path: /Users/tiutiutiu/Desktop/llm-graph-builder/backend/merged_files/《教育心理学》.pdf and Deleted File Name : 《教育心理学》.pdf
[ERROR]{'api_name': 'extract', 'message': 'Failed To Process File:《教育心理学》.pdf or LLM Unable To Parse Content ', 'file_created_at': '2025-02-24 16:21:15 ', 'error_message': "1 validation error for Relationship\ntype\n  Input should be a valid string [type=string_type, input_value={'涉及人物': {'描述. input_type=dict]\n    For further information visit https://errors.pydantic.dev/2.9/v/string_type", 'file_name': '《教育心理学》.pdf', 'status': 'Failed', 'db_url': 'neo4j://192.168.6.6:7688', 'userName': 'neo4j', 'database': 'neo4j', 'failed_count': 1, 'source_type': 'local file', 'source_url': None, 'wiy': None, 'logging_time': '2025-02-24 09:26:41 UTC', 'email': None}
2025-02-24 17:26:41,809 - File Failed in extraction: 1 validation error for Relationship
type
  Input should be a valid string [type=string_type, input_value={'涉及人物': {'描述...的人物角色有关'}}, input_type=dict]
    For further information visit https://errors.pydantic.dev/2.9/v/string_type
Traceback (most recent call last):
  File "/Users/tiutiutiu/Desktop/llm-graph-builder/backend/score.py", line 222, in extract_knowledge_graph_from_file
    uri_latency, result = await extract_graph_from_file_local_file(uri, userName, password, database, model, merged_file_path, file_name, allowedNodes, allowedRelationship, token_chunk_size, chunk_overlap, chunks_to_combine, retry_condition, additional_instructions)
  File "/Users/tiutiutiu/Desktop/llm-graph-builder/backend/src/main.py", line 238, in extract_graph_from_file_local_file
    return await processing_source(uri, userName, password, database, model, fileName, [], allowedNodes, allowedRelationship, token_chunk_size, chunk_overlap, chunks_to_combine, True, merged_file_path, retry_condition, additional_instructions=additional_instructions)
  File "/Users/tiutiutiu/Desktop/llm-graph-builder/backend/src/main.py", line 383, in processing_source
    node_count,rel_count,latency_processed_chunk = await processing_chunks(selected_chunks,graph,uri, userName, password, database,file_name,model,allowedNodes,allowedRelationship,chunks_to_combine,node_count, rel_count, additional_instructions)
  File "/Users/tiutiutiu/Desktop/llm-graph-builder/backend/src/main.py", line 478, in processing_chunks
    graph_documents =  await get_graph_from_llm(model, chunkId_chunkDoc_list, allowedNodes, allowedRelationship, chunks_to_combine, additional_instructions)
  File "/Users/tiutiutiu/Desktop/llm-graph-builder/backend/src/llm.py", line 225, in get_graph_from_llm
    graph_document_list = await get_graph_document_list(
  File "/Users/tiutiutiu/Desktop/llm-graph-builder/backend/src/llm.py", line 208, in get_graph_document_list
    graph_document_list = await llm_transformer.aconvert_to_graph_documents(combined_chunk_document_list)
  File "/Users/tiutiutiu/Desktop/llm-graph-builder/backend/venv/lib/python3.10/site-packages/langchain_experimental/graph_transformers/llm.py", line 1034, in aconvert_to_graph_documents
    results = await asyncio.gather(*tasks)
  File "/Users/tiutiutiu/.pyenv/versions/3.10.16/lib/python3.10/asyncio/tasks.py", line 304, in __wakeup
    future.result()
  File "/Users/tiutiutiu/.pyenv/versions/3.10.16/lib/python3.10/asyncio/tasks.py", line 232, in __step
    result = coro.send(None)
  File "/Users/tiutiutiu/Desktop/llm-graph-builder/backend/venv/lib/python3.10/site-packages/langchain_experimental/graph_transformers/llm.py", line 977, in aprocess_response
    Relationship(
  File "/Users/tiutiutiu/Desktop/llm-graph-builder/backend/venv/lib/python3.10/site-packages/langchain_core/load/serializable.py", line 125, in __init__
    super().__init__(*args, **kwargs)
  File "/Users/tiutiutiu/Desktop/llm-graph-builder/backend/venv/lib/python3.10/site-packages/pydantic/main.py", line 212, in __init__
    validated_self = self.__pydantic_validator__.validate_python(data, self_instance=self)
pydantic_core._pydantic_core.ValidationError: 1 validation error for Relationship
type
  Input should be a valid string [type=string_type, input_value={'涉及人物': {'描述...的人物角色有关'}}, input_type=dict]
    For further information visit https://errors.pydantic.dev/2.9/v/string_type

@qpb8023 qpb8023 added the bug Something isn't working label Feb 24, 2025
@seemakurthy
Copy link

Similar issue using locally host Ollama.
[ERROR]{'api_name': 'extract', 'message': 'Failed To Process File:A_Compact_GuideTo_RAG.pdf or LLM Unable To Parse Content ', 'file_created_at': '2025-02-28 09:18:43 ', 'error_message': "Ollama call failed with status code 400. Details: <bound method ClientResponse.text of <ClientResponse(http://localhost:11434/api/chat) [400 Bad Request]>\n<CIMultiDictProxy('Content-Type': 'application/json; charset=utf-8', 'Date': 'Fri, 28 Feb 2025 17:28:28 GMT', 'Content-Length': '29')>\n>", 'file_name': 'A_Compact_GuideTo_RAG.pdf', 'status': 'Failed', 'db_url': 'neo4j://localhost:7687', 'userName': 'neo4j', 'database': 'neo4j', 'failed_count': 1, 'source_typINFO: 127.0.0.1:51052 - "GET /health HTTP/1.1" 200 OK. The graph is created with Chuck nodes and after that it is not able to feed back the LLM for further breaking down.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants