Skip to content

Commit

Permalink
Fixed system message and custom chat history
Browse files Browse the repository at this point in the history
  • Loading branch information
Maximilian-Winter committed May 20, 2024
1 parent 7e4a6da commit 94c8afd
Show file tree
Hide file tree
Showing 2 changed files with 3 additions and 3 deletions.
2 changes: 1 addition & 1 deletion pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@ build-backend = "setuptools.build_meta"

[project]
name = "llama-cpp-agent"
version = "0.2.9"
version = "0.2.10"
description = "A framework for building LLM based AI agents with llama.cpp."

readme = "ReadMe.md"
Expand Down
4 changes: 2 additions & 2 deletions src/llama_cpp_agent/llm_agent.py
Original file line number Diff line number Diff line change
Expand Up @@ -382,9 +382,9 @@ def get_response_role_and_completion(
):
if len(chat_history.get_chat_messages()) == 0:
if system_prompt:
chat_history.add_message({"role": "system", "content": system_prompt})
chat_history.add_message({"role": Roles.system, "content": system_prompt})
else:
chat_history.add_message({"role": "system", "content": self.system_prompt})
chat_history.add_message({"role": Roles.system, "content": self.system_prompt})

if message is not None and add_message_to_chat_history:
chat_history.add_message(
Expand Down

0 comments on commit 94c8afd

Please sign in to comment.